Multisensory Integration of Vision and Intracortical Microstimulation for Sensory Substitution and Augmentation Maria C. Dadarlat,123 Joseph E. O’Doherty1 & Philip N. Sabes12

The performance of brain machine interfaces (BMIs) will ultimately be limited by the quality and nature of the sensory feedback provided to the user. For example, in a recent comparison of BMI and natural-movement control of a visible cursor, BMI performance only approached that of natural movements when proprioceptive feedback was received from the intact arm, congruent with the cursor movement [1]. In real BMI applications, natural afferent sensory input is limited to vision, and so there is widespread interest in developing techniques for providing artificial, supplemental feedback to replace proprioception. Here we present a novel approach to delivering supplemental sensory feedback via intracortical microstimulation (ICMS). We take advantage of the brain’s natural plasticity to implement a learning-based approach for artificial sensory feedback: a monkey is exposed to a novel ICMS input with fixed but arbitrary mapping to a natural sensory signal, vision. We hypothesized that strict temporal congruency between these two signals would be sufficient for the monkey to learn to interpret and utilize the ICMS signal. This approach worked: we have demonstrated multisensory integration of vision with an arbitrary ICMS signal, leading to both sensory augmentation and sensory substitution during a visually-guided reaching task. Methods: One monkey has been trained to perform an instructed-delay center-out reaching task to a hidden target, guided by a continuously updating error signal indicating the vector between the current hand position and the target (Fig. 1). The error vector is encoded by a random-dot visual flow field of variable coherence and/or a continuous multichannel ICMS signal (Fig. 1, bottom panel) delivered to the shoulder/arm area of primary somatosensory cortex. There are three trial types: vision-only trials serve as a measure of baseline performance, ICMS-only trials measure sensory substitution, and vision+ICMS trials measure sensory augmentation via multisensory integration. We evaluated performance for each trial type using a variety of behavioral measures. Results: As expected, on vision-only trials the variability of the animal’s performance improves with increasing visual coherence; an increase in the precision of visual input improves the precision of reaches. This can be seen in the correlation between the angle of the target (with respect to initial hand position) and the actual initial movement direction (Fig. 2, blue trace). Sensory substitution of vision by ICMS is evident in ICMS-only trials: the ICMS signal alone is sufficient for the animal to complete a smooth, direct reach to an unseen target, with performance equivalent the visual-only condition at low-to-moderate coherences (Fig. 2, green trace). Next, we observe sensory augmentation when both ICMS and vision are available, with performance better than either visiononly or ICMS-only (Fig. 2, red trace). Indeed, the performance in vision+ICMS trials is on par with that expected given optimal (minimum variance) cue combination of the visual and ICMS signals (Fig. 2, dashed red trace). Similar evidence for sensory substitution and sensory augmentation were observed for other behavioral measures, including the total movement time and path length and the number of movement corrections. In summary, we have demonstrated the use of a continuous, multichannel, non-biomimetic ICMS to deliver learning-based artificial sensory feedback that can augment or even replace natural visual feedback in a complex behavioral task. These results demonstrate the potential power of a learning-based approach to artificial sensory feedback in the BMI context. [1] Suminski A.J., Tkach D.C., Fagg A.H., Hatsopoulos, N. G., J. Neurosci. (2010) 30(50): 16777-16787 1

Center for Integrative Neuroscience and Department of Physiology, University of California, San Francisco, CA Joint Graduate Group in Bioengineering, University of California, Berkeley and University of California, San Francisco 3 Contact: [email protected] 2

Figure 1. Behavioral Task: A monkey performs an instructed-delay center-out reaching task to an unseen target, guided by information about the error vector between the current hand location and the target. This vector is encoded visually, using a random-dot flow field (not shown), and/or via a multichannel ICMS signal. Figure shows the timeline for a sample trial (A) and ICMS pulse trains (from four of eight electrodes) on that trial (B). The eight electrodes were arbitrarily assigned to eight preferred directions; ICMS pulse rate was the product of two factors, one scaling linearly with distance to the target and the other representing a “cosine tuning” of rate with respect to the preferred direction of the given electrode. In example trial, the preferred direction of electrode 3 is closest to the initial error vector.

Figure 2. Initial Angle Estimation: A monkey’s ability to extract target angle information from ICMS is quantified by modeling the relationship between target angle and initial movement angle. Circular coefficient of determination is calculated for vision-only, vision+ICMS, and ICMS-only conditions, averaged over seven days of testing. Predicted values for vision+ICMS are based on a minimum-variance model of cue combination, with cue variability inferred from the unimodal conditions, using the simplifying assumption that variation in performance is due only to variability in estimating the error vector.

Multisensory Integration of Vision and Intracortical ...

and natural-movement control of a visible cursor, BMI performance only approached that of natural ... cursor movement [1]. In real BMI applications, natural afferent sensory input is limited to vision, and so there is widespread interest in developing techniques for ... The error vector is encoded by a random-dot visual flow field ...

206KB Sizes 0 Downloads 129 Views

Recommend Documents

Multisensory Integration: Maintaining the Perception of ...
case of temporal information and vision in the case of spatial information .... In Toward a science of consciousness: The ... 4: Job Design, Product Design and Human-. Computer Interaction, D. Harris, Ed. (Ashgate Publishing,. Hampshire), pp.

Multisensory gain within and across hemispaces in ... - DIAL@UCL
Nov 29, 2010 - ability that the RT of the fastest channel will be faster than the mean RT. Statistically, RTs .... To test for multisensory interactions in the RT data,.

capturing multisensory spatial attention - NeuroBiography
closed-circuit television camera and monitor was installed in front of the ...... (1997) used neutral objects (e.g., ping pong balls or a square of cardboard), ...

Multisensory space representations for action and pre ...
Aug 1, 2007 - b INSERM, U864, Espace et Action, Bron F-69500 France c Université Lyon ... d Hospices Civils de Lyon, Hôpital Neurologique, Mouvement et Handicap, Lyon F-69003, France. Received 24 ..... We will argue that these data cohere with the

Gender Integration of Basic Combat Training and Career Intent of ...
Selection and Assignment Research Unit. Michael G. Rumsey, Chief. June 2003. United States Army Research Institute. for the Behavioral and Social Sciences.

Integration and Visualisation of Genomics and Proteomics Data - GitHub
This enables co-visualization of genomics, transcriptomics, and proteomics data using the Integrative ... The software is available via the GitHub code repository:.

vision
The new sub-notebook computer employs a 233 MHz processor, 32 MB of RAM, 4 GB ... Page 9 ... Edge​ ​Detected​ ​Sub-notebook​ ​computer​ ​image.

Influence of multisensory feedback on haptic ...
Apr 27, 2006 - sibility of accessing an element of a 3D model avoiding undesirable collisions. This paper studies the benefits that multisensory systems can provide in performing this kind of tasks. ..... Accessibility tasks, in general, imply.

Multisensory gain within and across hemispaces in ... - DIAL@UCL
Nov 29, 2010 - absence of spatial modulation of reaction time data, they reported overlapping but distinctive patterns of multisen- sory neural integration between spatially aligned and. Fig. 2 Test for violation of race model inequality (Miller 1982