Encyclopedia of Computational Neuroscience

Living Edition
| Editors: Dieter Jaeger, Ranu Jung

Somatosensory Prosthesis

  • Sliman J. BensmaiaEmail author
Living reference work entry
DOI: https://doi.org/10.1007/978-1-4614-7320-6_561-1

Our ability to dexterously manipulate objects relies heavily on somatosensory signals from the hand (Johansson and Flanagan 2009). Receptors embedded in the skin, joints and muscle convey information about the size, shape, and texture of grasped objects, and signal whether these are slipping from our grasp. The sensory experience of our hand also plays an important role in conferring to it embodiment, making it feel a part of us. Without these signals, using our hand to perform even the most rudimentary tasks would be slow, clumsy, and effortful. Given the importance of somatosensation in natural motor behavior, to achieve a clinically viable upper-limb neuroprosthesis will require that this sense be restored.

Sensory Substitution

There are a variety of approaches to conveying useful somatosensory feedback for use in upper-limb neuroprostheses. The least invasive approach consists of substituting lost sensation by mechanically stimulating – with vibratory motors, e.g., – regions of the sensory sheet that are still intact (the face, for example) (Stepp et al. 2012). In both amputees and individuals with upper spinal cord injury – the two patient populations that are targeted with these technologies – mechanical stimulation can be applied to signal movements of the hand and objects contacting it. While sensory substitution does not require surgical intervention, the patient must learn to associate patterns of skin stimulation with hand movements or with object contact, which may require extensive training. Furthermore, given spatial constraints, patients can only be instrumented with a relatively small number of stimulators, so sensory substitution can be used to convey only limited information compared to the high-bandwidth of the intact native hand.

Neural Interfaces

An alternative approach consists of interfacing directly with the nervous system. Indeed, in intact individuals, signals from the hand are carried via nerve fibers that climb up the arm and the spinal cord before they synapse onto neurons in the brainstem (in the dorsal column nuclei). These secondary afferents then cross the midline and project to the ventroposterior nucleus thalamus, which in turn projects to primary somatosensory cortex. In principle, an interface can be developed to interact with any of these neuronal populations by developing the right type of implantable device (Weber et al. 2012), but the different potential interface sites vary in their accessibility. The neural interface approach consists of eliciting patterns of neuronal activation informative of the state of the limb and of events impinging upon it, typically through electrical stimulation of the neuronal tissue. Let us briefly compare and contrast interfaces with the nerve and with the brain, at the two anatomical extremes of the interface continuum.

Peripheral interfaces: Afferents can be accessed either in the residual nerve (Dhillon and Horch 2005) or in the dorsal root (Hokanson et al. 2011), where all of their cell bodies are located. Only a relatively small number of different types of mechanoreceptive afferents convey proprioceptive and cutaneous information, and the properties of these neurons are both highly stereotyped and well characterized. Furthermore, these neurons are not interconnected, so they essentially convey independent signals to the brain, which greatly simplifies stimulation strategies. The idea, then, is to attempt to reproduce natural patterns of activation in the nerve by strategically injecting small electrical currents into it, ideally through many independently controlled electrodes. The patterns of electrical stimulation can be modulated in space and time to produce naturalistic patterns of afferent activation. A variety of very powerful single input single output models have been developed to convert the output of sensors on the prosthesis into desired patterns of neuronal activation and these patterns can in theory be effected in the nerve through electrical stimulation (Dong et al. 2012; Kim et al. 2009, 2010), while other multi input multi output models have also been explored (Daly et al. 2012; Liu et al. 2011).

Cortical interfaces: A cortical interface has the advantage that it can be applied to patients with upper spinal cord injury, for whom the communication between the nerve and the brain has been severed. Another desirable feature of cortical interfaces is that sensory systems extract behaviorally relevant information from the relatively unelaborated representation in the nerve through successive stages of processing. Indeed, individual cortical neurons encode more complex stimulus features than do their peripheral counterparts. Furthermore, cortical neurons are organized topographically, such that nearby neurons tend to respond to similar stimulus features. This organizational scheme, largely absent in the nerve, may thus be exploited in attempting to elicit artificial percepts.

There are two general approaches to conveying sensory information through a cortical interface. The first, analogous to its peripheral counterpart, consists of attempting to reproduce naturalistic patterns of neuronal activation through electrical stimulation. To the extent that patterns of activation are naturalistic, the evoked percepts will be verisimilar and thus intuitive, requiring little training on the part of the patient. Within this approach, the functional topography of the brain can be exploited. For example, information about contact location – where is the object contacting the skin? – is important for object manipulation. In intact individuals, where we feel a poke on the skin is determined by which population of neurons gets activated. Thus, information about contact location might be conveyed by stimulating small populations of somatosensory neurons, thereby eliciting a percept that is localized to a small patch of skin (Tabot et al. 2013). In amputees, the sensation is projected to a phantom limb, in tetraplegic patients to their deafferented limb. Now imagine that anytime the prosthetic thumb touches an object, stimulation is triggered through electrodes that are implanted in the thumb representation of somatosensory cortex. The evoked sensation will be projected to the phantom or deafferented thumb, which will lead to an intuitive sense of where contact happened. In fact, studies suggest that, while stimulating the neuronal representation of the thumb will be experienced on the phantom or deafferented thumb, if patients have consistent visual experience of contact with the thumb, paired with sensations experienced on the native thumb, the sensations will start to be experienced on the prosthesis (Marasco et al. 2011). Artificial touch may thus lead to embodiment of the robotic limb! While some degree of naturalism might be achieved with a brain interface, cortical neurons are embedded in highly complex networks, and electrical stimulation of cortical tissue leads to diffuse activation of neurons with very different response properties despite the spatial organization of the brain (Histed et al. 2009). Producing truly naturalistic patterns of activation through electrical stimulation is therefore an endeavor that is doomed to failure.

With this in mind, the so-called biomimetic approach – the exploitation of existing neuronal representations – may be unnecessary. Another approach to conveying sensory feedback through a cortical interface consists of exploiting the brain’s ability to learn. Indeed, the brain is known to be highly adaptable, so it is not unreasonable to hypothesize that if a systematic mapping is created between patterns of sensory activation and patterns of intracortical microstimulation, patients will learn to use this artificial sensory feedback to control the limb. Indeed, animals are able to learn to use completely arbitrary but systematic artificial sensations to guide behavior (Thomson et al. 2013; O’Doherty et al. 2011). However, the space of possible sensations that were tested in these experiments is small relative to the almost infinite space of sensory events associated with the hand and it is unclear whether the adaptation-based approach will scale up sufficiently for use in upper-limb neuroprostheses. Most likely, biomimicry and adaptation will both play critical roles in successful attempts to convey sensory feedback.

Conclusions

Multielectrode arrays have been implanted in the brain of human patients, and algorithms have been developed to decipher how these patients wish to move anthropomorphic robotic arms from signals in the motor parts of their brain (Hochberg et al. 2012; Collinger et al. 2013). In other words, these patients can control robotic limbs by thought alone. While these studies constitute staggering examples of scientific and technological achievement, the evoked movements are slow and inaccurate, and the neuroprostheses are only viable in a laboratory setting. Given its importance in guiding movement, the incorporation of somatosensory feedback in the next generation of prostheses may bring about a major improvement in the dexterity of these limbs, and perhaps eventually may lead to a clinically viable option to restore sensorimotor function in amputees and tetraplegic patients.

References

  1. Collinger JL et al (2013) High-performance neuroprosthetic control by an individual with tetraplegia. Lancet 381(9866):557–564PubMedCentralPubMedCrossRefGoogle Scholar
  2. Daly J, Liu J, Aghagolzadeh M, Oweiss K (2012) Optimal space-time precoding of artificial sensory feedback through multichannel microstimulation in bi-directional brain-machine interfaces. J Neural Eng 9(6):065004PubMedCrossRefGoogle Scholar
  3. Dhillon GS, Horch KW (2005) Direct neural sensory feedback and control of a prosthetic arm. IEEE Trans Neural Syst Rehabil Eng 13(4):468–472PubMedCrossRefGoogle Scholar
  4. Dong Y et al (2012) A simple model of mechanotransduction in primate glabrous skin. J Neurophysiol 109:1350–1359.Google Scholar
  5. Histed MH, Bonin V, Reid RC (2009) Direct activation of sparse, distributed populations of cortical neurons by electrical microstimulation. Neuron 63(4):508–522PubMedCentralPubMedCrossRefGoogle Scholar
  6. Hochberg LR et al (2012) Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature 485(7398):372–375PubMedCentralPubMedCrossRefGoogle Scholar
  7. Hokanson JA, Ayers CA, Gaunt RA, Bruns TM, Weber DJ (2011) Effects of spatial and temporal parameters of primary afferent microstimulation on neural responses evoked in primary somatosensory cortex of an anesthetized cat. In: Conference proceedings: … Annual International Conference of the IEEE Engineering in Medicine and Biology Society, IEEE Engineering in Medicine and Biology Society, Boston, MA, pp 7533–7536Google Scholar
  8. Johansson RS, Flanagan JR (2009) Coding and use of tactile signals from the fingertips in object manipulation tasks. Nat Rev Neurosci 10(5):345–359PubMedCrossRefGoogle Scholar
  9. Kim SS et al (2009) Conveying tactile feedback in sensorized hand neuroprostheses using a biofidelic model of mechanotransduction. IEEE Trans BIOCAS 3(6):398–404Google Scholar
  10. Kim SS, Sripati AP, Bensmaia SJ (2010) Predicting the timing of spikes evoked by tactile stimulation of the hand. J Neurophysiol 104(3):1484–1496PubMedCentralPubMedCrossRefGoogle Scholar
  11. Liu JB, Khalil HK, Oweiss KG (2011) Neural feedback for instantaneous spatiotemporal modulation of afferent pathways in bi-directional brain-machine interfaces. IEEE Trans Neural Syst Rehabil Eng 19(5):521–533PubMedCrossRefGoogle Scholar
  12. Marasco PD, Kim K, Colgate JE, Peshkin MA, Kuiken TA (2011) Robotic touch shifts perception of embodiment to a prosthesis in targeted reinnervation amputees. Brain J Neurol 134(Pt 3):747–758CrossRefGoogle Scholar
  13. O’Doherty JE et al (2011) Active tactile exploration using a brain-machine-brain interface. Nature 479(7372):228–231PubMedCentralPubMedCrossRefGoogle Scholar
  14. Stepp CE, An Q, Matsuoka Y (2012) Repeated training with augmentative vibrotactile feedback increases object manipulation performance. PloS One 7(2):e32743PubMedCentralPubMedCrossRefGoogle Scholar
  15. Tabot GA, Dammann III JF, Berg JA, Tenore FV, Boback JL, Vogelstein RJ, Bensmaia SJ (2013) Restoring the sense of touch with a prosthetic hand through a brain interface, Proc Nat Aca Sci, 110:18279–84Google Scholar
  16. Thomson EE, Carra R, Nicolelis MA (2013) Perceiving invisible light through a somatosensory cortical prosthesis. Nat Commun 4:1482PubMedCentralPubMedCrossRefGoogle Scholar
  17. Weber DJ, Friesen R, Miller LE (2012) Interfacing the somatosensory system to restore touch and proprioception: essential considerations. J Mot Behav 44(6):403–418PubMedCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  1. 1.Department of Organismal Biology and AnatomyUniversity of ChicagoChicagoUSA