Sensory Prosthetics

Many disabled people live in soundless and sightless worlds. In psychological research, endeavors have been made to combine the workings of sensory systems with technical advances in bioengineering, resulting in sensory prosthetic devices that provide sensory input that can substitute for what cannot be provided by a person’s sensory receptors (Patil & Turner, 2008). Lozano, Kaczmarek and Santello (2009, p.50) noted that electrical stimulations that occur on body’s surface, which transmit touch information can stimulate afferents underneath the surface thereby causing perception. According to them, augmentation and sensory substitution are the two common applications of electrotactile stimulation (ETS). One useful device in this regard is Paul Bach-y-Rita’s device, a tactile tongue-based electrical input sensor that converts digitized stimuli from a camera to an array of electrodes, which stimulate tactile receptors in the tongue to communicate spatial information to the brain (2004).

This research review paper will investigate whether the tongue is the best suitable organ to provide detailed, high-resolution input of visual stimuli. It will delve into the sensory and neural mechanisms involved with Bach-y-Rita’s stimulator as a substitute for visual input through working with the tactile receptors in the tongue. To put this research question into words, to what extent can a sensory artificial system, such as Bach-y-Rita’s device, compensate for blindness? Furthermore, is there any room for more advancement? In addition to helping the blind, can this device have other practical applications for people living with normal vision, in pitch-black, hazy, or smoky environments, for example? The overall expectation of this paper is to provide an in-depth overview of the current scientific research on Bach-y-Rita’s device. A critical review of scientific discoveries will suffice to present the necessary content by employing various approaches to explaining Bach-y-Rita’s device and presenting some controversial research data in the same domain.

An individual who has experienced total sensory system loss incidentally also underwent a brain lesion. In the case of blindness, this involves a loss of more than one million fibers. According to Bach-y-Rita (2003, p.643), there is growing scientific evidence to the effect that a human brain is capable of recognizing things extensively, following its damage and reorganization event after several years of trauma with suitable recuperation. Several research laboratory studies on the theme of sensory substitution have shown that information originating from non-natural sensory receptors can be relayed to the brain, and independent experiences of the missing sensory system can be felt with necessary training. Blind people have shown the ability to recognize different faces, read, locate various objects, observe, and trace movements in 3D (Sampaioa, Marisa & Bach-y-Rita, 2001, p.204).

With advancements in technology and related disciplines like bioengineering, devices with human-machine interfaces have been developed that provide convincing potential of real world, cosmetically satisfactory devices for the benefit of those who suffer sensory loss. These devices are popularly known as vision substitution system. They consist of videoconferencing camera, video capture card, laptop, tongue display unit (TDU), an electrode array, and software for image processing (Sampaioa, Marisa & Bach-y-Ritab, 2001, p.204). An experiment involving a human subject using a tongue-based human-machine interface and the above-mentioned equipment (Sampaioa, Marisa & Bach-y-Rita, 2001, p.204) revealed interesting findings.

These results indicated that it is possible to quantify visual acuity using a sensory substitution system. Moreover, a human-machine interface via the tongue is capable of providing a pathway to helpful sensory substitution systems. The application of the central and peripheral structures of the complete somatosensory system to convey information from an ancillary receptor such as TV camera would permit for the evaluation of late plasticity of the brain associated with sensory substitution (p.206).

Therefore, why is there much interest in a tongue-based device? A tongue-based device is appropriate since the tongue provides an interface that allows the growth of sensory systems, which are combined with supplementary technology. Normally, any transduce-able input develops into a 2D display on the array of the tongue and travels to the brain, becoming a brain component upon the influence of a training program. A research on this showed that for the tongue, voltage control bears to some extent desirable stimulation qualities. In addition, this leads to simpler circuitry that causes miniaturization upon employing MEMS technology (Bach-y-Rita, Tyler & Kaczmarek, 2009, p.289).

Electrotactile stimuli are transmitted via the TDU, a 12×12 grid of gold-plated copper circular electrodes held together by a Mylar (polyester) strip. The tongue display is designed to allow for simultaneous stimulation of all electrodes, and is a flexible component that is placed in the dorsum of the tongue (Bach-y-Rita, Tyler & Kaczmarek, 2009, p.289). The stimuli are delivered in 40 microsecond intervals in bursts of three pulses each at a frequency of 50Hz with a 200Hz pulse rate within the burst (p.290). Dorsal and lateral areas of tongue show higher thresholds of sensation than frontal and medial locations (32% higher) due to “differences in tactile sensor density and distribution” (Bach-y-Rita, Tyler & Kaczmarek, 2009, p.290). Linear regression models of the experimental data allowed the development of algorithms that can allow users of the device to adjust the average stimulus level and intensity as a function of location on the tongue (p.291).

 

The sensory receptors in the tongue are very shallow because the surrounding mouth offers sufficient protection. Saliva full of ions is a good electrolytic medium for signal transduction through the tongue receptors to the brain. It has been shown that the tongue is more sensitive than the fingertips in interpreting tactile stimuli, for it requires 3% of the voltage the finger-tip requires to respond to stimulation, as well as much less current (Bach-y-Rita1 & Kercel, 2003, p.542).

Previous studies showed that a TV camera could pick optical images that were transduced into direct or vibratory electrical signal transmitted through the receptors of the skin at different body parts such as the fingertip and back. The visual data spreads to the perceptual intensities for examination through somatosensory structures and pathways. Changing the location of the input device does not disorient the subjects, they easily adapt because they are trained to not perceive the image on the skin but locate it correctly in space. A reduced resolution sensory system has the power to make available required information for purposes of perceiving images that are complex (Bach-y-Rita1 & Kercel, 2003, p.543).

In this experiment, skin insufficiencies such as low 2-point resolution fail to affect ultimate high performance since the brain is able to extract information from stimulation patterns. In an experiment where an immobile TVSS was showing the tactile matrix concerning the back of the subjects revealed that the blind subjects identified a ball rolling off the table, estimated its position on the table, and calculated the precise moment and level at which to bat it once it is in free-fall. Results were near perfect when the subjects received training (Bach-y-Rita1 & Kercel, 2003, p.543).

 

Typical sensory systems never experience overloads. This is chiefly because the central nervous system (CNS) has the capability of choosing specific target information required to handle the situation that a given context presents. In fact, in 1970, research revealed that various attempts to develop means of aiding the sensory system resulted in sensory overload challenges. A sharp restriction arises on the rate at which humans can deal with serially received information particularly because visual perception grows well upon flooding with information. An example is looking at a complete scene of events or full text page. However, when input is reduced, visual perception gradually disjoints or diminishes (Bach-y-Rita1 & Kercel, 2003, p.543).

In another experiment using letters created from a black cardboard bearing a white background and involving blind subjects, a camera was used to transmit images perceived to a laptop computer device that translates the photos captured by the camera into electrical impulses sent to the tongue as electro-tactile pulses (Chebata et al., 2007, p.1901). Translation of the image depends on a color scale. The signal of highest potential corresponds to black and no signal is transmitted for white, thus any color will produce an intermediate potential with an intensity that depends on its closeness or distance from black. This codes the colors into electrical impulses that, with time and practice, allow the brain to distinguish the intensities and characterize them into colors. Therefore, stimuli were at full contrast, which converted into electro-tactile pulsations for black while white lacked stimulation (p.1902). In a similar experiment that Essick et al. conducted, they established that the tongue could recognize the smallest letter with an average of 5.1mm (p.1903).

Neural systems can develop some adaptation to benefitting from using neuroprosthetic devices. Apart from error-correction feedback mechanisms, neural adaptation to the circuitry of the neural device, and incorporated adaptation of the circuitry for purposes of learning are probably critical features that would permit neuroprosthetic devices to accomplish a level of usage the equivalent of the normal limb. Consciously, primates can indiscriminately transform their neural activity into performance of tasks. Much evidence indicates that neural activity is expected to be reasonably plastic. For example, selected neurons have already been used to measure regulation in an operant training paradigm (Patil & Turner, 2008, p.143).

Several advancements have been made in sensory substitution to cater not only for the needs of the handicapped, but also for those with normal vision and auditory systems. For instance, sensory substitution is applied in the areas of artistic presentations, augmented reality, and games. Examples of these include technologies that substitute visual stimuli into tactile or audio impulses, and of converting audio stimuli into tactile stimuli. For instance, according to the National Institute on Deafness and Other Communication Disorders, National Institutes of Health (2009), about 7.5 million U.S. citizens have speech difficulties that are generally associated with aphasias. However, due to advancements in cochlear implants, cortical implants could aid the brain’s speech areas and developed so to help improve speech in those with the speech handicaps.

Another key advancement of sensory substitution is the tactile-vestibular substitution that seeks to aid people who experience high reactions to antibiotics thereby anguishing from a condition called bilateral vestibular damage, BVD for short, where they cannot control their postures/gaits. This application requires one to have a brain-machine interface and a head-mounted accelerometer to generate electrotactile stimulation on the surface of the tongue causing the relay of information to the sufferer on head-body balance making it possible for him/her to adjust his/her body posture/gait appropriately (Tyler, Danilov & Bach-y-Rita, 2003).

Finally, in 2005, a research was conducted to provide for magnetic perception through sensory augmentation using vibrotactile magnetic compass strap that individuals put on about their waists. These belts provided vibrations that indicated to the subjects perceive magnetic directions like the North, based on the vibrating transducers at their waists (Nagel et al., 2005). There are other areas of advancements in this field including self-charging implants, memory off-loading, controlling complex machinery, and mood inducements. Current experiments indicate that tactile sense could help activate a person’s auditory cortex through the application of vibrotactile stimuli to enable hearing in both normal hearing and hearing-impaired humans. Already there is a promising technology called ‘Sense Organs Synthesizer” that is aiding this advancement (Schumann et al., 2006).

The modern progress of tongue human-machine interface now provides the opportunity of carrying these researches to greater levels: the invention of real-world systems for sensory substitution, for perceptible internet-based communications, for surgical treatment, and many other uses (Bach-y-Rita, Kaczmarek & Tyler, 2003, p.184). The ultimate goal of scientific research in this area of neuroprosthetic devices is to identify certain principles that would guide neural ensemble physiology. This would consequently lead the way in the development of a class of cortical neuroprosthetic devices, which can help restore complete body agility in those patients with devastating degrees of paralysis resulting from degenerative lesions of the CNS or from trauma (Nicolelis & Lebedev, 2009, p.533). However, there are criticisms on sensory prosthetics. They include arguments that substitution is a concept that is misleading since it simply means an addition, not a replacement of sensory modality (Lenay et al., 2003).

 

References

Bach-y-Rita, P. (2003). Theoretical basis for brain plasticity after a TBI. Brain Injury, 17(8), 643-651.

Bach-y-Rita, P. & Kercel, S.W. (2003). Sensory substitution and the human–machine interface. TRENDS in Cognitive Sciences, 7(12), 541-546

Bach-y-Rita, P., Tyler, M.E., & Kaczmarek, K.A. (2009). Seeing with the Brain. International Journal of Human-Computer Interaction, 15(2), 285-295.

Bach-y-Rita, P., Kaczmarek, K.A. & Tyler, M.E. (2003). A Tongue-Based Tactile Display for Portrayal of Environmental Characteristics. Wicab, Inc.169-186

Chebata et al. (2007).Tactile ‘visual’ acuity of the tongue in early blind individuals. NeuroReport, 18(18), 1901-1904

Lenay et al. (2003). Sensory Substitution: limits and perspectives. Touching for Knowing, Cognitive psychology of haptic manual perception, 275-292.

Lozano, C.A., Kaczmarek, K.A. & Santello, M. (2009). Electrotactile stimulation on the tongue: Intensity perception, discrimination, and cross-modality estimation. Somatosensory and Motor Research, 26(2–3), 50–63

Nagel et al. (2005). Beyond sensory substitution – learning the sixth sense. Journal of neural engineering, 2(4), R13-26.

Nicolelis, M.A.L. & Lebedev, M.A. (2009). Principles of neural ensemble physiology underlying the operation of brain–machine interfaces. Nature Reviews (Neuroscience). 10(1), 530- 539

Patil, P.G. & Turner, D.A. (2008). The Development of Brain-Machine Interface Neuroprosthetic Devices. Neurotherapeutics: The Journal of the American Society for Experimental NeuroTherapeutics, 5(1), 137-146

Sampaioa, E., Marisa, S. & Bach-y-Ritab, P. (2001). Brain plasticity: ‘visual’ acuity of blind persons via the tongue. Brain Research 908 (2001), 204–207

Schurmann et al. (2006). Touch activates human auditory cortex. Neuroimage, 30(1), 1325–1331

Tyler, M., Danilov, Y. & Bach-y-Rita, P. (2003). Closing an open-loop control system: vestibular substitution through the tongue. Journal of Integrative Neuroscience, 2(1), 159-164.

National Institute on Deafness and Other Communication Disorders, National Institutes of Health. http://www.nidcd.nih.gov/health/statistics/vsl.asp Accessed 12/16/2012. Updated 6/18/2009.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s