Hearing Red – Sensory Substitution With Your Smartphoneby Sara Adaes, PhD | February 21, 2015
Sensory substitution is a means to use input from one sensory modality to acquire information for another. It is mostly applied to acquiring visual information through audio or tactile stimuli. Sensory substitution devices can convert one type of sensory signal into another. They are a great improvement to the quality of life of individuals with some form of sensory loss, particularly blindness.
Sensory substitution as a therapy for patients with sensory impairments was first introduced in the late 60s by the American neuroscientist aul Bach-y-Rita. At the time, Bach-y-Rita and colleagues developed a sensory substitution apparatus consisting of a small camera attached to the subjects’ heads, that sent its image to a 10 in2 (64.5 cm2) tactile stimulation device in contact with the skin of the back. The device was composed of 400 stimulators in a 20×20 grid, each one representing a small area of the captured image. The results were amazing, with subjects being able to recognize many common objects within just a few hours of training. Longer training even allowed subjects to recognize faces and acquire spacial perception – a striking example of the brain’s plasticity.
In the last few years, visual-to-auditory sensory substitution has won ground with different sensory substitution devices and software being made available. These artificial synesthesia devices convert live visual images into auditory signals. Some of them are free mobile apps whose efficacy has been scientifically demonstrated.
One of them is “The vOICe” (the capitalized letters read “Oh, I see”). It was developed in 1992 by Peter Meijer, a Dutch scientist, and has been upgraded ever since. It is freely available for Android.
The vOICe converts a visual image to an auditory signal by translating live images from a smartphone camera into “soundscapes” that are transmitted to the user via headphones. It uses pitch to convey height and loudness to convey brightness; a rising bright line will sound as a rising tone, a bright spot as a beep, a bright filled rectangle as a noise burst, a vertical grid as a rhythm, for example. It also includes many other features that help in this process of providing artificial vision. It’s complex and requires training, but your brain’s plasticity ultimately gets it.
Experimental research has proved that it works. One study even showed that congenitally fully blind adults were able to reach the highest acuity reported at the time with any visual rehabilitation approach, even those that cost thousands of dollars. This shows the incredible potential of sensory substitution devices have while being inexpensive and non-invasive rehabilitation tools.
EyeMusic also creates soundscapes from live smartphone camera images, but it includes musical cues for colors, hence its name. EyeMusic represents colors using different musical instruments. It also uses higher notes on a given musical instrument to translate higher pixels of the image and lower notes to translate lower pixels; pixels closer to the left side of the image are heard before pixels closer to the right side of the picture. It thus allows shapes and colors to be heard.
Amedi’s group showed that, by using EyeMusic, blind subjects were able to learn to read with sounds topographically representing visual images of letters, and also to recognize soundscapes of complex object categories such as faces, houses, and body parts. As Amir Amedi states:
“The general concept is that you don’t need to teach each object individually, you teach the principles—just like the brain understands the principles of dots and lines and how to combine them.”
And again, in case this went by unnoticed: these are free! Why? Because, as the developers of The vOICe state:
“our foremost goal is to make a real change by lowering barriers to use as much as we can.”
And even sighted users can use these apps. Try to learn how to see with sound! It’s a great use for neuronal plasticity.
Abboud S, Hanassy S, Levy-Tzedek S, Maidenbaum S, & Amedi A (2014). EyeMusic: Introducing a “visual” colorful experience for the blind using auditory sensory substitution. Restorative neurology and neuroscience, 32 (2), 247-57 PMID: 24398719
Bach-y-Rita P, Collins CC, Saunders FA, White B, & Scadden L (1969). Vision substitution by tactile image projection. Nature, 221 (5184), 963-4 PMID: 5818337
Levy-Tzedek S, Hanassy S, Abboud S, Maidenbaum S, & Amedi A (2012). Fast, accurate reaching movements with a visual-to-auditory sensory substitution device. Restorative neurology and neuroscience, 30 (4), 313-23 PMID: 22596353
Haigh A, Brown DJ, Meijer P, & Proulx MJ (2013). How well do you see what you hear? The acuity of visual-to-auditory sensory substitution. Frontiers in psychology, 4 PMID: 23785345
Meijer PB (1992). An experimental system for auditory image representations. IEEE transactions on bio-medical engineering, 39 (2), 112-21 PMID: 1612614
Striem-Amit E, Cohen L, Dehaene S, & Amedi A (2012). Reading with sounds: sensory substitution selectively activates the visual word form area in the blind. Neuron, 76 (3), 640-52 PMID: 23141074
Striem-Amit E, Guendelman M, & Amedi A (2012). ‘Visual’ acuity of the congenitally blind using visual-to-auditory sensory substitution. PloS one, 7 (3) PMID: 22438894
No future articles scheduled.
This Sunday February 14th (9 p.m. ET), the Emmy-nominated Brain Games tv-show is back! Wonder junkie Jason Silva returns to our screens, teaming up with... READ MORE →
Like what you read? Give to Brain Blogger sponsored by GNIF with a tax-deductible donation.Make A Donation