By Devin Powell
SEATTLE — About a year and a half after her stroke, a 36-year-old professor started to feel sounds. A radio announcer’s voice made her tingle. Background noise in a plane felt physically uncomfortable.
Now Tony Ro, a neuroscientist at the City College of New York and the Graduate Center of the City University of New York, might have figured out the cause of this synesthesia. Sophisticated imaging of the woman’s brain revealed that new links had grown between its auditory part, which processes sound, and the somatosensory region, which handles touch.
“The auditory area of her brain started taking over the somatosensory area,” says Ro, who used diffusion tensor imaging, which focuses on the brain’s white matter connections, to spot the change.
This connection between sound and touch may run deep in the rest of us as well, Ro and colleagues said during presentations May 25 at a meeting of the Acoustical Society of America. Both hearing and touch, the scientists pointed out, rely on nerves set atwitter by vibration. A cell phone set to vibrate can be sensed by the skin of the hand, and the phone’s ring tone generates sound waves — vibrations of air — that move the eardrum.
Elizabeth Courtenay Wilson, a neuroscientist who did not attend the Seattle meeting, has also seen strong connections between areas of the brain that process hearing and touch. “We’re suggesting that the ear evolved out of the skin in order to do more finely tuned frequency analysis,” adds Wilson, of Beth Israel Deaconess Medical Center in Boston.
Wilson earned her Ph.D. in an MIT laboratory focused on studying whether vibrations could boost hearing aid performance. She published a series of papers showing that people with normal hearing were much better at detecting the combination of an extremely weak sound and an extremely weak vibration applied to the skin than either stimulus on its own.
Other researchers have shown that hearing a sound can boost touch sensitivity. Ro calls this the mosquito effect: The bug’s buzz makes our skin prickle. The frequency of the sound and the frequency of the vibrations our hands feel must match for this to work, according to a 2009 paper he published in Experimental Brain Research.
Frequency may be a two-way street in the brain that unites these two senses, says Jeffrey Yau, a neuroscientist at the Johns Hopkins University School of Medicine in Baltimore. A vibration that has a higher or lower frequency than a sound, he found, tends to skew pitch perception up or down. Sounds can also bias whether a vibration is perceived.
The ability of skin and ears to confuse each other also extends to volume, Yau said at the meeting. A car radio may sound louder to a driver than his passengers because of the shaking of the steering wheel.
“As you make a vibration more intense, what people hear seems louder,” says Yau. Sound, on the other hand, doesn’t seem to change how intense vibrations feel.
Functional MRI scans of people’s brains have shown that the auditory region can activate during a touch, and some speculate that chunks of brain specialized to understand frequency may play a role in crossing the wires. But, like touch and sound themselves, exactly where these two senses might be coming together in the brain remains a muddle.