Joined at the Senses
Perception may feast on a sensory stew, not a five-sense buffet
By Bruce Bower
Many adults with hearing loss receive implants of small devices that stimulate the inner ear and permit a person at least partial reentry into the world of sound. But there’s more to these cochlear implants than meets the ear. The remarkable gadgets improve vision along with hearing.
That, at any rate, is the implication of a new study directed by neuroscientist Anne-Lise Giraud of Johann Wolfgang Goethe University in Frankfurt, Germany. Giraud’s team examined 18 adults for up to 3 years after they received cochlear implants. The researchers also examined 18 volunteers who had no hearing problem.
As participants listened to syllables, words, and various noises, Giraud and her colleagues used positron emission tomography (PET) scans to follow ebbs and peaks in neural activity.
As expected, several brain regions that deal with incoming sounds–part of a larger swath of tissue called the auditory cortex–became more active the longer people used implants. Syllables and words elicited stronger neural responses than noises did, a sign of the brain’s improved ability to unravel speech sounds.
That wasn’t all, though. Meaningful sounds also yielded intensified reactions in the visual cortex of implant users. Normal-hearing adults displayed no such responses. This finding may reflect the implant users’ learning to use lip reading–a visual skill–to augment what they heard during conversations, the researchers suggest in the June Neuron.
Lip reading and other uses of sight don’t compensate for lingering hearing problems in implant users, in Giraud and her coworkers’ view. Instead, they say, the visual cortex “becomes more tightly tuned to meaningful sounds.” In other words, the brain’s sight and hearing centers form a new cooperative venture dedicated to speech perception. The team’s findings join a rapidly growing number of studies that tap into the brain’s sensory cross talk.
For many neuroscientists, these investigations light the way toward a solution to the fundamental mystery of how the brain unites separate sensations into multifaceted experiences. In their view, the brain somehow assembles separate sights, sounds, and other sensations into reasonable approximations of what’s out there–say, a piano sonata wafting across a concert hall or flashing red lights in the rearview mirror of a moving car.
Other investigators use Giraud and others’ evidence on sensory mixing, however, to challenge mainstream thinking about perception. It’s time to drop the assumption that people and other animals perceive the world through separate sensory channels of vision, hearing, touch, taste, and smell, they argue.
Perception isn’t grounded in simply seeing what is visible or hearing what is audible, in this scenario. Instead, a person simultaneously exploits different sources of energy, such as light radiation and acoustic pressure waves, as he or she strives to attain certain goals, such as understanding speech. If so, then full-blown perceptions bloom in the brain without its having to assemble separately processed sensory bits and pieces into a facsimile of reality.
“Interactions across sensory receiving areas in the cortex may be more widespread than previously suspected,” remarks neuroscientist Robert J. Zatorre of McGill University in Montreal, who remains neutral on how such perception actually works. These intriguing partnerships underscore the need to move beyond “simple models of cerebral organization,” he says.
Multi-talented neurons
Scientists’ appreciation for brain cells that respond to two or more types of sensory input has expanded dramatically in the past several years. Researchers refer to such cells as multisensory, or multimodal, neurons.
Consider the primary visual cortex. In 1997, researchers reported that this neural gateway for visual information in sighted people also boosts its activity when people who have been blind since childhood read Braille and make other discriminations with their sense of touch.
Moreover, blind volunteers suddenly developed problems in reading Braille and performing other tactile tasks when researchers used a magnetic device outside volunteers’ heads to temporarily disrupt nerve transmissions in the visual cortex.
In blind people, cells that usually focus on vision are drafted into tactile service and promote a heightened sense of touch relative to that of sighted folk, the scientists theorized. Other investigators find that tactile stimulation on the tongue, of all places, can partially restore eyesight to blind people (SN: 9/1/01, p. 140:
The visual cortex has its own touching effects on sighted people, as well, according to a 1999 report. Investigators found that parts of the visual cortex revved up as volunteers tried to discern the orientation of raised ridges on a pad that they could feel but not see. Temporary interruption of visual brain activity rendered volunteers unable to get a feel for ridge orientations.
The visual cortex may have enabled the participants to use their mind’s eye to visualize the direction or arrangement of the ridges they touched. Most participants who successfully detected the orientations noted later that they had pictured the raised areas as they felt them.
Mental imagery draws the senses together in other ways. For instance, when people think about a visual scene with their eyes closed, activated brain areas include the visual cortex; if they’re asked to hum a tune in their head in silence, the auditory cortex begins to bustle.
The approach of potential dangers may engage a unique set of multisensory structures. The human brain contains a system of areas that react with equal vigor to nearby visual, tactile, and auditory stimuli that are moving, according to a report in the January Neuron.
A team led by Frank Bremmer of Ruhr University in Bochum, Germany, conducted brain scans on eight adults during three separate experiences. First, a stream of air blew across the subjects’ faces, then they watched moving dots on a screen, and finally they wore stereo headphones that made a sound seem to move back and forth in front of their faces.
Each trial elicited elevated activity in a separate area for touch, sight, or sound in the brain’s outer layer, the cortex. Moreover, three other interconnected regions of the cortex displayed comparable activity boosts during all trials. Similar cross-sensory brain regions attuned to nearby movements that pose potential threats have been found in monkeys.
These multisensory areas possibly detect the approach of threatening objects and coordinate defensive reactions, comments Michael S.A. Graziano of Princeton University. He adds that the finding joins a growing body of research on the integration of the senses with each other and with the control of movement.
The amygdala, an almond-shape brain region located below the cortex, also integrates various sensations related to fear and danger, according to a report in the Aug. 14 Proceedings of the National Academy of Sciences.
Brain scans of 12 volunteers showed increased activity in the amygdala as they looked at faces with fearful expressions and heard an actor read sentences in a fearful tone. The same structure stayed relatively calm during presentations of other types of emotional faces and voices, as well as when participants saw clashing combinations, such as a fearful face with a happy voice.
Sensory blending in the amygdala fosters adults’ previously noted capability to recognize the emotion in a scared-looking face more adeptly if they simultaneously hear a fearful voice, concludes the team led by neuroscientist Raymond J. Dolan of University College London.
Bundles of information
People and other animals routinely perceive useful bundles of sensory information without needing first to process separate sensations, argues psychologist Thomas A. Stoffregen of the University of Minnesota at Minneapolis, St. Paul. What matters is the task at hand, in his view. As a person pursues an activity, sense organs tap simultaneously into related forms of energy in the environment that get translated into an integrated form of brain activity, Stoffregen theorizes.
It’s time to stop thinking of perception as a process grounded in separate domains of sight, sound, touch, taste, and smell, argues Stoffregen. Each species exploits patterns of information from the electromagnetic spectrum, acoustic waves, and other energy sources in its own way.
He and Benot G. Bardy of the University of Paris presented the case for this radical revision of perceptual theory in the April Behavioral and Brain Sciences.
“Multisensory perception is not merely the primary type of perception, it’s the only type of perception,” Stoffregen holds. “Give a brain cell a chance to deal with different sensory stimuli, and it usually shows a wide range of responses.” Whereas cells in some brain areas respond strongly to a single type of sensory input, he argues that many also react to input from multiple systems.
Simply to turn one’s head and walk away without tipping over requires the coordination of what one sees, hears, and feels, at the very least, he asserts. This sensory merger is more than the sum of its parts. It can’t be meaningfully portrayed as a pie chart with slices of vision, sound, and so on.
Moreover, speech perception also hinges on integrated versions of what’s seen and heard, Stoffregen says. He illustrates this point by reconsidering a strange laboratory finding dubbed the McGurk effect, after its discoverer, developmental psychologist Harry McGurk.
In studies of this effect, volunteers watch videotape showing a speaker’s face as he or she utters a syllable, while they hear a different syllable on the sound track. When asked what they heard, participants often provide a syllable that was neither shown on the videotape nor uttered on the sound track.
For instance, people who see the mouth movements for “g” at the beginning of a syllable while hearing a voice say “b” frequently say that the syllable started with a “d.”
Researchers have typically explained the McGurk effect as a neural compromise. The brain infers a sound’s identity by comparing conflicting lines of sensory information and coming up with an in-between sound.
In contrast, Stoffregen argues that this effect derives from an unfamiliar pattern of visual and auditory information. Neurons capable of handling sight and sound information cooperatively discern what’s being said rather than separately concentrating on what’s visible and what’s audible. As a result, the Minnesota researcher proposes, participants’ unified perception of the syllable differs from the sound track alone.
Given more experience with mismatching lip movements and speech sounds, volunteers would generate more-nuanced perceptions as they picked up on a wider array of sensory cues, Stoffregen predicts. He speculates that eventually, they would notice the disparities in visible and audible syllables.
Separate channels
Stoffregen and Bardy’s effort to replace independent senses with multisensory patterns provokes controversy among neuroscientists. “It’s like they’re telling me that I don’t have toes, when I can look down and see that I do,” comments Barry Stein of Wake Forest University in Winston-Salem, N.C.
Much evidence indicates that visual, acoustic, and other sensory information passes through separate neural channels before entering multisensory structures in the brain, Stein says. These sensory crossroads weave diverse sensory strands into a tapestry of perceived objects and scenes, he theorizes.
For instance, Stein and his coworkers have demonstrated that tactile and visual information converge from different directions on the superior colliculus, a midbrain region packed with multisensory cells.
Teija Kujala of the University of Helsinki agrees that sensory systems are often entwined in largely unappreciated ways. “However, there are limits on cross-sensory representations,” Kujala adds. “The senses are also in many respects distinct.”
In support of this view, she notes that while the visual cortex proves critical for detecting the orientation of ridges that can be felt but not seen, it shows no sign of influencing the capability to tell one texture from another–such as a smooth surface from a rough one.
In contrast, some investigators of infant perception staunchly support Stoffregen and Bardy’s approach. Tests on infants illustrate that the perceptual system deals with global and multisensory types of information beginning early in life, says psychologist Arlene Walker-Andrews of Rutgers University in New Brunswick, N.J.
The integration of seeing, hearing, and other senses enables babies to generate practical, action-oriented perceptions, in her opinion. In one study, 5-month-olds who heard an increasing or decreasing engine noise spent much time looking at videotapes of a train appropriately moving toward or away from them. Infants largely ignored other videotapes in which, for example, an approaching train was accompanied by diminishing engine noise or a train changed in size without moving as engine noise increased.
Babies perceive phenomena such the synchronized timing of events, the tempo of action, rhythms, and the spatial location of objects, says psychologist Lorraine E. Bahrick of Florida International University in Miami. These perceptions thrive on multiple sensory cues, according to her research.
So, for example, 5-month-olds stop what they’re doing and look toward a hammer that they can see rhythmically banging in concert with its impact sounds. The same infants show no interest if they only see or hear the hammer beat out a rhythm.
The proposition that perception rests on a foundation of unified sensations is far from confirmed. But its proponents plan to continue hammering away at the popular notion of independent senses.
“The question of whether [unified sensory] information actually exists and is used by perceptual systems matters very much to both theory and research,” says psychologist John B. Pittenger of the University of Arkansas at Little Rock. “We need to stop assuming that the answer is self-evident.”