Touch and sight push each other around
When the fingers feel downward motion, the eyes see upward motion
What people see sometimes depends on what they just felt, a study appearing online April 9 in Current Biology shows. A light ripple of pins moving up the fingertip tricked study participants into perceiving lines on a screen as moving down, and vice versa, suggesting that vision and touch are integrated in the human brain.
“I think this is an important advancement,” comments Jon Kaas, a neuroscientist at Vanderbilt University in Nashville. “It changes our way of thinking about how the brain works.”
For decades, instructors in medical schools have taught students that the senses —including vision, touch and sound — are interpreted in different, discrete parts of the brain, says Michael Beauchamp of the University of Texas Medical School at Houston. “Now it turns out what we’re teaching them is wrong,” he says. “There’s a lot more cross talk between the modalities.”
Recent studies have shown a close link between hearing and vision. To understand the presumed links between touch and vision, researchers led by Talia Konkle and Christopher Moore of the Massachusetts Institute of Technology in Cambridge employed a trick of perception called an aftereffect.
Watching a waterfall causes one well-known motion aftereffect. After a person stares at the falling water long enough, the stationary rocks at the bottom appear to float upward. One predominant explanation, Moore says, involves brain fatigue. Some nerve cells in the brain detect things moving down, while others detect things moving up. The “down” neurons get tired when a person stares at water plunging downward, but the “up” neurons are still fresh and active, causing the rocks to appear to float.
To study motion aftereffects caused by touching, Moore and his colleagues used a stamp-sized gadget made of 60 small pins in rows. The participants rested a finger on top of the gadget, which sat on the table in front of them. Raising rows of pins at different times created a gentle prodding motion either away from (up) or toward the person (down).
After feeling motion of either direction on the fingertip for 10 seconds, study participants looked at a commonly used pattern of black and white horizontal lines on a computer screen. The black and white lines repeatedly traded places, so their overall motions could be interpreted as either up or down, even though there was no net movement. Typically people are split evenly on what way they perceive the lines to be moving.
But people who had felt the lines moving up on the fingertip interpreted the ambiguous lines on the screen to be moving down, and vice versa. The touch conditioning created a visual aftereffect.
“I can basically make you see things that weren’t there by touching your hand,” Moore says.
The researchers also found that vision created a touch aftereffect: People who watched lines on a screen move up were more likely to interpret patternless touches from pin rows as moving down.
“These two things [touch and vision] are able to push each other around more than we thought,” suggesting that vision and touch are deeply entwined in the brain, Moore says. Imaging studies could reveal where in the brain this integration takes place.