We hear with our ears, right? Yes, but scientists have known for years that we also hear with our eyes. In a landmark study published in 1976, researchers found that people integrated both auditory cues and visual ones, like mouth and face movements, when they heard speech.
That study, and many that followed, raised this fundamental question about speech perception: If humans can integrate different sensory cues, do they do so through experience (through seeing countless speaking faces over time), or has evolution hard-wired them to do it?
A new study that looks at a different set of sensory cues adds to a growing body of evidence that suggests such integration is innate. In a paper in Nature, Bryan Gick and Donald Derrick of the University of British Columbia report that people can hear with their skin.
The researchers had subjects listen to spoken syllables while hooked up to a device that would simultaneously blow a tiny puff of air onto the skin of their hand or neck. The syllables included “pa” and “ta,” which produce a brief puff from the mouth when spoken, and “da” and “ba,” which do not produce puffs. They found that when listeners heard “da” or “ba” while a puff of air was blown onto their skin, they perceived the sound as “ta” or “pa.”
Dr. Gick said the findings were similar to those from the 1976 study, in which visual cues trumped auditory ones — subjects listened to one syllable but perceived another because they were watching video of mouth movements corresponding to the second syllable. In his study, he said, cues from sensory receptors on the skin trumped the ears as well. “Our skin is doing the hearing for us,” he said.
Dr. Gick noted that it would normally be rare that someone actually sensed a puff of air produced by another, although people might occasionally sense their own puffs. Either way, he said, the stimulus is very subtle, “which suggests it is very powerful.”
“What’s so persuasive about this particular effect,” he added, “is that people are picking up on this information that they don’t know they are using.” That supports the idea that integrating different sensory cues is innate.
Dr. Gick said the finding also suggested that other sensory cues might be at work in speech perception — that, as he put it, “we are these fantastic perception machines that take in all the information available to us and integrate it seamlessly.”