THIS STORY HAS BEEN FORMATTED FOR EASY PRINTING

Touch may influence what we are hearing

By Carolyn Y. Johnson
Globe Staff / November 26, 2009

E-mail this article

Invalid E-mail address
Invalid E-mail address

Sending your article

Your article has been sent.

  • E-mail|
  • Print|
  • Reprints|
  • |
Text size +

Listening is more than a matter of being “all ears.’’ People can also hear with their skin, according to new research that deepens our understanding of the senses, showing they can work together but also override one another.

Strange though it seems, scientists are finding that multiple senses contribute to the simplest perceptions. People can see with their ears, hear with their eyes, or hear with a touch.

In the work published yesterday in the journal Nature, researchers found they could influence what people hear by delivering puffs of air to the back of a hand or their neck. By demonstrating that the perception of speech is affected by touch, the experiment raises the possibility that one sense could be used as a substitute for another, creating new ways for deaf people to hear. Researchers at MIT are already using this basic idea to develop technology that could one day assist people with hearing impairments.

“This study is part of us . . . reconsidering how humans perceive the world, how humans interact with the world,’’ said coauthor Bryan Gick, a phonetician at the University of British Columbia and senior scientist at Haskins Laboratories, a speech research think tank in New Haven.

For years, scientists have known that watching another person speak can affect what we hear. In a well-known phenomenon called the McGurk effect, a person who listens to audio of someone saying “ba ba ba,’’ while watching another person’s lips forming the words “ga ga ga,’’ hears something in-between: “da da da.’’

Now, Gick is exploring whether touch also affects hearing. In the experiment, subjects heard the sounds “pa’’ or “ba’’ and “ta’’ or “da.’’ Sometimes, participants received a puff of air on the back of their hand or neck when the words had an aspirated sound - a sound like “pa’’ or “ta’’ that requires the speaker to expel a puff of air. (Hold your hand to your mouth and say “pa’’ and compare it with “ba’’ to feel the difference.) Other times, they got the reverse: a puff of air when they heard “ba’’ or “da’’ - non-aspirated sounds.

The researchers found that when the puff of air was paired with the aspirated word, people got better at identifying the sound. When the puff of air was paired with “ba’’ or “da,’’ accuracy declined.

“This is a very intriguing finding, raising lots of theoretical possibilities and future studies,’’ said Shinsuke Shimojo, head of a psychophysics laboratory at California Institute of Technology. He said it was interesting that the puff seemed to be an implicit signal for one sound over another, and worked at both spots on the body. The study participants were not told or taught that the puff was meant to signal a certain sound, and the researchers tried to choose spots on the body that probably do not feel aspirated puffs when people are speaking.

Gick plans to use brain imaging to reveal what is happening in the brains of people who “hear’’ puffs of air. His experiment supports the idea that people’s brains integrate information from touch with sounds that they hear. In a 2006 study from Finland, researchers used brain imaging to study 13 subjects and found that touch activated the auditory cortex, a part of the brain involved in hearing.

Charlotte Reed, a senior research scientist at Massachusetts Institute of Technology, specializes in researching the ways in which touch can be used to interpret speech, by studying deaf-blind people who learn the Tadoma method - a way of learning to talk and hear by placing a hand on the neck and mouth of a speaker.

“We know the auditory and tactile senses interact,’’ Reed said. She pointed out that the new study certainly shows touch can give a conflicting signal, making someone hear a “ba’’ as a “pa,’’ but also shows that a person’s accuracy in understanding a sound can be enhanced. Using touch to enhance hearing is the basic idea behind her research, which uses a machine called the “tactuator’’ that involves three vibrating and moving prongs - one for the thumb, forefinger, and middle finger to rest on - to turn speech into something people feel.

The idea is to create an aid for lipreading, and eventually to use the information gleaned from the bulky, three-pronged device to create software that could one day be used to turn a simple device like a cellphone into a prosthetic for deaf people. The microphone on a cellphone could translate a speaker’s voice into tactile signals that could help a person understand someone as they were lip-reading.

Research on how the senses intersect in productive and conflicting ways goes beyond touch and speech.

A study published earlier this year showed that people hear through their face. Stretching a person’s facial muscles into shapes normally associated with speech could make them hear words differently. For example, when the researchers played an ambiguous word, they found that people were more likely to hear the word as “head’’ when their facial skin was stretched upwards, and to hear “had’’ when their facial skin was stretched downwards.

CalTech’s Shimojo studies vision and has demonstrated that people shown one flash of light perceived that they saw two flashes of light if they also heard two beeps.

“It’s starting to look more and more like we’re perceiving machines,’’ Gick said. “Are we physically constructed to take in information, irrespective of where it’s coming from or what form it’s in?’’

Carolyn Y. Johnson can be reached at cjohnson@globe.com.