Saturday, November 28, 2009

I Hear You

 "Touch a scientist and you touch a child."
-- Ray Bradbury




One Sense Can Be Deceiving


In 1976 scientists discovered the importance of the eyes to the sense of hearing by demonstrating that the eyes could fool the ears in a peculiar phenomenon named the McGurk effect. (Harry McGurk and John MacDonald, "Hearing Lips and Seeing Voices," Nature, 1976) The effect shows that people can't help but integrate visual speech into what they "hear." The McGurk effect shows that visual articulatory information is integrated into perception of speech automatically and unconsciously. The syllable that people perceive depends on the strength of the auditory and visual information, and whether some compromise can be achieved. (Audiovisual Speech Web Lab)


When participants watched a video in which a person was saying "ga" but the audio was playing "ba," people thought they heard a completely different sound—"da." Now, by mixing audio with the tactile sense of airflow, researchers have found that our perception of certain sounds relies, in part, on being able to feel these sounds.



Do Only Ears Hear?

Natalie James (www.themoneytimes.com, November 26, 2009) reports the new Canadian research findings prove humans hear not just with ears but also with their skin and hair follicles. Sensations on the skin surface help in understanding what is being said. The research was carried out by Bryan Gick and colleagues at the University of British Columbia in Vancouver. (Nature, November 26, 2009)

The brain grasps and integrates information from various senses to build a single picture of a person's surroundings. "It gets integrated into a single event in your mind," quotes Gick in Nature.

Gick says people have traditionally believed that "we see with our eyes, and we hear with our ears, and we feel with our skin and so on, and that there are parts of the brain that only deal with seeing and only deal with hearing." (Richard J. Dalton Jr., Canwest News Service, November 27 2009) "Recent research, including ours, has been leading in a different direction," Gick continues. "We are naturally multimodal, very versatile perceivers, and we can use any part of our body to pick up information about objects and events around us in our environment."

The Gick researchers found that when a puff of air (air blown onto the neck) was paired with the aspirated word, people got better at identifying the sound. When the puff of air was paired with “ba’’ or “da,’’ accuracy declined. “This is a very intriguing finding, raising lots of theoretical possibilities and future studies,’’ reports Shinsuke Shimojo, head of a psychophysics laboratory at California Institute of Technology. He believes it is interesting that the puff seems to be an implicit signal for one sound over another. (Carolyn Y. Johnson, www.boston.com, November 26 2009)

"Standing a foot or closer to someone speaking normally should produce tactile puffs," Gick says. However, if the conversation were taking place on a windy street, this sensory input would be destroyed. Carina Storrs (www.scientificamerican.com, November 26 2009) concludes that although people can hear sounds in the absence of airflow, these sensory cues could make it easier to distinguish between two words, such as "tall" and "doll," especially if there is a lot of ambient noise.



What Applications the Research May Have

The feel of sounds could be applied in devices for groups such as the hearing impaired. Gick is in the early stages of exploring how to incorporate into hearing aids airflow-detecting sensors that would produce a synthetic puff to the side of the neck. Because the skin mechanoreceptors among the hearing impaired typically function normally, Gick says, this additional tactile stimulus could help the person wearing the device perceive sounds. A similar concept could aid pilots in their noisy work environments.

Charlotte Reed, a senior research scientist at Massachusetts Institute of Technology, specializes in researching the ways in which touch can be used to interpret speech, by studying deaf-blind people who learn the Tadoma method -- a way of learning to talk and hear by placing a hand on the neck and mouth of a speaker. “We know the auditory and tactile senses interact,’’ Reed said.

Reed says this tactile idea is used in a machine called the "tactuator" to turn speech into something people feel. The idea is to create an aid for lipreading, and eventually to use the information gleaned from the bulky, three-pronged device to create software that could one day be used to turn a simple device like a cellphone into a prosthetic for deaf people. The microphone on a cellphone could translate a speaker’s voice into tactile signals that could help a person understand someone as they were lip-reading. (Carolyn Y. Johnson, www.boston.com, November 26 2009)

What would be worst sense to lose? Many people believe that losing the sense of sight would be the most devastating. But, in reality, which would be worse -- losing your sense of sight or your sense of touch? Very little research has been done on the importance of the sense of touch as it relates to the other senses. Recent findings may change the direction of future investigation and find new importance in touch.





No comments:

Post a Comment