Depending on how a mechanical device pulls and tugs areas around the mouth, the volunteer will hear and interpret speech sounds differently

By Erica Westly

Neuroscience textbooks typically portray the five senses as separate entities, but in the real world the senses frequently interact, as anyone who has tried to enjoy dinner with a stuffy nose can attest. Hearing and vision seem similarly connected, the most famous example being the “McGurk effect,” where visual cues, such as moving lips, affect how people hear speech. And now new research shows that touch can influence speech perception, too.

David Ostry, a neuroscientist with co-appointments at McGill University and the New Haven, Conn.–based speech center Haskins Laboratories, has been studying for years the relation between speech and the somatosensory system, the network of receptors in skin and muscle that report information on tactile stimuli to the brain. In his most recent study, published in the Proceedings of the National Academy of Sciences USA, he and two Haskins colleagues found that subjects heard words differently when their mouths were stretched into different positions. The results have implications for neuroscientists studying speech and hearing as well as for therapists looking for new ways to treat speech disorders.

In the study, a specially designed robotic device stretched the mouths of volunteers slightly up, down or backward while they listened to a computer-generated continuum of speech verbalizations that sounded like “head” or “had,” or something in between. When the subjects’ mouths were stretched upward, closer to the position needed to say “head,” they were more likely to hear the sounds as “head,” especially with the more ambiguous output. If the subjects’ mouths were stretched downward, as if to say “had,” they were more likely to hear “had,” even when the sounds being generated were closer to “head.” Stretching subjects’ mouths backward had no effect, implying a position-specific response. Moreover, the timing of the stretch had to match that of the sounds exactly to get an effect: the stretch altered speech perception only when it mimicked realistic vocalizations.

The concept of eliciting a cognitive response by manipulating the mouth is not entirely new. In 1988 psychologists found that they could improve subjects’ moods by having them clench a pen between their teeth, thereby forcing them to smile, and researchers have been conducting similar experiments on physical manipulation and perception ever since. But most of those experiments focus on emotional responses, which require a longer timescale, whereas in Ostry’s speech study the results were nearly instantaneous, notes Asif Ghazanfar, a sensory researcher at Prince­ton University. “What this study is showing is that these things are happening very quickly—on an order of tens of milliseconds,” he says. “I think that’s really important because it emphasizes that the brain is not something separate from our bodies. You can’t point to events happening in one and not the other.”

Ostry’s study also has ties to a hypothesis from the 1960s called the motor theory of speech perception, which argues that the neural machinery associated with speech production is also involved with speech perception. Fernando Nottebohm, a neuroscientist at the Rockefeller University who uses songbirds as a model for human speech, believes Ostry’s study represents one of the few examples of direct evidence supporting this hypothesis. Ostry, however, cautions that the somatosensory system could modulate speech perception in several ways without involving the motor system.

Previous studies using neuroimaging and magnetic stimulation have suggested that the brain regions involved with auditory, motor and sensory processing overlap at some level. Precisely how these areas work together to modulate speech perception remains unclear, however. Ostry and his colleagues hope to help answer this question with follow-up work that inverts the experiment: instead of hearing a continuum of sound, subjects will endure a continuum of stretches to see if auditory input can influence what they feel. Ostry suspects that touch and hearing will go both ways in this context, which would mean not only could you hear with your mouth, but you could also feel with your ears.

Adding a Therapeutic Touch
Understanding how the skin around the mouth affects speech perception could lead to new methods to treat speech disorders. Traditionally the focus in speech therapy has been on the auditory component, says David Ostry of McGill University, but the mechanical and tactile aspects are crucial, too. “The somatosensory inputs play a role in both guiding speech production and speech learning, and now it’s clear they play a role in auditory perception, too,” Ostry explains, referring to his recent experiments. “It really identifies them as a potential conduit for therapeutic interventions.” Speech therapies with tactile components could especially help patients who, because of hearing loss or other reasons, have trouble hearing their speech mistakes, he says.

Note: This article was originally published with the title, “A Real Stretch.”{jcomments on}


Source: Scientific American, 28 May 2009



Comments are closed