Monday, February 2, 2009

Lip reading involves two cortical mechanisms

It is well known that visual speech (lip reading) affects auditory perception of speech. But how? There seem to be two ideas. One idea, dominant among sensory neuroscientists, is that visual speech accesses auditory speech systems via cross sensory integration. The STS is a favorite location in this respect. The other, dominant among speech scientists, particularly those with a motor theory bent, is that visual speech accesses motor representations of the perceived gestures which then influences perception.

A hot-off-the-press (well actually still in press) paper in Neuroscience Letters by Kai Okada and yours truly proposes that both ideas are correct. Specifically, that there are two routes by which visual speech can influence auditory speech, a "direct" and dominant cross sensory route involving the STS, and an "indirect" and less dominant sensory-motor route involving sensory-motor circuits. The goal of our paper was to outline existing evidence in favor of a two mechanism model, and to test one prediction of the model, namely that perceiving visual speech should activate speech related sensory-motor networks, including our favorite area, Spt.

Short version of our findings: as predicted, viewing speech gestures (baseline = non-speech gestures) activates speech-related sensory-motor areas including Spt as defined by a typical sensory-motor task (listen and reproduce speech). We interpret this as evidence for a sensory-motor route through which visual speech can influence heard speech, possibly via some sort of motor-to-sensory prediction mechanism. Viewing speech also activated a much broader set of regions along the STS, which may reflect the more direct cross sensory route.

Have a look and let me know what you think!


K OKADA, G HICKOK (2009). Two cortical mechanisms support the integration of visual and auditory speech: A hypothesis and preliminary data Neuroscience Letters DOI: 10.1016/j.neulet.2009.01.060

No comments: