Send to

Choose Destination
Neurosci Lett. 2017 Jun 9;651:237-241. doi: 10.1016/j.neulet.2017.05.024. Epub 2017 May 11.

Mismatch negativity (MMN) to speech sounds is modulated systematically by manual grip execution.

Author information

Department of Psychology and Logopedics, University of Helsinki, Finland. Electronic address:
Department of Psychology and Logopedics, University of Helsinki, Finland.
Department of Psychology and Logopedics, University of Helsinki, Finland; Cognitive Brain Research Unit, University of Helsinki, Finland.
Department of Modern Languages, University of Helsinki, Finland.


Manual actions and speech are connected: for example, grip execution can influence simultaneous vocalizations and vice versa. Our previous studies show that the consonant [k] is associated with the power grip and the consonant [t] with the precision grip. Here we studied whether the interaction between speech sounds and grips could operate already at a pre-attentive stage of auditory processing, reflected by the mismatch-negativity (MMN) component of the event-related potential (ERP). Participants executed power and precision grips according to visual cues while listening to syllable sequences consisting of [ke] and [te] utterances. The grips modulated the MMN amplitudes to these syllables in a systematic manner so that when the deviant was [ke], the MMN response was larger with a precision grip than with a power grip. There was a converse trend when the deviant was [te]. These results suggest that manual gestures and speech can interact already at a pre-attentive processing level of auditory perception, and show, for the first time that manual actions can systematically modulate the MMN.


Action; Action-perception; Gestures; MMN; Speech

[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Elsevier Science
Loading ...
Support Center