Send to

Choose Destination
Neuroscience. 2017 Feb 20;343:157-164. doi: 10.1016/j.neuroscience.2016.09.023. Epub 2016 Sep 17.

Visual form predictions facilitate auditory processing at the N1.

Author information

The MARCS Institute, University of Western Sydney, Sydney, Australia. Electronic address:
The MARCS Institute, University of Western Sydney, Sydney, Australia.


Auditory-visual (AV) events often involve a leading visual cue (e.g. auditory-visual speech) that allows the perceiver to generate predictions about the upcoming auditory event. Electrophysiological evidence suggests that when an auditory event is predicted, processing is sped up, i.e., the N1 component of the ERP occurs earlier (N1 facilitation). However, it is not clear (1) whether N1 facilitation is based specifically on predictive rather than multisensory integration and (2) which particular properties of the visual cue it is based on. The current experiment used artificial AV stimuli in which visual cues predicted but did not co-occur with auditory cues. Visual form cues (high and low salience) and the auditory-visual pairing were manipulated so that auditory predictions could be based on form and timing or on timing only. The results showed that N1 facilitation occurred only for combined form and temporal predictions. These results suggest that faster auditory processing (as indicated by N1 facilitation) is based on predictive processing generated by a visual cue that clearly predicts both what and when the auditory stimulus will occur.


EEG; N1 latency; audiovisual; prediction

[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Elsevier Science
Loading ...
Support Center