Send to

Choose Destination
J Neurosci. 2018 Jan 10;38(2):263-277. doi: 10.1523/JNEUROSCI.0322-17.2017. Epub 2017 Sep 15.

Behavioral, Modeling, and Electrophysiological Evidence for Supramodality in Human Metacognition.

Faivre N1,2,3, Filevich E4,5,6, Solovey G7,8, Kühn S4,9, Blanke O10,2,11.

Author information

Laboratory of Cognitive Neuroscience, Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology, Geneva 1202, Switzerland,
Center for Neuroprosthetics, Faculty of Life Sciences, Swiss Federal Institute of Technology, Geneva 1202, Switzerland.
Centre d'Economie de la Sorbonne, Centre National de la Recherche Scientifique, Unité Mixte de Recherche 8174, Paris 75647, France.
Department of Lifespan Psychology, Max Planck Institute for Human Development, Berlin 14195, Germany.
Department of Psychology, Humboldt Universität zu Berlin, 10099 Berlin, Germany.
Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany.
Instituto de Cálculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, 1428 Buenos Aires, Argentina.
Consejo Nacional de Investigaciones Científicas y Técnicas, 2290 Buenos Aires, Argentina.
Department of Psychiatry and Psychotherapy, University Medical Center Hamburg-Eppendorf, 20246 Hamburg, Germany.
Laboratory of Cognitive Neuroscience, Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology, Geneva 1202, Switzerland.
Department of Neurology, University Hospital Geneva, Geneva 1205, Switzerland, and.


Human metacognition, or the capacity to introspect on one's own mental states, has been mostly characterized through confidence reports in visual tasks. A pressing question is to what extent results from visual studies generalize to other domains. Answering this question allows determining whether metacognition operates through shared, supramodal mechanisms or through idiosyncratic, modality-specific mechanisms. Here, we report three new lines of evidence for decisional and postdecisional mechanisms arguing for the supramodality of metacognition. First, metacognitive efficiency correlated among auditory, tactile, visual, and audiovisual tasks. Second, confidence in an audiovisual task was best modeled using supramodal formats based on integrated representations of auditory and visual signals. Third, confidence in correct responses involved similar electrophysiological markers for visual and audiovisual tasks that are associated with motor preparation preceding the perceptual judgment. We conclude that the supramodality of metacognition relies on supramodal confidence estimates and decisional signals that are shared across sensory modalities.SIGNIFICANCE STATEMENT Metacognitive monitoring is the capacity to access, report, and regulate one's own mental states. In perception, this allows rating our confidence in what we have seen, heard, or touched. Although metacognitive monitoring can operate on different cognitive domains, we ignore whether it involves a single supramodal mechanism common to multiple cognitive domains or modality-specific mechanisms idiosyncratic to each domain. Here, we bring evidence in favor of the supramodality hypothesis by showing that participants with high metacognitive performance in one modality are likely to perform well in other modalities. Based on computational modeling and electrophysiology, we propose that supramodality can be explained by the existence of supramodal confidence estimates and by the influence of decisional cues on confidence estimates.


EEG; audiovisual; confidence; metacognition; signal detection theory; supramodality

[Indexed for MEDLINE]
Free full text

Supplemental Content

Full text links

Icon for HighWire
Loading ...
Support Center