Format

Send to

Choose Destination
Neuroimage. 2011 May 15;56(2):843-9. doi: 10.1016/j.neuroimage.2010.05.084. Epub 2010 Jun 10.

Name that tune: decoding music from the listening brain.

Author information

1
Donders Institute for Brain, Cognition and Behavior: Centre for Cognition, Radboud University Nijmegen, Nijmegen, The Netherlands. r.schaefer@donders.ru.nl

Abstract

In the current study we use electroencephalography (EEG) to detect heard music from the brain signal, hypothesizing that the time structure in music makes it especially suitable for decoding perception from EEG signals. While excluding music with vocals, we classified the perception of seven different musical fragments of about three seconds, both individually and cross-participants, using only time domain information (the event-related potential, ERP). The best individual results are 70% correct in a seven-class problem while using single trials, and when using multiple trials we achieve 100% correct after six presentations of the stimulus. When classifying across participants, a maximum rate of 53% was reached, supporting a general representation of each musical fragment over participants. While for some music stimuli the amplitude envelope correlated well with the ERP, this was not true for all stimuli. Aspects of the stimulus that may contribute to the differences between the EEG responses to the pieces of music are discussed.

[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Elsevier Science
Loading ...
Support Center