Format

Send to

Choose Destination
PLoS One. 2014 Jul 29;9(7):e103278. doi: 10.1371/journal.pone.0103278. eCollection 2014.

Implicit processing of visual emotions is affected by sound-induced affective states and individual affective traits.

Author information

1
Cognitive Brain Research Unit, Institute of Behavioral Sciences, University of Helsinki, Helsinki, Finland & Finnish Centre for Interdisciplinary Music Research, University of Helsinki, Helsinki, Finland; Psychiatric Neuroscience Group, Department of Basic Medical Sciences, Neuroscience and Sense Organs, University of Bari, Bari, Italy.
2
Psychiatric Neuroscience Group, Department of Basic Medical Sciences, Neuroscience and Sense Organs, University of Bari, Bari, Italy.
3
The Research Clinic for Functional Disorders and Psychosomatics, Aarhus University Hospital & Interacting Minds Centre, Aarhus University, Aarhus, Denmark.
4
Psychiatric Neuroscience Group, Department of Basic Medical Sciences, Neuroscience and Sense Organs, University of Bari, Bari, Italy; pRED, Neuroscience DTA, Hoffman-La Roche, Ltd., Basel, Switzerland.
5
Cognitive Brain Research Unit, Institute of Behavioral Sciences, University of Helsinki, Helsinki, Finland & Finnish Centre for Interdisciplinary Music Research, University of Helsinki, Helsinki, Finland; Brain & Mind Laboratory, Department of Biomedical Engineering and Computational Science, Aalto University School of Science, Helsinki, Finland.

Abstract

The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals.

PMID:
25072162
PMCID:
PMC4114563
DOI:
10.1371/journal.pone.0103278
[Indexed for MEDLINE]
Free PMC Article

Supplemental Content

Full text links

Icon for Public Library of Science Icon for PubMed Central
Loading ...
Support Center