Send to

Choose Destination
See comment in PubMed Commons below
Brain Res Cogn Brain Res. 2000 Jun;9(3):227-38.

Recognition of emotional prosody and verbal components of spoken language: an fMRI study.

Author information

  • 1Department of Psychiatry and Behavioral Sciences, University of Oklahoma Health Sciences Center, and Veterans Affairs Medical Center, Oklahoma City, OK 73104, USA.


This study examined the neural areas involved in the recognition of both emotional prosody and phonemic components of words expressed in spoken language using echo-planar, functional magnetic resonance imaging (fMRI). Ten right-handed males were asked to discriminate words based on either expressed emotional tone (angry, happy, sad, or neutral) or phonemic characteristics, specifically, initial consonant sound (bower, dower, power, or tower). Significant bilateral activity was observed in the detection of both emotional and verbal aspects of language when compared to baseline activity. We found that the detection of emotion compared with verbal detection resulted in significant activity in the right inferior frontal lobe. Conversely, the detection of verbal stimuli compared with the detection of emotion activated left inferior frontal lobe regions most significantly. Specific analysis of the anterior auditory cortex revealed increased right hemisphere activity during the detection of emotion compared to activity during verbal detection. These findings illustrate bilateral involvement in the detection of emotion in language while concomitantly showing significantly lateralized activity in both emotional and verbal detection, in both the temporal and frontal lobes.

[PubMed - indexed for MEDLINE]
PubMed Commons home

PubMed Commons

How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for Elsevier Science
    Loading ...
    Support Center