Send to

Choose Destination
See comment in PubMed Commons below
Brain Res Cogn Brain Res. 2005 Feb;22(2):193-203.

Neural organization for recognition of grammatical and emotional facial expressions in deaf ASL signers and hearing nonsigners.

Author information

  • 1Laboratory for Cognitive Neuroscience, The Salk Institute for Biological Studies 10010 North Torrey Pines Rd. La Jolla, CA 92037, USA.


Recognition of emotional facial expressions is universal for all humans, but signed language users must also recognize certain non-affective facial expressions as linguistic markers. fMRI was used to investigate the neural systems underlying recognition of these functionally distinct expressions, comparing deaf ASL signers and hearing nonsigners. Within the superior temporal sulcus (STS), activation for emotional expressions was right lateralized for the hearing group and bilateral for the deaf group. In contrast, activation within STS for linguistic facial expressions was left lateralized only for signers and only when linguistic facial expressions co-occurred with verbs. Within the fusiform gyrus (FG), activation was left lateralized for ASL signers for both expression types, whereas activation was bilateral for both expression types for nonsigners. We propose that left lateralization in FG may be due to continuous analysis of local facial features during on-line sign language processing. The results indicate that function in part drives the lateralization of neural systems that process human facial expressions.

[PubMed - indexed for MEDLINE]
PubMed Commons home

PubMed Commons

How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for Elsevier Science
    Loading ...
    Support Center