Format

Send to

Choose Destination
See comment in PubMed Commons below
Front Psychol. 2014 May 27;5:484. doi: 10.3389/fpsyg.2014.00484. eCollection 2014.

How sensory-motor systems impact the neural organization for language: direct contrasts between spoken and signed language.

Author information

  • 1Laboratory for Language and Cognitive Neuroscience, School of Speech, Language, and Hearing Sciences, San Diego State University San Diego, CA, USA.
  • 2Department of Psychology, University of Washington Seattle, WA, USA ; Department of Radiology, University of Washington Seattle, WA, USA.
  • 3Department of Radiology, University of Washington Seattle, WA, USA.

Abstract

To investigate the impact of sensory-motor systems on the neural organization for language, we conducted an H2 (15)O-PET study of sign and spoken word production (picture-naming) and an fMRI study of sign and audio-visual spoken language comprehension (detection of a semantically anomalous sentence) with hearing bilinguals who are native users of American Sign Language (ASL) and English. Directly contrasting speech and sign production revealed greater activation in bilateral parietal cortex for signing, while speaking resulted in greater activation in bilateral superior temporal cortex (STC) and right frontal cortex, likely reflecting auditory feedback control. Surprisingly, the language production contrast revealed a relative increase in activation in bilateral occipital cortex for speaking. We speculate that greater activation in visual cortex for speaking may actually reflect cortical attenuation when signing, which functions to distinguish self-produced from externally generated visual input. Directly contrasting speech and sign comprehension revealed greater activation in bilateral STC for speech and greater activation in bilateral occipital-temporal cortex for sign. Sign comprehension, like sign production, engaged bilateral parietal cortex to a greater extent than spoken language. We hypothesize that posterior parietal activation in part reflects processing related to spatial classifier constructions in ASL and that anterior parietal activation may reflect covert imitation that functions as a predictive model during sign comprehension. The conjunction analysis for comprehension revealed that both speech and sign bilaterally engaged the inferior frontal gyrus (with more extensive activation on the left) and the superior temporal sulcus, suggesting an invariant bilateral perisylvian language system. We conclude that surface level differences between sign and spoken languages should not be dismissed and are critical for understanding the neurobiology of language.

KEYWORDS:

American Sign Language; PET; audio-visual English; bimodal bilinguals; fMRI

PMID:
24904497
PMCID:
PMC4033845
DOI:
10.3389/fpsyg.2014.00484
PubMed Commons home

PubMed Commons

0 comments
How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for Frontiers Media SA Icon for PubMed Central
    Loading ...
    Support Center