Format

Send to

Choose Destination
Neuropsychologia. 2019 May 7;129:397-406. doi: 10.1016/j.neuropsychologia.2019.04.022. [Epub ahead of print]

A novel machine learning analysis of eye-tracking data reveals suboptimal visual information extraction from facial stimuli in individuals with autism.

Author information

1
Wrocław Faculty of Psychology, SWPS University of Social Sciences and Humanities in Wrocław, Wrocław, Poland. Electronic address: mkrol1@swps.edu.pl.
2
School of Social Sciences, University of Manchester, Manchester, United Kingdom.

Abstract

We propose a new method of quantifying the utility of visual information extracted from facial stimuli for emotion recognition. The stimuli are convolved with a Gaussian fixation distribution estimate, revealing more information in those facial regions the participant fixated on. Feeding this convolution to a machine-learning emotion recognition algorithm yields an error measure (between actual and predicted emotions) reflecting the quality of extracted information. We recorded the eye-movements of 21 participants with autism and 23 age-, sex- and IQ-matched typically developing participants performing three facial analysis tasks: free-viewing, emotion recognition, and brow-mouth width comparison. In the emotion recognition task, fixations of participants with autism were positioned on lower areas of the faces and were less focused on the eyes compared to the typically developing group. Additionally, the utility of information extracted by them in the emotion recognition task was lower. Thus, the emotion recognition deficit typical in autism can be at least partly traced to the earliest stage of face processing, i.e. to the extraction of visual information via eye-fixations.

KEYWORDS:

Autism spectrum disorder; Eye-tracking; Face emotion recognition; Face processing; Machine-learning

Supplemental Content

Full text links

Icon for Elsevier Science
Loading ...
Support Center