Format

Send to

Choose Destination
IEEE J Biomed Health Inform. 2018 Mar;22(2):525-536. doi: 10.1109/JBHI.2017.2676878. Epub 2017 Mar 2.

Dynamic Multimodal Measurement of Depression Severity Using Deep Autoencoding.

Abstract

Depression is one of the most common psychiatric disorders worldwide, with over 350 million people affected. Current methods to screen for and assess depression depend almost entirely on clinical interviews and self-report scales. While useful, such measures lack objective, systematic, and efficient ways of incorporating behavioral observations that are strong indicators of depression presence and severity. Using dynamics of facial and head movement and vocalization, we trained classifiers to detect three levels of depression severity. Participants were a community sample diagnosed with major depressive disorder. They were recorded in clinical interviews (Hamilton Rating Scale for Depression, HRSD) at seven-week intervals over a period of 21 weeks. At each interview, they were scored by the HRSD as moderately to severely depressed, mildly depressed, or remitted. Logistic regression classifiers using leave-one-participant-out validation were compared for facial movement, head movement, and vocal prosody individually and in combination. Accuracy of depression severity measurement from facial movement dynamics was higher than that for head movement dynamics, and each was substantially higher than that for vocal prosody. Accuracy using all three modalities combined only marginally exceeded that of face and head combined. These findings suggest that automatic detection of depression severity from behavioral indicators in patients is feasible and that multimodal measures afford the most powerful detection.

PMID:
28278485
PMCID:
PMC5581737
DOI:
10.1109/JBHI.2017.2676878
[Indexed for MEDLINE]
Free PMC Article

Supplemental Content

Full text links

Icon for IEEE Engineering in Medicine and Biology Society Icon for PubMed Central
Loading ...
Support Center