The blenderFace method: video-based measurement of raw movement data during facial expressions of emotion using open-source software

Behav Res Methods. 2019 Apr;51(2):747-768. doi: 10.3758/s13428-018-1085-9.

Abstract

This article proposes an optical measurement of movement applied to data from video recordings of facial expressions of emotion. The approach offers a way to capture motion adapted from the film industry in which markers placed on the skin of the face can be tracked with a pattern-matching algorithm. The method records and postprocesses raw facial movement data (coordinates per frame) of distinctly placed markers and is intended for use in facial expression research (e.g., microexpressions) in laboratory settings. Due to the explicit use of specifically placed, artificial markers, the procedure offers the simultaneous measurement of several emotionally relevant markers in a (psychometrically) objective and artifact-free way, even for facial regions without natural landmarks (e.g., the cheeks). In addition, the proposed procedure is fully based on open-source software and is transparent at every step of data processing. Two worked examples demonstrate the practicability of the proposed procedure: In Study 1(N= 39), the participants were instructed to show the emotions happiness, sadness, disgust, and anger, and in Study 2 (N= 113), they were asked to present both a neutral face and the emotions happiness, disgust, and fear. Study 2 involved the simultaneous tracking of 14 markers for approximately 12 min per participant with a time resolution of 33 ms. The measured facial movements corresponded closely to the assumptions of established measurement instruments (EMFACS, FACSAID, Friesen & Ekman, 1983; Ekman & Hager, 2002). In addition, the measurement was found to be very precise with sub-second, sub-pixel, and sub-millimeter accuracy.

Keywords: Emotion expression; Measurement of facial movement; Open-source software; Optical measurement; Video data analysis.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Emotions*
  • Facial Expression*
  • Humans
  • Movement*
  • Software*
  • Video Recording*