Format

Send to

Choose Destination
Philos Trans R Soc Lond B Biol Sci. 2019 Apr 29;374(1771):20180026. doi: 10.1098/rstb.2018.0026.

Live human-robot interactive public demonstrations with automatic emotion and personality prediction.

Author information

1
1 Department of Computer Science and Technology, University of Cambridge , Cambridge CB3 0FD , UK.
2
2 Centre for Robotics Research, Department of Informatics, King's College London , London WC2R 2LS , UK.
3
3 Center for Autism Research , Philadelphia, PA , USA.

Abstract

Communication with humans is a multi-faceted phenomenon where the emotions, personality and non-verbal behaviours, as well as the verbal behaviours, play a significant role, and human-robot interaction (HRI) technologies should respect this complexity to achieve efficient and seamless communication. In this paper, we describe the design and execution of five public demonstrations made with two HRI systems that aimed at automatically sensing and analysing human participants' non-verbal behaviour and predicting their facial action units, facial expressions and personality in real time while they interacted with a small humanoid robot. We describe an overview of the challenges faced together with the lessons learned from those demonstrations in order to better inform the science and engineering fields to design and build better robots with more purposeful interaction capabilities. This article is part of the theme issue 'From social brains to social robots: applying neurocognitive insights to human-robot interaction'.

KEYWORDS:

affect; facial action units; facial expressions; personality; public demonstration; real-time human–robot interaction

PMID:
30853000
PMCID:
PMC6452249
[Available on 2020-04-29]
DOI:
10.1098/rstb.2018.0026

Supplemental Content

Full text links

Icon for Atypon
Loading ...
Support Center