Embodied cognition for autonomous interactive robots

Top Cogn Sci. 2012 Oct;4(4):759-72. doi: 10.1111/j.1756-8765.2012.01218.x. Epub 2012 Aug 14.

Abstract

In the past, notions of embodiment have been applied to robotics mainly in the realm of very simple robots, and supporting low-level mechanisms such as dynamics and navigation. In contrast, most human-like, interactive, and socially adept robotic systems turn away from embodiment and use amodal, symbolic, and modular approaches to cognition and interaction. At the same time, recent research in Embodied Cognition (EC) is spanning an increasing number of complex cognitive processes, including language, nonverbal communication, learning, and social behavior. This article suggests adopting a modern EC approach for autonomous robots interacting with humans. In particular, we present three core principles from EC that may be applicable to such robots: (a) modal perceptual representation, (b) action/perception and action/cognition integration, and (c) a simulation-based model of top-down perceptual biasing. We describe a computational framework based on these principles, and its implementation on two physical robots. This could provide a new paradigm for embodied human-robot interaction based on recent psychological and neurological findings.

MeSH terms

  • Artificial Intelligence*
  • Cognition*
  • Electronic Data Processing
  • Humans
  • Neural Networks, Computer
  • Robotics*
  • User-Computer Interface