Format

Send to

Choose Destination
Proc SPIE Int Soc Opt Eng. 2016 May;9859. pii: 985906. doi: 10.1117/12.2224382.

Human-like object tracking and gaze estimation with PKD android.

Author information

1
University of Louisville, Louisville, KY 40292, USA.
2
University of North Texas Health Science Center, Fort Worth, TX 76107, USA.

Abstract

As the use of robots increases for tasks that require human-robot interactions, it is vital that robots exhibit and understand human-like cues for effective communication. In this paper, we describe the implementation of object tracking capability on Philip K. Dick (PKD) android and a gaze tracking algorithm, both of which further robot capabilities with regard to human communication. PKD's ability to track objects with human-like head postures is achieved with visual feedback from a Kinect system and an eye camera. The goal of object tracking with human-like gestures is twofold : to facilitate better human-robot interactions and to enable PKD as a human gaze emulator for future studies. The gaze tracking system employs a mobile eye tracking system (ETG; SensoMotoric Instruments) and a motion capture system (Cortex; Motion Analysis Corp.) for tracking the head orientations. Objects to be tracked are displayed by a virtual reality system, the Computer Assisted Rehabilitation Environment (CAREN; MotekForce Link). The gaze tracking algorithm converts eye tracking data and head orientations to gaze information facilitating two objectives: to evaluate the performance of the object tracking system for PKD and to use the gaze information to predict the intentions of the user, enabling the robot to understand physical cues by humans.

KEYWORDS:

Gaze estimation; Humanoid robot; Object tracking

Supplemental Content

Full text links

Icon for PubMed Central
Loading ...
Support Center