Logo of procamiaLink to Publisher's site
AMIA Annu Symp Proc. 2006; 2006: 920.
PMCID: PMC1839426

A Real-Time Gesture Interface for Hands-Free Control of Electronic Medical Records

Abstract

Whether attempting to review digital radiologic images during a procedure or reviewing labs on a clinical ward, computer keyboards and mice are potential sources for contamination of clinicians during sterile and non-sterile activities related to clinical care. The authors describe and demonstrate a live system prototype for hands-free, gesture-based control of an electronic medical record (EMR) system.

Objective

Contact with computer keyboards and mice is a potential risk to clinicians for contamination in both sterile and non-sterile environments. Surgeons in operating rooms and emergency physicians in trauma cases must often review multiple radiologic images while maintaining sterility. On clinical wards, up to 95% of computer keyboards and mice have been shown to be contaminated with potential bacterial pathogens [1]. In theory, a single computer mouse infected with SARS could serve as a potential vector for contaminating the employees of an entire nursing unit. The ability to effectively navigate and retrieve data from an EMR with hands-free modalities could potentially allow clinicians to maintain sterility in surgical suites and mitigate the risks of infectious disease transmission to healthcare workers on medical wards.

Methods

A computer-vision based gesture system was created that allows a gloved or non-gloved hand to control an EMR to select patients from a database, select and retrieve radiologic images or laboratory data, and adjust image viewing parameters (such as degree of zoom and level of attenuation.) Hands are tracked in a in a two-dimensional space in front of a digital video camera. Images are arranged in virtual panels creating a vertically standing image cylinder (a “Gibson cylinder”). The user’s viewpoint is that of the center of cylinder looking out towards the panels. Movement of the hand in one of four directions triggers rotation or vertical movement of the cylinder, thus shifting the images into and out of the user’s view. Hand gesture tracking is accomplished using a modified and enhanced version of the CAMSHIFT algorithm [2] which is based on color and motion cues. A state machine switches between user modes such as “zoom” and “rotate”. Three interface tasks were tested across twenty hospital employees including: hand calibration for tracking, image navigation, and inducing “sleep” mode.

Results

A prototype of the gesture control interface was implemented in a medical data-browser environment. Calibration time for the system averaged less than 10 seconds. 95% or greater completion rates was noted for the three interface tasks.

Discussion

Hands-free, gesture-based control of an EMR is a feasible alternative to computer keyboard and mouse based methods. Efforts are currently underway to deploy the gesture-based system into the neurosurgical operating room suites to replace nursing personnel who currently hold bound volumes of images for the surgeons to review during procedures.

References

1. Schultz M, Gill J, Zubairi S, Huber R, Gordin F. Bacterial contamination of computer keyboards in a teaching hospital. Infect Control Hosp Epidemiol. :302–3. [PubMed]
2. Quek FKH. Unencumbered Gestural Interaction. IEEE Multimedia. 1996:36–47.

Articles from AMIA Annual Symposium Proceedings are provided here courtesy of American Medical Informatics Association
PubReader format: click here to try

Formats:

Related citations in PubMed

See reviews...See all...

Links

  • PubMed
    PubMed
    PubMed citations for these articles

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...