Display Settings:

Format

Send to:

Choose Destination
See comment in PubMed Commons below
JAMA. 2009 Sep 23;302(12):1316-26. doi: 10.1001/jama.2009.1365.

Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review.

Author information

  • 1Department of Medicine, University of Pennsylvania Health System, 3701 Market St, Ste 640, Philadelphia, PA 19104, USA. jennifer.kogan@uphs.upenn.edu

Abstract

CONTEXT:

Direct observation of medical trainees with actual patients is important for performance-based clinical skills assessment. Multiple tools for direct observation are available, but their characteristics and outcomes have not been compared systematically.

OBJECTIVES:

To identify observation tools used to assess medical trainees' clinical skills with actual patients and to summarize the evidence of their validity and outcomes.

DATA SOURCES:

Electronic literature search of PubMed, ERIC, CINAHL, and Web of Science for English-language articles published between 1965 and March 2009 and review of references from article bibliographies.

STUDY SELECTION:

Included studies described a tool designed for direct observation of medical trainees' clinical skills with actual patients by educational supervisors. Tools used only in simulated settings or assessing surgical/procedural skills were excluded. Of 10 672 citations, 199 articles were reviewed and 85 met inclusion criteria.

DATA EXTRACTION:

Two authors independently abstracted studies using a modified Best Evidence Medical Education coding form to inform judgment of key psychometric characteristics. Differences were reconciled by consensus.

RESULTS:

A total of 55 tools were identified. Twenty-one tools were studied with students and 32 with residents or fellows. Two were used across the educational continuum. Most (n = 32) were developed for formative assessment. Rater training was described for 26 tools. Only 11 tools had validity evidence based on internal structure and relationship to other variables. Trainee or observer attitudes about the tool were the most commonly measured outcomes. Self-assessed changes in trainee knowledge, skills, or attitudes (n = 9) or objectively measured change in knowledge or skills (n = 5) were infrequently reported. The strongest validity evidence has been established for the Mini Clinical Evaluation Exercise (Mini-CEX).

CONCLUSION:

Although many tools are available for the direct observation of clinical skills, validity evidence and description of educational outcomes are scarce.

Comment in

PMID:
19773567
[PubMed - indexed for MEDLINE]
PubMed Commons home

PubMed Commons

0 comments
How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for Silverchair Information Systems
    Loading ...
    Write to the Help Desk