Display Settings:

Format

Send to:

Choose Destination
See comment in PubMed Commons below
Surgery. 2005 Feb;137(2):141-7.

Assuring the reliability of resident performance appraisals: more items or more observations?

Author information

  • 1Department of Surgery, Southern Illinois University School of Medicine, PO Box 19638, Springfield, IL 62794-9638, USA. rwilliams@siumed.edu

Abstract

BACKGROUND:

The tendency to add items to resident performance rating forms has accelerated due to new ACGME competency requirements. This study addresses the relative merits of adding items versus increasing number of observations. The specific questions addressed are (1) what is the reliability of single items used to assess resident performance, (2) what effect does adding items have on reliability, and (3) how many observations are required to obtain reliable resident performance ratings.

METHODS:

Surgeon ratings of resident performance were collected for 3 years. The rating instrument had 3 single items representing clinical performance, professional behavior, and comparisons to other house staff. Reliability analyses were performed separately for each year, and variance components were pooled across years to compute overall reliability coefficients.

RESULTS:

Single-item resident performance rating scales were equivalent to multiple-item scales using conventional reliability standards. Increasing the number of rating items had little effect on reliability. Increasing the number of observations had a much larger effect.

CONCLUSIONS:

Program directors should focus on increasing the number of observations per resident to improve performance sampling and reliability of assessment. Increasing the number of rating items had little effect on reliability and is unlikely to assess new ACGME competencies adequately.

PMID:
15674193
[PubMed - indexed for MEDLINE]
PubMed Commons home

PubMed Commons

0 comments
How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for Elsevier Science
    Loading ...
    Write to the Help Desk