Display Settings:

Format

Send to:

Choose Destination
See comment in PubMed Commons below
Can J Anaesth. 2001 Mar;48(3):225-33.

Validity and reliability of undergraduate performance assessments in an anesthesia simulator.

Author information

  • 1Department of Anesthesia, Sunnybrook and Women's College Health Sciences Centre, University of Toronto, Ontario, Canada. pam.morgan@utoronto.ca

Abstract

PURPOSE:

To examine the validity and reliability of performance assessment of undergraduate students using the anesthesia simulator as an evaluation tool.

METHODS:

After ethics approval and informed consent, 135 final year medical students and 5 elective students participated in a videotaped simulator scenario with a Link-Med Patient Simulator (CAE-Link Corporation). Scenarios were based on published educational objectives of the undergraduate curriculum in anesthesia at the University of Toronto. During the simulator sessions, faculty followed a script guiding student interaction with the mannequin. Two faculty independently viewed and evaluated each videotaped performance with a 25-point criterion-based checklist. Means and standard deviations of simulator-based marks were determined and compared with clinical and written evaluations received during the rotation. Internal consistency of the evaluation protocol was determined using inter-item and item-total correlations and correlations of specific simulator items to existing methods of evaluation.

RESULTS:

Mean reliability estimates for single and average paired assessments were 0.77 and 0.86 respectively. Means of simulator scores were low and there was minimal correlation between the checklist and clinical marks (r = 0.13), checklist and written marks (r = 0.19) and clinical and written marks (r = 0.23). Inter-item and item-total correlations varied widely and correlation between simulator items and existing evaluation tools was low.

CONCLUSIONS:

Simulator checklist scoring demonstrated acceptable reliability. Low correlation between different methods of evaluation may reflect reliability problems with the written and clinical marks, or that different aspects are being tested. The performance assessment demonstrated low internal consistency and further work is required.

PMID:
11305821
[PubMed - indexed for MEDLINE]
PubMed Commons home

PubMed Commons

0 comments
How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for Springer
    Loading ...
    Write to the Help Desk