Send to

Choose Destination
Acad Med. 2015 Feb;90(2):246-56. doi: 10.1097/ACM.0000000000000549.

Linking simulation-based educational assessments and patient-related outcomes: a systematic review and meta-analysis.

Author information

Dr. Brydges is assistant professor, Department of Medicine, University of Toronto, and scientist, Wilson Centre, University Health Network, Toronto, Ontario, Canada. Dr. Hatala is associate professor, Department of Medicine, University of British Columbia, Vancouver, British Columbia, Canada. Dr. Zendejas is a resident, Department of Surgery, Mayo Clinic College of Medicine, Rochester, Minnesota. Ms. Erwin is assistant professor of medical education, Mayo Clinic Libraries, Mayo Clinic College of Medicine, Rochester, Minnesota. Dr. Cook is professor of medicine and medical education and consultant, Division of General Internal Medicine, Mayo Clinic College of Medicine, Rochester, Minnesota.



To examine the evidence supporting the use of simulation-based assessments as surrogates for patient-related outcomes assessed in the workplace.


The authors systematically searched MEDLINE, EMBASE, Scopus, and key journals through February 26, 2013. They included original studies that assessed health professionals and trainees using simulation and then linked those scores with patient-related outcomes assessed in the workplace. Two reviewers independently extracted information on participants, tasks, validity evidence, study quality, patient-related and simulation-based outcomes, and magnitude of correlation. All correlations were pooled using random-effects meta-analysis.


Of 11,628 potentially relevant articles, the 33 included studies enrolled 1,203 participants, including postgraduate physicians (n = 24 studies), practicing physicians (n = 8), medical students (n = 6), dentists (n = 2), and nurses (n = 1). The pooled correlation for provider behaviors was 0.51 (95% confidence interval [CI], 0.38 to 0.62; n = 27 studies); for time behaviors, 0.44 (95% CI, 0.15 to 0.66; n = 7); and for patient outcomes, 0.24 (95% CI, -0.02 to 0.47; n = 5). Most reported validity evidence was favorable, though studies often included only correlational evidence. Validity evidence of internal structure (n = 13 studies), content (n = 12), response process (n = 2), and consequences (n = 1) were reported less often. Three tools showed large pooled correlations and favorable (albeit incomplete) validity evidence.


Simulation-based assessments often correlate positively with patient-related outcomes. Although these surrogates are imperfect, tools with established validity evidence may replace workplace-based assessments for evaluating select procedural skills.

[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Wolters Kluwer
Loading ...
Support Center