Format

Send to

Choose Destination
J Surg Educ. 2018 Apr 11. pii: S1931-7204(18)30038-2. doi: 10.1016/j.jsurg.2018.03.009. [Epub ahead of print]

Evaluating Surgical Coaching: A Mixed Methods Approach Reveals More Than Surveys Alone.

Author information

1
Department of Surgery, Stanford School of Medicine, Stanford, California.
2
Department of Surgery, Beth Israel Deaconess Medical Center, Boston, Massachusetts; Center for Surgery and Public Health, Brigham and Women's Hospital, Boston, Massachusetts; Division of Pediatric Surgery, Ann & Robert H. Lurie Children's Hospital, Chicago, Illinois. Electronic address: yueyunghu@luriechildrens.org.
3
Center for Surgery and Public Health, Brigham and Women's Hospital, Boston, Massachusetts; Department of Anesthesiology, Perioperative and Pain Medicine, Brigham and Women's Hospital, Boston, Massachusetts; Department of Health Policy & Management, Harvard T. H. Chan School of Public Health, Boston, Massachusetts.
4
Center for Surgery and Public Health, Brigham and Women's Hospital, Boston, Massachusetts; Department of Surgery, Wisconsin Surgical Outcomes Research Program, University of Wisconsin Hospitals & Clinics, Madison, Wisconsin.
5
Center for Surgery and Public Health, Brigham and Women's Hospital, Boston, Massachusetts.
6
Center for Surgery and Public Health, Brigham and Women's Hospital, Boston, Massachusetts; Department of Health Policy & Management, Harvard T. H. Chan School of Public Health, Boston, Massachusetts; Department of Surgery, Brigham and Women's Hospital, Boston, Massachusetts.
7
Center for Surgery and Public Health, Brigham and Women's Hospital, Boston, Massachusetts; Department of Surgery, Brigham and Women's Hospital, Boston, Massachusetts.
8
Center for Surgery and Public Health, Brigham and Women's Hospital, Boston, Massachusetts; Department of Surgery, Brigham and Women's Hospital, Boston, Massachusetts; STRATUS Center for Medical Simulation, Brigham & Women's Hospital, Boston, Massachusetts.

Abstract

OBJECTIVE:

Traditionally, surgical educators have relied upon participant survey data for the evaluation of educational interventions. However, the ability of such subjective data to completely evaluate an intervention is limited. Our objective was to compare resident and attending surgeons' self-assessments of coaching sessions from surveys with independent observations from analysis of intraoperative and postoperative coaching transcripts.

DESIGN:

Senior residents were video-recorded operating. Each was then coached by the operative attending in a 1:1 video review session. Teaching points made in the operating room (OR) and in post-OR coaching sessions were coded by independent observers using dialogue analysis then compared using t-tests. Participants were surveyed regarding the degree of teaching dedicated to specific topics and perceived changes in teaching level, resident comfort, educational assessments, and feedback provision between the OR and the post-OR coaching sessions.

SETTING:

A single, large, urban, tertiary-care academic institution.

PARTICIPANTS:

Ten PGY4 to 5 general surgery residents and 10 attending surgeons.

RESULTS:

Although the reported experiences of teaching and coaching sessions by residents and faculty were similar (Pearson correlation coefficient = 0.88), these differed significantly from independent observations. Observers found that residents initiated a greater proportion of teaching points and had more educational needs assessments during coaching, compared to the OR. However, neither residents nor attendings reported a change between the 2 environments with regard to needs assessments nor comfort with asking questions or making suggestions. The only metric on which residents, attendings, and observers agreed was the provision of feedback.

CONCLUSIONS:

Participants' perspectives, although considered highly reliable by traditional metrics, rarely aligned with analysis of the associated transcripts from independent observers. Independent observation showed a distinct benefit of coaching in terms of frequency and type of learning points. These findings highlight the importance of seeking different perspectives, data sources, and methodologies when evaluating clinical education interventions. Surgical education can benefit from increased use of dialogue analyses performed by independent observers, which may represent a viewpoint distinct from that obtained by survey methodology.

KEYWORDS:

Medical Knowledge; Patient Care; Practice Based Learning and Improvement; dialogue analysis; educational assessment; nontechnical skill training; surgical coaching; surgical resident education; technical skill training

Supplemental Content

Full text links

Icon for Elsevier Science
Loading ...
Support Center