Format

Send to

Choose Destination
AJR Am J Roentgenol. 2017 Aug;209(2):351-357. doi: 10.2214/AJR.16.17439. Epub 2017 May 24.

Development of a Standardized Kalamazoo Communication Skills Assessment Tool for Radiologists: Validation, Multisource Reliability, and Lessons Learned.

Author information

1
1 Department of Radiology, Boston Children's Hospital and Harvard Medical School, 300 Longwood Ave, Boston, MA 02115.
2
2 Institute for Professionalism and Ethical Practice, Boston Children's Hospital, Boston, MA.
3
3 Department of Medicine, Division of General Pediatrics, Boston Children's Hospital and Harvard Medical School, Boston, MA.
4
4 Simulator Program, Boston Children's Hospital, Boston, MA.
5
5 Department of Psychiatry, Harvard Medical School, Boston, MA.
6
6 Department of Radiology, University of Massachusetts Medical School, Worcester, MA.
7
7 Department of Medicine, Division of Adolescent Medicine, Boston Children's Hospital and Harvard Medical School, Boston, MA.
8
8 Cotting School, Lexington, MA.
9
9 Clinical Research Program, Biostatistics Core, Boston Children's Hospital, Boston, MA.
10
10 Department of Anesthesiology, Perioperative and Pain Medicine, Boston Children's Hospital and Harvard Medical School, Boston, MA.

Abstract

OBJECTIVE:

The purpose of this study was to develop and test a standardized communication skills assessment instrument for radiology.

MATERIALS AND METHODS:

The Delphi method was used to validate the Kalamazoo Communication Skills Assessment instrument for radiology by revising and achieving consensus on the 43 items of the preexisting instrument among an interdisciplinary team of experts consisting of five radiologists and four nonradiologists (two men, seven women). Reviewers assessed the applicability of the instrument to evaluation of conversations between radiology trainees and trained actors portraying concerned parents in enactments about bad news, radiation risks, and diagnostic errors that were video recorded during a communication workshop. Interrater reliability was assessed by use of the revised instrument to rate a series of enactments between trainees and actors video recorded in a hospital-based simulator center. Eight raters evaluated each of seven different video-recorded interactions between physicians and parent-actors.

RESULTS:

The final instrument contained 43 items. After three review rounds, 42 of 43 (98%) items had an average rating of relevant or very relevant for bad news conversations. All items were rated as relevant or very relevant for conversations about error disclosure and radiation risk. Reliability and rater agreement measures were moderate. The intraclass correlation coefficient range was 0.07-0.58; mean, 0.30; SD, 0.13; and median, 0.30. The range of weighted kappa values was 0.03-0.47; mean, 0.23; SD, 0.12; and median, 0.22. Ratings varied significantly among conversations (χ26 = 1186; p < 0.0001) and varied significantly by viewing order, rater type, and rater sex.

CONCLUSION:

The adapted communication skills assessment instrument is highly relevant for radiology, having moderate interrater reliability. These findings have important implications for assessing the relational competencies of radiology trainees.

KEYWORDS:

communication assessment; communication competency; communication education; radiology; simulation

PMID:
28537754
DOI:
10.2214/AJR.16.17439
[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Atypon
Loading ...
Support Center