Display Settings:


Send to:

Choose Destination
AJR Am J Roentgenol. 2013 Jan;200(1):132-7. doi: 10.2214/AJR.12.9580.

Peer review comments augment diagnostic error characterization and departmental quality assurance: 1-year experience from a children's hospital.

Author information

  • 1Department of Radiology, Seattle Children's Hospital, 4800 Sand Point Way NE, Seattle, WA 98105, USA. riyer@uw.edu



The objective of our study was to categorize radiologist peer review comments and evaluate their functions within the context of a comprehensive quality assurance (QA) program.


All randomly entered radiology peer review comments at our institution were compiled over a 1-year period (January 1, 2011, through December 31, 2011). A Web-based commercially available software package was used to query the comments, which were then exported into a spreadsheet. Each comment was then placed into a single most appropriate category based on consensus decision of two board-certified pediatric radiologists. QA scores associated with each comment were recorded.


A total of 427 peer review comments were evaluated. The majority of comments (85.9%) were entered voluntarily with QA scores of 1. A classification system was devised that augments traditional error classification. Seven broad comment categories were identified: errors of observation (25.5%), errors of interpretation (5.6%), inadequate patient data gathering (3.7%), errors of communication (9.6%), interobserver variability (21.3%), informational and educational feedback (23.0%), and complimentary (11.2%).


Comment-enhanced peer review expands traditional diagnostic error classification, may identify errors that were underscored, provides continuous educational feedback for participants, and promotes a collegial environment.

[PubMed - indexed for MEDLINE]
PubMed Commons home

PubMed Commons

How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for Atypon
    Loading ...
    Write to the Help Desk