Send to

Choose Destination
Child Abuse Negl. 2017 Oct;72:140-146. doi: 10.1016/j.chiabu.2017.08.001. Epub 2017 Aug 10.

Inter-rater reliability of physical abuse determinations in young children with fractures.

Author information

Department of Orthopaedics and Rehabilitation, Yale University School of Medicine, 800 Howard Avenue, New Haven, CT 06510, USA. Electronic address:
Department of Pediatrics, Yale University School of Medicine, P.O. Box 208064, 333 Cedar Street, New Haven, CT 06520-8064, USA.
Yale Center for Medical Informatics, 300 George Street, Suite 501, New Haven, CT 06511, USA.
Department of Surgery, Greenville Health System, 701 Grove Road, Greenville, SC 29605, USA.
Department of Orthopaedics and Rehabilitation, Yale University School of Medicine, 800 Howard Avenue, New Haven, CT 06510, USA.
Department of Pediatrics, Connecticut Children's Medical Center, 282 Washington Street, Hartford, CT 06106, USA.
Department of Radiology and Biomedical Imaging, Yale University School of Medicine, P.O. Box 208042, 333 Cedar Street, New Haven, CT 06520-8042, USA.


As there is no "gold standard" in determining whether a fracture is caused by accident or abuse, agreement among medical providers is paramount. Using abstracted medical record data from 551 children <36months of age presenting to a pediatric emergency department, we examined the extent of agreement between specialists who evaluate children with fractures for suspected abuse. To simulate clinical scenarios, two pediatric orthopaedists and two child abuse pediatricians (CAPs) reviewed the full abstraction and imaging, whereas the two pediatric radiologists reviewed a brief history and imaging. Each physician independently rated each case using a 7-point ordinal scale designed to distinguish accidental from abusive injuries. For any discrepancy in independent ratings, the two specialists discussed the case and came to a joint rating. We analyzed 3 types of agreement: (1) within specialties using independent ratings, (2) between specialties using joint ratings, and (3) between clinicians (orthopaedists and CAPs) with more versus less experience. Agreement between pairs of raters was assessed using Cohen's weighted kappa. Orthopaedists (κ=0.78) and CAPs (κ=0.67) had substantial within-specialty agreement, while radiologists (κ=0.53) had moderate agreement. Orthopaedists and CAPs had almost perfect between-specialty agreement (κ=0.81), while agreement was much lower for orthopaedists and radiologists (κ=0.37) and CAPs and radiologists (κ=0.42). More-experienced clinicians had substantial between-specialty agreement (κ=0.80) versus less-experienced clinicians who had moderate agreement (κ=0.60). These findings suggest the level of clinical detail a physician receives and his/her experience in the field has an impact on the level of agreement when evaluating fractures in young children.


Fracture; Inter-rater reliability; Physical child abuse

[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Elsevier Science
Loading ...
Support Center