Display Settings:


Send to:

Choose Destination
See comment in PubMed Commons below
Spine (Phila Pa 1976). 2005 Aug 15;30(16):E455-8.

Interrater reliability of scoring of pain drawings in a self-report health survey.

Author information

  • 1Primary Care Sciences Research Centre, Keele University, Keele, Staffordshire ST5 5BG, United Kingdom.



Study of interrater reliability.


To assess the interrater reliability of data from pain drawings scored by multiple raters and the consistency of the subsequent classification of cases of widespread pain.


In large health surveys, pain drawings used to capture self-reported pain, and to classify cases of widespread pain, are often scored by several raters. The reliability of multiple rater scoring of pain drawings has not been investigated.


As part of a postal survey sent to adults 50 years and older, subjects were asked to shade their pain on a blank body manikin. The first 50 pain drawings in which respondents had shaded pain were selected for this study. Eight nonclinical staff were trained to score pain drawings using transparent templates divided into 50 body areas. Interrater reliability was assessed by comparing the scoring of "pain" or "no pain" for all 50 areas of each pain drawing.


Complete scoring agreement among all raters was observed for at least 78% of pain drawings across all body areas (kappa > 0.60). The raters had complete agreement in 42 of 50 areas in 90% or more of pain drawings. From the raters' scoring of pain areas, there was complete agreement on the presence or absence of widespread pain for 49 of 50 pain drawings (98% agreement, Kappa = 0.98).


This study shows that multiple raters, with training and guidelines, can reliably score pain drawings, and high consistency in the subsequent classification of cases of widespread pain can be obtained from such data.

[PubMed - indexed for MEDLINE]
PubMed Commons home

PubMed Commons

How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for Lippincott Williams & Wilkins
    Loading ...
    Write to the Help Desk