Display Settings:

Format

Send to:

Choose Destination
J Am Med Inform Assoc. 1999 Mar-Apr;6(2):143-50.

A reliability study for evaluating information extraction from radiology reports.

Author information

  • 1Columbia University, New York, New York, USA. hripcsak@columbia.edu

Abstract

GOAL:

To assess the reliability of a reference standard for an information extraction task.

SETTING:

Twenty-four physician raters from two sites and two specialties judged whether clinical conditions were present based on reading chest radiograph reports.

METHODS:

Variance components, generalizability (reliability) coefficients, and the number of expert raters needed to generate a reliable reference standard were estimated.

RESULTS:

Per-rater reliability averaged across conditions was 0.80 (95% CI, 0.79-0.81). Reliability for the nine individual conditions varied from 0.67 to 0.97, with central line presence and pneumothorax the most reliable, and pleural effusion (excluding CHF) and pneumonia the least reliable. One to two raters were needed to achieve a reliability of 0.70, and six raters, on average, were required to achieve a reliability of 0.95. This was far more reliable than a previously published per-rater reliability of 0.19 for a more complex task. Differences between sites were attributable to changes to the condition definitions.

CONCLUSION:

In these evaluations, physician raters were able to judge very reliably the presence of clinical conditions based on text reports. Once the reliability of a specific rater is confirmed, it would be possible for that rater to create a reference standard reliable enough to assess aggregate measures on a system. Six raters would be needed to create a reference standard sufficient to assess a system on a case-by-case basis. These results should help evaluators design future information extraction studies for natural language processors and other knowledge-based systems.

Comment in

PMID:
10094067
[PubMed - indexed for MEDLINE]
PMCID:
PMC61353
Free PMC Article
PubMed Commons home

PubMed Commons

0 comments
How to join PubMed Commons

    Supplemental Content

    Icon for HighWire Icon for PubMed Central
    Loading ...
    Write to the Help Desk