Format

Send to

Choose Destination
See comment in PubMed Commons below
J Clin Epidemiol. 2009 Dec;62(12):1292-300. doi: 10.1016/j.jclinepi.2009.02.007. Epub 2009 May 17.

Differences between univariate and bivariate models for summarizing diagnostic accuracy may not be large.

Author information

  • 1Department of Medicine, Durham Veterans Affairs Medical Center, NC 27705, USA. david.simel@duke.edu

Abstract

OBJECTIVE:

Experts recommend random effects bivariate logitnormal sensitivity and specificity estimates, rather than directly summarized univariate likelihood ratios (LRs) for diagnostic test meta-analyses. We assessed whether bivariate measures might cause different clinical conclusions compared with those from simpler univariate measures.

STUDY DESIGN:

From two articles that described the benefits of bivariate random effects measures, we reanalyzed results and compared outcomes to univariate random effects summary estimates of sensitivity, specificity, and LRs. We also reanalyzed data from two published clinical examination studies to assess differences in the two methods.

RESULTS:

The median difference between bivariate and univariate methods for sensitivity was 1.5% (range: 0-6%) and for specificity was 1.5% (range: 0-4%). Using a pretest probability of 50%, the median difference in posterior probability was 2.5% (interquartile range: 2.2-3.2%, overall range: 0-11%). For sparse data, continuity adjustment affected the differences. Adding 0.5 to each cell of studies containing at least one cell with zero patients provided the most consistent result.

CONCLUSIONS:

Bivariate estimates of sensitivity and specificity generate summary LRs similar to those derived with univariate methods. Our empiric results suggest that recalculating LRs in published research will not likely create dramatic changes as a function of the random effects measure chosen.

[PubMed - indexed for MEDLINE]
PubMed Commons home

PubMed Commons

0 comments
How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for Elsevier Science
    Loading ...
    Write to the Help Desk