Format

Send to

Choose Destination
J Clin Epidemiol. 2005 Jan;58(1):1-12.

A systematic review finds that diagnostic reviews fail to incorporate quality despite available tools.

Author information

1
Centre for Reviews and Dissemination, University of York, United Kingdom. penny.whiting@bristol.ac.uk

Abstract

BACKGROUND AND OBJECTIVE:

To review existing quality assessment tools for diagnostic accuracy studies and to examine to what extent quality was assessed and incorporated in diagnostic systematic reviews.

METHODS:

Electronic databases were searched for tools to assess the quality of studies of diagnostic accuracy or guides for conducting, reporting or interpreting such studies. The Database of Abstracts of Reviews of Effects (DARE; 1995-2001) was used to identify systematic reviews of diagnostic studies to examine the practice of quality assessment of primary studies.

RESULTS:

Ninety-one quality assessment tools were identified. Only two provided details of tool development, and only a small proportion provided any indication of the aspects of quality they aimed to assess. None of the tools had been systematically evaluated. We identified 114 systematic reviews, of which 58 (51%) had performed an explicit quality assessment and were further examined. The majority of reviews used more than one method of incorporating quality.

CONCLUSION:

Most tools to assess the quality of diagnostic accuracy studies do not start from a well-defined definition of quality. None has been systematically evaluated. The majority of existing systematic reviews fail to take differences in quality into account. Reviewers should consider quality as a possible source of heterogeneity.

PMID:
15649665
DOI:
10.1016/j.jclinepi.2004.04.008
[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Elsevier Science
Loading ...
Support Center