Format

Send to

Choose Destination
Semin Hematol. 2008 Jul;45(3):160-6. doi: 10.1053/j.seminhematol.2008.04.010.

Perfect study, poor evidence: interpretation of biases preceding study design.

Author information

1
Clinical and Molecular Epidemiology Unit, Department of Hygiene and Epidemiology, University of Ioannina School of Medicine and Biomedical Research Institute, Foundation for Research and Technology-Hellas, Ioannina, Greece. jioannid@cc.uoi.gr

Abstract

In the interpretation of research evidence, data that have been accumulated in a specific isolated study are typically examined. However, important biases may precede the study design. A study may be misleading, useless, or even harmful, even though it seems to be perfectly designed, conducted, analyzed, and reported. Some biases pertain to setting the wider research agenda and include poor scientific relevance, minimal clinical utility, or failure to consider prior evidence (non-consideration of prior evidence, biased consideration of prior evidence, or consideration of biased prior evidence). Other biases reflect issues in setting the specific research questions: examples include straw man effects, avoidance of head-to-head comparisons, head-to-head comparisons bypassing demonstration of effectiveness, overpowered studies, unilateral aims (focusing on benefits and neglecting harms), and the approach of the industry towards research as bulk advertisement (including ghost management of the literature). The concerted presence of such biases may have a multiplicative, detrimental impact on the scientific literature. These issues should be considered carefully when interpreting research results.

[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Elsevier Science
Loading ...
Support Center