Send to

Choose Destination
See comment in PubMed Commons below
Int J Technol Assess Health Care. 2006 Summer;22(3):288-94.

Case study of the comparison of data from conference abstracts and full-text articles in health technology assessment of rapidly evolving technologies: does it make a difference?

Author information

Liverpool Reviews and Implementation Group, Faculty of Medicine, University of Liverpool, UK.



The aim of this study was to examine (i) the consistency of reporting research findings presented in conference abstracts and presentations and subsequent full publications, (ii) the ability to judge methodological quality of trials from conference abstracts and presentations, and (iii) the effect of inclusion or exclusion of data from these sources on the pooled effect estimates in a meta-analysis.


This report is a case study of a selected health technology assessment review (TAR) of a rapidly evolving technology that had identified and included a meta-analysis of trial data from conference abstracts and presentations.


The overall quality of reporting in abstracts and presentations was poor, especially in abstracts. There was incomplete or inconsistent reporting of data in the abstract/presentations. Most often inconsistencies were between conference slide presentations and data reported in published full-text articles. Sensitivity analyses indicated that using data only from published papers would not have altered the direction of any of the results when compared with those using published and abstract data. However, the statistical significance of three of ten results would have changed. If conference abstracts and presentations were excluded from the early analysis, the direction of effect and statistical significance would have changed in one result. The overall conclusions of the original analysis would not have been altered.


There are inconsistencies in data presented as conference abstracts/presentations and those reported in subsequent published reports. These inconsistencies could impact the final assessment results. Data discrepancies identified across sources included in TARs should be highlighted and their impact assessed and discussed. Sensitivity analyses should be carried out with and without abstract/presentation data included in the analysis. Incomplete reporting in conference abstracts and presentations limits the ability of reviewers to assess confidently the methodological quality of trials.

[Indexed for MEDLINE]
PubMed Commons home

PubMed Commons

How to join PubMed Commons

    Supplemental Content

    Loading ...
    Support Center