Logo of bmjLink to Publisher's site
BMJ. Jul 10, 1999; 319(7202): 120.
PMCID: PMC1116199
Economics notes

Handling uncertainty in economic evaluation

Andrew Briggs, joint MRC/Anglia and Oxford region training fellow

It is increasingly common to find the results of economic evaluations that report results in terms of cost per life year or cost per quality adjusted life year (QALY) grouped together for comparison in so called cost effectiveness league tables. It is worrying, however, that the rankings in such league tables are made on the basis of point estimates of cost effectiveness without any consideration for the inherent uncertainty involved in their calculation. Consider the figure, where the point estimates for seven different healthcare interventions are plotted in increasing order of magnitude. What is immediately apparent is that the range of uncertainty (shown by the size of the bars around the point estimates) is such that a completely different ranking than that obtained from the order of the point estimates is possible. If policymakers are to be fully informed it is imperative that analysts attempt to estimate the uncertainty inherent in their results rather than simply presenting point estimates. These interval estimates should accompany point estimates when reproduced in league tables.

Uncertainty in economic evaluation is pervasive, entering the evaluative process at every stage. It is useful to distinguish uncertainty related to the data requirements of a study and uncertainty related to the process of evaluation. Uncertainty in the data requirements of a study arises through natural variation in populations, which means that estimates based on samples drawn from that population will always be associated with a level of uncertainty that is inversely related to sample size. Examples of uncertainty due to the evaluative process include the need to extrapolate when conducting evaluations, say from a clinical outcome measure (such as cholesterol lowering) to a health outcome measure (reduced morbidity and mortality due to heart disease); uncertainty related to generalising from the context of the study to other contexts and patient populations; and the uncertainty related to choice of analytical methods—for example, whether to include indirect costs in the analysis.2

The traditional method for handling uncertainty due to sampling variation in many forms of evaluation, most notably clinical evaluation, has been statistical analysis. When patient specific resource use and health outcome data have been collected (for example, as part of a prospective clinical trial) then statistical techniques have been developed to calculate confidence intervals around point estimates of cost effectiveness, although the methods required to estimate confidence limits for a ratio statistic are less straightforward than for many other statistics.3,4

In practice, however, few economic evaluations are conducted alongside clinical trials. Instead, data are more likely to be synthesised from several sources, including literature reviews, hospital records, and even clinical judgments. Hence, standard statistical methods cannot be used. Moreover, even when statistical analysis is possible, the remaining levels of uncertainty that are not related to sampling variation need quantifying. To do this sensitivity analysis is used, which involves systematically examining the influence of the variables and assumptions used in an evaluation.5

The term sensitivity analysis encompasses several techniques and it is useful to distinguish three approaches. A one way sensitivity analysis systematically examines the impact of each variable in the study by varying it across a plausible range of values while holding all other variables constant at their “best estimate” or baseline value. In contrast, an extreme scenario analysis involves setting each variable simultaneously to take the most optimistic (or pessimistic) value in order to generate a best (or worst) case scenario. In real life, of course, the components of an evaluation do not vary in isolation, and nor are they perfectly correlated, so one way sensitivity analyses will probably underestimate, and extreme scenario analysis overestimate, the uncertainty associated with the results of economic evaluation. A third technique, known as probabilistic sensitivity analysis and based on a large number of Monte Carlo simulations, examines the effect on the results of an evaluation when the underlying variables are allowed to vary simultaneously across a plausible range according to predefined distributions. These probabilistic analyses may be expected to produce a more realistic interval.6

Figure
Point estimates of cost effectiveness together with estimates of the associated uncertainty1

Footnotes

These notes are edited by James Raftery (ku.ca.mahb@YRETFAR.P.J)

This is the 9th note in the series

References

1. Petrou S, Malek M, Davey PG. The reliability of cost-utility estimates in cost-per-QALY league tables. PharmacoEconomics. 1993;3:345–353. [PubMed]
2. Briggs AH, Sculpher MJ, Buxton MJ. Uncertainty in the economic evaluation of health care technologies: the role of sensitivity analysis. Health Economics. 1994;3:95–104. [PubMed]
3. O’Brien BJ, Drummond MF, Labelle RJ, Willan A. In search of power and significance: issues in the design and analysis of stochastic cost-effectiveness studies in health care. Medical Care. 1994;32:150–163. [PubMed]
4. Briggs AH, Fenn P. Confidence intervals or surfaces? Uncertainty on the cost effectiveness plane. Health Economics. 1998;7:723–740. [PubMed]
5. Briggs A. Handling uncertainty in the results of economic evaluation. London: Office of Health Economics; 1995.
6. Manning WG, Fryback DG, Weinstein MC. Reflecting uncertainty in cost-effectiveness analysis. In: Gold MR, Siegel JE, Russell LB, Weinstein MC, editors. Cost effectiveness in health and medicine. New York: Oxford University Press; 1996.

Articles from BMJ : British Medical Journal are provided here courtesy of BMJ Group
PubReader format: click here to try

Formats:

Related citations in PubMed

See reviews...See all...

Cited by other articles in PMC

See all...

Links

  • PubMed
    PubMed
    PubMed citations for these articles

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...