Send to

Choose Destination
Appl Health Econ Health Policy. 2011 Jul 1;9(4):225-41. doi: 10.2165/11590480-000000000-00000.

Ordering errors, objections and invariance in utility survey responses: a framework for understanding who, why and what to do.

Author information

Schneider Institutes for Health Policy, Heller School for Social Policy and Management, Brandeis University, Waltham, MA 02454-9110, USA.



Utilities are the quantification of the perceived quality of life associated with any health state. They are used to calculate QALYs, the outcome measure in cost-utility analysis. Generally measured through surveys of individuals, utilities often contain apparent or unapparent errors that can bias resulting values and QALYs calculated from these values.


The aim of this study was to improve direct health utility elicitation methodology through the identification of the types of survey responses that indicate errors and objections, and the reasons underlying them.


We conducted a systematic review of the medical (PubMed), economics (EconLit) and psychology (PsycINFO) literature from 1975 through June 2010 for articles describing the types and frequency of errors and objections in directly elicited utility survey responses, and strategies to address these responses. Primary data were collected through an internet-based utility survey (standard gamble) of community members to identify responses that indicate error or objections. A qualitative telephone survey was conducted among a subset of respondents with these types of responses using an open-ended protocol to elicit rationales for them.


A total of 11 papers specifically devoted to errors, objections and invariance in utility responses have been published since the mid-1990s. Error/objection responses can be broadly categorized into ordering errors (which include illogical and inconsistent responses) and objections/invariance (which include missing data, protest responses and refusals to trade time or risk in utility questions). Reported frequencies of respondents making ordering errors ranged from 5% to 100%, and up to 35% of respondents have been reported as objecting to the survey or task in some manner. Changes in the design, administration and analysis of surveys can address these potentially problematic responses. Survey data (n = 398) showed that individuals who provided invariant responses (n = 26) reported the lowest level of difficulty with the survey and often identified as religious (23% of invariant responders found the survey difficult vs 63% of all responders, and 77% of invariant responders identified as religious compared with 56% of entire sample; p < 0.05 for both). Respondents who provided illogical responses (n = 50) were less likely to be college educated (56% of illogical responders vs 73% of entire sample; p < 0.05), and less likely to be confident in their responses (62% vs 75% of entire sample; p < 0.05). Qualitative interviews (n = 42) following the survey revealed that the majority of ordering errors were a result of confusion, lack of attention or difficulty in responding to the survey on the part of the respondent, while invariant responses were often considered and thoughtful reactions to the premise of valuing health using the standard gamble task.


Rationales for error/objection responses include difficulty in articulating preferences or misunderstanding with a complex survey task, and also thoughtful and considered protestations to the task. Mechanisms to correct unintentional errors may be useful, but cannot address intentional responses to elements of the measurement task. Identification and analysis of the prevalence of errors and objections in responses in utility data sets are essential to understanding the accuracy and precision of utility estimates and analyses that depend thereon.

[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Springer
Loading ...
Support Center