NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Gottfried JA, editor. Neurobiology of Sensation and Reward. Boca Raton (FL): CRC Press; 2011.

Cover of Neurobiology of Sensation and Reward

Neurobiology of Sensation and Reward.

Show details

Chapter 16The Neurology of Value

.

16.1. INTRODUCTION

Many disorders of the central nervous system affect judgment and decision making. Along with other impairments of “executive function,” these symptoms have been challenging to understand within a classical neurological-localizationist framework, beyond a general link to the frontal lobes. Notwithstanding these difficulties, the fact that decision making deficits can emerge in neurological diseases provides a starting point for determining the neural circuits that underlie them. This method of enquiry has several advantages: from a clinical perspective, a better understanding of brain-behavior relationships can help in the diagnosis, prognosis, and treatment of disordered decision making. From a basic science perspective, this approach provides the capacity to test two main questions. The first concerns the behaviors themselves and the second the brain substrates of those behaviors.

Franz-Joseph Gall and his phrenologist colleagues were prepared to localize behaviors as complex as “benevolence” and “wit” to specific regions of the brain. Current views instead suggest that complex behaviors can be understood as depending on simpler component processes, which can in turn be localized to specific neural circuits, which interact to produce the complex behavior (Stuss and Alexander 2007). There are, in principle, many ways to dissect the complexity of decision making. Knowing how to carve up this complexity is perhaps the central challenge in these still-early days of the neuroscience of human decision making. If the aim is to provide a model that will be useful from a neurobiological perspective, then candidate component processes must relate in a meaningful way to the brain.

Studies of neurological patients can be particularly useful in testing whether putative component processes are, in fact, distinct (by showing, for example, that one process is impaired, and another spared, after brain damage). Furthermore, when such work is carried out in patients with defined brain injuries, inferences can be drawn about the brain substrates of the process in question. Thus, a neurological approach provides a useful window on decision making. This chapter will review work on the brain substrates of value-based decision making from this perspective, concentrating particularly on component process approaches. As we shall see, at least some of this work has addressed how “value” relates to choice objects in the world, and as such can be framed as a linkage of sensation and reward. Experimental studies of patients with focal brain injury aimed at delineating the neuroanatomical substrates of decision making will be the primary focus, but I will also touch on work examining the neurochemical modulation of choice. As mentioned, this basic science research also offers an interesting perspective on the mechanisms underlying the everyday difficulties of patients with dysfunction of systems important to decision and reward processing. I will return to this at the end of the chapter.

16.2. A SHORT HISTORY OF THE NEUROPSYCHOLOGY OF DECISION MAKING

There is now a substantial corpus of experimental work on decision making in patients with focal frontal lobe damage, and an emerging literature on the effects of dopaminergic and basal ganglia pathologies. However, perhaps the largest contribution of neurology to decision neuroscience has been to provide colorful anecdotes as introductory material for papers and talks. Indeed, it is remarkable to have come this far into the chapter without having mentioned Phineas Gage. This unfortunate nineteenth-century New Englander went from reliable construction foreman to existence proof of the importance of the frontal lobes in regulating behavior, after an industrial accident which sent a six-foot iron bar through his brain (Harlow 1868). Anecdotal reports of poor judgment after frontal injury continued in the ensuing century, culminating in an influential, detailed case study of the patient E.V.R. This patient was notable both for making poor choices and for having extreme difficulty in arriving at any decision at all, as sequelae of injury to orbitofrontal and ventromedial prefrontal cortex (PFC) (Eslinger and Damasio 1985). These adjacent sub-regions of PFC are commonly injured together, and will here be referred to together as the ventromedial frontal lobes (VMF; see Figure 16.1). Understanding the functions of this region has been a major focus of the experimental neuropsychological studies of decision making to date.

FIGURE 16.1. Sub-region of prefrontal cortex here designated as ventromedial frontal lobe (VMF), shown in darker gray on ventral (left panel) and partial coronal (right panel) sections on a three-dimensional view of the human brain.

FIGURE 16.1

Sub-region of prefrontal cortex here designated as ventromedial frontal lobe (VMF), shown in darker gray on ventral (left panel) and partial coronal (right panel) sections on a three-dimensional view of the human brain.

A clinical anecdote is weak evidence, particularly when the symptoms relate to complex behaviors, but a strong stimulus for creative hypothesis generation. Framing the altered behavior of E.V.R., and of other patients with VMF damage, as a deficit in decision making has contributed to the development of novel experimental measures of learning and decision making (Bechara, Damasio, and Damasio 2000; Bechara et al. 1997). In turn, this has led to links being made between these clinical observations and animal research on learning and reinforcement, and (separately) with decision theory developed in behavioral economics and psychology. Together, these approaches have begun to shed light on the brain substrates of decision and value, and are providing a fresh perspective on the constellation of clinical symptoms that follow frontal lobe injury (Fellows 2007a; Murray, O’Doherty, and Schoenbaum 2007).

Phrenology notwithstanding, decision making does not take place in only one region of the brain. Indeed, there is every reason to believe that decisions draw upon an extensive network of subcortical and cortical structures, and also to expect that fundamental signals of reward, punishment, and expectancy have a wide influence on neural processes beyond what might be considered as strictly decision making. This chapter will begin with a review of what has been learned about the role of VMF in decision making, primarily drawing from patient lesion studies. This will also provide the opportunity to discuss a range of candidate component processes likely to be important in decision making. I will then turn to other neuroanatomical substrates of decision making, including amygdala, insula, and striatum, and other regions within PFC, and review what is known about their contributions to these processes in humans. Finally, I will briefly discuss what has been learned about the neurochemical substrates of decision making in human subjects, and then consider the clinical implications of this basic science literature.

16.3. VENTROMEDIAL FRONTAL LOBE DAMAGE AFFECTS DECISION MAKING

Patients with VMF damage may have changes in behavior, judgment, and insight, often unaccompanied by demonstrable deficits on standard neuropsychological measures. This disconnect relates less to the mysterious functions of this region and more to the relatively narrow focus of standard neuropsychological tests of frontal lobe function. In an effort to capture the decision making deficits of these patients, Bechara and colleagues at the University of Iowa developed a new experimental measure that has come to be called the Iowa gambling task (IGT). This task was an empirical effort to capture the complexity of decisions about reward, punishment, and risk. The task “worked” in the sense that patients with VMF damage performed differently from healthy participants, but its complexity has resulted in controversy about the basis of this performance deficit and, indeed, the basis of the abnormal performance of many populations tested since, ranging from drug addicts to individuals with personality disorders (Bechara et al. 1994, 1997, 2005; Dunn, Dalgleish, and Lawrence 2006; Fellows and Farah 2005a; Maia and McClelland 2004; Tomb et al. 2002).

The IGT involves choosing amongst four decks of cards, for a total of 100 trials. After each choice, the player receives feedback in the form of a (typically hypothetical) monetary win, and, on some trials, a monetary loss as well. The feedback contingencies are such that two of the decks provide larger wins on every trial, but occasionally even larger losses. The other two decks provide smaller wins, but even smaller losses, making these decks more advantageous overall. The order of the cards in each deck is fixed. The majority of healthy participants begins with a preference for the high win decks, but gradually shift to choosing more often from the lower win (but also lower loss) decks. By contrast, patients with VMF damage tend to persist in choosing more often from the disadvantageous decks.

This impairment alone is interesting, but the initial work with this task took the question further by measuring autonomic responses, and by questioning participants at intervals about their explicit knowledge of the reinforcement contingencies. Healthy subjects seemed to be choosing from the better decks before they were able to explain why they were doing so, and also tended to have larger skin conductance responses prior to choosing from the disadvantageous decks as the task proceeded. By contrast, patients with VMF damage did not develop these discriminatory skin conductance responses prior to choosing and made disadvantageous choices even after they were able to report the task contingencies (Bechara et al. 1997). These data were interpreted as support for the somatic marker hypothesis of decision making, which proposes that risky decisions are taken based, in part, on embodied representations of that risk (colloquially, “gut feelings”), indexed by (or perhaps directly encoded by) autonomic changes that may not necessarily relate to conscious knowledge (Bechara, Damasio, and Damasio 2000; Damasio 1994; Dunn, Dalgleish, and Lawrence 2006).

Subsequent work has called into question many aspects of the interpretation of this experiment (reviewed in Dunn, Dalgleish, and Lawrence 2006). However, it is important not to throw the baby out with the bathwater: Patients with VMF damage are clearly impaired on the IGT and determining the mechanism underlying that impairment is likely to be informative. Furthermore, understanding the role of implicit learning from feedback (Pessiglione et al. 2008), and of autonomic signals (Critchley et al. 2003; Heims et al. 2004; Reekie et al. 2008), would seem important in developing a complete description of decision making, and of mapping that description onto the brain. Although the IGT was developed to test “decision making,” it is perhaps more useful to frame it as a learning task. I will first discuss a series of experiments that shed light on the mechanism that may underlie the learning deficits of VMF patients on the IGT, and then review the literature on the role of this area in simpler forms of learning from feedback, before returning to consider whether VMF plays a role in decision making beyond the roles it seems to play in learning.

16.4. VENTROMEDIAL FRONTAL LOBE IS CRITICAL FOR FLEXIBLY LEARNING FROM FEEDBACK

As far back as the 1960s, studies in animals had established that lesions to OFC disrupted specific forms of reinforcement learning, including reversal learning and extinction. Importantly, these tasks involve learning the reward (or punishment) value associated with stimuli—objects in the world. Macaques with OFC resections were typically able to learn to choose between two objects based on whether the object was paired with a (food) reward or non-reward, but made errors when the stimulus-reinforcement contingencies were changed (so-called reversal learning), or persisted in choosing previously rewarded objects after they were no longer followed by reward (i.e., under conditions of extinction) (Butter 1969; Izquierdo, Suda, and Murray 2004; Jones and Mishkin 1972). The same impairment in rapidly updating stimulus-reinforcement contingencies, captured by simple reversal learning tasks (with feedback in the form of points or play money), has also been demonstrated in humans with VMF damage, with the critical region likely to be medial OFC (Fellows and Farah 2003; Hornak et al. 2004; Rolls et al. 1994).

Interestingly, the IGT contains a reversal learning requirement. Because the card order is fixed, players are initially lured into a preference for the disadvantageous decks; for the first several trials, these decks are, in fact, better choices. Patients with VMF damage develop this initial preference just like healthy subjects, but then seem to have difficulty overcoming this initial preference despite mounting losses. We hypothesized that this reversal learning feature of the IGT was critical for the impaired performance of VMF patients. We first established that VMF damage led to impaired reversal learning, tested with a very simple two-deck card game. We then administered a “shuffled” version of the IGT which eliminated the reversal learning requirement of that task, by making it clear in the initial trials that the high-win decks also hold large losses. Remarkably, patients with VMF damage performed this shuffled version as well as did healthy control subjects (Fellows and Farah 2005a). Furthermore, the extent to which their performance improved from the standard to the shuffled IGT was correlated with the extent of their reversal learning impairments as measured by the simple reversal learning task (Fellows and Farah 2003). These studies did not measure skin conductance responses, leaving the relationship between these findings and the somatic marker hypothesis an open question.

This experiment argues that VMF damage disrupts reversal learning, which in turn leads to poor IGT performance. However, it does not mean that the IGT is only a jumped up reversal learning task, nor does it follow that all impaired performance on the IGT is due to a fundamental reversal learning deficit. Clearly the IGT taps other processes. For example, as can be seen in Figure 16.2, at least some patients with PFC damage sparing VMF are also impaired on the IGT (Clark et al. 2003; Fellows and Farah 2005a; but see Bechara et al. 1998), even though such patients have no difficulty with simple reversal learning (Fellows and Farah 2003). Amygdala damage is also associated with poor IGT performance (Bechara et al. 1999), with more recent evidence suggesting that this phenomenon may be due to a role for amygdala in influencing value information in VMF (Hampton et al. 2007). At the least, the effects of damage to other brain regions on this task emphasize that IGT performance cannot be interpreted as a specific index of VMF function.

FIGURE 16.2. Mean number of choices from the disadvantageous decks over 100 trials in the classic version of the IGT (light bars) and the shuffled version (dark bars; see text for details) for patients with VMF damage, dorsal and/or lateral frontal lobe damage (D/LF), and healthy, demographically matched controls (CTL).

FIGURE 16.2

Mean number of choices from the disadvantageous decks over 100 trials in the classic version of the IGT (light bars) and the shuffled version (dark bars; see text for details) for patients with VMF damage, dorsal and/or lateral frontal lobe damage (D/LF), (more...)

The critical role of VMF in reversal learning provides a simpler, more focused starting point for investigating the contributions of this region to learning from feedback, and so to decision making that relies on such learning. Despite its relative simplicity, reversal learning itself involves several component processes (Wheeler and Fellows 2008; Fellows 2007b, Murray, O’Doherty, and Schoenbaum 2007; Schoenbaum, Saddoris, and Stalnaker 2007). The specific pattern of reversal learning impairment shown by the patients we studied suggested that the difficulty relates to adjusting performance in response to unexpected negative feedback, as occurs on the critical reversal trial. Patients with VMF damage characteristically failed to change tack in response to that feedback. We wondered if this related to a specific deficit in learning from negative feedback, and examined that possibility using a probabilistic stimulus-reinforcement task that allowed us to separately probe learning from negative and positive feedback.

Learning in a probabilistic (as opposed to a fully deterministic) environment is more difficult, and patients with frontal lobe damage were, in general, less likely to reach criterion. Of those who were able to learn enough to move to the test phase of the task, patients with VMF damage showed the predicted selective deficit in learning from negative feedback. These patients learned to associate stimuli with positive feedback as well as controls, but were unable to learn to avoid the stimuli more often associated with negative feedback. Patients with frontal damage sparing VMF learned about the same from both forms of feedback, as did the healthy controls (Wheeler and Fellows 2008).

This finding argues for a critical role for VMF in learning from negative feedback and provides evidence that, at least in some contexts, learning from positive and negative feedback is dissociable. It leaves open what the specific role of VMF in this process might be. Is the learning itself occurring within VMF, or does VMF play a permissive role in facilitating or allowing such learning to occur in other neural systems (Stalnaker et al. 2007)? I will return to these questions after reviewing what is known about the role of VMF in decision contexts that do not involve learning.

16.5. VENTROMEDIAL FRONTAL LOBE DAMAGE DISRUPTS DECISION MAKING UNDER CERTAINTY AND UNDER RISK

A different perspective on VMF function has arisen from a literature focusing not on learning, but on economic value. The value of a stimulus (i.e., a decision option) is not an intrinsic property, but a product of cognition. To the dismay of normative economics theorists, even the value of money is changeable: Most of us would much rather have $10 today than the same amount in a year (Ainslie 2001). Value is not only affected by delay: it is influenced by other contextual factors, such as what else is available, and by the internal state of the organism. For example, the value of a slice of chocolate cake will depend on whether the other menu options are lemon tart or soda crackers, and will depend on whether you are hungry, or just finishing an all-you-can-eat chocolate binge. In this sense, value can be likened to more strictly sensory-perceptual properties of stimuli, such as color or size. While these are quantifiable in absolute terms, their perception can be influenced by context, such as surrounding colors, and fine-grained distinctions between different hues (or heights, etc.) are typically easier to establish in relative rather than absolute terms. Can value be understood as a higher-order sensory feature of objects? If so, is this only a superficial analogy, does it reflect analogous neural mechanisms (Rushworth, Mars, and Summerfield 2009), or is it carried in the same channels? Although single-unit work has shown that learned reward value of stimuli can be reflected in neural activity in posterior cortical regions also engaged in higher-order sensory processing (Sugrue, Corrado, and Newsome 2004), I am not aware of any work addressing whether value-related functions are disrupted in human subjects with disorders of sensory processing (a topic briefly touched on in Chapter 10 in this volume).

More work on the neural representations of value has been done in OFC, with several lines of evidence in animal models suggesting that neurons in this area encode the reinforcement value of stimuli (Rolls 2000; Schoenbaum et al. 2003; Padoa-Schioppa and Assad 2006, 2008). There are also functional imaging data supporting the view that OFC activity relates to current stimulus value in healthy human subjects (Elliott et al. 2003; O’Doherty et al. 2001; O’Doherty 2004; Plassmann, O’Doherty, and Rangel 2007; Small et al. 2001).

If neurons within VMF encode the value of stimuli, the next question might be: to what end? How are these value representations engaged to influence behavior? An obvious possibility is that these representations directly guide decisions. How do you choose between two good options? You check in with VMF, find out which one has the greatest value, and select it. In this formulation, VMF is functioning as a “value look-up table” (Schoenbaum, Saddoris, and Stalnaker 2007). The prediction that follows is that VMF damage should disrupt even the simplest forms of decision making that require such information. Simple preference judgments are a common example of such decisions. Chocolate or vanilla? Red wine or white? Such decisions would seem to involve the simplest sort of value comparison. Further, depending on the specific choice, they need not draw on recent learning, or involve consideration of risk or probability. As anyone who has stood in line at an ice cream stand can attest, such preference judgments can still be very difficult. Does VMF play a necessary role in such choices?

We addressed this question with a simple experiment, asking patients with such damage to indicate their preference between pairs of stimuli, in different categories: foods, colors, and famous people. While there is no right or wrong answer in such choices, we reasoned that the hypothesized degraded representation of stimulus value would lead to inconsistencies in these preferences. Applying the principle of transitivity, we tested the consistency of preferences across this series of choices. If subjects preferred A to B, and B to C, they should prefer A to C. When they chose C over A, this was considered an error (or in formal economic terms, an “irrational” choice). As predicted, patients with VMF damage (and not patients with frontal damage elsewhere) were more inconsistent in their preferences, an experimental demonstration of the clinical reports of “capriciousness” after frontal lobe damage that go back to Phineas Gage (Fellows and Farah 2007).

A fundamental deficit in assigning relative value to stimuli may be the basis for the deficits in more complex forms of decision making after VMF damage reported in several other studies. One more complex form of choice under certainty is so-called multi-attribute decision making. Here, subjects must integrate information about options across multiple domains, such as the rent, location, and size of apartments, in order to make a choice. Even when all information is available, these can be very difficult decisions. In a process-focused study of such decision making, we found that VMF damage led to a very different strategy of information acquisition, and to different final choices, again compared to either non-VMF frontal damage, or to demographically matched control subjects. One explanation for the different strategy in those with VMF damage is that they were acquiring information in a way that minimized the need to make relative value comparisons (Fellows 2006).

If VMF damage disrupts choices under certainty, it is perhaps not surprising that it also disrupts choices made more difficult through considerations of risk. The somatic marker hypothesis suggests that there is something special about risky decision making, and the initial studies in this area argued that VMF was important in that “something special.” However, introducing either risk (known uncertainty) or ambiguity (unknown uncertainty) into decision making might be expected simply to increase the level of difficulty in determining value, enhancing a fundamental deficit in “valuing” in patients with VMF damage. An early effort to deconstruct the IGT led to the development of a task that explicitly tested decision making under (known) risk. Now called the Cambridge gambling task, this task offers gambles with varying (but explicitly provided) probabilities, and subjects choose how much money to stake on each trial. Although initial results were somewhat inconsistent (Clark et al. 2003; Mavaddat et al. 2000; Rogers et al. 1999), what would seem to be the definitive study using this task has demonstrated that VMF damage does not affect the ability to choose the higher probability option, but that such patients are less risk-averse than healthy subjects, in that they systematically bet more at every level of risk. Interestingly, they remain sensitive to differences in risk: although their bets are high, they do scale with probability (Clark et al. 2008). Other work, using quite different tasks, has also demonstrated less risk-aversion after ventral frontal damage (Hsu et al. 2005; Shiv et al. 2005). Risk in such tasks, as in life, involves potential losses, raising the possibility that these observations reflect a generic underweighting of losses (whether experienced or anticipated) after VMF damage, which would be consistent with the results of the learning experiments discussed above.

Other decisions involve comparing not simply the anticipated value of current options (what will be), but also the value of what might have been. In choice paradigms under uncertainty (i.e., between gambles) where subjects will learn the outcome of both their choice, and of the other, unchosen option, they often choose so as to minimize regret. Regret is an interesting response, because it involves projecting relative value: anticipating the value of a given option in light of the possible outcomes of other options. Even objectively positive outcomes are downgraded in subjective value when the subject learns that the alternative gamble, had they chosen it, would have provided an even better outcome (Mellers 2000). One intriguing study suggests that VMF damage disrupts decision making that involves consideration of regret: unlike healthy subjects, patients with such damage did not choose so as to minimize regret, and did not report emotional responses consistent with the experience of regret, although they were more pleased when they won rather than lost on the choice they did make (Camille et al. 2004).

Regret can be seen as a higher-order value comparison, in which the subject considers not only the predicted value of the chosen option in and of itself, but how that value will be affected by the outcome of the non-chosen option, given that they know they will learn that other outcome. Other higher-order value comparisons include envy, when the value of an outcome is compared against the (higher) value of an outcome received by someone else, and schadenfreude (gloating when the subjective value of an outcome is increased by the knowledge that another person had a worse outcome. The ability to recognize these emotional states in hypothetical scenarios has been shown to be affected by VMF damage (Shamay-Tsoory, Tibi-Elhanany, and Aharon-Peretz 2007). The authors cast this deficit in a “theory-of-mind” framework, but (as they also suggest) it may be reasonable to interpret these emotions as requiring particularly difficult higher-order relative value estimates. A similar explanation might be advanced for the effects of VMF damage on moral decision making (Koenigs et al. 2007).

16.6. VALUE, EXPECTATION, AND LEARNING

It remains unclear whether the effects of VMF damage on stimulus-value learning (and particularly reversal learning) and on the ability to track relative stimulus value relate to a common underlying process. The region damaged in such patients is large enough to suppose that multiple processes may be affected. Furthermore, heterogeneity of damage, practical limitations in how accurately that damage can be ascertained with anatomical neuroimaging, and small sample sizes all constrain the ability to provide a definitive answer to this question. In principle, identifying patients who show dissociable patterns of performance on these two tasks would directly address this question, even if the anatomical substrates could not be definitively established. In practice, this assumes that the tasks are comparable in their indexing of the process (or processes) of interest. In fact, both the preference judgment and (to a lesser degree) the reversal learning tasks used to date are uncomfortably close to ceiling, making it difficult to make strong statements about this interesting question.

The alternative—that the two abilities relate to a single function of VMF—is worth considering. One model that could accommodate both effects is as follows: current anticipated value of an option is represented in OFC. This information is used to compare the (anticipated) values of multiple options in order to choose between them. It is also used to compare the value of an outcome after a choice with the anticipated value, in order to determine whether the value was equal to, less than, or greater than expected. Occurrence of a less than expected value (in particular) then engages learning mechanisms, both to update value projections for subsequent decisions, and perhaps to make a change of behavior more likely (Schoenbaum, Saddoris, and Stalnaker 2007). Such a formulation also aligns with hypotheses outlined in Chapter 15 in this volume.

16.7. VALUE AND TIME

Time is an important variable in decision making and can influence choice in several ways. Perhaps most obviously, time is a variable in choices that require delay of gratification. The ability to withhold a response to an immediately available reward, in order to gain a larger reward after a delay, is a classic test of self-control (Ainslie 2001; Mischel, Shoda, and Rodriguez 1989) and failure to withhold such responses is a classic clinical sign of frontal lobe dysfunction. Real-life decisions sometimes take this form, but often future considerations are less well defined. For example, rewards that are delayed are also often less certain, confounding time and probability. In decisions that involve repeated choice, time can be inextricably linked with learning.

Understanding the interplay between time and value requires clear operationalization of the time process in question. Temporal discounting is one well-studied construct meeting this criterion. Temporal discounting refers to the characteristic loss of (subjective) value of a reward as a function of the delay to its delivery. This loss of value can be measured in various ways and can typically be fitted with a hyperbolic function. Over short timescales (seconds) and actually experienced reward, temporal discounting has been observed in pigeons and rats. Similar temporal discounting effects are seen in adult humans, although this has most commonly been measured over much longer timescales (days to years), and with monetary rewards, whether real or hypothetical (Ainslie 2001). Temporal discounting measured in this way seems to have ecological validity. For example, groups known for real-life impulsive behaviors, such as drug addicts, discount the value of reward more steeply than do healthy control groups (Kirby, Petry, and Bickel 1999; Petry, Bickel, and Arnett 1998). Functional imaging studies have investigated the neural substrates of discounting, implicating both frontal and sub-cortical regions (Kable and Glimcher 2007; McClure et al. 2004).

We tested the hypothesis that VMF contributes to the ability to maintain subjective value despite delay, administering a standard temporal discounting measure to a group of patients with such damage, compared to both healthy control subjects and to patients with damage affecting lateral frontal lobes. We also included a non-frontal brain-injured control group, because we were concerned that the experience of a significant illness might, in and of itself, affect the interplay between time and reward. Somewhat to our surprise, we were unable to detect any systematic effect of brain injury, whether involving VMF or elsewhere, on temporal discounting rate. The hypothetical nature of the task may be one explanation for the null finding, although the same patients do have difficulties with other hypothetical tasks, including the preference judgments described above. Perhaps more importantly, a general “value representation” role for VMF would not predict a temporal direction to any deficit following VMF damage, at least if immediate and delayed reward values are both being represented in this region. Instead, one might predict an increase in inconsistent choices (leading to a “noisier” fit to the temporal discounting function, rather than a change in the slope of that function), due to a general degradation of the fidelity of value representations, just as was observed in the preference judgment experiment. This was not a planned outcome measure in the experiment just described, but would be worth testing in future work.

By contrast, VMF damage did affect a second measure of future thinking that would seem to have a bearing on decision making. Using a measure first developed to study “future orientation” as a personality trait, we asked how far into the future participants spontaneously projected themselves when asked to do so in a particular context. Healthy older subjects thought ahead in time almost 15 years, on average, whereas those with VMF damage considered much shorter time windows of 5–6 years. This foreshortening of future time was not solely a non-specific effect of brain injury or illness, in that it was significantly more marked in the VMF group than in the other brain-injured (and comparably disabled) control groups (Fellows and Farah 2005b).

Recent work has begun to ask interesting questions about the relation of memory to future thinking, showing, for example, that amnesia due to temporal lobe damage is associated with an impaired ability to imagine the future (reviewed in Schacter, Addis, and Buckner 2008). This work again highlights the role of learning and memory in decision making: we generate expectations about the future based on past experience, and the memory of that past experience frames both the content and the temporal window of decision making. Furthermore, it appears that common brain networks are involved in both remembering and predicting.

16.8. VALUE AND EMOTION

There appear to be close links between value and emotion. Value assessment may incorporate emotional responses and in social situations assessment of the likely value of a choice may rely on predicting or detecting emotional responses in others. Intriguingly, damage to VMF disrupts various emotional processes. Patients with VMF damage have an altered experience of complex emotions such as embarrassment (Beer et al. 2003, 2006). There is less consensus as to whether the experience of basic emotions is affected by such damage (Berlin, Rolls, and Kischka 2004; Hornak et al. 2003; Roberts et al. 2004; Gillihan et al. 2010). VMF damage can also disrupt the ability to recognize emotions in others, whether from voice or facial expression (Heberlein et al. 2008; Hornak et al. 2003). It is tempting to speculate that these various deficits are somehow linked and to wonder whether they may in turn be linked to deficits in decision making (Rolls 1999). However, as with preference and reinforcement learning, it may be that these abilities are linked only by anatomical coincidence (i.e., rely on nearby but distinct regions within VMF) rather than because they share the same component processes. More work is needed to resolve these issues. Demonstration of dissociations within individual patients may be particularly helpful in this regard, although this enterprise is complicated by a lack of consensus on the component processes of each of these complex abilities, and on the appropriate measures of these processes.

16.9. BEYOND VMF: DORSOMEDIAL PFC MAY LINK VALUE TO ACTIONS

There is mounting evidence from electrophysiological studies in monkeys (Amiez, Joseph, and Procyk 2006; Shidara and Richmond 2002), and from ERP and fMRI studies in humans (Brown and Braver 2007; Gehring and Willoughby 2002; Rushworth and Behrens 2008) that activity in dorsomedial PFC (anterior cingulate cortex [ACC] and adjacent medial PFC) also reflects reward value. While value can be readily conceived of as attached to a stimulus (that cake looks really good), it can also, somewhat less intuitively, be seen as a feature of an action (e.g., pulling the lever on a slot machine) (Kennerley et al. 2006; Rushworth and Behrens 2008). One view is that dorsomedial PFC is involved in decision making at the level of action; that it biases responses based on the anticipated value of the action. The values of stimulus and response are often confounded in real-life decisions, but experimental work suggests that the neural substrates of these two value representations are different, at least in animal models (Rushworth et al. 2007). Lesion experiments in monkeys indicate that stimulus-value and action-value learning can be dissociated, with ACC playing a critical role only in the latter (Rudebeck et al. 2008). In a task in which two different actions were possible, but only one was consistently associated with reward, lesioned animals did not sustain the rewarded action over multiple trials, suggesting a deficit in developing an integrated sense of the value of each action (Kennerley et al. 2006). Whether this region plays a necessary role in human decision making, and if so, the nature of that role, has yet to be examined systematically.

However, this region has been intensively studied in relation to conflict and error monitoring in humans (Botvinick 2007; Carter and van Veen 2007; Ridderinkhof et al. 2004). There has been conflicting evidence as to whether intact medial PFC is necessary for conflict monitoring (di Pellegrino, Ciaramelli, and Ladavas 2007; Fellows and Farah 2005c; Stuss et al. 2001). We have recently found that damage to this region disrupts error prediction in several tasks (Modirrousta and Fellows 2008a, 2008b), consistent with converging fMRI, ERP, and computational evidence (Brown and Braver 2005, 2008; Burle et al. 2008; Hester et al. 2005). This finding is relevant to understanding the performance of the ACC-lesioned monkeys described above, who could adjust their immediate response after feedback, but failed to develop a predictive model of action value based on experience. In many contexts, including the conflict tasks (such as the Stroop task) that have been the main focus of performance monitoring studies, action value is all-or-none, right or wrong. In decision contexts, it can be more nuanced (Amiez, Joseph, and Procyk 2005; Oliveira, McDonald, and Goodman 2007; Rushworth and Behrens 2008). One model of dmPFC function that could account for these varied findings is that this region represents the predicted value of an ongoing action in general (Mansouri, Tanaka, and Buckley 2009; Rushworth and Behrens 2008). Further work will be needed to test this possibility in humans, however.

16.10. THE ROLE OF THE INSULA IN REWARD AND DECISION

Insular cortex has long been associated with central autonomic control and interoception in general (Craig 2003), and with the emotional/motivational aspects of pain processing more specifically (Price 2000). This region is implicated in decision making tasks on the basis of functional imaging findings (Kuhnen and Knutson 2005; Paulus et al. 2003). There is some early evidence that insula plays a necessary role in certain forms of decision making. Pure insula lesions are quite rare, with damage typically also disrupting white matter tracts connecting frontal and parietal cortex, and cortex to basal ganglia, including amygdala, as well as variable encroachment on ventrolateral prefrontal, motor, somatosensory, and lateral temporal lobe. With these provisos in mind, damage involving the insula has been shown to increase the amount of money bet in the Cambridge gambling task, to a degree comparable to what was seen after VMF damage (Clark et al. 2008). Unlike VMF patients, however, insula damage was associated with a loss of sensitivity to probability information; that is, the size of bets did not systematically scale with the likelihood of winning.

A second study relevant to understanding the potential role of insular cortex in decision and value processes examined the effects of acquired insula injury in addicted smokers on their addiction. While those with insula damage were no more likely to quit smoking than patients with damage elsewhere, insula damage was associated with a reduction in the subjective urge to smoke (i.e., craving) (Naqvi et al. 2007). More systematic study of the effects of insula damage is needed to clarify the mechanisms underlying the intriguing observations in both of these studies, and to understand the links between these findings and earlier work on the role of this area in mediating the emotional and behavioral relevance of physical pain (see Price 2000 for review).

16.11. STRIATAL MECHANISMS IN DECISION MAKING

There is little doubt that value-based learning involves striatal mechanisms, based on an abundance of evidence from animal studies (Smith et al. 2009) and functional neuroimaging in humans (Izuma, Saito, and Sadato 2008; Knutson and Cooper 2005; Pessiglione et al. 2008; Seymour et al. 2007), and at least one recent study showing that striatal damage disrupts feedback-driven learning, particularly reversal learning (Bellebaum et al. 2008).

Patients with Parkinson’s disease (PD) are perhaps the most commonly studied proxy for striatal dysfunction in humans. Dopamine denervation in the striatum (particularly the putamen) is the hallmark of this degenerative disorder. However, pathological findings are heterogeneous across patients and over time other monoaminergic systems and the cortex itself are also variably affected. Differences between PD and healthy participants therefore may not reflect (only) dopamine-striatal dysfunction. Studies that use a within-subject design, comparing a single group of patients on and off dopamine replacement therapy, can be interpreted with more confidence: observed effects are likely due to dopamine, although may not be due to dopamine acting specifically in the striatum. One such study showed that dopamine modulates reinforcement learning. Patients with PD learned better from positive feedback when on their dopamine replacement therapy and better from negative feedback when off (Frank, Seeberger, and O’Reilly 2004). This pattern was predicted by a neurocomputational model of dopamine actions in the striatum (Frank 2005). Other work is also consistent with a role for dopamine in striatally mediated trial and error learning in patients with PD (see Shohamy et al. 2008 for review).

There has been interesting, albeit limited, work on the contributions of other neurochemicals to aspects of reinforcement learning and decision making. It has been shown, for example, that pharmacological manipulation of serotonin (but not noradrenaline) signaling in healthy subjects can influence learning from probabilistic feedback (Chamberlain et al. 2006). Other work has suggested that serotonin modulates learning from negative feedback specifically (Cools, Robinson, and Sahakian 2008).

16.12. AMYGDALA AND VALUE

There is evidence that the amygdala is critical for at least some forms of decision making. The amygdala has close, bi-directional connections with orbitofrontal cortex (Ghashghaei and Barbas 2002), and there is evidence that both flexible stimulus-reinforcement learning and classical conditioning rely critically on interactions between these two regions in animal models (Baxter et al. 2000; Roberts, Reekie, and Braesicke 2007; Schoenbaum and Roesch 2005; LeDoux 2003; Seymour and Dolan 2008). Interactions between amygdala and VMF have been demonstrated in human subjects during reversal learning (Hampton et al. 2007). Lesions to amygdala were reported to impair IGT performance in humans and this structure was proposed as important for signaling somatic states in the somatic marker hypothesis (Bechara, Damasio, and Damasio 2003, Bechara et al. 1999). For the moment, the role of amygdala in human decision making has not been specified in much detail.

16.13. HEDONICS AND DECISION MAKING

The chapter so far has largely treated value as a concept, rather than as a subjective experience. While decisions can, in principle, be guided by dry economic analysis, everyday experience suggests that many aspects of value are “felt”: pleasure and pain can be anticipated in vivid detail, risky choices can summon strong emotional responses, complete with autonomic changes, and near-misses can be played and re-played in the imagination. Further, one may be more or less explicitly aware of value information. Embodied and implicit aspects of value anticipation and experience have been important in neuroscience theories of human decision making, perhaps most prominently in the somatic marker hypothesis (Damasio 1994). Here, I will briefly discuss work related to this theme.

What are the neural substrates of choices made based on “hunches” or “gut feelings,” and are these distinct from more explicit aspects of decision making? This has been a much-debated feature of the IGT (Bechara et al. 2005; Dunn, Dalgleish, and Lawrence 2006; Maia and McClelland 2004, 2005). One way to frame this issue is to ask: can implicit representations of value guide choice? Work on implicit learning suggests that the general answer is yes (Yin and Knowlton 2006). Indeed, neural circuits involved in reinforcement learning (striatum, amygdala) can encode value, and influence choice, in the absence of explicit awareness of either the value estimate or, remarkably, even of the stimulus (Pessiglione et al. 2008). A related question, whether autonomic signals guide choice, also remains unsettled (Bechara et al. 1997; Heims et al. 2004; North and O’Carroll 2001; Tomb et al. 2002), although it seems that appetitive autonomic responses may be altered by OFC lesions without necessarily affecting behavior in marmosets (Reekie et al. 2008), and dissociation between autonomic and cognitive performance has been reported after ACC damage in humans (Critchley et al. 2003).

A third issue is the relation between value-guided choice and the subjective experience of value. In principle, lesion studies might be able to shed light on this question. For example, do lesions to VMF sufficient to disrupt value-based decision making also disrupt emotional experience related to value? Unsurprisingly, the answer turns out to be complicated. Part of this complexity is a measurement problem. What aspects of emotion should be affected and how should these be captured experimentally? As reviewed above (and see Hornak et al. 2003), damage to VMF does appear to disrupt some aspects of emotional experience, including complex emotions like embarrassment or regret, but other aspects of emotional experience, such as transiently evoked happiness and sadness, remain intact (Gillihan et al. 2010). If these patients are not “emotionally numb,” do they experience pleasure? Again, there is the problem of how to measure pleasure. Anecdotally, such patients do not typically report anhedonia in clinical interviews (in contrast to patients with major depression, for example, in whom anhedonia is often a prominent feature). Subjective report of hedonic experience, for example as measured by self-report questionnaire (e.g., Snaith-Hamilton Pleasure Scale), is typically within the normal range (Fellows, unpublished data). However, self-report is problematic in general because of demand effects, and a particular problem in such patients, who may lack insight into their subjective state.

16.14. VALUE AGNOSIA OR APRAXIA OF CHOICE?

The findings reviewed in this chapter suggest a network of brain regions that, in various combinations, contribute to decision making. Perhaps the strongest evidence is for a critical role for VMF in decision making and flexible stimulus-reinforcement learning. More speculatively, dorsomedial PFC may be involved in linking value to action, with the contributions of other regions such as insula, amygdala, and striatum likely, although as yet less well-specified in humans. This body of work, together with converging evidence reviewed elsewhere in this volume, make a strong case for the plausibility of a “neurology of value.” Indeed, the mechanisms that track the value of decision options, whether stimuli or actions, make an important link between perception and action. The frontal lobes in general are classically considered as important for “goal-directed behavior.” In the lab, goals are often assigned by the experimenter. In real life, we must judge for ourselves what is worth pursuing. This fundamental aspect of choice can rest on very basic “good” or “bad” determinations, on more subtle “better”/“best” judgments, or on very abstract “as good as it gets around here, and better than what the other guy got” or perhaps “better than I deserve” estimations.

From a clinical point of view, disruptions of these mechanisms may yield a complete inability to decide, but more commonly seem to lead to “wrong” choices. Given the inconsistencies, biases, and irrationality that mark our everyday decisions at the best of times (Kahneman 2003; Kahneman and Tversky 1979; Tversky 1969), a neurologist might be hard-pressed to distinguish clinical impairments in decision making. The tongue-in-cheek definition of “poor judgment” as when the patient’s decision runs contrary to the doctor’s recommendation is not much help. Nevertheless, patients with brain injury may show a variety of deficits that can be framed as impaired judgment, altered motivation, or changes in personality. The latter encompasses many symptoms, but may include decisions that are “out of character.”

Can the cognitive neuroscience findings on the neural substrates of decision making reviewed in this chapter be applied to a better understanding of these clinical presentations? This basic science has the potential to offer both novel conceptual frameworks from which to identify component processes of motivation, evaluation, and decision, and the empirical tools to measure these processes. Conceptually, impairments of decision making have been considered under the umbrella of executive dysfunction. Presumably, this is partly because they occur after frontal lobe injury, and partly because decision making would seem to be closely related to other complex, higher-order cognitive abilities such as those involved in abstraction, planning, and problem solving. However, the analysis presented here builds from relatively simple concepts of reward and punishment influencing basic aspects of learning and choice in dynamic environments. This raises alternative perspectives on clinical impairments of decision making. For example, the value of a stimulus can be seen as a higher-order “sensory” property, combining information from primary sensory modalities with learned reward (or punishment) information, interpreted in the organism’s current context. Considered in this way, disruption of evaluation can be conceived of as an impairment of higher-order sensory processing, i.e., an agnosia (see Chapter 10 in this volume). Such patients can perceive and identify an object, but fail to determine its (current, relative) worth, which will necessarily impair decision making about it.

A similar analysis can be applied, albeit more tentatively, to actions: actions must be linked to value to be adaptive. It is not enough to know “how” to carry out a higher-order behavior; there must also be an authentic “why” driving that behavior. Degraded links between action and value could lead to the kind of purposeless, environmentally triggered but contextually inappropriate behavior that is a hallmark of some forms of frontal damage. Again, existing clinical terminology might be adapted to capture this syndrome. The inability to perform a complex action despite intact basic motor function is termed apraxia. Such patients may have difficulty correctly manipulating tools, for example, or difficulty showing how to manipulate tools in the absence of direct access to those tools. Disorders in which actions are not guided by value can be considered as an “apraxia of choice,” or “value apraxia.” A patient so afflicted might manipulate the tool correctly, but employ it for some contextually inappropriate, i.e., valueless, purpose.

I am loathe to burden cognitive neurology with further variations of either apraxia or agnosia, both terms that can confuse as much as they clarify. However, I do think that the conceptualization of value as a higher-order sensory property, or as a factor influencing action selection, can be a useful heuristic in parsing the component processes of decision making. The experimental neuro-psychology reviewed in this chapter, and the field of decision neuroscience more generally, is making inroads into identifying component processes of decision making and their neural substrates Importantly, in many cases these components can be traced to much simpler aspects of sensory processing and reinforcement learning, both relatively well-understood in animal models. This framework helps to “rescue” decision making from the murky domain of complex executive function, providing more traction for understanding this aspect of goal-directed human behavior. Still very much a nascent field, this area of study has the potential to provide novel insights into human behavior, and into disorders of human behavior common in both neurology and psychiatry.

ACKNOWLEDGMENTS

Martha Farah, Elizabeth Wheeler, and Michael Frank contributed to the work discussed here. This research would not be possible without the generous participation of patients, their families, and the clinicians who care for them. I acknowledge operating support from NIH (NIDA R21DA022630), CIHR (MOP-77583), and the Parkinson Society of Canada, and salary support from the CIHR’s Clinician-Scientist program and the Killam Trust.

REFERENCES

  1. Ainslie G. Breakdown of Will. 2001 Cambridge: Cambridge University Press.
  2. Amiez C., Joseph J. P., Procyk E. Anterior cingulate error-related activity is modulated by predicted reward. Eur J Neurosci. 2005;21:3447–52. [PMC free article: PMC1913346] [PubMed: 16026482]
  3. Amiez C., Joseph J. P., Procyk E. Reward encoding in the monkey anterior cingulate cortex. Cereb Cortex. 2006;16:1040–55. [PMC free article: PMC1913662] [PubMed: 16207931]
  4. Baxter M. G., Parker A., Lindner C. C., Izquierdo A. D., Murray E. A. Control of response selection by reinforcer value requires interaction of amygdala and orbital prefrontal cortex. J Neurosci. 2000;20:4311–19. [PubMed: 10818166]
  5. Bechara A., Damasio A. R., Damasio H., Anderson S. W. Insensitivity to future consequences following damage to human prefrontal cortex. Cognition. 1994;50:7–15. [PubMed: 8039375]
  6. Bechara A., Damasio H., Damasio A. R. Emotion, decision making and the orbitofrontal cortex. Cereb Cortex. 2000;10:295–307. [PubMed: 10731224]
  7. Bechara A., Damasio H., Damasio A. R. Role of the amygdala in decision-making. Ann N Y Acad Sci. 2003;985:356–69. [PubMed: 12724171]
  8. Bechara A., Damasio H., Damasio A. R., Lee G. P. Different contributions of the human amygdala and ventromedial prefrontal cortex to decision-making. J Neurosci. 1999;19:5473–81. [PubMed: 10377356]
  9. Bechara A., Damasio H., Tranel D., Anderson S. W. Dissociation of working memory from decision making within the human prefrontal cortex. J Neurosci. 1998;18:428–37. [PubMed: 9412519]
  10. Bechara A., Damasio H., Tranel D., Damasio A. R. Deciding advantageously before knowing the advantageous strategy. Science. 1997;275:1293–95. [PubMed: 9036851]
  11. Bechara A., Damasio H., Tranel D., Damasio A. R. The Iowa Gambling Task and the somatic marker hypothesis: Some questions and answers. Trends Cogn Sci. 2005;9:159–62. discussion 162–64. [PubMed: 15808493]
  12. Beer J. S., Heerey E. A., Keltner D., Scabini D., Knight R. T. The regulatory function of self-conscious emotion: Insights from patients with orbitofrontal damage. J Pers Soc Psychol. 2003;85:594–604. [PubMed: 14561114]
  13. Beer J. S., John O. P., Scabini D., Knight R. T. Orbitofrontal cortex and social behavior: Integrating self-monitoring and emotion-cognition interactions. J Cogn Neurosci. 2006;18:871–79. [PubMed: 16839295]
  14. Bellebaum C., Koch B., Schwarz M., Daum I. Focal basal ganglia lesions are associated with impairments in reward-based reversal learning. Brain. 2008;131:829–41. [PubMed: 18263624]
  15. Berlin H. A., Rolls E. T., Kischka U. Impulsivity, time perception, emotion and reinforcement sensitivity in patients with orbitofrontal cortex lesions. Brain. 2004 [PubMed: 14985269]
  16. Berridge K. C. The debate over dopamine’s role in reward: The case for incentive salience. Psychopharmacology (Berl) 2007;191:391–431. [PubMed: 17072591]
  17. Botvinick M. M. Conflict monitoring and decision making: Reconciling two perspectives on anterior cingulate function. Cogn Affect Behav Neurosci. 2007;7:356–66. [PubMed: 18189009]
  18. Brown J. W., Braver T. S. Learned predictions of error likelihood in the anterior cingulate cortex. Science. 2005;307:1118–21. [PubMed: 15718473]
  19. Brown J. W., Braver T. S. Risk prediction and aversion by anterior cingulate cortex. Cogn Affect Behav Neurosci. 2007;7:266–77. [PubMed: 18189000]
  20. Brown J. W., Braver T. S. A computational model of risk, conflict, and individual difference effects in the anterior cingulate cortex. Brain Res. 2008;1202:99–108. [PMC free article: PMC2322871] [PubMed: 17707352]
  21. Burle B., Roger C., Allain S., Vidal F., Hasbroucq T. Error negativity does not reflect conflict: A reappraisal of conflict monitoring and anterior cingulate cortex activity. J Cogn Neurosci. 2008;20:1637–55. [PubMed: 18345992]
  22. Butter C. Perseveration in extinction and in discrimination reversal tasks following selective frontal ablations in macaca mulatta. Physiol Behav. 1969;4:163–71.
  23. Camille N., Coricelli G., Sallet J., Pradat-Diehl P., Duhamel J. R., Sirigu A. The involvement of the orbitofrontal cortex in the experience of regret. Science. 2004;304:1167–70. [PubMed: 15155951]
  24. Carter C. S., van Veen V. Anterior cingulate cortex and conflict detection: An update of theory and data. Cogn Affect Behav Neurosci. 2007;7:367–79. [PubMed: 18189010]
  25. Chamberlain S. R., Muller U., Blackwell A. D., Clark L., Robbins T. W., Sahakian B. J. Neurochemical modulation of response inhibition and probabilistic learning in humans. Science. 2006;311:861–63. [PMC free article: PMC1867315] [PubMed: 16469930]
  26. Clark L., Bechara A., Damasio H., Aitken M. R., Sahakian B. J., Robbins T. W. Differential effects of insular and ventromedial prefrontal cortex lesions on risky decision-making. Brain. 2008;131:1311–22. [PMC free article: PMC2367692] [PubMed: 18390562]
  27. Clark L., Manes F., Antoun N., Sahakian B. J., Robbins T. W. The contributions of lesion laterality and lesion volume to decision-making impairment following frontal lobe damage. Neuropsychologia. 2003;41:1474–83. [PubMed: 12849765]
  28. Cools R., Robinson O. J., Sahakian B. Acute tryptophan depletion in healthy volunteers enhances punishment prediction but does not affect reward prediction. Neuropsychopharmacology. 2008;33:2291–99. [PubMed: 17940553]
  29. Craig A. D. Interoception: The sense of the physiological condition of the body. Curr Opin Neurobiol. 2003;13:500–5. [PubMed: 12965300]
  30. Critchley H. D., Mathias C. J., Josephs O., O’Doherty J., Zanini S., Dewar B. K., et al. Human cingulate cortex and autonomic control: Converging neuroimaging and clinical evidence. Brain. 2003;126:2139–52. [PubMed: 12821513]
  31. Damasio A. R. Descartes’ 1994 Error: Emotion, Reason, and the Human Brain. New York: Avon Books.
  32. di Pellegrino G., Ciaramelli E., Ladavas E. The regulation of cognitive control following rostral anterior cingulate cortex lesion in humans. J Cogn Neurosci. 2007;19:275–86. [PubMed: 17280516]
  33. Dunn B. D., Dalgleish T., Lawrence A. D. The somatic marker hypothesis: A criticial evaluation. Neurosci Biobehav Rev. 2006;30:239–71. [PubMed: 16197997]
  34. Elliott R., Newman J. L., Longe O. A., Deakin J. F. Differential response patterns in the striatum and orbitofrontal cortex to financial reward in humans: A parametric functional magnetic resonance imaging study. J Neurosci. 2003;23:303–7. [PubMed: 12514228]
  35. Eslinger P. J., Damasio A. R. Severe disturbance of higher cognition after bilateral frontal lobe ablation: Patient EVR. Neurology. 1985;35:1731–41. [PubMed: 4069365]
  36. Fellows L. K. Deciding how to decide: Ventromedial frontal lobe damage affects information acquisition in multi-attribute decision making. Brain. 2006;129:944–52. [PubMed: 16455794]
  37. Eslinger P. J., Damasio A. R. Advances in understanding ventromedial prefrontal function: The accountant joins the executive. Neurology. 2007a;68:991–95. [PubMed: 17389302]
  38. Eslinger P. J., Damasio A. R. The role of orbitofrontal cortex in decision making: A component process account. Ann N Y Acad Sci. 2007b;1121:421–30. [PubMed: 17846161]
  39. Fellows L. K., Farah M. J. Ventromedial frontal cortex mediates affective shifting in humans: Evidence from a reversal learning paradigm. Brain. 2003;126:1830–37. [PubMed: 12821528]
  40. Fellows L. K., Farah M. J. Different underlying impairments in decision-making following ventromedial and dorsolateral frontal lobe damage in humans. Cereb Cortex. 2005a;15:58–63. [PubMed: 15217900]
  41. Fellows L. K., Farah M. J. Dissociable elements of human foresight: A role for the ventromedial frontal lobes in framing the future, but not in discounting future rewards. Neuropsychologia. 2005b;43:1214–21. [PubMed: 15817179]
  42. Fellows L. K., Farah M. J. Is anterior cingulate cortex necessary for cognitive control? Brain. 2005c;128:788–96. [PubMed: 15705613]
  43. Fellows L. K., Farah M. J. The role of ventromedial prefrontal cortex in decision making: Judgment under uncertainty or judgment per se? Cereb Cortex. 2007;17:2669–74. [PubMed: 17259643]
  44. Fellows L. K., Heberlein A. S., Morales D. A., Shivde G., Waller S., Wu D. H. Method matters: An empirical study of impact in cognitive neuroscience. J Cogn Neurosci. 2005;17:850–58. [PubMed: 15969904]
  45. Frank M. J. Dynamic dopamine modulation in the basal ganglia: A neurocomputational account of cognitive deficits in medicated and nonmedicated Parkinsonism. J Cogn Neurosci. 2005;17:51–72. [PubMed: 15701239]
  46. Frank M. J., Seeberger L. C., O’Reilly R. C. By carrot or by stick: Cognitive reinforcement learning in parkinsonism. Science. 2004;306:1940–43. [PubMed: 15528409]
  47. Gehring W. J., Willoughby A. R. The medial frontal cortex and the rapid processing of monetary gains and losses. Science. 2002;295:2279–82. [PubMed: 11910116]
  48. Ghashghaei H. T., Barbas H. Pathways for emotion: Interactions of prefrontal and anterior temporal pathways in the amygdala of the rhesus monkey. Neuroscience. 2002;115:1261–79. [PubMed: 12453496]
  49. Gillihan S. J., Xia C., Padon A. A., Heberlein A. S., Farah M. J., Fellows L. K. Contrasting roles for lateral and ventromedial prefrontal cortex in transient and dispositional affective experience. Soc Cog Affect Neurosci. 2010 ePub ahead of print 10 May 2010. [PMC free article: PMC3023090] [PubMed: 20460300]
  50. Hampton A. N., Adolphs R., Tyszka M. J., O’Doherty J. P. Contributions of the amygdala to reward expectancy and choice signals in human prefrontal cortex. Neuron. 2007;55:545–55. [PubMed: 17698008]
  51. Harlow J. M. Passage of an iron rod through the head. Pub Mass Med Soc. 1868;2
  52. Heberlein A. S., Padon A. A., Gillihan S. J., Farah M. J., Fellows L. K. Ventromedial frontal lobe plays a critical role in facial emotion recognition. J Cogn Neurosci. 2008;20:721–33. [PubMed: 18052791]
  53. Heims H. C., Critchley H. D., Dolan R., Mathias C. J., Cipolotti L. Social and motivational functioning is not critically dependent on feedback of autonomic responses: Neuropsychological evidence from patients with pure autonomic failure. Neuropsychologia. 2004;42:1979–88. [PubMed: 15381028]
  54. Hester R., Foxe J. J., Molholm S., Shpaner M., Garavan H. Neural mechanisms involved in error processing: A comparison of errors made with and without awareness. Neuroimage. 2005;27:602–8. [PubMed: 16024258]
  55. Hornak J., Bramham J., Rolls E. T., Morris R. G., O’Doherty J., Bullock P. R., et al. Changes in emotion after circumscribed surgical lesions of the orbitofrontal and cingulate cortices. Brain. 2003;126:1691–1712. [PubMed: 12805109]
  56. Hornak J., O’Doherty J., Bramham J., Rolls E. T., Morris R. G., Bullock P. R., et al. Reward-related reversal learning after surgical excisions in orbito-frontal or dorsolateral prefrontal cortex in humans. J Cogn Neurosci. 2004;16:463–78. [PubMed: 15072681]
  57. Hsu M., Bhatt M., Adolphs R., Tranel D., Camerer C. F. Neural systems responding to degrees of uncertainty in human decision-making. Science. 2005;310:1680–83. [PubMed: 16339445]
  58. Izquierdo A., Murray E. A. Selective bilateral amygdala lesions in rhesus monkeys fail to disrupt object reversal learning. J Neurosci. 2007;27:1054–62. [PubMed: 17267559]
  59. Izquierdo A., Suda R. K., Murray E. A. Bilateral orbital prefrontal cortex lesions in rhesus monkeys disrupt choices guided by both reward value and reward contingency. J Neurosci. 2004;24:7540–48. [PubMed: 15329401]
  60. Izuma K., Saito D. N., Sadato N. Processing of social and monetary rewards in the human striatum. Neuron. 2008;58:284–94. [PubMed: 18439412]
  61. Jones B., Mishkin M. Limbic lesions and the problem of stimulus–reinforcement associations. Exp Neurol. 1972;36:362–77. [PubMed: 4626489]
  62. Kable J. W., Glimcher P. W. The neural correlates of subjective value during intertemporal choice. Nat Neurosci. 2007;10:1625–33. [PMC free article: PMC2845395] [PubMed: 17982449]
  63. Kahneman D. A perspective on judgment and choice: Mapping bounded rationality. Am Psychol. 2003;58:697–720. [PubMed: 14584987]
  64. Kahneman D., Tversky A. Prospect theory: An analysis of decisions under risk. Econometrica. 1979;47:263–91.
  65. Kennerley S. W., Walton M. E., Behrens T. E., Buckley M. J., Rushworth M. F. Optimal decision making and the anterior cingulate cortex. Nat Neurosci. 2006;9:940–47. [PubMed: 16783368]
  66. Kirby K. N., Petry N. M., Bickel W. K. Heroin addicts have higher discount rates for delayed rewards than non-drug-using controls. J Exp Psychol: Gen. 1999;128:78–87. [PubMed: 10100392]
  67. Knutson B., Cooper J. C. Functional magnetic resonance imaging of reward prediction. Curr Opin Neurol. 2005;18:411–17. [PubMed: 16003117]
  68. Koenigs M., Young L., Adolphs R., Tranel D., Cushman F., Hauser M., et al. Damage to the prefrontal cortex increases utilitarian moral judgements. Nature. 2007;446:908–11. [PMC free article: PMC2244801] [PubMed: 17377536]
  69. Koob G. F., Ahmed S. H., Boutrel B., Chen S. A., Kenny P. J., Markou A., et al. Neurobiological mechanisms in the transition from drug use to drug dependence. Neurosci Biobehav Rev. 2004;27:739–49. [PubMed: 15019424]
  70. Kuhnen C. M., Knutson B. The neural basis of financial risk taking. Neuron. 2005;47:763–70. [PubMed: 16129404]
  71. LeDoux J. The emotional brain, fear, and the amygdala. Cell Mol Neurobiol. 2003;23:727–38. [PubMed: 14514027]
  72. Maia T. V., McClelland J. L. A reexamination of the evidence for the somatic marker hypothesis: What participants really know in the Iowa gambling task. Proc Natl Acad Sci USA. 2004;101:16075–80. [PMC free article: PMC528759] [PubMed: 15501919]
  73. Maia T. V., McClelland J. L. The somatic marker hypothesis: Still many questions but no answers: Response to Bechara. et al. Trends Cogn Sci. 2005;9:162–64.
  74. Mansouri F. A., Tanaka K., Buckley M. J. Conflict-induced behavioural adjustment: A clue to the executive functions of the prefrontal cortex. Nat Rev Neurosci. 2009;10:141–52. [PubMed: 19153577]
  75. Mavaddat N., Kirkpatrick P. J., Rogers R. D., Sahakian B. J. Deficits in decision-making in patients with aneurysms of the anterior communicating artery. Brain. 2000;123(10):2109–17. [PubMed: 11004127]
  76. McClure S. M., Laibson D. I., Loewenstein G., Cohen J. D. Separate neural systems value immediate and delayed monetary rewards. Science. 2004;306:503–7. [PubMed: 15486304]
  77. Mellers B. A. Choice and the relative pleasure of consequences. Psychol Bull. 2000;126:910–24. [PubMed: 11107882]
  78. Mischel W., Shoda Y., Rodriguez M. I. Delay of gratification in children. Science. 1989;244:933–38. [PubMed: 2658056]
  79. Modirrousta M., Fellows L. K. Dorsal medial prefrontal cortex plays a necessary role in rapid error prediction in humans. J Neurosci. 2008a;28:14000–5. [PMC free article: PMC2642611] [PubMed: 19091989]
  80. Modirrousta M., Fellows L. K. Medial prefrontal cortex plays a critical and selective role in ‘feeling of knowing’ meta-memory judgments. Neuropsychologia. 2008b;46:2958–65. [PubMed: 18606176]
  81. Murray E. A., O’Doherty J. P., Schoenbaum G. What we know and do not know about the functions of the orbitofrontal cortex after 20 years of cross-species studies. J Neurosci. 2007;27:8166–69. [PMC free article: PMC2630163] [PubMed: 17670960]
  82. Naqvi N. H., Rudrauf D., Damasio H., Bechara A. Damage to the insula disrupts addiction to cigarette smoking. Science. 2007;315:531–34. [PMC free article: PMC3698854] [PubMed: 17255515]
  83. North N. T., O’Carroll R. E. Decision making in patients with spinal cord damage: Afferent feedback and the somatic marker hypothesis. Neuropsychologia. 2001;39:521–24. [PubMed: 11254934]
  84. O’Doherty J., Kringelbach M. L., Rolls E. T., Hornak J., Andrews C. Abstract reward and punishment representations in the human orbitofrontal cortex. Nat Neurosci. 2001;4:95–102. [PubMed: 11135651]
  85. O’Doherty J. P. Reward representations and reward-related learning in the human brain: Insights from neuroimaging. Curr Opin Neurobiol. 2004;14:769–76. [PubMed: 15582382]
  86. Oliveira F. T., McDonald J. J., Goodman D. Performance monitoring in the anterior cingulate is not all error related: Expectancy deviation and the representation of action-outcome associations. J Cogn Neurosci. 2007;19:1994–2004. [PubMed: 17892382]
  87. Padoa-Schioppa C., Assad J. A. Neurons in the orbitofrontal cortex encode economic value. Nature. 2006;441:223–26. [PMC free article: PMC2630027] [PubMed: 16633341]
  88. Padoa-Schioppa C., Assad J. A. The representation of economic value in the orbitofrontal cortex is invariant for changes of menu. Nat Neurosci. 2008;11:95–102. [PMC free article: PMC2646102] [PubMed: 18066060]
  89. Paulus M. P., Rogalsky C., Simmons A., Feinstein J. S., Stein M. B. Increased activation in the right insula during risk-taking decision making is related to harm avoidance and neuroticism. Neuroimage. 2003;19:1439–48. [PubMed: 12948701]
  90. Pessiglione M., Petrovic P., Daunizeau J., Palminteri S., Dolan R. J., Frith C. D. Subliminal instrumental conditioning demonstrated in the human brain. Neuron. 2008;59:561–67. [PMC free article: PMC2572733] [PubMed: 18760693]
  91. Pessoa L. On the relationship between emotion and cognition. Nat Rev Neurosci. 2008;9:148–58. [PubMed: 18209732]
  92. Petry N. M., Bickel W. K., Arnett M. Shortened time horizons and insensitivity to future consequences in heroin addicts. Addiction. 1998;93:729–38. [PubMed: 9692271]
  93. Plassmann H., O’Doherty J., Rangel A. Orbitofrontal cortex encodes willingness to pay in everyday economic transactions. J Neurosci. 2007;27:9984–88. [PubMed: 17855612]
  94. Price D. D. Psychological and neural mechanisms of the affective dimension of pain. Science. 2000;288:1769–72. [PubMed: 10846154]
  95. Reekie Y. L., Braesicke K., Man M. S., Roberts A. C. Uncoupling of behavioral and autonomic responses after lesions of the primate orbitofrontal cortex. Proc Natl Acad Sci USA. 2008;105:9787–92. [PMC free article: PMC2447863] [PubMed: 18621690]
  96. Ridderinkhof K. R., Ullsperger M., Crone E. A., Nieuwenhuis S. The role of the medial frontal cortex in cognitive control. Science. 2004;306:443–7. [PubMed: 15486290]
  97. Roberts A. C., Reekie Y., Braesicke K. Synergistic and regulatory effects of orbitofrontal cortex on amygdala-dependent appetitive behavior. Ann N Y Acad Sci. 2007;1121:297–319. [PubMed: 17698997]
  98. Roberts N. A., Beer J. S., Werner K. H., Scabini D., Levens S. M., Knight R. T., et al. The impact of orbital prefrontal cortex damage on emotional activation to unanticipated and anticipated acoustic startle stimuli. Cogn Affect Behav Neurosci. 2004;4:307–16. [PubMed: 15535166]
  99. Rogers R. D., Everitt B.J., Baldacchino A., Blackshaw A. J., Swainson R., Wynne K., et al. Dissociable deficits in the decision-making cognition of chronic amphetamine abusers, opiate abusers, patients with focal damage to prefrontal cortex, and tryptophan-depleted normal volunteers: Evidence for monoaminergic mechanisms. Neuropsychopharmacology. 1999;20:322–39. [PubMed: 10088133]
  100. Rolls E. T. The Brain and Emotion. 1999 Oxford: Oxford University Press.
  101. Rolls E. T. The orbitofrontal cortex and reward. Cereb Cortex. 2000;10:284–94. [PubMed: 10731223]
  102. Rolls E. T., Hornak J., Wade D., McGrath J. Emotion-related learning in patients with social and emotional changes associated with frontal lobe damage. J Neurol Neurosurg Psychiatry. 1994;57:1518–24. [PMC free article: PMC1073235] [PubMed: 7798983]
  103. Rorden C., Karnath H. O. Using human brain lesions to infer function: A relic from a past era in the fMRI age? Nat Rev Neurosci. 2004;5:813–19. [PubMed: 15378041]
  104. Rudebeck P. H., Behrens T. E., Kennerley S. W., Baxter M. G., Buckley M. J., Walton M. E., et al. Frontal cortex subregions play distinct roles in choices between actions and stimuli. J Neurosci. 2008;28:13775–85. [PubMed: 19091968]
  105. Rudebeck P. H., Murray E. A. Amygdala and orbitofrontal cortex lesions differentially influence choices during object reversal learning. J Neurosci. 2008;28:8338–43. [PMC free article: PMC2556079] [PubMed: 18701696]
  106. Rushworth M. F., Behrens T. E. Choice, uncertainty and value in prefrontal and cingulate cortex. Nat Neurosci. 2008;11:389–97. [PubMed: 18368045]
  107. Rushworth M. F., Behrens T. E., Rudebeck P. H., Walton M. E. Contrasting roles for cingulate and orbitofrontal cortex in decisions and social behaviour. Trends Cogn Sci. 2007;11:168–76. [PubMed: 17337237]
  108. Rushworth M. F., Mars R. B., Summerfield C. General mechanisms for making decisions? Curr Opin Neurobiol. 2009;19:75–83. [PubMed: 19349160]
  109. Schacter D. L., Addis D. R., Buckner R. L. Episodic simulation of future events: Concepts, data, and applications. Ann N Y Acad Sci. 2008;1124:39–60. [PubMed: 18400923]
  110. Schoenbaum G., Roesch M. Orbitofrontal cortex, associative learning, and expectancies. Neuron. 2005;47:633–36. [PMC free article: PMC2628809] [PubMed: 16129393]
  111. Schoenbaum G., Saddoris M. P., Stalnaker T. A. Reconciling the roles of orbitofrontal cortex in reversal learning and the encoding of outcome expectancies. Ann N Y Acad Sci. 2007;1121:320–35. [PMC free article: PMC2430624] [PubMed: 17698988]
  112. Schoenbaum G., Setlow B., Saddoris M. P., Gallagher M. Encoding predicted outcome and acquired value in orbitofrontal cortex during cue sampling depends upon input from basolateral amygdala. Neuron. 2003;39:855–67. [PubMed: 12948451]
  113. Seymour B., Daw N., Dayan P., Singer T., Dolan R. Differential encoding of losses and gains in the human striatum. J Neurosci. 2007;27:4826–31. [PMC free article: PMC2630024] [PubMed: 17475790]
  114. Seymour B., Dolan R. Emotion, decision making, and the amygdala. Neuron. 2008;58:662–71. [PubMed: 18549779]
  115. Shamay-Tsoory S. G., Tibi-Elhanany Y., Aharon-Peretz J. The green-eyed monster and malicious joy: The neuroanatomical bases of envy and gloating (schadenfreude) Brain. 2007;130:1663–78. [PubMed: 17525143]
  116. Shidara M., Richmond B. J. Anterior cingulate: Single neuronal signals related to degree of reward expectancy. Science. 2002;296:1709–11. [PubMed: 12040201]
  117. Shiv B., Loewenstein G., Bechara A., Damasio H., Damasio A. R. Investment behavior and the negative side of emotion. Psychol Sci. 2005;16:435–39. [PubMed: 15943668]
  118. Shohamy D., Myers C. E., Kalanithi J., Gluck M. A. Basal ganglia and dopamine contributions to probabilistic category learning. Neurosci Biobehav Rev. 2008;32:219–36. [PMC free article: PMC2705841] [PubMed: 18061261]
  119. Small D. M., Zatorre R. J., Dagher A., Evans A. C., Jones-Gotman M. Changes in brain activity related to eating chocolate: From pleasure to aversion. Brain. 2001;124:1720–33. [PubMed: 11522575]
  120. Smith K. S., Tindell A. J., Aldridge J. W., Berridge K. C. Ventral pallidum roles in reward and motivation. Behav Brain Res. 2009;196:155–67. [PMC free article: PMC2606924] [PubMed: 18955088]
  121. Stalnaker T. A., Franz T. M., Singh T., Schoenbaum G. Basolateral amygdala lesions abolish orbitofrontal-dependent reversal impairments. Neuron. 2007;54:51–58. [PubMed: 17408577]
  122. Stuss D. T., Alexander M. P. Is there a dysexecutive syndrome? Philos Trans R Soc Lond B Biol Sci. 2007;362:901–15. [PMC free article: PMC2430005] [PubMed: 17412679]
  123. Stuss D. T., Floden D., Alexander M. P., Levine B., Katz D. Stroop performance in focal lesion patients: Dissociation of processes and frontal lobe lesion location. Neuropsychologia. 2001;39:771–86. [PubMed: 11369401]
  124. Sugrue L. P., Corrado G. S., Newsome W. T. Matching behavior and the representation of value in the parietal cortex. Science. 2004;304:1782–87. [PubMed: 15205529]
  125. Tomb I., Hauser M., Deldin P., Caramazza A. Do somatic markers mediate decisions on the gambling task? Nat Neurosci. 2002;5:1103–4. author reply 1104. [PubMed: 12403997]
  126. Tversky A. Intransitivity of preferences. Psych Rev. 1996
  127. Wheeler E. Z., Fellows L. K. The human ventromedial frontal lobe is critical for learning from negative feedback. Brain. 2008;131:1323–31. [PubMed: 18344561]
  128. Yin H. H., Knowlton B. J. The role of the basal ganglia in habit formation. Nat Rev Neurosci. 2006;7:464–76. [PubMed: 16715055]
Copyright © 2011 by Taylor and Francis Group, LLC.
Bookshelf ID: NBK92793PMID: 22593909
PubReader format: click here to try

Views

  • PubReader
  • Print View
  • Cite this Page

Other titles in this collection

Related information

  • PMC
    PubMed Central citations
  • PubMed
    Links to pubmed

Related citations in PubMed

See reviews...See all...

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...