NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Gottfried JA, editor. Neurobiology of Sensation and Reward. Boca Raton (FL): CRC Press/Taylor & Francis; 2011.

Cover of Neurobiology of Sensation and Reward

Neurobiology of Sensation and Reward.

Show details

Chapter 15Orbitofrontal Cortex and Outcome Expectancies: Optimizing Behavior and Sensory Perception

, , , and .


Orbitofrontal cortex has long been associated with adaptive, flexible behavior. Indeed the argument has been made that the ability of humans to adapt so rapidly to changing circumstances is, in part, linked to the expansion of this and other prefrontal regions. The association between the orbitofrontal cortex and adaptive behavior is apparent in accounts by Dr. John Harlow in 1868 (Harlow 1868) of the erratic, inflexible, stimulus-bound behavior of Phineas Gage, who reportedly suffered extensive damage to the orbital prefrontal regions (Damasio et al. 1994). Since then, increasingly refined experimental work has demonstrated repeatedly that damage to the orbitofrontal region, a set of loosely defined areas in the prefrontal regions overlying the orbits (Price 2007), impairs the ability of animals and humans to rapidly change their behavior in the face of changing contingencies and unexpected outcomes (see Chapter 16 in this volume for further considerations of this topic).

Two dominant hypotheses have been advanced to explain the role of orbitofrontal cortex in adaptive behavior. The first was the idea that orbitofrontal cortex is fundamentally critical to the ability to inhibit inappropriate or incorrect responses. The second was the suggestion that orbitofrontal cortex supports rapid changes in behavior because it serves as a rapidly flexible encoder of associative information. By this account, orbitofrontal cortex is faster at learning new information than other brain areas and thereby drives selection of the correct response or, perhaps, inhibits selection of the incorrect response.

More recently the orbitofrontal cortex has been shown to be critical to signaling of outcome expectancies—signals concerning the characteristics, features, and specific value of particular outcomes that are predicted by cues (and perhaps responses; though see Ostlund and Balleine 2007b) in the environment (Schoenbaum and Roesch 2005). Here we will argue that this function provides a better explanation for the role of the orbitofrontal cortex in adaptive behavior than either of the established hypotheses. First, we will review data from reversal learning tasks, showing that orbitofrontal cortex is critical for changing behavior in the face of unexpected outcomes. Then, we will provide a brief overview of the two dominant hypotheses, followed by data that directly contradict both accounts. Thereafter, we will review more recent evidence that orbitofrontal cortex is critical to signaling information about expected outcomes. As we will show, these signals are prominent in the neural activity and BOLD response in orbitofrontal cortex, and their role in guiding behavior is evident in deficits caused by orbitofrontal damage in a variety of behavioral settings in which outcomes must be used to guide normal behavior, even when contingencies are not changing. We will suggest that these same signals are also necessary for the detection of prediction errors when contingencies are changing, thereby facilitating changes in associative representations in other brain areas and, ultimately, behavior. Finally, we will also suggest that expectancy signals in orbitofrontal cortex might also impact early sensory regions, optimizing behavior through context-dependent firing and recall of environmental cues predicting future reward.


The role of orbitofrontal cortex in adaptive behavior is typically assessed experimentally using tasks that incorporate reversal learning. In these tasks, a subject is first trained that responding under one circumstance leads to reward and that responding under another circumstance will lead to non-reward or punishment. After stable responding is established, the contingencies or associations are reversed, and the subject must switch or reverse responding. This can be done in rats using a simple odor discrimination task. Rats are trained to sample odor cues at a centrally located port and then visit a nearby fluid well. Rats learn that one odor predicts sucrose, while a different odor predicts quinine. Rats like sucrose but want to avoid quinine, and so they learn to discriminate between the two odor cues, making “go” responses after sampling the positive, sucrose-predicting cue and withholding that response, called a “no-go,” after sampling the negative, quinine-predicting cue. After they have learned this discrimination to a 90% performance criterion and are responding stably at that level, we can assess their ability to rapidly reverse their responses by switching the odor-outcome associations.

As illustrated in Figure 15.1, lesions of the orbitofrontal cortex cause a specific impairment in the ability of rats to rapidly reverse responding in this setting (Schoenbaum et al. 2003a). Lesions encompassed the dorsal bank of the rhinal sulcus, including the laterally located orbital and agranular insular regions. These areas have a pattern of connectivity with sensory regions, mediodorsal thalamus, basolateral amygdala, and ventral striatum that approximates that of orbital regions in primate species (Schoenbaum and Setlow 2001). Lesions did not include medial orbital areas, which have a different pattern of connectivity. Importantly, these lesions had no effect on the ability of the rats to acquire the odor problems (Figure 15.1a). Thus orbitofrontal-lesioned rats were able to detect and discriminate between the odor cues, were similarly motivated to respond for sucrose and to avoid quinine, and were able to modify responding when first introduced to a negative odor cue. However, these rats were markedly slower to modify responding when the odor-outcome associations of the final odor pair were reversed (Figure 15.1b); they required approximately twice as many trials as controls to relearn this discrimination after the first reversal, and this deficit reemerged when the problem was re-reversed. This is the classic reversal learning deficit that is associated with orbitofrontal damage. This same deficit has been shown repeatedly by a host of different labs over the past 50 years across species, designs, and learning materials (Teitelbaum 1964; Butter 1969; Jones and Mishkin 1972; Rolls et al. 1994; Bechara et al. 1997; Meunier, Bachevalier, and Mishkin 1997; Schoenbaum et al. 2002; Chudasama and Robbins 2003; Fellows and Farah 2003; McAlonan and Brown 2003; Hornak et al. 2004; Izquierdo, Suda, and Murray 2004; Pais-Vieira, Lima, and Galhardo 2007; Bissonette et al. 2008; Reekie et al. 2008). Indeed this may be one of the more reliable brain-behavior relationships of which we are aware, rivaling even that of hippocampus and declarative memory in its reproducibility and robustness.

FIGURE 15.1. Effect of orbitofrontal lesions on rapid reversal of trained responses.


Effect of orbitofrontal lesions on rapid reversal of trained responses. Rats were trained to sample odors at a central port and then respond at a nearby fluid well. In each odor problem, one odor predicted sucrose and a second quinine. Rats had to learn (more...)


Such a clear association between a particular brain area and a particular cognitive function begs an explanation. The search for an explanation has been dominated by two proposals. In this section, we will review these proposals and then discuss evidence that directly contradicts them.

15.3.1. Orbitofrontal Cortex as Inhibitor of Responding

The first and perhaps still most prevalent proposal is that orbitofrontal cortex is critical for adaptive behavior generally—and reversal learning in particular—because it plays a fundamental role in inhibiting inappropriate responses. This idea has a long history, going back at least 135 years to writings by David Ferrier (1876), who proposed, based on experimental work in dogs and monkeys, that the frontal lobes might be critical to suppressing motor acts in favor of attention and planning. Though also associated generally with prefrontal cortex (Mishkin 1964), the function of response inhibition has become more closely associated with orbitofrontal function in the past several decades, because it provides a very attractive description of the general and even specific behavioral effects of orbitofrontal damage (Fuster 1997). For example, this explanation fits well with the “orbitofrontal syndrome,” which is typically described as a constellation of symptoms including impulsive, disinhibited, and perseverative responding (Damasio 1994), and of course it also provides an excellent description of reversal learning deficits (Teitelbaum 1964; Butter 1969; Jones and Mishkin 1972; Rolls et al. 1994; Bechara et al. 1997; Meunier, Bachevalier, and Mishkin 1997; Schoenbaum et al. 2002; Chudasama and Robbins 2003; Fellows and Farah 2003; McAlonan and Brown 2003; Hornak et al. 2004; Izquierdo, Suda, and Murray 2004; Pais-Vieira, Lima, and Galhardo 2007; Bissonette et al. 2008; Reekie et al. 2008) as well as deficits in detour-reaching and stop-signal tasks (Wallis et al. 2001; Dillon and Pizzagalli 2007; Eagle et al. 2008; Torregrossa, Quinn, and Taylor 2008).

However, this idea does not have very good predictive power outside of these settings. In fact, it is quite easy to find behaviors that have a high requirement for response inhibition that are not affected by orbitofrontal damage. For example, in many of the reversal studies described earlier, orbitofrontal-lesioned subjects are able to successfully inhibit the same responses during initial learning that they have such difficulty inhibiting after reversal (Rolls et al. 1994; Bechara et al. 1997; Meunier Bachevalier, and Mishkin 1997; Schoenbaum et al. 2002; Pais-Vieira, Lima, and Galhardo 2007). This includes our setting, in which rats have to inhibit a strongly pre-trained response at the fluid well during initial discrimination learning (Figure 15.1a) (Schoenbaum et al. 2003a). Before they are ever trained to withhold responding, rats in our studies typically receive 500–1000 shaping trials in which they sample an odorized air stream and then respond for reward. Yet when a new odor cue that predicts a negative outcome is introduced, they learn to inhibit this highly trained response at the same rate as controls and they also show the same improvement over several different odor discrimination problems. These data indicate that orbitofrontal cortex is not necessary, generally, for inhibiting pre-trained responses.

Similarly, orbitofrontal cortex is not required for inhibiting “pre-potent” or innate response tendencies. For example, in reinforcer devaluation tasks, animals with orbitofrontal lesions are readily able to withhold the selection or consumption of food that has been paired with illness or fed to satiety (Gallagher, McMahan, and Schoenbaum 1999; Baxter et al. 2000; Pickens et al. 2003; Izquierdo, Suda, and Murray 2004; Pickens et al. 2005; Burke et al. 2008). This ability can also be demonstrated in a reversal setting, as in a study by Murray et al. (Chudasama, Kralik, and Murray 2007). In this study, monkeys were allowed to choose between different size peanut rewards. To receive the larger amount, they had to select the smaller one. Thus the monkeys had to switch or reverse their innate bias toward selecting the reward that they wanted. Monkeys with orbitofrontal damage learned to do so just as well as controls. These data also indicate that orbitofrontal cortex is not necessary for inhibiting innate responses. Thus, while the inability to inhibit inappropriate responses is an effect or symptom of orbitofrontal damage that is evident in a number of settings, including reversals, this description does not provide an adequate all-inclusive definition of the underlying function that orbitofrontal cortex contributes to behavior.

15.3.2. Orbitofrontal Cortex as Flexible Encoder of Associative Information

More recently it has been proposed that the orbitofrontal cortex is critical to adaptive behavior because it is a rapidly flexible encoder of associative information, particular for associations between cues and appetitive and aversive outcomes (Rolls 1996). According to this hypothesis, the orbitofrontal cortex is better than other brain areas at rapidly encoding the new, correct associations and contributes to adaptive behavior by signaling this information to other areas, thereby driving selection of the correct response or, perhaps, inhibiting selection of the incorrect one.

This idea has its roots in single-unit recording studies reporting that cue-evoked neural activity in orbitofrontal cortex rapidly reflects associative information, under normal circumstances and also during reversal learning. This was first reported by Rolls and colleagues (Thorpe, Rolls, and Maddison 1983), who noted that single-units in monkey orbitofrontal cortex would often change their firing to objects, such as syringes, when their association with reward was changed, say by filling them with aversive saline rather than rewarding juices. The change in firing occurred rapidly after the monkeys experienced the unexpected outcome. Rolls has also reported similar reversal of associative encoding in single-units recorded in a visual discrimination task (Rolls et al. 1996). We’ve seen similar correlates in orbitofrontal neurons recorded in rats learning to reverse odor discriminations (Schoenbaum et al. 1999, 2003b; Stalnaker et al. 2006).

The prominent reversal of associative correlates in these orbitofrontal neurons during reversal learning and the poor reversal performance caused by orbitofrontal damage are clearly consistent with the idea that this area supports adaptive behavior because it is particularly efficient at learning new associative information. According to this proposal, orbitofrontal cortex acts as a rapidly flexible associative look-up table, deciphering the outcome associated with a particular cue after reversal more rapidly and with greater accuracy than any other brain region. This proposal has enormous explanatory power for these data; yet like the response inhibition proposal above, it is not consistent with more recent evidence that directly tests its predictions.

For example, if orbitofrontal cortex is an associative look-up table, one might expect reversal of encoding to be a dominant feature across an ensemble of orbitofrontal neurons. Yet a broader consideration of the neural correlates in orbitofrontal cortex reveals that this is not true (Stalnaker et al. 2006). This is illustrated in Figure 15.2, which shows the average response of all cue-selective neurons during a reversal. As a group, these neurons do not reverse or recode the associations across reversal. The populations fail to reverse even though about 25% of the neurons in these populations do reverse encoding. The reason for this is apparent in the inset in Figure 15.2, which plots an index of cue-selectivity for each neuron before and after reversal. This plot shows that there is no relationship between cue-selectivity before and after reversal in orbitofrontal neurons. Thus reversal of encoding in orbitofrontal neurons can only be demonstrated by cherry-picking neurons that reverse from the overall population. This result is clearly inconsistent with the view of this region as a particularly flexible associative learning area, as suggested by reports focusing on reversal correlates in single-units.

FIGURE 15.2. (See Color Insert)Flexibility of associative encoding in orbitofrontal cortex.


(See Color Insert)Flexibility of associative encoding in orbitofrontal cortex. Population response of neurons in orbitofrontal cortex identified as cue-selective during learning. Average activity per neuron is shown, synchronized to odor onset, during (more...)

This view is also contradicted by the relationship between reversal of encoding in orbitofrontal neurons and reversal performance. If orbitofrontal cortex were driving performance due to an ability to rapidly encode the new associations, then one should expect reversal of encoding to be associated with rapid reversal learning, whereas a failure to reverse encoding should be associated with slower learning. However we find exactly the opposite relationship (Stalnaker et al. 2006). Rats acquire reversals significantly more slowly when we observe reversal of encoding in orbitofrontal neurons.

And finally we now know that orbitofrontal cortex is far from unique in the reversal of associative encoding that it shows during reversal learning; cue-selective neurons in many other brain regions also reverse firing during reversal learning, and they do so much more rapidly and in greater proportions. This is particularly evident in the basolateral amygdala, where the majority of the cue-selective neurons—55%-60%—reverse firing (Schoenbaum et al. 1999; Saddoris, Gallagher, and Schoenbaum 2005; Stalnaker et al. 2007b). This is illustrated by the population responses in Figure 15.3, which switch cue-selectivity, and by the inset, which shows a significant inverse correlation between cue-selectivity before and after reversal in basolateral amygdala neurons. Similar results have also been reported in a Pavlovian reversal task in primates (Patton et al. 2006).

FIGURE 15.3. (See Color Insert)Flexibility of associative encoding in basolateral amygdala.


(See Color Insert)Flexibility of associative encoding in basolateral amygdala. Population response of neurons in basolateral amygdala identified as cue-selective during learning. Average activity per neuron is shown, synchronized to odor onset, during (more...)


If orbitofrontal cortex is not critical for adaptive behavior either due to a special role in response inhibition or because this area is faster than other regions at encoding new associative information, then why is orbitofrontal cortex necessary for adaptive behavior? What underlying function does orbitofrontal cortex provide that facilitates changes in behavior when outcomes are not as expected?

In the next two sections, we will suggest that this underlying function involves signaling what have been called outcome expectancies—literally predictions about the characteristics, features, and specific value of outcomes that the animal expects to receive, given particular circumstances and cues in the environment (Schoenbaum and Roesch 2005). In this section, we will show that these signals are prominent in neural activity and BOLD responses in orbitofrontal cortex and argue that the importance of these signals is evident in deficits caused by orbitofrontal damage in a variety of behavioral settings in which outcomes must be used to guide normal behavior, even when contingencies are not changing. In the next section, we will suggest that these same signals are also necessary for the detection of prediction errors when contingencies are changing, thereby facilitating changes in associative representations in other brain areas and, ultimately, behavior.

15.4.1. Neural Correlates

Signals reflecting outcome expectancies are prominent in neural activity in orbitofrontal cortex. In addition to the cue-selective activity described above, orbitofrontal neurons also tend to fire in advance of meaningful events in trial-based tasks. Indeed this may be the most striking feature, overall, of neural activity in orbitofrontal cortex. Orbitofrontal neurons exhibit firing correlated with every event in the trial, and typically this activity is not triggered by these events but rather increases in anticipation of them (Schoenbaum and Eichenbaum 1995; Lipton, Alvarez, and Eichenbaum 1999; Ramus and Eichenbaum 2000). In other words, event-related activity in orbitofrontal neurons anticipates these predictable and presumably value-laden events.

Such anticipatory activity is particularly strong prior to delivery of primary rewarding or aversive outcomes. We see this during discrimination learning, when distinct populations of neurons in orbitofrontal cortex develop selective firing prior to delivery of sucrose or quinine (Schoenbaum, Chiba, and Gallagher 1998). Often these neurons initially fire to one or the other outcome and then come to fire in anticipation of that outcome and, later, to cues that predict the outcome. Unlike reward-responsive dopamine neurons, which also transfer activity to predictive events (Montague, Dayan, and Sejnowski 1996; Hollerman and Schultz 1998; Waelti, Dickinson, and Schultz 2001; Bayer and Glimcher 2005; Pan et al. 2005; Roesch, Calu, and Schoenbaum 2007), these neurons do not stop firing to the rewards (Schoenbaum et al. 2003b; Stalnaker et al. 2006). As a result, their activity is not well described as a prediction error signal (see also later discussion below and Chapter 14); rather, activation of these neurons during progressively earlier periods in the trial is better explained as a representation of the actual outcome. Interestingly, this activity develops independently of choice performance, which is not orbitofrontal dependent, but instead develops in concert with changes in response latencies that seem to reflect active anticipation of particular outcomes.

Similar correlates have also been reported by a number of other investigators in rats, monkeys, and humans, in both single-unit activity and in BOLD response (Schoenbaum, Chiba, and Gallagher 1998; Tremblay and Schultz 1999; Schultz, Tremblay, and Hollerman 2000; Roesch, Taylor, and Schoenbaum 2006; van Duuren et al. 2007). For example, Schultz and colleagues have shown that orbitofrontal neurons exhibit outcome-expectant activity during a visual delayed response task (Tremblay and Schultz 1999). Monkeys were trained to respond to visual cues to obtain different food rewards. As illustrated by the single-unit example in Figure 15.4, many orbitofrontal neurons exhibited differential activity after responding, in anticipation of presentation of a particular food item. The monkey expected that food and the neuron’s activity reflected that expectation. This anticipatory activity did not simply reflect the physical or sensory attributes of the expected food reward, but rather reflected something about the value the monkey placed on this item. This was revealed by recording from the same neurons across blocks in which the relative preference for the different food rewards shifted. Outcome-expectant activity often changed to reflect changes in the relative value of a particular food across blocks. Although transitive encoding of the value of multiple rewards in orbitofrontal cortex has recently been shown in a randomized design (Padoa-Schioppa and Assad 2008), Schultz’s original report is important because by reliably altering the relative values of available rewards, it provides an excellent demonstration that these anticipatory signals reflect the animal’s judgment regarding the value of the expected outcome.

FIGURE 15.4. Outcome-expectant neural activity in monkey orbitofrontal cortex.


Outcome-expectant neural activity in monkey orbitofrontal cortex. Monkeys were trained to respond after presentation of visual cues to obtain different food rewards. The visual items each predicted a particular food, for which the monkeys had different (more...)

Notably, although anticipatory firing has also been observed in other areas (Watanabe 1996; Schoenbaum, Chiba, and Gallagher 1998; Hikosaka and Watanabe 2000; Tremblay and Schultz 2000; Shidara and Richmond 2002; Wallis and Miller 2003; Hikosaka and Watanabe 2004; Sugase-Miyamoto and Richmond 2005; Patton et al. 2006), studies that have compared such activity between different brain areas have found that it emerges first in orbitofrontal cortex (Schoenbaum, Chiba, and Gallagher 1998), and outcome-expectant activity in other areas—basolateral amygdala specifically—is reduced to near chance levels by even unilateral lesions of orbitofrontal cortex (Saddoris, Gallagher, and Schoenbaum 2005). This suggests that orbitofrontal neurons are, to some extent, constructing these representations based on afferent input, rather than simply receiving this information from other areas. Indeed, we would suggest that firing in anticipation of outcomes during delays or uncued periods of a task is just a special example of a more general function that orbitofrontal cortex fulfills, even during other periods in the task.

15.4.2. Behavioral Correlates

Of course, the neural firing patterns described above are only correlates of behavior. Alone they cannot provide conclusive evidence that orbitofrontal cortex is critical for signaling information about expected outcomes. For that one must rely on behavioral studies involving lesions or other manipulations. If orbitofrontal cortex is in fact critical for signaling outcome expectancies, then manipulations that disrupt or alter output from this brain area should preferentially disrupt behaviors that depend on information about outcomes.

This turns out to be an excellent description of many, if not most, orbitofrontal-dependent behaviors. A full accounting of the entire list is beyond the scope of this review; however, to illustrate we will describe a few examples in detail in addition to providing a list of others. The first comes from the odor discrimination task described above. At the same time rats are learning to withhold responding to an odor that predicts a negative outcome, they also show changes in the speed or latency at which they respond to the fluid well; they begin to respond faster after sampling a positive-predictive odor, as if expecting sucrose, and slower after sampling a negative-predictive odor, as if expecting quinine (Schoenbaum et al. 2003a). The difference in their response latencies becomes significant before accurate choice performance emerges and, as noted earlier, develops at the same time that neurons in the orbitofrontal cortex begin to show differential firing in anticipation of sucrose or quinine (Schoenbaum, Chiba, and Gallagher 1998). Furthermore, response latencies have been shown to be particularly sensitive to information about outcomes (Sage and Knowlton 2000). Manipulations affecting orbitofrontal cortex selectively abolish differential latency changes in our task and affect latency changes in other settings (Bohn, Giertler, and Hauber 2003a, 2003b; Schoenbaum et al. 2003a).

A second and perhaps more conclusive example showing a critical role for orbitofrontal cortex in the use of information about expected outcomes comes from studies using Pavlovian reinforcer devaluation (Holland and Rescorla 1975; Holland and Straub 1979). In these studies, an animal is trained that a cue predicts a particular reward. Subsequently, the value of the reward is reduced by pairing it with illness or selective satiation, and then the animal’s ability to access and use that new value to guide the learned responding is assessed by presenting the cue by itself. Normal animals show reduced responding to the predictive cue, reflecting their ability to access and use cue-evoked representations of the reward and its current value. Monkeys and rats with orbitofrontal lesions fail to show this normal effect of devaluation (Gallagher, McMahan, and Schoenbaum 1999; Izquierdo, Suda, and Murray 2004; Machado and Bachevalier 2007); although these animals stop consuming the devalued or satiated food, they continue to respond to the cue that predicts that food to the same extent as non-devalued controls. Critically, the deficit caused by orbitofrontal damage is evident even when orbitofrontal cortex is present for the initial training and for devaluation (Pickens et al. 2003, 2005); thus the deficit seems to reflect a critical role for orbitofrontal cortex in mobilizing and using the learned information about the value of the expected outcome to guide or influence responding. This observation is consistent with data in monkeys and humans that neural activity in orbitofrontal cortex changes in real time as a result of satiation (Critchley and Rolls 1996; O’Doherty et al. 2000; Gottfried, O’Doherty, and Dolan 2003). This distinguishes the role of orbitofrontal cortex in this setting from that of other regions, like amygdala or mediodorsal thalamus, which seem to be necessary during the earlier phases (Hatfield et al. 1996; Malkova, Gaffan, and Murray 1997; Pickens et al. 2003; Wellmann, Gale, and Malkova 2005; Mitchell, Browning, and Baxter 2007).

A third example comes from work by Balleine and colleagues using Pavlovian-to-instrumental transfer. Transfer occurs when animals are independently trained to associate a cue with reward (Pavlovian stage) and to associate an instrumental response such as a lever press with reward (instrumental phase), after which they show an increased rate of instrumental responding in the presence of the Pavlovian cue (Estes 1948; Holland 2004). There are two forms of this transfer: a general form, observed when the rewards predicted by the cue and response are different, and a specific form, observed when the reward is the same in both cases. The specific form is thought to depend on the ability of the cue to evoke a representation of the particular reward linked to the instrumental response. Consistent with the proposal that orbitofrontal cortex is critical for such outcome signaling, Balleine and colleagues have reported that specific transfer is particularly sensitive to orbitofrontal lesions (Ostlund and Balleine 2007a). Again the orbitofrontal cortex, unlike areas such as basolateral amygdala (Corbit and Balleine 2005), appears to be required specifically at the time the information must be used to guide responding, since only post-training lesions were effective.

These examples indicate that one fundamental role of neural activity in orbitofrontal cortex is to signal information about expected outcomes to other brain areas. Other examples of orbitofrontal-dependent behaviors further corroborate this account. These would include delayed discounting (Mobini et al. 2002; Kheramin et al. 2003; Winstanley et al. 2004), conditioned reinforcement, and other second-order behaviors (Cousens and Otto 2003; Hutcheson and Everitt 2003; Pears et al. 2003; Burke et al. 2008), Pavlovian approach behaviors (Chudasama and Robbins 2003), the enhancement of discriminative responding by different outcomes (McDannald et al. 2005), and even cognitive and affective processes currently under investigation in human models, such as regret and counterfactual reasoning (Camille et al. 2004). In each case, normal performance would require the ability to signal, in real time, information about the features, characteristics, and values of outcomes predicted by cues and circumstances in the environment.


Of course, the orbitofrontal-dependent behaviors described above differ from adaptive behavior as assessed by reversal learning or related tasks, since they generally do not involve changes in established associations or response contingencies. For example, orbitofrontal inactivation impairs changes in cue-evoked responding after devaluation, even though nothing about the underlying associations has changed; only the value of the outcome has been altered. Similarly, damage to orbitofrontal cortex disrupts transfer, even when lesions are made after the rats have learned the cue-reward and response-reward associations underlying this phenomenon. These data indicate that signals regarding expected outcomes from orbitofrontal cortex influence judgment and decision making.

So how does signaling of expected outcomes help explain why orbitofrontal cortex is so important for modifying behavior when contingencies are changing? One possibility is that these signals may also be critical for driving updating of associative representations in other brain regions in the face of unexpected outcomes, particularly in subcortical areas, such as striatum and amygdala, which are strongly implicated in associative learning processes. According to classical learning theory (Rescorla and Wagner 1972), and its more modern cousin reinforcement learning (Sutton and Barto 1998), associative learning is driven by prediction errors (for further details see Chapter 14). A prediction error (δ) is calculated from the difference between the value of the outcome that is predicted by actions and cues in the environment (V), and the value of the outcome that is actually received (λ), according to the equation, δ= c(λ–V), where c reflects processes like surprise or attention, which can influence the rate of learning. There is now strong evidence that these prediction errors are signaled by phasic activity in midbrain dopamine neurons, as well as afferent regions such as habenula; such phasic firing is proposed to act as a teaching signal to stamp in associative representations in areas like striatum and amygdala. If signaling of expected outcomes by orbitofrontal cortex were also contributing to calculation of these prediction errors, essentially providing information necessary to compute V, then that would explain why orbitofrontal damage disrupts changes in behavior when contingencies are altered.

This proposal makes a number of testable predictions. For starters, it predicts that changes in associative representations in downstream regions, such as amygdala, should be dependent on orbitofrontal cortex. Consistent with this prediction, we have reported that associative encoding in basolateral amygdala in our reversal task is markedly less flexible in orbitofrontal-lesioned rats (Figure 15.5) than that in controls (Figure 15.3) (Saddoris, Gallagher, and Schoenbaum 2005). Neurons were less likely to become selective for predictive cues with training, and those that were present failed to reverse their cue-selectivity. Moreover, this downstream inflexibility appears to be the proximal cause of the orbitofrontal-dependent reversal deficit, since lesions or inactivation of basolateral amygdala abolishes the reversal deficit caused by orbitofrontal lesions (Stalnaker et al. 2007a). Since recoding of associations in orbitofrontal cortex lags recoding in basolateral amygdala (Schoenbaum et al. 1999), these results cannot be easily explained as reflecting rapid flexibility in orbitofrontal cortex. However they are fully consistent with the proposal that signaling of the old associations by orbitofrontal neurons facilitates encoding of the new associations by downstream areas and thereby changes behavior. Indeed, this is consistent with our report that better reversal performance is observed when cue-selectivity in orbitofrontal cortex fails to reverse (Stalnaker et al. 2006). Under our proposal, the explanation for this finding would be that when orbitofrontal neurons continue to encode pre-reversal associations, negative prediction error signals are facilitated, leading to faster learning.

FIGURE 15.5. Flexibility of associative encoding in basolateral amygdala depends on input from orbitofrontal cortex.


Flexibility of associative encoding in basolateral amygdala depends on input from orbitofrontal cortex. Population response of cue-selective neurons in basolateral amygdala in rats with ipsilateral lesions of orbitofrontal cortex. Average activity per (more...)

Other data consistent with this proposal comes from a Pavlovian over-expectation task (Rescorla 1970). In this task, rats are first trained that several Pavlovian cues are each independent predictors of reward. Subsequently, two of the previously trained cues are presented together in compound, followed by the same reward. When the effect of this compound training on responding for the individual cues is assessed later in a probe test, a spontaneous reduction in responding is found. This reduced responding is thought to result from the violation of summed expectations for reward during compound training. That is, animals expect to get double the reward after the compound cues, whereas they actually only get the usual reward. Notably, unlike reversal learning, nothing about the outcome is changed. Instead, prediction errors are induced by directly manipulating the animals’ expectations for reward. Furthermore, the learning induced by these manipulations can be dissociated from the use of the newly acquired information, since the former occurs during compound training and the latter in the probe test.

Using this task, we have found that reversible inactivation of orbitofrontal cortex during compound training prevents the later reduction in responding to the individual cues (Figure 15.6a, b) (Takahashi et al. 2009). This result cannot be explained as a simple deficit in using associative information encoded by orbitofrontal neurons to guide behavior, because orbitofrontal cortex is fully functional at the time the behavior is assessed. Instead, it indicates that signals from orbitofrontal cortex are required for learning, which presumably occurs in other regions. Consistent with this idea, blockade of orbitofrontal NMDA receptors in a separate group of rats during compound training had no effect on over-expectation. The most coherent interpretation of these data is that orbitofrontal cortex contributes to teaching signals to update representations in other areas.

FIGURE 15.6. Effect of bilateral inactivation of ventral tegmental area, or contralateral inactivation of orbitofrontal cortex and ventral tegmental area, on changes in behavior after over-expectation.


Effect of bilateral inactivation of ventral tegmental area, or contralateral inactivation of orbitofrontal cortex and ventral tegmental area, on changes in behavior after over-expectation. Rats in all groups conditioned normally and maintained responding (more...)

Interestingly, it has been suggested that orbitofrontal neurons might directly signal prediction errors (Schultz and Dickinson 2000; Rolls and Grabenhorst 2008). This proposal is supported by a number of brain imaging studies that have reported that neural activity in OFC, as reflected in BOLD signal, is correlated with errors in reward prediction (Nobre et al. 1999; Berns et al. 2001; O’Doherty et al. 2003; Dreher, Kohn, and Berman 2006; Tobler et al. 2006). For example, BOLD signal in regions within OFC increases abruptly when expectations for reward are not met (Nobre et al. 1999) and this signal conforms with formal learning theory predictions in a blocking paradigm (Tobler et al. 2006). Thus, OFC might contribute to learning during over-expectation if it were directly signaling reward prediction errors.

However, data from single-unit recording studies are not consistent with this proposal. Aside from anecdotal reports (Thorpe, Rolls, and Maddison 1983; Ramus and Eichenbaum 2000; Feierstein et al. 2006), there is very little evidence for appreciable error encoding by orbitofrontal neurons. Moreover, a direct comparison of prediction error correlates in dopamine neurons, which have been clearly demonstrated to signal reward prediction errors, against those in orbitofrontal neurons, shows unambiguously that the two areas do not signal this information similarly (Takahashi et al. 2009). As expected, dopamine neurons increased firing in response to unexpected reward and suppressed firing on reward omission (Figure 15.7a). These changes were inversely correlated, and the loss of activity to reward was also inversely correlated with the development of activity to predictive cues. However none of these features were evident in the activity of reward-responsive (or any other) orbitofrontal neurons. Instead these neurons actually tended to fire more to an expected reward. This is because the normal reward response is essentially shifted backward in time (Figure 15.7b), so that the orbitofrontal response develops in anticipation of the expected reward, while the dopamine neuron response develops after an unexpected reward. Activity shows a similar time course prior to reward on omission trials. Thus activity in orbitofrontal neurons signals reward predictions and not reward prediction errors.

FIGURE 15.7. Activity in ventral tegmental area dopamine neurons and orbitofrontal neurons in response to unexpected and expected reward delivery and omission of an expected reward.


Activity in ventral tegmental area dopamine neurons and orbitofrontal neurons in response to unexpected and expected reward delivery and omission of an expected reward. As expected, dopamine neurons (a) exhibited greater firing in response to unexpected (more...)

The relationship between signaling of reward predictions by orbitofrontal neurons and reward prediction errors by dopamine neurons is consistent with the proposal that the two areas interact in calculating errors; when activity before reward in orbitofrontal cortex is low, consistent with low reward expectations, activity after reward in the dopamine neurons is high (and vice versa). Accordingly, inactivation of ventral tegmental area during compound training in the over-expectation task abolishes the normal decline in responding, as does unilateral inactivation of orbitofrontal cortex in one hemisphere and ventral tegmental area in the contralateral hemisphere (Figure 15.6c). These results, together, argue strongly that information regarding expected outcomes signaled by orbitofrontal neurons contributes to calculation of reward prediction errors signaled by dopamine neurons.


Above we have argued that information regarding expected outcomes in orbitofrontal neurons influences prediction errors and associative learning in downstream areas more closely tied to decision making. However orbitofrontal cortex might also work upstream, modulating activity in more sensory-related structures such as piriform cortex. Piriform cortex, the largest of the olfactory cortical areas, is part of a bidirectional system involved in processing olfactory information (Haberly 2001). On one hand, piriform cortex, in particular anterior regions, has strong reciprocal connections with olfactory bulb and has traditionally been thought of as primary olfactory cortex. On the other hand, piriform cortex also receives substantial descending input from downstream brain areas, including orbitofrontal cortex and amygdala (Johnson et al. 2000; Majak et al. 2004). This relationship suggests that piriform cortex might not function solely as a primary sensory region, but also as an association cortex, integrating incoming olfactory information with descending input from higher-order association areas such as OFC (also see Chapter 5 in this volume for further details).

Consistent with this notion we have shown that activity to odor cues in anterior and posterior piriform cortex during performance of our go/no-go discrimination task (Roesch, Stalnaker, and Schoenbaum 2006, 2007) was indeed much more associative in nature than would be expected from a purely sensory region. For example, although odor-selective activity was present in many neurons in anterior piriform cortex, this activity was typically modulated by learning or reversal of simple odor discriminations. Interestingly, the prevalence of such correlates increased as we moved posteriorly in piriform cortex. This is consistent with the connectivity of anterior versus posterior areas with limbic structures and likely reflects strong input from amygdala (Datiche and Cattarelli 1996; Johnson et al. 2000; Majak et al. 2004; Illig 2005).

So what functions do interactions between orbitofrontal cortex, amygdala, and piriform cortex play in associative learning? Clearly, during learning, piriform cortex serves as a way station to orbitofrontal cortex and other areas, signaling sensory features of odor cues. Notably, pure sensory encoding in piriform cortex is unique among the areas we have recorded from, including orbitofrontal cortex, amygdala, and ventral striatum. However, this interaction is bidirectional; after learning, orbitofrontal cortex might actively modulate afferent input to piriform cortex (cf. Cohen et al. 2008). This might prepare neurons to recognize and respond to expected odors more rapidly or even promote certain neurons to fire in certain contexts (e.g., after learning or after reversal) but not others. Orbitofrontal input might even initiate activity in piriform cortex in the absence of any odor, allowing for recall of odors and odor-related associations (Haberly 2001). In either case, the strong reciprocal connections between these two areas (Johnson et al. 2000; Haberly 2001; Illig 2005) are likely to support appropriate behavioral responding to olfactory cues in the service of optimizing behavior. Of course, future work disrupting feedback from orbitofrontal cortex will be necessary to test whether this input is influencing encoding in piriform cortex as we propose.


We have discussed evidence for the involvement of the orbitofrontal cortex in adapting behavior in the face of unexpected outcomes. Current evidence contradicts long-held ideas that this role reflects response inhibition or rapid flexibility of associative encoding. Instead we have suggested that it reflects contributions of orbitofrontal signaling to changing associative representations in other brain regions, mediated indirectly through support of prediction-error signaling by systems such as the midbrain dopamine neurons. This proposal is consistent with neurophysiological and behavioral studies published in the past decade revealing that orbitofrontal cortex is fundamentally critical to signaling outcome expectancies. These signals facilitate judgment about expected outcomes to guide behavior, even when contingencies are unchanged. Our proposal simply extends this idea to suggest that these same signals also facilitate learning when contingencies are changing.

This proposal bears striking similarities to ideas regarding the role of ventral striatum in so-called actor-critic models of reinforcement learning (Barto, Sutton, and Anderson 1983; Sutton and Barto 1998; O’Doherty et al. 2004). In these proposals, it is ventral striatum that is hypothesized to provide the information necessary to compute the “state value,” or V, which is required for calculating reward prediction errors. We would suggest that orbitofrontal cortex plays a similar role, not instead of but in addition to contributions from ventral striatum and likely other regions. Indeed, much as there are multiple memory systems, there are likely to be multiple, parallel critics, each providing a particular type of information relevant to computing V. With regard to orbitofrontal cortex and ventral striatum, one possibility is that output from orbitofrontal cortex provides information regarding the value of the specific outcome that is predicted by Pavlovian cues in the environment. This is consistent with its proposed role in associative learning from devaluation and other tasks and also with recent data from our lab showing that orbitofrontal lesions prevent unblocking when one equally valued outcome is switched for another (Burke et al. 2008). At the same time, ventral striatum might provide information regarding the general affect or emotion that has been associated with the same predictive cue, again consistent with the clear involvement of this area in behavioral tasks, such as Pavlovian-to-instrumental transfer, that require such general affective properties (so-called motivational habits). One might also postulate the existence of an instrumental critic, perhaps in medial prefrontal regions, providing information about values predicted by actions (Valentin, Dickinson, and O’Doherty 2007).

Inputs from these areas might converge on midbrain areas, either directly or indirectly. For example, ventral striatum receives input from orbitofrontal cortex; thus ventral striatum could function as a super-critic, integrating information regarding the general affective and outcome-specific properties of Pavlovian cues and sending it off to be used in error signaling. Alternatively orbitofrontal cortex also projects to other areas, including directly to midbrain and could thereby bypass ventral striatum. Indeed both pathways may be utilized to influence learning in subtly different ways.

Importantly, all of these ideas are imminently testable. fMRI and single-unit studies combined with lesions and inactivation, and behavioral tasks that manipulate expectancies based on the specific and general values of actions and Pavlovian cues can directly address—either confirming or invalidating—hypotheses regarding these circuits. We believe there is already substantial experimental support for the simple ideas we have laid out here. However further work holds the potential to sketch out this interesting circuit and its functions in much more detail. This work will surely provide evidence contrary to our proposal, but it holds the promise of creating a truly useful framework for how brain circuits implement the simple associative learning processes that help us navigate our ever-changing world.


  1. Barto A. G., Sutton R. S., Anderson C. W. Neuron-like adaptive elements that can solve difficult learning control problems. IEEE Transactions on Systems, Man, and Cybernetics. 1983;13:834–46.
  2. Baxter M. G., Parker A., Lindner C. C. C., Izquierdo A. D., Murray E. A. Control of response selection by reinforcer value requires interaction of amygdala and orbitofrontal cortex. Journal of Neuroscience. 2000;20:4311–19. [PubMed: 10818166]
  3. Bayer H. M., Glimcher P. W. Midbrain dopamine neurons encode a quantitative reward prediction error signal. Neuron. 2005;47:129–41. [PMC free article: PMC1564381] [PubMed: 15996553]
  4. Bechara A., Damasio H., Tranel D., Damasio A. R. Deciding advantageously before knowing the advantageous strategy. Science. 1997;275:1293–94. [PubMed: 9036851]
  5. Berns G. S., McClure S. M., Pagnoni G., Montague P. R. Predictability modulates human brain response to reward. Journal of Neuroscience. 2001;21:2793–98. [PubMed: 11306631]
  6. Bissonette G. B., Martins G. J., Franz T. M., Harper E. S., Schoenbaum G., Powell E. M. Double dissociation of the effects of medial and orbital prefrontal cortical lesions on attentional and affective shifts in mice. Journal of Neuroscience. 2008;28:11124–30. [PMC free article: PMC2657142] [PubMed: 18971455]
  7. Bohn I., Giertler C., Hauber W. Orbital prefrontal cortex and guidance of instrumental behavior in rats under reversal conditions. Behavioral Brain Research. 2003a;143:49–56. [PubMed: 12842295]
  8. Bohn I., Giertler C., Hauber W. NMDA receptors in the rat orbital prefrontal cortex are involved in guidance of instrumental behavior under reversal conditions. Cerebral Cortex. 2003b;13:968–76. [PubMed: 12902396]
  9. Burke K. A., Franz T. M., Miller D. N., Schoenbaum G. The role of orbitofrontal cortex in the pursuit of happiness and more specific rewards. Nature. 2008;454:340–44. [PMC free article: PMC2727745] [PubMed: 18563088]
  10. Butter C. M. Perseveration and extinction in discrimination reversal tasks following selective frontal ablations in Macaca mulatta. Physiology and Behavior. 1969;4:163–71.
  11. Camille N., Coricelli G., Sallet J., Pradat-Diehl P., Duhamel J. -R., Sirigu A. The involvement of the orbitofrontal cortex in the experience of regret. Science. 2004;304:1168–70. [PubMed: 15155951]
  12. Chudasama Y., Kralik J. D., Murray E. A. Rhesus monkeys with orbital prefrontal cortex lesions can learn to inhibit prepotent responses in the reversed reward contingency task. Cerebral Cortex. 2007;17:1154–59. [PubMed: 16774961]
  13. Chudasama Y., Robbins T. W. Dissociable contributions of the orbitofrontal and infralimbic cortex to pavlovian autoshaping and discrimination reversal learning: Further evidence for the functional heterogeneity of the rodent frontal cortex. Journal of Neuroscience. 2003;23:8771–80. [PubMed: 14507977]
  14. Cohen Y., Reuveni I., Barkai E., Maroun M. Olfactory learning-induced long-lasting enhancement of descending and ascending synaptic transmission to the piriform cortex. Journal of Neuroscience. 2008;28:6664–69. [PubMed: 18579740]
  15. Corbit L. H., Balleine B. W. Double dissociation of basolateral and central amygdala lesions on the general and outcome-specific forms of pavlovian-instrumental transfer. Journal of Neuroscience. 2005;25:962–70. [PubMed: 15673677]
  16. Cousens G. A., Otto T. Neural substrates of olfactory discrimination learning with auditory secondary reinforcement. I. Contributions of the basolateral amygdaloid complex and orbitofrontal cortex. Integrative Physiological and Behavioral Science. 2003;38:272–94. [PubMed: 15119378]
  17. Critchley H. D., Rolls E. T. Hunger and satiety modify the responses of olfactory and visual neurons in the primate orbitofrontal cortex. Journal of Neurophysiology. 1996;75:1673–86. [PubMed: 8727405]
  18. Damasio A. R. New York: Putnam; Descartes Error. 1994
  19. Damasio A. R., Grabowski T., Frank R., Galaburda A. M., Damasio A. R. The return of Phineas Gage: Clues about the brain from the skull of a famous patient. Science. 1994;264:1102–5. [PubMed: 8178168]
  20. Datiche F., Cattarelli M. Reciprocal and topographic connections between the piriform and prefrontal cortices in the rat: A tracing study using the B subunit of the cholera toxin. Brain Research Bulletin. 1996;41:391–98. [PubMed: 8973845]
  21. Dillon D. G., Pizzagalli D. A. Inhibition of action, thought, and emotion: A selective neurobiological review. Applied Previews of Psychology. 2007;12:99–114. [PMC free article: PMC2396584] [PubMed: 19050749]
  22. Dreher J. C., Kohn P., Berman K. F. Neural coding of distinct statistical properties of reward information in humans. Cerebral Cortex. 2006;16:561–73. [PubMed: 16033924]
  23. Eagle D. M., Baunez C., Hutcheson D. M., Lehmann O., Shah A. P., Robbins T. W. Stop-signal reaction-time task performance: Role of prefrontal cortex and subthalamic nucleus. Cerebral Cortex. 2008;18:178–88. [PubMed: 17517682]
  24. Estes W. K. Discriminative conditioning 2. Effects of a Pavlovian conditioned stimulus upon a subsequently established operant response. Journal of Experimental Psychology. 1948;38:173–77. [PubMed: 18913666]
  25. Feierstein C. E., Quirk M. C., Uchida N., Sosulski D. L., Mainen Z. F. Representation of spatial goals in rat orbitofrontal cortex. Neuron. 2006;51:495–507. [PubMed: 16908414]
  26. Fellows L. K., Farah M. J. Ventromedial frontal cortex mediates affective shifting in humans: Evidence from a reversal learning paradigm. Brain. 2003;126:1830–37. [PubMed: 12821528]
  27. Ferrier D. New York: The Functions of the Brain. 1876 GP Putnam’s Sons.
  28. Fuster J. M. The Prefrontal Cortex. 1997 3rd edn. New York: Lippin-Ravencott.
  29. Gallagher M., McMahan R. W., Schoenbaum G. Orbitofrontal cortex and representation of incentive value in associative learning. Journal of Neuroscience. 1999;19:6610–14. [PubMed: 10414988]
  30. Gottfried J. A., O’Doherty J., Dolan R. J. Encoding predictive reward value in human amygdala and orbitofrontal cortex. Science. 2003;301:1104–7. [PubMed: 12934011]
  31. Haberly L. B. Parallel-distributed processing in olfactory cortex: New insights from morphological and physiological analysis of neuronal circuitry. Chemical Senses. 2001;26:551–76. [PubMed: 11418502]
  32. Harlow J. M. Recovery after passage of an iron bar through the head. Publications of the Massachusetts Medical Society. 1868;2:329–46.
  33. Hatfield T., Han J. S., Conley M., Gallagher M., Holland P. Neurotoxic lesions of basolateral, but not central, amygdala interfere with Pavlovian second-order conditioning and reinforcer devaluation effects. Journal of Neuroscience. 1996;16:5256–65. [PubMed: 8756453]
  34. Hikosaka K., Watanabe M. Delay activity of orbital and lateral prefrontal neurons of the monkey varying with different rewards. Cerebral Cortex. 2000;10:263–71. [PubMed: 10731221]
  35. Hikosaka K., Watanabe M. Long- and short-range reward expectancy in the primate orbitofrontal cortex. European Journal of Neuroscience. 2004;19:1046–54. [PubMed: 15009152]
  36. Holland P. C. Relations between Pavlovian-Instrumental transfer and reinforcer devaluation. Journal of Experimental Psychology: Animal Behavior Processes. 2004;30:104–17. [PubMed: 15078120]
  37. Holland P. C., Rescorla R. A. The effects of two ways of devaluing the unconditioned stimulus after first and second-order appetitive conditioning. Journal of Experimental Psychology: Animal Behavior Processes. 1975;1:355–63. [PubMed: 1202141]
  38. Holland P. C., Straub J. J. Differential effects of two ways of devaluing the unconditioned stimulus after Pavlovian appetitive conditioning. Journal of Experimental Psychology: Animal Behavior Processes. 1979;5:65–78. [PubMed: 528879]
  39. Hollerman J. R., Schultz W. Dopamine neurons report an error in the temporal prediction of reward during learning. Nature Neuroscience. 1998;1:304–9. [PubMed: 10195164]
  40. Hornak J., O’Doherty J., Bramham J., Rolls E. T., Morris R. G., Bullock P. R., Polkey C. E. Reward-related reversal learning after surgical excisions in orbito-frontal or dorsolateral prefrontal cortex in humans. Journal of Cognitive Neuroscience. 2004;16:463–78. [PubMed: 15072681]
  41. Hutcheson D. M., Everitt B. J. The effects of selective orbitofrontal cortex lesions on the acquisition and performance of cue-controlled cocaine seeking in rats. Annals of the New York Academy of Science. 2003;1003:410–11. [PubMed: 14684474]
  42. Illig K. R. Projections from orbitofrontal cortex to anterior piriform cortex in the rat suggest a role in olfactory information processing. Journal of Comparative Neurology. 2005;488:224–31. [PMC free article: PMC1360190] [PubMed: 15924345]
  43. Izquierdo A. D., Suda R. K., Murray E. A. Bilateral orbital prefrontal cortex lesions in rhesus monkeys disrupt choices guided by both reward value and reward contingency. Journal of euroscience. 2004;24:7540–48. [PubMed: 15329401]
  44. Johnson D. M., Illig K. R., Behan M., Haberly L. B. New features of connectivity in piriform cortex visualized by intracellular injection of pyramidal cells suggest that “primary” olfactory cortex functions like “association” cortex in other sensory systems. Journal of Neuroscience. 2000;20:6974–82. [PubMed: 10995842]
  45. Jones B., Mishkin M. Limbic lesions and the problem of stimulus-reinforcement associations. Experimental Neurology. 1972;36:362–77. [PubMed: 4626489]
  46. Kheramin S., Brody S., Ho M. -Y., Velazquez-Martinez D. N., Bradshaw C. M., Szabadi E., Deakin J. F. W., Anderson I. M. Role of the orbital prefrontal cortex in choice between delayed and uncertain reinforcers: A quantitative analysis. Behavioral Processes. 2003;64:239–50. [PubMed: 14580695]
  47. Lipton P. A., Alvarez P., Eichenbaum H. Crossmodal associative memory representations in rodent orbitofrontal cortex. Neuron. 1999;22:349–59. [PubMed: 10069340]
  48. Machado C. J., Bachevalier J. The effects of selective amygdala, orbital frontal cortex or hippocampal formation lesions on reward assessment in nonhuman primates. European Journal of Neuroscience. 2007;25:2885–2904. [PubMed: 17561849]
  49. Majak K., Ronkko S., Kemppainen S., Pitkanen A. Projections from the amygdaloid complex to the piriform cortex: A PHA-L study in the rat. Journal of Comparative Neurology. 2004;476:414–28. [PubMed: 15282713]
  50. Malkova L., Gaffan D., Murray E. A. Excitotoxic lesions of the amygdala fail to produce impairment in visual learning for auditory secondary reinforcement but interfere with reinforcer devaluation effects in rhesus monkeys. Journal of Neuroscience. 1997;17:6011–20. [PubMed: 9221797]
  51. McAlonan K., Brown V. J. Orbital prefrontal cortex mediates reversal learning and not attentional set shifting in the rat. Behavioral Brain Research. 2003;146:97–30. [PubMed: 14643463]
  52. McDannald M. A., Saddoris M. P., Gallagher M., Holland P. C. Lesions of orbitofrontal cortex impair rats’ differential outcome expectancy learning but not conditioned stimulus-potentiated feeding. Journal of Neuroscience. 2005;25:4626–32. [PMC free article: PMC1201522] [PubMed: 15872110]
  53. Meunier M., Bachevalier J., Mishkin M. Effects of orbital frontal and anterior cingulate lesions on object and spatial memory in rhesus monkeys. Neuropsychologia. 1997;35:999–1015. [PubMed: 9226661]
  54. Mishkin M. Perseveration of central sets after frontal lesions in monkeys. Warren J. M., Akert K. New York: McGraw-Hill; The Frontal Granular Cortex and Behavior. 1964:219–41.
  55. Mitchell A. S., Browning P. G., Baxter M. G. Neurotoxic lesions of the medial mediodorsal nucleus of the thalamus disrupt reinforcer devaluation effects in rhesus monkeys. Journal of Neuroscience. 2007;27:11289–95. [PMC free article: PMC2242856] [PubMed: 17942723]
  56. Mobini S., Body S., Ho M. -Y., Bradshaw C. M., Szabadi E., Deakin J. F. W., Anderson I. M. Effects of lesions of the orbitofrontal cortex on sensitivity to delayed and probabilistic reinforcement. Psychopharmacology. 2002;160:290–98. [PubMed: 11889498]
  57. Montague P. R., Dayan P., Sejnowski T. J. A framework for mesencephalic dopamine systems based on predictive hebbian learning. Journal of Neuroscience. 1996;16:1936–47. [PubMed: 8774460]
  58. Nobre A. C., Coull J. T., Frith C. D., Mesulam M. M. Orbitofrontal cortex is activated during breaches of expectation in tasks of visual attention. Nature Neuroscience. 1999;2:11–12. [PubMed: 10195173]
  59. O’Doherty J. P., Dayan P., Friston K., Critchley H., Dolan R. J. Temporal difference models and reward-related learning in the human brain. Neuron. 2003;38:329–37. [PubMed: 12718865]
  60. O’Doherty J., Dayan P., Schultz J., Deichmann R., Friston K. J., Dolan R. J. Dissociable roles of ventral and dorsal striatum in instrumental conditioning. Science. 2004;304:452–54. [PubMed: 15087550]
  61. O’Doherty J., Rolls E. T., Francis S., Bowtell R., McGlone F., Kobal G., Renner B., Ahne G. Sensory-specific satiety-related olfactory activation of the human orbitofrontal cortex. Neuroreport. 2000;11:893–97. [PubMed: 10757540]
  62. Ostlund S. B., Balleine B. W. Orbitofrontal cortex mediates outcome encoding in Pavlovian but not instrumental learning. Journal of Neuroscience. 2007a;27:4819–25. [PubMed: 17475789]
  63. Ostlund S. B., Balleine B. W. The contribution of orbitofrontal cortex to action selection. Annals of the New York Academy of Science. 2007b;1121:174–92. [PubMed: 17872392]
  64. Padoa-Schioppa C., Assad J. A. The representation of economic value in the orbitofrontal cortex is invariant for changes in menu. Nature Neuroscience. 2008;11:95–102. [PMC free article: PMC2646102] [PubMed: 18066060]
  65. Pais-Vieira M., Lima D., Galhardo V. Orbitofrontal cortex lesions disrupt risk assessment in a novel serial decision-making task for rats. Neuroscience. 2007;145:225–31. [PubMed: 17204373]
  66. Pan W. -X., Schmidt R., Wickens J. R., Hyland B. I. Dopamine cells respond to predicted events during classical conditioning: Evidence for eligibility traces in the reward-learning network. Journal of Neuroscience. 2005;25:6235–42. [PubMed: 15987953]
  67. Patton J. J., Belova M. A., Morrison S. E., Salzman C. D. The primate amygdala represents the positive and negative value of visual stimuli during learning. Nature. 2006;439:865–70. [PMC free article: PMC2396495] [PubMed: 16482160]
  68. Pears A., Parkinson J. A., Hopewell L., Everitt B. J., Roberts A. C. Lesions of the orbitofrontal but not medial prefrontal cortex disrupt conditioned reinforcement in primates. Journal of Neuroscience. 2003;23:11189–11201. [PubMed: 14657178]
  69. Pickens C. L., Saddoris M. P., Gallagher M., Holland P. C. Orbitofrontal lesions impair use of cue-outcome associations in a devaluation task. Behavioral Neuroscience. 2005;119:317–22. [PMC free article: PMC1201523] [PubMed: 15727536]
  70. Pickens C. L., Setlow B., Saddoris M. P., Gallagher M., Holland P. C., Schoenbaum G. Different roles for orbitofrontal cortex and basolateral amygdala in a reinforcer devaluation task. Journal of Neuroscience. 2003;23:11078–84. [PubMed: 14657165]
  71. Price J. L. Definition of the orbital cortex in relation to specific connections with limbic and visceral structures and other cortical regions. Annals of the New York Academy of Science. 2007;1121:54–71. [PubMed: 17698999]
  72. Ramus S. J., Eichenbaum H. Neural correlates of olfactory recognition memory in the rat orbitofrontal cortex. Journal of Neuroscience. 2000;20:8199–8208. [PubMed: 11050143]
  73. Reekie Y. L., Braesicke K., Man M. S., Roberts A. C. Uncoupling of behavioral and autonomic responses after lesions of the primate orbitofrontal cortex. Proceedings of the National Academy of Sciences. 2008;105:9787–92. [PMC free article: PMC2447863] [PubMed: 18621690]
  74. Rescorla R. A. Reduction in the effectiveness of reinforcement after prior excitatory conditioning. Learning and Motivation. 1970;1:372–81.
  75. Rescorla R. A., Wagner A. R. A theory of Pavlovian conditioning: Variations in the effectiveness of reinforcement and nonreinforcement. Black A. H., Prokasy W. F. New York: Appleton-Century-Crofts; Classical Conditioning II: Current Research and Theory. 1972:64–99.
  76. Roesch M. R., Calu D. J., Schoenbaum G. Dopamine neurons encode the better option in rats deciding between differently delayed or sized rewards. Nature Neuroscience. 2007;10:1615–24. [PMC free article: PMC2562672] [PubMed: 18026098]
  77. Roesch M. R., Stalnaker T. A., Schoenbaum G. Associative encoding in anterior piriform cortex versus orbitofrontal cortex during odor discrimination and reversal learning. Cerebral Cortex. 2006;17:643–52. [PMC free article: PMC2396586] [PubMed: 16699083]
  78. Roesch M. R., Stalnaker T. A., Schoenbaum G. Associative encoding in anterior piriform cortex versus orbitofrontal cortex during odor discrimination and reversal learning. Cerebral Cortex. 2007;17:643–52. [PMC free article: PMC2396586] [PubMed: 16699083]
  79. Roesch M. R., Taylor A. R., Schoenbaum G. Encoding of time-discounted rewards in orbitofrontal cortex is independent of value representation. Neuron. 2006;51:509–20. [PMC free article: PMC2561990] [PubMed: 16908415]
  80. Rolls E. T. The orbitofrontal cortex. Philosophical Transactions of the Royal Society of London B. 1996;351:1433–43. [PubMed: 8941955]
  81. Rolls E. T., Critchley H. D., Mason R., Wakeman E. A. Orbitofrontal cortex neurons: Role in olfactory and visual association learning. Journal of Neurophysiology. 1996;75:1970–81. [PubMed: 8734596]
  82. Rolls E. T., Grabenhorst F. The orbitofrontal cortex and beyond: From affect to decision-making. Progress in Neurobiology. 2008;86:216–44. [PubMed: 18824074]
  83. Rolls E. T., Hornak J., Wade D., McGrath J. Emotion-related learning in patients with social and emotional changes associated with frontal lobe damage. Journal of Neurology, Neurosurgery, and Psychiatry. 1994;57:1518–24. [PMC free article: PMC1073235] [PubMed: 7798983]
  84. Saddoris M. P., Gallagher M., Schoenbaum G. Rapid associative encoding in basolateral amygdala depends on connections with orbitofrontal cortex. Neuron. 2005;46:321–31. [PubMed: 15848809]
  85. Sage J. R., Knowlton B. J. Effects of US devaluation on win-stay and win-shift radial maze performance in rats. Behavioral Neuroscience. 2000;114:295–306. [PubMed: 10832791]
  86. Schoenbaum G., Chiba A. A., Gallagher M. Orbitofrontal cortex and basolateral amygdala encode expected outcomes during learning. Nature Neuroscience. 1998;1:155–59. [PubMed: 10195132]
  87. Schoenbaum G., Chiba A. A., Gallagher M. Neural encoding in orbitofrontal cortex and basolateral amygdala during olfactory discrimination learning. Journal of Neuroscience. 1999;19:1876–84. [PubMed: 10024371]
  88. Schoenbaum G., Eichenbaum H. Information coding in the rodent prefrontal cortex. I. Singleneuron activity in orbitofrontal cortex compared with that in pyriform cortex. Journal of Neurophysiology. 1995;74:733–50. [PubMed: 7472378]
  89. Schoenbaum G., Nugent S., Saddoris M. P., Setlow B. Orbitofrontal lesions in rats impair reversal but not acquisition of go, no-go odor discriminations. Neuroreport. 2002;13:885–90. [PubMed: 11997707]
  90. Schoenbaum G., Roesch M. R. Orbitofrontal cortex, associative learning, and expectancies. Neuron. 2005;47:633–36. [PMC free article: PMC2628809] [PubMed: 16129393]
  91. Schoenbaum G., Setlow B. Integrating orbitofrontal cortex into prefrontal theory: Common processing themes across species and subdivision. Learning and Memory. 2001;8:134–47. [PubMed: 11390633]
  92. Schoenbaum G., Setlow B., Nugent S. L., Saddoris M. P., Gallagher M. Lesions of orbitofrontal cortex and basolateral amygdala complex disrupt acquisition of odor-guided discriminations and reversals. Learning and Memory. 2003a;10:129–40. [PMC free article: PMC196660] [PubMed: 12663751]
  93. Schoenbaum G., Setlow B., Saddoris M. P., Gallagher M. Encoding predicted outcome and acquired value in orbitofrontal cortex during cue sampling depends upon input from basolateral amygdala. Neuron. 2003b;39:855–67. [PubMed: 12948451]
  94. Schultz W., Dickinson A. Neuronal coding of prediction errors. Annual Review of Neuroscience. 2000;23:473–500. [PubMed: 10845072]
  95. Schultz W., Tremblay L., Hollerman J. R. Reward processing in primate orbitofrontal cortex and basal ganglia. Cerebral Cortex. 2000;10:272–83. [PubMed: 10731222]
  96. Shidara M., Richmond B. J. Anterior cingulate: Single neuronal signals related to degree of reward expectancy. Science. 2002;296:1709–11. [PubMed: 12040201]
  97. Stalnaker T. A., Franz T. M., Singh T., Schoenbaum G. Basolateral amygdala lesions abolish orbitofrontal-dependent reversal impairments. Neuron. 2007a;54:51–58. [PubMed: 17408577]
  98. Stalnaker T. A., Roesch M. R., Franz T. M., Burke K. A., Schoenbaum G. Abnormal associative encoding in orbitofrontal neurons in cocaine-experienced rats during decision-making. European Journal of Neuroscience. 2006;24:2643–53. [PMC free article: PMC2391072] [PubMed: 17100852]
  99. Stalnaker T. A., Roesch M. R., Franz T. M., Calu D. J., Singh T., Schoenbaum G. Cocaine-induced decision-making deficits are mediated by miscoding in basolateral amygdala. Nature Neuroscience. 2007b;10:949–51. [PMC free article: PMC2562677] [PubMed: 17603478]
  100. Sugase-Miyamoto Y., Richmond B. J. Neuronal signals in the monkey basolateral amygdala during reward schedules. Journal of Neuroscience. 2005;25:11071–83. [PubMed: 16319307]
  101. Sutton R. S., Barto A. G. Reinforcement Learning: An Introduction. 1998 Cambridge MA: MIT Press.
  102. Takahashi Y., Roesch M. R., Stalnaker T. A., Haney R. Z., Calu D. J., Taylor A. R., Burke K. A., Schoenbaum G. The orbitofrontal cortex and ventral tegmental area are necessary for learning from unexpected outcomes. Neuron. 2009;62:269–80. [PMC free article: PMC2693075] [PubMed: 19409271]
  103. Teitelbaum H. A comparison of effects of orbitofrontal and hippocampal lesions upon discrimination learning and reversal in the cat. Experimental Neurology. 1964;9:452–62. [PubMed: 14188532]
  104. Thorpe S. J., Rolls E. T., Maddison S. The orbitofrontal cortex: Neuronal activity in the behaving monkey. Experimental Brain Research. 1983;49:93–115. [PubMed: 6861938]
  105. Tobler P. N., O’Doherty J., Dolan R. J., Schultz W. Human neural learning depends on reward prediction errors in the blocking paradigm. Journal of Neurophysiology. 2006;95:301–10. [PMC free article: PMC2637603] [PubMed: 16192329]
  106. Torregrossa M. M., Quinn J. J., Taylor J. R. Impulsivity, compulsivity, and habit: The role of the orbitofrontal cortex revisited. Biological Psychiatry. 2008;63:253–55. [PMC free article: PMC2265211] [PubMed: 18194683]
  107. Tremblay L., Schultz W. Relative reward preference in primate orbitofrontal cortex. Nature. 1999;398:704–8. [PubMed: 10227292]
  108. Tremblay L., Schultz W. Modifications of reward expectation-related neuronal activity during learning in primate orbitofrontal cortex. Journal of Neurophysiology. 2000;83:1877–85. [PubMed: 10758099]
  109. Valentin V. V., Dickinson A., O’Doherty J. P. Determining the neural substrates of goal-directed learning in the human brain. Journal of Neuroscience. 2007;27:4019–26. [PubMed: 17428979]
  110. van Duuren E., Nieto-Escamez F. A., Joosten R. N. J. M. A., Visser R., Mulder A. B., Pennartz C. M. A. Neural coding of reward magnitude in the orbitofrontal cortex during a five-odor discrimination task. Learning and Memory. 2007;14:446–56. [PMC free article: PMC1896094] [PubMed: 17562896]
  111. Waelti P., Dickinson A., Schultz W. Dopamine responses comply with basic assumptions of formal learning theory. Nature. 2001;412:43–48. [PubMed: 11452299]
  112. Wallis J. D., Dias R., Robbins T. W., Roberts A. C. Dissociable contributions of the orbitofrontal and lateral prefrontal cortex of the marmoset to performance on a detour reaching task. European Journal of Neuroscience. 2001;13:1797–1808. [PubMed: 11359531]
  113. Wallis J. D., Miller E. K. Neuronal activity in primate dorsolateral and orbital prefrontal cortex during performance of a reward preference task. European Journal of Neuroscience. 2003;18:2069–81. [PubMed: 14622240]
  114. Watanabe M. Reward expectancy in primate prefrontal neurons. Nature. 1996;382:629–32. [PubMed: 8757133]
  115. Wellmann L. L., Gale K., Malkova L. GABAA-mediated inhibition of basolateral amygdala blocks reward devaluation in macaques. Journal of Neuroscience. 2005;25:4577–86. [PubMed: 15872105]
  116. Winstanley C. A., Theobald D. E. H., Cardinal R. N., Robbins T. W. Contrasting roles of basolateral amygdala and orbitofrontal cortex in impulsive choice. Journal of Neuroscience. 2004;24:4718–22. [PubMed: 15152031]
Copyright © 2011 by Taylor and Francis Group, LLC.
Bookshelf ID: NBK92778PMID: 22593899


  • PubReader
  • Print View
  • Cite this Page

Other titles in this collection

Related information

  • PMC
    PubMed Central citations
  • PubMed
    Links to PubMed

Similar articles in PubMed

See reviews...See all...

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...