Skip to main content
PMC Canada Author Manuscripts logoLink to PMC Canada Author Manuscripts
. Author manuscript; available in PMC: 2017 Jan 12.
Published in final edited form as: Eur J Neurosci. 2015 Jul 30;42(5):2224–2233. doi: 10.1111/ejn.13009

Hippocampal interplay with the nucleus accumbens is critical for decisions about time

Andrew R Abela 1, Yiran Duan 1, Yogita Chudasama 1
PMCID: PMC5233438  CAMSID: CAMS6480  PMID: 26121594

Abstract

Human cognition depends upon the capacity to make decisions in the present that bear upon outcomes in the future. The nucleus accumbens, a recipient of direct projections from both the hippocampus and orbitofrontal cortex, is known to contribute to these aspects of decision-making. Here we demonstrate that interaction of the nucleus accumbens with the hippocampus, but not the orbitofrontal cortex, is critical in shaping decisions that involve time trade-offs. Compared with controls, rats with a disrupted hippocampal–accumbens interaction were strongly biased toward choosing stimuli that led to small and immediate food rewards over large and delayed ones. We show that this pattern of behavior cannot be ascribed to the impaired representation of stimulus value, the incapacity to wait, or a general disruption of decision-making. These results identify a hippocampal–accumbens circuit that may underlie a range of problems in which daily decisions are marked by a shift toward immediate gratification.

Keywords: decision-making, disconnection, orbitofrontal cortex, rat

Introduction

One of the most important aspects of human cognition is the capacity to make optimal decisions in the face of multiple alternative actions and with the benefit of past experiences. Converging evidence from both animal and human studies implicates the nucleus accumbens (NAc) as being critical for such decisions, particularly when the decisions involve a trade-off between costs and long-term benefits. Neural activity in the rat NAc appears to encode the internal costs weighed against obtaining a large reward (Day et al., 2010, 2011). Furthermore, when the NAc is damaged, rats are unwilling to wait for a larger pay-off (Cardinal et al., 2001; Pothuizen et al., 2005) or take risks for uncertain rewards (Cardinal & Howes, 2005; Acheson et al., 2006).

Recently, the hippocampus has also been associated with choices that involve waiting. Its contribution may relate directly to the representation of time itself, as evidence suggests that the hippocampus is specialised to represent past, and possibly future events (Ferbinteanu & Shapiro, 2003; Johnson et al., 2007; Schacter et al., 2007; Peters & Büchel, 2010). In laboratory settings, patients with hippocampal pathology make choices that are random or poorly suited for developing strategies that benefit them in the long term (Gleichgerrcht et al., 2010; Kwan et al., 2012). Analogous deficits emerge from experiments in rats; partial or complete lesions of the hippocampus elicit an unwillingness to make choices that require waiting, even for high-valued rewards (Cheung & Cardinal, 2005; McHugh et al., 2008; Mariano et al., 2009; Abela & Chudasama, 2013).

The orbitofrontal cortex (OFC) also contributes to some underlying aspects of decision-making, such as updating the value of an expected reward (Schoenbaum et al., 2003; Izquierdo et al., 2004; Burton et al., 2014). In some experiments, the removal or inactivation of this structure disrupts the natural inclination of rats to wait for larger rewards (Kheramin et al., 2002; Rudebeck et al., 2006; Zeeb et al., 2010). Others suggest that it is more important for decisions that do not involve the representation of future events (Winstanley et al., 2004; Mariano et al., 2009; Abela & Chudasama, 2013; Stopper et al., 2014), which has also been the conclusion of at least one study in humans (Fellows & Farah, 2005).

The NAc, OFC and ventral portion of the hippocampus (vHC) are anatomically interconnected in a manner that can provide insight into the operation of decision-making circuits. Notably, both the vHC and OFC send excitatory projections to the NAc (Kelley & Domesick, 1982; Brog et al., 1993) and are thus able to influence its activity. This anatomical convergence raises the question of whether the NAc might receive different types of information from the two structures, and whether the interaction of the NAc with the vHC or OFC might be more important for certain types of decisions. Here we address how these two structures contribute to decision-making via their interaction with the NAc using a disconnection paradigm in rats. We show that the interaction of the NAc with the vHC, but not the OFC, is essential for decisions that weigh the cost of waiting for a positive outcome.

Materials and methods

Subjects

The subjects were male Long–Evans rats (Charles River, LaSalle, QC, Canada), pair-housed in a temperature-controlled room (22 °C) under diurnal conditions (12 h light/12 h dark). The rats were food restricted and maintained at 85% of their weight when fed ad libitum. The rats weighed 200–225 g at the start of behavioral training. McGill University Animal Care Committee approved all experimental procedures, in accordance with the guidelines of the Canadian Council on Animal Care.

Apparatus

All behavioral training and testing were conducted in four automated, operant touchscreen testing chambers (Lafayette Instruments, Lafayette, IN, USA). Each chamber was equipped with: (i) a house-light, (ii) a food magazine fitted with a light-emitting diode and photocells to detect food collection entries, (iii) a pellet dispenser that delivered 45-mg dustless precision sucrose pellets (Ren’s Pet Depot, ON, Canada), and (iv) a 12″×12″ touch-sensitive monitor (Elo Touch Solutions, USA) programmed to present computer graphic stimuli. Visually identical geometric shapes (Fig. 1) were displayed on the left and right side of the touchscreen. A black Plexiglas mask was attached to the front of the screen at approximately 0.6″ from the surface of the display to restrict the rats’ access to the visual stimuli through a left and right response window (2.05″×2.05″). The apparatus and online data collection for each chamber were controlled using the Whisker control system (Cardinal & Aitken, 2010).

Fig. 1.

Fig. 1

Diagram to illustrate the touchscreen apparatus with visual stimuli in the two decision-making tasks. (A) Delay discounting task. The rats chose between two identical white squares but their positions indicated differences in reward size and delay. A nosepoke touch response to the right square resulted in the immediate delivery of a small one-pellet reward, whereas a response to the left square delivered a large four-pellet reward after a delay. The delay to reward was progressively increased across blocks. The length of each trial was 70 s regardless of the rat’s choice of stimulus. (B) Probability discounting task. The rat chose between a different pair of identical stimuli, in this case resembling ‘pacman’. This time the position of the stimuli indicated differences in reward size and uncertainty, i.e. a response to the right stimulus always delivered a small one-pellet reward, but a response to the left stimulus always delivered the large four-pellet reward according to a predetermined probability. The probability of the large reward delivery was progressively decreased every 10 sessions of testing. Each trial lasted for 40 s. In both tasks, the side of the large reward stimulus (left or right) was counterbalanced between rats.

Surgery

The rats were anesthetised with isoflurane gas (4–5% induction; 1–3% maintenance) and secured in a stereotaxic frame (incisor bar, −3.0 mm). The scalp was retracted to expose the skull, and craniotomies were made directly above the target region of the brain. A 1-μL microsyringe (SGE, Canadian Life Science, ON, Canada) was used to administer bilateral injections of 0.09 M N-methyl-D-aspartic acid (Sigma-Aldrich, Canada), dissolved in 0.9% saline (pH 7.0–7.2). Each injection was made over 2 min, and the injector was left in place for an additional 2 min for dispersion before the syringe was retracted. Injection coordinates were taken from the atlas of Paxinos & Watson (2005). Table 1 provides the coordinates for each lesion. Dorso-ventral readings were measured with reference to the dural surface. The combined surgical manipulations (i.e. disconnection lesion, ipsilateral lesion) are schematised in Fig. 2. The side on which the lesions were made (left or right hemisphere) was counterbalanced for all groups. Rats that received sham control surgery were exposed to the same surgical manipulations but received injections of saline in place of N-methyl-D-aspartic acid. Following surgery, rats were given injections of carprofen (analgesic; 5 mg/kg, s.c.) and Tribrissen (antibiotic; 0.125 mL/kg, s.c.) for 3 days. During recovery, rats were monitored in a recovery cage that was devoid of extraneous sensory stimulation (e.g. bright lights, loud noise). Postoperatively, rats were allowed to recover from surgery for at least 1 week, with food available ad libitum. Following the recovery period, rats were food-restricted and maintained at 85% of their free-feeding weight for the remainder of the experiment.

Table 1.

Stereotaxic coordinates for vHC, OFC and NAc lesions

Region Stereotaxic coordinates (mm) Volume of neurotoxin (μL)
vHC AP, −4.6; ML, −5.0; DV, −6.7 0.4
AP, −4.7; ML, −4.4; DV, −6.7 0.5
AP, −4.8; ML, −4.6; DV, −7.5 0.5
OFC AP, +3.7; ML, −0.7; DV, −3.4 0.3
AP, +3.2; ML, −0.7; DV, −3.4 0.3
NAc AP, +2.1; ML, −1.6; DV, −6.5 0.3
AP, +1.4; ML, −1.2; DV, −6.4 0.3

AP, anterior–posterior; DV, dorsal–ventral; ML, medial–lateral.

Fig. 2.

Fig. 2

Schematic illustration of experimental design. (A) Left panel: a three-dimensional view of the rat brain localising the three brain regions of interest, i.e. the OFC shaded in red, the NAc in green, and the vHC in yellow. Right panel: anatomical connections between these structures. Both the OFC and vHC send projections to the NAc. The OFC also receives input from the vHC. All of these connections are unidirectional; they are not reciprocated. (B) Following combined contralateral lesions of the NAc and vHC in opposite hemispheres (i.e. disconnection), all direct and indirect connections between these structures are abolished in both hemispheres. Following combined unilateral lesions of the NAc and vHC in the same hemisphere (i.e. ipsilateral), all connections to and from these structures are disrupted in one hemisphere only, whereas the other hemisphere remains entirely intact. The same lesions are schematised in (C), this time illustrating the disconnection and ipsilateral lesions for the NAc and OFC, relative to the control. Red crosses, lesioned area; solid line, intrahemispheric projection; gray lines, disrupted intrahemispheric projection. dHC, dorsal hippocampus. Three dimensional figures adapted from Adult (P80) Wistar rat brain template by Calabrese et al. (2013).

Behavioral procedure

The training and testing procedure has been previously described in full (Abela & Chudasama, 2013, 2014). Following habituation to the testing chamber, rats were trained to make a nosepoke touch response to a white square (2″×2″) that was presented on the left or right side of the screen. A nosepoke touch response to the white square was rewarded with a single sucrose pellet. Rats were ready for surgery when they were able to obtain 50 reward pellets within a 20-min session (~5 days). After the rats had recovered from surgery, they were retrained to touch the screen until they achieved the same criterion as before surgery (~2 days). The rats were then tested on two behavioral tasks. In the delay discounting task, the rats chose between two identical white squares located on the left and right side of a touchscreen. Their position indicated differences in reward size and delay. Responses to the left square resulted in the immediate delivery of a small, one-pellet reward. Responses to the right square resulted in the delivery of a large, four-pellet reward, but after a delay (Fig. 1A). The side (left or right) of the large reward stimulus was counterbalanced between subjects. Each session consisted of four blocks of 12 trials. Each block began with two ‘forced choice’ trials in which either the left or right stimulus was presented to demonstrate the outcome associated with the stimulus. The remaining 10 trials were ‘free choice’ trials in which the rats could choose between both stimuli. Each trial lasted for 70 s regardless of the rat’s choice of stimulus. This ensured that the subsequent trial did not occur sooner if the small, immediate stimulus was chosen in the previous trial.

The rats were initially trained to discriminate between the two reward sizes in the absence of delays. When rats displayed a preference for the large reward stimulus in >80% of trials (~2 days), the delay to delivery of the large reward was progressively increased in each block within a session (0, 8, 16 and 32 s). The trial onset was signaled by illumination of the food magazine and houselight. A nosepoke entry into the food magazine triggered the presentation of the stimuli on the screen, which remained on the touchscreen for 10 s. Choice behavior was measured as the total number of choices for the large reward per delay (maximum of 10 responses). Failure to make a response within 10 s was recorded as an omission and the box returned to its intertrial interval state when all lights were extinguished until the next trial. Trials were repeated until rats received the full set of trials for each block. The response latency was the time between stimulus presentation and the time that the rat made a response. Following a response, and after the rat had retrieved its food reward, the box went into an intertrial interval state until the next trial. The food magazine was illuminated during the delays. The food collection latency was the time between pellet delivery and pellet collection. Rats were tested until stable choice performance was observed for five consecutive sessions. This took, on average, 20 sessions (range 7–30). This training allowed the animals to become familiar with the task contingencies (see Fig. S1). The last five sessions of training were used for analysis. Stable performance was confirmed with a main effect of delay and no main effect of session for five consecutive days by subjecting the data to a repeated-measures ANOVA.

During the equal delay procedure, the delay to both the small and large reward was equal, and the delay increased in each block of the session as in the standard version of the task. Sessions 1–5 allowed the animals to become familiar with the task contingencies. Stable discounting behavior was confirmed with a main effect of delay and no effect of session in sessions 6–10, which were used for analysis. All other task parameters remained the same, including the duration of the delays. Rats were then exposed to the standard version of the delay discounting task until their rate of discounting stabilised (~3 days).

Rats were also tested on a task in which the probability of reward was manipulated, i.e. the probability discounting task. Here, rats chose between a different pair of identical visual stimuli, located on the left and right side of the screen (Fig. 1B). Responses to the right stimulus always resulted in the delivery of a small, one-pellet reward. In contrast, a response to the left stimulus resulted in the delivery of a large, four-pellet reward, according to a predetermined probability. The side of the large reward stimulus (left or right) was counterbalanced between subjects, and remained in the same location for the entire experiment for each rat. Each session consisted of 40 trials. The first 30 trials of each session were ‘forced choice’ trials (15 per side), in which only the left or right stimulus was presented to demonstrate the outcome associated with each stimulus. The left/right presentation of each stimulus followed a pseudorandom sequence such that each stimulus was not presented for more than two consecutive trials. The subsequent 10 trials were ‘free choice’ trials, in which the rats chose freely between both stimuli. Each trial lasted for 40 s. The testing was divided into four phases. In phase 1, the probability of large reward delivery was set at 1 (i.e. fully certain), and all rats chose the large reward on more than 80% of trials for the last five sessions. In phases 2, 3 and 4, the probability of large reward delivery was progressively decreased to 1/3, 1/9, and 1/15, respectively. Rats were tested on each phase for 10 sessions. The first five sessions in each phase ensured that the rats had been exposed to the new probability. The free choice trials from the last five sessions of each phase were analysed with a repeated-measures ANOVA to obtain a stable measure of their choice performance.

To avoid the potential of testing-order effects, we ensured that half of the rats were tested on the delay version of the task first (including the equal delay procedure), whereas the other half were tested on the probability task first.

Histology

After behavioral testing was complete, rats were perfused transcardially using 0.9% saline followed by 4% paraformaldehyde prepared in 0.9% saline. After dehydration in 20% sucrose solution, the brains were sectioned on a cryostat at 40-μm thickness. Every other section was mounted on glass slides and stained with cresyl violet. The sections were used to determine the location and extent of the lesions using standardised anatomical sections of the rat brain (Paxinos & Watson, 2005).

Data analysis

Data for each variable were analysed using a repeated-measures ANOVA using PASW statistical software, version 20 (SPSS, Chicago, IL, USA). Homogeneity of variance was determined using Mauchly’s sphericity test. In cases that violated this assumption of the ANOVA, the F ratio was tested against more conservative degrees of freedom using the Greenhouse–Geisser correction procedure. Each main effect that reached significance (P < 0.05) was further scrutinised by comparing individual group means using the Fisher’s least significant difference (LSD) test. The between-subjects factor (lesion) was at three levels: sham, ipsilateral and disconnection. For the delay discounting task, data were analysed according to the within-subject factor of delay (four levels: 0, 8, 16, and 32 s). For the probability version of the task, data were analysed according to the within-subject factor of probability (four levels: 1, 1/3, 1/9, and 1/15). The order of testing was counterbalanced for task such that half of the rats were tested on the delay task first, whereas the other half were tested on the probability task first. To test for the effects of task order, we analysed choice behavior separately for each group with task order as the between-subjects factor (two levels: delay or probability task first), and a within-subjects factor of delay or probability (four levels).

In parametric designs of this kind, some rats do not display normal discounting behavior, i.e. rats sometimes form persistent side biases that lead to nearly 100% choice of the small or large reward, irrespective of delay or probability. In a total of 50 rats, two rats in the vHC/NAc disconnection study (one sham, one ipsilateral) and three rats in the OFC/NAc disconnection study (three ipsilateral) did not display normal discounting behavior, either in the delay or probability version of the task. These rats were excluded from the appropriate analysis.

Results

We applied a well-established automated testing method in rats to study the contribution of hippocampal–accumbens interactions and orbitofrontal–accumbens interactions on two types of decision-making (Fig. 1). The first paradigm involved a trade-off between reward size and the delay of its delivery (known as delay discounting), whereas the second involved a trade-off between reward size and the probability of its delivery (known as probability discounting). Rats were divided into multiple experimental groups (Fig. 2), consisting of vHC/NAc disconnections (lesions in opposite hemispheres), vHC/NAc ipsilateral controls (lesions in the same hemisphere), OFC/NAc disconnections, OFC/NAc ipsilateral controls, and concurrent surgical sham cohorts. The NAc receives unidirectional input from both the vHC and OFC (Fig. 2A). We used a disconnection lesion approach, in which the NAc was ablated in one hemisphere and either the vHC or OFC was ablated in the opposite hemisphere, to investigate the relative roles of these inputs in the two paradigms. This preparation abolishes all direct and indirect intrahemispheric interactions between the two structures. In each case, the disconnection group was compared with both an ‘ipsilateral’ lesion control group, which received a lesion to both structures in the same hemisphere, and a surgical sham control group (Fig. 2B and C). All experimental and control groups were able to successfully learn and indicate their preference for the large reward stimuli by making a nosepoke to a touch-sensitive screen but differed in their performance faced with more complex decisions. The following sections describe, in turn, the contribution of vHC/NAc interactions and of OFC/NAc interactions to two different types of decisions involving trade-offs about the reward outcome.

Hippocampal–accumbens disconnection lowers tolerance for delays

We first investigated the effect of combined vHC and NAc lesions. To test the role of this circuitry in decisions involving time, we employed the paradigm of delay discounting. In this test, a nosepoke touch response to one of two stimuli resulted in the immediate delivery of a small reward (one sucrose pellet), whereas a response to the other stimulus resulted in a much larger reward (four sucrose pellets) that was delivered after a delay (Fig. 1A). Thus, the rat faced a decision about how long it was willing to wait for a larger reward. By increasing the delay across blocks, we were able to establish how time influenced the rats’ choice of stimulus.

We compared the performance of groups with disconnection lesions (n = 8), ipsilateral lesions (n = 7), and sham control surgery (n = 11). The extents of the vHC and NAc lesions for disconnection and ipsilateral groups are schematised in Fig. 3A and B. The NAc lesion showed neuronal loss in both core and shell regions between anterior–posterior (AP) level +2.52 mm and AP +1.4 mm. The vHC lesion was also extensive, extending rostrocaudally from AP −4.68 mm to AP −5.60 mm, and included the cornu amonis fields (CA1–CA3), dentate gyrus, and ventral subiculum. Typically, the vHC lesion encroached on the ventral tip of the dorsal hippocampus but adjacent structures such as the amygdala, entorhinal and perirhinal cortex were all spared.

Fig. 3.

Fig. 3

The top panel shows a histological analysis of the vHC and NAc lesions. (A) Schematic lesion reconstructions superimposed on coronal rat brain sections depicting the extent of the NAc (left hemisphere) and vHC (right hemisphere) lesions in the disconnection lesion group (red), and (B) the extent of the NAc (left hemisphere) and vHC (left hemisphere) lesions in the ipsilateral lesion group (green). Regions that appear darker indicate greater overlap in the damage present among different rats. Numbers represent the location of sections relative to bregma according to Paxinos & Watson (2005). The bottom panels show the impact of delay and uncertainty on choice of the large reward stimulus in animals with vHC/NAc disconnection lesions (Disc) compared with animals with ipsilateral lesions (Ipsi) and sham control surgery (Sham). (C) Average choice of large reward for each delay to reward delivery, displayed for each test condition: when no delays were present (No delays), when only the large reward was delayed (Delays I), when delays for the small and large reward were equal (Equal delays), and when the large reward was delayed again (Delays II). (D) Choice of large reward as a function of reward uncertainty across 10 sessions for each probability (P = 1, 1/3, 1/9, and 1/15). The last five sessions for each probability were analysed separately (gray shading). *P < 0.05 relative to the sham and ipsilateral vHC/NAc groups across delays. All error bars indicate SEM.

The behavioural data are illustrated in Fig. 3C and D. In the absence of any delays (Fig. 3C, No delays), all cohorts of rats readily chose the large reward stimulus (F2,23 = 2.07, P > 0.05), and responded with similar reaction times [means (± SEM): disconnection, 2.0 s (± 0.3); ipsilateral, 2.2 s (± 0.3); shams, 2.4 s (± 0.2)]. Thus, neither the disconnection nor the ipsilateral control lesion affected the animals’ capacity to discriminate the size of the delivered reward, interpret its value, associate it with a particular response, or execute the nosepoke response. We next systematically increased the delay to the large reward in order to evaluate the tolerance for waiting (Fig. 3C, Delays I). With this manipulation, all rats shifted their preference for the small, immediate reward (F2,39 = 139.72, P < 0.001). The ipsilateral and sham control rats were willing to endure some seconds of delay in exchange for receipt of the larger reward, and then gradually shifted their response to the immediate reward stimulus as the delay reached 16 s. This behavioral pattern is typical in animals, as they, like humans, treat long time delays as a subjective cost that offsets the benefits of receiving a large reward (Ainslie, 1975). Importantly, because the length of the trial was held constant across delays, irrespective of the animal’s choice, selection of the small reward could not be used as a strategy to initiate the subsequent trial more quickly.

In the animals with the vHC/NAc disconnection lesion, the tolerance for delay was much less than in the controls (F2,23 = 5.83, P < 0.01), with an interaction between delay and lesion (F3,39 = 2.73, P = 0.05), i.e. there was a significant shift toward choosing the small, immediate reward even at the shortest delay (see Fig. 3C, Delays I, red line) (Fisher’s LSD, P < 0.01). Thus, unlike the sham and ipsilateral control cohorts, the disconnection animals were unwilling to wait for 8 s to receive four times the reward amount. This main effect emerged over the course of training (see Fig. S1), suggesting that rats with the vHC/NAc disconnection were highly sensitive to the delays. Despite their normal response time to zero second delays, they appeared to respond faster after the delays were introduced [mean (± SEM): disconnection, 2.2 s (± 0.2); ipsilateral, 3.2 s (± 0.3); shams, 3.0 s (± 0.2); F2,23 = 2.876, P = 0.077] as if they were deciding in haste. However, such behavior did not account for their biased choices. In fact, in a subsequent set of control experiments, the same animals reliably chose the stimulus indicating the large reward when both small and large reward options were equally delayed, although this bias declined at the longest delay (F2,49 = 78.52, P < 0.001; Fig. 3C, Equal delays). Nonetheless, the groups did not differ in this regard (F2,23 = 1.10, P > 0.05), indicating that the animals with disconnection lesions were capable of waiting to some extent. In this context, it is worth noting that human subjects discount large rewards less than normal when both the small and large monetary reward options are associated with long delays (Green et al., 2005). Thus, just like our rats in the equal delayed condition, even humans fail to show a consistent preference for the large reward option, despite having to wait for either choice.

The equal delays procedure confirmed that animals with hippocampal–accumbens disconnection lesions, like the other groups, were able to normally learn, and remember, the stimulus associated with a delayed, large reward. Then, when the same animals were returned to the choice situation in which they had to wait for the large reward only (Fig. 3C, Delays II), the group difference re-emerged (F2,23 = 4.89, P < 0.05) as the choice of large reward declined as a function of delay (F2,48 = 137.16, P < 0.001). There was no interaction between delay and lesion (F4,48 = 2.16, P > 0.05). Again, the strong bias towards the small, immediate reward was specific to those animals with the vHC/NAc disconnection relative to the other two groups (Fisher’s LSD, P < 0.05). Moreover, other aspects of performance, such as trial omissions, speed of response and food collection latencies, were all in the normal range (see Table 2). As shown in Table 2, animals in all groups omitted more trials during the equal delay procedure, presumably because waiting for either a large or small reward was equally demotivating, especially at the long delays. Even here, however, these animals still chose, optimally, the large reward option on average on at least 60% of the trials at the longest delay (Fig. 3C, Equal delays) compared with less than 15% when only the large reward was delayed (Fig. 3C, Delays I and Delays II).

Table 2.

Mean latencies (s) and omissions during performance on the delay and probability discounting tasks

Delays I
Equal delays
Delays II
Probability discounting
NAc/vHC NAc/OFC NAc/vHC NAc/OFC NAc/vHC NAc/OFC NAc/vHC NAc/OFC
Response latency (s ± SEM)
 Sham 2.8 (± 0.2) 2.5 (± 0.1) 3.3 (± 0.2) 3.1 (± 0.2) 2.5 (± 0.2) 2.3 (± 0.2) 2.4 (± 0.2) 2.0 (± 0.1)
 Ipsilateral 3.0 (± 0.3) 2.3 (± 0.2) 3.6 (± 0.3) 3.1 (± 0.3) 3.1 (± 0.4) 2.6 (± 0.4) 2.0 (± 0.2) 2.2 (± 0.1)
 Disconnection 2.1 (± 0.2) 3.1 (± 0.3) 3.2 (± 0.3) 3.6 (± 0.3) 2.4 (± 0.3) 3.0 (± 0.3) 2.0 (± 0.1) 2.2 (± 0.2)
Food collection latency (s ± SEM)
 Sham 1.6 (± 0.1) 1.5 (± 0.1) 1.6 (± 0.2) 2.0 (± 0.3) 1.5 (± 0.2) 1.7 (± 0.2) 1.3 (± 0.0) 1.3 (± 0.1)
 Ipsilateral 1.8 (± 0.2) 1.5 (± 0.1) 1.7 (± 0.2) 2.0 (± 0.2) 1.6 (± 0.3) 1.7 (± 0.3) 1.2 (± 0.1) 1.3 (± 0.1)
 Disconnection 1.5 (± 0.1) 1.5 (± 0.1) 1.7 (± 0.3) 1.7 (± 0.3) 1.4 (± 0.2) 1.6 (± 0.3) 1.4 (± 0.1) 1.4 (± 0.1)
Omissions (± SEM)
 Sham 1.2 (± 0.3) 1.5 (± 0.3) 1.5 (± 0.3) 6.4 (± 1.0) 1.3 (± 0.7) 1.1 (± 0.7) 0.7 (± 0.2) 0.1 (± 0.3)
 Ipsilateral 1.6 (± 0.5) 0.9 (± 0.3) 0.9 (± 0.3) 6.6 (± 1.3) 1.7 (± 0.6) 1.2 (± 0.6) 1.0 (± 0.4) 1.2 (± 0.4)
 Disconnection 1.4 (± 0.6) 1.1 (± 0.2) 1.1 (± 0.2) 5.6 (± 1.4) 0.9 (± 0.3) 0.4 (± 0.3) 0.5 (± 0.2) 1.0 (± 0.4)

NAc/OFC, NAc and orbitofrontal cortex groups; NAc/vHC, NAc and vHC lesion groups.

Mean values are collapsed across delay (0–32 s) and probability (P = 1−1/15).

We compared this result with another type of decision trade-off, in this case where the reward was always immediate but the large reward was uncertain in its delivery. In this paradigm, often referred to as probability discounting, we systematically decreased the probability of receiving the large reward (Fig. 1B). Rats were tested on each probability for 10 sessions (Fig. 3D). The first five sessions ensured that the rats learned the response–outcome contingency at a given probability through experience. This experience occurred through a high number of ‘forced choice’ trials for each probability (30 in total, 15 per stimulus; see Materials and methods), so that the animal could accurately estimate the probability of reward given its infrequent occurrence (see also, Mobini et al., 2002; Nasrallah et al., 2009, 2011; Mai & Hauber, 2012). Thus, unlike previous studies that include few forced choice trials irrespective of probability (e.g. St Onge & Floresco, 2010; St Onge et al., 2010; Mendez et al., 2012, 2013), we did not attempt to compare different reward probabilities within a single session (see also, Abela & Chudasama, 2013). Instead, we allowed choice behavior to stabilise over 10 consecutive sessions for each probability and analysed sessions 6–10 (Fig. 3D, light gray shaded blocks).

When the probability of receiving the reward was always certain (Fig. 3D, P = 1), all rats preferred the large reward option (F2,22 = 2.14, P > 0.05). Although the disconnection group deviated from the sham control group for the 1/3 reward probability (Fig. 3D, P = 1/3, compare red and blue points), it was also shared with the ipsilateral group, indicating that this somewhat more conservative behavior in the operated animals was manifest even when one hemisphere was intact. Nevertheless, the animals still chose the large reward most of the time at this 1/3 probability, and no group differences were observed (F2,22 = 2.86, P > 0.05), and nor did we observe an interaction between session and lesion (F5,51 = 0.39, P > 0.05). As the large reward became increasingly uncertain (Fig. 3D, P = 1/9 and P = 1/15), the disconnection group readily shifted their preference for the small certain reward, an effect that did not contrast with the sham or ipsilateral control groups (P = 1/9, F2,22 = 0.984, P > 0.05; P = 1/15, F2,22 = 0.269, P > 0.05). Again, there was no interaction between session and lesion (all P > 0.05). Finally, the choice of large reward was not affected by the order of testing, i.e. the animals did not differ within groups irrespective of whether they were tested on the delay or probability task first (all P > 0.05).

The results of these two tests together suggest that the hippocampal–accumbens interaction is critical for certain types of decisions that involve trade-offs, and particularly those that involve time. We next compared these results with similar experiments in which we investigated the interaction between the OFC and NAc in the same tasks.

No effect of orbitofrontal–accumbens disconnection in tradeoffs involving time or certainty

We repeated the experiments above using cohorts in which the OFC was ablated. Like the vHC, the OFC has been implicated in decision-making and projects directly to the NAc, and thus the interplay between these structures is likely to be important for behavior. We repeated the testing paradigm above, using the two types of decision-making tasks. Analogous to the experiment described above, we compared animals with OFC/NAc disconnection lesions (n = 8) with animals that received ipsilateral control lesions of the same structures (n = 8) or animals that received sham control surgery (n = 8).

Schematic lesion reconstructions depicting the extent of the NAc and OFC lesions in the disconnection and ipsilateral lesion groups are illustrated in Fig. 4A and B. The NAc lesion was as intended, causing neuronal loss to shell and core regions, rostrocaudally from AP +2.4 to AP +1.4. In the OFC, the lesion extended rostrocaudally from AP +5.7 to AP +2.52 mm from bregma. At the most rostral extent, the medial orbital region was spared in nine animals (five in the disconnection group, four in the ipsilateral group). In all other cases the lesion included the medial, ventral and lateral orbital region. As shown in section AP +4.7, the lesion encroached on the ventral tip of the rostral prelimbic cortex in some animals. Otherwise, all neighboring areas, including the infralimbic cortex, prelimbic cortex and dorsal peduncular cortex, were spared.

Fig. 4.

Fig. 4

The top panel shows the extent of OFC and NAc lesions. (A) Schematic lesion reconstructions superimposed on coronal rat brain sections depicting the extent of the NAc (left hemisphere) and OFC (right hemisphere) lesions in the disconnection lesion group (orange), and (B) the NAc (left hemisphere) and OFC (left hemisphere) lesions in the ipsilateral lesion group in which both lesions were made to the same hemisphere (purple). Greater overlap in lesion amongst animals is indicated by darker shading. Numbers represent the location of sections relative to bregma according to Paxinos & Watson (2005). The bottom two panels show the impact of delay and uncertainty on choice of large reward stimulus in animals with OFC and NAc disconnection lesions (Disc) compared with animals with ipsilateral lesions (Ipsi) and sham controls (Sham). (C) Average percentage choice of large reward for each delay to reward delivery, displayed for each test condition: when no delays were present (No delays), when only the large reward was delayed (Delays I), when delays for the small and large reward were equal (Equal delays), and when the large reward was delayed again (Delays II). (D) The choice of large reward as a function of reward uncertainty for 10 sessions at each probability (P = 1, 1/3, 1/9, and 1/15). The last five sessions for each probability were analysed separately (gray shading). All error bars indicate SEM.

The behavioral data provided in Fig. 4C and D demonstrate that, in contrast to the vHC/NAc disconnection group, none of the OFC/NAc lesion groups were affected in their decision-making when tested on the same tasks. In the delay discounting task, a typical pattern of choice behavior was observed as all rats preferred the large reward when it was delivered immediately with no delay (Fig. 4C, No delays; F2,21 = 1.65, P > 0.05), but shifted their preference towards the small immediate reward as the delay to the large rewarded progressively increased (Fig. 4C, Delays I; F2,38 = 136.70, P < 0.001). However, the groups did not differ (F2,21 = 0.34, P > 0.05), nor was there a significant delay by lesion interaction (F4,38 = 0.428, P > 0.05). Thus, the disconnection group behaved similarly to the controls, indicating that the OFC interaction with the NAc affects neither the capacity to understand the value of delayed rewards nor the generation of decisions regarding trade-offs involving time. When the trade-off aspect was removed (Fig. 4C, Equal delays), the preference for the large reward was maintained (F2,39 = 75.65, P < 0.001) and, again, there was no group difference (F2,21 = 1.17, P > 0.05) and no interaction (F4,39=0.957, P > 0.05). When the large reward was delayed again (Fig. 4C, Delays II), the OFC/NAc group continued to show normal choice behavior; they reduced their preference for the large reward with increasing delays (F2,49 = 167.03, P < 0.001) just like the other lesioned groups (F2,21 = 0.61, P > 0.05). These findings suggest that, whereas the vHC/NAc interaction contributes to the tolerance for delay of the animals, the OFC/NAc interaction does not.

In contrast to our expectations, the OFC/NAc disconnection group also responded normally to manipulations of reward uncertainty (Fig. 4D). As previous work has pointed to both the OFC (Kheramin et al., 2002; Abela & Chudasama, 2013) and NAc (Cardinal & Howes, 2005; Acheson et al., 2006) as being important for decisions that involve uncertain outcomes, we anticipated that the disconnection of these two structures would affect performance in probability discounting. However, our results demonstrated that this was not the case. When the reward was certain (P = 1), all rats chose the large reward (F2,20 = 1.12, P > 0.05). When the large reward was only slightly uncertain (P = 1/3), all rats maintained a preference for the large reward (F2,20 = 0.07, P > 0.05). However, as the probability of receiving the large reward became increasingly unlikely (P = 1/9 and P = 1/15), the rats switched their preferences to the small reward but there was no main effect of group for either probability (P > 0.05). For all probabilities tested, there was no interaction between session and lesion (all P > 0.05). Finally, the animals’ choice of large reward within each group was not affected by the order of testing (all P > 0.05), and all other general aspects of performance such as omissions, latencies to respond and collect food were all in the normal range (see Table 2).

Discussion

In this study we present evidence that certain types of decision-making depend on the interaction of the NAc and vHC. This interaction appears to be particularly important for decisions that involve time. When rats were required to weigh the cost of waiting against the receipt of a large reward in the future, the disruption of the vHC/NAc interaction led to a shift in their decision-making. Rats without a functioning vHC/NAc circuitry in either hemisphere were unwilling to wait, even for a few seconds, to obtain a food reward that was four times the size of the small reward.

In interpreting these results, it is important to point out that our disconnection paradigm involved lesions in contralateral hemispheres, but no commissural transections, which is a rare and difficult procedure in rats as it often leads to unwanted collateral damage (Dunnett et al., 2005). Thus, there is the possibility that projections through the commissures could preserve some function between the lesioned areas through interhemispheric interactions, thus potentially affecting the interpretation of null results. However, the robust shifts that we observed in the delay discounting behavior in the animals with vHC/NAc crossed lesions was not affected by this possibility. Moreover, the absence of any effect in the ipsilateral-lesioned control group argues strongly that the intrahemispheric interaction between the hippocampus and NAc is critical for this type of decision-making. Considering more specifically the potential nature of this functional interaction, the most obvious candidate is the known anatomical projections from the vHC to the NAc in both hemispheres. Although this interpretation cannot be definitively proven using these methods, the qualitatively similar deficit following bilateral lesions of the vHC or NAc (Cardinal et al., 2001; Acheson et al., 2006; McHugh et al., 2008; Abela & Chudasama, 2013) is also consistent with this possibility.

The direct projections from the vHC to the NAc are thought to exert an important influence on behavior through the control of dopamine levels in the NAc. One obvious question is whether the observed behavioral shift is primarily a product of dopamine dysregulation and its effect on decision-making. There is evidence that disruption of either the NAc or vHC can lead to alteration of firing among dopamine neurons in the ventral tegmental area in the midbrain (Grace et al., 2007). Moreover, within the NAc, vHC projections converge with the dopaminergic ventral tegmental area terminals (Totterdell & Smith, 1989; Sesack & Pickel, 1990). Electrophysiological responses of neurons in the NAc signal the value of future choices (Roesch et al., 2009; Day et al., 2010, 2011), and the blocking of glutamate receptors at various points in the vHC/NAc/ventral tegmental area circuit (Blaha et al., 1997; Legault et al., 2000; Taepavarapruk et al., 2000) has been shown to alter dopamine neuromodulation in this structure (Yang & Mogenson, 1984; DeFrance et al., 1985; O’Donnell & Grace, 1998; Floresco et al., 2001). The direct vHC projections to the NAc can influence dopamine levels in the NAc by at least two mechanisms: they can result in the direct release of dopamine from dopaminergic boutons, and they can alter the activity of the ventral tegmental area indirectly by modulating the activity of NAc projection neurons. Thus, the anatomical and physiological evidence points to dopamine dysregulation as a serious consequence of our vHC/NAc disconnection manipulation.

At the same time, studies that have explicitly examined the role of dopamine levels in experiments involving time trade-offs have failed to find that such dysregulation impacts decision-making. For example, it has been shown that depletion of local dopamine levels in the NAc does not affect time-discounting judgments in decision-making tasks (Winstanley et al., 2005) and nor does it affect decision-making in other settings (Walton et al., 2009; see also Mai & Hauber, 2012). This finding is difficult to reconcile with the proposition that the critical role of vHC/NAc interaction in the deficits that we observed is in the regulation of dopamine. It is worth noting that reduced dopamine utilisation within the OFC has been shown to significantly impact decisions that require a time investment (Kheramin et al., 2004; Winstanley et al., 2006). However, our finding that OFC/NAc disconnection had no effect on decision-making further suggests that the main effect of the vHC/NAc crossed lesions was not due to dopamine dysregulation per se.

The amygdala is another potential component of the intrahemispheric circuit involving the NAc and vHC and may thus also contribute to our observed results. The basolateral amygdala, which also projects strongly to the NAc (Kelley et al., 1982; Robinson & Beart, 1988) shares reciprocal connections with both the vHC and OFC (Brog et al., 1993; Pitkänen et al., 2000; Aggleton et al., 2015), and is thought to contribute to certain types of decisions, particularly those involving value and emotion (Izquierdo et al., 2004). As lesions of the amygdala disrupt optimal choice selection in time discounting tasks (Winstanley et al., 2004), it is feasible that its interaction with these structures might have played an important role in the vHC/NAc disconnection lesion result. However, this interpretation would require an explanation of why rats with OFC/NAc disconnections were unimpaired on either decision-making task. The idea that optimal choice selection stems from a preserved interaction between the amygdala and vHC in those cases is an open possibility and warrants further investigation. We can, however, rule out the possibility that the observed disconnection effects resulted from disruption of input from the vHC to the OFC, as this interaction was intact in one hemisphere, in each group. Thus, the weight of the evidence points towards a pivotal role of the vHC, and its direct innervation of the NAc, in shaping decisions that involve time.

One notable finding in our study was the overall lack of impairment in rats following the OFC and NAc disconnections. Several studies have indicated that these two structures contribute to decision-making by encoding or updating the value of expected rewards (Schoenbaum et al., 2002; Izquierdo et al., 2004; Roesch & Olson, 2004; Kable & Glimcher, 2007; Roesch et al., 2009) and both structures are sensitive to the effects of reward uncertainty (Cardinal et al., 2001; Kheramin et al., 2002; Mobini et al., 2002; Abela & Chudasama, 2013; Stopper et al., 2014; but see St Onge & Floresco, 2010). Our OFC/NAc disconnection did not cause any changes in choice behavior for this type of decision, suggesting that, although both of these structures may independently contribute to such decisions, their intrahemispheric interaction is not critical. It is relevant that rats with OFC lesions show variable effects on discounting tasks and this might depend on subtle differences in task design. For example, unlike previous studies (e.g. St Onge & Floresco, 2010; St Onge et al., 2010), we chose to include a low probability to assess choice behavior under conditions of low likelihood of reward (i.e. 0.067 or 1/15th) necessitating a minimum of 30 forced choice trials (15 per stimulus). Consequently, we did not attempt to compare different reward probabilities across blocks within a single session. Instead, we allowed choice behavior to stabilise at one probability before moving onto the next. This ensured that the rat received sufficient exposure to the contingencies so that it could accurately estimate the probability of reward given its infrequent occurrence (see also Mobini et al., 2002; Nasrallah et al., 2009, 2011; Mai & Hauber, 2012). Our design also prevents the potential effect of satiation leading to a shift in the animal’s motivational state (Mai et al., 2012). Nevertheless, the difference between across-block and within-session shifts does not detract from the conceptual finding that circuits involving the NAc and vHC, but not the OFC, affect decisions about time. That bilateral OFC lesions or inactivations do not always cause risky choices irrespective of design (St Onge & Floresco, 2010; Abela & Chudasama, 2013) is consistent with this idea.

It is also possible that the testing procedure used in the current study is insensitive to the OFC/NAc disconnection, rendering a conclusion of these structures in probability discounting more difficult. We think this unlikely because we have previously shown, using identical testing procedures, that only bilateral OFC lesions reduce tolerance to uncertainty (Abela & Chudasama, 2013), a finding also reported by others (e.g. Mobini et al., 2002). Our data suggest that the behavioral differences following OFC or NAc lesions should be considered in terms of their interacting effects within a broader anatomical circuitry.

Time or probability discounting is one aspect of decision-making but there is evidence for decisions where the trade-off involves effort and reward type, rather than reward magnitude, and thus the final choice behavior may draw upon areas implicated in other types of decision-making (Izquierdo et al., 2004; Floresco & Ghods-Sharifi, 2007). Decision-making has also been assessed in situations when two possible responses are in conflict: approach to obtain an attractive food and avoidance of the fearful snake. Such emotional decision-making relies heavily on the OFC and amygdala (Izquierdo et al., 2005) as well as the hippocampus (McNaughton & Gray, 2000; Chudasama et al., 2008, 2009). Moreover, disrupting the interaction between the OFC and amygdala impairs the appropriate assessment of reward value when making choices associated with different outcomes (Baxter et al., 2000; Zeeb & Winstanley, 2013). Further studies are required to refine our knowledge on the interactions between these structures in a broad decision-making circuitry, and how they facilitate the animal’s ability to overcome a variety of costs to maximise long-term benefits.

Impairments in decision-making are a hallmark of virtually all human neuropsychiatric disorders as well as many neurological patient groups. They also underlie a range of social problems in which daily decisions, such as drug use, gambling or credit card binges, are marked by a shift toward immediate gratification. Our identification of the hippocampal–accumbens circuit as critical for decisions about time is important for understanding how the brain uses the representation of temporal structure in the hippocampus to arrive at decisions that involve waiting. As such, these results provide insight into the nature of dysfunctional neural circuits that affect a range of human afflictions in which decision-making is affected, and suggest that the hippocampal circuitry is a potentially valuable therapeutic target.

Supplementary Material

Figure_S1

Acknowledgments

This work was supported by grants from the Canadian Institute of Health Research (CIHR 102507) and the Canadian Foundation for Innovation (CFI 14033) awarded to Y.C. A.R.A. was supported by an Alexander Graham Bell Canada Graduate Scholarship from the Natural Sciences and Engineering Research Council of Canada. Y.C. is a member of the Center for Studies in Behavioral Neurobiology (CSBN). We thank André St-Jacques for help with behavioral testing. A.R.A is now at the Department of Neuroscience, Centre for Addiction and Mental Health, Toronto, Canada. The authors have no conflicts of interest to declare.

Abbreviations

AP

anterior–posterior

NAc

nucleus accumbens

OFC

orbitofrontal cortex

vHC

ventral hippocampus

Footnotes

Supporting Information

Additional supporting information can be found in the online version of this article:

Fig. S1. Temporal discounting training data for 10 days (two blocks of five sessions) compared with final block of stable performance (i.e. Delays I).

References

  1. Abela AR, Chudasama Y. Dissociable contributions of the ventral hippocampus and orbitofrontal cortex to decision-making with a delayed or uncertain outcome. Eur J Neurosci. 2013;37:640–647. doi: 10.1111/ejn.12071. [DOI] [PubMed] [Google Scholar]
  2. Abela AR, Chudasama Y. Noradrenergic α2A-receptor stimulation in the ventral hippocampus reduces impulsive decision-making. Psychopharmacology. 2014;231:521–531. doi: 10.1007/s00213-013-3262-y. [DOI] [PubMed] [Google Scholar]
  3. Acheson A, Farrar AM, Patak M, Hausknecht KA, Kieres AK, Choi S, de Wit H, Richards JB. Nucleus accumbens lesions decrease sensitivity to rapid changes in the delay to reinforcement. Behav Brain Res. 2006;173:217–228. doi: 10.1016/j.bbr.2006.06.024. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Aggleton JP, Wright NF, Rosene DL, Saunders RC. Complementary patterns of direct amygdala and hippocampal projections to the macaque prefrontal cortex. Cereb, Cortex. 2015 doi: 10.1093/cercor/bhv019. [Epub ahead of print] [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Ainslie G. Specious reward: a behavioral theory of impulsiveness and impulse control. Psychol Bull. 1975;82:463–496. doi: 10.1037/h0076860. [DOI] [PubMed] [Google Scholar]
  6. Baxter MG, Parker A, Lindner CCC, Izquierdo AD, Murray EA. Control of response selection by reinforcer value requires interaction of amygdala and orbital prefrontal cortex. J Neurosci. 2000;20:4311–4319. doi: 10.1523/JNEUROSCI.20-11-04311.2000. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Blaha CD, Yang CR, Floresco SB, Barr AM, Phillips AG. Stimulation of the ventral subiculum of the hippocampus evokes glutamate receptor-mediated changes in dopamine efflux in the rat nucleus accumbens. Eur J Neurosci. 1997;9:902–911. doi: 10.1111/j.1460-9568.1997.tb01441.x. [DOI] [PubMed] [Google Scholar]
  8. Brog JS, Salyapongse A, Deutch AY, Zahm DS. The patterns of afferent innervation of the core and shell in the “accumbens” part of the rat ventral striatum: immunohistochemical detection of retrogradely transported fluoro-gold. J Comp Neurol. 1993;338:255–278. doi: 10.1002/cne.903380209. [DOI] [PubMed] [Google Scholar]
  9. Burton AC, Kashtelyan V, Bryden DW, Roesch MR. Increased firing to cues that predict low-value reward in the medial orbitofrontal cortex. Cereb Cortex. 2014;24:3310–3321. doi: 10.1093/cercor/bht189. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Calabrese E, Badea A, Watson C, Johnson GA. A quantitative magnetic resonance histology atlas of postnatal rat brain development with regional estimates of growth and variability. NeuroImage. 2013;71:196–206. doi: 10.1016/j.neuroimage.2013.01.017. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Cardinal RN, Aitken MRF. Whisker: a client-server high-performance multimedia research control system. Behav Res Methods. 2010;42:1059–1071. doi: 10.3758/BRM.42.4.1059. [DOI] [PubMed] [Google Scholar]
  12. Cardinal RN, Howes NJ. Effects of lesions of the nucleus accumbens core on choice between small certain rewards and large uncertain rewards in rats. BMC Neurosci. 2005;6:37. doi: 10.1186/1471-2202-6-37. [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Cardinal RN, Pennicott DR, Sugathapala CL, Robbins TW, Everitt BJ. Impulsive choice induced in rats by lesions of the nucleus accumbens core. Science. 2001;292:2499–2501. doi: 10.1126/science.1060818. [DOI] [PubMed] [Google Scholar]
  14. Cheung THC, Cardinal RN. Hippocampal lesions facilitate instrumental learning with delayed reinforcement but induce impulsive choice in rats. BMC Neurosci. 2005;6:36. doi: 10.1186/1471-2202-6-36. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Chudasama Y, Wright KS, Murray EA. Hippocampal lesions in rhesus monkeys disrupt emotional responses but not reinforcer devaluation effects. Biol Psychiat. 2008;63:1084–1091. doi: 10.1016/j.biopsych.2007.11.012. [DOI] [PubMed] [Google Scholar]
  16. Chudasama Y, Izquierdo A, Murray EA. Distinct contributions of the amygdala and hippocampus to fear expression. Eur J Neurosci. 2009;30:2327–2337. doi: 10.1111/j.1460-9568.2009.07012.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Day JJ, Jones JL, Wightman RM, Carelli RM. Phasic nucleus accumbens dopamine release encodes effort- and delay-related costs. Biol Psychiat. 2010;68:306–309. doi: 10.1016/j.biopsych.2010.03.026. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Day JJ, Jones JL, Carelli RM. Nucleus accumbens neurons encode predicted and ongoing reward costs in rats. Eur J Neurosci. 2011;33:308–321. doi: 10.1111/j.1460-9568.2010.07531.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. DeFrance JF, Marchand JF, Sikes RW, Chronister RB, Hubbard JI. Characterization of fimbria input to nucleus accumbens. J Neurophysiol. 1985;54:1553–1567. doi: 10.1152/jn.1985.54.6.1553. [DOI] [PubMed] [Google Scholar]
  20. Dunnett SB, Meldrum A, Muir JL. Frontal-striatal disconnection disrupts cognitive performance of the frontal-type in the rat. Neuroscience. 2005;135:1055–1065. doi: 10.1016/j.neuroscience.2005.07.033. [DOI] [PubMed] [Google Scholar]
  21. Fellows LK, Farah MJ. Dissociable elements of human foresight: a role for the ventromedial frontal lobes in framing the future, but not in discounting future rewards. Neuropsychologia. 2005;43:1214–1221. doi: 10.1016/j.neuropsychologia.2004.07.018. [DOI] [PubMed] [Google Scholar]
  22. Ferbinteanu J, Shapiro ML. Prospective and retrospective memory coding in the hippocampus. Neuron. 2003;40:1227–1239. doi: 10.1016/s0896-6273(03)00752-9. [DOI] [PubMed] [Google Scholar]
  23. Floresco SB, Ghods-Sharifi S. Amygdala-prefrontal cortical circuitry regulates effort-based decision making. Cereb Cortex. 2007;17:251–260. doi: 10.1093/cercor/bhj143. [DOI] [PubMed] [Google Scholar]
  24. Floresco SB, Todd CL, Grace AA. Glutamatergic afferents from the hippocampus to the nucleus accumbens regulate activity of ventral tegmental area dopamine neurons. J Neurosci. 2001;21:4915–4922. doi: 10.1523/JNEUROSCI.21-13-04915.2001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Gleichgerrcht E, Ibáñez A, Roca M, Torralva T, Manes F. Decision-making cognition in neurodegenerative diseases. Nat Rev Neurol. 2010;6:611–623. doi: 10.1038/nrneurol.2010.148. [DOI] [PubMed] [Google Scholar]
  26. Grace AA, Floresco SB, Goto Y, Lodge DJ. Regulation of firing of dopaminergic neurons and control of goal-directed behaviors. Trends Neurosci. 2007;30:220–227. doi: 10.1016/j.tins.2007.03.003. [DOI] [PubMed] [Google Scholar]
  27. Green L, Myerson J, Macaux EW. Temporal discounting when the choice is between two delayed rewards. J Exp Psychol. 2005;31:1121–1133. doi: 10.1037/0278-7393.31.5.1121. [DOI] [PubMed] [Google Scholar]
  28. Izquierdo A, Suda RK, Murray EA. Bilateral orbital prefrontal cortex lesions in rhesus monkeys disrupt choices guided by both reward value and reward contingency. J Neurosci. 2004;24:7540–7548. doi: 10.1523/JNEUROSCI.1921-04.2004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Izquierdo A, Suda RK, Murray EA. Comparison of the effects of bilateral orbital prefrontal cortex lesions and amygdala lesions on emotional responses in rhesus monkeys. J Neurosci. 2005;25:8534–8542. doi: 10.1523/JNEUROSCI.1232-05.2005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Johnson A, van der Meer MAA, Redish AD. Integrating hippocampus and striatum in decision-making. Curr Opin Neurobiol. 2007;17:692–697. doi: 10.1016/j.conb.2008.01.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Kable JW, Glimcher PW. The neural correlates of subjective value during intertemporal choice. Nat Neurosci. 2007;10:1625–1633. doi: 10.1038/nn2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Kelley AE, Domesick VB. The distribution of the projection from the hippocampal formation to the nucleus accumbens in the rat: an anterograde- and retrograde-horseradish peroxidase study. Neuroscience. 1982;7:2321–2335. doi: 10.1016/0306-4522(82)90198-1. [DOI] [PubMed] [Google Scholar]
  33. Kelley AE, Domesick VB, Nauta WJ. The amygdalostriatal projection in the rat – an anatomical study by anterograde and retrograde tracing methods. Neuroscience. 1982;7:615–630. doi: 10.1016/0306-4522(82)90067-7. [DOI] [PubMed] [Google Scholar]
  34. Kheramin S, Body S, Mobini S, Ho MY, Velázquez-Martinez DN, Bradshaw CM, Szabadi E, Deakin JFW, Anderson IM. Effects of quinolinic acid-induced lesions of the orbital prefrontal cortex on inter-temporal choice: a quantitative analysis. Psychopharmacology. 2002;165:9–17. doi: 10.1007/s00213-002-1228-6. [DOI] [PubMed] [Google Scholar]
  35. Kheramin S, Body S, Ho MY, Velázquez-Martinez DN, Bradshaw CM, Szabadi E, Deakin JFW, Anderson IM. Effects of orbital prefrontal cortex dopamine depletion on inter-temporal choice: a quantitative analysis. Psychopharmacology. 2004;175:206–214. doi: 10.1007/s00213-004-1813-y. [DOI] [PubMed] [Google Scholar]
  36. Kwan D, Craver CF, Green L, Myerson J, Boyer P, Rosenbaum RS. Future decision-making without episodic mental time travel. Hippocampus. 2012;22:1215–1219. doi: 10.1002/hipo.20981. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Legault M, Rompré PP, Wise RA. Chemical stimulation of the ventral hippocampus elevates nucleus accumbens dopamine by activating dopaminergic neurons of the ventral tegmental area. J Neurosci. 2000;20:1635–1642. doi: 10.1523/JNEUROSCI.20-04-01635.2000. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Mai B, Hauber W. Intact risk-based decision making in rats with prefrontal or accumbens dopamine depletion. Cogn Affect Behav Ne. 2012;12:719–729. doi: 10.3758/s13415-012-0115-9. [DOI] [PubMed] [Google Scholar]
  39. Mai B, Sommer S, Hauber W. Motivational states influence effort-based decision making in rats: the role of dopamine in the nucleus accumbens. Cogn Affect Behav Ne. 2012;12:74–84. doi: 10.3758/s13415-011-0068-4. [DOI] [PubMed] [Google Scholar]
  40. Mariano TY, Bannerman DM, McHugh SB, Preston TJ, Rudebeck PH, Rudebeck SR, Rawlins JNP, Walton ME, Rushworth MFS, Baxter MG, Campbell TG. Impulsive choice in hippocampal but not orbitofrontal cortex-lesioned rats on a nonspatial decision-making maze task. Eur J Neurosci. 2009;30:472–484. doi: 10.1111/j.1460-9568.2009.06837.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. McHugh SB, Campbell TG, Taylor AM, Rawlins JNP, Bannerman DM. A role for dorsal and ventral hippocampus in inter-temporal choice cost-benefit decision making. Behav Neurosci. 2008;122:1–8. doi: 10.1037/0735-7044.122.1.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. McNaughton N, Gray JA. Anxiolytic action on the behavioural inhibition system implies multiple types of arousal contribute to anxiety. J Affect Disorders. 2000;61:161–176. doi: 10.1016/s0165-0327(00)00344-x. [DOI] [PubMed] [Google Scholar]
  43. Mendez IA, Gilbert RJ, Bizon JL, Setlow B. Effects of acute administration of nicotinic and muscarinic cholinergic agonists and antagonists on performance in different cost-benefit decision making tasks in rats. Psychopharmacology. 2012;224:489–499. doi: 10.1007/s00213-012-2777-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Mendez IA, Damborsky JC, Winzer-Serhan UH, Bizon JL, Setlow B. 4β2 and α7 nicotinic acetylcholine receptor binding predicts choice preference in two cost benefit decision-making tasks. Neuroscience. 2013;230:121–131. doi: 10.1016/j.neuroscience.2012.10.067. [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. Mobini S, Body S, Ho M, Bradshaw C, Szabadi E, Deakin J, Anderson I. Effects of lesions of the orbitofrontal cortex on sensitivity to delayed and probabilistic reinforcement. Psychopharmacology. 2002;160:290–298. doi: 10.1007/s00213-001-0983-0. [DOI] [PubMed] [Google Scholar]
  46. Nasrallah NA, Yang TWH, Bernstein IL. Long-term risk preference and suboptimal decision making following adolescent alcohol use. Proc Natl Acad Sci USA. 2009;106:17600–17604. doi: 10.1073/pnas.0906629106. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Nasrallah NA, Clark JJ, Collins AL, Akers CA, Phillips PE, Bernstein IL. Risk preference following adolescent alcohol use is associated with corrupted encoding of costs but not rewards by mesolimbic dopamine. Proc Natl Acad Sci USA. 2011;108:5466–5471. doi: 10.1073/pnas.1017732108. [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. O’Donnell P, Grace AA. Phencyclidine interferes with the hippocampal gating of nucleus accumbens neuronal activity in vivo. Neuroscience. 1998;87:823–830. doi: 10.1016/s0306-4522(98)00190-0. [DOI] [PubMed] [Google Scholar]
  49. Paxinos G, Watson C. The Rat Brain in Stereotaxic Coordinates. 5. Elsevier Academic Press; New York: 2005. [Google Scholar]
  50. Peters J, Büchel C. Episodic future thinking reduces reward delay discounting through an enhancement of prefrontal-mediotemporal interactions. Neuron. 2010;66:138–148. doi: 10.1016/j.neuron.2010.03.026. [DOI] [PubMed] [Google Scholar]
  51. Pitkänen A, Pikkarainen M, Nurminen N, Ylinen A. Reciprocal connections between the amygdala and the hippocampal formation, perirhinal cortex, and postrhinal cortex in rat. A review. Ann NY Acad Sci. 2000;911:369–391. doi: 10.1111/j.1749-6632.2000.tb06738.x. [DOI] [PubMed] [Google Scholar]
  52. Pothuizen HHJ, Jongen Rêlo AL, Feldon J, Yee BK. Double dissociation of the effects of selective nucleus accumbens core and shell lesions on impulsive-choice behaviour and salience learning in rats. Eur J Neurosci. 2005;22:2605–2616. doi: 10.1111/j.1460-9568.2005.04388.x. [DOI] [PubMed] [Google Scholar]
  53. Robinson TG, Beart PM. Excitant amino acid projections from rat amygdala and thalamus to nucleus accumbens. Brain Res Bull. 1988;20:467–471. doi: 10.1016/0361-9230(88)90136-0. [DOI] [PubMed] [Google Scholar]
  54. Roesch MR, Olson CR. Neuronal activity related to reward value and motivation in primate frontal cortex. Science. 2004;304:307–310. doi: 10.1126/science.1093223. [DOI] [PubMed] [Google Scholar]
  55. Roesch MR, Singh T, Brown PL, Mullins SE, Schoenbaum G. Ventral striatal neurons encode the value of the chosen action in rats deciding between differently delayed or sized rewards. J Neurosci. 2009;29:13365–13376. doi: 10.1523/JNEUROSCI.2572-09.2009. [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Rudebeck PH, Walton ME, Smyth AN, Bannerman DM, Rush-worth MFS. Separate neural pathways process different decision costs. Nat Neurosci. 2006;9:1161–1168. doi: 10.1038/nn1756. [DOI] [PubMed] [Google Scholar]
  57. Schacter DL, Addis DR, Buckner RL. Remembering the past to imagine the future: the prospective brain. Nat Rev Neurosci. 2007;8:657–661. doi: 10.1038/nrn2213. [DOI] [PubMed] [Google Scholar]
  58. Schoenbaum G, Nugent SL, Saddoris MP, Setlow B. Orbitofrontal lesions in rats impair reversal but not acquisition of go, no-go odor discriminations. NeuroReport. 2002;13:885–890. doi: 10.1097/00001756-200205070-00030. [DOI] [PubMed] [Google Scholar]
  59. Schoenbaum G, Setlow B, Saddoris MP, Gallagher M. Encoding predicted outcome and acquired value in orbitofrontal cortex during cue sampling depends upon input from basolateral amygdala. Neuron. 2003;39:855–867. doi: 10.1016/s0896-6273(03)00474-4. [DOI] [PubMed] [Google Scholar]
  60. Sesack SR, Pickel VM. In the rat medial nucleus accumbens, hippocampal and catecholaminergic terminals converge on spiny neurons and are in apposition to each other. Brain Res. 1990;527:266–279. doi: 10.1016/0006-8993(90)91146-8. [DOI] [PubMed] [Google Scholar]
  61. St Onge JR, Floresco SB. Prefrontal cortical contribution to risk-based decision making. Cereb Cortex. 2010;20:1816–1828. doi: 10.1093/cercor/bhp250. [DOI] [PubMed] [Google Scholar]
  62. St Onge JR, Chiu YC, Floresco SB. Differential effects of dopaminergic manipulations on risky choice. Psychopharmacology. 2010;211:209–221. doi: 10.1007/s00213-010-1883-y. [DOI] [PubMed] [Google Scholar]
  63. Stopper CM, Green EB, Floresco SB. Selective involvement by the medial orbitofrontal cortex in biasing risky, but not impulsive, choice. Cereb Cortex. 2014;24:154–162. doi: 10.1093/cercor/bhs297. [DOI] [PubMed] [Google Scholar]
  64. Taepavarapruk P, Floresco SB, Phillips AG. Hyperlocomotion and increased dopamine efflux in the rat nucleus accumbens evoked by electrical stimulation of the ventral subiculum: role of ionotropic glutamate and dopamine D1 receptors. Psychopharmacology. 2000;151:242–251. doi: 10.1007/s002130000376. [DOI] [PubMed] [Google Scholar]
  65. Totterdell S, Smith AD. Convergence of hippocampal and dopaminergic input onto identified neurons in the nucleus accumbens of the rat. J Chem Neuroanat. 1989;2:285–298. [PubMed] [Google Scholar]
  66. Walton ME, Groves J, Jennings KA, Croxson PL, Sharp T, Rushworth MFS, Bannerman DM. Comparing the role of the anterior cingulate cortex and 6-hydroxydopamine nucleus accumbens lesions on operant effort-based decision making. Eur J Neurosci. 2009;29:1678–1691. doi: 10.1111/j.1460-9568.2009.06726.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  67. Winstanley CA, Theobald DEH, Cardinal RN, Robbins TW. Contrasting roles of basolateral amygdala and orbitofrontal cortex in impulsive choice. J Neurosci. 2004;24:4718–4722. doi: 10.1523/JNEUROSCI.5606-03.2004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  68. Winstanley CA, Theobald DEH, Dalley JW, Robbins TW. Interactions between serotonin and dopamine in the control of impulsive choice in rats: therapeutic implications for impulse control disorders. Neuropsychopharmacology. 2005;30:669–682. doi: 10.1038/sj.npp.1300610. [DOI] [PubMed] [Google Scholar]
  69. Winstanley CA, Theobald DEH, Dalley JW, Cardinal RN, Robbins TW. Double dissociation between serotonergic and dopaminergic modulation of medial prefrontal and orbitofrontal cortex during a test of impulsive choice. Cereb Cortex. 2006;16:106–114. doi: 10.1093/cercor/bhi088. [DOI] [PubMed] [Google Scholar]
  70. Yang CR, Mogenson GJ. Electrophysiological responses of neurones in the nucleus accumbens to hippocampal stimulation and the attenuation of the excitatory responses by the mesolimbic dopaminergic system. Brain Res. 1984;324:69–84. doi: 10.1016/0006-8993(84)90623-1. [DOI] [PubMed] [Google Scholar]
  71. Zeeb FD, Winstanley CA. Functional disconnection of the orbitofrontal cortex and basolateral amygdala impairs acquisition of a rat gambling task and disrupts animals’ ability to alter decision-making behavior after reinforcer devaluation. J Neurosci. 2013;33:6434–6443. doi: 10.1523/JNEUROSCI.3971-12.2013. [DOI] [PMC free article] [PubMed] [Google Scholar]
  72. Zeeb FD, Floresco SB, Winstanley CA. Contributions of the orbitofrontal cortex to impulsive choice: interactions with basal levels of impulsivity, dopamine signalling, and reward-related cues. Psychopharmacology. 2010;211:87–98. doi: 10.1007/s00213-010-1871-2. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Figure_S1

RESOURCES