Abstract
Intertemporal choices are common and consequential to private and public life. Thus, there is considerable interest in understanding the neural basis of intertemporal decision making. In this minireview, we briefly describe conceptual and psychological perspectives on intertemporal choice and then provide a comprehensive evaluation of the neural structures and signals that comprise the underlying cortico-limbic-striatal circuit. Even though great advances have been made, our understanding of the neurobiology of intertemporal choice is still in its infancy because of the complex and dynamic nature of this form of decision making. We close by briefly discussing recommendations for the future study of intertemporal choice research.
Keywords: decision making, intertemporal choice, neural processes, psychological processes, rodents
1.0 Introduction
Among the countless behaviors studied by neuroscientists, value-based decision making engenders great fascination because of the ease with which people can think of examples from their own lives-routine decisions like selecting what/when to eat as well as more profound decisions like choosing whether to pursue an advanced degree or determining the right time to invest in a home. Such examples highlight the fact that many of the value-based decisions humans and other animals regularly face are intertemporal choices, decisions between options available at different times.
Individuals across species respond to intertemporal choices by delay discounting, a phenomenon by which their subjective valuation of reward declines with a delay (Ainslie, 1975; Mazur, 1997). Not only do individuals prefer an immediately available reward over a delayed reward when given the choice between equally-sized reward options, but their preference for an objectively larger reward is also reduced when its receipt is delayed relative to the smaller reward. Even though delay discounting is the norm, the rate at which individuals discount future reward varies greatly, with high rates of discounting correlating with a variety of psychiatric and behavioral conditions such as psychiatric diseases and behavioral disorders (Koffarnus, Jarmolowicz, Mueller, & Bickel, 2013), college GPA (Kirby, Winston, & Santiesteban, 2005), texting while driving (Hayashi, Russo, & Wirth, 2015), environmental investment (Hardisty & Weber, 2009), social policy (Weatherly, Plumm, & Derenne, 2011).
Because intertemporal choices are common and consequential to private and public life, there is great interest in enumerating the processes that underpin normal and pathological intertemporal choices. While human studies are useful for identifying neural circuits and psychological processes involved in intertemporal decisions, rodent studies are able to extend those findings by disentangling the contributions of specific brain structures and systems. Thus, the aim of this minireview is to present a framework for understanding and advancing intertemporal choice research using rodent models. We begin by briefly describing conceptual and psychological perspectives on intertemporal choice. Then, we summarize rodent studies in order to review and evaluate the neural contributions of brain structures and systems. Finally, we close by highlighting open questions and making recommendations for future study.
2.0 Measuring intertemporal choice
Even though it is widely recognized that the goal of value-based decision making is to maximize value, it is difficult to predict an individual’s intertemporal choice because valuation is idiosyncratic and driven by delay discounting rate. Thus, to study intertemporal decision making, researchers have developed laboratory tasks to measure individual delay discounting rates. The tasks, often called delay discounting tasks, record subjects’ preferences during a series of choices between small rewards available after little/no delay (usually immediately) and larger rewards preceded by a range of delays. During the course of these tasks, human subjects are asked about their preferences during novel choices between food or money rewards paired with long delays (weeks, months, or years), and rodent subjects are well-trained to learn and express preferences for choices between food rewards associated with delays on the order of seconds. [In the rodent tasks, choices are typically presented with fixed delays in a prescribed order, i.e. increasing from 0 seconds, but there are versions with mixed delays and versions that allow rodents’ choices to determine the order.]. From these tasks, subjects’ choice behavior (% choice of large, delayed reward) is used to calculate delay discounting rates and identify decision making tendencies (Figure 1A). Individuals with higher rates than the group mean are identified as ‘impulsive’, whereas individuals with lower rates are identified as ‘patient’. Here, we use the term impulsive to refer only to impulsive choices and not impulsive actions.
Despite the differences in task structure, rodent delay discounting tasks are designed to measure the same process as human tasks. Although some have argued rodent tasks are not appropriate models for extending human research (see Hayden, 2015 and Killeen, 2011 for critical analyses), the similarities in results between species has justified their continued use. Not only do measures of human and rodent delay discounting rates both remain stable over time if measured under the same circumstances (Anokhin, Golosheykin, Grant, & Heath, 2011; Kirby, 2009; J. McClure, Podos, & Richardson, 2014), but there is significant overlap between psychological and neural processes that have been identified using these tasks in both species (see fourth and fifth sections).
3.0 Conceptual basis of intertemporal choice
In light of how intertemporal decision making is measured, it is tempting to view choice as a single, standalone behavior. But, it is more accurate to view it as a complex, multistep behavioral process. The conceptual model which captures this best divides the decision making process into five subprocesses that occur every time an animal encounters an intertemporal choice: decision representation, subjective valuation, action selection, outcome evaluation, and learning/updating (Rangel, Camerer, & Montague, 2008; Figure 1B). According to the model, animals must first represent the decision problem by determining the number and features (actions, delays, reward size) of the available options. Then, they integrate the feature information (reward and delay) to assign each option a subjective value, and next, they use that valuation to choose and perform the action(s) associated with the most valuable option. Finally, they compare the experienced value against their expected value and then update their decision representations, valuations, and choices based and what has or has not changed externally (actions, cues, delays, rewards, etc) or internally (hunger, thirst, affect, etc.). By outlining the entire decision making process, the model expands our conception of intertemporal choice beyond just the choice itself to include an interconnected web of behavioral subprocesses.
Even though the subprocesses were initially proposed to highlight testable computational variables for neurobiological experiments, they also implicate psychological variables to be tested in behavioral experiments. Specifically, the fact that each subprocess is likely linked to numerous psychological processes (including learning, memory, perception, and motivation-related processes) raises the possibility that some or many of those processes are critical for intertemporal choice.
4.0 Psychological basis of intertemporal choice
To empirically determine which psychological variables support intertemporal choice, experimenters have taken two approaches. The first is to identify scenarios and manipulations that change (increase or decrease) delay discounting rates; and the second is to identify correlations between psychological measures and delay discounting rates. To date, those approaches have been little applied to rodents and very frequently applied to humans, such that many psychological variables have been identified in human subjects (for excellent reviews, see Koffarnus et al., 2013; Lempert & Phelps, 2016; Peters & Büchel, 2011) and very few in rodent subjects. However, to hew to the focus of this minireview, in this section we discuss the three psychological variables that have been studied in both human and rodent subjects: timing, affect, and working memory. Not only do these examples illustrate how psychological variables can influence choice, but they also corroborate our claim that current rodent tasks can accurately model psychological aspects of human intertemporal choice.
First, timing or temporal perception has proven critical for human and rodent intertemporal choice. In both species, the more imprecise an individual’s timing ability, the higher their delay discounting rates (Baumann & Odum, 2012; Marshall, Smith, & Kirkpatrick, 2014; McClure et al., 2014). Consistent with that, when human subjects were directly trained to time more precisely, their discounting rates were lowered (Peters & Büchel, 2011). Such observations indicate that improper timing of the delays during the decision representation subprocess will result in a higher discount rate when integrated with reward during the valuation subprocess. This may also explain why many of the same human disorders associated with timing dysfunction are also associated with high discounting rates (Allman & Meck, 2012; Berlin, Rolls, & Kischka, 2004; Meck, 2005; Rubia, Halari, Christakou, & Taylor, 2009).
In addition to timing, affect can also be manipulated to reduce delay discounting across species. Human gamblers exhibited diminished discounting when large, delayed rewards were offered in the context of a positively conditioned cue (Dixon & Holton, 2009); just as rats discounting rates were lower when they were routinely trained in a task with a cue present during the delays that had acquired the positive properties of a conditioned reinforcer (Cardinal, Robbins, & Everitt, 2000; Zeeb, Floresco, & Winstanley, 2010). In both cases, the contextual cues alter delay discounting by changing the subjects’ affective responses to the delayed reward during the learning/updating subprocess. The presence of the positively associated cues, strengthens reward associations and activates learned behavioral patterns, which together make the delayed option more valuable and likely to be pursued.
Lastly, researchers have also successfully linked working memory to intertemporal choice in humans and rodents. Not only has working memory capacity been shown to negatively correlate with delay discounting in humans and rats (Renda, Stein, & Madden, 2015; Shamosh et al., 2008); but manipulating human’s working memory by taxation or training lead to increased and decreased discounting rates, respectively (Hinson, Jameson, & Whitney, 2003; Wesley & Bickel, 2014). There is currently no consensus on the mechanism by which working memory impacts delay discounting, but one possibility is that working memory is required during the learning/updating subprocess to maintain active representations of actions across delays so that action-outcome associations are formed and values are properly learned and updated.
As mentioned, human research has identified additional psychological variables that have yet to receive attention from rodent researchers. Specifically, contextual and framing manipulations have shown that attention, feature (delay, reward) salience, and recent choice history play a role in intertemporal choice; while manipulations that changed the state of the decision maker have revealed the importance of motivation, mood, and stress. Currently, there is no evidence to suggest a relative weighting, in terms of importance, for the diverse set of psychological variables that have been identified. Rather, as a group, they deepen our understanding of the web of subprocesses that comprise the intertemporal decision making process (Figure 1B) and encourage the use of psychological approaches to treat pathological intertemporal choice.
5.0 Neural basis of intertemporal choice
To fully understand intertemporal choice, a complete functional characterization of the neural circuitry that underpins the process is also essential. First, the critical nodes of the neural circuit should be identified; and second, the contribution(s) of those structures/systems-both the computations or values they signal and the behavioral subprocesses they support-should be catalogued.
The initial work to understand the neural basis of intertemporal choice used fMRI recordings to determine which structures were active when healthy, human subjects made intertemporal decisions. They found BOLD activation of a large scale neural network that included the prefrontal cortex, amygdala and ventral stratum (Kable & Glimcher, 2007; McClure, Ericson, Laibson, Loewenstein, & Cohen, 2007; McClure, Laibson, Loewenstein, & Cohen, 2004). Beyond identifying a possible circuit, the activity patterns they observed also proposed two interesting models for value encoding during intertemporal decisions. The first model, the β-δ model, suggested that the network influenced choice through the relative activation of separate, distinct substructures that either responded to immediate or delayed options, with immediate-choice promoting activity in limbic structures (β) and delayed-choice promoting activity in prefrontal structures (δ) (McClure et al., 2007, 2004). Alternatively, the second model found correlates of subjective values (single, scaled values calculated from subjects’ choices) within both limbic and prefrontal structures (Kable & Glimcher, 2007). Those two models highlight a fundamental question about how value is computed in the brain during inter-temporal choices: are dual values integrated at the network level, are subjective values represented within structures throughout the network, or is some combinatorial approach at play? (Peters & Büchel, 2011).
Human studies have not be able to fully answer that question, nor determine whether those signals are necessary for normal levels of choice because such experiments require the use of more invasive techniques. But, rodent models have begun to shed some light on answers. Targeted neural manipulations (lesions and inactivations) have been used to map the circuit (Figure 2). Additionally, single-unit recordings have been used to assess which values are computed and signaled in the brain and to characterize neural signals in terms of their likeliest behavioral/psychological contributions.
5.1 Prefrontal cortex monitors and adjusts intertemporal choices
5.1.1 Orbitofrontal cortex (OFC)
The OFC is the region of the prefrontal cortex (PFC) that has been most extensively studied for its role in intertemporal choice. Although humans with OFC lesions consistently exhibit impulsive choices (Bechara, Tranel, & Damasio, 2000; Berlin et al., 2004; Sellitto, Ciaramelli, & di Pellegrino, 2010), there are discrepancies between reported effects of manipulating the rodent OFC. Following lesion or inactivation of the whole OFC, rodents’ discounting increased (Mobini et al., 2002; Rudebeck, Walton, Smyth, Bannerman, & Rushworth, 2006), decreased (Kheramin et al., 2002; Mar, Walker, Theobald, Eagle, & Robbins, 2011; Winstanley, 2004), or did not change (Abela & Chudasama, 2013; Churchwell, Morris, Heurtelou, & Kesner, 2009; Jo, Kim, Lee, & Jung, 2013; Mariano et al., 2009; Moschak & Mitchell, 2014). The reasons for the inconsistencies are not fully known, but one possibility is that the OFC was differentially engaged based on the experimental design. Specifically, differences might have arisen because rats encountered a limited number of delays (Churchwell et al., 2009); mixed delay order (Jo et al., 2013); no pre-operative training (Abela & Chudasama, 2013); lengthy post-op training (Mar et al., 2011); a cue during the delay-to-reinforcement (Mariano et al., 2009; Zeeb et al., 2010); or interactions with their basal levels of impulsivity (Zeeb et al., 2010). A second possibility is that the medial and lateral portions of the OFC exert heterogeneous influence on intertemporal choice. Mar et al (2011) showed that medial OFC lesions decreased rodent discounting while lateral OFC lesions increased rodent discounting; but it took 6 days of post-lesion retraining for those effects to be observable and inactivation of medial OFC by another group was not successful at changing rodent discount rates (Stopper, Green, & Floresco, 2014).
Single unit electrophysiological recordings have furthered our understanding of why the OFC is necessary for normal intertemporal decisions by pinpointing OFC signals that likely underpin its contribution. Specifically, two types of OFC reward signals were recorded in rats during a chamber-based intertemporal choice task in which an odor cue and the spatial location of a reward indicated whether the delay preceding the reward would be short or long. While the majority of OFC neurons exhibited reward evoked firing that reflected the delay-discounted reward value, meaning they fired stronger to the more preferred, immediate rewards than to the delayed reward (immediate-preferring neurons); a second, smaller subset of neurons showed sustained increases in firing during the delay to the large reward, i.e. a larger response to delayed rewards (delay-preferring neurons) (Roesch, Taylor, & Schoenbaum, 2006). However, neither response type could be categorized as encoding a common value because they failed to also respond differentially to reward size (i.e. immediate-preferring neurons did not fire more for the large reward when offered without delay). The authors concluded that the two reward representations support two separate processes: the delay-discounted signal broadcasts the reduced value of delayed rewards and promotes choice of the immediate reward; and the second ramping signal represents an outcome expectancy signal that tracks delay, facilitates and updates associative representations and outcome predictions in other brain areas, and promotes choice of the delayed reward (Roesch, Calu, Burke, & Schoenbaum, 2007). In support of the suggestion that OFC neurons signal outcome expectancy during intertemporal decisions, Stott & Redish (2014) observed that OFC neuron ensembles signaled covert reward representations just after the choice point of a spatial adjusted delay discounting task.
Consistent with the previously mentioned human study that showed that the degree of OFC population activation correlated with choice of delayed rewards (McClure et al., 2004), there is evidence to suggest that the two reward representations in the rodent OFC also work together to value options and guide choice. When the two reward signals were identified, the overall OFC representation favored the immediate rewards (the majority of the recorded cells were immediate-preferring) and was accompanied by a behavioral preference for the immediate rewards. Indeed, if the overall OFC reward representation is responsible for guiding choice of delayed rewards, then neural manipulations that change choices during inter-temporal decisions probably do so by changing the balance of the two reward representations in the OFC (increasing or decreasing the proportion of delay-preferring reward representation). The hypothesis has already been tested once by recording OFC neurons in aged rats that are believed to have OFC dysfunction because they exhibit age-related changes in OFC-mediated behaviors. Consistent with the hypothesis, the aged rats displayed an increased preference for delayed rewards that was accompanied by an increased percentage of delayed-preferring neurons and no change to the overall percentage of delay-sensitive reward neurons (Roesch, Bryden, Cerri, Haney, & Schoenbaum, 2012).
5.1.2 Medial prefrontal cortex (mPFC)
The mPFC is a region of the PFC that has been less-studied during intertemporal choice because it is not necessary for valuation/choice during delay discounting tasks. Although several human imaging studies have observed activation of the mPFC during intertemporal choice tasks (Ballard & Knutson, 2009; Peters & Büchel, 2011), silencing the mPFC in rodents does not affect discounting during a within-session delay discounting task. When the mPFC was lesioned, animals’ discounting curves were flattened-their choice of the large reward was reduced at short delays and increased at long delays (Cardinal, Pennicott, Lakmali, Robbins, & Everitt, 2001). At first glance, that might seem attributable to a change in discounting; but when combined with the fact that the lesion also failed to impact reward discrimination and that a recent study showed temporary mPFC inactivation had no effect on discounting, it is most likely that the mPFC contributes to timing or working memory rather than valuation (Feja & Koch, 2014). Since the mPFC is known to track interval durations (Kim, Ghim, Lee, & Jung, 2013; Meck, Church, Wenk, & Olton, 1987) and support tolerance during reaction time tasks (Narayanan, Horst, & Laubach, 2006), it is probable that the mPFC monitors delay information that in turn helps animals determine when to pursue rewards. Additionally, there is a rich literature implicating the mPFC in working memory processes that support the linking actions with outcomes over delays (Yang, Shi, Wang, Peng, & Li, 2014). While animals with their mPFC temporarily inactivated during one test session can rely on well-established delay-reward associations, which remain supported by stored temporal memory information from other neural systems (Matell & Meck, 2000); animals with lesioned mPFC, whose delay-reward associations are likely inappropriately updated during retraining (probably lengthened according to results of mPFC lesions during interval timing tasks (Meck et al., 1987)), no longer base their behavior on accurate temporal information. Consistent with a timing contribution, temporary disconnection of the mPFC and BLA reduced waiting tolerance for a single 15s delayed reward on a T-maze choice task during which the small, immediate reward was constantly available (Churchwell et al., 2009). Finally, another explanation for the discrepancy between the lesion and inactivation data may be procedural. Lesioning the mPFC is known to impair reversal learning (Salazar, White, Lacroix, Feldon, & White, 2004), so it may be the case that the adjustment or updating required between days (from the longest delay block to the zero delay block) is too extreme to accomplish without an active mPFC and inadvertently drives down preference for the large reward during the first block.
5.2 Amygdala-ventral striatal circuits promote choice of large, delayed rewards
5.2.1 Basolateral amygdala (BLA)
Even though the amygdala is primarily studied for processing negative emotions and linking environmental stimuli to aversive sensory experiences, the amygdala, and the BLA in particular, is also known to play an essential role in positive emotions and reward-related behavior (Baxter & Murray, 2002). Thus, it is not surprising that the BLA is required for intertemporal choice. Specifically, its known that the BLA is responsible for biasing choice toward the large, delayed reward because BLA-lesioned rodents shifted their choice away from the large reward during a within-session delay discounting task (Winstanley, 2004). Their increased discounting was not attributable to an inability to discriminate reward sizes nor an inability to remember the stimulus (lever)-reward association for two reasons: first, BLA-lesioned rodents in the same study retained their preference for the large reward when it was immediately available; and second, previous studies have shown the BLA is not necessary for appetitive Pavlovian conditioning (Parkinson, Robbins, & Everitt, 2000).
However, since BLA recordings have yet to be performed during a rodent delay discounting task, the precise role of the BLA in intertemporal choice remains unknown; but the most compelling hypothesis is that it contributes through its role in assigning incentive or motivational significance to instrumental cues, just as it does in other tasks. BLA neurons recorded in rodents during instrumental tasks exhibited cue and delay-related activity that differentially responded based on the motivational significance of the stimuli (Schoenbaum, Chiba, & Gallagher, 1998). Additionally, lesion studies have found that the BLA is required for animals to adjust their instrumental responses to conditioned cues when faced with changes in affective information, such as conditioned reinforcers, reinforcer devaluation, or extinction learning (Hatfield, Han, Conley, Gallagher, & Holland, 1996; Hitchcott & Phillips, 1998; McLaughlin & Floresco, 2007; Schoenbaum, Roesch, & Stalnaker, 2009). Therefore, BLA-lesioned animals probably decreased their choice of the large reward because they no longer had access to incentive signals that would normally invigorate them to pursue those rewards despite their associated delays.
5.2.2 Nucleus accumbens (NAc)
The role of the NAc in intertemporal choice has received more attention than the BLA. Time and time again, irrespective of the task used to measure delay discounting, rats with NAc core lesions displayed increased discounting that was not linked to altered reward sensitivity nor satiety (Bezzina et al., 2007; Cardinal et al., 2001; da Costa Araujo et al., 2009; Valencia-Torres et al., 2012). Curiously, when the NAc core was instead temporarily inactivated during the within-session version of the delay discounting task, the effect was the opposite – rats exhibited decreased discounting (Moschak & Mitchell, 2014). While it is possible that means temporary silencing of the NAc core affects choice differently than permanent lesions, it is more likely that it is an outlier and attributable to the fact that the dose of the GABA agonist used was low enough to, in the authors’ own words, only ‘partially’ inactivate the structure. Instead, the data indicated that the NAc is required to work with the BLA and OFC to promote choice of the large reward. The evidence for the OFC-NAc projection’s involvement comes from a disconnection study that showed that serial information transfer between the OFC and the NAc core is also required to maintain normal levels of delayed reward choices and avoid excessive discounting (Bezzina et al., 2008).
Recording studies have observed several neural responses within the NAc that further detail its contribution to intertemporal choice. Population level analyses of the NAc across species have identified reward representations that likely bias choice during intertemporal choices. Human fMRI studies found subjective value correlates as well as activity that was predictive of the choice of immediate rewards (Ballard & Knutson, 2009; Kable & Glimcher, 2007; McClure et al., 2004); while more recent recordings of rodent NAc ensembles at the choice point of a spatial adjusted delay task uncovered signals that combined reward value and chosen action – reward representations that not only distinguished reward size but also preferentially responded to the chosen reward over unchosen reward (Stott & Redish, 2014). Integration of value and action information has also been seen at the single cell level in the NAc. In a second rodent study, performed during a chamber-based intertemporal choice task with odor cues associated with short or long delayed rewards, NAc cells exhibited cue-evoked firing that was both reflective of delay-discounted value and sensitive to whether the cue-associated action was performed, meaning, for example, they fired more strongly to cues associated with more preferred immediate rewards but only when the nosepoke action was performed in pursuit of the immediate reward (Roesch, Singh, Brown, Mullins, & Schoenbaum, 2009). In addition to responding preferentially to cues whose associated rewards were obtained, the cue responses were also correlated with the speed of the nosepoke action. Thus, the study directly demonstrated how an action – dependent cue signal in the NAc core can trigger downstream motor targets and encourage the animal to perform actions and pursue a given reward. Said another way, it is a great example of how the NAc functions as a ‘limbic-motor interface’ able to integrate reward, value and cue information during intertemporal choice tasks (Mogenson, Jones, & Yim, 1980).
5.3 Hippocampus provides temporal and prospective information
The final cortico-limbic structure currently implicated in intertemporal choice is the hippocampus (HPC). In three separate within-session delay discounting experiments, HPC lesions caused rats to decrease their choice of the large, delayed reward (Abela & Chudasama, 2013; Cheung & Cardinal, 2005; Mariano et al., 2009; McHugh, Campbell, Taylor, Rawlins, & Bannerman, 2008). Their increased discounting was not specific to either subregion, as lesions of both the dorsal and ventral HPC produced the same impairment (McHugh et al., 2008); nor could the effect be interpreted as stemming from deficits in magnitude or spatial discrimination or, by extension, memory (Mariano et al., 2009). A slightly different, yet complimentary effect was observed in the spatial adjusted delay task: even though HPC-lesioned animals were eventually able to titrate around a similar indifference delay as the controls; their training took much longer and their preference for the large reward was much more variable than controls (Bett, Murdoch, Wood, & Dudchenko, 2015). Finally, HPC-lesioned rodents faced with a choice between an immediate uncertain reward and a delayed certain reward were less tolerant of delayed rewards, shifting their preference from the certain reinforcement to less certain, but immediate reinforcement (Rawlins, Feldon, & Butt, 1985). Taken together, the lesion data suggests that the HPC is required for animals to tolerate delays in order to obtain larger rewards.
A precise functional characterization of the HPC neurons during a rodent delay discounting task has yet to be performed, but based on related observations, it seems likely that the HPC contributes both temporal and prospective information to intertemporal choice. Evidence for the former role comes from the fact that HPC neurons in both primates and rodents encode the duration of elapsed time (MacDonald, Lepage, Eden, & Eichenbaum, 2011; Naya & Suzuki, 2011). Given that HPC-lesions failed to affect timing behavior in some studies (Dietrich & Allen, 1998; Dietrich, Allen, & Bunnell, 1997; Port, Romano, Steinmetz, Mikhail, & Patterson, 1986; Rawlins, Winocur, & Gray, 1983), while others have reported that lesions to the HPC or fimbria/fornix have shortened duration estimates (Meck, 1988; Meck, Church, & Olton, 2013; Olton, Meck, & Church, 1987); it is not clear what the exact role of the HPC is to timing behavior and/or whether it is even required (Yin & Troger, 2011).
The role of the HPC in prospection is more straightforward, stemming from complimentary human and rodent studies. Behavioral psychologist have long considered the HPC an important component of the imagery and prospection network because it is engaged when individuals ‘self-project’ or ‘mentally time travel’ and imagine or predict future events like reinforcement (Peters & Büchel, 2011; Peters & Büchel, 2010). In fact, when such ‘future thinking’ was stimulated by a behavioral intervention that encouraged episodic prospection during an human intertemporal choice task, subjects discounted less and their fMRI BOLD signals revealed enhanced PFC-HPC coupling (Peters & Büchel, 2010). Additionally, at decision points in a maze-based decision task, rodent HPC neuron ensembles are known to show forward sweeping spatial representations that 1) are sensitive to task demands and experience and 2) likely signal the potential forward paths and associated actions and choices (Johnson & Redish, 2007; van der Meer, Johnson, Schmitzer-Torbert, & Redish, 2010).
5.4 The dopamine system transmits expected value information throughout the decision circuit
The dopamine (DA) system is best known for its ability to broadcast reward information throughout the brain, including to structures described above. DA neurons fire in response to predictive cues and encode the magnitude of future rewards (Schultz, 2006). In fact, direct stimulation of DA neurons can invigorate reward-seeking behavior (Phillips, Stuber, Heien, Wightman, & Carelli, 2003), just as inactivation of DA neurons can impair animals ability to respond to reward-predicting cues (Yun, Wakabayashi, Fields, & Nicola, 2004). As DA neurons respond to cues that predict rewards, they continue to track reward information since they fire at higher rates for greater than expected rewards and at lower rates for smaller than expected rewards, a phenomena called reward prediction error signaling (Schultz, 2006). Finally, recordings made during two rodent delay discounting tasks showed that DA neurons’ cue responses are able to integrate delay information and signal expected value information-in this case, a single value signal that encodes both reward size and delay-discounted value (Day, Jones, Wightman, & Carelli, 2010; Roesch, Calu, & Schoenbaum, 2007). Thus, in addition to contributing to outcome evaluation, the DA neurons phasic responses provide expected value information that is transmitted throughout the decision circuit where it can update outcome predictions, modulate other decision signals, and support normal levels of delay discounting.
Pharmacological manipulations confirm that DA neurotransmission influences intertemporal choice through tonic signaling as well. Blocking DA receptors with the antagonist flupenthixol caused animals to increase discounting; whereas, stimulating DA receptors with amphetamine or a DA reuptake inhibitor decreased delay discounting (Floresco, Tse, & Ghods-Sharifi, 2008; Winstanley, Dalley, Theobald, & Robbins, 2003). Since systemic injection of the D1, but not D2, antagonist also increases discounting, it is likely that DA biases choice toward the delayed reward through global D1 receptor activation (van Gaalen, van Koten, Schoffelmeer, & Vanderschuren, 2006). Additionally, through its capacity as a neuromodulator, the DA system is also capable of influencing important signals and computations locally by modifying activity within single structures. So far, it has been reported that DA input into the OFC but not the NAc is necessary for normal levels of choice, as eliminating DA input using 6-hydroxydopamine in the OFC and not NAc caused rats to decrease choice of the delayed reward (Kheramin et al., 2004; Winstanley, Theobald, Dalley, & Robbins, 2005). Finally, two experiments found that blocking both D1 and D2 receptors in the OFC and mPFC can increase discounting (Loos et al., 2010; Zeeb et al., 2010).
5.5 A cortico-limbic-striatal circuit supports normal intertemporal choice
Human research suggests that a neural network involving the prefrontal cortex, amygdala, and ventral striatum is engaged during intertemporal choices. The use of targeted neural manipulations and single-unit recordings in rodent models has confirmed that those structures, as well as the hippocampus and dopamine system, are indeed critical nodes of the decision circuit (Figure 2). Furthermore, the diversity of single-cell responses recorded thus far suggests that many values are computed at choice and favors a combinatorial valuation model rather than the dual or subjective value models championed by some human researchers. Although much more work is needed to characterize the single-cells and population activity of every node throughout the decision making process (rather than focusing on choice points), the current data set has enough information for us to begin linking brain structures to psychological variables and populating our conceptual model. Thus, our current conception of how and when different nodes of the circuit are engaged during intertemporal choices is as follows: relying on learned associations and values stored in a similar neural circuit, HPC cells are recruited to assist with decision representation-both by timing delay lengths and/or sweeping forward in time to shape accurate outcome expectations; next DA and BLA cells compute subjective values and NAc cells help transform those value signals into meaningful action values; and finally, once the outcomes are obtained, DA neurons evaluate expectation errors and update values with the help of OFC expectancy signals.
6.0 Summary and recommendations for future rodent intertemporal choice studies
Intertemporal choices are ubiquitous and consequential to human life, and this has motivated an interest in delineating the complex processes that underpin normal and pathological intertemporal decisions. While the use of human and rodent models of intertemporal choice have permitted some advancement of our understanding of those processes, the field is still in its infancy. To clarify the problem, we have advocated for a broader conception of intertemporal choice; one that acknowledges it as a multistep decision making process rather than a standalone behavior and have 1) drawn connections between neural and psychological processes and 2) speculated about when/how neural and psychological processes are engaged during the decision making process. In this section, we will briefly discuss our recommendations for future studies.
First, we recommend that researchers continue to investigate the similarities and differences between human and rodent intertemporal choice tasks used today. As mentioned above, an important question is whether or not rodent tasks are truly comparable to human tasks (and by extension real-life experiences) in which choice are typically varied and unpredictable. Since the question is premised upon the suspicion that current tasks, which intend to model the same behavior by generating measures of delay discounting, actually model distinct behavioral processes that engage different psychological and neural processes; it should be addressed by systematically changing rodent task parameters to better approximate human choice tasks (fixed, adjusted, or intermixed delay presentations) and assessing 1) whether the same neural circuit and neurotransmitters are required during the modified tasks and 2) whether the same correlations between delay discounting and psychological measures are observed.
Next, we encourage further efforts to interpret the activity and computations of the circuit at both the single-cell and population levels. As evidenced by the literature we discussed, much of the focus of neural recordings thus far has been to understand how value is computed at the time of choice and some at the time of the outcome. However, such narrow focus might limit our understanding of which structures and signals are involved in each time point of the decision making process. Once the signals are more thoroughly mapped in time, it will then be possible to apply optogenetic techniques to assess the necessity of certain signals and identify redundancies in the system.
Third, more attention should be paid to the role of neurotransmitters. In addition to DA, the neurotransmitters serotonin, norepinephrine, and acetylcholine are also positioned to anatomically and functionally influence the decision circuit. Importantly, each is implicated in the etiology and treatment of neuropsychiatric disorders characterized by impulsive choice and is known to subserve a unique array of psychological processes. Yet, few studies have explored the contribution of their global and local neurotransmission (global: (Mobini et al., 2000; Catharine A Winstanley et al., 2005) for serotonin; (van Gaalen et al., 2006) for norepinephrine; and (Mendez, Gilbert, Bizon, & Setlow, 2012; Scattoni, Adriani, Calamandrei, Laviola, & Ricceri, 2006; Tian, Qin, Sun, Li, & Wei, 2016)for acetylcholine; local: only serotonin (Darna et al., 2015)). Such experiments may inform our understanding of those neuropsychiatric disorders and/or lead to pharmacological treatments for improving intertemporal choices.
Our final recommendation is that the field focus on extending our understanding of the dynamic interplay between the structures of the circuit. Multi-site recordings and inactivations (disconnection studies via pharmacologic, chemogenetic, or optogenetic means) should be undertaken to 1) map the direction and hierarchy of signal transfer through projections between structures and 2) catalog how signals work alone and in concert to encode the computations necessary for normal levels of intertemporal choice.
Highlights.
Conceptual and psychological insights into intertemporal choice
Functional roles proposed for cortico-limbic-striatal circuit involved in intertemporal choice
Justifications for rodent models of intertemporal choice
Acknowledgments
Supported by 5T32GM007108-40 to WCF, and NIMH grant MH58755 to SJYM
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
References
- Abela AR, Chudasama Y. Dissociable contributions of the ventral hippocampus and orbitofrontal cortex to decision-making with a delayed or uncertain outcome. European Journal of Neuroscience. 2013;37(4):640–647. doi: 10.1111/ejn.12071. http://doi.org/10.1111/ejn.12071. [DOI] [PubMed] [Google Scholar]
- Ainslie G. Specious reward: a behavioral theory of impulsiveness and impulse control. Psychological Bulletin. 1975;82(4):463–496. doi: 10.1037/h0076860. http://doi.org/10.1037/h0076860. [DOI] [PubMed] [Google Scholar]
- Allman MJ, Meck WH. Pathophysiological distortions in time perception and timed performance. Brain. 2012;135(3):656–677. doi: 10.1093/brain/awr210. http://doi.org/10.1093/brain/awr210. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Anokhin AP, Golosheykin S, Grant JD, Heath AC. Heritability of delay discounting in adolescence: A longitudinal twin study. Behavior Genetics. 2011;41(2):175–183. doi: 10.1007/s10519-010-9384-7. http://doi.org/10.1007/s10519-010-9384-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ballard K, Knutson B. Dissociable neural representations of future reward magnitude and delay during temporal discounting. NeuroImage. 2009;45(1):143–150. doi: 10.1016/j.neuroimage.2008.11.004. http://doi.org/10.1016/j.neuroimage.2008.11.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Baumann AA, Odum AL. Impulsivity, risk taking, and timing. Behavioural Processes. 2012;90(3):408–414. doi: 10.1016/j.beproc.2012.04.005. http://doi.org/10.1016/j.beproc.2012.04.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Baxter MG, Murray EA. The amygdala and reward. Nature Reviews Neuroscience. 2002;3(7):563–573. doi: 10.1038/nrn875. http://doi.org/10.1038/nrn875. [DOI] [PubMed] [Google Scholar]
- Bechara A, Tranel D, Damasio H. Characterization of the decision-making deficit of patients with ventromedial prefrontal cortex lesions. Brain. 2000;123(11):2189–2202. doi: 10.1093/brain/123.11.2189. http://doi.org/10.1093/brain/123.11.2189. [DOI] [PubMed] [Google Scholar]
- Berlin HA, Rolls ET, Kischka U. Impulsivity, time perception, emotion and reinforcement sensitivity in patients with orbitofrontal cortex lesions. Brain. 2004;127(5):1108–1126. doi: 10.1093/brain/awh135. http://doi.org/10.1093/brain/awh135. [DOI] [PubMed] [Google Scholar]
- Bett D, Murdoch LH, Wood ER, Dudchenko PA. Hippocampus, Delay discounting, And vicarious trial-and-error. Hippocampus. 2015;25(5):643–654. doi: 10.1002/hipo.22400. http://doi.org/10.1002/hipo.22400. [DOI] [PubMed] [Google Scholar]
- Bezzina G, Body S, Cheung THC, Hampson CL, Bradshaw CM, Szabadi E, … Deakin JFW. Effect of disconnecting the orbital prefrontal cortex from the nucleus accumbens core on inter-temporal choice behaviour: A quantitative analysis. Behavioural Brain Research. 2008;191Bezzina(2):272–279. doi: 10.1016/j.bbr.2008.03.041. http://doi.org/10.1016/j.bbr.2008.03.041. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bezzina G, Cheung THC, Asgari K, Hampson CL, Body S, Bradshaw CM, … Anderson IM. Effects of quinolinic acid-induced lesions of the nucleus accumbens core on inter-temporal choice: A quantitative analysis. Psychopharmacology. 2007;195(1):71–84. doi: 10.1007/s00213-007-0882-0. http://doi.org/10.1007/s00213-007-0882-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cardinal RN, Pennicott DR, Lakmali C, Robbins TW, Everitt BJ. Impulsive choice induced in rats by lesions of the nucleus accumbens core. Science. 2001;292(5526):2499–2501. doi: 10.1126/science.1060818. http://doi.org/10.1126/science.1060818. [DOI] [PubMed] [Google Scholar]
- Cardinal RN, Robbins TW, Everitt BJ. The effects of d-amphetamine, chlordiazepoxide, α-flupenthixol and behavioural manipulations on choice of signalled and unsignalled delayed reinforcement in rats. Psychopharmacology. 2000;152(4):362–375. doi: 10.1007/s002130000536. http://doi.org/10.1007/s002130000536. [DOI] [PubMed] [Google Scholar]
- Cheung TH, Cardinal RN. Hippocampal lesions facilitate instrumental learning with delayed reinforcement but induce impulsive choice in rats. BMC Neuroscience. 2005;6(1):36. doi: 10.1186/1471-2202-6-36. http://doi.org/10.1186/1471-2202-6-36. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Churchwell JC, Morris AM, Heurtelou NM, Kesner RP. Interactions between the prefrontal cortex and amygdala during delay discounting and reversal. Behavioral Neuroscience. 2009;123(6):1185–96. doi: 10.1037/a0017734. http://doi.org/10.1037/a0017734. [DOI] [PMC free article] [PubMed] [Google Scholar]
- da Costa Araujo S, Body S, Hampson CL, Langley RW, Deakin JFW, Anderson IM, … Szabadi E. Effects of lesions of the nucleus accumbens core on intertemporal choice: Further observations with an adjusting-delay procedure. Behavioural Brain Research. 2009;202(2):272–277. doi: 10.1016/j.bbr.2009.04.003. http://doi.org/10.1016/j.bbr.2009.04.003. [DOI] [PubMed] [Google Scholar]
- Darna M, Chow JJ, Yates JR, Charnigo RJ, Beckmann JS, Bardo MT, Dwoskin LP. Role of serotonin transporter function in rat orbitofrontal cortex in impulsive choice. Behavioural Brain Research. 2015;293:134–42. doi: 10.1016/j.bbr.2015.07.025. http://doi.org/10.1016/j.bbr.2015.07.025. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Day JJ, Jones JL, Wightman RM, Carelli RM. Phasic nucleus accumbens dopamine release encodes effort- and delay-related costs. Biological Psychiatry. 2010;68(3):306–309. doi: 10.1016/j.biopsych.2010.03.026. http://doi.org/10.1016/j.biopsych.2010.03.026. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dietrich A, Allen JD. Functional dissociation of the prefrontal cortex and the hippocampus in timing behavior. Behavioral Neuroscience. 1998;112(5):1043–7. doi: 10.1037//0735-7044.112.5.1043. http://doi.org/10.1037/0735-7044.112.5.1043. [DOI] [PubMed] [Google Scholar]
- Dietrich A, Allen JD, Bunnell BN. Is the hippocampus involved in temporal discrimination and the memory of short intervals? International Journal of Neuroscience. 1997;90(3–4):255–269. doi: 10.3109/00207459709000642. http://doi.org/10.3109/00207459709000642. [DOI] [PubMed] [Google Scholar]
- Dixon MR, Holton B. Altering the Magnitude of Delay Discounting By Pathological Gamblers. Journal of Applied Behavior Analysis. 2009;42(2):269–275. doi: 10.1901/jaba.2009.42-269. http://doi.org/10.1901/jaba.2009.42-269. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Feja M, Koch M. Ventral medial prefrontal cortex inactivation impairs impulse control but does not affect delay-discounting in rats. Behavioural Brain Research. 2014;264:230–239. doi: 10.1016/j.bbr.2014.02.013. http://doi.org/10.1016/j.bbr.2014.02.013. [DOI] [PubMed] [Google Scholar]
- Floresco SB, Tse MTL, Ghods-Sharifi S. Dopaminergic and glutamatergic regulation of effort- and delay-based decision making. Neuropsychopharmacology : Official Publication of the American College of Neuropsychopharmacology. 2008;33(8):1966–79. doi: 10.1038/sj.npp.1301565. http://doi.org/10.1038/sj.npp.1301565. [DOI] [PubMed] [Google Scholar]
- Hardisty DJ, Weber EU. Discounting future green: Money versus the environment. Journal of Experimental Psychology. General. 2009;138(3):329–340. doi: 10.1037/a0016433. http://doi.org/10.1037/a0016433. [DOI] [PubMed] [Google Scholar]
- Hatfield T, Han JS, Conley M, Gallagher M, Holland P. Neurotoxic lesions of basolateral, but not central, amygdala interfere with Pavlovian second-order conditioning and reinforcer devaluation effects. The Journal of Neuroscience : The Official Journal of the Society for Neuroscience. 1996;16(16):5256–5265. doi: 10.1523/JNEUROSCI.16-16-05256.1996. http://doi.org/http://www.jneurosci.org/content/16/16/5256. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hayashi Y, Russo CT, Wirth O. Texting while driving as impulsive choice: A behavioral economic analysis. Accident Analysis and Prevention. 2015;83:182–189. doi: 10.1016/j.aap.2015.07.025. http://doi.org/10.1016/j.aap.2015.07.025. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hayden BY. Time discounting and time preference in animals: A critical review. Psychonomic Bulletin & Review. 2015 doi: 10.3758/s13423-015-0879-3. http://doi.org/10.3758/s13423-015-0879-3. [DOI] [PubMed]
- Hinson JM, Jameson TL, Whitney P. Impulsive decision making and working memory. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2003;29(2):298–306. doi: 10.1037/0278-7393.29.2.298. http://doi.org/10.1037/0278-7393.29.2.298. [DOI] [PubMed] [Google Scholar]
- Hitchcott PK, Phillips GD. Double dissociation of the behavioural effects of R(+) 7-OH-DPAT infusions in the central and basolateral amygdala nuclei upon Pavlovian and instrumental conditioned appetitive behaviours. Psychopharmacology. 1998;140(4):458–469. doi: 10.1007/s002130050790. http://doi.org/10.1007/s002130050790. [DOI] [PubMed] [Google Scholar]
- Ho MY, Mobini S, Chiang TJ, Bradshaw CM, Szabadi E. Theory and method in the quantitative analysis of “impulsive choice” behaviour: Implications for psychopharmacology. Psychopharmacology. 1999;146(4):362–372. doi: 10.1007/pl00005482. http://doi.org/10.1007/PL00005482. [DOI] [PubMed] [Google Scholar]
- Jo S, Kim KU, Lee D, Jung MW. Effect of orbitofrontal cortex lesions on temporal discounting in rats. Behavioural Brain Research. 2013;245:22–28. doi: 10.1016/j.bbr.2013.02.014. http://doi.org/10.1016/j.bbr.2013.02.014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Johnson A, Redish aD. Neural ensembles in CA3 transiently encode paths forward of the animal at a decision point. The Journal of Neuroscience : The Official Journal of the Society for Neuroscience. 2007;27(45):12176–12189. doi: 10.1523/JNEUROSCI.3761-07.2007. http://doi.org/10.1523/JNEUROSCI.3761-07.2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kable JW, Glimcher PW. The neural correlates of subjective value during intertemporal choice. Nature Neuroscience. 2007;10(12):1625–1633. doi: 10.1038/nn2007. http://doi.org/10.1038/nn2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kheramin S, Body S, Ho MY, Vel??zquez-Martinez DN, Bradshaw CM, Szabadi E, … Anderson IM. Effects of orbital prefrontal cortex dopamine depletion on inter-temporal choice: A quantitative analysis. Psychopharmacology. 2004;175(2):206–214. doi: 10.1007/s00213-004-1813-y. http://doi.org/10.1007/s00213-004-1813-y. [DOI] [PubMed] [Google Scholar]
- Kheramin S, Body S, Mobini S, Ho MY, Velázquez-Martinez DN, Bradshaw CM, … Anderson IM. Effects of quinolinic acid-induced lesions of the orbital prefrontal cortex on inter-temporal choice: A quantitative analysis. Psychopharmacology. 2002;165(1):9–17. doi: 10.1007/s00213-002-1228-6. http://doi.org/10.1007/s00213-002-1228-6. [DOI] [PubMed] [Google Scholar]
- Killeen PR. Models of trace decay, eligibility for reinforcement, and delay of reinforcement gradients, from exponential to hyperboloid. Behavioural Processes. 2011;87(1):57–63. doi: 10.1016/j.beproc.2010.12.016. http://doi.org/10.1016/j.beproc.2010.12.016. [DOI] [PubMed] [Google Scholar]
- Kim J, Ghim JW, Lee JH, Jung MW. Neural Correlates of Interval Timing in Rodent Prefrontal Cortex. 2013;33(34):13834–13847. doi: 10.1523/JNEUROSCI.1443-13.2013. http://doi.org/10.1523/JNEUROSCI.1443-13.2013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kirby KN. One-year temporal stability of delay-discount rates. Psychonomic Bulletin & Review. 2009;16(3):457–462. doi: 10.3758/PBR.16.3.457. http://doi.org/10.3758/PBR.16.3.457. [DOI] [PubMed] [Google Scholar]
- Kirby KN, Winston GC, Santiesteban M. Impatience and grades: Delay-discount rates correlate negatively with college GPA. Learning and Individual Differences. 2005;15(3):213–222. http://doi.org/10.1016/j.lindif.2005.01.003. [Google Scholar]
- Koffarnus MN, Jarmolowicz DP, Mueller ET, Bickel WK. Changing delay discounting in the light of the competing neurobehavioral decision systems theory: A review. Journal of the Experimental Analysis of Behavior. 2013;99(1):32–57. doi: 10.1002/jeab.2. http://doi.org/10.1002/jeab.2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lempert KM, Phelps EA. The Malleability of Intertemporal Choice. Trends in Cognitive Sciences. 2016;20(1):64–74. doi: 10.1016/j.tics.2015.09.005. http://doi.org/10.1016/j.tics.2015.09.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Loos M, Pattij T, Janssen MCW, Counotte DS, Schoffelmeer ANM, Smit AB, … Van Gaalen MM. Dopamine receptor D1/D5 gene expression in the medial prefrontal cortex predicts impulsive choice in rats. Cerebral Cortex. 2010;20(5):1064–1070. doi: 10.1093/cercor/bhp167. http://doi.org/10.1093/cercor/bhp167. [DOI] [PubMed] [Google Scholar]
- MacDonald CJ, Lepage KQ, Eden UT, Eichenbaum H. Hippocampal “time cells” bridge the gap in memory for discontiguous events. Neuron. 2011;71(4):737–749. doi: 10.1016/j.neuron.2011.07.012. http://doi.org/10.1016/j.neuron.2011.07.012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mar AC, Walker ALJ, Theobald DE, Eagle DM, Robbins TW. Dissociable effects of lesions to orbitofrontal cortex subregions on impulsive choice in the rat. The Journal of Neuroscience : The Official Journal of the Society for Neuroscience. 2011;31(17):6398–404. doi: 10.1523/JNEUROSCI.6620-10.2011. http://doi.org/10.1523/JNEUROSCI.6620-10.2011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mariano TY, Bannerman DM, McHugh SB, Preston TJ, Rudebeck PH, Rudebeck SR, … Campbell TG. Impulsive choice in hippocampal but not orbitofrontal cortex-lesioned rats on a nonspatial decision-making maze task. European Journal of Neuroscience. 2009;30(3):472–484. doi: 10.1111/j.1460-9568.2009.06837.x. http://doi.org/10.1111/j.1460-9568.2009.06837.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Marshall AT, Smith AP, Kirkpatrick K. Mechanisms of impulsive choice: I. Individual differences in interval timing and reward processing. Journal of the Experimental Analysis of Behavior. 2014;102(1):86–101. doi: 10.1002/jeab.88. http://doi.org/10.1002/jeab.88. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Matell MS, Meck WH. Neuropsychological mechanisms of interval timing behavior. BioEssays. 2000;22(1):94–103. doi: 10.1002/(SICI)1521-1878(200001)22:1<94::AID-BIES14>3.0.CO;2-E. http://doi.org/10.1002/(SICI)1521-1878(200001)22:1<94::AID-BIES14>3.0.CO;2-E. [DOI] [PubMed] [Google Scholar]
- Mazur JE. Choice, delay, probability, and conditioned reinforcement. Animal Learning & Behavior. 1997;25(2):131–147. http://doi.org/10.3758/BF03199051. [Google Scholar]
- McClure J, Podos J, Richardson HN. Isolating the delay component of impulsive choice in adolescent rats. Frontiers in Integrative …. 2014;8(JAN):1–9. doi: 10.3389/fnint.2014.00003. http://doi.org/10.3389/fnint.2014.00003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- McClure SM, Ericson KM, Laibson DI, Loewenstein G, Cohen JD. Time Discounting for Primary Rewards. J Neurosci. 2007;27(21):5796–5804. doi: 10.1523/JNEUROSCI.4246-06.2007. http://doi.org/10.1523/JNEUROSCI.4246-06.2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- McClure SM, Laibson DI, Loewenstein G, Cohen JD. Separate neural systems value immediate and delayed monetary rewards. Science (New York, NY) 2004;306(5695):503–507. doi: 10.1126/science.1100907. http://doi.org/10.1126/science.1100907. [DOI] [PubMed] [Google Scholar]
- McHugh SB, Campbell TG, Taylor AM, Rawlins JNP, Bannerman DM. A role for dorsal and ventral hippocampus in inter-temporal choice cost-benefit decision making. Behavioral Neuroscience. 2008;122(1):1–8. doi: 10.1037/0735-7044.122.1.1. http://doi.org/10.1037/0735-7044.122.1.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- McLaughlin RJ, Floresco SB. The role of different subregions of the basolateral amygdala in cue-induced reinstatement and extinction of food seeking. Neuroscience. 2007;146:1484–1494. doi: 10.1016/j.neuroscience.2007.03.025. http://doi.org/10.1016/j.neuroscience.2007.03.025. [DOI] [PubMed] [Google Scholar]
- Meck WH. Hippocampal function is required for feedback control of an internal clock’s criterion. Behavioral Neuroscience. 1988;102(1):54–60. doi: 10.1037//0735-7044.102.1.54. http://doi.org/10.1037/0735-7044.102.1.54. [DOI] [PubMed] [Google Scholar]
- Meck WH. Neuropsychology of timing and time perception. Brain and Cognition. 2005;58(1):1–8. doi: 10.1016/j.bandc.2004.09.004. http://doi.org/10.1016/j.bandc.2004.09.004. [DOI] [PubMed] [Google Scholar]
- Meck WH, Church RM, Olton DS. Hippocampus, time, and memory. Behavioral Neuroscience. 2013;127(5):655–668. doi: 10.1037/a0034188. http://doi.org/10.1037/a0034188. [DOI] [PubMed] [Google Scholar]
- Meck WH, Church RM, Wenk GL, Olton DS. Nucleus basalis magnocellularis and medial septal area lesions differentially impair temporal memory. The Journal of Neuroscience : The Official Journal of the Society for Neuroscience. 1987;7(11):3505–3511. doi: 10.1523/JNEUROSCI.07-11-03505.1987. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mendez IA, Gilbert RJ, Bizon JL, Setlow B. Effects of acute administration of nicotinic and muscarinic cholinergic agonists and antagonists on performance in different cost-benefit decision making tasks in rats. Psychopharmacology. 2012;224(4):489–499. doi: 10.1007/s00213-012-2777-y. http://doi.org/10.1007/s00213-012-2777-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mobini S, Body S, Ho MY, Bradshaw C, Szabadi E, Deakin J, Anderson I. Effects of lesions of the orbitofrontal cortex on sensitivity to delayed and probabilistic reinforcement. Psychopharmacology. 2002;160(3):290–298. doi: 10.1007/s00213-001-0983-0. http://doi.org/10.1007/s00213-001-0983-0. [DOI] [PubMed] [Google Scholar]
- Mobini S, Chiang TJ, Al-Ruwaitea ASA, Ho MY, Bradshaw CM, Szabadi E. Effect of central 5-hydroxytryptamine depletion on inter-temporal choice: A quantitative analysis. Psychopharmacology. 2000;149(3):313–318. doi: 10.1007/s002130000385. http://doi.org/10.1007/s002130000385. [DOI] [PubMed] [Google Scholar]
- Mogenson GJ, Jones DL, Yim CY. From motivation to action: Functional interface between the limbic system and the motor system. Progress in Neurobiology. 1980;14(2–3):69–97. doi: 10.1016/0301-0082(80)90018-0. http://doi.org/10.1016/0301-0082(80)90018-0. [DOI] [PubMed] [Google Scholar]
- Morris SE, Cuthbert BN. Research domain criteria: Cognitive systems, neural circuits, and dimensions of behavior. Dialogues in Clinical Neuroscience. 2012;14(1):29–37. doi: 10.31887/DCNS.2012.14.1/smorris. http://doi.org/10.1097/ALN.0b013e318212ba87. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Moschak TM, Mitchell SH. Partial inactivation of nucleus accumbens core decreases delay discounting in rats without affecting sensitivity to delay or magnitude. Behavioral Brain Research. 2014;(268):159–168. doi: 10.1016/j.bbr.2014.03.044. http://doi.org/10.1016/j.bbr.2014.03.044.Partial. [DOI] [PMC free article] [PubMed]
- Narayanan NS, Horst NK, Laubach M. Reversible inactivations of rat medial prefrontal cortex impair the ability to wait for a stimulus. Neuroscience. 2006;139(3):865–876. doi: 10.1016/j.neuroscience.2005.11.072. http://doi.org/10.1016/j.neuroscience.2005.11.072. [DOI] [PubMed] [Google Scholar]
- Naya Y, Suzuki Wa. Integrating what and when across the primate medial temporal lobe._supplement. Science (New York, NY) 2011;333(6043):773–6. doi: 10.1126/science.1206773. http://doi.org/10.1126/science.1206773. [DOI] [PubMed] [Google Scholar]
- Olton DS, Meck WH, Church RM. Separation of hippocampal and amygdaloid involvement in temporal memory dysfunctions. Brain Research. 1987;404(1–2):180–188. doi: 10.1016/0006-8993(87)91369-2. http://doi.org/10.1016/0006-8993(87)91369-2. [DOI] [PubMed] [Google Scholar]
- Parkinson Ja, Robbins TW, Everitt BJ. Dissociable roles of the centraland basolateral amygdala in appetitive emotional learning. European Journal of Neuroscience. 2000;12(November 1999):405–413. doi: 10.1046/j.1460-9568.2000.00960.x. [DOI] [PubMed] [Google Scholar]
- Peters J, Büchel C. The neural mechanisms of inter-temporal decision-making: Understanding variability. Trends in Cognitive Sciences. 2011;15(5):227–239. doi: 10.1016/j.tics.2011.03.002. http://doi.org/10.1016/j.tics.2011.03.002. [DOI] [PubMed] [Google Scholar]
- Peters J, Büchel C. Episodic Future Thinking Reduces Reward Delay Discounting through an Enhancement of Prefrontal-Mediotemporal Interactions. Neuron. 2010;66(1):138–148. doi: 10.1016/j.neuron.2010.03.026. http://doi.org/10.1016/j.neuron.2010.03.026. [DOI] [PubMed] [Google Scholar]
- Phillips PE, Stuber GD, Heien ML, Wightman RM, Carelli RM. Subsecond dopamine release promotes cocaine seeking. Nature. 2003;422(6932):614–618. doi: 10.1038/nature01476. http://doi.org/10.1038/nature01664. [DOI] [PubMed] [Google Scholar]
- Port RL, Romano AG, Steinmetz JE, Mikhail AA, Patterson MM. Retention and Acquisition of Classical Trace Conditioned Responses by Rabbits With Hippocampal Lesions. Behavioral Neuroscience. 1986;100(5):745–752. doi: 10.1037//0735-7044.100.5.745. http://doi.org/10.1037/0735-7044.100.5.745. [DOI] [PubMed] [Google Scholar]
- Rangel A, Camerer C, Montague PR. A framework for studying the neurobiology of value-based decision making. Nature Reviews Neuroscience. 2008;9(7):545–556. doi: 10.1038/nrn2357. http://doi.org/10.1038/nrn2357. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rawlins JNP, Feldon J, Butt S. The effects of delaying reward on choice preference in rats with hippocampal or selective septal lesions. Behavioral Brain Research. 1985;15:191–203. doi: 10.1016/0166-4328(85)90174-3. [DOI] [PubMed] [Google Scholar]
- Rawlins JN, Winocur G, Gray Ja. The hippocampus, collateral behavior, and timing. Behavioral Neuroscience. 1983;97(6):857–872. doi: 10.1037//0735-7044.97.6.857. http://doi.org/10.1037/0735-7044.97.6.857. [DOI] [PubMed] [Google Scholar]
- Renda CR, Stein JS, Madden GJ. Working-Memory Training: Effercts on Delay Discounting in Male Long Evans Rats. Journal of Experimental Analysis of Behavior. 2015;103(1):50–61. doi: 10.1002/jeab.115. http://doi.org/10.1126/scisignal.2001449.Engineering. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Roesch MR, Bryden DW, Cerri DH, Haney ZR, Schoenbaum G. Willingness to wait and altered encoding of time-discounted reward in the orbitofrontal cortex with normal aging. The Journal of Neuroscience : The Official Journal of the Society for Neuroscience. 2012;32(16):5525–33. doi: 10.1523/JNEUROSCI.0586-12.2012. http://doi.org/10.1523/JNEUROSCI.0586-12.2012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Roesch MR, Calu DJ, Burke KA, Schoenbaum G. Should I stay or should I go? Transformation of time-discounted rewards in orbitofrontal cortex and associated brain circuits. Annals of the New York Academy of Sciences. 2007;1104:21–34. doi: 10.1196/annals.1390.001. http://doi.org/10.1196/annals.1390.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Roesch MR, Calu DJ, Schoenbaum G. Dopamine neurons encode the better option in rats deciding between differently delayed or sized rewards. Nature Neuroscience. 2007;10(12):1615–1624. doi: 10.1038/nn2013. http://doi.org/10.1038/nn2013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Roesch MR, Singh T, Brown PL, Mullins SE, Schoenbaum G. Ventral striatal neurons encode the value of the chosen action in rats deciding between differently delayed or sized rewards. The Journal of Neuroscience : The Official Journal of the Society for Neuroscience. 2009;29(42):13365–13376. doi: 10.1523/JNEUROSCI.2572-09.2009. http://doi.org/10.1523/JNEUROSCI.2572-09.2009. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Roesch MR, Taylor AR, Schoenbaum G. Encoding of Time-Discounted Rewards in Orbitofrontal Cortex Is Independent of Value Representation. Neuron. 2006;51(4):509–520. doi: 10.1016/j.neuron.2006.06.027. http://doi.org/10.1016/j.neuron.2006.06.027. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rubia K, Halari R, Christakou A, Taylor E. Impulsiveness as a timing disturbance: Neurocognitive abnormalities in attention-deficit hyperactivity disorder during temporal processes and normalization with methylphenidate. Philosophical Transactions of the Royal Society B. 2009;364(1525):1919–1931. doi: 10.1098/rstb.2009.0014. http://doi.org/364/1525/1919[pii]\r10.1098/rstb.2009.0014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rudebeck PH, Walton ME, Smyth AN, Bannerman DM, Rushworth MFS. Separate neural pathways process different decision costs. Nature Neuroscience. 2006;9(9):1161–1168. doi: 10.1038/nn1756. http://doi.org/10.1038/nn1756. [DOI] [PubMed] [Google Scholar]
- Salazar RF, White W, Lacroix L, Feldon J, White IM. NMDA lesions in the medial prefrontal cortex impair the ability to inhibit responses during reversal of a simple spatial discrimination. 2004;152:413–424. doi: 10.1016/j.bbr.2003.10.034. http://doi.org/10.1016/j.bbr.2003.10.034. [DOI] [PubMed] [Google Scholar]
- Scattoni ML, Adriani W, Calamandrei G, Laviola G, Ricceri L. Long-term effects of neonatal basal forebrain cholinergic lesions on radial maze learning and impulsivity in rats. Behav Pharmacol. 2006;17(5–6):517–524. doi: 10.1097/00008877-200609000-00018. http://doi.org/00008877-200609000-00018 [pii] [DOI] [PubMed] [Google Scholar]
- Schoenbaum G, Chiba AA, Gallagher M. Orbitofrontal cortex and basolateral amygdala encode expected outcomes during learning. Nature Neuroscience. 1998;1(2):155–159. doi: 10.1038/407. http://doi.org/10.1038/407. [DOI] [PubMed] [Google Scholar]
- Schoenbaum G, Roesch MR, Stalnaker TA. A new perspective on the role of the orbitofrontal cortex in adaptive behaviour. Nature Reviews Neuroscience. 2009;10(12):885–892. doi: 10.1038/nrn2753. http://doi.org/10.1038/nrn2753. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schultz W. Behavioral theories and the neurophysiology of reward. Annual Review of Psychology. 2006;57:87–115. doi: 10.1146/annurev.psych.56.091103.070229. http://doi.org/10.1146/annurev.psych.56.091103.070229. [DOI] [PubMed] [Google Scholar]
- Sellitto M, Ciaramelli E, di Pellegrino G. Myopic discounting of future rewards after medial orbitofrontal damage in humans. The Journal of Neuroscience : The Official Journal of the Society for Neuroscience. 2010;30(49):16429–36. doi: 10.1523/JNEUROSCI.2516-10.2010. http://doi.org/10.1523/JNEUROSCI.2516-10.2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shamosh NA, DeYoung CG, Green AE, Reis DL, Johnson MR, Conway A, … Gray JR. Individual differences in delay discounting. Psychological Science. 2008;19(9):904–911. doi: 10.1111/j.1467-9280.2008.02175.x. http://doi.org/10.1111/j.1467-9280.2008.02175. [DOI] [PubMed] [Google Scholar]
- Stopper CM, Green EB, Floresco SB. Selective involvement by the medial orbitofrontal cortex in biasing risky, but not impulsive, choice. Cerebral Cortex. 2014;24(1):154–162. doi: 10.1093/cercor/bhs297. http://doi.org/10.1093/cercor/bhs297. [DOI] [PubMed] [Google Scholar]
- Stott JJ, Redish AD. A functional difference in information processing between orbitofrontal cortex and ventral striatum during decision-making behaviour A functional difference in information processing between orbitofrontal cortex and ventral striatum during decision-making. Philosophical Transactions of the Royal Society B. 2014 doi: 10.1098/rstb.2013.0472. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tian L, Qin X, Sun J, Li X, Wei L. Pharmacology, Biochemistry and Behavior Differential effects of co-administration of oxotremorine with SCH 23390 on impulsive choice in high-impulsive rats and low-impulsive rats. 2016;142:56–63. doi: 10.1016/j.pbb.2016.01.001. [DOI] [PubMed] [Google Scholar]
- Valencia-Torres L, Olarte-Sánchez CM, Da Costa Araújo S, Body S, Bradshaw CM, Szabadi E. Nucleus accumbens and delay discounting in rats: Evidence from a new quantitative protocol for analysing inter-temporal choice. Psychopharmacology. 2012;219(2):271–283. doi: 10.1007/s00213-011-2459-1. http://doi.org/10.1007/s00213-011-2459-1. [DOI] [PubMed] [Google Scholar]
- Van der Meer MAA, Johnson A, Schmitzer-Torbert NC, Redish AD. Triple dissociation of information processing in dorsal striatum, ventral striatum, and hippocampus on a learned spatial decision task. Neuron. 2010;67(1):25–32. doi: 10.1016/j.neuron.2010.06.023. http://doi.org/10.1016/j.neuron.2010.06.023. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Van Gaalen MM, van Koten R, Schoffelmeer ANM, Vanderschuren LJMJ. Critical Involvement of Dopaminergic Neurotransmission in Impulsive Decision Making. Biological Psychiatry. 2006;60(1):66–73. doi: 10.1016/j.biopsych.2005.06.005. http://doi.org/10.1016/j.biopsych.2005.06.005. [DOI] [PubMed] [Google Scholar]
- Weatherly JN, Plumm KM, Derenne A. Delay discounting and social policy issues. The Psychological Record. 2011;61(4):527–546. Retrieved from http://opensiuc.lib.siu.edu/tpr/vol61/iss4/2. [Google Scholar]
- Wesley MJ, Bickel WK. Remember the future II: Meta-analyses and functional overlap of working memory and delay discounting. Biological Psychiatry. 2014;75(6):435–448. doi: 10.1016/j.biopsych.2013.08.008. http://doi.org/10.1016/j.biopsych.2013.08.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Winstanley CA. Contrasting Roles of Basolateral Amygdala and Orbitofrontal Cortex in Impulsive Choice. Journal of Neuroscience. 2004;24(20):4718–4722. doi: 10.1523/JNEUROSCI.5606-03.2004. http://doi.org/10.1523/JNEUROSCI.5606-03.2004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Winstanley CA, Dalley JW, Theobald DEH, Robbins TW. Global 5-HT depletion attenuates the ability of amphetamine to decrease impulsive choice on a delay-discounting task in rats. Psychopharmacology. 2003;170(3):320–331. doi: 10.1007/s00213-003-1546-3. http://doi.org/10.1007/s00213-003-1546-3. [DOI] [PubMed] [Google Scholar]
- Winstanley CA, Theobald DEH, Dalley JW, Robbins TW. Interactions between serotonin and dopamine in the control of impulsive choice in rats: Therapeutic implications for impulse control disorders. Neuropsychopharmacology : Official Publication of the American College of Neuropsychopharmacology. 2005;30(4):669–82. doi: 10.1038/sj.npp.1300610. http://doi.org/10.1038/sj.npp.1300610. [DOI] [PubMed] [Google Scholar]
- Yang ST, Shi Y, Wang Q, Peng JY, Li BM. Neuronal representation of working memory in the medial prefrontal cortex of rats. Molecular Brain. 2014;7(1):61. doi: 10.1186/s13041-014-0061-2. http://doi.org/10.1186/s13041-014-0061-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yin B, Troger AB. Exploring the 4th Dimension: Hippocampus, Time, and Memory Revisited. Frontiers in Integrative Neuroscience. 2011;5(August):1–5. doi: 10.3389/fnint.2011.00036. http://doi.org/10.3389/fnint.2011.00036. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yun IA, Wakabayashi KT, Fields HL, Nicola SM. The ventral tegmental area is required for the behavioral and nucleus accumbens neuronal firing responses to incentive cues. Journal of Neuroscience. 2004;24(12):2923–2933. doi: 10.1523/JNEUROSCI.5282-03.2004. http://doi.org/10.1523/JNEUROSCI.5282-03.2004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zeeb FD, Floresco SB, Winstanley CA. Contributions of the orbitofrontal cortex to impulsive choice: Interactions with basal levels of impulsivity, dopamine signalling, and reward-related cues. Psychopharmacology. 2010;211(1):87–98. doi: 10.1007/s00213-010-1871-2. http://doi.org/10.1007/s00213-010-1871-2. [DOI] [PubMed] [Google Scholar]