Abstract
Dopamine contributes to cognitive control through well-established effects in both the striatum and cortex. Although earlier work suggests that dopamine impacts cognitive control capacity, more recent work suggests that striatal dopamine may also impact cognitive motivation. We consider the emerging perspective that striatal dopamine boosts control by making people more sensitive to the benefits versus the costs of cognitive effort and discuss how this sensitivity might impact competition between controlled and prepotent actions. We propose that dopamine signaling in distinct cortico-striatal subregions mediates different kinds of cost-benefit tradeoffs and also discuss mechanisms for the local control of dopamine release, enabling selectivity among cortico-striatal circuits. In so doing, we show how this cost-benefit mosaic can reconcile seemingly conflicting findings about the impact of dopamine signaling on cognitive control.
Keywords: dopamine, cognitive control, cortico-striatal circuit, motivation, cognitive effort, working memory
Cognitive control, motivation, and dopamine
Dopamine has long been implicated in cognitive control and working memory, with clear relevance for Parkinson’s disease, ADHD, schizophrenia, and depression [1–3]. Dopaminergic drugs can remediate control deficits, but their mechanisms of action remain unclear. Their cognitive enhancing effects are commonly ascribed to prefrontal cortex modulation, where the slower time constants of dopamine clearance and degradation are well-suited to support temporally-protracted working memory maintenance [4–7]. Yet, dopamine also plays a complementary role in the striatum where fast dynamics shape working memory gating and task shifting [7–13].
While prior work has implicated cortical and striatal dopamine in cognitive control capacity, another, previously under-appreciated possibility, is that striatal dopamine regulates the willingness to exert control [3,14–18]. This perspective fits with a large body of work demonstrating striatal dopamine’s motivational effects on learning and performance of motor tasks [19–23] and growing evidence that, like physical effort, cognitive control is costly and requires motivation [16–18,24–33].
Here, we consider the hypothesis that striatal dopamine boosts willingness to exert cognitive control by increasing sensitivity to the benefits versus the costs of effort across a range of domains from motor to cognitive actions. We start with a brief synopsis of the historical perspective implicating prefrontal and striatal dopamine in cognitive control. Next, we highlight studies demonstrating the impact of dopaminergic drugs on cost-benefit decision-making about control. In the final section, we consider the hypothesis that dopamine signaling is not homogenous across the striatum but is instead characterized by rich spatiotemporal dynamics impacting different kinds of decision variables across striatal sub-regions. This more heterogeneous account has the potential to explain a host of findings not accommodated by monolithic theories of striatal dopamine signaling, while still retaining a core cost-benefit interpretation in any given sub-region.
A brief history of dopamine and cognitive control
Cognitive control, generally, refers to the ability to override prepotent responses with flexible, rule-guided behavior. It requires working memory to update and maintain rules and information about context [34]. Adaptive behavior arises from hierarchical mechanisms by which higher-level task rules maintained in rostral prefrontal cortex bias lower-level, task-relevant processing pathways in more caudal sensorimotor regions. Experimental data [4,6], and computational models [5] implicating prefrontal dopamine in the stability of working memory representations of rules also, thereby, implicate cortical dopamine in cognitive control of responses. For example, early connectionist models centered on aberrant cortical dopamine function to explain dysfunctional rule representations in schizophrenia, e.g. [35].
Concurrently, striatal dopamine was implicated by the recognition that fronto-striatal interactions are well suited for governing what should be gated into (and out of) working memory, and in which contexts, to maximize rewards and minimize punishments [8,36]. To drive adaptive behavior, the prefrontal cortex maintains task-relevant information to bias sensorimotor processing and acts in concert with the striatum to gate working memory contents based on expected reward [7,12,36–38]. This account of striatal dopamine in working memory gating builds on its analogous role in adaptive motor behavior and helps explain a variety of findings linking striatal dopamine to working memory operations and cognitive flexibility [8–10,13,39–43].
Cognitive control is motivated
According to early models, cognitive control is recruited automatically in response to conflict or errors [44,45]. These models predict that control is engaged in proportion to task demands, to maximize expected reward. When response conflict is perceived, for example, a control rule will be gated and that rule will exert top-down influence over lower-level pathways to process task relevant over task-irrelevant information. In contrast, an emerging perspective posits that cognitive control is inherently costly, and must be motivated [16–18,24–33,46]. Thus, rather than being reflexive, control results from a kind of cost-benefit decision-making, balancing factors like control efficacy and reward magnitude against cognitive effort costs [28,47].
Why cognitive control is costly remains unresolved. Unlike physical effort, control does not incur clear metabolic costs [48,49] - though, perhaps, the brain is sensitive to the accumulation of metabolic by-products of control processes [50]. An alternative class of explanations argues that control is normatively costly because it biases responses which are unpracticed and uncommon relative to well-trained habits with more predictable returns; in a sense, control is costly because it is more risky [51]. Yet even when we know we can do tasks properly, using control for one set of tasks would prevent us from doing other tasks. In other words, control incurs opportunity costs [49]. Accordingly, control should be curtailed when opportunity costs rise [52,53]. Consistent with opportunity cost models, control is diminished when average reward rates are high (opportunity costs increase with the payoff per unit action), or when cognitive capacity is low (each unit of allocation accounts for a larger fraction of available resources) [54,55]. Conversely, withholding resources is beneficial because it affords greater flexibility for task-switching as alternative opportunities arise [53]. This tradeoff could explain why people avoid task engagement more in environments prioritizing flexibility over stability (in a recent preprint: [56]).
Although the basis of cognitive effort costs remains unresolved, there is nevertheless consensus in the existence of a cost function governing control allocation. Indeed, people largely choose lower over higher task demands, all else being equal [25,57,58] (though individual differences matter [32]), and discount goals as a function of rising demands [26,27,30,31,33,59]. Rewards are discounted, for example, when received in the context of higher demands for working memory [60], task switching [61], or response conflict [62,63].
A cost function implies that incentives could promote control by offsetting costs. There is ample evidence that incentives enhance various forms of control [16,28] from proactive response preparation [64,65], to task switching [66], to distractor resistance [19,42]. Notably, incentive effects vary as a function of genes and drugs that affect striatal dopamine transmission [19,37,43,63,67,68].
Dopamine impacts cost-benefit decisions about cognitive effort
The performance-enhancing effects of incentives and dopamine drugs in cognitive control tasks have traditionally been understood to reflect heightened capacity. For example, more striatal dopamine facilitates flexible gating [43,66]. An alternative explanation is that incentives and concomitant dopamine release increase motivation to expend cognitive effort [17,18]. To test whether motivation is altered requires asking whether dopamine impacts explicit decisions about cognitive effort. Explicit economic choices provide direct evidence about motivation, while avoiding the pitfalls inherent in inferring cognitive effort from task performance – which is jointly determined by capacity, motivation, and controllability [69].
To test the hypothesis that striatal dopamine promotes motivation, we combined dopamine synthesis capacity imaging (18[F]fluoro-dopa PET) with methylphenidate, in a study of explicit economic choices [70]. First, we asked participants to choose between high-load working memory tasks for more money, or low-load tasks for less money. We found that people with lower dopamine synthesis capacity in the caudate nucleus were less willing to accept high-cost, high-benefit offers (Fig. 1b). We also found that methylphenidate increased willingness to expend effort, but more so for those with low dopamine synthesis capacity – implicating striatal dopamine (in the caudate nucleus, Fig. 1a,b) in cognitive motivation.
Figure 1. Striatal dopamine and cognitive effort selection in two human pharmaco-imaging studies.
a) Two effort-based decision-making tasks ([70]: red frame, [96]: blue frame) and a mask image of striatal sub-regions including the caudate nucleus (red), the putamen (green) and the ventral striatum (blue). On placebo, higher dopamine synthesis capacity, b) in the caudate nucleus, predicts greater selection of high-effort, high-benefit versus low-effort low-benefit offers to perform working memory tasks for money and, c) in the ventral striatum, predicts greater selection of free time for less money versus performance of a visual working memory task for more money. Methylphenidate increases selection of cognitive effort, selectively among b) those with low dopamine synthesis capacity in the caudate nucleus and c) those with high dopamine synthesis capacity in the ventral striatum implying that dopamine signaling in different striatal sub-regions will have distinct effects on different types of decisions about cognitive effort. d) Dopamine synthesis capacity predicts higher effort selection by increasing sensitivity to the relative benefits, and decreasing sensitivity to the relative costs of a hard task for more money versus an easy task for less money, as qualitatively predicted by a computational model of striatal dopamine [71].
Next, we tested the more specific hypothesis that dopamine alters cost-benefit tradeoffs [71]. To test this hypothesis, we examined eye gaze patterns (to track attention to cost or benefit information), and choice behavior, as participants decided between offers tailored to their individual reward-effort tradeoffs. Consistent with our hypothesis, we found that dopamine (correlationally via PET and causally via pharmacology) both increased sensitivity to benefits and decreased sensitivity to the costs of cognitive effort. Furthermore, we found that empirical choice patterns could be predicted from simulations of a computational model [71] of striatal dopamine’s effects in cost-benefit action selection (Fig. 1d). Note that our inference about dopamine promoting motivation is based on the simplistic assumption that higher dopamine synthesis capacity and dopamine transporter blockade amplify post-synaptic dopamine signaling. Future studies with more temporally-resolved methods (e.g. [72]) could elucidate the precise dopamine dynamics regulating cognitive motivation.
Our results help explain the findings from other recent human pharmacology studies examining explicit choices. In one study, methylphenidate increased selection of high versus low task-switching demands, among some participants [58]. Critically, the drug effect depended on trait impulsivity, elsewhere linked with striatal dopamine function. In another study, Parkinson’s disease patients were more likely than controls to choose lower attentional demands for lower rewards over higher demands for higher rewards [73]. Yet, patients who took their Parkinson’s medications matched controls’ preferences for high-cost, high-benefit options. Our results suggest that such effects may reflect increased striatal dopamine signaling which made patients weight benefits (e.g. of bonus points) more strongly relative to costs (e.g. of control demands).
Our study indicates that dopamine is sufficient to mediate cognitive motivation, which was hitherto unclear given the possibility that other neurotransmitters might do so. Noradrenaline, for example, has been implicated in multiple aspects of physical effort including effort selection and force production [74,75] and provides an alternative explanation of medication effects in the previous Parkinson’s disease work. Acetylcholine has also been proposed to explain why psychostimulants alter cognitive effort choice in rats [76], yet neither noradrenaline nor dopamine-selective agents impact choice in the same task [77]. On the other hand, it is possible that dopamine drugs did not alter rodents’ preferences because offers were not sufficiently close to indifference to detect a systematic effect across animals. In our study, dopamine’s impact on sensitivity to costs and benefits was most apparent on trials when offers were closest to participants’ individual indifference points.
Our finding implicating striatal dopamine does not rule out an effect of cortical dopamine on increasing sensitivity to benefits versus costs – though it is less clear how cortical dopamine could mediate such an effect. By contrast, our results accord with an extensive literature implicating striatal dopamine in promoting effort [15,20,23,78,79], and the weighting of benefits versus costs [71,80,81]. Namely, our results are consistent with a canonical model whereby instrumental action learning and selection are mediated by two opposing sets of neurons that coordinate striatal output, alternately expressing dopamine receptor subtypes that make the cells more (D1) or less (D2) sensitive to cortical inputs [8,82,83]. Given that unexpected rewards (positive prediction errors) are signaled by phasic increases, and unexpected punishments (negative prediction errors) by phasic decreases in dopamine release [84], the cortico-striatal synapses of D1 and D2 neurons will, in the course of learning, come to reflect reward and punishment statistics, respectively. Thus, dopamine signals train cortico-striatal synapses onto D1 and D2 neurons to reflect the expected benefits and costs of actions, respectively [71,81,85–87].
There is some question over whether dopamine cell firing and dopamine release encode effort costs – strictly defined in terms of metabolic demands of physical exertion [88], see also [89]. Nevertheless, many factors associated with cognitive and physical effort demands (risk, delays, etc.) dampen phasic responses to reward cues [21,81,90]. Phasic dopamine release in the rat ventral striatum, for example, encodes the net subjective value of reward magnitude, discounted by either delay to reward, or the number of lever presses required, as revealed by fast-scan cyclic voltammetry [81]. Critically, optogenetic stimulation of dopamine neurons can influence the degree to which animals later choose to work for reward, consistent with a causal role for dopamine in training cortico-striatal synapses to reflect physical effort costs and benefits [81].
There is also indirect evidence that dopamine mediates learning about cognitive effort costs. For example, human fMRI studies reveal that while learning about cognitive task demands, brains compute the kinds of reward prediction errors conveyed by phasic dopamine signals in other domains [32,91]. Also, people discount rewards received in the context of cognitive demands [61,63], and this effect is modulated by both the D2 receptor agonist cabergoline, and individual differences in a gene (DARPP-32) linked with the striatal D1 versus D2 pathway balance [63]. These data are consistent with our hypothesis that striatal dopamine signaling shapes cortico-striatal synapses to reflect the relative benefits and costs of cognitive effort.
Effort costs, opportunity costs, and dopamine
Our decision-making study described above indicates that striatal dopamine can reduce sensitivity to cognitive effort costs. Yet, another hypothesis proposes that striatal dopamine encodes an average rate of rewards that signals an opportunity cost of time [92–94]. Critically, because control is slow relative to fast and efficient habits, greater opportunity costs (corresponding with greater environmental richness) may decrease, rather than increase preference for control [52]. Indeed, reward rate manipulations increasing opportunity costs can make people faster and less accurate in cognitive tasks, consistent with a down-regulation of cognitive control [55].
Could dopamine signaling mediate an opportunity cost effect, biasing fast prepotent action over slow control? Rather than biasing selection among competing actions, dopamine signaling opportunity costs was originally proposed to increase behavioral vigor [92]. This focus on behavioral vigor is emphasized in recent accounts proposing that striatal dopamine release primarily determines action latency (e.g. [95]). Nevertheless, as noted above, there is considerable evidence that striatal dopamine can also determine which action to choose (i.e., reward-maximizing or punishment-minimizing) and not just how fast. Thus, dopamine may plausibly bias cost-benefit selection among competing opportunities.
To test the hypothesis that dopamine release can cause people to avoid (rather than engage) demanding tasks in response to high opportunity costs, we conducted a study asking participants to choose greater reward for more time on task or less reward for unconstrained free time, and again used methylphenidate to manipulate dopamine and PET to measure dopamine synthesis capacity [96]. Mirroring our first study, we found that methylphenidate increased high-effort selection and that drug status interacted with individual differences in dopamine synthesis capacity. This pattern supports the general conclusion that striatal dopamine can increase willingness to work for reward, even with respect to unconstrained opportunities.
There were, however, key differences. First, in contrast to our first study, high rather than low dopamine synthesis capacity individuals avoided effort the most, and were most sensitive to the effects of methylphenidate on choice (Fig. 1c). Second, these individual difference effects depended primarily on ventral rather than dorsal striatal dopamine. It is possible that these differences reflect distinct computations mediated by dopamine signaling in distinct striatal subregions [97–101]. Namely, we speculate that in our second study, individuals were differentiated chiefly by ventral striatal dopamine reflecting opportunity costs, because they were tasked with deciding between work and unconstrained opportunities comprising the expected value of the state [71,102] at large. Conversely, in the first study, individuals were differentiated chiefly by sensitivity to effort cost-benefit tradeoffs, and dorsal striatal dopamine because they were tasked with weighing fully explicit costs and benefits of opponent actions (cf. [98]).
Our inference that the functional impact of dopamine signaling depends on striatal subregion accords generally with conclusions from recent rodent imaging work showing that striatal subregions exhibit spatiotemporal variations in dopamine release as a function of task demands [99]. In particular, dopamine release is preferentially directed toward dorsomedial regions during instrumental but not Pavlovian tasks, allowing the animal to adapt behavior to changing tasks demands [99]. Similarly, in human imaging studies, striatal reward prediction errors are amplified in those subregions most related to a given task structure [103]. Collectively, such results motivate a mosaic model, which we will turn to in the next section, by which dopamine signaling impacts the weighting of benefits versus costs of different dimensions in distinct striatal subregions.
Our results converge with foraging studies [93,94] towards the inference that (ventral) striatal dopamine can shift the balance in favor of disengagement in the context of high opportunity costs (when the average, expected benefits of alternative opportunities is higher). Yet, this inference is tempered by the results of another human pharmacology study implying limits on the kinds of tradeoffs which dopamine can alter, with respect to the opportunity cost of time [23]. Namely, participants do not squeeze a handgrip harder to save opportunity costs (time-on-task) on levodopa versus placebo, even though levodopa causes them to squeeze harder for larger rewards. These results are consistent with the broader hypothesis that dopamine can sensitize people to benefits versus costs of actions, even if they do not necessarily influence the weighting of opportunity costs [23]. It is possible that a functional segregation across striatal subregions, which we turn to next, also implies limits on the degree to which dopamine can mediate the tradeoff between different commodities in distinct subregions (see Box 1 for a discussion).
Box 1. Dopamine’s Effects on Choice versus Performance.
Dopamine appears to sensitize people to opportunity costs in our task [96] and in foraging studies [93,94], but not impact the tradeoff between handgrip force and opportunity costs in another pharmacology study [23]. Could differing results reflect a distinction between explicit choices (e.g. between high cost/benefit and low cost/benefit options) and performance (e.g. force used to squeeze a handgrip) such that the effect of dopaminergic medication on one does not predict an effect on the other? Evidence against this explanation comes from a study in which patients with Parkinson’s disease, off medication, both chose high effort options less often and also exerted less force while squeezing [79]. Critically, computational modeling revealed that both effects of dopamine depletion on choices and performance could be explained by a common cause – namely, a reduction in sensitivity to reward benefits [79]. Similarly, in our own study [70], dopamine drugs tended to increase saccade velocities more for those participants who also showed a greater increase in high-effort choice. Thus, explicit choices and performance appear to provide overlapping information about dopamine-dependent motivation.
Differences in the nature of the costs being weighed, such as time versus effort costs, could be another reason why levodopa did not cause people to squeeze harder to save opportunity costs. Our overarching hypothesis is that dopamine biases different kinds of tradeoffs in distinct striatal subregions. Spatial heterogeneity provides specificity, but may also impose limits on the kinds of tradeoffs that dopamine can mediate across different subregions. Perhaps, for example, the opportunity costs of time are mediated by ventral striatal dopamine [21,96], while choices between greater or lesser handgrip force are mediated by dorsal striatal regions, which play a key role in mapping effort costs to motor kinematics [120]. If the two commodities are represented in distinct circuits, a global pharmacological boost may both make people more sensitive to the benefits of alternative opportunities one region, and less sensitive to effort costs in another, without impacting how the two trade off. Accordingly, dopamine might not cause people to squeeze harder to save time, but it could cause them to squeeze faster.
A mosaic of striatal dopamine cost/benefit calculations across subregions
Collectively, effort choice studies imply that striatal dopamine can promote cognitive effort by making people more sensitive to effort benefits and less sensitive to effort costs (Fig. 2a). On the other hand, they also suggest that dopamine may signal opportunity costs and thus shift the balance in the opposite direction, away from effortful cognitive control when the average value of alternative opportunities is higher.
Figure 2. Dopamine shapes both the learning and performance of action policies in cortico-striatal-thalamic circuits.
a) Specifically, dopamine binding at D1 versus D2 receptors respectively increases and decreases striatal neuron excitability to cortical drive. Phasic reward prediction error signaling thus shapes synapses at D1- versus D2-receptor expressing neurons to reflect the benefits versus the costs of actions, respectively. In turn, high dopamine release can also instantaneously amplify benefit versus cost information encoded in cortico-striatal synaptic weights at the time of choice. Adapted from [119]. b) Midbrain dopamine projections shape the learning and performance of cognitive and motor actions in a conserved, hierarchically-structured, cortico-striatal architecture for action selection [14,37,38]. Dopamine projections to the prefrontal cortex modulate the stability of recurrent working memory representations. Abstract, higher-level goal representations in more rostral cortex bias increasingly concrete, lower-level representations, including specific cognitive and motor actions, in more caudal cortex. At the highest level of abstraction, this could include selection of effortful cognitive tasks themselves. Working memory contents are also determined by gating policies in the striatum, where dopamine both trains policies and alters their expression, by modulating sensitivity to afferent cortical projections. A reciprocal, cortico-striatal circuit is replicated across the dorsal frontal cortex and dorsal striatum, thus constituting a spiraling, dopamine-modulated pathway for hierarchical action selection. Abbreviations: DS – dorsal striatum, VS – ventral striatum, VTA – ventral tegmental area, SNc – substantia nigra pars compacta, 1PFC – lateral prefrontal cortex, ACC – anterior cingulate cortex, DA – dopamine.
Striatal dopamine has been shown to undermine control in a number of other contexts as well. For example, it can promote working memory flexibility which is detrimental when tasks demand stability to resist distractors [104]. Robust dopamine-mediated incentive signaling can also undermine performance among people with higher baseline dopamine function in control-demanding tasks, even when the same incentives are beneficial for healthy adults with lower baseline dopamine function [105] or for Parkinson’s patients [106]. Large, dopamine-mediated incentive signals may also amplify response vigor, undermining fine motor control [107].
A mechanistic account of such effects is that striatal dopamine sensitizes D1- versus D2- receptor expressing neurons diffusely, lowering the threshold for gating multiple candidate actions, thereby potentiating both costly control, but also the cheap habits that control was intended to override [71,108]. Moreover, the degree to which dopamine potentiates habits versus control may itself depend on the strength and availability of the habit being proposed relative to control rules retrieved from long term memory [108]. This prediction stems from a core aspect of competition between control which is slow relative to fast, available habits.
How, then, could dopamine selectively promote control signals over habits, or prioritize alternative opportunities in rich environments? A complementary proposal, which we consider here, is that the consequences of dopamine signaling will depend on where and when dopamine is released in the striatum. An architecture of spiraling cortico-striatal loops implies heterogeneity in information processed across different subregions [14,37,38,109] (Fig. 2b). Thus, dopamine signaling in distinct subregions is proposed to mediate different kinds of cost-benefit tradeoffs [109]: e.g. impacting sensitivity to opportunity time costs in the ventral striatum, or willingness to expend cognitive effort in the dorsomedial striatum [101]. Also, circuits are hierarchically nested in the sense that task-sets gated in more medial and anterior prefrontal cortex can contextualize lower-level action selection in lateral and posterior cortex [36,38]. For example, ventral striatum-anterior cingulate cortex circuits may select relevant cost-benefit variables which can then constrain the selection of task representations in dorsal striatum-lateral prefrontal cortex circuits [110]. Accordingly, cortico-striatal interactions could affect cost-benefit selection, based on whether it is worth it, to 1) engage in a Stroop task relative to other potential opportunities, and then 2) gate rules which direct attention to the Stroop color dimension over the text dimension – rules which are more beneficial once the Stroop task is engaged.
Crucially, there are mechanisms for local control of dopamine release, enabling selectivity among cortico-striatal circuits [22,83,99–101]. A key mechanism involves transient suppression of tonically active cholinergic interneurons, distributed throughout the striatum, thereby gating release from local dopamine terminals [83,100,101], and both the learning and performance of specific, motivated behaviors [111,112]. These cholinergic interneurons are, in turn, partly driven by projections from prefrontal cortex, suggesting access to rich, high-level information to shape their activity selectively with respect to the animal’s belief about their state [113].
The functional consequences of local control were demonstrated in a recent study in which axonal activity and dopamine release were imaged across the rat dorsal striatum [99]. Specifically, imaging revealed “waves” of release which, by virtue of their spatiotemporal mapping, selectively promote different behavioral controllers (e.g. Pavlovian versus instrumental control) in different striatal subregions [99]. Data were consistent with a model whereby regions in which dopamine was signaled sooner after reward cues became more strongly associated with reward, and the control signals they processed were more likely to govern subsequent behavior. Their results also had direct relevance for how animals learn to exert physical effort: waves originating in dorsomedial striatum predicted greater subsequent effort (when effort is instrumental toward achieving rewards), while waves originating in dorsolateral striatum predicted less (when effort is not related to rewards, in a Pavlovian context).
Extending this notion, we propose that spatiotemporally precise dopamine signaling to distinct striatal regions supports the learning and performance of complex control-demanding tasks. For example, dopamine release may preferentially target dorsal striatal regions interconnected with anterior prefrontal cortex when gating decisions are made at the level of a task [114] that needs to be sustained for the duration of its execution [38]. Conversely, dopamine release may be targeted to regions interconnected with lateral and posterior prefrontal cortex when effort cost / benefit-based gating needs to conducted to support local task operations [38], such as working memory updating and manipulation, when flexibility is required [43,104]. Concurrently, dopamine release in the ventral striatum tracking the value of the current state [21,96,109] could promote decisions about whether and how vigorously to persist with effort in the current task [20,115] or to switch to alternative actions when their average value is high. Future work is needed to elucidate the extent to which the spatiotemporal dynamics of dopamine release have the resolution to support complex cognitive control demands (e.g., model-based goal pursuit; see Box 2).
Box 2. Striatal Dopamine Signaling and Model-Based versus Model-Free Deliberation.
Model-based control relies heavily on both flexible and stable working memory updating and maintenance for protracted intervals, and typically compete with cheap and fast model-free associations. Yet, there are a number of reasons to think that dopamine dynamics have the requisite complexity and precision to support model-based deliberation. First, phasic dopamine signals appear to train model-based associations and could thus shape internal models of the environment and state transitions [121]. Second, because model-based deliberation is effort-costly [122], dopamine release may offset costs, thereby promoting model-based versus model-free choice. Consistent with this hypothesis, both higher striatal dopamine synthesis capacity [123] and administration of the dopamine precursor l-dopa [124] shift the balance to model-based over model-free choice – though this may reflect a reduced weighting on model-free action values [124]. Third, sustained dopamine ramping, e.g. [21,22,99] operates at timescales that could promote protracted deliberation and could, in principle, be driven by hierarchically inferred progress [125] in model-based deliberation. Critically, dopamine release reflects an animal’s beliefs about their present state [126], and dopamine ramps likewise depend on an animal’s beliefs about progress inferred from an internal model of their position with respect to goal states [127]. Ramps may reflect an accumulation [128] of dopamine-mediated “pseudoreward” prediction errors [129] which could, in principle, sustain protracted deliberation. Fourth, although very high dopamine release (e.g. at the pinnacle of a ramp) could plausibly produce detrimental flexibility, goal progress itself increases the likelihood of on- versus off-task gating, given that goal state associations will be increasingly salient as goals near [18]. Moreover, striatal dopamine can facilitate re-engagement with sequences of instrumental motor behavior [130], in a way that suggests that dopamine would also promote re-engagement with sequences of deliberative cognition following distraction. Future work should test these predictions about whether and how dopamine could promote and sustain protracted, model-based deliberation.
Notably, this cost-benefit mosaic account can reconcile seemingly conflicting findings regarding dopamine’s effects in motor vigor. While higher dopamine release has traditionally been associated with greater vigor [92], recent work suggests that dopamine dynamics afford considerable selectivity for precise and bi-directional control. During learning, dopamine signals can train either slower or faster responding, whichever better predicts reward [116]. Moreover, studies manipulating reward contingencies suggest that dopamine can up-regulate vigor when reward depends on it, and suppress vigor when it does not [117,118]. In a saccade task, for example, Parkinson’s patients off dopamine medications showed faster saccades for larger incentives, whether incentives were performance-contingent or delivered randomly [118]. Yet on dopamine medications, patients, like healthy controls, only increased speed for rewards contingent on fast saccades, and were slower to respond when rewards were not contingent on speed. Thus, dopamine can both promote vigor that is instrumental, and suppress vigor that is not goal-directed. These action-selective effects are generally consistent with the model framework we advance here, whereby dopamine modulates sensitivity to benefits over costs of competing actions, respecting goal hierarchies, rather than producing indiscriminate vigor. Such selectivity likely reflects rich spatiotemporal dynamics of dopamine release in striatal sub-regions governing varying levels of behavioral control.
Concluding remarks
The work reviewed here supports that cognitive control is costly and is regulated by cost-benefit decision making. It implicates dopamine in promoting control by increasing sensitivity to benefits versus costs of cognitive actions, consistent with well-established effects on physical effort, and a conserved cortico-striatal architecture governing both cognitive and motor action selection. It also suggests that dopamine signaling could have different consequences in different striatal sub-regions, impacting the weighting of costs and benefits across a hierarchy of behavioral control. Future work should investigate the spatiotemporal dynamics of dopamine release and their functional implications for alternately enhancing control, or the actions that control might otherwise override (see Outstanding Questions).
Outstanding Questions Box.
Why is cognitive control effort-costly?
How and what does the brain monitor to compute opportunity costs? Are opportunity costs signaled by striatal dopamine release?
What are the functional implications of dopamine release for cognitive control in specific subregions including the ventral, dorsomedial, and dorsolateral striatum?
Does dopamine signaling mediate action policy learning for cognitive control? For example, do phasic dopamine prediction error signals encode cognitive effort costs, and what variables are tracked?
How do prefrontal-basal ganglia circuits orchestrate midbrain dopamine dynamics to sustain sequences of operations in protracted working memory tasks? Might dopamine dynamics in particular circuits be prioritized as a function of key meta-parameters computed in cortex, such as environmental controllability and state value?
Under what conditions do dopamine drugs promote control versus impulsivity and distractibility?
Highlights.
Striatal dopamine can promote cognitive control by increasing sensitivity to the benefits and decreasing sensitivity to the costs of cognitive effort.
The opportunity costs of time may also be signaled by striatal dopamine, biasing disengagement from control-demanding tasks in rich environments.
A hierarchical, cortico-striatal architecture for action selection implies spatial heterogeneity in the kinds of cost-benefit tradeoffs mediated by dopamine signaling in distinct striatal sub-regions.
Acknowledgements:
This research was supported by NIH Grant F32MH115600-01A1 to A.W., NIMH Grants R01MH080066 and P50MH119467 to M.J.F., and a VICI grant from NWO (Grant No. 453-14-005), and Ammodo KNAW award 2017 to R.C.
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
References:
- 1.Abi-Dargham A et al. (2000) Increased baseline occupancy of D2 receptors by dopamine in schizophrenia. Proc Nat Acad Sci 97, 8104–8109 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Frank MJ et al. (2004) By Carrot or by Stick: Cognitive Reinforcement Learning in Parkinsonism. Science 306, 1940–1943 [DOI] [PubMed] [Google Scholar]
- 3.Volkow ND et al. (2010) Motivation deficit in ADHD is associated with dysfunction of the dopamine reward pathway. Mol Psych 16, 1147–1154 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Sawaguchi T and Goldman-Rakic P (2005) D1 dopamine receptors in prefrontal cortex: involvement in working memory. Science 251, 947–950 [DOI] [PubMed] [Google Scholar]
- 5.Durstewitz D and Seamans JK (2008) The Dual-State Theory of Prefrontal Cortex Dopamine Function with Relevance to Catechol-O-Methyltransferase Genotypes and Schizophrenia. Biol Psychiat 64, 739–749 [DOI] [PubMed] [Google Scholar]
- 6.Arnsten AFT (2011) Catecholamine Influences on Dorsolateral Prefrontal Cortical Networks. Biol Psychiat 69, e89–e99 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Ott T and Nieder A (2019) Dopamine and Cognitive Control in Prefrontal Cortex. Trends in Cog Sci 23, 213–234 [DOI] [PubMed] [Google Scholar]
- 8.Frank MJ and O’Reilly RC (2006) A mechanistic account of striatal dopamine function in human cognition: Psychopharmacological studies with cabergoline and haloperidol. Behav Neurosci 120, 497–517 [DOI] [PubMed] [Google Scholar]
- 9.Moustafa AA et al. (2008) A dopaminergic basis for working memory, learning and attentional shifting in Parkinsonism. Neuropsychologia 46, 3144–3156 [DOI] [PubMed] [Google Scholar]
- 10.Clatworthy PL et al. (2009) Dopamine Release in Dissociable Striatal Subregions Predicts the Different Effects of Oral Methylphenidate on Reversal Learning and Spatial Working Memory. J Neurosci 29, 4690–4696 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Cools R (2011) Dopaminergic control of the striatum for high-level cognition. Carr Opin Neurobiol 21, 402–407 [DOI] [PubMed] [Google Scholar]
- 12.Frank MJ (2011) Computational models of motivated action selection in corticostriatal circuits. Carr Opin Neurobio 21, 381–386 [DOI] [PubMed] [Google Scholar]
- 13.Schouwenburg MR van et al. (2013) Anatomical connection strength predicts dopaminergic drug effects on fronto-striatal function. Psychopharmacol 227, 521–531 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Frank MJ and Fossella JA (2010) Neurogenetics and Pharmacology of Learning, Motivation, and Cognition. Neuropsychopharmacol 36, 133–152 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Kurniawan IT et al. (2011) Dopamine and Effort-Based Decision Making. Front Neurosci 5, 81. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Botvinick M and Braver T (2015) Motivation and Cognitive Control: From Behavior to Neural Mechanism. Ann Rev Psych 66, 83–113 [DOI] [PubMed] [Google Scholar]
- 17.Cools R (2016) The costs and benefits of brain dopamine for cognitive control. Wiley Interdiscip Rev Cognitive Sci 7, 317–329 [DOI] [PubMed] [Google Scholar]
- 18.Westbrook A and Braver TS (2016) Dopamine Does Double Duty in Motivating Cognitive Effort. Neuron 89, 695–710 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Manohar SG et al. (2015) Reward Pays the Cost of Noise Reduction in Motor and Cognitive Control. Curr Biol 25, 1707–1716 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Salamone JD et al. (2016) The pharmacology of effort-related choice behavior: Dopamine, depression, and individual differences. Behav Proc 127, 3–17 [DOI] [PubMed] [Google Scholar]
- 21.Hamid AA et al. (2016) Mesolimbic dopamine signals the value of work. Nat Neurosci 19, 117–126 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Mohebi A et al. (2019) Dissociable dopamine dynamics for learning and motivation. Nature 570, 65–70 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Zénon A et al. (2016) Dopamine Manipulation Affects Response Vigor Independently of Opportunity Cost. J Neurosci 36, 9516–9525 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.McGuire JT and Botvinick MM (2010) Prefrontal cortex, cognitive control, and the registration of decision costs. Proc Nat Acad Sci 107, 7922–7926 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Kool W et al. (2010) Decision making and the avoidance of cognitive demand. J Exp Psych: Gen 139, 665–682 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Westbrook A et al. (2013) What Is the Subjective Cost of Cognitive Effort? Load, Trait, and Aging Effects Revealed by Economic Preference. PLoS ONE 8, e68210–8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Massar SAA et al. (2015) Separate and overlapping brain areas encode subjective value during delay and effort discounting. NeuroImage 120, 104–113 [DOI] [PubMed] [Google Scholar]
- 28.Shenhav A et al. (2013) The Expected Value of Control: An Integrative Theory of Anterior Cingulate Cortex Function. Neuron 79, 217–240 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Vassena E et al. (2014) Overlapping Neural Systems Represent Cognitive Effort and Reward Anticipation. PLoS ONE 9, e91008–9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Apps MAJ et al. (2015) The role of cognitive effort in subjective reward devaluation and risky decision-making. Sci Rep 5, 16880. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Chong TT-J et al. (2017) Neurocomputational mechanisms underlying subjective valuation of effort costs. PLoS Biol 15, e1002598. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Sayalı C and Badre D (2019) Neural systems of cognitive demand avoidance. Neuropsychologia 123, 41–54 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Sidarus N et al. (2019) Cost-benefit trade-offs in decision-making and learning. PLoS Comput Biol 15, el007326. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Miller EK and Cohen JD (2007) An Integrative Theory of Prefrontal Cortex Function. Ann Rev Neurosci 24, 167–202 [DOI] [PubMed] [Google Scholar]
- 35.Braver TS et al. (1999) Cognition and control in schizophrenia: a computational model of dopamine and prefrontal function. Biol Psychiat 46, 312–328 [DOI] [PubMed] [Google Scholar]
- 36.Collins AGE and Frank MJ (2013) Cognitive control over learning: Creating, clustering, and generalizing task-set structure. Psych Rev 120, 190–229 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Aarts E et al. (2011) Striatal Dopamine and the Interface between Motivation and Cognition. Front Psych 2, 163. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Badre D and Nee DE (2018) Frontal Cortex and the Hierarchical Control of Behavior. Trends in Cog Sci 22, 170–188 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Cools R and D’Esposito M (2011) Inverted-U-Shaped Dopamine Actions on Human Working Memory and Cognitive Control. Biol Psychiat 69, e113–e125 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.van der Schaaf ME et al. (2012) Establishing the Dopamine Dependency of Human Striatal Signals During Reward and Punishment Reversal Learning. Cereb Cortex 24, 633–642 [DOI] [PubMed] [Google Scholar]
- 41.Cools R et al. (2001) Enhanced or Impaired Cognitive Function in Parkinson’s Disease as a Function of Dopaminergic Medication and Task Demands. Cereb Cortex 11, 1136–1143 [DOI] [PubMed] [Google Scholar]
- 42.Fallon SJ et al. (2017) The Neurocognitive Cost of Enhancing Cognition with Methylphenidate: Improved Distractor Resistance but Impaired Updating. J Cog Neurosci 29, 652–663 [DOI] [PubMed] [Google Scholar]
- 43.Samanez-Larkin GR et al. (2013) A Thalamocorticostriatal Dopamine Network for Psychostimulant-Enhanced Human Cognitive Flexibility. Biol Psychiat 74, 99–105 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Botvinick MM et al. (2001) Conflict Monitoring and Cognitive Control. Psychol Rev 108, 624–652 [DOI] [PubMed] [Google Scholar]
- 45.Holroyd CB and Coles MGH (2002) The neural basis of human error processing: Reinforcement learning, dopamine, and the error-related negativity. Psychol Rev 109, 679–709 [DOI] [PubMed] [Google Scholar]
- 46.Inzlicht M et al. (2018) The Effort Paradox: Effort Is Both Costly and Valued. Trends in Cog Sci 22, 337–349 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Frömer R et al. (2020) Expectations of reward and efficacy guide cognitive control allocation. Nat Comm 12, 1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Shenhav A et al. (2017) Toward a Rational and Mechanistic Account of Mental Effort. Ann Rev Neurosci 40, 99–124 [DOI] [PubMed] [Google Scholar]
- 49.Kurzban R et al. (2013) An opportunity cost model of subjective effort and task performance. Behav Brain Sci 36, 661–679 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Holroyd CB (2016) The Waste Disposal Problem of Effortful Control. In Motivation and Cognitive Control pp. 235–260, Psychology Press [Google Scholar]
- 51.Zénon A et al. (2018) An information-theoretic perspective on the costs of cognition. Neuropsychologia 123, 5–18 [DOI] [PubMed] [Google Scholar]
- 52.Boureau Y-L et al. (2015) Deciding How to Decide: Self-Control and Meta-Decision Making. Trends Cog Sci 19, 700–710 [DOI] [PubMed] [Google Scholar]
- 53.Musslick S et al. (2018) Constraints associated with cognitive control and the stability-flexibility dilemma, in Proceedings of the 40th Annual Meeting of the Cognitive Science Society, pp. 1–6 [Google Scholar]
- 54.Sandra DA and Otto AR (2017) Cognitive capacity limitations and Need for Cognition differentially predict reward-induced cognitive effort expenditure. Cognition 172, 101–106 [DOI] [PubMed] [Google Scholar]
- 55.Otto AR and Daw ND (2019) The opportunity cost of time modulates cognitive effort. Neuropsychologia 123, 92–105 [DOI] [PubMed] [Google Scholar]
- 56.Papadopetraki D et al. (2019) Quantifying the cost of cognitive stability and flexibility. bioRxiv 4, 743120 [Google Scholar]
- 57.Schouppe N et al. (2014) Context-specific control and context selection in conflict tasks. Acta Psychologica 146, 63–66 [DOI] [PubMed] [Google Scholar]
- 58.Froböse MI et al. (2018) Catecholaminergic modulation of the avoidance of cognitive control. J Exp Psychol: Gen 147, 1763–1781 [DOI] [PubMed] [Google Scholar]
- 59.Froböse MI et al. (2020) Catecholaminergic modulation of the cost of cognitive control in healthy older adults. PLoS ONE 15, e0229294–26 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Collins AGE et al. (2017) Interactions Among Working Memory, Reinforcement Learning, and Effort in Value-Based Choice: A New Paradigm and Selective Deficits in Schizophrenia. Biol Psychiat 82, 431–439 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Botvinick MM et al. (2009) Effort discounting in human nucleus accumbens. Cog Affect Behav Neurosci 9, 16–27 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.Dreisbach G and Fischer R (2012) Conflicts as aversive signals. Brain and Cognition 78, 94–98 [DOI] [PubMed] [Google Scholar]
- 63.Cavanagh JF et al. (2014) Conflict acts as an implicit cost in reinforcement learning. Nat Comm 5, 5394. [DOI] [PubMed] [Google Scholar]
- 64.Chiew KS and Braver TS (2016) Reward favors the prepared: Incentive and task-informative cues interact to enhance attentional control. J Exp Psychol: Human Percep Perf 42, 52–66 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Bloemendaal M et al. (2018) Neuro-Cognitive Effects of Acute Tyrosine Administration on Reactive and Proactive Response Inhibition in Healthy Older Adults. eNeuro 5, ENEURO.0035-17.2018 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Bahlmann J et al. (2015) Influence of Motivation on Control Hierarchy in the Human Frontal Cortex. J Neurosci 35, 3207–3217 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67.Dreisbach G et al. (2005) Dopamine and Cognitive Control: The Influence of Spontaneous Eyeblink Rate and Dopamine Gene Polymorphisms on Perseveration and Distractibility. Behav Neurosci 119, 483–490 [DOI] [PubMed] [Google Scholar]
- 68.Aarts E et al. (2015) Reward modulation of cognitive function in adult attention-deficit/hyperactivity disorder. Behav Pharmacol 26, 227–240 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 69.Musslick S et al. (2018) Estimating the costs of cognitive control from task performance: theoretical validation and potential pitfalls, in Proceedings of the 40th Annual Meeting of the Cognitive Science Society, pp. 1–7 [Google Scholar]
- 70.Westbrook A et al. (2020) Dopamine promotes cognitive effort by biasing the benefits versus costs of cognitive work. Science 367, 1362–1366 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71.Collins AGE and Frank MJ (2014) Opponent actor learning (OpAL): Modeling interactive effects of striatal dopamine on reinforcement learning and choice incentive. Psychol Rev 121, 337–366 [DOI] [PubMed] [Google Scholar]
- 72.Bang D et al. (2020) Sub-second Dopamine and Serotonin Signaling in Human Striatum during Perceptual Decision-Making. Neuron 108, 999–1010.e6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 73.McGuigan S et al. (2019) Dopamine restores cognitive motivation in Parkinson’s disease. Brain 142, 719–732 [DOI] [PubMed] [Google Scholar]
- 74.Jahn CI et al. (2018) Dual contributions of noradrenaline to behavioural flexibility and motivation. Psychopharmacol 235, 2687–2702 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 75.Borderies N et al. (2020) Pharmacological evidence for the implication of noradrenaline in effort. PLoS Biol 18, e3000793–26 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 76.Cocker PJ et al. (2012) Sensitivity to Cognitive Effort Mediates Psychostimulant Effects on a Novel Rodent Cost/Benefit Decision-Making Task. Neuropsychopharmacol 37, 1825–1837 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77.Hosking JG et al. (2014) Dopamine Antagonism Decreases Willingness to Expend Physical, But Not Cognitive, Effort: A Comparison of Two Rodent Cost/Benefit Decision-Making Tasks. Neuropsychopharmacol 40, 1005–1015 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 78.Michely J et al. (2020) The role of dopamine in dynamic effort-reward integration. Neuropsychopharmacol 45, 1448–1453 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 79.Le Bouc R et al. (2016) Computational Dissection of Dopamine Motor and Motivational Functions in Humans. J Neurosci 36, 6623–6633 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 80.Tai L-H et al. (2012) Transient stimulation of distinct subpopulations of striatal neurons mimics changes in action value. Nat Neurosci 15, 1281–1289 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 81.Schelp SA et al. (2017) A transient dopamine signal encodes subjective value and causally influences demand in an economic context. Proc Nat Acad Sci 114, E11303–E11312 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 82.Shen W et al. (2008) Dichotomous Dopaminergic Control of Striatal Synaptic Plasticity. Science 321, 848–851 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 83.Cox J and Witten IB (2019) Striatal circuits for reward learning and decision-making. Nat Rev Neurosci 20, 482–494 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 84.Montague P et al. (2002) A framework for mesencephalic dopamine systems based on predictive Hebbian learning. J Neurosci 16, 1936–1947 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 85.Samejima K et al. (2005) Representation of Action-Specific Reward Values in the Striatum. Science 310, 1337–1340 [DOI] [PubMed] [Google Scholar]
- 86.Zalocusky KA et al. (2016) Nucleus accumbens D2R cells signal prior outcomes and control risky decision-making. Nature 531, 642–646 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.Kravitz AV et al. (2012) Distinct roles for direct and indirect pathway striatal neurons in reinforcement. Nat Neurosci 15, 816–818 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 88.Walton ME and Bouret S (2019) What Is the Relationship between Dopamine and Effort? Trends in Neurosci 42, 79–91 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 89.Skvortsova V et al. (2017) A Selective Role for Dopamine in Learning to Maximize Reward But Not to Minimize Effort: Evidence from Patients with Parkinson’s Disease. J Neurosci 37, 6087–6097 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 90.Varazzani C et al. (2015) Noradrenaline and Dopamine Neurons in the Reward/Effort Trade-Off: A Direct Electrophysiological Comparison in Behaving Monkeys. J Neurosci 35, 7866–7877 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 91.Nagase AM et al. (2018) Neural Mechanisms for Adaptive Learned Avoidance of Mental Effort. J Neurosci 38, 2631–2651 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 92.Niv Y et al. (2007) Tonic dopamine: opportunity costs and the control of response vigor. Psychopharmacol 191, 507–520 [DOI] [PubMed] [Google Scholar]
- 93.Heron CL et al. (2020) Dopamine modulates dynamic decision-making during foraging. J Neurosci 40, 5273–5282 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 94.Constantino SM et al. (2016) A Neural Mechanism for the Opportunity Cost of Time. bioRxiv 16, 173443 [Google Scholar]
- 95.Coddington LT and Dudman JT (2019) Learning from Action: Reconsidering Movement Signaling in Midbrain Dopamine Neuron Activity. Neuron 104, 63–77 [DOI] [PubMed] [Google Scholar]
- 96.Hofmans L et al. (2020) Methylphenidate boosts choices of mental labor over leisure depending on striatal dopamine synthesis capacity. Neuropsychopharmacol 45, 2170–2179 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 97.Saunders BT et al. (2018) Dopamine neurons create Pavlovian conditioned stimuli with circuit-defined motivational properties. Nat Neurosci 21, 1072–1083 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 98.Parker NF et al. (2016) Reward and choice encoding in terminals of midbrain dopamine neurons depends on striatal target. Nat Neurosci 19, 845–854 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 99.Hamid AA et al. (2019) Dopamine waves as a mechanism for spatiotemporal credit assignment. 10.1016/j.cell.2021.03.046 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 100.Collins AL and Saunders BT (2020) Heterogeneity in striatal dopamine circuits: Form and function in dynamic reward seeking. J Neurosci Res 98, 1046–1069 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 101.Berke JD (2018) What does dopamine mean? Nat Neurosci 21, 787–793 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 102.O’Doherty J et al. (2004) Dissociable Roles of Ventral and Dorsal Striatum in Instrumental Conditioning. Science 304, 452–454 [DOI] [PubMed] [Google Scholar]
- 103.Badre D and Frank MJ (2012) Mechanisms of Hierarchical Reinforcement Learning in Cortico-Striatal Circuits 2: Evidence from fMRI. Cereb Cortex 22, 527–536 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 104.Broadway JM et al. (2018) Dopamine D2 agonist affects visuospatial working memory distractor interference depending on individual differences in baseline working memory span. Cog, Affect & Behav Neurosci 18, 509–520 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 105.Aarts E et al. (2014) Dopamine and the Cognitive Downside of a Promised Bonus. Psychol Sci 25, 1003–1009 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 106.Timmer MHM et al. (2018) Enhanced motivation of cognitive control in Parkinson’s disease. Eur J Neurosci 48, 2374–2384 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 107.Oudiette D et al. (2019) A Pavlovian account for paradoxical effects of motivation on controlling response vigour. Sci Rep 9, 7607. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 108.Westbrook A and Frank M (2018) Dopamine and proximity in motivation and cognitive control. Curr Opin Behav Sci 22, 28–34 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 109.Suzuki S et al. (2020) Distinct regions of the striatum underlying effort, movement initiation and effort discounting. Nat Hum Behav 5, 378–388 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 110.Verguts T et al. (2015) Adaptive effort investment in cognitive and physical tasks: a neurocomputational model. Front in Behav Neurosci 9, 266–17 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 111.Franklin NT and Frank MJ (2015) A cholinergic feedback circuit to regulate striatal population uncertainty and optimize reinforcement learning. eLife 4, e12029. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 112.Collins AL et al. (2019) Nucleus Accumbens Cholinergic Interneurons Oppose Cue-Motivated Behavior. Biol Psychiat 86, 388–396 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 113.Stalnaker TA et al. (2016) Cholinergic Interneurons Use Orbitofrontal Input to Track Beliefs about Current State. J Neurosci 36, 6242–6257 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 114.Soutschek A et al. (2018) Brain Stimulation Over the Frontopolar Cortex Enhances Motivation to Exert Effort for Reward. Biol Psychiat 84, 38–45 [DOI] [PubMed] [Google Scholar]
- 115.Strasser A et al. (2020) Glutamine-to-glutamate ratio in the nucleus accumbens predicts effort-based motivated performance in humans. Neuropsychopharmacol 45, 2048–2057 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 116.Yttri EA and Dudman JT (2016) Opponent and bidirectional control of movement velocity in the basal ganglia. Nature 533, 402–406 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 117.Manohar SG et al. (2017) Distinct Motivational Effects of Contingent and Noncontingent Rewards. Psychol Sci 28, 1016–1026 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 118.Grogan JP et al. (2020) Dopamine promotes instrumental motivation, but reduces reward-related vigour. eLife 9, e58321. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 119.Maia TV and Frank MJ (2017) An Integrative Perspective on the Role of Dopamine in Schizophrenia. Biol Psychiat 81, 52–66 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 120.Jurado-Parras M-T et al. (2020) The Dorsal Striatum Energizes Motor Routines. Curr Biol 30, 4362–4372.e6 [DOI] [PubMed] [Google Scholar]
- 121.Sharpe MJ et al. (2017) Dopamine transients are sufficient and necessary for acquisition of model-based associations. Nat Neurosci 20, 735–742 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 122.Kool W et al. (2018) Planning Complexity Registers as a Cost in Metacontrol. J Cog Neurosci 30, 1391–1404 [DOI] [PubMed] [Google Scholar]
- 123.Deserno L et al. (2015) Ventral striatal dopamine reflects behavioral and neural signatures of model-based control during sequential decision making. Proc Nat Acad Sci 112, 1595–1600 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 124.Kroemer NB et al. (2019) L-DOPA reduces model-free control of behavior by attenuating the transfer of value to action. NeuroImage 186, 113–125 [DOI] [PubMed] [Google Scholar]
- 125.Holroyd CB and McClure SM (2015) Hierarchical control over effortful behavior by rodent medial frontal cortex: A computational model. Psychol Rev 122, 54–83 [DOI] [PubMed] [Google Scholar]
- 126.Babayan BM et al. (2018) Belief state representation in the dopamine system. Nat Comm 9, 1891. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 127.Guru A et al. (2020) Ramping activity in midbrain dopamine neurons signifies the use of a cognitive map. bioRxiv 542, 2020.05.21.108886 [Google Scholar]
- 128.Gershman SJ (2014) Dopamine Ramps Are a Consequence of Reward Prediction Errors. Neur Comp 26, 467–471 [DOI] [PubMed] [Google Scholar]
- 129.Botvinick MM et al. (2009) Hierarchically organized behavior and its neural foundations: A reinforcement learning perspective. Cognition 113, 262–280 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 130.Nicola SM (2010) The Flexible Approach Hypothesis: Unification of Effort and Cue-Responding Hypotheses for the Role of Nucleus Accumbens Dopamine in the Activation of Reward-Seeking Behavior. J Neurosci 30, 16585–16600 [DOI] [PMC free article] [PubMed] [Google Scholar]