Abstract
Motivated behavior exhibits properties that change with experience and partially dissociate among a number of brain structures. Here, we review evidence from rodent experiments demonstrating that multiple brain systems acquire information in parallel and either cooperate or compete for behavioral control. We propose a conceptual model of systems interaction wherein a ventral emotional memory network involving ventral striatum (VS), amygdala, ventral hippocampus, and ventromedial prefrontal cortex triages behavioral responding to stimuli according to their associated affective outcomes. This system engages autonomic and postural responding (avoiding, ignoring, approaching) in accordance with associated stimulus valence (negative, neutral, positive), but does not engage particular operant responses. Rather, this emotional system suppresses or invigorates actions that are selected through competition between goal-directed control involving dorsomedial striatum (DMS) and habitual control involving dorsolateral striatum (DLS). The hippocampus provides contextual specificity to the emotional system, and provides an information rich input to the goal-directed system for navigation and discriminations involving ambiguous contexts, complex sensory configurations, or temporal ordering. The rapid acquisition and high capacity for episodic associations in the emotional system may unburden the more complex goal-directed system and reduce interference in the habit system from processing contingencies of neutral stimuli. Interactions among these systems likely involve inhibitory mechanisms and neuromodulation in the striatum to form a dominant response strategy. Innate traits, training methods, and task demands contribute to the nature of these interactions, which can include incidental learning in non-dominant systems. Addition of these features to reinforcement learning models of decision-making may better align theoretical predictions with behavioral and neural correlates in animals.
Keywords: amygdala, dopamine, emotion, hippocampus, inhibition, Pavlovian-instrumental transfer, reinforcement learning, striatum
Introduction
Natural environments pose numerous challenges to animals seeking to survive and reproduce. Advantage is gained by adapting behavior so as to exploit new opportunities and avoid hazards. The study of these adaptations has enjoyed a rich and active history. Pioneering animal learning psychologists of the mid-twentieth century were divided among those who viewed behavior as the learning of stimulus–response habit associations driven by reinforcement (Thorndike, 1911; Hull, 1943) and those who postulated that animals used internal representations of environmental contingencies in order to select actions achieving desirable goals (Tolman, 1932). Habitual responses can be generated quickly and accurately with simple learning schemes, but are slow to change in the face of changing environmental contingencies between antecedents (e.g., stimuli, events, actions) and outcomes. Conversely, goal-oriented responses can adapt quickly, but involve more complex learning and control schemes explicitly encoding goal values and contingency-dependent strategies. It is now generally accepted that multiple forms of learning, including both habit and goal-oriented systems, are distributed among multiple brain structures and interact so as to control actions in rodents and primates (McClelland et al., 1995; Balleine and Dickinson, 1998; Wise and Murray, 2000; Cardinal et al., 2002a; White and McDonald, 2002; Doya, 2008).
Here, we review key literature regarding the behavioral significance of processing in and among rodent frontal cortex, striatum, amygdala, hippocampus, hypothalamus, and brainstem modulatory systems. In addition to the formation of segregated circuits among these structures for mediating (i) habits involving dorsolateral striatum (DLS) and (ii) goal-directed control involving dorsomedial striatum (DMS), they also form (iii) an emotional memory system involving the ventral striatum (VS) and its limbic inputs that exerts an important influence on behavior (Figure 1). This emotional system engages postures, attention, and autonomic responses rather than selecting specific actions, but these are nonetheless important for engaging and invigorating operant responses as well as influencing behavioral flexibility. We propose this emotional system serves to contextually gate responses to stimuli based on associated valence, and furthermore the gating out of neutral stimuli and suppressing unrewarding responses is an important feature for behavioral control. When a stimulus passes the triage threshold of this system, operant responding is then determined by a competition between a habit system and a goal-directed system that is sensitive to specific outcomes and complex task demands. The ventral emotional triage system has a high capacity, forms memories rapidly, and forgets associations slowly such that stimuli associated with rewards engage attention and responding, which may allow the slower-learning goal and habit systems to solve tasks efficiently. In an attempt to synthesize a coherent framework cutting across the large and complex rodent literature on these brain systems, we first review key evidence for functional significance of structures individually before reviewing evidence of their interactions.
Amygdala: emotional learning and memory system
Research on the functional significance of the amygdala has consistently implicated this brain structure in emotional learning and memory (Kluver and Bucy, 1939; Weiskrantz, 1956; Goddard, 1964; Bagshaw and Benzies, 1968). Several studies in humans have demonstrated a reliable correlation between amygdala activation and emotionally valenced stimuli (Breiter et al., 1996; Reiman et al., 1997), which can even occur without subjects' awareness (Whalen et al., 1998). Experiments with animals has likewise revealed an important role for the amygdala in rapidly forming associations between environmental stimuli and positive or negative affective events in a process called classical (or Pavlovian) conditioning (Pavlov, 1927). The representations formed by these associations are essential for many behaviors, as they can be used to initiate general approach or avoidance behaviors as well as modulate overt responses via interactions with instrumental memory systems (Davis et al., 1982; Cador et al., 1989; Everitt et al., 1989; Hiroi and White, 1991b).
The critical involvement of the amygdala in emotional learning and memory processes is supported by the unique and extensive reciprocal connectivity of this structure to an interesting array of cortical and subcortical targets (White and McDonald, 2002; Ledoux, 2007). Briefly, the amygdala receives extensive sensory input from the thalamus or cortex from all of the major sensory modalities. This feature of amygdala connectivity suggests that it has online access to multimodal information about the external environment. The amygdala also has extensive reciprocal connections with portions of the hypothalamus, brainstem, hippocampus, and VS (Petrovich et al., 2001). These brain areas control many autonomic functions like heart rate, respiration, hormone release, and neurotransmitter release that occur during both negative and positive experiences (Kapp et al., 1990; Davis, 1992b; White and McDonald, 2002; Everitt et al., 2003). This feature of amygdala anatomy provides the system with information about negative and positive affective states, as well as a pathway to influence physiological responses accordingly. Mechanisms of synaptic plasticity in the amygdala allow for rapid association among convergent sensory and affective information for later use (Maren and Quirk, 2004). Accordingly, neurons in the amygdala respond to sensory stimuli from various modalities that are paired with positive or negative states, but responses rapidly habituate if not paired with biologically significant cues (Ben-Ari and Le Gal La Salle, 1974; Schoenbaum et al., 1998).
The amygdala is anatomically divided into multiple nuclei with differing connectivity and functional specialization. The basolateral nucleus is defined by extensive reciprocal connectivity with the thalamus, cortex, and VS (Veening, 1978; Simon et al., 1979). Numerous research groups have provided evidence implicating the basolateral nucleus in the acquisition, storage, and retrieval of stimulus-reward associations (Schwartzbaum, 1960; Spiegler and Mishkin, 1981; Cador et al., 1989; Everitt et al., 1991; Kentridge et al., 1991; McDonald and White, 1993). An example of this kind of learning is a conditioned cue preference task developed for the 8-arm radial maze. Training on this task consists of pairing a highly palatable food (sweetened cereal) with a lit arm and the absence of food with a darkened arm. During testing in the absence of food, normal rats spend more time in the previously paired arm than the unpaired arm. Importantly, rats are never trained (instrumentally) to make any voluntary movements to obtain the food reward. Rather, they simply consume the food in the presence of a specific cue. This is a straightforward demonstration of classical conditioning between sensory events and affective outcomes whereby rats learn that the light signals the presence of food and this association elicits a general approach response and thereby maintains the subjects' contact with that stimulus. Importantly, rats with damage to the lateral or basolateral nucleus of the amygdala are impaired on the acquisition of this task (McDonald and White, 1993).
These stimulus-outcome (S-O) associations can be used as conditioned reinforcers for learning instrumental responses in the absence of primary rewards (Everitt et al., 1989). In these paradigms, initial training consists of pairing a reward (e.g., access to a sexual partner) with a punctate stimulus. Following this S-O learning, these rats can then learn to respond on a lever to activate the stimulus even if the primary reinforcement is not given. Rats given neurotoxic damage to the basolateral amygdala after S-O learning do not learn a new instrumental response using the conditioned reinforcer, indicating that this structure is needed for representing valence associated with conditioned stimuli to enable additional learning. These S-O associations can also be acquired during cued instrumental training, although they do not seem to be necessary for accurate performance of the task. For example, acquisition of tasks requiring rats to generate turning responses into lit maze arms for food rewards are sensitive to lesions of DLS rather than amygdala (McDonald and White, 1993). However, the amygdala does incidentally acquire information about the signaling value of the cue in these conditions (light = reward), which can be revealed with a preference transfer test independent of the reinforced instrumental response (McDonald et al., 2004).
The lateral, basolateral, and central amygdala have also been implicated in cued aversive classical conditioning (Bagshaw and Benzies, 1968; Kapp et al., 1979; Davis, 1986; Ledoux et al., 1990; Kim et al., 1993; Antoniadis and McDonald, 2000). An example of this conditioned fear learning comes from Kapp and colleagues (Kapp et al., 1979), who utilized a classical conditioning paradigm in rabbits in which one stimulus (tone) predicted an aversive eye-shock and another stimulus (light) did not. This is another task in which animals were never trained to make voluntary movements during conditioning. They experienced the aversive event in the presence of particular cues and rapidly (within a few repetitions) learned that the tone signaled the aversive eye-shock. This association lead to an internal fear state that elicits general avoidance and involuntary effects such as lowering of heart rate (bradycardia). After sufficient training, the rabbits showed conditioned bradycardia to the tone and little or no changes in heart rate when the light was presented. Lesions to the central (Kapp et al., 1979) or basolateral (Ledoux et al., 1990) nucleus of the amygdala impairs this form of classical conditioning. Furthermore, neural activity in the central amygdala is elevated in the presence of the shock-paired stimulus but not the unpaired stimulus (Kapp et al., 1979), consistent with proposals that the amygdala signals the affective significance of stimuli associated with aversive as well as appetitive outcomes (Morrison and Salzman, 2010). The central nucleus of the amygdala is defined by extensive reciprocal connections with the brain stem and hypothalamus that allow this system to activate defensive responses following learning (Maren, 2001; Viviani et al., 2011). Note that this system is distinct from the brain structures primarily involved in generating operant responses, such as motor cortex and DLS (McDonald and White, 1993; Hikosaka, 1998). Although numerous studies have now shown that amygdala is sufficient for rapid development of fear responses evoked by discrete cues, rapid learning of fear induced by environmental context additionally involves dorsal hippocampus (Sutherland and McDonald, 1990; Kim and Fanselow, 1992). Such dissociation of cue and context learning is important, and will be discussed in more detail later.
The central nucleus of the amygdala has also been implicated in innate postural responses supporting appetitive learning. Holland and Gallagher (Gallagher et al., 1990) utilized a set of unconditioned responses by rats to novel visual (rearing) and auditory (startle) cues; these are sometimes referred to as orienting responses. These responses are maintained if cues are associated with reinforcement, but normally habituate in the absence of reinforcement (Holland, 1977). Rats with damage to the central, but not basolateral, nucleus of the amygdala show normal unconditioned orienting responses to these cues, but do not maintain these behaviors when the cues are associated with food availability (McDannald et al., 2005). Orienting responses are also altered following damage to the central nucleus of the amygdala in a Pavlovian autoshaping procedure in which rats approach food-related stimuli (Parkinson et al., 2000a). Thus, the central nucleus is involved in controlling postural responses that keep animals in contact with behaviorally relevant stimuli. Consistent with this role in guiding behavioral focus to relevant stimuli are demonstrations that amygdala is involved in attention processes after task acquisition (Gallagher et al., 1990) and signaling surprising events (Roesch et al., 2012). Some of these effects on attention and task engagement may involve amygdala innervation of hypothalamus (Petrovich et al., 2001), an integrative structure involved in arousal, autonomic control, sleep, reproduction, food intake, and other functions (Saper, 2003). For instance, hypothalamic neurons can release neurotransmitters including histamine and hypocretin that promote arousal onto cortico-limbic targets such as medial prefrontal cortex, amygdala, VS, hippocampus, and brainstem monoaminergic neurons (Parmentier et al., 2002; Takahashi et al., 2006; Haas et al., 2008; Berridge et al., 2010). Many of these target structures in turn innervate hypothalamus (Risold and Swanson, 1996; Saper, 2003; Haas et al., 2008), thus forming an interconnected network that can promote arousal and consumption (Kelley et al., 2005), or fearful responses under stressful conditions (Petrovich et al., 2001; Herman et al., 2005). The data summarized so far indicate that the amygdala rapidly associates multimodal sensory and affective signals so as to trigger freezing or orienting responses, modulate visceral function, and influence overt operant responses during instrumental learning, largely mediated via different pathways. As discussed later, amygdala input to the VS is important for learning and modulating operant responses. Thus, the associations between stimuli and affective outcomes (S-O) formed in the amygdala impact a number of brain systems involved in both early and late phases of adaptive behavioral control.
Hippocampal formation: space, context, and disambiguation
The hippocampus and related structures are thought to rapidly acquire and store relational information about spatial, contextual, and multimodal sensory elements of episodic experiences (O'Keefe and Nadel, 1978; Sutherland et al., 1989; Maren et al., 1997; Tulving and Markowitsch, 1998). The representation of this information can be recalled at a later time to activate a variety of effectors including those involved in producing internal states, general approach, avoidance, freezing, and complex navigational behaviors in animals (White and McDonald, 2002). The anatomy of the hippocampus and its neural correlates of behavior are consistent with this view (O'Keefe and Dostrovsky, 1971; McNaughton et al., 1983; Amaral and Witter, 1989; Muller, 1996; Derdikman and Moser, 2010). The hippocampus receives extensive sensory input from all cortical sensory association areas via connections with the perirhinal and entorhinal cortices (Amaral and Witter, 1995; Lavenex and Amaral, 2000). This information appears to be processed in a distributed manner throughout the septal and temporal extent of the structure such that encoding is sparse and unique for a given input configuration (Muller and Kubie, 1987; O'Keefe and Speakman, 1987). The output of this representation may serve as an index of an episode to facilitate reactivation of activity related to the experience in other brain regions (Schwindel and McNaughton, 2011).
The classic demonstration of spatial context encoding in hippocampus comes from neurons showing “place fields” in which activation increases in specific locations in an environment. Intriguingly, place fields are unique to specific environments. The population of place cells are globally remapped to orthogonal representations (sets of active neurons with few common members) when animals are moved among different testing environments (Muller and Kubie, 1987; Leutgeb et al., 2005; Jezek et al., 2011). Within the same environment, firing rates of place cells in the place field are sensitive to factors such as head direction, idiothetic movement metrics, and cues in an environment (McNaughton et al., 1983; Muller and Kubie, 1987; O'Keefe and Speakman, 1987; Leutgeb et al., 2005). These place cells are thought to be a substrate through which a representation of the topographical relationships amongst cues in an environment are formed and stored (O'Keefe and Nadel, 1978).
Consistent with the abundant electrophysiological data implicating hippocampus in spatial information processing, rats with damage to the hippocampus are impaired on a variety of spatial learning and memory tasks. These include the standard hidden platform version of the water maze (Morris et al., 1982; Sutherland et al., 1982), 8-arm radial maze (Olton et al., 1978; Harley, 1979), and spatial discriminations (O'Keefe et al., 1975; Rasmussen et al., 1989). In addition to navigation, hippocampal dysfunction impairs rats' ability to learn about experiences in specific spatial contexts (Sutherland and McDonald, 1990; Selden et al., 1991; Kim and Fanselow, 1992; Antoniadis and McDonald, 2000). Beyond purely spatial contexts, hippocampal damage also impairs rats' ability to discriminate based on configurations of stimuli (Rudy and Sutherland, 1989; McDonald et al., 1997).
The pattern of lesion effects reported in this classic work makes it tempting to suggest that hippocampus is critical for spatial, contextual, and relational/configural associations. However, these tasks are not sufficient to fully capture the unique representational contributions of the mammalian hippocampus to behavior. An emerging body of work now suggests that many tasks can be altered in ways that make them highly sensitive or insensitive to hippocampal dysfunction. One task feature that necessitates hippocampal involvement is high cue ambiguity. For example, rats with hippocampal damage can solve binary spatial discriminations for distal but not proximal reward zones (McDonald and White, 1995a; Gilbert et al., 1998). Conversely, normal rats can solve both problems regardless of spatial ambiguity. We have shown similar effects by varying ambiguity in versions of configural association tasks (McDonald et al., 1997) and fear conditioning to context (Frankland et al., 1998; Antoniadis and McDonald, 2000). The unique hippocampal contribution to disambiguation is sometimes referred to as pattern separation (Sutherland et al., 1989; O'Reilly and Rudy, 2001), and is consistent with the high sensitivity of hippocampal neural activity to environmental factors. Interestingly, Fanselow (Fanselow, 2000) suggested that rats must form a gestalt of a test chamber by exploring it over the course of minutes before contextual fear can be acquired. This is similar to the amount of time needed for rats to form a stable hippocampal firing field in a novel chamber (Bostock et al., 1991). Recent work with genetically modified mice has shown a causal link between these by demonstrating that selective optogenetic stimulation of dorsal hippocampal neurons that encode a fear-associated place can induce freezing responses when animals are in a benign environment (Liu et al., 2012). Thus, hippocampal activity patterns are sufficient for mice to engage behaviors associated with the encoded place or episode.
In addition to ambiguity, a second task factor that appears to necessitate hippocampal involvement is temporal ordering among events. Hippocampal damage impairs the ability of rats to use previously learned sequential ordering of odor cues to make discriminations, but spares recognition of the same odors (Fortin et al., 2002). The hippocampus is also involved in tasks with delays between events. This has been shown in non-match-to-sample tasks in monkeys, wherein hippocampal damage impairs responding when delays are introduced between sample and match phases (Mishkin and Manning, 1978). A similar interaction between lesion and delay has been found for dorsal (but not ventral) hippocampal lesions in rats on a spatial delayed alternation task (Hock and Bunsey, 1998). Delays also recruit hippocampal involvement in non-spatial tasks such as trace fear paradigms, where a delay occurs between the conditioned stimulus and eyeblink response in rabbits (Kim et al., 1995) or freezing response in rats (McEchron et al., 1998). Neural encoding in dorsal hippocampus contains information about temporal order. For instance, place cells in this region activate in sequence as rats passes through their respective place fields while navigating an environment during a task. These neurons briefly reactivate in the same sequence when the animal is resting (Skaggs and McNaughton, 1996). Intriguingly, sequences coding for potential future paths are briefly generated when rats approach a choice point in a spatial task (Wood et al., 2000; Shapiro et al., 2006; Johnson and Redish, 2007; Ferbinteanu et al., 2011), suggesting that the hippocampus may be sending out a predictive signal based on past episodes of spatial trajectories. Indeed, similar reactivations have also been detected in medial prefrontal cortex (Euston et al., 2007; Peyrache et al., 2009) and VS (Lansink et al., 2009; Van Der Meer and Redish, 2009), two structures that receive prominent hippocampal input and synchronize with hippocampus (Goto and O'Donnell, 2001; Jones and Wilson, 2005; Cenquizca and Swanson, 2007; Gruber et al., 2009a; Lansink et al., 2009; Benchenane et al., 2010; Hyman et al., 2010). In addition to representing sequential order through sequential firing patterns of different neurons, temporal information of events is also encoded by the timing of action potential firing of individual neurons in the hippocampus with respect to the phase of prevalent theta-frequency oscillations of field potentials in this structure (Buzsaki, 2005; Hasselmo and Eichenbaum, 2005). These features allow for compression of temporal information into single theta cycles, and indicate that the output of the hippocampus is rich in temporal (spike phase and ordering) as well as non-temporal (which neurons activate) information.
The output projections of the hippocampus vary along the septo-temporal axis (Cenquizca and Swanson, 2007), so it is unsurprising that some function appears to vary over this axis as well (Fanselow and Dong, 2010; Bast, 2011). The dorsal (septal) hippocampus primarily projects to extrahippocampal structures in the temporal lobe such as subiculum and entorhinal cortex that in turn project to most neocortical areas (Lavenex and Amaral, 2000; Cenquizca and Swanson, 2007). A dominant dorsal hippocampal/neocortical projection is to both the anterior and posterior cingulate cortices. The projection to the posterior cingulate cortex is of particular interest because this brain area also receives strong projections from posterior parietal cortex, which has been implicated in online visual guidance of behavior (Sutherland et al., 1988). It is possible that posterior cingulate allows animals to use spatial memories to navigate via interactions between hippocampus and neocortical regions like the posterior parietal cortex. The intermediate and ventral (temporal) portions of the hippocampus project to these structures as well as to ventral medial prefrontal cortex, VS, and amygdala (Voorn et al., 2004). Although less is known about the neural signaling in these more ventral portions, they are important for translating hippocampal information into actions (Bast et al., 2009).
Both the dorsal and intermediate regions of the hippocampus are thought to be necessary for accurate spatial navigation in the water task (Moser et al., 1993; Ferbinteanu et al., 2003; Bast et al., 2009), while the ventral pole (i.e., the most ventral third of the hippocampus) may not be required. The bulk of current evidence suggests that the dorsal region is more efficient in encoding spatial information and is necessary for spatial navigation, but recent work suggests that the intermediate zone is critical for translating spatial information into action, particularly in paradigms requiring rapid learning (Bast et al., 2009). On the other hand, damage to the ventral hippocampus produces behavioral impairments in non-spatial tasks similar to damage of its forebrain targets in some cases. One example of function that varies along the septo-temporal axis and impacts forebrain targets is fear conditioning in rats. Whereas the dorsal hippocampus is involved in fear conditioning to context, the ventral hippocampus appears to be involved in fear conditioning to both context and explicit cues such as tones (Maren, 1999; Bast et al., 2001; Zhang et al., 2001). Fear conditioning can be acquired without an intact hippocampus following repeated training (Wiltgen et al., 2006; Sparks et al., 2011) through a mechanism thought to involve the amygdala (Biedenkapp and Rudy, 2009). Thus, hippocampal output to other structures needed for fear conditioning, particularly the amygdala, appears to support rapid associative learning in at least some non-spatial domains. Another behavior exemplifying functional overlap between ventral hippocampus and target regions is prepulse inhibition of the startle reflex. This is a sensorimotor process in which an acoustic startle reflex is reduced when startling stimuli are preceded by a weak prepulse stimulus, and is impaired by manipulations to ventral hippocampus, VS, or basolateral amygdala among other limbic and brainstem structures (Wan et al., 1996; Wan and Swerdlow, 1996; Koch and Schnitzler, 1997; Wan and Swerdlow, 1997). Although the ventral hippocampus does not directly mediate prepulse inhibition, it is able to modulate this phenomenon (Koch and Schnitzler, 1997; Bast and Feldon, 2003).
Ventral hippocampus and its associated medial temporal lobe structures also functionally interact with VS in learning phenomena like latent inhibition and conditioned inhibition. These are similar types of learning wherein rats rapidly cease orienting toward, or responding to, stimuli that have never been associated with reinforcement (Lubow, 1989), and both appear to be context-specific (Honey and Hall, 1989; McDonald et al., 2001). Latent inhibition is a phenomenon whereby non-reinforced stimulus pre-exposure leads to retardation of the development of subsequent conditioned responses when the stimulus is later paired with reinforcement. Latent inhibition is not disrupted by selective lesions of hippocampus per se (Honey and Good, 1993; Reilly et al., 1993), but is disrupted by damage to its nearby cortical target, the entorhinal cortex, or VS (Weiner et al., 1996; Coutureau et al., 1999). However, neurotoxic lesions or temporary inactivation of ventral hippocampus do disrupt the usual context-specificity of latent inhibition (Honey and Good, 1993; Maren and Holt, 2000). In conditioned inhibition, rats are trained to discriminate between a reinforced cue and a non-reinforced cue, resulting in accrual of excitatory conditioning to the reinforced stimulus and conditioned inhibition to the non-reinforced stimulus. The contextual specificity of conditioned inhibition is disrupted by damage to ventral hippocampus (McDonald et al., 2001, 2006). Thus, hippocampus modulates the suppression of responding to irrelevant cues in a context-dependent manner.
The data reviewed in this section indicate that the hippocampus is involved in the rapid formation (Wiltgen et al., 2006; Bast et al., 2009) and recall of associations among places, contexts, and sensory configurations, and also includes temporal elements. This processing affects many types of behaviors including spatial navigation, operant responding for rewards, fearful response, and various forms of response inhibition. These features support the conclusion by Wise and Murray (Wise and Murray, 2000) that the primate hippocampus is a critical part of a network with frontal cortex and the basal ganglia that is required for learning to generate arbitrary and flexible associations between antecedents and outcomes. This is likely true for rodents as well. Hirsh (Hirsh, 1974) noted that actions in rodents appeared to be a matter of habit in the absence of the hippocampus. This structure seems to be particularly critical when outcomes are delayed or context is ambiguous. We later discuss conditions in which this structure cooperates or competes with other brain structures for contextual control of habitual and adaptive behavior.
Striatum: a nexus among limbic structures and parallel circuits linking cortex and basal ganglia
Two influential anatomical reviews paved the way for modern conceptualizations of functional heterogeneity in the striatum. Alexander et al. (1986) proposed five parallel circuits in the monkey connecting different portions of the cortex, striatum, pallidum, substantia nigra, and thalamus in partially closed basal ganglia-thalamocortical loops. These parallel circuits included a motor circuit, multiple prefrontal circuits, and multiple limbic circuits that were centered on the dorsal and VS. Groenewegen et al. (1990) identified similar circuits in rat brain, and an important review by McGeorge and Faull (1989) pointed to clear anatomical distinctions between the dorsolateral and DMS in the rat, with the former receiving extensive convergent projections from motor and sensory cortices and the latter defined by connectivity with prefrontal and limbic projections from hippocampus and amygdala. Subsequent anatomical data has indicated that these circuits are not independent from one another (Joel and Weiner, 1994). For instance, cortico-striatal projections can innervate large volumes of striatum that cross functional territories, providing a mechanism for cross-talk among circuits (Levesque and Parent, 1998; Zheng and Wilson, 2002; Hoover and Vertes, 2011). Furthermore, Haber et al. (2000) have suggested that the overarching organization of these circuits forms a spiral wherein the limbic circuit affects the cognitive and motor circuits. Although it is convenient to identify discrete regions of striatum, cytoarchitectural composition, and afferent innervation vary according to a dorsolateral to ventromedial gradient within striatum, rather than forming discrete boundaries between these regions (Voorn et al., 2004).
Here, we consider three functional territories in our analysis of rodent striatal function: (1) a dorsolateral “motor” sector involved in skilled movements and habits, (2) a dorsomedial “cognitive” sector involved in allocentric navigation and flexible responding for strategic acquisition of goals, and (3) a ventral “limbic” sector incorporating the core and shell of the nucleus accumbens in the VS that are involved in approach behaviors, arousal, extinction, and response vigor. We first briefly review key literature on the function of each sector before turning to their interaction with each other and other cortico-limbic structures.
Dorsolateral striatum (DLS): stimulus-response habits
The rodent DLS appears to primarily bring reinforcement-related operant movements under specific stimulus and temporal control as a result of repeated reinforced stimulus-response (S-R) experiences, which can eventually form habits (Devan et al., 2011). These movements are usually more complex than postural or orienting responses mediated by brainstem structures (Whishaw et al., 2004). Habit formation can be mediated by associative conditioning specific to the reinforced cue (McDonald et al., 2001). This is supported by neural recordings in rat DLS showing responses selective for task-related cues and sensory-motor processing (Gardiner and Kitai, 1992; White and Rebec, 1993).
Consistent with this view, rats with neurotoxic lesions of the DLS are impaired in various types of simple discrimination tasks using a variety of cues and reinforced responses (Packard et al., 1989; Reading et al., 1991; McDonald and White, 1993; McDonald and Hong, 2004). In such studies, instrumental responses (e.g., lever press or egocentric turns) must be generated in response to a stimulus to receive food reinforcement. One example is a task in which rats must push a lever if a light was present, or pull a chain if a tone was present. This task requires acquisition of S-R associations for optimal performance and cannot be solved by instrumental (response-outcome; R-O) or Pavlovian S-O associations. Rats with neurotoxic lesions of the DLS are impaired in the acquisition and retention of this task, even if the number of reinforcers is equated across groups or the motoric requirements are reduced (Featherstone and McDonald, 2004).
Further evidence that the DLS is involved in S-R learning comes from a set of studies using non-discriminative instrumental conditioning procedures (Yin et al., 2004). Briefly, rats with DLS lesions were trained to lever press for sucrose reward. Following training, the reward was devalued by injecting subjects with a substance that induced a conditioned taste aversion. Normal rats reduce responding to the lever associated with sucrose availability following this devaluation, and rats with damage to the DLS also show this effect. In the final stage of testing, both groups are returned to the operant chambers and given an extinction test in which the lever was available but no sucrose was delivered. Normal rats responded on the lever despite the fact that they recently reduced sucrose consumption, while rats with DLS damage did not respond despite also showing the devaluation effect. Temporary inactivation of the DLS produces similar effects (Yin et al., 2006), suggesting that S-R representations encoded and stored in the DLS are insensitive to outcome devaluations in instrumental learning situations. Thus, when DLS is damaged, control of behavior is mediated by other brain regions such as the DMS that are sensitive to outcome devaluations.
Dorsomedial striatum (DMS): space and flexible responses
Accumulating evidence implicates the DMS in a range of cognitive processes including behavioral flexibility, allocentric navigation, and instrumental learning (Devan et al., 2011). This view is consistent with the connectivity of the DMS. This brain region receives glutamatergic input from the entorhinal cortex, subiculum, hippocampus, amygdala, thalamus, piriform, and prefrontal cortices (McGeorge and Faull, 1989; Voorn et al., 2004). The anatomical links between the hippocampal formation and DMS are complex. First, the subiculum projects to the most medial portions of the striatum (Groenewegen et al., 1987). Second, hippocampal output layers of the entorhinal cortex also project to the DMS (Krayniak et al., 1981; Swanson and Kohler, 1986). Third, the posterior cingulate cortex receives indirect input from the dorsal hippocampus and sends a strong projection to the DMS (McGeorge and Faull, 1989), which might provide a unique hippocampal/posterior cingulate representation to influence complex navigational abilities using visual information (Sutherland et al., 1988). Finally, the ventral hippocampus projects to portions of the medial prefrontal cortex which then project to the DMS (McGeorge and Faull, 1989). Interestingly, damage to any of the indirect sources of hippocampal input to the DMS results in impairments in place learning in the water task (Schenk and Morris, 1985; Sutherland et al., 1988; Kolb et al., 1994; Ferbinteanu et al., 1999). It is thus not presently possible to determine if any region of hippocampus has preferential influence on DMS.
DMS output targets nuclei and thalamic regions that can influence voluntary behavior (Gerfen, 1992). Furthermore, neural correlates in rodent DMS have been linked to various aspects of spatial navigational behaviors including neurons that show direction, location, and movement selectivity (Wiener, 1993; Kim et al., 2009; Mizumori et al., 2009). The location-specific cells are similar to those found in hippocampus except that the DMS “place cells” are of lower resolution. These data suggest that the DMS can utilize input from hippocampus and associated structures in some situations. Given the prominent representation of space in the hippocampus, it is thus not surprising that DMS damage impairs behaviors that require the flexible use of spatial navigation. One example comes from a variant of the water maze in which rats can swim to a submerged platform to escape the water (McDonald et al., 2008a). If the platform location is consistently moved every eight trials, normal rats will eventually learn to navigate toward the platform on the second trial after a location switch. DMS lesioned rats are impaired in the task, but do show within-session learning. Conversely, rats with hippocampal damage show severe impairments with little within-session improvements (Ferbinteanu et al., 2003). This suggests that DMS damage impairs response flexibility after a switch rather than eliminating navigation abilities. Indeed, DMS lesions impair spatial reversal learning for rewards (Castane et al., 2010), which also explicitly requires flexibility in navigation. Thus, the DMS appears to be an important node for translating navigational information, likely involving intermediate hippocampus (Bast et al., 2009), into actions that are rapidly adaptable across consecutive trials.
The role of DMS in response flexibility also extends to discriminations based on non-spatial features of multimodal cues. For instance, rats with DMS damage are able to acquire discrimination based on one dimension (e.g., light or tone) of a compound stimulus, but are impaired when they have to switch dimensions for proper discrimination (Ragozzino et al., 2002). Furthermore, neurons in rodent DMS encode stimuli and actions in tasks requiring little spatial navigation (Ito and Doya, 2009; Kimchi and Laubach, 2009), further supporting a role beyond allocentric navigation.
DMS is also involved in some types of instrumental (R-O) learning (Adams and Dickinson, 1981). These experiments typically require the subject to make a response (lever press) to obtain a desired outcome (food). This form of learning is susceptible to devaluation procedures during early phases of training but not later phases. This is thought to reflect a gradual transfer of behavioral control to S-R habit systems that are insensitive to devaluation. Various experiments provide evidence that the DMS mediates R-O associations, whereas the DLS does not (Yin et al., 2004, 2005; Yin and Knowlton, 2006). Such R-O associations would allow animals to utilize expected outcomes to select responses, which is a hallmark of a flexible, goal-oriented, control system (Adams and Dickinson, 1981; Daw et al., 2005).
Ventral striatum and related structures: gateway from emotional memory to instrumental action
The VS has been characterized as a “limbic–motor interface,” in which information about reward, context, and motivational drive is integrated to guide motivated behavior (Mogenson et al., 1980). This region receives convergent glutamatergic input from the prefrontal cortex, hippocampus, and amygdala, as well as dopaminergic input from the ventral tegmental area (Brog et al., 1993; Lynd-Balta and Haber, 1994; Wright and Groenewegen, 1995). Outputs project to brain regions associated with generation of motor behaviors (Groenewegen and Russchen, 1984; Heimer et al., 1991), as well as midbrain dopamine and hypothalamic neurons (Heimer et al., 1991; Groenewegen et al., 1993) that can modulate arousal and autonomic function (Hilton, 1982; Sutcliffe and De Lecea, 2002). Neurons in VS core fire following and in anticipation of task-related events such as cues and reinforcements, and many of these responses also encode the value of anticipated outcomes such that firing rates are higher prior to preferred reinforcements (Carelli and Deadwyler, 1994; Nicola et al., 2004; Lansink et al., 2008; Ito and Doya, 2009; Kim et al., 2009; Kimchi and Laubach, 2009; Van Der Meer and Redish, 2009; Goldstein et al., 2012). Such prevalent reward-related modulation of activity could aid animals in making economic choices by computing the relative value of future actions or states. However, many of these same recording studies show little activity predictive of choice prior to action selection, suggesting that overt actions are selected elsewhere (Ito and Doya, 2009; Kim et al., 2009; Kimchi and Laubach, 2009; Goldstein et al., 2012). Furthermore, rats with VS damage are still sensitive to changes in the value of instrumental contingency (Balleine and Killcross, 1994). This is consistent with the proposal that goal-directed control requires circuit processing involving DMS that can operate (e.g., compute value) independently from other striatal loops. Indeed, choice-related activity is more prevalent and adapts more quickly following changes in reward contingency in DMS as compared to VS (Ito and Doya, 2009; Kimchi and Laubach, 2009). If the VS is not necessary for outcome valuation and is not generating signals predictive of upcoming actions, what is the role of the prevalent value signal in the VS that precedes choices and rewards?
Like the amygdala, the VS core and its dopaminergic input appear to impart motivational effects that bring the animal in contact with task-related stimuli and invigorates operant responding (Cardinal et al., 2002a). Cues associated with rewards attract attention and elicit a conditioned response of locomotor approach in a variety of species (Brown and Jenkins, 1968; Sidman and Fletcher, 1968; Wilcove and Miller, 1974), which bring the animal in contact with the conditioned stimulus (autoshaping) or sources of reward (conditioned magazine approach) and thereby benefit learning in appetitive tasks. These phenomena are impaired in rats by VS core damage made either during or after task acquisition, suggesting that VS plays an ongoing role in behavior (Parkinson et al., 2000b; Cardinal et al., 2002b). The VS core is also involved in modulating the vigor of operant responding. VS core damage reduces rates of operant responding (Balleine and Killcross, 1994), and abolishes (Corbit and Balleine, 2011) the normally invigorating effects of non-contingent Pavlovian conditioned stimuli through a phenomenon called Pavlovian-to-instrumental transfer [PIT; (Estes, 1943; Lovibond, 1983)]. One version of PIT involves instrumental training on two levers that result in different reinforcement outcomes. Next, Pavlovian conditioning is used to associate the same outcomes with novel conditional stimuli (tone and light). During a transfer test, one of these conditioned stimuli is presented while the subjects are allowed to respond on the levers. The PIT effect is the change in response rate evoked by the conditioned stimulus, which can be specific to a particular response [one lever; (Colwill and Rescorla, 1988)] or can exert a non-specific modulation of motivation [both levers; (Dickinson and Dawson, 1987)]. PIT involves processing in a number of structures projecting to VS, including neuromodulator-releasing neurons, as described subsequently.
Inputs from dopamine-releasing neurons are an important component of VS function (Yun et al., 2004), and dopamine depletion in VS results in similar impairments as VS damage. For instance, dopamine depletion in VS core impairs the acquisition and performance of autoshaping (Parkinson et al., 2002). The behavioral significance of dopamine in the VS core has been nicely demonstrated in a recent set of experiments utilizing a detailed analysis of behavior on multiple tasks following dopamine depletion (Nicola, 2010). Nicola concluded that the primary deficit was reduced initiation of operant responding due to a reduced capacity to generate appropriate approach behaviors toward task-related apparatuses, a process he termed “flexible approach.” However, once dopamine-depleted rats engaged in a chain of operant responding, their behavior was not different from control animals. Supporting this hypothesis, VS core neurons fire in response to reward-predictive cues that trigger flexible approach (Nicola et al., 2004; Day et al., 2006; Wan and Peoples, 2006). These firing responses depend on afferents from the basolateral amygdala and prefrontal cortex (Ambroggi et al., 2008; Ishikawa et al., 2008), suggesting that Pavlovian effects of the amygdala and other structures may be exerted through the VS. The flexible approach hypothesis is consistent with the hypotheses that dopamine in the VS core is involved in assigning behavioral salience to stimuli (Berridge and Robinson, 1998) and mediating effortful responding (Salamone et al., 2009). One example of such effects is the finding that increasing VS core dopamine levels by local amphetamine injection potentiates PIT for food rewards (Wyvell and Berridge, 2000). Conversely, blocking D1 type dopamine receptors in VS core reduces responding for intracranial self-stimulation (Cheer et al., 2007). These data suggest that the VS core is involved in the engagement of an activity and modulating response vigor rather than the overt selection of actions, and this function depends on inputs from prefrontal cortex and amygdala, as well as dopamine neurons (Hauber and Sommer, 2009; Salamone et al., 2009).
Although less is known about the behavioral significance of the VS shell region, it appears to share some functional overlap with the VS core. For instance, damage to VS shell impairs the outcome-specific form of PIT in which response invigoration is specific to actions leading to the same outcome as the Pavlovian conditioned stimulus (Corbit and Balleine, 2011). Additionally, injection of amphetamine into the shell region increases lever pressing for conditioned reinforcers associated with food (Parkinson et al., 1999). These data suggest function similar to the proposal that VS core is part of a circuit for initiating responding or reinstatement of behavior in drug-taking or fear-conditioning paradigms (Peters et al., 2009). However, the VS shell appears to potentiate specific responses, whereas the VS core promotes approach and general response vigor (Parkinson et al., 1999; Corbit and Balleine, 2011). Contrasting these potentiating effects, the shell is also implicated in extinction learning (Peters et al., 2008), in which formerly positive or negative contingencies of stimuli are changed to a neutral contingency such that associated behaviors gradually extinguish. Extinction learning creates an inhibitory memory trace distinct from that created by conditioning (Rescorla, 2004), and is thus an active learning process rather than a “forgetting” of the initial associations. Extinction involves interactions among the VS shell, ventromedial prefrontal cortex and central amygdala, which have been described as a pathway for actively suppressing actions, and also for engaging avoidance/freezing behavior (Peters et al., 2009). The VS and its cortico-limbic afferents are also important for some other forms of response suppression including latent inhibition (Gal et al., 1997). Thus, the VS and associated structures are involved in processing stimuli with negative and neutral valence, as well as positive valence. Under the triage framework advocated here, the promotion or suppression of responding involving VS described in this section can be explained by mechanisms distinct from the proposed “go” and “stop” functions of the dorsal striatum that are specific for particular actions (Chevalier and Deniau, 1990; Mink, 1996). Rather than generating a suppression signal that competes with a planned action (Aron and Poldrack, 2006), the VS and associated cortico-limbic structures may instead gate sensorimotor and/or neuromodulatory processes that support the generation of motor programs thus preventing the neural substrate of actions from forming.
Many VS-dependent behaviors have a strong contextual component likely involving input from ventral hippocampus. Damage to VS or ventral hippocampus decreases contextual sensitivity of drug reinstatement, fearful responses, and latent and conditioned inhibition (Honey and Good, 1993; McDonald et al., 2006). Neural recordings further support the notion that hippocampal output shapes VS representations. VS activity shows phase precession to theta oscillations thought to depend on hippocampal input, and VS activity has a spatial component wherein cells activate during approach to reward zones (Lansink et al., 2008, 2009; Van Der Meer and Redish, 2009, 2011). Thus, the VS appears to multiplex information about space, cues, and affective outcomes, consistent with its role in developing preferences for places associated with positive valence (Hiroi and White, 1991a; White et al., 1991). Such information is expected in a system that engages approach to places (e.g., feeders) based on affective associations of stimuli. Interestingly, activity of neurons in the DMS have been reported to be more sensitive to values of specific outcomes than those in the VS (Kimchi and Laubach, 2009), suggesting that encoding of affective value may somewhat dissociate from action-specific values. This would be useful in scenarios such as reward reversal paradigms in which the affective value of a particular cue-response pair becomes negative or neutral, while others become positive. A more slowly changing affective signal related to tasks rather than particular cues or responses would be useful for keeping animals engaged in tasks so as to solve new discrimination problems, despite temporary reductions in rewarded responding.
Interactions among limbic, cognitive, and motor circuits
The data reviewed thus far indicate that the VS, amygdala, ventral hippocampus, and ventromedial prefrontal cortex form a network involved in linking stimuli and context with affective value, and engaging appropriate autonomic and postural responses. On the other hand, the control systems involving medial or lateral dorsal striatum are involved in selecting specific actions. These structures can cooperate, compete, or interfere depending on training and task demands, revealing a dynamic interaction for the control of behavior.
Hippocampal-amygdala interactions in place preference and contextual fear
One form of competition among neural structures appears to involve blocking access to a common output node. Such an interaction appears to occur during acquisition of conditioned place preference (CPP). CPP is an appetitive classical conditioning paradigm that uses distal spatial cues as the conditioned stimuli. Acquisition of this task normally involves a synergistic interaction between the amygdala, hippocampus, and VS (McDonald and White, 1993, 1995b). However, rats with lesions of ventral hippocampus or fornix (output fibers from hippocampus and associated structures) show enhanced acquisition of CPP, suggesting that the ventral hippocampal circuit normally retards control of behavior by amygdala in this task (McDonald and White, 1993, 1995b; Ferbinteanu and McDonald, 2001). This could be mediated by competition between these structures for access to the VS, which is necessary for CPP (Hiroi and White, 1991b; White et al., 1991). Pre-training exposure to the maze could be sufficient for hippocampal input to retard subsequent amygdala control of VS activity (Ferbinteanu and McDonald, 2001). Such a blockade of amygdala input to VS by hippocampal input has been reported in experiments using electrical stimulation of these afferents (Mulder et al., 1998).
Another task revealing complex interactions among these structures is fear conditioning to context. It is acquired rapidly with an intact hippocampus, but can be acquired without it following repeated training (Wiltgen et al., 2006; Sparks et al., 2011). The non-hippocampal memory is thought to involve the amygdala (Biedenkapp and Rudy, 2009), which associates some salient feature of the context with the negative event after multiple repetitions. Conversely, the hippocampus is thought to rapidly create a complex and unique representation reflecting all of the features of the context, which can be used by amygdala to facilitate context-based learning (Antoniadis and McDonald, 2000; Fanselow and Poulos, 2005). However, fear memories learned when hippocampus is inactivated can be recalled in subsequent sessions when hippocampus is again inactivated, but are not properly recalled when hippocampus is left online (Sparks et al., 2011), suggesting another example of interference or competition produced by the hippocampal output. Thus, data from CPP and fear conditioning paradigms suggest that hippocampal output may both facilitate learning to context in amygdala while also suppressing amygdala influence on VS in some instances. On the other hand, hippocampal place cells acquire responses to auditory conditioned stimuli in fear conditioning paradigms (Moita et al., 2003), suggesting that amygdala representations may also influence hippocampal processing.
Amygdala modulation of operant response: pavlovian-instrumental transfer and emotional responses
Various experiments have shown examples of PIT in which an amygdala-based Pavlovian association supports acquisition and maintenance of an arbitrarily reinforced instrumental response (Holland and Gallagher, 2003; Zorawski and Killcross, 2003; Corbit and Balleine, 2005). The general and outcome-specific form of PIT dissociate among multiple brain structures. The general form depends on central nucleus of the amygdala (Holland and Gallagher, 2003), VS core (Hall et al., 2001), and dopamine input to VS (Wyvell and Berridge, 2000; Lex and Hauber, 2008). The outcome-specific form requires basolateral amygdala (Blundell et al., 2001; Corbit and Balleine, 2005), VS shell (Corbit et al., 2001), DLS (Corbit and Janak, 2007) and dopamine neurons (El-Amamy and Holland, 2007). This is consistent with the hypothesis that VS core circuits are involved in general invigoration, while DMS/DLS circuits are selective for actions. Interestingly, PIT is enhanced with training (Holland, 2004), suggesting that the S-R system is more susceptible to this phenomena than the goal-oriented control systems and PIT is therefore more prominent as the S-R system takes over behavioral control.
Pavlovian associations can also attenuate instrumental responding and reflex amplitude. An example is conditioned emotional responding. In this paradigm, rats receive aversive outcomes paired with a cue (light) in one room, and learn operant responding for food in another room. Animals drastically reduce responding when the Pavlovian cue is presented during the operant task, likely owing to a fearful state that suppresses instrumental responding (Estes and Skinner, 1941; Leaf and Muller, 1965). Rats with amygdala damage show normal acquisition of the instrumental response, but do not reduce responding when the signal for an aversive event is presented (Davis, 1990; Weiner et al., 1995). These fearful Pavlovian associations can also modulate reflex amplitudes such as the acoustic startle reflex (Davis, 1992a). Collectively, these data indicate that Pavlovian (S-O) associations in the amygdala can invigorate or suppress instrumental responding and reflexes, possibly by evoking neural activity (Stuber et al., 2011) or dopamine release (Howland et al., 2002) in VS. The dependence of PIT on multiple interconnected brain structures presents another example in which systems interactions are needed to support behavioral phenomena.
Allocentric versus egocentric responding
Some learning paradigms can be acquired in parallel by multiple systems, particularly in tasks that can be solved my multiple strategies. For instance, rats trained to retrieve food in one location of a plus maze use an allocentric place strategy early in training and then switch to an egocentric stimulus-response (S-R) strategy later in training (Tolman et al., 1946). Subsequent studies showed that hippocampus was necessary for performance in early phases, while DLS was necessary for later phases in this task (Packard and McGaugh, 1996; Packard, 1999). This pattern of data suggests that DLS-dependent control gradually builds sufficient associative strength to take over from hippocampal-dependent control, yet either control strategy can solve the problem. It further suggests the possibility that the hippocampus reaches asymptotic associative strength levels faster than the DLS system, which is consistent with other studies of spatial navigation and contextual fear (Muller et al., 1987; Rudy et al., 2004; Stote and Fanselow, 2004; Wiltgen et al., 2006).
The DMS appears to be critical for hippocampal-based control to compete with the egocentric control mediated by DLS. An example of this (McDonald and White, 1994) was demonstrated using a modified version of the water maze task in which rats were trained to navigate to a platform that was located in the same spatial position but either visible or submerged in 12 training sessions (Figure 2A). Subjects were then given a choice between a submerged platform in the original spatial location and a visible platform in a novel location. Rats with damage to the hippocampus mainly chose the cued location (visible platform), while rats with damage to the DLS mainly chose the remembered place (submerged platform). Interestingly, the control subjects split on this competition test; about half chose the visible platform and the others chose the original spatial location (Figure 2B). Rats who chose the place response were better place learners during training, suggesting that trait differences in the relative strength of these two systems determined which gained control over behavior. Rats with DMS damage could acquire both cue-based and place-based solutions, but mainly chose the visual platform in the cue-place competition test, suggesting that the hippocampal representation could not compete for behavioral control during the test (McDonald and White, 1994; Devan et al., 1999). Subsequent cross-lesion studies have shown that interaction between DMS and hippocampus is needed for behavioral flexibility in navigation to cues and remembered places (Devan and White, 1999).
These data reveal competition between egocentric responding mediated by DLS and allocentric responding to remembered places mediated by hippocampus-DMS interactions. However, asymptotic performance on other navigation tasks that require both sensory driven responses (tactile-turning responses) and within-session spatial information (what arms have been previously visited) requires both intact DLS and hippocampus (McDonald et al., 2006). Thus, the nature of the interaction (competitive versus cooperative) between these control systems depends on task demands.
Necessary and incidental associations acquired during visual discrimination learning: context-specific inhibition by ventral hippocampus
We have recently developed a task revealing a potentially novel subclass of associations that are acquired and stored by different neural systems, and that interact with one another in ways not previously described. These associations appear to be incidentally acquired during acquisition of a visual discrimination task, and affect response flexibility when task contingencies change. In this task, rats first learn an S-R association between a particular stimulus (e.g., light on) and response (turn) that is repeatedly reinforced in a particular context “A” (room). Responses to a CS- (e.g., light off) are not reinforced. The effects of reversal learning among these stimuli in context A or a different context B show that reversal is slower in context A, in which the original training occurred (McDonald et al., 2001). Furthermore, a transfer test in which subjects reversed in context B, and were then re-tested in context A, revealed that an inhibitory association to the CS- acquired in the original context (A) still remained despite the reversal experience in context B. Consistent with other tasks, the DLS is necessary for acquisition of the S-R association (McDonald and Hong, 2004). Superimposed on this association are several apparent incidental associations. One of these associations is Pavlovian in nature and appears to be acquired and stored in the amygdala (McDonald and Hong, 2004). The amygdala representation is not context-specific, enhances the speed of reversal learning, and can be revealed during a conditioned-cue preference transfer test. In contrast to the previously described PIT effects involving amygdala, this novel amygdala-based S-O association might extinguish quickly and reduce general approach responses toward the illuminated arms, which could accelerate reversal learning when that representation is available. Another novel association is a context-specific inhibitory association acquired by the ventral hippocampus. This representation inhibits reversal learning in the same context as original learning, but does not seem to otherwise influence task acquisition or performance (McDonald et al., 2006, 2007). This is a unique demonstration because in most instances the hippocampal representation supports engagement, rather than suppression, of context-appropriate behavior (McDonald et al., 2007).
One emergent question is whether the striatum interacts with hippocampus in this context-specific inhibitory process. To partly assess this question, rats with neurotoxic damage to the DMS were trained on the previously-described visual discrimination task. DMS damage was expected to remove the context-specific inhibitory effect by disrupting functional interaction with hippocampus, analogous to the effects of DMS damage on spatial navigation. Surprisingly, damage to the DMS resulted in enhanced inhibition of reversal learning in the same context (McDonald et al., 2008b). We argued that this resulted from the elimination of a DMS-frontal cortex circuit that normally suppresses a parallel circuit linking VS-ventral hippocampus and ventromedial prefrontal cortex involved in extinction processes (McDonald et al., 2007).
Cortical interactions
Many of the behaviors thus far described also involve cortical processing. A thorough treatment is beyond the scope of this review. Briefly, the several regions comprising the rodent prefrontal cortex are implicated in numerous mnemonic, emotive, and cognitive functions supporting flexible responding to achieve goals (Dalley et al., 2004; Kesner and Churchwell, 2011). Among the many identified tasks involving subregions of prefrontal cortex, damage to the orbitofrontal region impairs outcome devaluation (Gallagher et al., 1999) and reversal learning (Schoenbaum et al., 2002), while damage to the medial prefrontal region impairs shifting discrimination among different stimulus dimensions (Joel et al., 1997; Ragozzino et al., 1999b; Birrell and Brown, 2000) as well as flexibility in place/response navigation (Ragozzino et al., 1999a). DMS receives input from these cortical regions (McGeorge and Faull, 1989), and DMS damage yields similar deficits in devaluation (Yin et al., 2005), reversal learning (Ragozzino, 2007), and flexibility in place/cue navigation (Devan et al., 1999; Ragozzino et al., 2002). Functional overlap between ventromedial prefrontal cortex and VS has been noted in acquisition and extinction of drug-taking and fearful responses (Peters et al., 2009). These data suggest that processing in cortico-striatial-pallidal/nigral-thalamic loops is sensitive to disruption at multiple points. Furthermore, the hippocampus, amygdala, and midbrain neuromodulatory neurons are reciprocally connected with prefrontal cortex as well as other nodes of the loop (Voorn et al., 2004). Such rich connectivity allows for complex interaction among these structures.
Mechanisms of competition
Inhibition within the striatum
One pervasive type of interaction that has emerged from the study of flexible responding is competition among control systems involving the striatum. Examples include selection of one operant response over others (Mink, 1996), allocentric versus egocentric navigation (Packard, 1999), or control over response extinction (McDonald et al., 2007). Indeed, anatomical and physiological data support the notion of inhibition among circuits flowing through striatum and other component nuclei of the basal ganglia. The majority of glutamatergic input to the striatum converges onto medium-sized spiny projection neurons (SPN), the primary neuron type in the striatum (DiFiglia et al., 1976; Chang et al., 1982). SPN receive a unique and rich set of inputs (Kincaid et al., 1998), and release the normally inhibitory transmitter GABA onto targets in other component nuclei of the basal ganglia as well as neighboring SPN (Somogyi et al., 1981; Bolam and Izzo, 1988; Tunstall et al., 2002). This latter feature has led to the development of many theories of the basal ganglia in which the striatum works as a competitive network wherein the most active SPN will suppress the activity of others through inhibitory collaterals (Groves, 1983; Wickens et al., 1995; Suri and Schultz, 1998; Gruber et al., 2006; O'Reilly and Frank, 2006). This configuration leads to a “winner-take-all” dynamic that generates sparse output, which is an appealing property for an action selection network so that only one output action is generated. However, this model has been called into question as electrophysiological studies have detected only weak functional connectivity between SPN, which is inconsistent with strong collateral inhibition needed for winner-take-all dynamics (Czubayko and Plenz, 2002; Tunstall et al., 2002; Koos et al., 2004). These SPN collaterals may thus be more important for other dynamical aspects of processing such as spike timing (Plenz, 2003) or the formation ensembles of SPN with coherent activity (Ponzi and Wickens, 2010).
Although the functional role of SPN collaterals remains unclear, GABA-mediated inhibition within the striatum is an important component that shapes the response of SPN to afferent input (Rebec and Curtis, 1988; Mallet et al., 2005; Gruber et al., 2009b). Electrical stimulation of cortical afferents evokes an initial excitatory component followed by an inhibitory component on firing in the DMS, and infusion of GABA antagonists decreases the inhibition (Mallet et al., 2005; Galinanes et al., 2011). Furthermore, the inhibitory component depends on the spatial location of the stimulation in prefrontal cortex (Figure 3), a property essential for competition between loops linking cortex and striatum. A likely mechanism involved in these phenomena is feed-forward inhibition mediated by the GABAergic striatal fast spiking (FS) interneurons. Although these neurons comprise only a small proportion of striatal cells (Kawaguchi, 1993), they evoke large somatic inhibitory currents that inhibit SPN firing (Koos and Tepper, 1999). FS interneurons receive input from cortex, hippocampus, amygdala, and thalamus (Pennartz and Kitai, 1991; Kita, 1993; Bennett and Bolam, 1994), indicating that multiple afferents may be able to drive inhibition. FS interneurons in dorsal or VS phasically activate during tasks (Berke, 2008; Lansink et al., 2010), and suppressing their excitability in DLS causes dystonia-like effects (Gittis et al., 2011), suggesting that FS neurons are functionally important. FS and other GABAergic interneuron types targeting SPN are found throughout the striatum and may provide similar feed-forward inhibition across functional processing domains (Tepper, 2010). Electrophysiological recordings during spatial navigation for rewards have revealed that FS neurons exhibit coordinated activity in VS (Lansink et al., 2010), but exhibit uncoordinated activity in dorsal striatum (Berke, 2008). However, transient coordination among subsets of these neurons in dorsal striatum during operant task performance has been reported (Gage et al., 2010). Thus, the spatial scale of inhibition may vary across the striatum, and may depend on behavior. It remains to be determined whether such inhibition operates between nearby or distal striatal circuits. Other routes of inhibition within the basal ganglia are also likely to influence interactions among circuits. Extensive collaterals among GABAergic projections neurons appears to be a general characteristic found throughout the basal ganglia, including the output neurons in pallidum and substantia nigra (Millhouse, 1986; Parent et al., 2000), as well as GABAergic neurons of globus pallidus that project back to the striatum (Kita and Kitai, 1994; Bevan et al., 1998). Unlike the striatum, these pallidal collaterals have a robust inhibitory influence on neighboring neurons (Sadek et al., 2007; Sims et al., 2008). The convergent nature of both afferents to the striatum (McGeorge and Faull, 1989; Kincaid et al., 1998) and projections within the basal ganglia (Oorschot, 1996) place inhibitory mechanisms in these nuclei in a strategic position for mediating competition among parallel control systems.
Striatal output: dopamine modulation of direct and indirect pathways
Competition for control of voluntary actions can also involve inhibition by the output of the basal ganglia on afferent structures. Two primary pathways through the basal ganglia have been identified with opposing effects on target neurons (Smith et al., 1998). The architecture and electrophysiology of the basal ganglia have led to a model of striatal function in which SPN projecting through the “direct” pathway have a disinhibitory effect on targets and promote action output, while those involved in the “indirect” pathway increase inhibition on target neurons and suppress action (Chevalier and Deniau, 1990; Mink, 1996). Thus, any input that can bias processing toward one pathway or the other could influence behavioral responding. One such input is dopamine. SPN involved in the direct pathway predominantly express D1 type dopamine receptors, whereas indirect pathway SPN express D2 type receptors (Gerfen et al., 1990; Surmeier et al., 1996). The proposed functional roles of the direct/indirect pathway and the select expression of dopamine receptors has been recently confirmed with optical stimulation of D1 or D2 expressing SPN in the DLS to evoke or inhibit locomotion, respectively (Kravitz et al., 2010). The level of dopamine is able to bias the relative excitability of the direct and indirect pathways by virtue of the opposing effects of D1 and D2 receptors on SPN excitability; the direct pathway is more excitable in high dopamine concentrations via D1-mediated excitation, whereas the indirect pathway is more excitable in low dopamine conditions via reduced D2-mediated suppression of SPN excitability (Albin et al., 1989; Surmeier et al., 2010).
Diffuse projections from brainstem dopamine-releasing neurons to cortical and limbic circuits include dense innervation of the striatum (Beckstead et al., 1979; Joel and Weiner, 2000). Many dopamine neurons initially respond to primary rewards by generating brief phasic increases of firing over tonic baseline, but this phasic response transfers to the earliest conditioned stimulus that predicts the delivery of reward as training progresses (Schultz, 1998; Pan et al., 2005). Omissions of expected rewards cause a depression of the tonic firing (Schultz, 1998). Dopamine neurons have thus been proposed (Montague et al., 1996) to signal errors in predicted rewards analogous to the learning signal in reinforcement learning theory that is essential for learning values of actions or future states (Sutton and Barto, 1998). Consistent with this hypothesis, dopamine neurons encode differences between expected and received reward in primates (Bayer and Glimcher, 2005), and dopamine has been shown to modulate plasticity at cortico-striatal synapses (Reynolds et al., 2001; Pawlak and Kerr, 2008; Shen et al., 2008). The direct and indirect pathways appear to have inverse learning rules wherein the D1 receptors in the direct pathway mediate post-synaptic long-term potentiation (LTP) following increases in dopamine and long-term depression (LTD) following decreases, while the opposite is true for D2 receptors (Shen et al., 2008). This suggests that the action-promoting direct pathway is strengthened following larger-than-expected rewards, whereas the action-suppressing D2 pathway is strengthened following smaller-than-expected rewards. This process has been confirmed in mouse DMS using optogenetic techniques to activate either D1 receptors, which promoted subsequent behavioral responding for light self-administration, or D2 receptors, which suppressed subsequent responding (Kravitz et al., 2012). In addition to biasing transmission between direct/indirect pathways and influencing plasticity, dopamine levels have also been proposed to bias the sensitivity of SPN to afferents from different sources such that cortical inputs dominate VS activity in conditions of low dopamine, while limbic inputs (amygdala and ventral hippocampus) dominate in conditions of high dopamine (Floresco et al., 2001; Goto and Grace, 2005). This is speculated to allow cortex to briefly exert control over behavior when outcomes are worse than expected so as to alter subsequent responding. This mechanism is consistent with data from multiple labs showing that manipulations of frontal cortex mostly impair the initial adaptation of rats following changes in reward contingencies such that responses become unexpectedly unrewarded (Birrell and Brown, 2000; Schoenbaum et al., 2002; Goto and Grace, 2005). However, this mechanism in VS does not preclude a role for hippocampus in flexible responding following disappointing outcomes, which could be mediated via cortex or directly in the striatum via diminished hippocampal input. Further data are needed to resolve this issue.
Dopamine neurons continue to respond to conditioned stimuli even in well-learned tasks (Schultz et al., 1993; Pan et al., 2005), and striatal dopamine levels encode relative value associated with stimuli (Gan et al., 2010). These phasic dopamine responses could thus be important for Pavlovian effects on behavior such as autoshaping and PIT, even in well-learned tasks. Consistent with this hypothesis, dopamine depletion in VS reduces engagement of operant responding (Nicola, 2010), whereas dopamine in both VS and DLS are involved in increased vigor via PIT (Hall et al., 2001; Corbit and Janak, 2007; Corbit and Balleine, 2011). The fact that PIT depends on both territories could be reflective of the spiral architecture in which VS neurons project to dopamine neurons in the substantia nigra (Groenewegen et al., 1993) that project to dorsal striatum (Beckstead et al., 1979; Joel and Weiner, 2000). Furthermore, the involvement of basolateral amygdala in Pavlovian effects such as PIT (Holland and Gallagher, 2003) and autoshaping (Parkinson et al., 2000a) may come from the ability of this input to activate SPN (O'Donnell and Grace, 1995) or to increase VS dopamine (Howland et al., 2002; Jones et al., 2010). Indeed, mice will operantly respond for optogenetic activation of amygdala input to VS through a mechanism that requires D1 receptors in the striatum, while optogenetic suppression of this input suppresses cue-evoked sucrose intake (Stuber et al., 2011). This latter finding is consistent with an ongoing role of dopamine in regulating behavior after acquisition. Interestingly, mice do not acquire analogous self-stimulation of cortical afferents to VS, suggesting that information from these different afferents have different impact in the striatum (Stuber et al., 2011).
These data indicate that the striatum is a nexus for multiple afferents involved in behavioral control, and has mechanisms needed for both competition and coordination among afferents to suppress or engage responding. These mechanisms involve inhibition and neuromodulatory control by dopamine for learning and on-line control of responding.
Synthesis and computational perspectives
Dissociated striatal circuits for triaging responses and generating actions by model-based or model-free control systems
Control systems involving different territories of the striatum have some distinct properties that are advantageous in particular tasks. Repeated reinforcement of stimulus driven responses develops S-R associations with sufficient strength so as to control behavior in tasks where such responding provides an adequate solution strategy (Tolman et al., 1946; Devan et al., 2011). The behavioral studies reviewed here indicate that these S-R mediated actions depend on DLS processing, are egocentric, and are insensitive to devaluation procedures. With repeated exposure, these S-R mediated actions can become habits (Barnes et al., 2005). Computationally, S-R control can be learned using “cashed” values of rewards acquired over many repetitions with simple algorithms (e.g., temporal difference) that do not build explicit models of environmental contingencies, and are thus considered to be “model-free” (Daw et al., 2005). Nonetheless, such algorithms can solve complex real-world problems, and can be executed quickly. They are, however, bound to particular stimuli and do not represent changes in environment until outcomes have been experienced repeatedly so as to weaken the S-R associations. Such control is therefore slow to adapt to changing environmental contingencies and is not intrinsically sensitive to motivational state because the reinforcement is not inferred prior to action.
The DMS is involved in flexible responding using hippocampal-dependent allocentric navigation, switching among cue dimensions for discriminations, and anticipating outcomes (Ragozzino et al., 2002; Yin et al., 2005; McDonald et al., 2008a). The DMS can therefore be characterized as associating a combination of stimuli, spatial contexts, responses, and reward outcomes. Computationally, formation of such antecedent-outcome associations necessitates representation of environmental contingencies, and is therefore considered to be a “model-based” learning system (Daw et al., 2005). Importantly, such systems allow mental testing of hypotheses about the outcomes of potential responses prior to action selection. Responding is thus sensitive to motivational state (e.g., devaluation) because the desirability of the reinforcement outcome can be inferred prior to the action. This type of control is therefore more flexible than the habit system.
Like the DMS, the VS and associated structures also form Pavlovian associations between stimuli, contexts, and affective outcomes. Unlike the DMS, however, evidence reviewed here indicates that its effect on motoric output is related to (i) approach behaviors needed to engage in operant responding and (ii) the vigor of responding (e.g., PIT), rather than selecting specific operant responses. We propose that the emotional memory network formed by VS, amygdala, ventral hippocampus, prefrontal cortex, and dopamine input can be conceptualized as triaging responses to stimuli based on their associated affective value in a context. When stimuli are associated with aversive outcomes, this system engages freezing and autonomic responses (e.g., bradycardia) via brainstem targets dissociated from habit or goal directed systems. For stimuli with positive valence, we propose the following simple model (Figure 4). Early in learning, hippocampal-amygdala circuits rapidly acquire response to a conditioned stimulus predictive of reward (CS+), and their output activates VS core and dopamine neurons so as to promote orientation and place preference, and invigorate non-specific activity so that the goal-oriented system can acquire contingencies and discover appropriate operant responses (Figure 4A). Later in learning, the CS+ activates amygdala, hippocampus, and neocortex, resulting in activation of VS-projecting dopamine neurons. Elevated dopamine in the VS biases SPN to hippocampal input and promotes direct pathway output to orient the animal and to trigger activity of dopamine neurons projecting to dorsal striatum. Elevated dopamine in the dorsal striatum promotes activation of the direct pathway to invigorate actions mediated by a competition between S-R systems involving the DLS and goal-oriented systems in the DMS. If the subsequent reward outcome is better than expected, a second bout of elevated dopamine causes LTP in the direct pathway via D1 receptors and LTD in the indirect pathway via D2 receptors, thereby increasing the likelihood of repeating the response. If the outcome is lower than expected, then the direct pathway depresses and the indirect pathway potentiates so that the response is more likely to be withheld following future stimulus presentations. Furthermore, reduced dopamine following negative reward prediction errors engages cortical control over striatal function so as to implement new response strategies (Figure 4B). Consistent with this simple model, damage, or inactivation of medial prefrontal or orbitofrontal cortex induces behavioral deficits that are most apparent following switches in task rules or outcomes (Ragozzino et al., 1999b; Birrell and Brown, 2000; Schoenbaum et al., 2002), and damage to DMS similarly impairs response flexibility (Devan et al., 1999; Ragozzino et al., 2002; Yin et al., 2005). Presentation of a Pavlovian CS during a well learned task evokes larger amygdala activations, promoting approach and by increasing VS core activity, and increasing vigor by increasing dopamine levels (Figure 4C).
Aligning reinforcement learning with brain function: dopamine, inhibition, and context
Dopamine in the striatum is critical for learning and executing actions (Yun et al., 2004; Witten et al., 2011). The unique signaling properties of dopamine neurons and their striking similarity to reward prediction error signals have inspired proposals that the brain may use a form of reinforcement learning (RL) to make economic decisions. In RL, the reward prediction error signal is used to learn the value of action outcomes, and the decision-maker then biases selection of actions toward those with high value. Current models of cortico-striatal function developed under this elegant theory can replicate key features of flexible decisions in some economic choice tasks (Samejima et al., 2005; Daw et al., 2006; Glascher et al., 2010). However, the standard form does not perform well in other conditions (Niv et al., 2007; Zilli and Hasselmo, 2008; Collins and Frank, 2012). The addition of episodic or working-memory can improve RL model performance on tasks with delays or history dependence (Zilli and Hasselmo, 2008; Collins and Frank, 2012). Another candidate mechanism that may increase coherence of RL with biological function is the modulatory effects of emotion on vigor that may involve dopamine (Niv et al., 2007). The neuromodulatory effects of dopamine have been proposed to selectively enhance encoding of salient information (Berridge and Robinson, 1998; Gruber et al., 2004, 2006), increase response effort (Niv et al., 2007; Salamone et al., 2007), and bias the willingness to explore options (Parush et al., 2011; Humphries et al., 2012). Dopamine is involved in both the general and outcome-specific forms of PIT, which dissociate among striatal regions as described in a previous section. Furthermore, the representation of value may partially dissociate in striatum. Neurons in DMS can encode changing outcome values more quickly than in VS (Ito and Doya, 2009; Kimchi and Laubach, 2009), and animals are sensitive to reward value following VS damage (Balleine and Killcross, 1994; De Borchgrave et al., 2002). These data suggest parallel circuits for computing value in which the VS engages and invigorates non-specific actions based on general affective value (amount of food in this environment), whereas the DMS is involved in selecting actions based on specific expected outcomes of responses (left lever results in grape after a delay). As discussed by others (Pennartz et al., 2011), these features of the rodent striatum conflict with the current mapping of the standard actor-critic architecture of RL onto striatal circuits wherein the VS “critic” signals outcome value to dorsal striatal “actor” for action selection.
Dopamine neurons generate a variety of responses to stimuli associated with appetitive as well as aversive outcomes in rodents and primates (Mantz et al., 1989; Schultz et al., 1993; Pan et al., 2005; Brischoux et al., 2009; Matsumoto and Hikosaka, 2009). Measurement of dopamine release following such stimuli has revealed site specific effects; dopamine is elevated in VS shell following unconditioned rewarding stimuli (Bassareo et al., 2002; Aragona et al., 2009) and is elevated to a larger extent in frontal cortex than striatum following unconditioned aversive stimuli (Abercrombie et al., 1989; Bassareo et al., 2002). Furthermore, alteration of synaptic plasticity is selective to dopamine neurons projecting to frontal cortex and VS shell following aversive events, and selective to dopamine neurons projecting to VS core and shell following appetitive events (Lammel et al., 2011). However, conditioned stimuli associated with aversive or appetitive outcomes evoke dopamine release in VS core (Young, 2004; Gan et al., 2010). These data indicate functional heterogeneity of dopamine neurons projecting to different sites, and are consistent with the proposal (Bromberg-Martin et al., 2010) of distinct dopamine circuits mediating: (i) orientation and motivation, (ii) value learning, and (iii) detection of important sensory events. The two former functions map nicely onto the triaging function of the VS and strategic/flexible function of the DMS proposed here. The responses of dopamine neurons to aversive stimuli may fall under the latter category, and evidence suggesting a predominant cortical effect suggest that it may be useful for engaging cortex-driven alteration of responding to avoid negative affective states. Further investigation is needed to determine if elevated dopamine evoked by aversive stimuli could increase attention to relevant stimuli or affect decision-making without engaging approach.
Another biological component missing from the standard RL model is the learned suppression of responding through conditioned and latent inhibition. These could be mediated by multiple mechanisms in the striatum. For conditioned inhibition, worse-than-expected rewards potentiate the indirect pathway via D2 receptors so as to inhibit future responses. This is separate from LTD on the direct pathway. This has been shown with optogenetic stimulation of the indirect pathway (Kravitz et al., 2012), and is consistent with data from medicated Parkinson's patients who do not have a mechanism for brief reductions in dopamine and are impaired on learning from negative, but not positive, reward prediction errors (Frank et al., 2004). Another important mechanism for response suppression is the triaging function of the emotional system, which is important for guiding behavior to non-positive CS (Figure 4D). For neutral CS, this system closes the sensory-motor gate by means of latent or conditioned inhibition involving hippocampus and VS (McDonald et al., 2002, 2006). The emotional memory system may take advantage of the rapid learning capabilities, contextual sensitivity, and large memory capacity of hippocampal episodic memory to prevent responding to stimuli previously unrelated to reinforcement. We predict that this triaging would be particularly advantageous for focusing responding on productive actions in stimulus-rich environments that allow many possible unproductive actions. It could also unburden the goal-directed and habit systems from learning associations for low-valued outcomes that could interfere with encoding of high-value associations. On the other hand, such gating out of sensorimotor responding by incidental learning in hippocampus and amygdala can impair behavioral flexibility when previously neutral stimuli become predictive of rewards.
The intermediate and ventral portions of hippocampus and associated structures innervate both the DMS and VS, and are thought to provide context. Although context is often associated with space, it can also represent temporal ordering in spatial and non-spatial domains (Fortin et al., 2002; Howland et al., 2008), segment phase during spatial alternation (Wood et al., 2000), configuration of stimuli (Rudy and Sutherland, 1989), task rules (Satvat et al., 2011), and internal drive states (Kennedy and Shapiro, 2004). Thus, the ability of hippocampus to pre-play task-related information prior to choices, as has been identified in dorsal hippocampus (Johnson and Redish, 2007; Ferbinteanu et al., 2011), could impart a great deal of information to DMS and frontal cortex in support of flexible decisions. This input is expected to be particularly important for decision making in conditions of high spatial or contextual ambiguity, or when complex configurations of stimuli or their temporal ordering cue appropriate behavior. The ability of hippocampus to separate patterns by generating orthogonal representations (O'Reilly and Rudy, 2001) may additionally facilitate learning about salient features of complex input that may otherwise be generalized in amygdala or cortico-striatal circuits. Conversely, ventral hippocampus is involved in contextual gating of activity to unrewarding stimuli. For instance, reversal learning is slower in familiar contexts as compared to novel contexts in rats with intact hippocampus, but this slowing is eliminated by lesions of ventral hippocampus (McDonald et al., 2002, 2006). This lesion-induced enhancement of reversal learning reflects removal of a context-specific inhibition, possibly by suppressing flexible approach to neutral stimuli via the VS. These lesions also enhance CPP, again reflecting a context-specific inhibitory influence possibly mediated by blocking amygdala access to VS (McDonald and White, 1993, 1995b). These inhibitory functions are separate but not exclusive of the proposed excitatory hippocampal gating functions mediated by glutamatergic input to SPN conjunctive with other afferents (O'Donnell and Grace, 1995).
One potential mechanism for the inhibitory hippocampal gating function is competition for striatal control mediated by inhibitory mechanisms in the striatum. This inhibition may also play a role in the competition between DMS-mediated “model-based” control and DLS-mediated “model-free” control observed in spatial navigation tasks (McDonald and White, 1994; Devan et al., 1999; Thorn et al., 2010), impulse suppression (Winstanley et al., 2006), and the hypothesized suppression of extinction functions of the ventral networks by DMS during reversal learning paradigms (McDonald et al., 2008b). However, these systems do not always compete. Tasks requiring integration of the unique processing capabilities of multiple systems leads to synergistic interactions (White and McDonald, 2002).
Pathology of decision-making systems
The processing in networks among cortex, basal ganglia, hippocampus, amygdala, hypothalamus, and brainstem neuromodulatory systems forms a distributed control system with multiple learning systems that operate in parallel for controlling behavior. This distributed architecture has an advantage of containing few single points of total failure; much of the work discussed in this review shows that rats can learn to perform many tasks, albeit sub-optimally, following damage to any one striatal region or its cortico-limbic afferents. However, it does provide multiple points for partial failure in that damage to one of many structures can lead to the same deficit, and dysfunction of one control system can alter processing in others. These features obfuscate the etiology of decision-making deficits accompanying many mental illnesses. For example, schizophrenia among other illnesses is associated with complex symptomatology including impairments in working-memory, response perseveration, and outcome valuations as well as sensorimotor gating (Braff and Geyer, 1990; Gold et al., 2008). How much of the cognitive effects derive from dysfunction of the proposed limbic triaging system remain to be determined (Grace, 2000; Bast, 2011). Furthermore, the limbic system is involved in or affected by brain systems mediating stress (Herman et al., 2005), mood (Price and Drevets, 2012), and addictions (Everitt et al., 1999), suggesting that it is a conduit by which these affective states can influence decisions in healthy and ill brains.
Conclusion
We propose that a ventral emotional network including amygdala and ventral portions of hippocampus, striatum, and medial prefrontal cortex performs triaging of responding based on affective associations so as to avoid unpleasurable stimuli, ignore inconsequential stimuli, and approach pleasurable stimuli. The approach and accompanying gating of sensory signals promote the selection of operant actions by model-based or model-free systems involving dorsal striatum. The rapid learning rate and large capacity for S-O (amygdala) and sensory-sensory (hippocampus) associations may provide a solution for the problem of learning to thrive in large environments that present many unproductive stimuli and permit many inconsequential actions, which is computationally difficult to solve (Sutton and Barto, 1998). The triaging system may allow animals to identify, following little experience, hazards to avoid and opportunities to exploit for benefit. This alone may facilitate survival by providing simple outcome-predictive control over elementary behaviors such as foraging. It may further aid more complex behaviors by unburdening model-based and model-free control from processing or remembering irrelevant information. In addition to gating responses, this ventral system also appears to invigorate actions as well as provide rich contextual information that aids the model-based control system in complex discriminations and learning. Potential downsides of this triaging system are poor decision-making under states of heightened emotion, and slow adaptation when affective associations of stimuli change qualitatively (e.g., neutral to positive). However, we suggest that slow adaptation of this system may aid learning in some appetitive tasks following contingency changes by keeping animals engaged in tasks while new response strategies can be learned, despite temporary reduction in rewarded responding.
Learning and memory systems involved in behavioral control appear to function in parallel, even when they are not part of the dominant response strategy. However, these non-dominant associations may become relevant when adapting to changing environmental contingencies. For reversal learning of a habit, suppression of both the dominant response and latent/conditioned inhibition to previously-irrelevant stimuli must accompany formation of new associations between stimuli, responses, and values. Dysfunction of any of these processes will retard reversals. Thus, a multiple-systems-level conceptualization may help drive new insight to decision-making processes in the brain, particularly regarding the role of emotional memories and the control of habitual responding, and may come to illuminate the processes underlying many examples of inflexible or irrational human behavior involving emotion (Ditto et al., 2006; Van Den Bos et al., 2008).
Conflict of interest statement
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Acknowledgments
We thank D. Euston for helpful discussion, and I. Skelin, S. Oberg, N. Hong, and the reviewers for their constructive comments on previous versions of the manuscript. Supported by funding from the Canadian Natural Sciences and Engineering Research Council (Aaron J. Gruber, Robert J. McDonald), Alberta Heritage Foundation for Medical Research (Aaron J. Gruber), and Alberta Innovates—Health Solutions (Aaron J. Gruber).
Glossary
Abbreviations
- CPP
conditioned place preference
- CS
conditioned stimulus
- DLS
dorsolateral striatum
- DMS
dorsomedial striatum
- PIT
Pavlovian-to-instrumental transfer
- RL
reinforcement learning
- R-O
response-outcome
- S-O
stimulus-outcome
- S-R
sensory-response
- VS
ventral striatum (a.k.a. nucleus accumbens).
References
- Abercrombie E. D., Keefe K. A., Difrischia D. S., Zigmond M. J. (1989). Differential effect of stress on in vivo dopamine release in striatum, nucleus accumbens, and medial frontal cortex. J. Neurochem. 52, 1655–1658 [DOI] [PubMed] [Google Scholar]
- Adams C. D., Dickinson A. (1981). Instrumental responding following reinforcer devaluation. Q. J. Exp. Psychol. 33, 109–122 [Google Scholar]
- Albin R. L., Young A. B., Penney J. B. (1989). The functional anatomy of basal ganglia disorders. Trends Neurosci. 12, 366–375 [DOI] [PubMed] [Google Scholar]
- Alexander G. E., Delong M. R., Strick P. L. (1986). Parallel organization of functionally segregated circuits linking basal ganglia and cortex. Ann. Rev. Neurosci. 9, 357–381 [DOI] [PubMed] [Google Scholar]
- Amaral D., Witter M. (1995). Hippocampal formation, in The Rat Nervous System, 2nd Edn ed Paxinos G. (Sydney, NSW: Academic Press; ), 443–485 [Google Scholar]
- Amaral D. G., Witter M. P. (1989). The three-dimensional organization of the hippocampal formation: a review of anatomical data. Neuroscience 31, 571–591 10.1016/0306-4522(89)90424-7 [DOI] [PubMed] [Google Scholar]
- Ambroggi F., Ishikawa A., Fields H. L., Nicola S. M. (2008). Basolateral amygdala neurons facilitate reward-seeking behavior by exciting nucleus accumbens neurons. Neuron 59, 648–661 10.1016/j.neuron.2008.07.004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Antoniadis E. A., McDonald R. J. (2000). Amygdala, hippocampus and discriminative fear conditioning to context. Behav. Brain Res. 108, 1–19 10.1016/S0166-4328(99)00121-7 [DOI] [PubMed] [Google Scholar]
- Aragona B. J., Day J. J., Roitman M. F., Cleaveland N. A., Wightman R. M., Carelli R. M. (2009). Regional specificity in the real-time development of phasic dopamine transmission patterns during acquisition of a cue-cocaine association in rats. Eur. J. Neurosci. 30, 1889–1899 10.1111/j.1460-9568.2009.07027.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aron A. R., Poldrack R. A. (2006). Cortical and subcortical contributions to Stop signal response inhibition: role of the subthalamic nucleus. J. Neurosci. 26, 2424–2433 10.1523/JNEUROSCI.4682-05.2006 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bagshaw M. H., Benzies S. (1968). Multiple measures of the orienting reaction and their dissociation after amygdalectomy in monkeys. Exp. Neurol. 20, 175–187 10.1016/0014-4886(68)90090-3 [DOI] [PubMed] [Google Scholar]
- Balleine B., Killcross S. (1994). Effects of ibotenic acid lesions of the nucleus accumbens on instrumental action. Behav. Brain Res. 65, 181–193 [DOI] [PubMed] [Google Scholar]
- Balleine B. W., Dickinson A. (1998). Goal-directed instrumental action: contingency and incentive learning and their cortical substrates. Neuropharmacology 37, 407–419 [DOI] [PubMed] [Google Scholar]
- Barnes T. D., Kubota Y., Hu D., Jin D. Z., Graybiel A. M. (2005). Activity of striatal neurons reflects dynamic encoding and recoding of procedural memories. Nature 437, 1158–1161 10.1038/nature04053 [DOI] [PubMed] [Google Scholar]
- Bassareo V., De Luca M. A., Di Chiara G. (2002). Differential expression of motivational stimulus properties by dopamine in nucleus accumbens shell versus core and prefrontal cortex. J. Neurosci. 22, 4709–4719 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bast T. (2011). The hippocampal learning-behavior translation and the functional significance of hippocampal dysfunction in schizophrenia. Curr. Opin. Neurobiol. 21, 492–501 10.1016/j.conb.2011.01.003 [DOI] [PubMed] [Google Scholar]
- Bast T., Feldon J. (2003). Hippocampal modulation of sensorimotor processes. Prog. Neurobiol. 70, 319–345 10.1016/S0301-0082(03)00112-6 [DOI] [PubMed] [Google Scholar]
- Bast T., Wilson I. A., Witter M. P., Morris R. G. (2009). From rapid place learning to behavioral performance: a key role for the intermediate hippocampus. PLoS Biol. 7:e1000089 10.1371/journal.pbio.1000089 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bast T., Zhang W. N., Feldon J. (2001). The ventral hippocampus and fear conditioning in rats. Different anterograde amnesias of fear after tetrodotoxin inactivation and infusion of the GABA(A) agonist muscimol. Exp. Brain Res. 139, 39–52 10.1007/s002210100746 [DOI] [PubMed] [Google Scholar]
- Bayer H. M., Glimcher P. W. (2005). Midbrain dopamine neurons encode a quantitative reward prediction error signal. Neuron 47, 129–141 10.1016/j.neuron.2005.05.020 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beckstead R. M., Domesick V. B., Nauta W. J. (1979). Efferent connections of the substantia nigra and ventral tegmental area in the rat. Brain Res. 175, 191–217 10.1016/0006-8993(79)91001-1 [DOI] [PubMed] [Google Scholar]
- Ben-Ari Y., Le Gal La Salle G. (1974). Lateral amygdala unit activity: II. Habituating and non-habituating neurons. Electroencephalogr. Clin. Neurophysiol. 37, 463–472 [DOI] [PubMed] [Google Scholar]
- Benchenane K., Peyrache A., Khamassi M., Tierney P. L., Gioanni Y., Battaglia F. P., Wiener S. I. (2010). Coherent theta oscillations and reorganization of spike timing in the hippocampal-prefrontal network upon learning. Neuron 66, 921–936 10.1016/j.neuron.2010.05.013 [DOI] [PubMed] [Google Scholar]
- Bennett B. D., Bolam J. P. (1994). Synaptic input and output of parvalbumin-immunoreactive neurons in the neostriatum of the rat. Neuroscience 62, 707–719 10.1016/0306-4522(94)90471-5 [DOI] [PubMed] [Google Scholar]
- Berke J. D. (2008). Uncoordinated firing rate changes of striatal fast-spiking interneurons during behavioral task performance. J. Neurosci. 28, 10075–10080 10.1523/JNEUROSCI.2192-08.2008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Berridge C. W., Espana R. A., Vittoz N. M. (2010). Hypocretin/orexin in arousal and stress. Brain Res. 1314, 91–102 10.1016/j.brainres.2009.09.019 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Berridge K. C., Robinson T. E. (1998). What is the role of dopamine in reward: hedonic impact, reward learning, or incentive salience? Brain Res. Brain Res. Rev. 28, 309–369 10.1016/S0165-0173(98)00019-8 [DOI] [PubMed] [Google Scholar]
- Bevan M. D., Booth P. A., Eaton S. A., Bolam J. P. (1998). Selective innervation of neostriatal interneurons by a subclass of neuron in the globus pallidus of the rat. J. Neurosci. 18, 9438–9452 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Biedenkapp J. C., Rudy J. W. (2009). Hippocampal and extrahippocampal systems compete for control of contextual fear: role of ventral subiculum and amygdala. Learn. Mem. 16, 38–45 10.1101/lm.1099109 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Birrell J. M., Brown V. J. (2000). Medial frontal cortex mediates perceptual attentional set shifting in the rat. J. Neurosci. 20, 4320–4324 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Blundell P., Hall G., Killcross S. (2001). Lesions of the basolateral amygdala disrupt selective aspects of reinforcer representation in rats. J. Neurosci. 21, 9018–9026 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bolam J. P., Izzo P. N. (1988). The postsynaptic targets of substance P-immunoreactive terminals in the rat neostriatum with particular reference to identified spiny striatonigral neurons. Exp. Brain Res. 70, 361–377 [DOI] [PubMed] [Google Scholar]
- Bostock E., Muller R. U., Kubie J. L. (1991). Experience-dependent modifications of hippocampal place cell firing. Hippocampus 1, 193–205 10.1002/hipo.450010207 [DOI] [PubMed] [Google Scholar]
- Braff D. L., Geyer M. A. (1990). Sensorimotor gating and schizophrenia. Human and animal model studies. Arch. Gen. Psychiatry 47, 181–188 10.1001/archpsyc.1990.01810140081011 [DOI] [PubMed] [Google Scholar]
- Breiter H. C., Etcoff N. L., Whalen P. J., Kennedy W. A., Rauch S. L., Buckner R. L., Strauss M. M., Hyman S. E., Rosen B. R. (1996). Response and habituation of the human amygdala during visual processing of facial expression. Neuron 17, 875–887 10.1016/S0896-6273(00)80219-6 [DOI] [PubMed] [Google Scholar]
- Brischoux F., Chakraborty S., Brierley D. I., Ungless M. A. (2009). Phasic excitation of dopamine neurons in ventral VTA by noxious stimuli. Proc. Natl. Acad. Sci. U.S.A. 106, 4894–4899 10.1073/pnas.0811507106 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brog J. S., Salyapongse A., Deutch A. Y., Zahm D. S. (1993). The patterns of afferent innervation of the core and shell in the “accumbens” part of the rat ventral striatum: immunohistochemical detection of retrogradely transported fluoro-gold. J. Comp. Neurol. 338, 255–278 10.1002/cne.903380209 [DOI] [PubMed] [Google Scholar]
- Bromberg-Martin E. S., Matsumoto M., Hikosaka O. (2010). Dopamine in motivational control: rewarding, aversive, and alerting. Neuron 68, 815–834 10.1016/j.neuron.2010.11.022 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brown P. L., Jenkins H. M. (1968). Auto-shaping of the pigeon's key-peck. J. Exp. Anal. Behav. 11, 1–8 10.1901/jeab.1968.11-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Buzsaki G. (2005). Theta rhythm of navigation: link between path integration and landmark navigation, episodic and semantic memory. Hippocampus 15, 827–840 10.1002/hipo.20113 [DOI] [PubMed] [Google Scholar]
- Cador M., Robbins T. W., Everitt B. J. (1989). Involvement of the amygdala in stimulus-reward associations: interaction with the ventral striatum. Neuroscience 30, 77–86 [DOI] [PubMed] [Google Scholar]
- Cardinal R. N., Parkinson J. A., Hall J., Everitt B. J. (2002a). Emotion and motivation: the role of the amygdala, ventral striatum, and prefrontal cortex. Neurosci. Biobehav. Rev. 26, 321–352 10.1016/S0149-7634(02)00007-6 [DOI] [PubMed] [Google Scholar]
- Cardinal R. N., Parkinson J. A., Lachenal G., Halkerston K. M., Rudarakanchana N., Hall J., Morrison C. H., Howes S. R., Robbins T. W., Everitt B. J. (2002b). Effects of selective excitotoxic lesions of the nucleus accumbens core, anterior cingulate cortex, and central nucleus of the amygdala on autoshaping performance in rats. Behav. Neurosci. 116, 553–567 [DOI] [PubMed] [Google Scholar]
- Carelli R. M., Deadwyler S. A. (1994). A comparison of nucleus accumbens neuronal firing patterns during cocaine self-administration and water reinforcement in rats. J. Neurosci. 14, 7735–7746 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Castane A., Theobald D. E., Robbins T. W. (2010). Selective lesions of the dorsomedial striatum impair serial spatial reversal learning in rats. Behav. Brain Res. 210, 74–83 10.1016/j.bbr.2010.02.017 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cenquizca L. A., Swanson L. W. (2007). Spatial organization of direct hippocampal field CA1 axonal projections to the rest of the cerebral cortex. Brain Res. Rev. 56, 1–26 10.1016/j.brainresrev.2007.05.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chang H. T., Wilson C. J., Kitai S. T. (1982). A Golgi study of rat neostriatal neurons: light microscopic analysis. J. Comp. Neurol. 208, 107–126 10.1002/cne.902080202 [DOI] [PubMed] [Google Scholar]
- Cheer J. F., Aragona B. J., Heien M. L., Seipel A. T., Carelli R. M., Wightman R. M. (2007). Coordinated accumbal dopamine release and neural activity drive goal-directed behavior. Neuron 54, 237–244 10.1016/j.neuron.2007.03.021 [DOI] [PubMed] [Google Scholar]
- Chevalier G., Deniau J. M. (1990). Disinhibition as a basic process in the expression of striatal functions. Trends Neurosci. 13, 277–280 [DOI] [PubMed] [Google Scholar]
- Collins A. G., Frank M. J. (2012). How much of reinforcement learning is working memory, not reinforcement learning? A behavioral, computational, and neurogenetic analysis. Eur. J. Neurosci. 35, 1024–1035 10.1111/j.1460-9568.2011.07980.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Colwill R. M., Rescorla R. A. (1988). Associations between the discriminative stimulus and the reinforcer in instrumental learning. J. Exp. Psychol. Anim. Behav. Process. 14, 155–164 [Google Scholar]
- Corbit L. H., Balleine B. W. (2005). Double dissociation of basolateral and central amygdala lesions on the general and outcome-specific forms of pavlovian-instrumental transfer. J. Neurosci. 25, 962–970 10.1523/JNEUROSCI.4507-04.2005 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Corbit L. H., Balleine B. W. (2011). The general and outcome-specific forms of Pavlovian-instrumental transfer are differentially mediated by the nucleus accumbens core and shell. J. Neurosci. 31, 11786–11794 10.1523/JNEUROSCI.2711-11.2011 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Corbit L. H., Janak P. H. (2007). Inactivation of the lateral but not medial dorsal striatum eliminates the excitatory impact of Pavlovian stimuli on instrumental responding. J. Neurosci. 27, 13977–13981 10.1523/JNEUROSCI.4097-07.2007 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Corbit L. H., Muir J. L., Balleine B. W. (2001). The role of the nucleus accumbens in instrumental conditioning: evidence of a functional dissociation between accumbens core and shell. J. Neurosci. 21, 3251–3260 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Coutureau E., Galani R., Gosselin O., Majchrzak M., Di Scala G. (1999). Entorhinal but not hippocampal or subicular lesions disrupt latent inhibition in rats. Neurobiol. Learn. Mem. 72, 143–157 10.1006/nlme.1998.3895 [DOI] [PubMed] [Google Scholar]
- Czubayko U., Plenz D. (2002). Fast synaptic transmission between striatal spiny projection neurons. Proc. Natl. Acad. Sci. U.S.A. 99, 15764–15769 10.1073/pnas.242428599 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dalley J. W., Cardinal R. N., Robbins T. W. (2004). Prefrontal executive and cognitive functions in rodents: neural and neurochemical substrates. Neurosci. Biobehav. Rev. 28, 771–784 10.1016/j.neubiorev.2004.09.006 [DOI] [PubMed] [Google Scholar]
- Davis M. (1986). Pharmacological and anatomical analysis of fear conditioning using the fear-potentiated startle paradigm. Behav. Neurosci. 100, 814–824 [DOI] [PubMed] [Google Scholar]
- Davis M. (1990). Animal models of anxiety based on classical conditioning: the conditioned emotional response (CER) and the fear-potentiated startle effect. Pharmacol. Ther. 47, 147–165 10.1016/0163-7258(90)90084-F [DOI] [PubMed] [Google Scholar]
- Davis M. (1992a). The role of the amygdala in fear-potentiated startle: implications for animal models of anxiety. Trends Pharmacol. Sci. 13, 35–41 [DOI] [PubMed] [Google Scholar]
- Davis M. (1992b). The role of the amygdala in fear and anxiety. Annu. Rev. Neurosci. 15, 353–375 10.1146/annurev.ne.15.030192.002033 [DOI] [PubMed] [Google Scholar]
- Davis M., Gendelman D. S., Tischler M. D., Gendelman P. M. (1982). A primary acoustic startle circuit: lesion and stimulation studies. J. Neurosci. 2, 791–805 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Daw N. D., Niv Y., Dayan P. (2005). Uncertainty-based competition between prefrontal and dorsolateral striatal systems for behavioral control. Nat. Neurosci. 8, 1704–1711 10.1038/nn1560 [DOI] [PubMed] [Google Scholar]
- Daw N. D., O'doherty J. P., Dayan P., Seymour B., Dolan R. J. (2006). Cortical substrates for exploratory decisions in humans. Nature 441, 876–879 10.1038/nature04766 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Day J. J., Wheeler R. A., Roitman M. F., Carelli R. M. (2006). Nucleus accumbens neurons encode Pavlovian approach behaviors: evidence from an autoshaping paradigm. Eur. J. Neurosci. 23, 1341–1351 10.1111/j.1460-9568.2006.04654.x [DOI] [PubMed] [Google Scholar]
- De Borchgrave R., Rawlins J. N., Dickinson A., Balleine B. W. (2002). Effects of cytotoxic nucleus accumbens lesions on instrumental conditioning in rats. Exp. Brain Res. 144, 50–68 10.1007/s00221-002-1031-y [DOI] [PubMed] [Google Scholar]
- Derdikman D., Moser E. I. (2010). A manifold of spatial maps in the brain. Trends Cogn. Sci. 14, 561–569 10.1016/j.tics.2010.09.004 [DOI] [PubMed] [Google Scholar]
- Devan B. D., Hong N. S., McDonald R. J. (2011). Parallel associative processing in the dorsal striatum: segregation of stimulus-response and cognitive control subregions. Neurobiol. Learn. Mem. 96, 95–120 10.1016/j.nlm.2011.06.002 [DOI] [PubMed] [Google Scholar]
- Devan B. D., McDonald R. J., White N. M. (1999). Effects of medial and lateral caudate-putamen lesions on place- and cue-guided behaviors in the water maze: relation to thigmotaxis. Behav. Brain Res. 100, 5–14 10.1016/S0166-4328(98)00107-7 [DOI] [PubMed] [Google Scholar]
- Devan B. D., White N. M. (1999). Parallel information processing in the dorsal striatum: relation to hippocampal function. J. Neurosci. 19, 2789–2798 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dickinson A., Dawson G. R. (1987). Pavlovian processes in the motivated control of instrumental performance. Q. J. Exp. Psychol. B Comp. Physiol. Psychol. 39, 201–213 [Google Scholar]
- DiFiglia M., Pasik P., Pasik T. (1976). A Golgi study of neuronal types in the neostriatum of monkeys. Brain Res. 114, 245–256 10.1016/0006-8993(76)90669-7 [DOI] [PubMed] [Google Scholar]
- Ditto P. H., Pizarro D. A., Epstein E. B., Jacobson J. A., Macdonald T. K. (2006). Visceral influences on risk-taking behavior. J. Behav. Decis. Making 19, 99–113 [Google Scholar]
- Doya K. (2008). Modulators of decision making. Nat. Neurosci. 11, 410–416 10.1038/nn2077 [DOI] [PubMed] [Google Scholar]
- El-Amamy H., Holland P. C. (2007). Dissociable effects of disconnecting amygdala central nucleus from the ventral tegmental area or substantia nigra on learned orienting and incentive motivation. Eur. J. Neurosci. 25, 1557–1567 10.1111/j.1460-9568.2007.05402.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Estes W. K. (1943). Discriminative conditioning. I: a discriminative property of conditioned anticipation. J. Exp. Psychol. 32, 150–155 [Google Scholar]
- Estes W. K., Skinner B. F. (1941). Some quantitative properties of anxiety. J. Exp. Psychol. 29, 390–400 [Google Scholar]
- Euston D. R., Tatsuno M., McNaughton B. L. (2007). Fast-forward playback of recent memory sequences in prefrontal cortex during sleep. Science 318, 1147–1150 10.1126/science.1148979 [DOI] [PubMed] [Google Scholar]
- Everitt B. J., Cador M., Robbins T. W. (1989). Interactions between the amygdala and ventral striatum in stimulus reward associations – studies using a 2nd-order schedule of sexual reinforcement. Neuroscience 30, 63–75 [DOI] [PubMed] [Google Scholar]
- Everitt B. J., Cardinal R. N., Parkinson J. A., Robbins T. W. (2003). Appetitive behavior: impact of amygdala-dependent mechanisms of emotional learning. Ann. N.Y. Acad. Sci. 985, 233–250 10.1111/j.1749-6632.2003.tb07085.x [DOI] [PubMed] [Google Scholar]
- Everitt B. J., Morris K. A., O'brien A., Robbins T. W. (1991). The basolateral amygdala-ventral striatal system and conditioned place preference: further evidence of limbic-striatal interactions underlying reward-related processes. Neuroscience 42, 1–18 10.1016/0306-4522(91)90145-E [DOI] [PubMed] [Google Scholar]
- Everitt B. J., Parkinson J. A., Olmstead M. C., Arroyo M., Robledo P., Robbins T. W. (1999). Associative processes in addiction and reward – the role of amygdala-ventral striatal subsystems. Ann. N.Y. Acad. Sci. 877, 412–438 10.1111/j.1749-6632.1999.tb09280.x [DOI] [PubMed] [Google Scholar]
- Fanselow M. S. (2000). Contextual fear, gestalt memories, and the hippocampus. Behav. Brain Res. 110, 73–81 10.1016/S0166-4328(99)00186-2 [DOI] [PubMed] [Google Scholar]
- Fanselow M. S., Dong H. W. (2010). Are the dorsal and ventral hippocampus functionally distinct structures? Neuron 65, 7–19 10.1016/j.neuron.2009.11.031 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fanselow M. S., Poulos A. M. (2005). The neuroscience of mammalian associative learning. Annu. Rev. Psychol. 56, 207–234 10.1146/annurev.psych.56.091103.070213 [DOI] [PubMed] [Google Scholar]
- Featherstone R. E., McDonald R. J. (2004). Dorsal striatum and stimulus-response learning: lesions of the dorsolateral, but not dorsomedial, striatum impair acquisition of a simple discrimination task. Behav. Brain Res. 150, 15–23 10.1016/S0166-4328(03)00218-3 [DOI] [PubMed] [Google Scholar]
- Ferbinteanu J., Holsinger R. M., McDonald R. J. (1999). Lesions of the medial or lateral perforant path have different effects on hippocampal contributions to place learning and on fear conditioning to context. Behav. Brain Res. 101, 65–84 10.1016/S0166-4328(98)00144-2 [DOI] [PubMed] [Google Scholar]
- Ferbinteanu J., McDonald R. J. (2001). Dorsal/ventral hippocampus, fornix, and conditioned place preference. Hippocampus 11, 187–200 10.1002/hipo.1036 [DOI] [PubMed] [Google Scholar]
- Ferbinteanu J., Ray C., McDonald R. J. (2003). Both dorsal and ventral hippocampus contribute to spatial learning in Long-Evans rats. Neurosci. Lett. 345, 131–135 10.1016/S0304-3940(03)00473-7 [DOI] [PubMed] [Google Scholar]
- Ferbinteanu J., Shirvalkar P., Shapiro M. L. (2011). Memory modulates journey-dependent coding in the rat hippocampus. J. Neurosci. 31, 9135–9146 10.1523/JNEUROSCI.1241-11.2011 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Floresco S. B., Blaha C. D., Yang C. R., Phillips A. G. (2001). Modulation of hippocampal and amygdalar-evoked activity of nucleus accumbens neurons by dopamine: cellular mechanisms of input selection. J. Neurosci. 21, 2851–2860 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fortin N. J., Agster K. L., Eichenbaum H. B. (2002). Critical role of the hippocampus in memory for sequences of events. Nat. Neurosci. 5, 458–462 10.1038/nn834 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Frank M. J., Seeberger L. C., O'reilly R, C. (2004). By carrot or by stick: cognitive reinforcement learning in parkinsonism. Science 306, 1940–1943 10.1126/science.1102941 [DOI] [PubMed] [Google Scholar]
- Frankland P. W., Cestari V., Filipkowski R. K., McDonald R. J., Silva A. J. (1998). The dorsal hippocampus is essential for context discrimination but not for contextual conditioning. Behav. Neurosci. 112, 863–874 [DOI] [PubMed] [Google Scholar]
- Gage G. J., Stoetzner C. R., Wiltschko A. B., Berke J. D. (2010). Selective activation of striatal fast-spiking interneurons during choice execution. Neuron 67, 466–479 10.1016/j.neuron.2010.06.034 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gal G., Joel D., Gusak O., Feldon J., Weiner I. (1997). The effects of electrolytic lesion to the shell subterritory of the nucleus accumbens on delayed non-matching-to-sample and four-arm baited eight-arm radial-maze tasks. Behav. Neurosci. 111, 92–103 [DOI] [PubMed] [Google Scholar]
- Galinanes G. L., Braz B. Y., Murer M. G. (2011). Origin and properties of striatal local field potential responses to cortical stimulation: temporal regulation by fast inhibitory connections. PLoS ONE 6:e28473 10.1371/journal.pone.0028473 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gallagher M., Graham P. W., Holland P. C. (1990). The amygdala central nucleus and appetitive Pavlovian conditioning: lesions impair one class of conditioned behavior. J. Neurosci. 10, 1906–1911 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gallagher M., McMahan R. W., Schoenbaum G. (1999). Orbitofrontal cortex and representation of incentive value in associative learning. J. Neurosci. 19, 6610–6614 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gan J. O., Walton M. E., Phillips P. E. (2010). Dissociable cost and benefit encoding of future rewards by mesolimbic dopamine. Nat. Neurosci. 13, 25–27 10.1038/nn.2460 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gardiner T. W., Kitai S. T. (1992). Single-unit activity in the globus pallidus and neostriatum of the rat during performance of a trained head movement. Exp. Brain Res. 88, 517–530 [DOI] [PubMed] [Google Scholar]
- Gerfen C. R. (1992). The neostriatal mosaic: multiple levels of compartmental organization in the basal ganglia. Annu. Rev. Neurosci. 15, 285–320 10.1146/annurev.ne.15.030192.001441 [DOI] [PubMed] [Google Scholar]
- Gerfen C. R., Engber T. M., Mahan L. C., Susel Z., Chase T. N., Monsma F. J., Sibley D. R. (1990). D1 and D2 dopamine receptors-regulated gene expression of striatonigral and striatopallidal neurons. Science 250, 1429–1432 10.1126/science.2147780 [DOI] [PubMed] [Google Scholar]
- Gilbert P. E., Kesner R. P., Decoteau W. E. (1998). Memory for spatial location: role of the hippocampus in mediating spatial pattern separation. J. Neurosci. 18, 804–810 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gittis A. H., Leventhal D. K., Fensterheim B. A., Pettibone J. R., Berke J. D., Kreitzer A. C. (2011). Selective inhibition of striatal fast-spiking interneurons causes dyskinesias. J. Neurosci. 31, 15727–15731 10.1523/JNEUROSCI.3875-11.2011 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Glascher J., Daw N., Dayan P., O'doherty J. P. (2010). States versus rewards: dissociable neural prediction error signals underlying model-based and model-free reinforcement learning. Neuron 66, 585–595 10.1016/j.neuron.2010.04.016 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Goddard G. V. (1964). Functions of the amygdala. Psychol. Bull. 62, 89–109 [DOI] [PubMed] [Google Scholar]
- Gold J. M., Waltz J. A., Prentice K. J., Morris S. E., Heerey E. A. (2008). Reward processing in schizophrenia: a deficit in the representation of value. Schizophr. Bull. 34, 835–847 10.1093/schbul/sbn068 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Goldstein B. L., Barnett B. R., Vasquez G., Tobia S. C., Kashtelyan V., Burton A. C., Bryden D. W., Roesch M. R. (2012). Ventral striatum encodes past and predicted value independent of motor contingencies. J. Neurosci. 32, 2027–2036 10.1523/JNEUROSCI.5349-11.2012 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Goto Y., Grace A. A. (2005). Dopaminergic modulation of limbic and cortical drive of nucleus accumbens in goal-directed behavior. Nat. Neurosci. 8, 805–812 10.1038/nn1471 [DOI] [PubMed] [Google Scholar]
- Goto Y., O'Donnell P. (2001). Synchronous activity in the hippocampus and nucleus accumbens in vivo. J. Neurosci. 21, RC131 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Grace A. A. (2000). Gating of information flow within the limbic system and the pathophysiology of schizophrenia. Brain Res. Brain Res. Rev. 31, 330–341 10.1016/S0165-0173(99)00049-1 [DOI] [PubMed] [Google Scholar]
- Groenewegen H. J., Berendse H. W., Haber S. N. (1993). Organization of the output of the ventral striatopallidal system in the rat: ventral pallidal efferents. Neuroscience 57, 113–142 10.1016/0306-4522(93)90115-V [DOI] [PubMed] [Google Scholar]
- Groenewegen H. J., Berendse H. W., Wolters J. G., Lohman A. H. (1990). The anatomical relationship of the prefrontal cortex with the striatopallidal system, the thalamus and the amygdala: evidence for a parallel organization. Prog. Brain Res. 85, 95–116 discussion: 116–118. [DOI] [PubMed] [Google Scholar]
- Groenewegen H. J., Russchen F. T. (1984). Organization of the efferent projections of the nucleus accumbens to pallidal, hypothalamic, and mesencephalic structures: a tracing and immunohistochemical study in the cat. J. Comp. Neurol. 223, 347–367 10.1002/cne.902230303 [DOI] [PubMed] [Google Scholar]
- Groenewegen H. J., Vermeulen-Van Der Zee E., Te Kortschot A., Witter M. P. (1987). Organization of the projections from the subiculum to the ventral striatum in the rat. A study using anterograde transport of Phaseolus vulgaris leucoagglutinin. Neuroscience 23, 103–120 [DOI] [PubMed] [Google Scholar]
- Groves P. M. (1983). A theory of the functional organization of the neostriatum and the neostriatal control of voluntary movement. Brain Res. 286, 109–132 [DOI] [PubMed] [Google Scholar]
- Gruber A. J., Dayan P., Gutkin B. G., Solla S. A. (2004). Dopamine modulation in a basal ganglio-cortical network implements saliency-based gating of working memory, in Advances in Neural Information Processing Systems 16, eds Thrun S., Saul L., Scholkopf B. (Cambridge, MA: MIT Press; ), 1271–1278 [Google Scholar]
- Gruber A. J., Dayan P., Gutkin B. S., Solla S. A. (2006). Dopamine modulation in the basal ganglia locks the gate to working memory. J. Comput. Neurosci. 20, 153–166 10.1007/s10827-005-5705-x [DOI] [PubMed] [Google Scholar]
- Gruber A. J., Hussain R. J., O'Donnell P. (2009a). The nucleus accumbens: a switchboard for goal-directed behaviors. PLoS ONE 4:e5062 10.1371/journal.pone.0005062 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gruber A. J., Powell E. M., O'Donnell P. (2009b). Cortically activated interneurons shape spatial aspects of cortico-accumbens processing. J. Neurophysiol. 101, 1876–1882 10.1152/jn.91002.2008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Haas H. L., Sergeeva O. A., Selbach O. (2008). Histamine in the nervous system. Physiol. Rev. 88, 1183–1241 10.1152/physrev.00043.2007 [DOI] [PubMed] [Google Scholar]
- Haber S. N., Fudge J. L., McFarland N. R. (2000). Striatonigrostriatal pathways in primates form an ascending spiral from the shell to the dorsolateral striatum. J. Neurosci. 20, 2369–2382 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hall J., Parkinson J. A., Connor T. M., Dickinson A., Everitt B. J. (2001). Involvement of the central nucleus of the amygdala and nucleus accumbens core in mediating Pavlovian influences on instrumental behaviour. Eur. J. Neurosci. 13, 1984–1992 10.1046/j.0953-816x.2001.01577.x [DOI] [PubMed] [Google Scholar]
- Harley C. W. (1979). Arm choices in a sunburst maze: effects of hippocampectomy in the rat. Physiol. Behav. 23, 283–290 10.1016/0031-9384(79)90369-X [DOI] [PubMed] [Google Scholar]
- Hasselmo M. E., Eichenbaum H. (2005). Hippocampal mechanisms for the context-dependent retrieval of episodes. Neural Netw. 18, 1172–1190 10.1016/j.neunet.2005.08.007 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hauber W., Sommer S. (2009). Prefrontostriatal circuitry regulates effort-related decision making. Cereb. Cortex 19, 2240–2247 10.1093/cercor/bhn241 [DOI] [PubMed] [Google Scholar]
- Heimer L., Zahm D. S., Churchill L., Kalivas P. W., Wohltmann C. (1991). Specificity in the projection patterns of accumbal core and shell in the rat. Neuroscience 41, 89–125 10.1016/0306-4522(91)90202-Y [DOI] [PubMed] [Google Scholar]
- Herman J. P., Ostrander M. M., Mueller N. K., Figueiredo H. (2005). Limbic system mechanisms of stress regulation: hypothalamo-pituitary-adrenocortical axis. Prog. Neuropsychopharmacol. Biol. Psychiatry 29, 1201–1213 10.1016/j.pnpbp.2005.08.006 [DOI] [PubMed] [Google Scholar]
- Hikosaka O. (1998). Neural systems for control of voluntary action–a hypothesis. Adv. Biophys. 35, 81–102 10.1016/S0065-227X(98)80004-X [DOI] [PubMed] [Google Scholar]
- Hilton S. M. (1982). The defence-arousal system and its relevance for circulatory and respiratory control. J. Exp. Biol. 100, 159–174 10.1016/S0166-4328(01)00256-X [DOI] [PubMed] [Google Scholar]
- Hiroi N., White N. M. (1991a). The amphetamine conditioned place preference: differential involvement of dopamine receptor subtypes and two dopaminergic terminal areas. Brain Res. 552, 141–152 10.1016/0006-8993(91)90672-I [DOI] [PubMed] [Google Scholar]
- Hiroi N., White N. M. (1991b). The lateral nucleus of the amygdala mediates expression of the amphetamine-produced conditioned place preference. J. Neurosci. 11, 2107–2116 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hirsh R. (1974). The hippocampus and contextual retrieval of information from memory: a theory. Behav. Biol. 12, 421–444 [DOI] [PubMed] [Google Scholar]
- Hock B. J., Jr., Bunsey M. D. (1998). Differential effects of dorsal and ventral hippocampal lesions. J. Neurosci. 18, 7027–7032 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Holland P. C. (1977). Conditioned stimulus as a determinant of the form of the Pavlovian conditioned response. J. Exp. Psychol. Anim. Behav. Process. 3, 77–104 [DOI] [PubMed] [Google Scholar]
- Holland P. C. (2004). Relations between Pavlovian-instrumental transfer and reinforcer devaluation. J. Exp. Psychol. Anim. Behav. Process. 30, 104–117 10.1037/0097-7403.30.2.104 [DOI] [PubMed] [Google Scholar]
- Holland P. C., Gallagher M. (2003). Double dissociation of the effects of lesions of basolateral and central amygdala on conditioned stimulus-potentiated feeding and Pavlovian-instrumental transfer. Eur. J. Neurosci. 17, 1680–1694 10.1046/j.1460-9568.2003.02585.x [DOI] [PubMed] [Google Scholar]
- Honey R. C., Good M. (1993). Selective hippocampal lesions abolish the contextual specificity of latent inhibition and conditioning. Behav. Neurosci. 107, 23–33 [DOI] [PubMed] [Google Scholar]
- Honey R. C., Hall G. (1989). Attenuation of latent inhibition after compound pre-exposure: associative and perceptual explanations. Q. J. Exp. Psychol. B 41, 355–368 [PubMed] [Google Scholar]
- Hoover W. B., Vertes R. P. (2011). Projections of the medial orbital and ventral orbital cortex in the rat. J. Comp. Neurol. 519, 3766–3801 10.1002/cne.22733 [DOI] [PubMed] [Google Scholar]
- Howland J. G., Harrison R. A., Hannesson D. K., Phillips A. G. (2008). Ventral hippocampal involvement in temporal order, but not recognition, memory for spatial information. Hippocampus 18, 251–257 10.1002/hipo.20396 [DOI] [PubMed] [Google Scholar]
- Howland J. G., Taepavarapruk P., Phillips A. G. (2002). Glutamate receptor-dependent modulation of dopamine efflux in the nucleus accumbens by basolateral, but not central, nucleus of the amygdala in rats. J. Neurosci. 22, 1137–1145 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hull C. L. (1943). Principles of Behaviour. New York, NY: Appleton-Century-Crofts [Google Scholar]
- Humphries M. D., Khamassi M., Gurney K. (2012). Dopaminergic control of the exploration-exploitation trade-off via the basal ganglia. Front. Neurosci. 6:9 10.3389/fnins.2012.00009 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hyman J. M., Zilli E. A., Paley A. M., Hasselmo M. E. (2010). Working memory performance correlates with prefrontal-hippocampal theta interactions but not with prefrontal neuron firing rates. Front. Integr. Neurosci. 4:2 10.3389/neuro.07.002.2010 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ishikawa A., Ambroggi F., Nicola S. M., Fields H. L. (2008). Dorsomedial prefrontal cortex contribution to behavioral and nucleus accumbens neuronal responses to incentive cues. J. Neurosci. 28, 5088–5098 10.1523/JNEUROSCI.0253-08.2008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ito M., Doya K. (2009). Validation of decision-making models and analysis of decision variables in the rat basal ganglia. J. Neurosci. 29, 9861–9874 10.1523/JNEUROSCI.6157-08.2009 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jezek K., Henriksen E. J., Treves A., Moser E. I., Moser M. B. (2011). Theta-paced flickering between place-cell maps in the hippocampus. Nature 478, 246–249 10.1038/nature10439 [DOI] [PubMed] [Google Scholar]
- Joel D., Weiner I. (1994). The organization of the basal ganglia-thalamocortical circuits: open interconnected rather than closed segregated. Neuroscience 63, 363–379 10.1016/0306-4522(94)90536-3 [DOI] [PubMed] [Google Scholar]
- Joel D., Weiner I. (2000). The connections of the dopaminergic system with the striatum in rats and primates: an analysis with respect to the functional and compartmental organization of the striatum. Neuroscience 96, 451–474 10.1016/S0306-4522(99)00575-8 [DOI] [PubMed] [Google Scholar]
- Joel D., Weiner I., Feldon J. (1997). Electrolytic lesions of the medial prefrontal cortex in rats disrupt performance on an analog of the Wisconsin Card Sorting Test, but do not disrupt latent inhibition: implications for animal models of schizophrenia. Behav. Brain Res. 85, 187–201 10.1016/S0166-4328(97)87583-3 [DOI] [PubMed] [Google Scholar]
- Johnson A., Redish A. D. (2007). Neural ensembles in CA3 transiently encode paths forward of the animal at a decision point. J. Neurosci. 27, 12176–12189 10.1523/JNEUROSCI.3761-07.2007 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jones J. L., Day J. J., Aragona B. J., Wheeler R. A., Wightman R. M., Carelli R. M. (2010). Basolateral amygdala modulates terminal dopamine release in the nucleus accumbens and conditioned responding. Biol. Psychiatry 67, 737–744 10.1016/j.biopsych.2009.11.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jones M. W., Wilson M. A. (2005). Theta rhythms coordinate hippocampal-prefrontal interactions in a spatial memory task. PLoS Biol. 3:e402 10.1371/journal.pbio.0030402 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kapp B. S., Frysinger R. C., Gallagher M., Haselton J. R. (1979). Amygdala central nucleus lesions: effect on heart rate conditioning in the rabbit. Physiol. Behav. 23, 1109–1117 [DOI] [PubMed] [Google Scholar]
- Kapp B. S., Wilson A., Pascoe J. P., Supple W., Whalen P. J. (1990). Neuroanatomical system analysis of bradycardia in the rabbit, in Neurocomputation and Learning: Foundations of Adaptive Networks, eds Gabriel M., Moore J. (Cambridge, MA: MIT Press; ), 53–90 [Google Scholar]
- Kawaguchi Y. (1993). Physiological, morphological, and histochemical characterization of three classes of interneurons in rat neostriatum. J. Neurosci. 13, 4908–4923 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kelley A. E., Baldo B. A., Pratt W. E. (2005). A proposed hypothalamic-thalamic-striatal axis for the integration of energy balance, arousal, and food reward. J. Comp. Neurol. 493, 72–85 10.1002/cne.20769 [DOI] [PubMed] [Google Scholar]
- Kennedy P. J., Shapiro M. L. (2004). Retrieving memories via internal context requires the hippocampus. J. Neurosci. 24, 6979–6985 10.1523/JNEUROSCI.1388-04.2004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kentridge R. W., Shaw C., Aggleton J. P. (1991). Amygdaloid lesions and stimulus-reward associations in the rat. Behav. Brain Res. 42, 57–66 [DOI] [PubMed] [Google Scholar]
- Kesner R. P., Churchwell J. C. (2011). An analysis of rat prefrontal cortex in mediating executive function. Neurobiol. Learn. Mem. 96, 417–431 10.1016/j.nlm.2011.07.002 [DOI] [PubMed] [Google Scholar]
- Kim H., Sul J. H., Huh N., Lee D., Jung M. W. (2009). Role of striatum in updating values of chosen actions. J. Neurosci. 29, 14701–14712 10.1523/JNEUROSCI.2728-09.2009 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kim J. J., Clark R. E., Thompson R. F. (1995). Hippocampectomy impairs the memory of recently, but not remotely, acquired trace eyeblink conditioned responses. Behav. Neurosci. 109, 195–203 [DOI] [PubMed] [Google Scholar]
- Kim J. J., Fanselow M. S. (1992). Modality-specific retrograde amnesia of fear. Science 256, 675–677 10.1126/science.1585183 [DOI] [PubMed] [Google Scholar]
- Kim J. J., Rison R. A., Fanselow M. S. (1993). Effects of amygdala, hippocampus, and periaqueductal gray lesions on short- and long-term contextual fear. Behav. Neurosci. 107, 1093–1098 [DOI] [PubMed] [Google Scholar]
- Kimchi E. Y., Laubach M. (2009). Dynamic encoding of action selection by the medial striatum. J. Neurosci. 29, 3148–3159 10.1523/JNEUROSCI.5206-08.2009 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kincaid A. E., Zheng T., Wilson C. J. (1998). Connectivity and convergence of single corticostriatal axons. J. Neurosci. 18, 4722–4731 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kita H. (1993). GABAergic circuits of the striatum. Prog. Brain Res. 99, 51–72 [DOI] [PubMed] [Google Scholar]
- Kita H., Kitai S. T. (1994). The morphology of globus pallidus projection neurons in the rat: an intracellular staining study. Brain Res. 636, 308–319 [DOI] [PubMed] [Google Scholar]
- Kluver H., Bucy P. C. (1939). Preliminary analysis of functions of the temporal lobes in monkeys. J. Neuropsychiatry Clin. Neurosci. 9, 606–620 [DOI] [PubMed] [Google Scholar]
- Koch M., Schnitzler H. U. (1997). The acoustic startle response in rats–circuits mediating evocation, inhibition and potentiation. Behav. Brain Res. 89, 35–49 [DOI] [PubMed] [Google Scholar]
- Kolb B., Buhrmann K., McDonald R., Sutherland R. J. (1994). Dissociation of the medial prefrontal, posterior parietal, and posterior temporal cortex for spatial navigation and recognition memory in the rat. Cereb. Cortex 4, 664–680 10.1093/cercor/4.6.664 [DOI] [PubMed] [Google Scholar]
- Koos T., Tepper J. M. (1999). Inhibitory control of neostriatal projection neurons by GABAergic interneurons. Nat. Neurosci. 2, 467–472 10.1038/8138 [DOI] [PubMed] [Google Scholar]
- Koos T., Tepper J. M., Wilson C. J. (2004). Comparison of IPSCs evoked by spiny and fast-spiking neurons in the neostriatum. J. Neurosci. 24, 7916–7922 10.1523/JNEUROSCI.2163-04.2004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kravitz A. V., Freeze B. S., Parker P. R., Kay K., Thwin M. T., Deisseroth K., Kreitzer A. C. (2010). Regulation of parkinsonian motor behaviours by optogenetic control of basal ganglia circuitry. Nature 466, 622–626 10.1038/nature09159 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kravitz A. V., Tye L. D., Kreitzer A. C. (2012). Distinct roles for direct and indirect pathway striatal neurons in reinforcement. Nat. Neurosci. [Epup ahead of print]. 10.1038/nn.3100 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Krayniak P. F., Meibach R. C., Siegel A. (1981). A Projection from the entorhinal cortex to the nucleus accumbens in the rat. Brain Res. 209, 427–431 10.1016/0006-8993(81)90165-7 [DOI] [PubMed] [Google Scholar]
- Lammel S., Ion D. I., Roeper J., Malenka R. C. (2011). Projection-specific modulation of dopamine neuron synapses by aversive and rewarding stimuli. Neuron 70, 855–862 10.1016/j.neuron.2011.03.025 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lansink C. S., Goltstein P. M., Lankelma J. V., Joosten R. N., McNaughton B. L., Pennartz C. M. (2008). Preferential reactivation of motivationally relevant information in the ventral striatum. J. Neurosci. 28, 6372–6382 10.1523/JNEUROSCI.1054-08.2008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lansink C. S., Goltstein P. M., Lankelma J. V., McNaughton B. L., Pennartz C. M. (2009). Hippocampus leads ventral striatum in replay of place-reward information. PLoS Biol. 7:e1000173 10.1371/journal.pbio.1000173 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lansink C. S., Goltstein P. M., Lankelma J. V., Pennartz C. M. (2010). Fast-spiking interneurons of the rat ventral striatum: temporal coordination of activity with principal cells and responsiveness to reward. Eur. J. Neurosci. 32, 494–508 10.1111/j.1460-9568.2010.07293.x [DOI] [PubMed] [Google Scholar]
- Lavenex P., Amaral D. G. (2000). Hippocampal-neocortical interaction: a hierarchy of associativity. Hippocampus 10, 420–430 [DOI] [PubMed] [Google Scholar]
- Leaf R. C., Muller S. A. (1965). Simple method for CER conditioning and measurement. Psychol. Rep. 17, 211–215 [DOI] [PubMed] [Google Scholar]
- Ledoux J. (2007). The amygdala. Curr. Biol. 17, R868–R874 10.1016/j.cub.2007.08.005 [DOI] [PubMed] [Google Scholar]
- Ledoux J. E., Cicchetti P., Xagoraris A., Romanski L. M. (1990). The lateral amygdaloid nucleus: sensory interface of the amygdala in fear conditioning. J. Neurosci. 10, 1062–1069 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Leutgeb S., Leutgeb J. K., Barnes C. A., Moser E. I., McNaughton B. L., Moser M. B. (2005). Independent codes for spatial and episodic memory in hippocampal neuronal ensembles. Science 309, 619–623 10.1016/j.nlm.2006.09.008 [DOI] [PubMed] [Google Scholar]
- Levesque M., Parent A. (1998). Axonal arborization of corticostriatal and corticothalamic fibers arising from prelimbic cortex in the rat. Cereb. Cortex 8, 602–613 10.1093/cercor/8.7.602 [DOI] [PubMed] [Google Scholar]
- Lex A., Hauber W. (2008). Dopamine D1 and D2 receptors in the nucleus accumbens core and shell mediate Pavlovian-instrumental transfer. Learn. Mem. 15, 483–491 10.1101/lm.978708 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Liu X., Ramirez S., Pang P. T., Puryear C. B., Govindarajan A., Deisseroth K., Tonegawa S. (2012). Optogenetic stimulation of a hippocampal engram activates fear memory recall. Nature 484, 381–385 10.1038/nature11028 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lovibond P. F. (1983). Facilitation of instrumental behavior by a Pavlovian appetitive conditioned stimulus. J. Exp. Psychol. Anim. Behav. Process. 9, 225–247 [PubMed] [Google Scholar]
- Lubow R. E. (1989). Latent Inhibition and Conditioned Attention Theory. Cambridge: Cambridge University Press; 10.1093/schbul/sbi005 [DOI] [Google Scholar]
- Lynd-Balta E., Haber S. N. (1994). The organization of midbrain projections to the striatum in the primate: sensorimotor-related striatum versus ventral striatum. Neuroscience 59, 625–640 [DOI] [PubMed] [Google Scholar]
- Mallet N., Le Moine C., Charpier S., Gonon F. (2005). Feedforward inhibition of projection neurons by fast-spiking GABA interneurons in the rat striatum in vivo. J. Neurosci. 25, 3857–3869 10.1523/JNEUROSCI.5027-04.2005 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mantz J., Thierry A. M., Glowinski J. (1989). Effect of noxious tail pinch on the discharge rate of mesocortical and mesolimbic dopamine neurons: selective activation of the mesocortical system. Brain Res. 476, 377–381 10.1016/0006-8993(89)91263-8 [DOI] [PubMed] [Google Scholar]
- Maren S. (1999). Neurotoxic or electrolytic lesions of the ventral subiculum produce deficits in the acquisition and expression of Pavlovian fear conditioning in rats. Behav. Neurosci. 113, 283–290 [DOI] [PubMed] [Google Scholar]
- Maren S. (2001). Neurobiology of Pavlovian fear conditioning. Annu. Rev. Neurosci. 24, 897–931 10.1146/annurev.neuro.24.1.897 [DOI] [PubMed] [Google Scholar]
- Maren S., Aharonov G., Fanselow M. S. (1997). Neurotoxic lesions of the dorsal hippocampus and Pavlovian fear conditioning in rats. Behav. Brain Res. 88, 261–274 10.1016/S0166-4328(97)00088-0 [DOI] [PubMed] [Google Scholar]
- Maren S., Holt W. (2000). The hippocampus and contextual memory retrieval in Pavlovian conditioning. Behav. Brain Res. 110, 97–108 10.1016/S0166-4328(99)00188-6 [DOI] [PubMed] [Google Scholar]
- Maren S., Quirk G. J. (2004). Neuronal signalling of fear memory. Nat. Rev. Neurosci. 5, 844–852 10.1038/nrn1535 [DOI] [PubMed] [Google Scholar]
- Matsumoto M., Hikosaka O. (2009). Two types of dopamine neuron distinctly convey positive and negative motivational signals. Nature 459, 837–841 10.1038/nature08028 [DOI] [PMC free article] [PubMed] [Google Scholar]
- McClelland J. L., McNaughton B. L., O'reilly R. C. (1995). Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory. Psychol. Rev. 102, 419–457 10.1037/0033-295X.102.3.419 [DOI] [PubMed] [Google Scholar]
- McDannald M. A., Saddoris M. P., Gallagher M., Holland P. C. (2005). Lesions of orbitofrontal cortex impair rats' differential outcome expectancy learning but not conditioned stimulus-potentiated feeding. J. Neurosci. 25, 4626–4632 10.1523/JNEUROSCI.5301-04.2005 [DOI] [PMC free article] [PubMed] [Google Scholar]
- McDonald R. J., Foong N., Hong N. S. (2004). Incidental information acquired by the amygdala during acquisition of a stimulus-response habit task. Exp. Brain Res. 159, 72–83 10.1007/s00221-004-1934-x [DOI] [PubMed] [Google Scholar]
- McDonald R. J., Hong N. S. (2004). A dissociation of dorso-lateral striatum and amygdala function on the same stimulus-response habit task. Neuroscience 124, 507–513 10.1016/j.neuroscience.2003.11.041 [DOI] [PubMed] [Google Scholar]
- McDonald R. J., Jones J., Richards B., Hong N. S. (2006). A double dissociation of dorsal and ventral hippocampal function on a learning and memory task mediated by the dorso-lateral striatum. Eur. J. Neurosci. 24, 1789–1801 10.1111/j.1460-9568.2006.05064.x [DOI] [PubMed] [Google Scholar]
- McDonald R. J., King A. L., Foong N., Rizos Z., Hong N. S. (2008a). Neurotoxic lesions of the medial prefrontal cortex or medial striatum impair multiple-location place learning in the water task: evidence for neural structures with complementary roles in behavioural flexibility. Exp. Brain Res. 187, 419–427 10.1007/s00221-008-1314-z [DOI] [PubMed] [Google Scholar]
- McDonald R. J., King A. L., Hong N. S. (2008b). Neurotoxic damage to the dorsomedial striatum exaggerates the behavioral influence of a context-specific inhibitory association mediated by the ventral hippocampus. Behav. Neurosci. 122, 27–35 10.1037/0735-7044.122.1.27 [DOI] [PubMed] [Google Scholar]
- McDonald R. J., King A. L., Hong N. S. (2001). Context-specific interference on reversal learning of a stimulus-response habit. Behav. Brain Res. 121, 149–165 10.1016/S0166-4328(01)00160-7 [DOI] [PubMed] [Google Scholar]
- McDonald R. J., King A. L., Wasiak T. D., Zelinski E. L., Hong N. S. (2007). A complex associative structure formed in the mammalian brain during acquisition of a simple visual discrimination task: dorsolateral striatum, amygdala, and hippocampus. Hippocampus 17, 759–774 10.1002/hipo.20333 [DOI] [PubMed] [Google Scholar]
- McDonald R. J., Ko C. H., Hong N. S. (2002). Attenuation of context-specific inhibition on reversal learning of a stimulus-response task in rats with neurotoxic hippocampal damage. Behav. Brain Res. 136, 113–126 10.1016/S0166-4328(02)00104-3 [DOI] [PubMed] [Google Scholar]
- McDonald R. J., Murphy R. A., Guarraci F. A., Gortler J. R., White N. M., Baker A. G. (1997). Systematic comparison of the effects of hippocampal and fornix-fimbria lesions on acquisition of three configural discriminations. Hippocampus 7, 371–388 [DOI] [PubMed] [Google Scholar]
- McDonald R. J., White N. M. (1993). A triple dissociation of memory systems: hippocampus, amygdala, and dorsal striatum. Behav. Neurosci. 107, 3–22 [DOI] [PubMed] [Google Scholar]
- McDonald R. J., White N. M. (1994). Parallel information processing in the water maze: evidence for independent memory systems involving dorsal striatum and hippocampus. Behav. Neural Biol. 61, 260–270 [DOI] [PubMed] [Google Scholar]
- McDonald R. J., White N. M. (1995a). Hippocampal and nonhippocampal contributions to place learning in rats. Behav. Neurosci. 109, 579–593 [DOI] [PubMed] [Google Scholar]
- McDonald R. J., White N. M. (1995b). Information acquired by the hippocampus interferes with acquisition of the amygdala-based conditioned-cue preference in the rat. Hippocampus 5, 189–197 10.1002/hipo.450050305 [DOI] [PubMed] [Google Scholar]
- McEchron M. D., Bouwmeester H., Tseng W., Weiss C., Disterhoft J. F. (1998). Hippocampectomy disrupts auditory trace fear conditioning and contextual fear conditioning in the rat. Hippocampus 8, 638–646 [DOI] [PubMed] [Google Scholar]
- McGeorge A. J., Faull R. L. (1989). The organization of the projection from the cerebral cortex to the striatum in the rat. Neuroscience 29, 503–537 [DOI] [PubMed] [Google Scholar]
- McNaughton B. L., Barnes C. A., O'keefe J. (1983). The contributions of position, direction, and velocity to single unit activity in the hippocampus of freely-moving rats. Exp. Brain Res. 52, 41–49 [DOI] [PubMed] [Google Scholar]
- Millhouse O. E. (1986). Pallidal neurons in the rat. J. Comp. Neurol. 254, 209–227 10.1002/cne.902540206 [DOI] [PubMed] [Google Scholar]
- Mink J. W. (1996). The basal ganglia: focused selection and inhibition of competing motor programs. Prog. Neurobiol. 50, 381–425 10.1016/S0301-0082(96)00042-1 [DOI] [PubMed] [Google Scholar]
- Mishkin M., Manning F. J. (1978). Non-spatial memory after selective prefrontal lesions in monkeys. Brain Res. 143, 313–323 10.1016/0006-8993(78)90571-1 [DOI] [PubMed] [Google Scholar]
- Mizumori S. J., Puryear C. B., Martig A. K. (2009). Basal ganglia contributions to adaptive navigation. Behav. Brain Res. 199, 32–42 10.1016/j.bbr.2008.11.014 [DOI] [PubMed] [Google Scholar]
- Mogenson G. J., Jones D. L., Yim C. Y. (1980). From motivation to action: functional interface between the limbic system and the motor system. Prog. Neurobiol. 14, 69–97 10.1016/0301-0082(80)90018-0 [DOI] [PubMed] [Google Scholar]
- Moita M. A., Rosis S., Zhou Y., Ledoux J. E., Blair H. T. (2003). Hippocampal place cells acquire location-specific responses to the conditioned stimulus during auditory fear conditioning. Neuron 37, 485–497 10.1016/S0896-6273(03)00033-3 [DOI] [PubMed] [Google Scholar]
- Montague P. R., Dayan P., Sejnowski T. J. (1996). A framework for mesencephalic dopamine systems based on predictive Hebbian learning. J. Neurosci. 16, 1936–1947 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Morris R. G., Garrud P., Rawlins J. N., O'keefe J. (1982). Place navigation impaired in rats with hippocampal lesions. Nature 297, 681–683 [DOI] [PubMed] [Google Scholar]
- Morrison S. E., Salzman C. D. (2010). Re-valuing the amygdala. Curr. Opin. Neurobiol. 20, 221–230 10.1016/j.conb.2010.02.007 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Moser E., Moser M. B., Andersen P. (1993). Spatial learning impairment parallels the magnitude of dorsal hippocampal lesions, but is hardly present following ventral lesions. J. Neurosci. 13, 3916–3925 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mulder A. B., Hodenpijl M. G., Lopes Da Silva F. H. (1998). Electrophysiology of the hippocampal and amygdaloid projections to the nucleus accumbens of the rat: convergence, segregation, and interaction of inputs. J. Neurosci. 18, 5095–5102 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Muller R. (1996). A quarter of a century of place cells. Neuron 17, 813–822 10.1016/S0896-6273(00)80214-7 [DOI] [PubMed] [Google Scholar]
- Muller R. U., Kubie J. L. (1987). The effects of changes in the environment on the spatial firing of hippocampal complex-spike cells. J. Neurosci. 7, 1951–1968 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Muller R. U., Kubie J. L., Ranck J. B., Jr. (1987). Spatial firing patterns of hippocampal complex-spike cells in a fixed environment. J. Neurosci. 7, 1935–1950 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nicola S. M. (2010). The flexible approach hypothesis: unification of effort and cue-responding hypotheses for the role of nucleus accumbens dopamine in the activation of reward-seeking behavior. J. Neurosci. 30, 16585–16600 10.1523/JNEUROSCI.3958-10.2010 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nicola S. M., Yun I. A., Wakabayashi K. T., Fields H. L. (2004). Cue-evoked firing of nucleus accumbens neurons encodes motivational significance during a discriminative stimulus task. J. Neurophysiol. 91, 1840–1865 10.1152/jn.00657.2003 [DOI] [PubMed] [Google Scholar]
- Niv Y., Daw N. D., Joel D., Dayan P. (2007). Tonic dopamine: opportunity costs and the control of response vigor. Psychopharmacology (Berl.) 191, 507–520 10.1007/s00213-006-0502-4 [DOI] [PubMed] [Google Scholar]
- O'Donnell P., Grace A. A. (1995). Synaptic interactions among excitatory afferents to nucleus accumbens neurons: hippocampal gating of prefrontal cortical input. J. Neurosci. 15, 3622–3639 [DOI] [PMC free article] [PubMed] [Google Scholar]
- O'Keefe J., Dostrovsky J. (1971). The hippocampus as a spatial map. Preliminary evidence from unit activity in the freely-moving rat. Brain Res. Brain Res. 34, 171–175 10.1016/0006-8993(71)90358-1 [DOI] [PubMed] [Google Scholar]
- O'Keefe J., Nadel L. (1978). The Hippocampus as a Congnitive Map. Oxford: Oxford University Press [Google Scholar]
- O'Keefe J., Nadel L., Keightley S., Kill D. (1975). Fornix lesions selectively abolish place learning in the rat. Exp. Neurol. 48, 152–166 10.1016/0014-4886(75)90230-7 [DOI] [PubMed] [Google Scholar]
- O'Keefe J., Speakman A. (1987). Single unit activity in the rat hippocampus during a spatial memory task. Exp. Brain Res. 68, 1–27 [DOI] [PubMed] [Google Scholar]
- Olton D. S., Walker J. A., Gage F. H. (1978). Hippocampal connections and spatial discrimination. Brain Res. 139, 295–308 10.1016/0006-8993(78)90930-7 [DOI] [PubMed] [Google Scholar]
- Oorschot D. E. (1996). Total number of neurons in the neostriatal, pallidal, subthalamic, and substantia nigral nuclei of the rat basal ganglia: a stereological study using the cavalieri and optical disector methods. J. Comp. Neurol. 366, 580–599 [DOI] [PubMed] [Google Scholar]
- O'Reilly R. C., Frank M. J. (2006). Making working memory work: a computational model of learning in the prefrontal cortex and basal ganglia. Neural Comput. 18, 283–328 10.1162/089976606775093909 [DOI] [PubMed] [Google Scholar]
- O'Reilly R. C., Rudy J. W. (2001). Conjunctive representations in learning and memory: principles of cortical and hippocampal function. Psychol. Rev. 108, 311–345 10.1037/0033-295X.108.2.311 [DOI] [PubMed] [Google Scholar]
- Packard M. G. (1999). Glutamate infused posttraining into the hippocampus or caudate-putamen differentially strengthens place and response learning. Proc. Natl. Acad. Sci. U.S.A. 96, 12881–12886 10.1073/pnas.96.22.12881 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Packard M. G., Hirsh R., White N. M. (1989). Differential effects of fornix and caudate nucleus lesions on two radial maze tasks: evidence for multiple memory systems. J. Neurosci. 9, 1465–1472 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Packard M. G., McGaugh J. L. (1996). Inactivation of hippocampus or caudate nucleus with lidocaine differentially affects expression of place and response learning. Neurobiol. Learn. Mem. 65, 65–72 10.1006/nlme.1996.0007 [DOI] [PubMed] [Google Scholar]
- Pan W. X., Schmidt R., Wickens J. R., Hyland B. I. (2005). Dopamine cells respond to predicted events during classical conditioning: evidence for eligibility traces in the reward-learning network. J. Neurosci. 25, 6235–6242 10.1523/JNEUROSCI.1478-05.2005 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Parent A., Sato F., Wu Y., Gauthier J., Levesque M., Parent M. (2000). Organization of the basal ganglia: the importance of axonal collateralization. Trends Neurosci. 23, S20–S27 [DOI] [PubMed] [Google Scholar]
- Parkinson J. A., Dalley J. W., Cardinal R. N., Bamford A., Fehnert B., Lachenal G., Rudarakanchana N., Halkerston K. M., Robbins T. W., Everitt B. J. (2002). Nucleus accumbens dopamine depletion impairs both acquisition and performance of appetitive Pavlovian approach behaviour: implications for mesoaccumbens dopamine function. Behav. Brain Res. 137, 149–163 10.1016/S0166-4328(02)00291-7 [DOI] [PubMed] [Google Scholar]
- Parkinson J. A., Olmstead M. C., Burns L. H., Robbins T. W., Everitt B. J. (1999). Dissociation in effects of lesions of the nucleus accumbens core and shell on appetitive pavlovian approach behavior and the potentiation of conditioned reinforcement and locomotor activity by D-amphetamine. J. Neurosci. 19, 2401–2411 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Parkinson J. A., Robbins T. W., Everitt B. J. (2000a). Dissociable roles of the central and basolateral amygdala in appetitive emotional learning. Eur. J. Neurosci. 12, 405–413 10.1046/j.1460-9568.2000.00960.x [DOI] [PubMed] [Google Scholar]
- Parkinson J. A., Willoughby P. J., Robbins T. W., Everitt B. J. (2000b). Disconnection of the anterior cingulate cortex and nucleus accumbens core impairs Pavlovian approach behavior: further evidence for limbic cortical-ventral striatopallidal systems. Behav. Neurosci. 114, 42–63 [PubMed] [Google Scholar]
- Parmentier R., Ohtsu H., Djebbara-Hannas Z., Valatx J. L., Watanabe T., Lin J. S. (2002). Anatomical, physiological, and pharmacological characteristics of histidine decarboxylase knock-out mice: evidence for the role of brain histamine in behavioral and sleep-wake control. J. Neurosci. 22, 7695–7711 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Parush N., Tishby N., Bergman H. (2011). Dopaminergic balance between reward maximization and policy complexity. Front. Syst. Neurosci. 5:22 10.3389/fnsys.2011.00022 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pavlov I. (1927). Conditioned Reflexes. Oxford: Oxford University Press [Google Scholar]
- Pawlak V., Kerr J. N. (2008). Dopamine receptor activation is required for corticostriatal spike-timing-dependent plasticity. J. Neurosci. 28, 2435–2446 10.1523/JNEUROSCI.4402-07.2008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pennartz C. M., Ito R., Verschure P. F., Battaglia F. P., Robbins T. W. (2011). The hippocampal-striatal axis in learning, prediction and goal-directed behavior. Trends Neurosci. 34, 548–559 10.1016/j.tins.2011.08.001 [DOI] [PubMed] [Google Scholar]
- Pennartz C. M., Kitai S. T. (1991). Hippocampal inputs to identified neurons in an in vitro slice preparation of the rat nucleus accumbens: evidence for feed-forward inhibition. J. Neurosci. 11, 2838–2847 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Peters J., Kalivas P. W., Quirk G. J. (2009). Extinction circuits for fear and addiction overlap in prefrontal cortex. Learn. Mem. 16, 279–288 10.1101/lm.1041309 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Peters J., Lalumiere R. T., Kalivas P. W. (2008). Infralimbic prefrontal cortex is responsible for inhibiting cocaine seeking in extinguished rats. J. Neurosci. 28, 6046–6053 10.1523/JNEUROSCI.1045-08.2008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Petrovich G. D., Canteras N. S., Swanson L. W. (2001). Combinatorial amygdalar inputs to hippocampal domains and hypothalamic behavior systems. Brain Res. Brain Res. Rev. 38, 247–289 10.1016/S0165-0173(01)00080-7 [DOI] [PubMed] [Google Scholar]
- Peyrache A., Khamassi M., Benchenane K., Wiener S. I., Battaglia F. P. (2009). Replay of rule-learning related neural patterns in the prefrontal cortex during sleep. Nat. Neurosci. 12, 919–926 10.1038/nn.2337 [DOI] [PubMed] [Google Scholar]
- Plenz D. (2003). When inhibition goes incognito: feedback interaction between spiny projection neurons in striatal function. Trends Neurosci. 26, 436–443 10.1016/S0166-2236(03)00196-6 [DOI] [PubMed] [Google Scholar]
- Ponzi A., Wickens J. (2010). Sequentially switching cell assemblies in random inhibitory networks of spiking neurons in the striatum. J. Neurosci. 30, 5894–5911 10.1523/JNEUROSCI.5540-09.2010 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Price J. L., Drevets W. C. (2012). Neural circuits underlying the pathophysiology of mood disorders. Trends Cogn. Sci. 16, 61–71 10.1016/j.tics.2011.12.011 [DOI] [PubMed] [Google Scholar]
- Ragozzino M. E. (2007). The contribution of the medial prefrontal cortex, orbitofrontal cortex, and dorsomedial striatum to behavioral flexibility. Ann. N.Y. Acad. Sci. 1121, 355–375 10.1196/annals.1401.013 [DOI] [PubMed] [Google Scholar]
- Ragozzino M. E., Detrick S., Kesner R. P. (1999a). Involvement of the prelimbic-infralimbic areas of the rodent prefrontal cortex in behavioral flexibility for place and response learning. J. Neurosci. 19, 4585–4594 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ragozzino M. E., Wilcox C., Raso M., Kesner R. P. (1999b). Involvement of rodent prefrontal cortex subregions in strategy switching. Behav. Neurosci. 113, 32–41 [DOI] [PubMed] [Google Scholar]
- Ragozzino M. E., Ragozzino K. E., Mizumori S. J., Kesner R. P. (2002). Role of the dorsomedial striatum in behavioral flexibility for response and visual cue discrimination learning. Behav. Neurosci. 116, 105–115 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rasmussen M., Barnes C. A., McNaughton B. L. (1989). A systematic test of cognitive mapping, working memory, and temporal discontiguity theories of hippocampal function. Psychobiology 17, 335–348 [Google Scholar]
- Reading P. J., Dunnett S. B., Robbins T. W. (1991). Dissociable roles of the ventral, medial and lateral striatum on the acquisition and performance of a complex visual stimulus-response habit. Behav. Brain Res. 45, 147–161 [DOI] [PubMed] [Google Scholar]
- Rebec G. V., Curtis S. D. (1988). Reciprocal zones of excitation and inhibition in the neostriatum. Synapse 2, 633–635 10.1002/syn.890020609 [DOI] [PubMed] [Google Scholar]
- Reilly S., Harley C., Revusky S. (1993). Ibotenate lesions of the hippocampus enhance latent inhibition in conditioned taste aversion and increase resistance to extinction in conditioned taste preference. Behav. Neurosci. 107, 996–1004 [DOI] [PubMed] [Google Scholar]
- Reiman E. M., Lane R. D., Ahern G. L., Schwartz G. E., Davidson R. J., Friston K. J., Yun L. S., Chen K. (1997). Neuroanatomical correlates of externally and internally generated human emotion. Am. J. Psychiatry 154, 918–925 [DOI] [PubMed] [Google Scholar]
- Rescorla R. A. (2004). Spontaneous recovery varies inversely with the training-extinction interval. Learn. Behav. 32, 401–408 [DOI] [PubMed] [Google Scholar]
- Reynolds J. N., Hyland B. I., Wickens J. R. (2001). A cellular mechanism of reward-related learning. Nature 413, 67–70 10.1038/35092560 [DOI] [PubMed] [Google Scholar]
- Risold P. Y., Swanson L. W. (1996). Structural evidence for functional domains in the rat hippocampus. Science 272, 1484–1486 10.1126/science.272.5267.1484 [DOI] [PubMed] [Google Scholar]
- Roesch M. R., Esber G. R., Li J., Daw N. D., Schoenbaum G. (2012). Surprise! Neural correlates of Pearce-Hall and Rescorla-Wagner coexist within the brain. Eur. J. Neurosci. 35, 1190–1200 10.1111/j.1460-9568.2011.07986.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rudy J. W., Huff N. C., Matus-Amat P. (2004). Understanding contextual fear conditioning: insights from a two-process model. Neurosci. Biobehav. Rev. 28, 675–685 10.1016/j.neubiorev.2004.09.004 [DOI] [PubMed] [Google Scholar]
- Rudy J. W., Sutherland R. J. (1989). The hippocampal formation is necessary for rats to learn and remember configural discriminations. Behav. Brain Res. 34, 97–109 [DOI] [PubMed] [Google Scholar]
- Sadek A. R., Magill P. J., Bolam J. P. (2007). A single-cell analysis of intrinsic connectivity in the rat globus pallidus. J. Neurosci. 27, 6352–6362 10.1523/JNEUROSCI.0953-07.2007 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Salamone J. D., Correa M., Farrar A., Mingote S. M. (2007). Effort-related functions of nucleus accumbens dopamine and associated forebrain circuits. Psychopharmacology (Berl.) 191, 461–482 10.1007/s00213-006-0668-9 [DOI] [PubMed] [Google Scholar]
- Salamone J. D., Correa M., Farrar A. M., Nunes E. J., Pardo M. (2009). Dopamine, behavioral economics, and effort. Front. Behav. Neurosci. 3:13 10.3389/neuro.08.013.2009 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Samejima K., Ueda Y., Doya K., Kimura M. (2005). Representation of action-specific reward values in the striatum. Science 310, 1337–1340 10.1126/science.1115270 [DOI] [PubMed] [Google Scholar]
- Saper C. B. (2003). The hypothalamus, in In The Human Nervous System, ed Paxinos G. (San Diego, CA: Academic Press; ), 513–550 [Google Scholar]
- Satvat E., Schmidt B., Argraves M., Marrone D. F., Markus E. J. (2011). Changes in task demands alter the pattern of zif268 expression in the dentate gyrus. J. Neurosci. 31, 7163–7167 10.1523/JNEUROSCI.0094-11.2011 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schenk F., Morris R. G. (1985). Dissociation between components of spatial memory in rats after recovery from the effects of retrohippocampal lesions. Exp. Brain Res. 58, 11–28 [DOI] [PubMed] [Google Scholar]
- Schoenbaum G., Chiba A. A., Gallagher M. (1998). Orbitofrontal cortex and basolateral amygdala encode expected outcomes during learning. Nat. Neurosci. 1, 155–159 10.1038/407 [DOI] [PubMed] [Google Scholar]
- Schoenbaum G., Nugent S. L., Saddoris M. P., Setlow B. (2002). Orbitofrontal lesions in rats impair reversal but not acquisition of go, no-go odor discriminations. Neuroreport 13, 885–890 [DOI] [PubMed] [Google Scholar]
- Schultz W. (1998). Predictive reward signal of dopamine neurons. J. Neurophysiol. 80, 1–27 [DOI] [PubMed] [Google Scholar]
- Schultz W., Apicella P., Ljungberg T. (1993). Responses of monkey dopamine neurons to reward and conditioned stimuli during successive steps of learning a delayed response task. J. Neurosci. 13, 900–913 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schwartzbaum J. S. (1960). Changes in reinforcing properties of stimuli following ablation of the amygdaloid complex in monkeys. J. Comp. Physiol. Psychol. 53, 388–395 [DOI] [PubMed] [Google Scholar]
- Schwindel C. D., McNaughton B. L. (2011). Hippocampal-cortical interactions and the dynamics of memory trace reactivation. Prog. Brain Res. 193, 163–177 10.1016/B978-0-444-53839-0.00011-9 [DOI] [PubMed] [Google Scholar]
- Selden N. R., Everitt B. J., Jarrard L. E., Robbins T. W. (1991). Complementary roles for the amygdala and hippocampus in aversive conditioning to explicit and contextual cues. Neuroscience 42, 335–350 10.1016/0306-4522(91)90379-3 [DOI] [PubMed] [Google Scholar]
- Shapiro M. L., Kennedy P. J., Ferbinteanu J. (2006). Representing episodes in the mammalian brain. Curr. Opin. Neurobiol. 16, 701–709 10.1016/j.conb.2006.08.017 [DOI] [PubMed] [Google Scholar]
- Shen W., Flajolet M., Greengard P., Surmeier D. J. (2008). Dichotomous dopaminergic control of striatal synaptic plasticity. Science 321, 848–851 10.1126/science.1160575 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sidman M., Fletcher F. G. (1968). A demonstration of auto-shaping with monkeys. J. Exp. Anal. Behav. 11, 307–309 10.1901/jeab.1968.11-307 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Simon H., Le Moal M., Calas A. (1979). Efferents and afferents of the ventral tegmental-A10 region studied after local injection of [3H]leucine and horseradish peroxidase. Brain Res. 178, 17–40 10.1016/0006-8993(79)90085-4 [DOI] [PubMed] [Google Scholar]
- Sims R. E., Woodhall G. L., Wilson C. L., Stanford I. M. (2008). Functional characterization of GABAergic pallidopallidal and striatopallidal synapses in the rat globus pallidus in vitro. Eur. J. Neurosci. 28, 2401–2408 10.1111/j.1460-9568.2008.06546.x [DOI] [PubMed] [Google Scholar]
- Skaggs W. E., McNaughton B. L. (1996). Replay of neuronal firing sequences in rat hippocampus during sleep following spatial experience. Science 271, 1870–1873 10.1126/science.271.5257.1870 [DOI] [PubMed] [Google Scholar]
- Smith Y., Bevan M. D., Shink E., Bolam J. P. (1998). Microcircuitry of the direct and indirect pathways of the basal ganglia. Neuroscience 86, 353–387 10.1016/S0306-4522(98)00004-9 [DOI] [PubMed] [Google Scholar]
- Somogyi P., Bolam J. P., Smith A. D. (1981). Monosynaptic cortical input and local axon collaterals of identified striatonigral neurons. A light and electron microscopic study using the Golgi-peroxidase transport-degeneration procedure. J. Comp. Neurol. 195, 567–584 10.1002/cne.901950403 [DOI] [PubMed] [Google Scholar]
- Sparks F. T., Lehmann H., Sutherland R. J. (2011). Between-systems memory interference during retrieval. Eur. J. Neurosci. 34, 780–786 10.1111/j.1460-9568.2011.07796.x [DOI] [PubMed] [Google Scholar]
- Spiegler B. J., Mishkin M. (1981). Evidence for the sequential participation of inferior temporal cortex and amygdala in the acquisition of stimulus-reward associations. Behav. Brain Res. 3, 303–317 [DOI] [PubMed] [Google Scholar]
- Stote D. L., Fanselow M. S. (2004). NMDA receptor modulation of incidental learning in Pavlovian context conditioning. Behav. Neurosci. 118, 253–257 10.1037/0735-7044.118.1.253 [DOI] [PubMed] [Google Scholar]
- Stuber G. D., Sparta D. R., Stamatakis A. M., Van Leeuwen W. A., Hardjoprajitno J. E., Cho S., Tye K. M., Kempadoo K. A., Zhang F., Deisseroth K., Bonci A. (2011). Excitatory transmission from the amygdala to nucleus accumbens facilitates reward seeking. Nature 475, 377–380 10.1038/nature10194 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Suri R. E., Schultz W. (1998). Learning of sequential movements by nerual network model with dopamine-like reinforcement signal. Exp. Brain Res. 121, 350–354 10.1007/s002210050467 [DOI] [PubMed] [Google Scholar]
- Surmeier D. J., Day M., Gertler T., Chan S., Shen W. (2010). D1 and D2 dopamine receptor modulation of glutamatergic signaling in striatual medium spiny neurons, in Handbook of Basal Ganglia Structure and Function, eds Steiner H., Tseng K. Y. (London: Academic Press; ), 113–129 [Google Scholar]
- Surmeier D. J., Song W.-J., Yan Z. (1996). Coordinated expression of dopamine receptors in neostriatal medium spiny neurons. J. Neurosci. 16, 6579–6591 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sutcliffe J. G., De Lecea L. (2002). The hypocretins: setting the arousal threshold. Nat. Rev. Neurosci. 3, 339–349 10.1038/nrn808 [DOI] [PubMed] [Google Scholar]
- Sutherland R. J., Kolb B., Whishaw I. Q. (1982). Spatial mapping: definitive disruption by hippocampal or medial frontal cortical damage in the rat. Neurosci. Lett. 31, 271–276 [DOI] [PubMed] [Google Scholar]
- Sutherland R. J., McDonald R. J. (1990). Hippocampus, amygdala, and memory deficits in rats. Behav. Brain Res. 37, 57–79 10.1016/0166-4328(90)90072-M [DOI] [PubMed] [Google Scholar]
- Sutherland R. J., McDonald R. J., Hill C. R., Rudy J. W. (1989). Damage to the hippocampal formation in rats selectively impairs the ability to learn cue relationships. Behav. Neural Biol. 52, 331–356 [DOI] [PubMed] [Google Scholar]
- Sutherland R. J., Whishaw I. Q., Kolb B. (1988). Contributions of cingulate cortex to two forms of spatial learning and memory. J. Neurosci. 8, 1863–1872 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sutton R. S., Barto A. G. (1998). Reinforcement Learning: An Introduction. Cambridge, MA: MIT Press; 10.1016/j.neunet.2008.09.004 [DOI] [Google Scholar]
- Swanson L. W., Kohler C. (1986). Anatomical evidence for direct projections from the entorhinal area to the entire cortical mantle in the rat. J. Neurosci. 6, 3010–3023 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Takahashi K., Lin J. S., Sakai K. (2006). Neuronal activity of histaminergic tuberomammillary neurons during wake-sleep states in the mouse. J. Neurosci. 26, 10292–10298 10.1523/JNEUROSCI.2341-06.2006 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tepper J. M. (2010). GABAergic interneurons of the striatum, in Handbook of Basal Ganglia Structure and Function, eds Steiner H., Tseng K. Y. (London: Academic Press; ), 151–166 [Google Scholar]
- Thorn C. A., Atallah H., Howe M., Graybiel A. M. (2010). Differential dynamics of activity changes in dorsolateral and dorsomedial striatal loops during learning. Neuron 66, 781–795 10.1016/j.neuron.2010.04.036 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Thorndike E. L. (1911). Animal Intelligence. New York, NY: Macmillan Co [Google Scholar]
- Tolman E. C. (1932). Purposive Behavior in Animals and Man. New York, NY: Appleton [Google Scholar]
- Tolman E. C., Ritchie B. F., Kalish D. (1946). Studies in spatial learning; place learning versus response learning. J. Exp. Psychol. 36, 221–229 [DOI] [PubMed] [Google Scholar]
- Tulving E., Markowitsch H. J. (1998). Episodic and declarative memory: role of the hippocampus. Hippocampus 8, 198–204 [DOI] [PubMed] [Google Scholar]
- Tunstall M. J., Oorschot D. E., Kean A., Wickens J. R. (2002). Inhibitory interactions between spiny projection neurons in the rat striatum. J. Neurophysiol. 88, 1263–1269 [DOI] [PubMed] [Google Scholar]
- Van Den Bos W., Li J., Lau T., Maskin E., Cohen J. D., Montague P. R., McClure S. M. (2008). The value of victory: social origins of the winner's curse in common value auctions. Judgm. Decis. Mak. 3, 483–492 [PMC free article] [PubMed] [Google Scholar]
- Van Der Meer M. A., Redish A. D. (2009). Covert expectation-of-reward in rat ventral striatum at decision points. Front. Integr. Neurosci. 3:1 10.3389/neuro.07.001.2009 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Van Der Meer M. A., Redish A. D. (2011). Theta phase precession in rat ventral striatum links place and reward information. J. Neurosci. 31, 2843–2854 10.1523/JNEUROSCI.4869-10.2011 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Veening J. G. (1978). Subcortical afferents of the amygdaloid complex in the rat: an HRP study. Neurosci. Lett. 8, 197–202 10.1016/0304-3940(78)90121-0 [DOI] [PubMed] [Google Scholar]
- Viviani D., Charlet A., Van Den Burg E., Robinet C., Hurni N., Abatis M., Magara F., Stoop R. (2011). Oxytocin selectively gates fear responses through distinct outputs from the central amygdala. Science 333, 104–107 10.1126/science.1201043 [DOI] [PubMed] [Google Scholar]
- Voorn P., Vanderschuren L. J., Groenewegen H. J., Robbins T. W., Pennartz C. M. (2004). Putting a spin on the dorsal-ventral divide of the striatum. Trends Neurosci. 27, 468–474 10.1016/j.tins.2004.06.006 [DOI] [PubMed] [Google Scholar]
- Wan F. J., Caine S. B., Swerdlow N. R. (1996). The ventral subiculum modulation of prepulse inhibition is not mediated via dopamine D2 or nucleus accumbens non-NMDA glutamate receptor activity. Eur. J. Pharmacol. 314, 9–18 [DOI] [PubMed] [Google Scholar]
- Wan F. J., Swerdlow N. R. (1996). Sensorimotor gating in rats is regulated by different dopamine-glutamate interactions in the nucleus accumbens core and shell subregions. Brain Res. 722, 168–176 10.1016/0006-8993(96)00209-0 [DOI] [PubMed] [Google Scholar]
- Wan F. J., Swerdlow N. R. (1997). The basolateral amygdala regulates sensorimotor gating of acoustic startle in the rat. Neuroscience 76, 715–724 10.1016/S0306-4522(96)00218-7 [DOI] [PubMed] [Google Scholar]
- Wan X., Peoples L. L. (2006). Firing patterns of accumbal neurons during a pavlovian-conditioned approach task. J. Neurophysiol. 96, 652–660 10.1152/jn.00068.2006 [DOI] [PubMed] [Google Scholar]
- Weiner I., Gal G., Rawlins J. N., Feldon J. (1996). Differential involvement of the shell and core subterritories of the nucleus accumbens in latent inhibition and amphetamine-induced activity. Behav. Brain Res. 81, 123–133 [DOI] [PubMed] [Google Scholar]
- Weiner I., Tarrasch R., Feldon J. (1995). Basolateral amygdala lesions do not disrupt latent inhibition. Behav. Brain Res. 72, 73–81 [DOI] [PubMed] [Google Scholar]
- Weiskrantz L. (1956). Behavioral changes associated with ablation of the amygdaloid complex in monkeys. J. Comp. Physiol. Psychol. 49, 381–391 [DOI] [PubMed] [Google Scholar]
- Whalen P. J., Rauch S. L., Etcoff N. L., McInerney S. C., Lee M. B., Jenike M. A. (1998). Masked presentations of emotional facial expressions modulate amygdala activity without explicit knowledge. J. Neurosci. 18, 411–418 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Whishaw I. Q., Piecharka D. M., Zeeb F., Stein D. G. (2004). Unilateral frontal lobe contusion and forelimb function: chronic quantitative and qualitative impairments in reflexive and skilled forelimb movements in rats. J. Neurotrauma 21, 1584–1600 10.1089/neu.2004.21.1584 [DOI] [PubMed] [Google Scholar]
- White I. M., Rebec G. V. (1993). Responses of rat striatal neurons during performance of a lever-release version of the conditioned avoidance response task. Brain Res. 616, 71–82 10.1016/0006-8993(93)90194-R [DOI] [PubMed] [Google Scholar]
- White N. M., McDonald R. J. (2002). Multiple parallel memory systems in the brain of the rat. Neurobiol. Learn. Mem. 77, 125–184 10.1006/nlme.2001.4008 [DOI] [PubMed] [Google Scholar]
- White N. M., Packard M. G., Hiroi N. (1991). Place conditioning with dopamine D1 and D2 agonists injected peripherally or into nucleus accumbens. Psychopharmacology (Berl.) 103, 271–276 [DOI] [PubMed] [Google Scholar]
- Wickens J. R., Kotter R., Alexander M. E. (1995). Effects of local connectivity on striatal function: stimulation and analysis of a model. Synapse 20, 281–298 10.1002/syn.890200402 [DOI] [PubMed] [Google Scholar]
- Wiener S. I. (1993). Spatial and behavioral correlates of striatal neurons in rats performing a self-initiated navigation task. J. Neurosci. 13, 3802–3817 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wilcove W. G., Miller J. C. (1974). CS-USC presentations and a lever: human autoshaping. J. Exp. Psychol. 103, 868–877 [DOI] [PubMed] [Google Scholar]
- Wiltgen B. J., Sanders M. J., Anagnostaras S. G., Sage J. R., Fanselow M. S. (2006). Context fear learning in the absence of the hippocampus. J. Neurosci. 26, 5484–5491 10.1523/JNEUROSCI.2685-05.2006 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Winstanley C. A., Eagle D. M., Robbins T. W. (2006). Behavioral models of impulsivity in relation to ADHD: translation between clinical and preclinical studies. Clin. Psychol. Rev. 26, 379–395 10.1016/j.cpr.2006.01.001 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wise S. P., Murray E. A. (2000). Arbitrary associations between antecedents and actions. Trends Neurosci. 23, 271–276 10.1016/S0166-2236(00)01570-8 [DOI] [PubMed] [Google Scholar]
- Witten I. B., Steinberg E. E., Lee S. Y., Davidson T. J., Zalocusky K. A., Brodsky M., Yizhar O., Cho S. L., Gong S., Ramakrishnan C., Stuber G. D., Tye K. M., Janak P. H., Deisseroth K. (2011). Recombinase-driver rat lines: tools, techniques, and optogenetic application to dopamine-mediated reinforcement. Neuron 72, 721–733 10.1016/j.neuron.2011.10.028 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wood E. R., Dudchenko P. A., Robitsek R. J., Eichenbaum H. (2000). Hippocampal neurons encode information about different types of memory episodes occurring in the same location. Neuron 27, 623–633 10.1016/S0896-6273(00)00071-4 [DOI] [PubMed] [Google Scholar]
- Wright C. I., Groenewegen H. J. (1995). Patterns of convergence and segregation in the medial nucleus accumbens of the rat: relationships of prefrontal cortical, midline thalamic, and basal amygdaloid afferents. J. Comp. Neurol. 361, 383–403 10.1002/cne.903610304 [DOI] [PubMed] [Google Scholar]
- Wyvell C. L., Berridge K. C. (2000). Intra-accumbens amphetamine increases the conditioned incentive salience of sucrose reward: enhancement of reward “wanting” without enhanced “liking” or response reinforcement. J. Neurosci. 20, 8122–8130 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yin H. H., Knowlton B. J. (2006). The role of the basal ganglia in habit formation. Nat. Rev. Neurosci. 7, 464–476 10.1038/nrn1919 [DOI] [PubMed] [Google Scholar]
- Yin H. H., Knowlton B. J., Balleine B. W. (2004). Lesions of dorsolateral striatum preserve outcome expectancy but disrupt habit formation in instrumental learning. Eur. J. Neurosci. 19, 181–189 10.1111/j.1460-9568.2004.03095.x [DOI] [PubMed] [Google Scholar]
- Yin H. H., Knowlton B. J., Balleine B. W. (2006). Inactivation of dorsolateral striatum enhances sensitivity to changes in the action-outcome contingency in instrumental conditioning. Behav. Brain Res. 166, 189–196 10.1016/j.bbr.2005.07.012 [DOI] [PubMed] [Google Scholar]
- Yin H. H., Ostlund S. B., Knowlton B. J., Balleine B. W. (2005). The role of the dorsomedial striatum in instrumental conditioning. Eur. J. Neurosci. 22, 513–523 10.1111/j.1460-9568.2005.04218.x [DOI] [PubMed] [Google Scholar]
- Young A. M. (2004). Increased extracellular dopamine in nucleus accumbens in response to unconditioned and conditioned aversive stimuli: studies using 1 min microdialysis in rats. J. Neurosci. Methods 138, 57–63 10.1016/j.jneumeth.2004.03.003 [DOI] [PubMed] [Google Scholar]
- Yun I. A., Wakabayashi K. T., Fields H. L., Nicola S. M. (2004). The ventral tegmental area is required for the behavioral and nucleus accumbens neuronal firing responses to incentive cues. J. Neurosci. 24, 2923–2933 10.1523/JNEUROSCI.5282-03.2004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zhang W. N., Bast T., Feldon J. (2001). The ventral hippocampus and fear conditioning in rats: different anterograde amnesias of fear after infusion of N-methyl-D-aspartate or its noncompetitive antagonist MK-801 into the ventral hippocampus. Behav. Brain Res. 126, 159–174 [DOI] [PubMed] [Google Scholar]
- Zheng T., Wilson C. J. (2002). Corticostriatal combinatorics: the implications of corticostriatal axonal arborizations. J. Neurophysiol. 87, 1007–1017 [DOI] [PubMed] [Google Scholar]
- Zilli E. A., Hasselmo M. E. (2008). Modeling the role of working memory and episodic memory in behavioral tasks. Hippocampus 18, 193–209 10.1002/hipo.20382 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zorawski M., Killcross S. (2003). Glucocorticoid receptor agonist enhances pavlovian appetitive conditioning but disrupts outcome-specific associations. Behav. Neurosci. 117, 1453–1457 10.1037/0735-7044.117.6.1453 [DOI] [PubMed] [Google Scholar]