Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2015 Feb 1.
Published in final edited form as: Neurobiol Learn Mem. 2013 Aug 30;0:52–64. doi: 10.1016/j.nlm.2013.08.012

Behavioral and Neurobiological Mechanisms of Extinction in Pavlovian and Instrumental Learning

Travis P Todd 1, Drina Vurbic 1, Mark E Bouton 1
PMCID: PMC3946264  NIHMSID: NIHMS521823  PMID: 23999219

Abstract

This article reviews research on the behavioral and neural mechanisms of extinction as it is represented in both Pavlovian and instrumental learning. In Pavlovian extinction, repeated presentation of a signal without its reinforcer weakens behavior evoked by the signal; in instrumental extinction, repeated occurrence of a voluntary action without its reinforcer weakens the strength of the action. In either case, contemporary research at both the behavioral and neural levels of analysis has been guided by a set of extinction principles that were first generated by research conducted at the behavioral level. The review discusses these principles and illustrates how they have informed the study of both Pavlovian and instrumental extinction. It shows that behavioral and neurobiological research efforts have been tightly linked and that their results are readily integrated. Pavlovian and instrumental extinction are also controlled by compatible behavioral and neural processes. Since many behavioral effects observed in extinction can be multiply determined, we suggest that the current close connection between behavioral-level and neural-level analyses will need to continue.

Keywords: Extinction, Pavlovian learning, instrumental learning, d-cycloserine, prefrontal cortex


Behavioral research over the last few decades has made significant progress in uncovering the mechanisms that underlie extinction, the behavioral phenomenon in which learned behavior decreases in strength or frequency when the event that reinforced it is removed (e.g., Delamater & Westbrook, this issue). Extinction is important, in part because it is one of the most basic of all behavioral change effects, and in part because it is thought to be involved in many clinical treatments that are designed to get rid of unwanted learned behaviors, thoughts, and emotions (e.g., Craske, Kircanski, Zelikowsky, Mystkowski, Chowdhury, & Baker, 2008; Craske, Liao, Brown, & Vervliet, 2012). Since the 1970s, extinction has been extensively studied in Pavlovian conditioning, where responding to a conditioned stimulus (CS) that has been associated with an unconditioned stimulus (US) decreases when the CS is then presented repeatedly alone. The results of this literature, along with the larger behavioral research literature of which it is a part, support a set of principles of extinction that are summarized in Table 1. These principles have gone on to shape further research on extinction at both the behavioral and neurobiological levels of analysis. The purpose of the present article is to discuss these principles and explore how they have facilitated progress in behavioral and neurobiological research on Pavlovian extinction and a “newer” frontier concerning the extinction of instrumental learning, where the focus is on voluntary behaviors that are controlled by their consequences (reinforcers). We also suggest that continued success at elucidating the neurobiological mechanisms of Pavlovian and instrumental extinction will require continued research at the behavioral level.

Table 1.

Summary principles of extinction

  • Extinction is not the same as erasure

    • Responding can return or “relapse” through spontaneous recovery, renewal, reinstatement, rapid reacquisition, resurgence

  • The context plays a fundamental role in extinction

    • “context” can be provided by exteroceptive background cues as well as interoceptive cues such as drug state, hormonal state, mood state, deprivation state, recent events, expectation of events, and time

    • Extinction is at least partly a context-specific form of inhibitory learning

  • Performance declines in extinction because of (1.) generalization decrement and (2.) the correction of prediction error

  • Extinction is a retroactive interference paradigm that shares many features with other “interference paradigms” involving retroactive and/or proactive interference

Summary extinction principles

The first extinction principle supported by behavioral research is perhaps the most widely recognized today: Although behavior goes away in extinction, Extinction is not the same as erasure. Learning theorists have long recognized that there is a difference between behavior on the one hand and the organism’s knowledge on the other. Extinction is a good example of the so-called “learning-performance distinction:” Although performance is at a zero level at the end of extinction, the original learning is still retained in long-term memory and the brain. Pavlov’s early demonstrations (1927) of spontaneous recovery were the first to support this idea: If time is allowed to elapse after extinction, responding can return. The list of related recovery effects was expanded in the 1970s and 1980s (e.g., see Bouton, 1988, for an early review). In renewal, extinguished responding returns when the CS is removed from the extinction context and tested in another context (e.g., Bouton & Bolles, 1979a; Bouton & King, 1983; Bouton & Ricker, 1994; Laborda, Witnauer, & Miller, 2011). In reinstatement, behavior recovers if the unconditioned stimulus (US) is presented again after extinction (e.g., Rescorla & Heth, 1975). And in rapid reacquisition, responding can return to the CS quickly if CS-US pairings are resumed after extinction (e.g., Napier, Macrae, & Kehoe, 1992). Since the 1980s and early 1990s, these recovery phenomena have been seen as potential models of relapse after extinction (e.g., Bouton, 1988, 2002; Bouton & Swartzentruber, 1991; Laborda, McConnell, & Miller, 2011). Because they do not occur unless the CS was associated with the US in the original conditioning phase, they each indicate that at least part of the knowledge that was acquired during conditioning must survive extinction. Some comparisons of extinguished and nonextinguished CSs have even suggested that extinction can leave the strength of the original CS-US association more or less “fully preserved” (Rescorla, 1996; see also Delamater, 1996).

A second group of extinction principles follows directly from the evidence supporting the first. At the same time the various relapse phenomena indicate that extinction is not erasure, they also demonstrate that The context plays a fundamental role in extinction. Renewal, which indicates that extinction performance can be lost when the context is changed, is the most direct demonstration of this principle. However, all of the other relapse effects are also arguably context effects (e.g., Bouton, 1988, 1993, 2004). For example, reinstatement occurs at least in part because the reinstating US presentations condition the context, and this contextual conditioning is the trigger that causes responding to return to the CS. The supporting evidence includes the fact that reinstatement is typically not observed unless the US is presented in the context where the CS will be tested (e.g., Bouton, 1984; Bouton & Bolles, 1979b; Bouton & Peck, 1989), and the strength of reinstatement correlates with measures of contextual conditioning (Bouton, 1984; Bouton & King, 1983). The other relapse effects are also context effects, because “context” is provided by many different kinds of stimuli. In typical experiments on renewal, the contexts are provided by the conditioning chambers in which conditioning and/or extinction are conducted. However, interoceptive cues can play the role of context, as they do in state-dependent extinction, where extinction performance is shown to be specific to the context created by a drug administered during extinction (e.g., Bouton, Kenney, & Rosengard, 1990; Cunningham, 1979; Hart, Harris, & Westbrook, 2009; Lattal, 2007). Furthermore, recent USs and recent CS-US pairings are part of the context of conditioning and can cause extinguished responding to return—as they can in reinstatement and rapid reacquisition (e.g., Baker, Steinwald, & Bouton, 1991; Bouton, Woods, & Pineño, 2004). Spontaneous recovery can also be conceptualized as the renewal effect that occurs when the extinguished CS is tested in a new temporal context (e.g., Bouton, 1988; Brooks & Bouton, 1993). Temporal cues can clearly disambiguate the current meaning of the CS; rats can use the temporal context provided by the current inter-trial interval to signal whether or not the next CS will be reinforced (e.g., Bouton & Hendrix, 2011).

Perhaps a deeper point about the context’s role in extinction is that extinction depends at least partly on a context-specific form of inhibitory learning. This idea is most clearly consistent with the so-called ABC and AAB forms of the renewal effect. In these situations, extinguished responding returns when conditioning, extinction, and testing (respectively) occur in Contexts A, B, and C or in A, A, and B. The common feature of these forms of renewal is that the return of the response does not depend on return to the original context of conditioning (as in the so-called ABA renewal effect); mere removal from the extinction context is sufficient. This fact suggests that the response is actively inhibited in the context of extinction. In principle, such inhibition can take any of several forms. The simplest form is the one suggested by many models of associative learning, including the Rescorla-Wagner model (1972): Extinction with the CS in a neutral Context B, for example, could make the associative strength of the context become negative (see also Pearce & Hall, 1980; Wagner, 1981). The context would essentially predict “no US.” Although such direct inhibition could occur in extinction, evidence of it is rare and usually lacking (e.g., see Bouton, 1993, for one review). Behavioral research on Pavlovian extinction has instead supported the form of inhibition known as “negative occasion setting.” Here, the context of extinction serves as a hierarchical information cue that signals that the CS will not be paired with the US (e.g., Holland, 1992; Schmajuk, Lamoureux, & Holland, 1998). In a way, it sets the occasion for the CS’s meaning. We will suggest a third form of inhibition when we consider the context’s role in instrumental extinction: The context may directly inhibit the instrumental response (e.g., Rescorla, 1993, 1997). We should add that the effects of contexts on extinction are not exclusively inhibitory. They can have “excitatory” influences, too, as in reinstatement, when the recent association of the context and the US triggers responding to the CS, and in ABA renewal, where testing in the context of conditioning typically causes stronger renewal than testing in a neutral context (ABC renewal; e.g., Harris, Jones, Bailey, & Westbrook, 2000).

The next summary principle of extinction is that behavior weakens in extinction in part because of (1.) generalization decrement and (2.) new learning that is driven by prediction error. Generalization decrement occurs in extinction because the stimulus conditions change when extinction begins. For instance, simple omission of the US can remove some of the stimulus support for responding; when the US is still occasionally presented during extinction, but without a contingent relationship with the CS, the loss of responding is slowed (e.g., Frey & Butler, 1977; Rescorla & Cunningham, 1977; Spence, 1966). Although such a result is consistent with several behavioral mechanisms, one is that presenting the US in extinction makes the stimulus conditions of extinction more similar to those of conditioning. Generalization decrement was emphasized in a generation of extinction theories that were developed in the 1950s and 1960s (e.g., Amsel, 1967; Capaldi, 1967). It was the main mechanism that explained the partial reinforcement extinction effect (PREE), in which responding declines more slowly over extinction trials in subjects that have not been reinforced every time they made the response (e.g., Mackintosh, 1974). For example, according to sequential theory (e.g., Capaldi, 1967), the PREE occurs because the subject has been reinforced for responding in the presence of a memory of not being rewarded, which makes responding persist over more nonrewarded extinction trials. In frustration theory (e.g., Amsel, 1967), responding is prolonged because the animal has learned to respond in the presence of frustration cues that are present in extinction. Notice that, consistent with the first summary extinction principle, generalization decrement does not imply that extinction erases the original learning.

As noted above, however, phenomena like ABC and AAB renewal further suggest that extinction is also controlled in part by new (context-dependent) inhibitory learning. According to modern conceptions of the learning process, new learning is driven by “prediction error,” a behavioral mechanism that is embodied in the Rescorla-Wagner model (e.g., Rescorla & Wagner, 1972) and many of the models of conditioning that followed it (e.g., Pearce & Hall, 1980; Wagner, 1981). According to this view, the degree of associative change that occurs on any conditioning or extinction trial is governed by the difference between what is predicted by all the cues present on the trial and the US that actually occurs. The discrepancy between what is predicted and what occurs is the prediction error. To examine the idea, a number of experiments have explored the effects of compounding other CSs with a target CS during extinction. When the target is compounded with a separate predictor of the US, the US prediction is especially strong and responding to the target is especially decremented (e.g., Leung, Reeks, & Westbrook, 2012; Rescorla, 2000, 2006; Wagner, 1969). Conversely, when the target CS is compounded with a CS that actively predicts “no US” (that is, a conditioned inhibitor), prediction error is reduced and there is less associative consequence of extinction trials (e.g., Rescorla, 2003; Soltysik et al., 1983). We should note that presenting a target CS in compound with other CSs during extinction can introduce complications. For example, compounding the target CS with two excitors may be less effective than compounding it with one excitor (McConnell, Miguez, & Miller, 2013), and there may be increased generalization decrement when a CS extinguished in compound with other stimuli is later tested alone (Urcelay, Lipatova, & Miller, 2009; Vervliet, Vansteenwegen, Hermans, & Eelen, 2007). Perhaps the most striking evidence for the role of prediction error in extinction, however, is the “overexpectation effect,” in which two CSs are separately associated with the US and then presented in a compound that is paired with the US. Even though the compound is reinforced on each trial, the compounded CSs summate to predict a greater US than the one that occurs, and therefore undergo some extinction (see Kremer 1978; Lattal & Nakajima 1998; Rescorla, 2007). This sort of result has had a profound effect on how we understand the conditions that drive new learning in extinction. Interestingly, just like other extinction phenomena, the decrement in conditioned responding caused by overexpectation is subject to spontaneous recovery as well as renewal (e.g., Rescorla, 2006; 2007)

The last principle of extinction in Table 1 is that the extinction paradigm (in which a CS is presented without the US after previous CS-US pairings) is just one of several interference paradigms that all follow similar principles (Bouton, 1993). In interference paradigms, the animal learns conflicting information about a CS in different phases of an experiment, and these hinder one another. In extinction, the focal interference is retroactive; Phase 2 learning interferes with performance from the first. In other paradigms, such as discrimination reversal learning and counter conditioning (in which a CS is paired with USs of different emotional valences in Phases 1 and 2), there is a mixture of both retroactive and proactive interference. And in still other paradigms, such as latent inhibition, the focus is primarily on proactive interference. Although theories of the various interference paradigms have often focused on interference at the level of storage and learning, there is evidence to suggest that interference often occurs at the level of performance output (see Bouton, 1993). Here again, there is a difference between knowledge and performance. As in extinction, physical context and temporal context (time) have similar influence on these paradigms. Miller and Escobar (2002) have expanded this point of view, for example, to also include situations in which a given US is associated with different predictors in different phases.

Neurobiological mechanisms of Pavlovian extinction

Advances in our understanding of extinction from a behavioral point of view contributed directly to a new interest in the underlying neurobiological mechanisms that began to crystallize and mature in the 2000s (e.g., Delamater, 2004; Myers & Davis, 2002). In general, the neuroscience literature can be divided into several areas that emphasize one of the major principles just described. Perhaps the largest area is one investigating the neural circuits and synaptic changes that support new learning in extinction. Although there is evidence that some of the synaptic changes that occur during conditioning can be reversed by extinction (Kim et al., 2007; Lin, Yeh, Lu, & Gean, 2003; see also Delamater, 2012), many findings strongly support the idea that new learning occurs, and that like conditioning (as well as other forms of learning), extinction depends on NMDA receptor activity (for reviews, see Davis, 2011; Orsini & Maren, 2012; Quirk & Mueller, 2008). Many studies have now shown that blocking this receptor immediately before or after extinction training blocks fear extinction (e.g., Burgos-Robles, Vidal-Gonzalez, Santini, & Quirk, 2007; Falls, Miserendino, & Davis, 1992; Santini, Muller, & Quirk, 2001), indicating that the NMDA receptor is necessary for both encoding and consolidation of extinction. Indeed, NMDA receptors in the basolateral amygdala (BLA; Falls et al., 1992) are necessary during extinction training, whereas those in the infralimbic area of the medial prefrontal cortex (IL mPFC; Burgos-Robles et al., 2007) are needed immediately following training. In a complementary way, enhancing NMDA activity has been shown to facilitate fear extinction. Modulating NMDA activity has thus become an important therapeutic target for clinical scientists seeking to optimize extinction-based therapies (see below for a more detailed discussion).

Interactions between the BLA and IL mPFC are now thought to embody the inhibitory mechanism that controls fear responding after extinction. As the behavioral research makes clear, however, the context plays a major role in modulating its expression. A corresponding literature has focused on understanding the role of context, and has implicated the workings of the hippocampus, a region known to be critically involved in processing contextual stimuli and forming representations of the context. The work of Maren and colleagues (Corcoran & Maren, 2001, 2004; Hobin, Ji, & Maren, 2006; Orsini, Kim, Knapska, & Maren, 2011; see also Orsini & Maren, 2012) has emphasized hippocampal involvement in renewal (but see Campese & Delamater, 2013; Frohardt, Guarraci, & Bouton, 2000; Wilson, Brooks, & Bouton, 1995; Zelikowsky, Pham, & Fanselow, 2012). Using several different manipulations, they have demonstrated that hippocampal inactivation prior to renewal testing prevents extinguished fear from returning. Orsini et al. (2011) have extended those findings by showing that disrupting communication between the hippocampus and BLA by severing direct or indirect pathways (via the PFC) abolishes renewal. Although there is far less research examining hippocampal involvement in other relapse effects, data from our laboratory has shown that it plays a role in reinstatement after fear extinction (Frohardt et al., 2000; Wilson et al., 1995), although its role after appetitive extinction is less certain (Fox & Holland, 1998).

Another area focuses on the role of prediction error and how it is encoded in the brain. Recent work in fear extinction implicates the actions of endogenous opioids in the detection of negative prediction errors. McNally and colleagues (McNally, Pigg, & Weidemann, 2004a; McNally & Westbrook, 2003) have demonstrated such action in several experiments in which they blocked opioid activity with naloxone. Rats injected with naloxone immediately prior to (but not after) extinction showed dose-dependent deficits in within-session extinction that remained on later tests. Even more convincingly, blocking opioid receptors prevented the loss of responding due to overexpectation when two CSs were reinforced in compound (McNally, Pigg, & Weidemann, 2004b). Prediction errors appear to be encoded by other brain systems as well. Studies of appetitive learning have implicated the activity of midbrain dopamine neurons. These neurons fire in response to surprising rewards and are suppressed when expected rewards are withheld (Hollerman & Schultz, 1998; Waelti, Dickinson, & Schultz, 2001). Thus, dopamine neurons can detect the occurrence, size, and direction of prediction errors. More recent studies have also demonstrated the role of dopamine prediction errors in fear extinction, suggesting a broader role for them than in learning about the absence of positive rewards (Holtzman-Assif, Laurent, & Westbrook, 2010).

As noted above, researchers have identified the NMDA receptor as a critical component of the new learning that occurs in extinction. That discovery has led to investigation of whether enhancing NMDA activity can facilitate or strengthen extinction learning. Most of this work has focused on the drug d-cycloserine (DCS), a partial agonist of the receptor. Prior to its use in fear extinction studies, DCS had been shown to facilitate other forms of learning in animals, including eyeblink conditioning (Thompson & Disterhoft, 1997) and maze learning (Quartermain, Mower, Rafferty, Herting, & Lanthorn, 1994; Pussinen et al., 1997). Walker, Ressler, Lu, and Davis (2002) were the first to show that DCS enhanced fear extinction. In their experiments, animals treated with DCS shortly before extinction showed less fear when tested the following day. Administration of DCS without extinction training had no effect on its own. Other laboratories have since demonstrated similar results with fear extinction (Bouton, Vurbic, & Woods, 2008; Langton & Richardson, 2010; Ledgerwood, Richardson, & Cranney, 2003; Woods & Bouton, 2006) and non-fear paradigms such as conditioned place preference or aversion (Botreau, Paolone, & Stewart, 2006; Myers & Carlezon, 2010; Paolone, Botreau, & Stewart, 2009). Importantly, several of these studies have shown that post-training administration is similarly effective in enhancing extinction on subsequent tests, suggesting that DCS facilitates consolidation of extinction memories. This conclusion is consistent with other findings that the effect of DCS is reduced as the delay between extinction and post-training DCS administration is increased (Ledgerwood et al., 2003).

The generality of these effects across different learning paradigms suggests that DCS may have translational potential when administered alongside extinction-based clinical treatments, and indeed some human studies have shown significant benefits for exposure therapy (e.g., Otto et al., 2010; Ressler et al., 2004; Smits et al., 2013; but see Litz et al, 2012). However, as the behavioral research on extinction makes clear, there are important issues to consider regarding the use of extinction-enhancing drugs in therapy. One is that there is no a priori reason to think that a drug that enhances extinction learning will change the nature of extinction learning qualitatively. Thus, one must ask whether DCS affects extinction’s fundamental context specificity. In our laboratory, DCS facilitates fear extinction but leaves animals vulnerable to renewal (Bouton et al., 2008; Woods & Bouton, 2006); even extinction learning facilitated by DCS is still context-specific. It is more likely that DCS enables extinction learning to progress more quickly or with fewer exposures to the CS; that is, DCS enhances extinction quantitatively rather than qualitatively. In clinical settings, this may translate to achieving treatment goals in fewer therapy sessions. However, a further caveat is that DCS is ineffective with minimal training––some extinction must be learned while the drug is in the system in order for DCS to facilitate it (Bouton et al., 2008; Smits et al., 2013; Weber, Hart, & Richardson, 2007). There is also evidence from animal studies that DCS may actually impair extinction if too little training is given. Lee, Milton, and Everitt (2006) found that DCS-treated animals given a single CS exposure can display more fear than controls on subsequent tests. Based on these studies, it is clear that DCS does not erase fear memories or protect against renewal, and may not decrease (or may even increase) fearful responding under some conditions. Several findings from human studies appear to be consistent with that possibility (Litz et al., 2012; Smits et al., 2013).

Another area of research has responded to the behavioral evidence that extinction does not cause erasure by seeking new ways to interfere with fearful memories more permanently. This research began by focusing on the way that fear memories are stored in the brain. Investigators have long known that presenting fear-conditioned animals with a retrieval cue can make the fear memory temporarily more sensitive to various disruptions (Misanin, Miller & Lewis, 1968; Nader, 2003; Nader, Schafe, & LeDoux, 2000). If performed within a short post-retrieval window, these manipulations result in long-term decreases in fear responding to the CS. Such findings have led Nader and colleagues to propose that fear memories are not necessarily permanent after being consolidated. Rather, the act of retrieving a memory makes it unstable again. Before being reconsolidated, a newly unstable memory can be weakened if the neurobiological processes required for memory storage (or in this case re-storage) are stopped. This idea is mainly supported by experiments in which weakly fear-conditioned rats are given drugs that stop protein synthesis from occurring soon after memory retrieval. When later tested, the rats fail to show any fear (Nader et al., 2000; see Kindt, Soeter, & Vervliet, 2009, and Soeter & Kindt, 2012, for similar findings in humans).

Similar effects have more recently been demonstrated with behavioral procedures that use extinction in place of drugs. Monfils, Cowansage, Klann, and LeDoux (2009) reported that rats given fear extinction shortly after memory retrieval did not show any signs of relapse, including spontaneous recovery, renewal, and reinstatement. On the other hand, rats that were given extinction after a longer post-retrieval delay (i.e., after reconsolidation was presumed to be complete) demonstrated the usual recovery effects. The findings have generated considerable interest because they suggest that under certain conditions extinction procedures may lead to erasure and unlearning. However, studies in this area are still few, and there are several unanswered questions about the behavioral and neurobiological mechanisms that are involved. One important issue, which has also been a longstanding question for the earlier studies using post-retrieval drugs, is whether this retrieval–extinction procedure actually changes (or erases) fear memories or instead makes them less accessible. The absence of fear responding on a test cannot distinguish between these possibilities (e.g., see Lattal & Stafford, 2008). Another question concerns the fact that memory retrieval is typically achieved with a brief nonreinforced presentation of the CS that would otherwise be considered an ordinary extinction trial. It is not clear why extinction trials should reactivate fear memories and allow more extinction trials to cause erasure in the Monfils et al. (2009) paradigm, but context-dependent new extinction learning in most other extinction protocols, although it is becoming apparent that different molecular processes would be involved (see Auber, Tedesco, Jones, Monfils, & Chiamulera, 2013). Also, although analogous findings have been replicated in other Pavlovian conditioning experiments (Flavell, Barber, & Lee, 2011; Schiller, Monfils, Raio, Johnson, LeDoux, & Phelps, 2010), they have not been obtained universally (Chan, Leung, Westbrook, & McNally, 2010; Flavell et al, 2011; Ishii, Matsuzawa, Matsuda, Tomizawa, Sutoh, & Shimizu, 2012; Kindt & Soeter, 2013; Soeter & Kindt, 2011).

Behavioral mechanisms of instrumental extinction

Other advances are being made in understanding the behavioral and neurobiological mechanisms of instrumental extinction. In instrumental (or operant) learning, organisms learn to perform certain behaviors (e.g., pressing a lever or pulling a chain) when a reinforcing event like a food pellet is presented as a consequence of it. In extinction, the behavior declines when the reinforcer is no longer presented. Instrumental extinction is as important to understand as Pavlovian extinction if we want a complete understanding of the phenomenon. Furthermore, it is worth studying in its own right because instrumental behavior is the animal laboratory’s model of voluntary action, choice, and decision making (e.g., Balleine & Ostlund, 2007). Understanding instrumental extinction may thus lead to more direct insight into the inhibition of voluntary behaviors, such as drug taking, overeating, and gambling.

Our laboratory has recently expanded from studying Pavlovian extinction to also studying the behavioral mechanisms that are involved in instrumental extinction. This effort has used our previous research on Pavlovian extinction, and the principles listed in Table 1, as its guide. In general, the findings indicate strong parallels between the principles that govern extinction of Pavlovian and instrumental conditioning. For example, the demonstration of relapse effects (renewal, resurgence, rapid reacquisition, and reinstatement) after instrumental extinction once again indicates that extinction does not erase original learning but instead results in new learning that is at least partly context-dependent.

Relapse after instrumental extinction

Perhaps the most basic of the relapse effects is the renewal effect. All three forms of renewal (ABA, AAB, and ABC) have now been demonstrated after the extinction of instrumental behavior (e.g., Bouton, Todd, Vurbic, & Winterbauer, 2011). For example, Bouton et al. (2011) first trained rats to lever press for food pellets in Context A on a variable-interval 30 s (VI 30 s) schedule (pellets were made available with a 1/30 probability every second). After several sessions of acquisition, half the rats were switched to Context B for extinction (where lever presses no longer resulted in food pellet delivery) and the other half received extinction in Context A. Extinction lasted for four sessions, at which point responding was quite low. In two final sessions, all rats were tested in their extinction context and in the other (renewal) context (order counterbalanced). For rats that underwent extinction in Context B, a return to Context A caused a significant increase in lever press responding (ABA renewal, see also Nakajima, Tanaka, Urushihara, & Imada, 2000). And for rats that received extinction in Context A, a move to Context B also caused responding to increase (AAB renewal; see also Todd, Winterbauer, & Bouton, 2012a). Renewal also occurred in a similar experiment in which acquisition, extinction, and testing occurred in separate contexts (ABC renewal; see also Todd, Winterbauer, & Bouton, 2012b).

Like renewal, resurgence indicates that instrumental extinction does not erase the original learning (see Bouton, Winterbauer, & Todd, 2013, for a review). In resurgence, one operant behavior (e.g., pressing one lever) is first reinforced and then undergoes extinction in a second phase. While the first behavior is in extinction, a second behavior (e.g., pressing a second lever) is reinforced. In a third phase, when the second behavior is then extinguished, the first behavior returns, or “resurges.” For example, Winterbauer and Bouton (2010, Experiment 1) first trained rats to lever press (L1) for food pellets on a VI 30 s schedule. In the next phase L1 was nonreinforced (extinguished) while a second lever (L2) was reinforced for some rats, but presented and nonreinforced for other rats. In the final phase, when both L1 and L2 were presented and nonreinforced, only rats that had L2 reinforced during the previous phase showed resurgence. That is, if while L2 was being reinforced while L1 was being extinguished, responding resurged on L1 when L2 was subsequently extinguished.

Several behavioral mechanisms of resurgence have been suggested (e.g., Leitenberg, Rawson, & Bath, 1970; Leitenberg, Rawson, & Mulick, 1975; Podlesnick & Shahan, 2009; Shahan & Sweeney, 2011). However, we have emphasized that resurgence can be understood as another form of the renewal effect (e.g., Winterbauer & Bouton, 2010). According to this perspective, reinforcement of the second behavior is part of the “context” in which the first behavior is extinguished. Resurgence then occurs when the reinforcer context is removed (see Bouton, Rosengard, Achenbach, Peck, & Brooks, 1993). Thus, resurgence is conceptually similar to the ABC renewal effect (e.g., Bouton et al, 2011). The difference is that in resurgence the animal’s own behavior and/or the reinforcers it produces serves as the context, instead of the physical surroundings.

One prediction of this account of resurgence is that any manipulation that makes the final change in context less detectable should reduce the effect. One way this can be accomplished is by “fading” or “thinning” the rate of reinforcement of the second behavior during extinction of the first behavior. Thus, the first behavior undergoes extinction in the context of infrequent reinforcement, which results in the extinction context being similar to the final test context where both responses no longer produce reinforcement. To test this idea, Winterbauer and Bouton (2012) first trained rats to lever press (L1) on a VI-30 s schedule. Next, during extinction of L1, a second lever (L2) was reinforced with the same schedule throughout this phase (either fixed interval [FI] or random interval [RI] 20 s) for several sessions. For these two groups, when L2 was then extinguished, there was a strong resurgence effect. However, final test performance was much different for two other groups. For these groups, during extinction of L1, the schedule of reinforcement of L2 was gradually thinned from either a FI or RI 20 s schedule to a final schedule of FI or RI 120 s. When L2 was subsequently nonreinforced, the degree of resurgence in these two groups was much less pronounced. Similar results have recently been reported by Sweeney and Shahan (2013). Thus, thinning the reinforcement schedule of L2 results in less resurgence. This finding indicates that resurgence depends upon the rate of reinforcement used during the extinction phase, a result previously demonstrated by Leitenberg at al. (1975).

In the natural world, the “lapses” in extinguished instrumental behaviors like drug taking or binge eating that might be caused by renewal and resurgence usually lead to new pairings of the behavior and the reinforcer (e.g., Bouton, 2000). These new action-reinforcer pairings might then lead to reacquisition of the original behavior. Several experiments from our laboratory and others have shown that reacquisition is also modulated by the context. For example, rats that were not deprived of food (e.g., Todd et al., 2012a, Experiment 1) were first trained to lever press for either sucrose or sweet/fatty pellets in Context A and extinguished in Context B (e.g., Todd et al., 2012a, Experiment 1). After initial renewal testing, rats were then divided into two groups: One group received reacquisition in Context A (the acquisition context) and one group in Context B (the extinction context). During this phase, every fifth press now earned a pellet. Reacquisition of lever pressing was faster in Context A than in Context B. One way to think about this is that the extinction context slowed reacquisition of the original behavior. In a second experiment (Todd et al., 2012a, Experiment 2b), after acquisition and extinction in Context A, reacquisition was faster in Context B than in Context A. In this experiment, rats were quicker to reacquire lever pressing in a context in which no lever pressing had occurred before. This result emphasizes the inhibitory nature of the extinction context as well as the fact that removal from the extinction context is sufficient for recovery to occur. Other forms of context can also influence reacquisition. For example, Woods and Bouton (2007) introduced infrequent response-reinforcer pairings during extinction of lever pressing. This procedure slowed reacquisition relative to a group that received simple extinction. Woods and Bouton (2007) argued that adding the occasional response-reinforcer pairings during extinction increased generalization between extinction and reacquisition. Essentially, the response-pellet pairings allowed them to become part of the “context” associated with nonreinforcement.

Following extinction, non-contingent presentations of the reinforcer can cause reinstatement of instrumental responding (e.g., Baker et al., 1991). Behavioral research suggests that this form of “relapse” operates through at least two possible mechanisms (see Bouton & Swartzentruber, 1991). One possibility is that the reinforcer may serve as a discriminative stimulus, or type of context, that directly supports the instrumental response (e.g., Ostlund & Balleine, 1997). Thus, when the reinforcer is presented, it can set the occasion for responding again. In this way, reinstatement may be a form of ABA renewal; responding increases upon a return to the context of conditioning. A second possibility is that reinstatement occurs because presentation of the reinforcer results in conditioning of the context, which then facilitates responding (e.g., Pearce & Hall, 1979; Baker et al., 1991). As noted earlier, prior research indicates that this mechanism is especially important in Pavlovian reinstatement (e.g., Bouton, 1984; Bouton & Bolles, 1979b; Bouton & King, 1983). And consistent with its role in instrumental learning, reinstatement following instrumental extinction depends on the reinforcer being presented in the context of testing and is weakened by introducing sessions in which the animal is exposed to the context repeatedly between re-presentation of the reinforcer and testing (Baker et al., 1991). Like the other recovery effects, reinstatement highlights the fact that extinction is not erasure. And both the discriminative stimulus and contextual conditioning mechanisms implicate a role for context.

Behavioral mechanisms of the contextual control of instrumental extinction

The finding that removal from the extinction context (AAB, ABC) is sufficient for renewal to occur is especially strong evidence that the extinction context somehow inhibits instrumental responding. As in Pavlovian conditioning, there are several ways this inhibition might operate (see Bouton et al., 2011). As noted earlier, one possibility is that during extinction, the organism learns an inhibitory association between the context and the reinforcer. In this case, the context would acquire inhibitory properties akin to those of a conditioned inhibitor and suppress the representation of the reinforcer (e.g., Polack, Laborda, & Miller, 2011). A second possible mechanism is conceptually similar to the well-accepted model of Pavlovian extinction discussed earlier, in which the context, as an occasion setter, activates an inhibitory association between the CS and US (e.g., Bouton, 1997; Bouton & Ricker, 1994). In instrumental extinction, the context might analogously activate an inhibitory association between the response and the reinforcer. Finally, a third possible mechanism is an inhibitory association between the context and the response. According to this mechanism, the extinction context would directly suppress the instrumental response. Rescorla (1993, 1997) has suggested that this type of association is formed between discriminative stimuli and their responses during extinction, although his experiments did not separate the approach from an occasion setting approach (Bouton, 2004; Rescorla, 1993, p. 335; Rescorla, 1997, p. 249).

Based on recent experiments conducted in our laboratory, Todd (2013) has suggested that instrumental extinction may be best characterized by the inhibitory context-response mechanism. In his experiments, renewal was observed when both the extinction and renewal contexts had equivalent reinforcement histories and associative properties (e.g., Bouton & Ricker, 1994; Campese & Delamater, 2013; Delamater, Campese, & Westbrook, 2009; Harris et al., 2000; Rescorla, 2008). For example, in one experiment (Todd, 2013, Experiment 1), rats were first trained to perform one response (R1, lever press or chain pull, counterbalanced) in one context (A), and the other response (R2) in a different context (B). The method ensured that during the acquisition phase, both contexts were equally associated with reinforcement. Next, R1 underwent extinction in Context B, and R2 underwent extinction in Context A. This symmetrical treatment ensured that both Contexts A and B were also equally associated with nonreinforcement. Finally, R1 and R2 were both tested in their extinction and conditioning contexts. There was a clear renewal effect for both responses: R1 was high in A but low in B, whereas R2 was high in B but low in A. Using analogous designs, Todd (2013) also demonstrated AAB and ABC renewal. Because the contexts were equally associated with conditioning and extinction, their direct associations with the reinforcer did not differ and could not produce the differential responding observed during the renewal test.

While the renewal effects observed by Todd (2013) cannot be explained by differential context-reinforcer associations, they can be explained by either the inhibitory context-response or the occasion-setting mechanisms discussed above. According to the inhibitory context-response hypothesis, either response would be released from its inhibition when tested in the other context. According to the occasion-setting hypothesis, either response would be released from hierarchical inhibitory control by its extinction context. One problem for the occasion-setting account, however, is that the effects of negative occasion setters tend to transfer and influence other suitable targets (e.g., Holland & Coldwell, 1993; Morell & Holland, 1993). In Todd’s experiments, such transfer should have reduced any renewal of R1 (or R2) when it was tested in a context that had been associated with extinction of the other response. However, because transfer of occasion setting is often incomplete, some renewal could have still been observed. To further test for the role of negative occasion setting, Todd (2013, Experiment 4) therefore went on to compare the strength of renewal in a group for which such transfer was possible and a group for which it was not. After rats were trained to perform R1 and R2 in Contexts A and B, one group received extinction of R1 in B and R2 in A. Renewal of R2 was then tested in Context B. If extinction results in the context becoming a negative occasion setter, then the fact that R1 had been extinguished in Context B should reduce the size of any renewal effect of R2 there. To test this, a second group received extinction of R2 in A, but extinction of R1 occurred in a separate context. This group was simply exposed to Context B to ensure it was equally familiar. For this group, Context B had not been trained as a negative occasion setter, and any renewal of R2 there would therefore be strong and undiminished. However, renewal testing revealed that R2 was equally renewed in Context B in the two groups. There was thus no evidence that negative occasion setting had been learned. In fact, the experiment also casts further doubt on the idea that the extinction context enters into a direct inhibitory association with the reinforcer. Such an association could have theoretically reduced renewal of R2 due to inhibition of the reinforcer. Overall, the results are most consistent with the idea that inhibition provided by the extinction context is of the form of a simple and direct inhibitory association between the context and a specific response. The animal simply learns not to make a specific response in a specific context. Other recent research has further supported this hypothesis (Todd, Vurbic, & Bouton, submitted).

Performance of the instrumental response itself appears to have a central role in its extinction. We have previously demonstrated in an ABA renewal design that exposure to the renewal context alone, in the absence of the opportunity to perform the response (i.e., the lever was not inserted into the context), does not weaken the strength of ABA renewal (Bouton et al., 2011, Experiment 4). One way to interpret this finding is that Pavlovian excitation conditioned to the context may not play an important role in ABA renewal (exposure to the context alone should result in extinction of this excitation). However, the finding also suggests that in instrumental extinction, extinction learning may depend on the organism actually performing the response. This notion has been elegantly demonstrated by Rescorla (1997) in experiments that manipulated the overall likelihood of two responses during extinction. Rescorla found that extinction was more complete for the response that had been made more frequently, suggesting that performance of the response itself contributes to extinction. In several experiments, Rescorla (2000, Experiment 2; 2006, Experiment 3) also reported that responding during extinction with two discriminative stimuli presented in compound was higher than responding during a third stimulus extinguished alone. However, extinction training with the compound resulted in greater loss of responding when the individual stimuli were later tested. The greater level of responding created by the compound thus allowed extinction to be “deepened.” Similar effects have been demonstrated more recently in rats responding for cocaine reward (Janak, Bowers, & Corbit, 2012; Kearns, Tunstall, & Weiss, 2012). As before, presenting discriminative stimuli in compound caused more responding during extinction than exposure to a discriminative stimulus alone. But when responding was later tested for spontaneous recovery, it was lower in a stimulus that had been extinguished in compound with a second one, relative to a third stimulus that had been extinguished on its own. We should note that high levels of responding in extinction might also reflect a higher expectation of reinforcement, and thus potentially implicate a role for prediction error in instrumental extinction. We have already seen that similar stimulus compounding effects in Pavlovian extinction have been interpreted in terms of prediction error (e.g., Rescorla, 2000, 2003, 2006; see above). Indeed, Rescorla (2006, Experiment 5), successfully separated the role of prediction error from response level in Pavlovian extinction. However, to our knowledge there has been no separation of these possibilities in an instrumental extinction experiment to date.

Similarities and differences between Pavlovian and instrumental extinction

The preceding discussion demonstrates that the rules that summarize Pavlovian extinction provide a useful framework for studying instrumental extinction. That research suggests that the principles established in Pavlovian extinction often do apply to instrumental extinction. However, the research has also uncovered differences. First, the nature of the inhibition learned in instrumental extinction may be different from the inhibition learned in Pavlovian extinction. As noted above, the instrumental learning data have suggested a role for a direct inhibitory context-response association. In contrast, the Pavlovian data are not consistent with this approach (see Harris et al., 2000), and instead implicate negative occasion-setting by the extinction context. In retrospect, it may not be surprising that the response is more the “focus” in instrumental learning; in Pavlovian conditioning, the animal is mainly learning to react or not to a CS. A second difference is that the strength of the operant response decreases when the context is changed after conditioning (e.g., Bouton, Todd, & León, in press), whereas the Pavlovian response typically seems unaffected (e.g., Rosas, Todd, & Bouton, 2013). Once again, the difference may be consistent with the intuition that the response is the focus of instrumental learning. We would emphasize, however, that operant extinction is still more context-specific than operant conditioning—as implied by ABC and AAB renewal, and perhaps the resurgence effect. Like Pavlovian extinction, instrumental extinction does reflect a context-specific inhibitory effect.

A third difference between instrumental and Pavlovian extinction is that the instrumental situation has more moving parts. For example, the presence of explicit conditioned reinforcers (stimuli associated with reinforcer delivery) are always potentially present. Biobehavioral accounts of addiction and instrumental learning have appropriately emphasized the role of both the discriminative stimulus and conditioned reinforcers in instrumental learning (e.g., Everitt & Robbins, 2005; Flagel, Akil, & Robinson, 2009; Milton & Everitt, 2012). But it is worth noting that the presence and use of conditioned reinforcers in experimental procedures in behavioral pharmacology studies is highly variable, and this can complicate interpretation. For example, extinction sometimes involves presentation of the conditioned reinforcer (Bossert, Liu, Lu, & Shaham, 2004; Crombag & Shaham, 2002) and sometimes not (Fuchs, Evans, Parker, & See, 2004; Schwendt, Reichel, & See, 2012). In relapse tests, the conditioned reinforcer is sometimes response-contingent (Chaudhri, Sahaque, & Janak, 2009; Fuchs et al., 2004) and sometimes it is presented noncontingently (Sutton et al., 2003). Unsystematic presentation of the conditioned reinforcer in extinction or “reinstatement” testing will inconsistently introduce Pavlovian processes in addition to instrumental processes. Unfortunately, the term “reinstatement” has also been imprecisely attached to a mixture of phenomena—noncontingent presentation of the conditioned reinforcer, contingent presentation of the conditioned reinforcer, noncontingent presentation of the reinforcer, or testing in the original conditioning context (which learning theorists would call ABA renewal). Given the complexity of instrumental learning methods and its underlying mechanisms, it would make sense to be as precise as possible with how it is described.

Finally, it is important to note that although generalization decrement is likely to play a role in both Pavlovian and instrumental extinction, its role in the extinction of free-operant responding is necessarily large. This is because, in the typical operant arrangement, the delivery of a reinforcer serves as a cue or discriminative stimulus that directly precedes and sets the occasion for the next lever-press response. When the reinforcer is removed in extinction, the response thus loses a direct source of stimulus support. Consistent with this idea, if reinforcers are presented in extinction but not contingent on behavior, the response is slower to extinguish (e.g., Baker, 1990; Rescorla & Skucy, 1969; Winterbauer & Bouton, 2011). Again, a similar effect occurs in Pavlovian extinction (e.g., Frey & Butler, 1977), and reinforcer presentations can have many effects (e.g., Baker, 1990). But one important function of the reinforcer in free operant methods is that it is a stimulus that directly leads to the next response (e.g., Reid, 1958).

Neurobiological mechanisms of instrumental extinction

Instrumental learning and extinction procedures have recently become important tools for investigating the neurobiology of drug-taking and relapse (e.g., Marchant, Li, & Shaham, 2013). The approach has been stimulated and enriched by behavioral work. For example, many studies have examined the contextual control of extinguished operant behavior reinforced by drugs of abuse (see Bouton, Winterbauer, & Vurbic, 2012 and Millan, Marchant, & McNally, 2011 for reviews). The area has the potential to elucidate neural and behavioral mechanisms that might contribute to drug taking behavior.

Instrumental extinction and relapse of drug reinforced behavior

Renewal of extinguished instrumental behavior has been repeatedly demonstrated when responding is first reinforced with drugs of abuse. Crombag and Shaham (2002) were among the first to demonstrate renewal with a drug self-administration paradigm. In their experiment, rats were first reinforced for lever pressing with intravenous administration of a mixture of cocaine and heroin in Context A. After many sessions of extinction in Context B, responding renewed when testing occurred back in Context A. This form of renewal (ABA) has since been demonstrated using a number of drug reinforcers, including alcohol (e.g., Chaudri et al., 2009; Hamlin, Newby, & McNally, 2007; Zironi, Burattini, Aircardi, & Janak, 2006), heroin alone (e.g., Bossert et al., 2004), and cocaine alone (e.g., Hamlin, Clemens, & McNally, 2008). Interestingly, although the other forms of renewal (AAB and ABC) have now been clearly demonstrated with food-reinforced responses (Bouton et al., 2011; Todd, 2013), these forms have yet to be convincingly demonstrated in drug self-administration. Several experiments have failed to demonstrate AAB renewal (Bossert et al., 2004; Crombag & Shaham, 2002) or ABC renewal (Zironi et al., 2006) of responding for drugs. At this point, it is not clear whether the failure to obtain AAB and ABC renewal with drug reinforcers is due to methodological differences or a more fundamental difference between food-pellet and drug reinforcers.

Other behavioral forms of relapse have been studied in drug self-administration experiments. Resurgence has been demonstrated with responding for alcohol (Podlesnik, Jimenez-Gomez, & Shahan, 2006) and cocaine (Quick, Pyszczynski, Colston, & Shahan, 2011). In these experiments, the instrumental response was extinguished while an alternative response was reinforced with food. When this second response was then extinguished, the original drug-seeking response resurged even though the drug remained absent. Reacquisition has in turn been studied with behaviors that have been reinforced with alcohol (e.g., Perry & McNally, 2012; Willcocks & McNally, 2011, 2013). Like food-reinforced behavior, the reacquisition of alcohol-reinforced behavior is modulated by the context. For example, Willcocks and McNally (2011, Experiment 4) first trained rats to nose-poke for alcoholic beer. Then, extinction of responding was conducted in Context B. During a reacquisition test, when nose-pokes were again reinforced with beer, the latency to the first response was less in Context A than in B. This result is similar to the findings of Todd et al. (2012a) described above, where reacquisition was more rapid when it occurred outside the context of extinction (cf. Willcocks & McNally, 2011, Experiment 3). Reinstatement of drug self-administration by noncontingent presentation of drug reinforcers has also been widely demonstrated. For example, extinguished lever pressing that was previously reinforced with heroin or cocaine is reinstated by noncontingent infusion of the drug (Banks, Sprague, Czoty, & Nader, 2008; Botly, Burton, Rizos, & Fletcher, 2008; de Wit & Stewart, 1981, 1983; Weerts, Kaminski, & Griffiths, 1998). Under some conditions, extinguished drug seeking can also be reinstated by stressors such as food deprivation and unsignaled footshock delivered in the test context (Shalev, Highfiled, Yap, & Shaham, 2000).

Neurobiology of instrumental extinction and relapse

There is also a growing body of research in the drug self-administration area that has started to elucidate the neural mechanisms of instrumental extinction and relapse (for reviews see Bossert, Marchant, Calu, & Shaham, 2013; Crombag, Bossert, Koya, Shaham, 2008; Marchant et al., 2013; Peters, Kalivas, & Quirk, 2009; Willcocks & McNally, 2013). The prefrontal cortex is again critically involved. In particular, two regions have been implicated in controlling instrumental extinction performance––the IL mPFC mentioned in our previous discussion of Pavlovian extinction, and the prelimbic area of the medial prefrontal cortex (PL mPFC). These regions are thought to have opposing roles in extinction of conditioned fear (Laurent & Westbrook, 2009; Peters et al., 2009). There is some evidence that they may have opposing roles in extinction of instrumental behavior as well. Recently, Peters and De Vries (2013) have shown that the IL mPFC is critical for instrumental extinction to be learned. Inactivating NMDA receptors prior to extinction of sucrose seeking disrupted later extinction performance. Moreover, Peters, LaLumiere, and Kalivas (2008) have reported that inactivating the IL mPFC following the extinction of cocaine seeking caused an increase in responding; activating this region just prior to relapse (reinstatement) testing reduced the overall strength of response recovery. These data suggest that the IL mPFC somehow inhibits behavior during extinction (see Peters et al., 2009). However, other studies suggest that these effects may depend on the type of reinforcer, or may instead reflect one of several different roles for this region on extinction and retrieval (Bossert et al, 2011, 2013). For instance, Bossert and colleagues (2011) have reported that inactivating a subset of IL neurons reduced renewal of heroin seeking, an inhibition of responding that is not in agreement with its putative role in mediating extinction performance. Those results are instead consistent with the effects of inactivating the PL mPFC, which is known to reduce relapse after extinction (e.g., McFarland & Kalivas, 2001; Willcocks & McNally, 2013). For example, Willcocks and McNally (2013) first trained rats to nosepoke for alcoholic beer in Context A. Responding was extinguished in Context B before being tested in both Contexts. Inactivation of the PL mPFC reduced renewal (the recovery of responding seen in Context A) but had no effect on the expression of extinction (responding in Context B).

Many other brain structures have been implicated in the extinction and relapse of drug seeking. For example, the medial dorsal region of the tuberal hypothalamus (MDH) is thought to exert inhibitory control over extinguished reward seeking (for a review see Marchant, Millan, & McNally, 2012). Along with the IL mPFC and the MDH, the nucleus accumbens shell has been implicated in the inhibition of responding during extinction (for a review see Millan et al., 2011; Millan & McNally, 2011), while the ventral tegmental area (e.g., Bossert et al., 2004), nucleus accumbens core and shell (e.g., Chaudhri et al., 2009; Fuchs, Ramirez, & Bell, 2008; Millan & McNally, 2012), the basolateral amygdala and lateral hypothalamus (e.g., Hamlin et al., 2008; Hamlin et al., 2007), and dorsal hippocampus (Fuchs, Eaddy, Su, & Bell, 2007) have all been related to relapse (renewal) following extinction.

Effect of DCS on instrumental extinction

The many parallels between Pavlovian and instrumental learning have led investigators to ask whether treatments that enhance Pavlovian extinction can also enhance the extinction of instrumental responding. Much research has focused again on the effects of DCS. When delivered systemically, DCS has been shown to facilitate extinction in animal studies using several different reinforcers, such as food (Leslie & Norwood, 2013; Shaw et al., 2009), alcohol (Vengeliene, Kiefer, & Spanagel, 2008), and cocaine (Nic Dhonnchadha et al., 2010; Thanos, Bermeo, Wang, & Volkow, 2011a; Thanos et al., 2011b). More recently, a similar enhancement has been reported when DCS was infused directly into the IL mPFC (Peters & De Vries, 2013). These studies suggest that DCS may have similar actions across different extinction learning paradigms.

As before, however, one critical issue is whether DCS simply facilitates normal extinction learning or fundamentally changes something about it, such as its context-specificity. As described above, studies of fear extinction have shown that DCS-treated animals remain vulnerable to renewal, indicating that extinction learning remains context specific (Bouton et al., 2008; Woods & Bouton, 2006). Moreover, given the general complexity of instrumental learning and the involvement of Pavlovian associations in supporting instrumental behavior, another issue is whether DCS targets extinction of instrumental associations, Pavlovian associations, or both. We have investigated these questions in a series of experiments with rats lever pressing for food pellets (Vurbic, Gold, & Bouton, 2011). After acquisition in Context A, extinction with DCS was conducted in Context B. Rats were then tested in both contexts to determine whether any enhancement of extinction was specific to the context where extinction was learned (i.e., Context B). In contrast to the results cited above, there was no effect of DCS in either context. We noted that in all of the other studies in which DCS was delivered systemically, animals were presented with discrete reinforcer-paired conditioned stimuli (e.g., buzzer sound with concurrent retraction of the lever; Leslie & Norwood, 2013; Shaw et al., 2009), which may have become conditioned reinforcers. These stimuli were also presented during the extinction and test phases. It is therefore possible that DCS facilitated Pavlovian extinction of the conditioned reinforcers rather than any instrumental association. Consistent with this idea, in an experiment by Thanos et al. (2011b), DCS facilitated extinction of lever pressing for cocaine only in mice presented with cocaine-paired cues during extinction. Mice given extinction without cue exposure showed no such effect. In addition, Torregrossa, Sanchez, and Taylor (2010) obtained a DCS effect on extinguished lever pressing when it was delivered immediately following extinction sessions of the cocaine-paired conditioned reinforcers alone.

Although studies of the systemic DCS administration thus point to the possibility that it primarily affects Pavlovian associations during instrumental extinction, the results with localized brain infusions of DCS complicate the picture. Peters and De Vries (2013) recently showed that DCS delivered directly into the IL mPFC enhanced extinction of instrumental behavior without a conditioned reinforcer being present. And interestingly, Torregrossa et al. (2010) had previously found that direct delivery of DCS to the IL mPFC had no effect on extinction of the conditioned reinforcer without the response. Reconciling the various effects of DCS will be an important step in understanding how it might be used in behavioral therapies for drug addiction. At present, studies on the use of DCS during drug cue exposure in humans have had little success (Kamboj et al, 2011; Kamboj, Joye, Das, Gibson, Morgan, & Curran, 2012; Price et al., 2013; Yoon et al., 2013).

In addition to the growing instrumental literature on the effects of DCS, new work has extended the retrieval-extinction procedure that appears to prevent relapse of extinguished fear (Monfils et al., 2009; Schiller et al., 2010). Using a modified version of the Monfils et al. method, Xue et al. (2012) found that a brief 15-min extinction session followed 10 min later by a longer 180-min extinction session prevented spontaneous recovery, reinstatement and renewal in rats self-administering cocaine or heroin.. Control rats were not given the first session, but were instead given longer 195-min sessions of extinction to equate the total time. The results were partially replicated by Millan, Milligan-Saville, and McNally (2013), who used a similar procedure to prevent renewal in rats responding for alcohol. However, Millan et al. also found the same pattern (i.e., a lack of renewal) in a second experiment in which the order of the retrieval and extinction sessions was reversed. Such a result suggests that the effect may be due to mechanisms other than memory retrieval and disruption of reconsolidation. More research will be necessary to understand the findings.

Conclusion

Our review has illustrated a tight coupling between behavior theory and research on the neural mechanisms of extinction. Indeed, it can be said that the study of neurobiological mechanisms of extinction has stood on the shoulders of research that has investigated its behavioral underpinnings. However, we will close by noting that, in all likelihood, the linkage between these levels of analysis will need to be ongoing. As our review indicates, extinction and the many behavioral effects that relate to it can often and in principle be multiply determined. For example, in instrumental extinction, the extinction context alone might directly inhibit the reinforcer, the response, or activate an inhibitory association between the response and reinforcer (cf. Todd, 2013). Or in the ABA renewal of extinguished Pavlovian or instrumental responding, renewal can follow from several theoretical mechanisms, such as the removal of any of the various forms of inhibition present in Context B, or any of several excitatory influences of Context A (e.g, see Nelson, Sanjuan, Vadillo-Ruiz, Pérez, & León, 2011, for a recent discussion of renewal’s complexity). Without careful supporting behavioral investigation, any neurobiological study can fail to determine the precise effect of a neural manipulation. Any structure involved in extinction (such as the IL mPFC) could in principle be involved in any of several possible inhibitory mechanisms. Its inactivation (for example) could therefore result in increased responding through any of them. Likewise, neural manipulations that influence renewal, such as inactivation of the PL-mPFC, are equally open to varieties of interpretation. An accurate and sophisticated understanding of the neural mechanisms behind extinction and the various lapse and relapse effects will require an equally sophisticated understanding of the behavioral mechanisms. Success at that the neurobiological level of analysis may thus always need support from careful and continued work at the behavioral level of analysis.

Acknowledgments

Preparation of the manuscript was supported by National of Institute on Drug Abuse Grant RO1 DA033123 to MEB.

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

References

  1. Amsel A. Partial reinforcement effects on vigor and persistence. In: Spence KW, Spence JT, editors. The Psychology of Learning and Motivation. I. New York: Academic Press; 1967. pp. 1–65. [Google Scholar]
  2. Auber A, Tedesco V, Jones CE, Monfils MH, Chiamulera C. Post-retrieval extinction as reconsolidation interference: methodological issues or boundary conditions? Psychopharmacology. 2013;226:631–647. doi: 10.1007/s00213-013-3004-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Baker AG. Contextual conditioning during free-operant extinction: Unsignaled, signaled, and backward-signaled noncontingent food. Animal Learning & Behavior. 1990;18:59–70. [Google Scholar]
  4. Baker AG, Steinwald H, Bouton ME. Contextual conditioning and reinstatement of extinguished instrumental responding. The Quarterly Journal of Experimental Psychology. 1991;43B:199–218. [Google Scholar]
  5. Balleine BW, Ostlund SB. Still at the choice point: Action selection and initiation in instrumental conditioning. Annals of the New York Academy of Sciences. 2007;1104:147–171. doi: 10.1196/annals.1390.006. [DOI] [PubMed] [Google Scholar]
  6. Banks ML, Sprague JE, Czoty PW, Nader MA. Effects of ambient temperature on the relative reinforcing strength of MDMA using a choice procedure in monkeys. Psychopharmacology. 2008;196:63–70. doi: 10.1007/s00213-007-0932-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Bossert JM, Liu SY, Lu L, Shaham Y. A role of ventral tegmental area glutamate in contextual cue-induced relapse to heroin seeking. The Journal of Neuroscience. 2004;24:10726–10730. doi: 10.1523/JNEUROSCI.3207-04.2004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Bossert JM, Stern AL, Theberge FR, Cifani C, Koya E, Hope BT, Shaham Y. Ventral medial prefrontal cortex neuronal ensembles mediate context-induced relapse to heroin. Nature Neuroscience. 2011;14:420–422. doi: 10.1038/nn.2758. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Bossert JM, Marchant NJ, Calu DJ, Shaham Y. The reinstatement model of drug relapse: recent neurobiological findings, emerging research topics, and translational research. Psychopharmacology. 2013 doi: 10.1007/s00213-013-3120-y. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Botly LC, Burton CL, Rizos Z, Fletcher PJ. Characterization of methylphenidate self-administration and reinstatement in the rat. Psychopharmacology. 2008;199:55–66. doi: 10.1007/s00213-008-1093-z. [DOI] [PubMed] [Google Scholar]
  11. Botreau F, Paolone G, Stewart J. D-Cycloserine facilitates extinction of a cocaine-induced conditioned place preference. Behavioral Brain Research. 2006;172:173–178. doi: 10.1016/j.bbr.2006.05.012. [DOI] [PubMed] [Google Scholar]
  12. Bouton ME. Differential control by context in the inflation and reinstatement paradigms. Journal of Experimental Psychology: Animal Behavior Processes. 1984;10:56–74. [Google Scholar]
  13. Bouton ME. Context and ambiguity in the extinction of emotional learning: Implications for exposure therapy. Behaviour Research and Therapy. 1988;26:137–149. doi: 10.1016/0005-7967(88)90113-1. [DOI] [PubMed] [Google Scholar]
  14. Bouton ME. Context, time, and memory retrieval in the interference paradigm of Pavlovian learning. Psychological Bulletin. 1993;114:80–99. doi: 10.1037/0033-2909.114.1.80. [DOI] [PubMed] [Google Scholar]
  15. Bouton ME. Signals for whether versus when an event will occur. In: Bouton ME, Fanselow MS, editors. Learning, motivation, and cognition: The functional behaviorism of Robert C. Bolles. Washington, D. C: American Psychological Association; 1997. pp. 385–409. [Google Scholar]
  16. Bouton ME. A learning theory perspective on lapse and relapse and the maintenance of behavior change. Health Psychology. 2000;19:57–63. doi: 10.1037/0278-6133.19.suppl1.57. [DOI] [PubMed] [Google Scholar]
  17. Bouton ME. Context, ambiguity, and unlearning: Sources of relapse after behavioral extinction. Biological Psychiatry. 2002;52:976–986. doi: 10.1016/s0006-3223(02)01546-9. [DOI] [PubMed] [Google Scholar]
  18. Bouton ME. Context and behavioral processes in extinction. Learning & Memory. 2004;11:485–494. doi: 10.1101/lm.78804. [DOI] [PubMed] [Google Scholar]
  19. Bouton ME, Bolles RC. Contextual control of the extinction of conditioned fear. Learning and Motivation. 1979a;10:445–466. [Google Scholar]
  20. Bouton ME, Bolles RC. Role of conditioned contextual stimuli in reinstatement of extinguished fear. Journal of Experimental Psychology: Animal Behavioral Processes. 1979b;5:368–378. doi: 10.1037//0097-7403.5.4.368. [DOI] [PubMed] [Google Scholar]
  21. Bouton ME, Hendrix MC. Intertrial interval as a contextual stimulus: further analysis of a novel asymmetry in temporal discrimination learning. Journal of Experimental Psychology: Animal Behavior Processes. 2011;37:79–93. doi: 10.1037/a0021214. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Bouton ME, Kenney FA, Rosengard C. State-dependent fear extinction with two benzodiazepine tranquilizers. Behavioral Neuroscience. 1990;104:44–55. doi: 10.1037//0735-7044.104.1.44. [DOI] [PubMed] [Google Scholar]
  23. Bouton ME, King DA. Contextual control of the extinction of conditioned fear: Tests for the associative value of the context. Journal of Experimental Psychology: Animal Behavior Processes. 1983;9:248–265. [PubMed] [Google Scholar]
  24. Bouton ME, Peck CA. Context effects on conditioning, extinction, and reinstatement in an appetitive conditioning preparation. Animal Learning and Behavior. 1989;17:188–198. [Google Scholar]
  25. Bouton ME, Ricker ST. Renewal of extinguished responding in a second context. Animal Learning & Behavior. 1994;22:317–324. [Google Scholar]
  26. Bouton ME, Rosengard C, Achenbach GG, Peck CA, Brooks DC. Effects of contextual conditioning and unconditional stimulus presentation on performance in appetitive conditioning. The Quarterly Journal of Experimental Psychology. 1993;46B:63–95. [PubMed] [Google Scholar]
  27. Bouton ME, Swartzentruber D. Sources of relapse after extinction in Pavlovian and instrumental learning. Clinical Psychology Review. 1991;11:123–140. [Google Scholar]
  28. Bouton ME, Todd TP, León SP. Contextual control of a discriminated operant behavior. Journal of Experimental Psychology: Animal Behavior Processes. doi: 10.1037/xan0000002. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Bouton ME, Todd TP, Vurbic D, Winterbauer NE. Renewal after the extinction of free operant behavior. Learning & Behavior. 2011;39:57–67. doi: 10.3758/s13420-011-0018-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Bouton ME, Winterbauer NE, Todd TP. Relapse processes after the extinction of instrumental behavior: Renewal, resurgence, and reacquisition. Behavioral Processes. 2013;90:130–141. doi: 10.1016/j.beproc.2012.03.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Bouton ME, Winterbauer NE, Vurbic D. Context and extinction: Mechanisms of relapse in drug self-administration. In: Haselgrove M, Hogarth L, editors. Clinical Applications of Learning Theory. New York, NY: Psychology Press; 2012. pp. 103–133. [Google Scholar]
  32. Bouton ME, Vurbic D, Woods AM. d-Cycloserine facilitates context specific fear extinction learning. Neurobiology of Learning and Memory. 2008;90:504–510. doi: 10.1016/j.nlm.2008.07.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Bouton ME, Woods AM, Pineño O. Occasional reinforced trials during extinction can slow the rate of rapid reacquisition. Learning and Motivation. 2004;35:371–390. doi: 10.1016/j.lmot.2006.07.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Brooks DC, Bouton ME. A retrieval cue for extinction attenuates spontaneous recovery. Journal of Experimental Psychology: Animal Behavior Processes. 1993;19:77–89. doi: 10.1037//0097-7403.19.1.77. [DOI] [PubMed] [Google Scholar]
  35. Burgos-Robles A, Vidal-Gonzalez I, Santini E, Quirk GJ. Consolidation of fear extinction requires NMDA receptor-dependent bursting in the ventromedial prefrontal cortex. Neuron. 2007;53:871–880. doi: 10.1016/j.neuron.2007.02.021. [DOI] [PubMed] [Google Scholar]
  36. Campese V, Delamater AR. ABA and ABC renewal of conditioned magazine approach are not impaired by dorsal hippocampus inactivation or lesions. Behavioral Brain Research. 2013;248:62–73. doi: 10.1016/j.bbr.2013.03.044. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Capaldi EJ. A sequential hypothesis of instrumental learning. Psychology of Learning and Motivation. 1967;1:67–156. [Google Scholar]
  38. Chan WY, Leung HT, Westbrook RF, McNally GP. Effects of recent exposure to a conditioned stimulus on extinction of Pavlovian fear conditioning. Learning & Memory. 2010;17:512–521. doi: 10.1101/lm.1912510. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Chaudri N, Sahuque LL, Janak PH. Ethanol seeking triggered by environmental context is attenuated by blocking dopamine D1 receptors in the nucleus accumbens core and shell in rats. Psychopharmacology. 2009;207:303–314. doi: 10.1007/s00213-009-1657-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Corcoran KA, Desmond TJ, Frey KA, Maren S. Hippocampal inactivation disrupts the acquisition and contextual encoding of fear extinction. The Journal of Neuroscience. 2005;25:8978–8987. doi: 10.1523/JNEUROSCI.2246-05.2005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Corcoran KA, Maren S. Hippocampal inactivation disrupts contextual retrieval of fear memory after extinction. The Journal of Neuroscience. 2001;21:1720–1726. doi: 10.1523/JNEUROSCI.21-05-01720.2001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Corcoran KA, Maren S. Factors regulating the effects of hippocampal inactivation on renewal of conditional fear after extinction. Learning & Memory. 2004;11:598–603. doi: 10.1101/lm.78704. [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Craske MG, Kircanski K, Zelikowsky M, Mystkowski J, Chowdhury N, Baker A. Optimizing inhibitory learning during exposure therapy. Behaviour Research and Therapy. 2008;46:5–27. doi: 10.1016/j.brat.2007.10.003. [DOI] [PubMed] [Google Scholar]
  44. Craske MG, Liao B, Brown L, Vervliet B. Role of inhibition in exposure therapy. Journal of Experimental Psychopathology. 2012;3:322–345. [Google Scholar]
  45. Crombag HS, Bossert JM, Koya E, Shaham Y. Context-induced relapse to drug seeking: A review. Philosophical Transactions of the Royal Society B. 2008;363:3233–3243. doi: 10.1098/rstb.2008.0090. [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Crombag HS, Shaham Y. Renewal of drug seeking by contextual cues after prolonged extinction in rats. Behavioral Neuroscience. 2002;116:169–173. doi: 10.1037//0735-7044.116.1.169. [DOI] [PubMed] [Google Scholar]
  47. Cunningham CL. Alcohol as a cue for extinction: State dependency produced by conditioned inhibition. Animal Learning & Behavior. 1979;7:45–52. [Google Scholar]
  48. Davis M. NMDA receptors and fear extinction: Implications for cognitive behavioral therapy. Dialogues in Clinical Neuroscience. 2011;13:463–474. doi: 10.31887/DCNS.2011.13.4/mdavis. [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. Delamater AR. Effects of several extinction treatments upon the integrity of Pavlovian stimulus-outcome associations. Animal Learning & Behavior. 1996;24:437–449. [Google Scholar]
  50. Delamater AR. Experimental extinction in Pavlovian conditioning: Behavioural and neuroscience perspectives. Quarterly Journal of Experimental Psychology B: Comparative and Physiological Psychology. 2004;57:97–132. doi: 10.1080/02724990344000097. [DOI] [PubMed] [Google Scholar]
  51. Delamater AR. Issues in the extinction of specific stimulus-outcome associations in Pavlovian conditioning. Behavioural Processes. 2012;90:9–19. doi: 10.1016/j.beproc.2012.03.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  52. Delamater AR, Campese V, Westbrook RF. Renewal and spontaneous recovery, but not latent inhibition, are mediated by gamma-aminobutyric acid in appetitive conditioning. Journal of Experimental Psychology: Animal Behavior Processes. 2009;35:224–237. doi: 10.1037/a0013293. [DOI] [PubMed] [Google Scholar]
  53. de Wit H, Stewart J. Reinstatement of cocaine-reinforced responding in the rat. Psychopharmacology. 1981;75:134–143. doi: 10.1007/BF00432175. [DOI] [PubMed] [Google Scholar]
  54. de Wit H, Stewart J. Drug reinstatement of heroin-reinforced responding in the rat. Psychopharmacology. 1983;79:29–31. doi: 10.1007/BF00433012. [DOI] [PubMed] [Google Scholar]
  55. Everitt BJ, Robbins TW. Neural systems of reinforcement for drug addiction: from actions to habits to compulsion. Nature Neuroscience. 2005;8:1481–1489. doi: 10.1038/nn1579. [DOI] [PubMed] [Google Scholar]
  56. Falls WA, Miserendino MJD, Davis M. Extinction of fear-potentiated startle: Blockade by infusion of an NMDA antagonist into the amygdala. The Journal of Neuroscience. 1992;12:854–863. doi: 10.1523/JNEUROSCI.12-03-00854.1992. [DOI] [PMC free article] [PubMed] [Google Scholar]
  57. Flagel SB, Akil H, Robinson TE. Individual differences in the attribution of incentive salience to reward-related cues: Implications for addiction. Neuropsychopharmacology. 2009;56:139–148. doi: 10.1016/j.neuropharm.2008.06.027. [DOI] [PMC free article] [PubMed] [Google Scholar]
  58. Flavell CR, Barber DJ, Lee JLC. Behavioural memory reconsolidation of food and fear memories. Nature Communications. 2011;2:504. doi: 10.1038/ncomms1515. [DOI] [PMC free article] [PubMed] [Google Scholar]
  59. Fox GD, Holland PC. Neurotoxic hippocampal lesions fail to impair reinstatement of an appetitively conditioned response. Behavioral Neuroscience. 1998;112:255–260. [PubMed] [Google Scholar]
  60. Frey PW, Butler CS. Extinction after aversive conditioning: An associative or nonassociative process? Learning and Motivation. 1977;8:1–17. [Google Scholar]
  61. Frohardt RJ, Guarraci FA, Bouton ME. The effects of neurotoxic hippocampal lesions on two effects of context after fear extinction. Behavioral Neuroscience. 2000;114:227–240. doi: 10.1037//0735-7044.114.2.227. [DOI] [PubMed] [Google Scholar]
  62. Fuchs RA, Eaddy JL, Su ZI, Bell G. Interactions of the basolateral amygdala with the dorsal hippocampus and dorsomedial prefrontal cortex regulate drug-context-induced cocaine seeking in rats. European Journal of Neuroscience. 2007;26:487–498. doi: 10.1111/j.1460-9568.2007.05674.x. [DOI] [PubMed] [Google Scholar]
  63. Fuchs RA, Evans KA, Parker MP, See RE. Differential involvement of the core and shell subregions of the nucleus accumbens in conditioned cue-induced reinstatement of cocaine seeking in rats. Psychopharmacology. 2004;176:459–465. doi: 10.1007/s00213-004-1895-6. [DOI] [PubMed] [Google Scholar]
  64. Fuchs RA, Ramirez DR, Bell GH. Nucleus accumbens shell and core involvement in drug context-induced reinstatement of cocaine seeking in rats. Psychopharmacology. 2008;200:545–556. doi: 10.1007/s00213-008-1234-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  65. Hamlin AS, Clemens KJ, McNally GP. Renewal of extinguished cocaine-seeking. Neuroscience. 2008;151:659–670. doi: 10.1016/j.neuroscience.2007.11.018. [DOI] [PubMed] [Google Scholar]
  66. Hamlin AS, Newby J, McNally GP. The neural correlates and role of D1 dopamine receptors in renewal of extinguished alcohol-seeking. Neuroscience. 2007;146:525–536. doi: 10.1016/j.neuroscience.2007.01.063. [DOI] [PubMed] [Google Scholar]
  67. Harris JA, Jones ML, Bailey GK, Westbrook RF. Contextual control over conditioned responding in an extinction paradigm. Journal of Experimental Psychology: Animal Behavior Processes. 2000;26:174–185. doi: 10.1037//0097-7403.26.2.174. [DOI] [PubMed] [Google Scholar]
  68. Hart G, Harris JA, Westbrook RF. Systemic or intra-amygdala injection of a benzodiazepine (midazolam) impairs extinction but spares re-extinction of conditioned fear responses. Learning & Memory. 2009;16:53–61. doi: 10.1101/lm.1154409. [DOI] [PubMed] [Google Scholar]
  69. Hobin JA, Ji J, Maren S. Ventral hippocampal muscimol disrupts context-specific fear memory retrieval after extinction in rats. Hippocampus. 2006;16:174–182. doi: 10.1002/hipo.20144. [DOI] [PubMed] [Google Scholar]
  70. Holland PC. Occasion setting in Pavlovian conditioning. In: Medin DL, editor. The psychology of learning and motivation. Vol. 28. New York: Academic Press; 1992. pp. 69–125. [Google Scholar]
  71. Holland PC, Coldwell SE. Transfer of inhibitory stimulus control in operant feature-negative discrimination. Learning and Motivation. 1993;24:345–375. [Google Scholar]
  72. Hollerman JR, Schultz W. Dopamine neurons report an error in the temporal prediction of reward during learning. Nature Neuroscience. 1998;1:304–309. doi: 10.1038/1124. [DOI] [PubMed] [Google Scholar]
  73. Holtzman-Assif O, Laurent V, Westbrook RF. Blockade of dopamine activity in the nucleus accumbens impairs learning extinction of conditioned fear. Learning & Memory. 2010;17:71–75. doi: 10.1101/lm.1668310. [DOI] [PubMed] [Google Scholar]
  74. Ishii D, Matsuzawa D, Matsuda S, Tomizawa H, Sutoh C, Shimizu E. No erasure effect of retrieval-extinction trial on fear memory in the hippocampus-independent and dependent paradigms. Neuroscience Letters. 2012;523:76–81. doi: 10.1016/j.neulet.2012.06.048. [DOI] [PubMed] [Google Scholar]
  75. Janak PH, Bowers MS, Corbit LH. Compound stimulus presentation and the norepinephrine reuptake inhibitor atomoxetine enhance long-term extinction of cocaine-seeking behavior. Neuropsychopharmacology. 2012;37:975–985. doi: 10.1038/npp.2011.281. [DOI] [PMC free article] [PubMed] [Google Scholar]
  76. Kamboj SK, Joye A, Das RK, Gibson AJ, Morgan CJ, Curran HV. Cue exposure and response prevention with heavy smokers: A laboratory-based randomised placebo-controlled trial examining the effects of D-cycloserine on cue re-activity and attentional bias. Psychopharmacology. 2012;22:273–284. doi: 10.1007/s00213-011-2571-2. [DOI] [PubMed] [Google Scholar]
  77. Kamboj SK, Massey-Chase R, Rodney L, Das R, Almahdi B, Curran HV, Morgan CJA. Changes in cue reactivity and attentional bias following experimental cue exposure and response prevention: A laboratory study of the effects of D-cycloserine in heavy drinkers. Psychopharmacology. 2011;217:25–37. doi: 10.1007/s00213-011-2254-z. [DOI] [PubMed] [Google Scholar]
  78. Kearns DN, Tunstall BJ, Weiss SJ. Deepened extinction of cocaine cues. Drug and Alcohol Dependence. 2012;124:283–287. doi: 10.1016/j.drugalcdep.2012.01.024. [DOI] [PMC free article] [PubMed] [Google Scholar]
  79. Kim J, Lee S, Park K, Hong I, Song B, Son G, Park H, Kim WR, Park E, Choe HK, Kim H, Lee C, Sun W, Kim K, Shin KS, Choi S. Amygdala depotentiation and fear extinction. Proceedings of the National Academy of Sciences. 2007;104:20955–20960. doi: 10.1073/pnas.0710548105. [DOI] [PMC free article] [PubMed] [Google Scholar]
  80. Kindt M, Soeter M. Reconsolidation in a human fear conditioning study: A test of extinction as updating mechanism. Biological Psychology. 2013;92:43–50. doi: 10.1016/j.biopsycho.2011.09.016. [DOI] [PubMed] [Google Scholar]
  81. Kindt M, Soeter M, Vervliet B. Beyond extinction: erasing human fear responses and preventing the return of fear. Nature Neuroscience. 2009;12:256–258. doi: 10.1038/nn.2271. [DOI] [PubMed] [Google Scholar]
  82. Kremer EF. The Rescorla-Wagner model: Losses in associative strength in compound conditioned stimuli. Journal of Experimental Psychology: Animal Behavior Processes. 1978;4:22–36. doi: 10.1037//0097-7403.4.1.22. [DOI] [PubMed] [Google Scholar]
  83. Laborda MA, McConnell BL, Miller RR. Behavioral techniques to reduce relapse after exposure therapy. In: Schachtman TR, Reilly S, editors. Associative learning and conditioning theory: Human and non-human applications. New York, NY: Oxford University Press; 2011. pp. 79–103. [Google Scholar]
  84. Laborda MA, Witnauer JE, Miller RR. Contrasting AAC and ABC renewal: the role of context associations. Learning & Behavior. 2011;39:46–56. doi: 10.3758/s13420-010-0007-1. [DOI] [PubMed] [Google Scholar]
  85. Langton JM, Richardson R. The effect of D-cycloserine on immediate vs. delayed extinction of learned fear. Learning & Memory. 2010;17:547–551. doi: 10.1101/lm.1927310. [DOI] [PubMed] [Google Scholar]
  86. Lattal KM. Effects of ethanol on the encoding, consolidation, and expression of extinction following contextual fear conditioning. Behavioral Neuroscience. 2007;121:1280–1292. doi: 10.1037/0735-7044.121.6.1280. [DOI] [PMC free article] [PubMed] [Google Scholar]
  87. Lattal KM, Nakajima S. Overexpectation in appetitive Pavlovian and instrumental conditioning. Animal Learning & Behavior. 1998;26:351–360. [Google Scholar]
  88. Lattal KM, Stafford JM. What does it take to demonstrate memory erasure? Theoretical comment on Norrholm et al. (2008) Behavioral Neuroscience. 2008;122:1186–1190. doi: 10.1037/a0012993. [DOI] [PMC free article] [PubMed] [Google Scholar]
  89. Laurent V, Westbrook FR. Inactivation of the infralimbic but not the prelimbic cortex impairs consolidation and retrieval of fear extinction. Learning & Memory. 2009;16:520–529. doi: 10.1101/lm.1474609. [DOI] [PubMed] [Google Scholar]
  90. Ledgerwood L, Richardson R, Cranney J. Effects of D-cycloserine on extinction of conditioned freezing. Behavioral Neuroscience. 2003;117:341–349. doi: 10.1037/0735-7044.117.2.341. [DOI] [PubMed] [Google Scholar]
  91. Lee JC, Milton AL, Everitt BJ. Reconsolidation and extinction of conditioned fear: Inhibition and potentiation. The Journal of Neuroscience. 2006;26:10051–10056. doi: 10.1523/JNEUROSCI.2466-06.2006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  92. Leitenberg H, Rawson RA, Bath K. Reinforcement of competing behavior during extinction. Science. 1970;169:301–303. doi: 10.1126/science.169.3942.301. [DOI] [PubMed] [Google Scholar]
  93. Leitenberg H, Rawson RA, Mulick JA. Extinction and reinforcement of alternative behavior. Journal of Comparative and Physiological Psychology. 1975;88:640–652. [Google Scholar]
  94. Leslie JC, Norwood K. Facilitation of extinction and re-extinction of operant behavior in mice by chlordiazepoxide and D-cycloserine. Neurobiology of Learning and Memory. 2013;102:1–6. doi: 10.1016/j.nlm.2013.02.002. [DOI] [PubMed] [Google Scholar]
  95. Leung HT, Reeks LM, Westbrook RF. Two ways to deepen extinction and the difference between them. Journal of Experimental Psychology: Animal Behavior Processes. 2012;38:394–406. doi: 10.1037/a0030201. [DOI] [PubMed] [Google Scholar]
  96. Lin CH, Yeh SH, Lu HY, Gean PW. The similarities and diversities of signal pathways leading to consolidation of conditioning and consolidation of extinction of fear memory. Journal of Neuroscience. 2003;23:8310–8317. doi: 10.1523/JNEUROSCI.23-23-08310.2003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  97. Litz BT, Salters-Pedneault K, Steenkamp MM, Hermos JA, Bryant RA, Otto MW, Hofmann SG. A randomized placebo-controlled trial of d-cycloserine and exposure therapy for posttraumatic stress disorder. Journal of Psychiatric Research. 2012;46:1184–1190. doi: 10.1016/j.jpsychires.2012.05.006. [DOI] [PubMed] [Google Scholar]
  98. Mackintosh NJ. The psychology of animal learning. London: Academic Press; 1974. [Google Scholar]
  99. Marchant NJ, Li X, Shaham Y. Recent developments in animal models of drug relapse. Current Opinion in Neurobiology. 2013 doi: 10.1016/j.conb.2013.01.003. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  100. Marchant NJ, Millan EZ, McNally GP. The hypothalamus and the neurobiology of drug seeking. Cellular and Molecular Life Sciences. 2012;69:581–597. doi: 10.1007/s00018-011-0817-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  101. McConnell BL, Miguez G, Miller RR. Extinction with multiple excitors. Learning & Behavior. 2013;41:119–137. doi: 10.3758/s13420-012-0090-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  102. McFarland K, Kalivas PW. The circuitry mediating cocaine-induced reinstatement of drug-seeking behavior. The Journal of Neuroscience. 2001;21:8655–8663. doi: 10.1523/JNEUROSCI.21-21-08655.2001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  103. McNally GP, Pigg M, Weidemann G. Opioid receptors in the midbrain periaqueductal gray regulate extinction of pavlovian fear conditioning. Journal of Neuroscience. 2004a;24:6912–6919. doi: 10.1523/JNEUROSCI.1828-04.2004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  104. McNally GP, Pigg M, Weidemann G. Blocking, unblocking, and overexpectation of fear: a role for opioid receptors in the regulation of Pavlovian association formation. Behavioral Neuroscience. 2004b;118:111–120. doi: 10.1037/0735-7044.118.1.111. [DOI] [PubMed] [Google Scholar]
  105. McNally GP, Westbrook RF. Opioid receptors regulate the extinction of Pavlovian fear conditioning. Behavioral Neuroscience. 2003;117:1292–1301. doi: 10.1037/0735-7044.117.6.1292. [DOI] [PubMed] [Google Scholar]
  106. Millan EZ, Marchant NJ, McNally GP. Extinction of drug seeking. Behavioural Brain Research. 2011;217:454–462. doi: 10.1016/j.bbr.2010.10.037. [DOI] [PubMed] [Google Scholar]
  107. Millan EZ, McNally GP. Accumbens shell AMPA receptors mediate expression of extinguished reward seeking through interactions with basolateral amygdala. Learning & Memory. 2011;18:414–421. doi: 10.1101/lm.2144411. [DOI] [PubMed] [Google Scholar]
  108. Millan EZ, McNally GP. Cocaine- and amphetamine-regulated transcript in the nucleus accumbens shell attenuates context-induced reinstatement of alcohol seeking. Behavioral Neuroscience. 2012;126:690–698. doi: 10.1037/a0029953. [DOI] [PubMed] [Google Scholar]
  109. Millan EZ, Milligan-Saville J, McNally GP. Memory retrieval, extinction, and reinstatement of alcohol seeking. Neurobiology of Learning and Memory. 2013;101:26–32. doi: 10.1016/j.nlm.2012.12.010. [DOI] [PubMed] [Google Scholar]
  110. Miller RR, Escobar M. Associative interference between cues and outcomes presented together and presented apart: An integration. Behavioural Processes. 2002;57:163–185. doi: 10.1016/s0376-6357(02)00012-8. [DOI] [PubMed] [Google Scholar]
  111. Milton AL, Everitt BJ. The persistence of maladaptive memory: Addiction, drug memories, and anti-relapse treatments. Neuroscience & Biobehavioral Reviews. 2012;36:119–139. doi: 10.1016/j.neubiorev.2012.01.002. [DOI] [PubMed] [Google Scholar]
  112. Misanin JR, Miller RR, Lewis DJ. Retrograde amnesia produced by electroconvulsive shock after reactivation of a consolidated memory trace. Science. 1968;160:554–555. doi: 10.1126/science.160.3827.554. [DOI] [PubMed] [Google Scholar]
  113. Monfils MH, Cowansage KK, Klann E, LeDoux JE. Extinction-reconsolidation boundaries: Key to persistent attenuation of fear memories. Science. 2009;324:951–955. doi: 10.1126/science.1167975. [DOI] [PMC free article] [PubMed] [Google Scholar]
  114. Morell JR, Holland PC. Summation and transfer of negative occasion setting. Animal Learning & Behavior. 1993;21:145–153. [Google Scholar]
  115. Myers KM, Carlezon WA. D-cycloserine facilitates extinction of naloxone-induced conditioned place aversion in morphine-dependent rats. Biological Psychiatry. 2010;67:85–87. doi: 10.1016/j.biopsych.2009.08.015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  116. Myers KM, Davis M. Behavioral and neural analysis of extinction. Neuron. 2002;36:567–584. doi: 10.1016/s0896-6273(02)01064-4. [DOI] [PubMed] [Google Scholar]
  117. Nader K. Memory traces unbound. Trends in Neurosciences. 2003;26:65–72. doi: 10.1016/S0166-2236(02)00042-5. [DOI] [PubMed] [Google Scholar]
  118. Nader K, Hardt O. A single standard for memory: the case for reconsolidation. Nature Reviews Neuroscience. 2009;10:224–234. doi: 10.1038/nrn2590. [DOI] [PubMed] [Google Scholar]
  119. Nader K, Schafe GE, LeDoux JE. Fear memories require protein synthesis in the amygdala for reconsolidation after retrieval. Nature. 2000;406:722–726. doi: 10.1038/35021052. [DOI] [PubMed] [Google Scholar]
  120. Nakajima S, Tanaka S, Urushihara K, Imada H. Renewal of extinguished lever-press responses upon return to the training context. Learning & Motivation. 2000;31:416–431. [Google Scholar]
  121. Napier RM, Mcrae M, Kehoe EJ. Rapid reacquisition in conditioning of the rabbit’s nictitating membrane response. Journal of Experimental Psychology: Animal Behavior Processes. 1992;18:182–192. doi: 10.1037//0097-7403.18.2.182. [DOI] [PubMed] [Google Scholar]
  122. Nelson JB, Sanjuan MC, Vadillo-Ruiz S, Pérez J, León SP. Experimental renewal in human participants. Journal of Experimental Psychology: Animal Behavior Processes. 2011;37:58–70. doi: 10.1037/a0020519. [DOI] [PubMed] [Google Scholar]
  123. Nic Dhonnchadha BA, Szalay JJ, Achat-Mendes C, Platt DM, Otto MW, Spealman RD, Kantak KM. D-cycloserine deters reacquisition of cocaine self-administration by augmenting extinction learning. Neuropsychopharmacology. 2010;35:357–367. doi: 10.1038/npp.2009.139. [DOI] [PMC free article] [PubMed] [Google Scholar]
  124. Orsini CA, Maren S. Neural and cellular mechanisms of fear and extinction memory formation. Neuroscience and Behavioral Reviews. 2012;36:1773–1802. doi: 10.1016/j.neubiorev.2011.12.014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  125. Orsini CA, Kim JH, Knapska E, Maren S. Hippocampal and prefrontal projections to the basal amygdala mediate contextual regulation of fear after extinction. The Journal of Neuroscience. 2011;31:17269–17277. doi: 10.1523/JNEUROSCI.4095-11.2011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  126. Ostlund SB, Balleine BW. Selective reinstatement of instrumental performance depends on the discriminative stimulus properties of the mediating outcome. Learning & Behavior. 1997;35:43–52. doi: 10.3758/bf03196073. [DOI] [PubMed] [Google Scholar]
  127. Otto MW, Tolin DF, Simon NM, Pearlson GD, Basden S, Meunier SA, Hofmann SG, Eisenmenger K, Krystal JH, Pollack MH. Efficacy of d-cycloserine for enhancing response to cognitive-behavior therapy for panic disorder. Biological Psychiatry. 2010;67:365–370. doi: 10.1016/j.biopsych.2009.07.036. [DOI] [PubMed] [Google Scholar]
  128. Paolone G, Botreau F, Stewart J. The facilitative effects of D-cylcoserine on extinction of a cocaine-induced conditioned place preference can be long lasting and resistant to reinstatement. Psychopharmacology. 2009;202:403–409. doi: 10.1007/s00213-008-1280-y. [DOI] [PubMed] [Google Scholar]
  129. Pavlov IP. In: Conditioned reflexes. Anrep GV, translator. London: Oxford University Press; 1927. [Google Scholar]
  130. Pearce JM, Hall G. The influence of context-reinforcer associations on instrumental performance. Animal Learning & Behavior. 1979;7:504–508. [Google Scholar]
  131. Pearce JM, Hall G. A model for Pavlovian learning: Variations in the effectiveness of conditioned but not of unconditioned stimuli. Psychological Review. 1980;87:532–552. [PubMed] [Google Scholar]
  132. Perry CJ, McNally GP. Naloxone prevents the rapid reacquisition but not acquisition of alcohol seeking. Behavioral Neuroscience. 2012;126:599–604. doi: 10.1037/a0029079. [DOI] [PubMed] [Google Scholar]
  133. Peters J, De Vries TJ. D-cycloserine administered directly to infralimbic medial prefrontal cortex enhances extinction memory in sucrose-seeking animals. Neuroscience. 2013;230:24–30. doi: 10.1016/j.neuroscience.2012.11.004. [DOI] [PubMed] [Google Scholar]
  134. Peters J, Kalivas PW, Quirk GJ. Extinction circuits for fear and addiction overlap in the prefrontal cortex. Learning & Memory. 2009;16:279–288. doi: 10.1101/lm.1041309. [DOI] [PMC free article] [PubMed] [Google Scholar]
  135. Peters J, LaLumiere RT, Kalivas PW. Infralimbic prefrontal cortex is responsible for inhibiting cocaine seeking in extinguished rats. The Journal of Neuroscience. 2008;28:6046–6053. doi: 10.1523/JNEUROSCI.1045-08.2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  136. Podlesnik CA, Jimenez-Gomez C, Shahan TA. Resurgence of alcohol seeking produced by discontinuing non-drug reinforcement as an animal model of drug relapse. Behavioural Pharmacology. 2006;17:369–374. doi: 10.1097/01.fbp.0000224385.09486.ba. [DOI] [PubMed] [Google Scholar]
  137. Podlesnik CA, Shahan TA. Behavioral momentum and relapse of extinguished operant responding. Learning & Behavior. 2009;37:357–364. doi: 10.3758/LB.37.4.357. [DOI] [PMC free article] [PubMed] [Google Scholar]
  138. Polack CW, Laborda MA, Miller RR. Extinction context as a conditioned inhibitor. Learning & Behavior. 2011;40:24–33. doi: 10.3758/s13420-011-0039-1. [DOI] [PubMed] [Google Scholar]
  139. Price KL, Baker NL, McRae-Clark AL, Saladin ME, DeSantis SM, Santa Ana EJ, Brady KT. A randomized, placebo-controlled laboratory study of the effects of D-cycloserine on craving in cocaine-dependent individuals. Psychopharmacology. 2013;226:739–746. doi: 10.1007/s00213-011-2592-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  140. Pussinen R, Niememinen S, Koivisto E, Haapalinna A, Riekkinen S, Sirvio J. Enhancement of intermediate-term memory by an α1 agonist or a partial agonist at the glycine site of the NMDA receptor. Neurobiology of Learning and Memory. 1997;67:69–74. doi: 10.1006/nlme.1996.3738. [DOI] [PubMed] [Google Scholar]
  141. Quartermain D, Mower J, Rafferty MF, Herting RL, Lanthorn TH. Acute but not chronic activation of the NMDA-coupled glycine receptor with D-cycloserine facilitates learning and retention. European Journal of Pharmacology. 1994;257:7–12. doi: 10.1016/0014-2999(94)90687-4. [DOI] [PubMed] [Google Scholar]
  142. Quick SL, Pyszczynski AD, Colston KA, Shahan TA. Loss of alternative non-drug reinforcement induces relapse of cocaine-seeking in rats: Role of dopamine D1 receptors. Neuropsychopharmacology. 2011;36:1015–1020. doi: 10.1038/npp.2010.239. [DOI] [PMC free article] [PubMed] [Google Scholar]
  143. Quirk GJ, Mueller D. Neural mechanisms of extinction learning and retrieval. Neuropsychopharmacology Reviews. 2008;33:56–72. doi: 10.1038/sj.npp.1301555. [DOI] [PMC free article] [PubMed] [Google Scholar]
  144. Reid RL. The role of the reinforcer as a stimulus. British Journal of Psychology. 1958;49:202–209. doi: 10.1111/j.2044-8295.1958.tb00658.x. [DOI] [PubMed] [Google Scholar]
  145. Rescorla RA. Inhibitory associations between S and R in extinction. Animal Learning & Behavior. 1993;21:327–336. [Google Scholar]
  146. Rescorla RA. Preservation of Pavlovian associations through extinction. Quarterly Journal of Experimental Psychology. 1996;49B:245–258. [Google Scholar]
  147. Rescorla RA. Response inhibition in extinction. The Quarterly Journal of Experimental Psychology. 1997;50B:238–252. [Google Scholar]
  148. Rescorla RA. Extinction can be enhanced by a concurrent excitor. Journal of Experimental Psychology: Animal Behavior Processes. 2000;26:251–261. doi: 10.1037//0097-7403.26.3.251. [DOI] [PubMed] [Google Scholar]
  149. Rescorla RA. Protection from extinction. Learning & Behavior. 2003;31:124–132. doi: 10.3758/bf03195975. [DOI] [PubMed] [Google Scholar]
  150. Rescorla RA. Deepened extinction from compound stimulus presentation. Journal of Experimental Psychology: Animal Behavior Processes. 2006;32:135–144. doi: 10.1037/0097-7403.32.2.135. [DOI] [PubMed] [Google Scholar]
  151. Rescorla RA. Spontaneous recovery from overexpectation. Learning & Behavior. 2006;34:13–20. doi: 10.3758/bf03192867. [DOI] [PubMed] [Google Scholar]
  152. Rescorla RA. Renewal after overexpectation. Learning & Behavior. 2007;35:19–26. doi: 10.3758/bf03196070. [DOI] [PubMed] [Google Scholar]
  153. Rescorla RA. Within-subject renewal in sign tracking. Quarterly Journal of Experimental Psychology. 2008;61:1792–1802. doi: 10.1080/17470210701790099. [DOI] [PubMed] [Google Scholar]
  154. Rescorla RA, Cunningham CL. The erasure of conditioned fear. Animal Learning & Behavior. 1977;5:386–394. [Google Scholar]
  155. Rescorla RA, Heth CD. Reinstatement of fear to an extinguished conditioned stimulus. Journal of Experimental Psychology: Animal Behavior Processes. 1975;104:88–96. [PubMed] [Google Scholar]
  156. Rescorla RA, Skucy JC. Effect of response-independent reinforcers during extinction. Journal of Comparative and Physiological Psychology. 1969;67:381–389. [Google Scholar]
  157. Rescorla RA, Wagner AR. A theory of Pavlovian conditioning: Variations in the effectiveness of reinforcement and nonreinforcement. In: Black AH, Prokasy WF, editors. Classical conditioning II: current research and theory. New York: Appleton-Century-Crofts; 1972. pp. 64–99. [Google Scholar]
  158. Ressler KJ, Rothbaum BO, Tannenbaum L, Anderson P, Graap K, Zimand E, Hodges E, Davis M. Cognitive enhancers as adjuncts to psychotherapy: Use of D-cycloserine in phobic individuals to facilitate extinction of fear. Archives of General Psychiatry. 2004;61:1136–1144. doi: 10.1001/archpsyc.61.11.1136. [DOI] [PubMed] [Google Scholar]
  159. Rosas JM, Todd TP, Bouton ME. Context change and associative learning. Wiley Interdisciplinary Reviews: Cognitive Science. 2013;4:237–244. doi: 10.1002/wcs.1225. [DOI] [PMC free article] [PubMed] [Google Scholar]
  160. Santini E, Muller RU, Quirk GJ. Consolidation of extinction learning involves transfer from NMDA-independent to NMDA-dependent memory. The Journal of Neuroscience. 2001;21:9009–9017. doi: 10.1523/JNEUROSCI.21-22-09009.2001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  161. Schiller D, Monfils MH, Raio CM, Johnson D, LeDoux JE, Phelps EA. Blocking the return of fear in humans using reconsolidation update mechanisms. Nature. 2010;463:49–53. doi: 10.1038/nature08637. [DOI] [PMC free article] [PubMed] [Google Scholar]
  162. Schmajuk NA, Lamoureux JA, Holland PC. Occasion setting: A neural network approach. Psychological Review. 1998;105:3–32. doi: 10.1037/0033-295x.105.1.3. [DOI] [PubMed] [Google Scholar]
  163. Schwendt M, Reichel CM, See RE. Extinction-dependent alterations in corticostriatal mGluR2/3 and mGluR7 receptors following chronic methamphetamine self-administration in rats. PLoS ONE. 2012;7:e34299. doi: 10.1371/journal.pone.0034299. [DOI] [PMC free article] [PubMed] [Google Scholar]
  164. Shahan TA, Sweeney MM. A model of resurgence based on behavioral momentum theory. Journal of Experimental Analysis of Behavior. 2011;95:91–108. doi: 10.1901/jeab.2011.95-91. [DOI] [PMC free article] [PubMed] [Google Scholar]
  165. Shalev U, Highfield D, Yap J, Shaham Y. Stress and relapse to drug seeking in rats: Studies on the generality of the effect. Psychopharmacology. 2000;150:337–146. doi: 10.1007/s002130000441. [DOI] [PubMed] [Google Scholar]
  166. Shaw D, Norwood K, Sharp K, Quigley L, McGovern SF, Leslie JC. Facilitation of extinction of operant behaviour in mice by D-cycloserine. Psychopharmacology. 2009;202:397–402. doi: 10.1007/s00213-008-1312-7. [DOI] [PubMed] [Google Scholar]
  167. Smits JAJ, Rosenfield D, Otto MW, Powers MB, Hofmann SG, Telch MJ, Pollack MH, Tart CD. D-cycloserine enhancement of fear extinction is specific to successful exposure sessions: Evidence from the treatment of height phobia. Biological Psychiatry. 2013 doi: 10.1016/j.biopsych.2012.12.009. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  168. Soeter M, Kindt M. Disrupting reconsolidation: Pharmacological and behavioral manipulations. Learning & Memory. 2011;18:377–366. doi: 10.1101/lm.2148511. [DOI] [PubMed] [Google Scholar]
  169. Soeter M, Kindt M. Stimulation of the noradrenergic system during memory formation impairs extinction learning but not the disruption of reconsolidation. Neurpsychopharmacology. 2012;37:1204–1215. doi: 10.1038/npp.2011.307. [DOI] [PMC free article] [PubMed] [Google Scholar]
  170. Soltysik SS, Wolfe GE, Nicholas T, Wilson J, Garcia-Sanchez JL. Blocking of inhibitory conditioning within a serial conditioned stimulus-conditioned inhibitor compound: Maintenance of acquired behavior without an unconditioned stimulus. Learning and Motivation. 1983;14:1–29. [Google Scholar]
  171. Spence KW. Extinction of the human eyelid CR as a function of presence or absence of the UCS during extinction. Journal of Experimental Psychology. 1966;71:642–648. doi: 10.1037/h0023108. [DOI] [PubMed] [Google Scholar]
  172. Sutton MA, Schmidt EF, Choi KH, Schad CA, Whisler K, Simmons D, Karanian DA, Monteggia LM, Neve RL, Self DW. Extinction-induced upregulation in AMPA receptors reduces cocaine-seeking behaviour. Nature. 2003;421:70–75. doi: 10.1038/nature01249. [DOI] [PubMed] [Google Scholar]
  173. Sweeney MM, Shahan TA. Effects of high, low, and thinning rates of alternative reinforcement on response elimination and resurgence. Journal of the Experimental Analysis of Behavior. 2013;100:102–116. doi: 10.1002/jeab.26. [DOI] [PubMed] [Google Scholar]
  174. Thanos PK, Bermeo C, Wang GJ, Volkow ND. D-cycloserine facilitates extinction of cocaine self-administration in rats. Synapse. 2011a;65:938–944. doi: 10.1002/syn.20922. [DOI] [PMC free article] [PubMed] [Google Scholar]
  175. Thanos PK, Subrize M, Lui W, Puca Z, Ananth M, Michaelides M, Wang GJ, Volkow ND. D-cycloserine facilitates extinction of cocaine self-administration in C57 mice. Synapse. 2011b;65:1099–1105. doi: 10.1002/syn.20944. [DOI] [PMC free article] [PubMed] [Google Scholar]
  176. Thompson LT, Disterhoft JF. Age- and dose-dependent facilitation of associative eyeblink conditioning by D-cycloserine in rabbits. Behavioral Neuroscience. 1997;111:1303–1312. doi: 10.1037//0735-7044.111.6.1303. [DOI] [PubMed] [Google Scholar]
  177. Todd TP. Mechanisms of renewal after the extinction of instrumental behavior. Journal of Experimental Psychology: Animal Behavior Processes. 2013;39:193–207. doi: 10.1037/a0032236. [DOI] [PMC free article] [PubMed] [Google Scholar]
  178. Todd TP, Vurbic D, Bouton ME. Renewal after the extinction of discriminated operant behavior: Role of context-specific response inhibition. (submitted) [Google Scholar]
  179. Todd TP, Winterbauer NE, Bouton ME. Contextual control of appetite: Renewal of inhibited food-seeking behavior in sated rats after extinction. Appetite. 2012a;58:484–489. doi: 10.1016/j.appet.2011.12.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  180. Todd TP, Winterbauer NE, Bouton ME. Effects of the amount of acquisition and contextual generalization on the renewal of instrumental behavior after extinction. Learning & Behavior. 2012b;40:145–157. doi: 10.3758/s13420-011-0051-5. [DOI] [PubMed] [Google Scholar]
  181. Torregrossa MM, Sanchez H, Taylor JR. D-cycloserine reduces the context specificity of Pavlovian extinction of cocaine cues through actions in the nucleus accumbens. The Journal of Neuroscience. 2010;30:10526–10533. doi: 10.1523/JNEUROSCI.2523-10.2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  182. Urcelay GP, Lipatova O, Miller RR. Constraints on enhanced extinction resulting from extinction treatment in the presence of an added excitor. Learning and Motivation. 2009;40:343–363. doi: 10.1016/j.lmot.2009.04.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  183. Vengeliene V, Kiefer F, Spanagel R. D-cycloserine facilitates extinction of conditioned alcohol-seeking behavior in rats. Alcohol & Alcoholism. 2008;43:626–629. doi: 10.1093/alcalc/agn067. [DOI] [PubMed] [Google Scholar]
  184. Vervliet B, Vansteenwegen D, Hermans D, Eelen P. Concurrent excitors limit the extinction of conditioned fear in humans. Behaviour Research and Therapy. 2007;45:375–383. doi: 10.1016/j.brat.2006.01.009. [DOI] [PubMed] [Google Scholar]
  185. Vurbic D, Gold B, Bouton ME. Effects of d-cycloserine on the extinction of appetitive operant learning. Behavioral Neuroscience. 2011;125:551–559. doi: 10.1037/a0024403. [DOI] [PMC free article] [PubMed] [Google Scholar]
  186. Waelti P, Dickinson A, Schultz W. Dopamine responses comply with basic assumptions of formal learning theory. Nature. 2001;412:43–48. doi: 10.1038/35083500. [DOI] [PubMed] [Google Scholar]
  187. Wagner AR. Stimulus selection and a “modified continuity theory”. In: Bower GH, Spence JT, editors. The psychology of learning and motivation. Vol. 3. New York: Academic Press; 1969. [Google Scholar]
  188. Wagner AR. SOP: A model of automatic memory processing in anima behavior. In: Spear NE, Miller RR, editors. Information processing in animals: Memory mechanisms. Hillsdale, NJ: Lawrence Erlbaum Associates, Inc; 1981. pp. 5–47. [Google Scholar]
  189. Walker DL, Ressler KJ, Lu KT, Davis M. Facilitation of conditioned fear extinction by systemic administration or intra-amygdala infusion of D-cycloserine as assessed with fear-potentiated startle in rats. The Journal of Neuroscience. 2002;22:2343–2351. doi: 10.1523/JNEUROSCI.22-06-02343.2002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  190. Weber M, Hart J, Richardson R. Effects of D-cycloserine on extinction of learned fear to an olfactory cue. Neurobiology of Learning and Memory. 2007;87:476–482. doi: 10.1016/j.nlm.2006.12.010. [DOI] [PubMed] [Google Scholar]
  191. Weerts EM, Kaminski BJ, Griffiths RR. Stable low-rate midazolam self-injection with concurrent physical dependence under conditions of long-term continuous availability in baboons. Psychopharmacology. 1998;135:70–81. doi: 10.1007/s002130050487. [DOI] [PubMed] [Google Scholar]
  192. Willcocks AL, McNally GP. The role of context in re-acquisition of extinguished alcoholic beer-seeking. Behavioral Neuroscience. 2011;125:541–550. doi: 10.1037/a0024100. [DOI] [PubMed] [Google Scholar]
  193. Willcocks AL, McNally GP. The role of medial prefrontal cortex in extinction and reinstatement of alcohol seeking in rats. European Journal of Neuroscience. 2013;37:259–268. doi: 10.1111/ejn.12031. [DOI] [PubMed] [Google Scholar]
  194. Wilson A, Brooks DC, Bouton ME. The role of the rat hippocampal system in several effects of context in extinction. Behavioral Neuroscience. 1995;109:828–836. doi: 10.1037//0735-7044.109.5.828. [DOI] [PubMed] [Google Scholar]
  195. Winterbauer NE, Bouton ME. Mechanisms of resurgence of an extinguished operant response. Journal of Experimental Psychology: Animal Behavior Processes. 2010;36:343–353. doi: 10.1037/a0017365. [DOI] [PMC free article] [PubMed] [Google Scholar]
  196. Winterbauer NE, Bouton ME. Mechanisms of resurgence II: Response-contingent reinforcers can reinstate a second extinguished behavior. Learning and Motivation. 2011;42:154–164. doi: 10.1016/j.lmot.2011.01.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  197. Winterbauer NE, Bouton ME. Effects of thinning the rate at which the alternative behavior is reinforced on resurgence of an operant response. Journal of Experimental Psychology: Animal Behavior Processes. 2012;38:279–291. doi: 10.1037/a0028853. [DOI] [PMC free article] [PubMed] [Google Scholar]
  198. Woods AM, Bouton ME. D-cycloserine facilitates extinction but does not eliminate renewal of the conditioned emotional response. Behavioral Neuroscience. 2006;120:1159–1162. doi: 10.1037/0735-7044.120.5.1159. [DOI] [PubMed] [Google Scholar]
  199. Woods AM, Bouton ME. Occasional reinforced responses during extinction can slow the rate of reacquisition of an operant response. Learning and Motivation. 2007;38:56–74. doi: 10.1016/j.lmot.2006.07.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  200. Xue Y, Luo Y, Wu P, Shi H, Xue L, Chen C, Zhu W, Ding Z, Bao Y, Shi J, Epstein DH, Shaham Y, Lu L. A memory retrieval-extinction procedure to prevent drug craving and relapse. Science. 2012;336:241–245. doi: 10.1126/science.1215070. [DOI] [PMC free article] [PubMed] [Google Scholar]
  201. Yoon JH, Newton TF, Haile CN, Bordnick PS, Fintzy RE, Culbertson C, Mahoney JJ, Hawkins RY, LaBounty KR, Ross EL, Aziziyeh AI, De La Garza R. Effects of D-cycloserine on cue-induced craving and cigarette smoking among concurrent cocaine- and nicotine-dependent volunteers. Addictive behaviors. 2013;38:1518–1526. doi: 10.1016/j.addbeh.2012.03.022. [DOI] [PMC free article] [PubMed] [Google Scholar]
  202. Zelikowsky M, Pham DL, Fanselow MS. Temporal factors control hippocampal contributions to fear renewal after extinction. Hippocampus. 2012;22:1096–1106. doi: 10.1002/hipo.20954. [DOI] [PMC free article] [PubMed] [Google Scholar]
  203. Zironi I, Burattini C, Aircardi G, Janak PH. Context is a trigger for relapse to alcohol. Behavioural Brain Research. 2006;167:150–155. doi: 10.1016/j.bbr.2005.09.007. [DOI] [PubMed] [Google Scholar]

RESOURCES