Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2008 May 5.
Published in final edited form as: Cogn Emot. 2007;21(6):1212–1237. doi: 10.1080/02699930701438020

On the interdependence of cognition and emotion

Justin Storbeck 1, Gerald L Clore 1
PMCID: PMC2366118  NIHMSID: NIHMS40350  PMID: 18458789

Abstract

Affect and cognition have long been treated as independent entities, but in the current review we suggest that affect and cognition are in fact highly interdependent. We open the article by discussing three classic views for the independence of affect. These are (i) the affective independence hypothesis, that emotion is processed independently from cognition, (ii) the affective primacy hypothesis, that evaluative processing precedes semantic processing, and (iii) the affective automaticity hypothesis, that affectively potent stimuli commandeer attention and evaluation is automatic. We argue that affect is not independent from cognition, that affect is not primary to cognition, nor is affect automatically elicited. The second half of the paper discusses several instances of how affect influences cognition. We review experiments showing affective involvement in perception, semantic activation, and attitude activation. We conclude that one function of affect is to regulate cognitive processing.


Different views of the relationship between cognition and emotion can be seen in the comments of two prominent psychologists upon receipt of distinguished scientist awards.

Robert Zajonc (1980, p. 151) in a paper titled “Preferences Need No Inferences”, proclaimed: “Affect and cognition … constitute independent sources of effects in information processing”. A year later, upon receipt of the same award, Gordon Bower (1981, p. 147), in a paper on “Mood and Memory”, stated, “I am a cognitive psychologist, and … the emotional effects we have found so far seem understandable to me in terms of ideas that are standard fare in cognitive psychology”. These two pivotal papers both argued for the importance of studying emotion, but they proposed different meta-theories. Zajonc argued that affect and cognition are processed independently and that affect has temporal priority over even basic cognitive processes. In contrast, Bower argued that cognitive processing could be used to understand emotional phenomena. We agree with Bower’s conception, and we extend it to suggest that cognitive processes are necessary for the processing, elicitation, and experience of emotions.

The concepts of “cognition” and “emotion” are, after all, simply abstractions for two aspects of one brain in the service of action. Zajonc believed that emotion is independent from cognition. Our own view is that the study of emotion and cognition should be integrated, because the phenomena themselves are integrated (Dewey, 1894; Parrott & Sabini, 1989). We argue against the notion that discrete emotions have separate and distinct areas in the brain (Duncan & Barrett, 2007 this issue). Rather, emotions emerge from a combination of affective and cognitive processes (see Moors, this issue).1 Moreover, in agreement with Bower (1981), we suggest that emotion can be studied using cognitive paradigms. Both laboratory findings and everyday observation suggest a unity and interrelatedness of cognitive and affective processes, and that trying to dissect them into separate faculties would neglect the richness of mental life (Roediger, Gallo, & Geraci, 2002, p. 319). We suggest, like others, that the interconnections found within the brain provide no obvious basis for divorcing emotion from cognition (Erickson & Schulkin, 2003; Halgren, 1992; Lane & Nadel, 2000; Phelps, 2004).

This article has two parts. The first part is a critical review of what recent neuroscience and social psychological research tells us about three popular ideas about cognition and emotion.2 These include the affective independence hypothesis, that emotion is processed independently of cognition via a subcortical “low route”, the affective primacy hypothesis, that affective and evaluative processing takes precedence over semantic processing, as evident in the mere exposure effect and affective priming, and the affective automaticity hypothesis, that affectively potent stimuli commandeer attention and that affective processes are especially likely to be automatic. In the second part, we suggest that rather than being processed independently, affect modifies and regulates cognitive processing, as illustrated in a review of some recent research from our own lab.

I. ASSESSING THREE HYPOTHESES ABOUT AFFECT AND COGNITION

A. The affective independence hypothesis: The low route

Zajonc (2000) lists ten ways in which affect and cognition differ and suggests that they arise from separate systems. Papers in this tradition sometimes cite LeDoux’s (1996) proposal that the amygdala can elicit emotion before information reaches the cortex. In this section, we suggest (a) that the low route does not play a role in processing the complex stimuli typically used in social and emotional research (e.g., faces, ideographs, objects), and (b) that the amygdala, and emotion in general, does not function independently of perceptual and cognitive processes.

The “low route” (LeDoux, 1996; LeDoux, Romanski, & Xagoraris, 1989) is a pathway that allows stimulus processing without cortical influence as studied in rats. When light hits the retina, the signal is relayed to the amygdala through the thalamus without going first to the visual cortex. The pathway is adequate to support fear conditioning between illumination changes and fear-invoking events (e.g., shock). Based on studies with the auditory cortex, which have similar relay pathways as the visual cortex, information can reach the amygdala within 20 ms (Quirk, Armony, & LeDoux, 1997; Quirk, Repa, & LeDoux, 1995).

LeDoux and colleagues used rat models to examine the low route to emotion, and the question remains whether a similar pathway exists in humans. First, comparative anatomical studies (Linke, De Lima, Schwegler, & Pape, 1999) and behavioural studies (Shi & Davis, 2001) using rat-based models suggest that this pathway may be functionally relevant only when cortical areas have been lesioned or damaged. In humans, converging evidence suggests that the low route may not be functionally important for emotion processing (Halgren, 1992; Kudo, Glendenning, Frost, & Masterson, 1986; Rolls, 1999; Shi & Davis, 2001; Storbeck, Robinson, & McCourt, 2006). However, the existence of the low route in humans is still debated (see LeDoux, 2001).3

Can the low route discriminate emotional vs. non-emotional stimuli without cortical involvement? That is, can the low route sufficiently discriminate a snake from a bunny without cortical involvement? We say No. The low route has limited capacity for stimulus discrimination. Fear conditioning studies that find support for the low route typically require only detection of the presence or absence of a stimulus (Duvel, Smith, Talk, & Gabriel, 2001; LeDoux et al., 1989; Shi & Davis, 2001). When the task requires discriminating one stimulus from another (e.g., CS+ = high freq. tone, CS− = low freq. tone), then cortical analysis appears to be necessary (Butler, Diamond, & Neff, 1957; Duvel et al., 2001; Komura et al., 2001; McCabe, McEchron, Green, & Schneiderman, 1993; Nicholson & Freeman, 2000; Thompson, 1962).

One way to test the affective independence hypothesis is to temporarily inactivate the visual cortex to determine whether it is necessary for the amygdala to determine affective significance of stimuli. Fukuda, Ono, and Nakamura (1987) did just that in awake, behaving monkeys. They observed that when the visual cortex (representing early cognitive processing) was temporarily inactivated, monkeys failed to learn and failed to demonstrate appropriate affective associations to visual cues of edible vs. inedible objects. But, the amygdala was still intact, because the monkeys could determine the affective significance of the same stimuli based on taste. This study suggests that in order to determine whether a peanut or a baseball is edible based on visual properties and prior experience, the visual cortex is necessary.

Another way to determine whether the amygdala is independent of initial cognitive processing is to record single-cell activity within the visual cortex and the amygdala. Again, if the amygdala can process visual stimuli independent of cortical input, then amygdala neurons should still remain active to visually presented objects that have affective significance when the visual cortex is temporarily inactivated. Nishijo, Ono, Tamura, and Nakamura (1993) recorded such vision-relevant neurons in the amygdala of monkeys. They discovered that these neurons failed to respond to affectively significant visual stimuli when the visual cortex was temporarily inactivated. They suggested that amygdala activity is not related directly to sensory inputs, but rather it relies on view-invariant representations of objects from the visual cortex. That is, the goal of the visual cortex is to create unique neural signatures for unique objects regardless of its orientation, background lighting, etc. Therefore, the same affective significance can be retrieved regardless of the visual state the object is in. Halgren offers similar conclusions as Nishijo et al. by suggesting that when “the amygdala performs emotional evaluation, it does so within the cognitive system. This could explain why it has been so difficult to dissociate emotional from cognitive processing in humans” (Halgren, 1992, p. 212).

Faces have always received special attention in the study of emotion due to their possible evolutionary connection to survival (Davey, 1995; Öhman, 1997). Adolphs (2002) suggested that processing and recognising a fear face requires a network of various structures and that the low route alone is incapable of such processing. He proposed that the visual cortex first grossly identifies the face, and in particular determines whether the face contains an expression or not. Support for this model comes from the fact that there are two areas in the monkey visual cortex dedicated to face processing, areas STS and IT. Area STS (superior temporal sulcus) is involved in encoding facial expressions, while the IT (inferotemporal cortex) is involved in encoding facial identity (see Allison, Puce, & McCarthy, 2000; Kanwisher, McDermott, & Chun, 1997; Narumoto, Okada, Sadato, Fukui, & Yonekura, 2001; Rotshtein, Malach, Hadar, Graif, & Hendler, 2001, for related literature on humans). Both areas, STS and IT, have strong reciprocal connections to the amygdala, suggesting the amygdala receives highly processed facial information pertaining to both facial identity and facial expression (Baylis, Rolls, & Leonard, 1987; Fukuda et al., 1987; Nishijo, Ono, & Nishino, 1988b; Rolls, 1992). For example, Rotshtein et al. (2001) found that lateral occipital cortex in humans, which processes facial expressions, is concerned with the configuration for each expression, rather than with its affective value. That is, the visual cortex does not code for affective significance (Rolls, 1999; Rotshtein et al., 2001), but rather codes for facial configurations and these configurations are sent to the amygdala for affective processing. Thus, the visual cortex is needed for the amygdala to correctly identify and respond to emotional stimuli.

For emotional stimuli used by psychologists (e.g., snakes; emotional faces) the processing capacities of the low route would appear to be inadequate (see Rolls, 1999; Smith, Cacioppo, Larsen, & Chartrand, 2003, for similar concerns). But, the low route is still cited to help explain particular affective phenomena (e.g., Bargh, 1997; Berkowitz & Harmon-Jones, 2004; Zajonc, 2000). However, we suggest that the fact that the amygdala relies on cortical input to make an evaluation requires reconsideration of whether emotion at initial levels of processing can be dissociated from cognitive processing. We should also note that the same processes occur regardless of whether stimuli are presented subliminally or supraliminally (Rolls & Tovee, 1994; Rolls, Tovee, Purcell, Stewart, & Azzopardi, 2004; Storbeck et al., 2006).

Conclusions

Davidson (2003) claimed that one of seven deadly sins of cognitive neuroscience is to assume that affect is independent from cognition. We and several others agree that emotion should not be divorced from cognition (Adolphs & Damasio, 2001; Barnard, Duke, Byrne, & Davidson, 2007 this issue; Davidson, 2003; Duncan & Barrett, 2007 this issue; Eder & Klauer, this issue; Erickson & Schulkin, 2003; Lane & Nadel, 2000; Lavender & Hommel, this issue; Lazarus, 1995; Parrott & Sabini, 1989; Phelps, 2004; Storbeck et al., 2006). For instance, based on anatomical connections alone (Ghashghaei & Barbas, 2002) areas necessary for cognition and emotion are highly interconnected, and these connections are bidirectional, suggesting integrated processing of emotion and cognition. Halgren (1992) suggests that emotion and cognition are so interconnected that it is not practical to try to disentangle the temporal and casual relations of emotion and cognition.

B. The affective primacy hypothesis

In addition to the idea that affect and cognition are independent sources of influence, is the allied idea of affective primacy (Zajonc, 1980, 2000). The mere exposure phenomenon (Zajonc, 1968) was an important source of evidence for the hypothesis, because mere exposure involves an affective reaction that is not dependent on conscious categorisation or identification of the liked stimulus.

One problem with accepting the grand conclusions, that affective processing occurs without cognitive processing, drawn from mere exposure research is that they rest on an equation of consciousness with cognition (Lazarus, 1995). However, single-cell recording shows that the visual cortex can readily identify stimuli presented below subjective thresholds. Therefore, lack of conscious awareness has no bearing on whether the visual cortex categorises and identifies a stimulus. In fact, studies have demonstrated that the only difference between a stimulus presented for 30 ms as opposed to 1000 ms is the strength at which a neural population fires (Rolls, 1999; Rolls & Tovee, 1994; Rolls et al., 1994). But the response pattern output remains the same, which is thought to reflect a unique neural signature for a given stimulus type. That is, the neural population identifies X whether presented for 30 ms or 1000 ms, but the firing is stronger so that the system is more confident for 1000 ms presentations than for 30 ms presentations.

Another problem is that the mere exposure effect gets weaker the longer stimuli are presented, as respondents have time to process the identity of the stimulus and to realise that they have seen it before. The mere exposure effect is, therefore, a mistake based on a misattribution of fluency as liking instead of as familiarity (Winkielman, Schwarz, Fazendeiro, & Reber, 2003). Of course, most real-world emotional reactions are not errors of this kind, so that its appeal as a model of how affect is related to cognition is limited.

More importantly, Zajonc (2000) assumed that the mere exposure effect does not rely on cognitive or cortical involvement, suggesting the effect may rely on the low route to emotion. The low route to emotion would allow for affective processing without cortical or cognitive input. However, available evidence suggests that the effect does not rely on processing by the amygdala. For example, Greve and Bauer (1990) report the case of a patient, GY, who had an accident that severed the connection between the visual cortex and the amygdala. They found that GY showed the mere exposure effect even though visual information was not getting to the amygdala. Elliott and Dolan (1998) also report that the mere exposure effect relies more on frontal cortical networks. They failed to observe any relevant amygdala activation. These data suggest that the mere exposure effect is not a phenomenon that requires the amygdala, let alone one that could be based in the low route to the amygdala.

Affective priming as evidence for affective primacy

Another phenomenon that has been interpreted as demonstrating a special status for affect is affective priming. Research suggests that people routinely evaluate objects in their environment (Murphy & Zajonc, 1993; Niedenthal, 1990). For example, in sequential priming studies, briefly presented prime words are followed by target words that participants categorise (e.g., as positive or negative) as quickly as possible (Bargh, Chaiken, Govender, & Pratto, 1992; Bargh, Chaiken, Raymond, & Hymes, 1996; Fazio, Sanbonmatsu, Powell, & Kardes, 1986). Priming is then seen when, for example, positive primes facilitate responses to positive targets and interfere with responses to negative targets. Such affective priming suggests that people may automatically evaluate stimuli without an intention to do so.

In an effort to rule out the possibility that the response facilitation is really due to some semantic dimension, investigators typically choose primes and targets that have no association other than being similar or dissimilar in evaluative meaning. However, Storbeck and Robinson (2004) point out that this practice of limiting the relationship between primes and targets to evaluation may force respondents into evaluative priming. If so, it would lose its value as evidence that affect is independent of cognition or has primacy over semantic meaning.

To test this hypothesis, Storbeck and Robinson (2004) used “a comparative priming method” in which words were selected to vary not only evaluatively but also categorically. For example, primes or targets might be positively or negative valenced animals (e.g., puppies, snakes) or positive or negatively valenced texturewords (smooth, rough). Thus, prime-target pairs were related evaluatively (good vs. bad), but also descriptively (animals vs. textures).

Two tasks were used, an evaluation task and a lexical decision task, and both revealed semantic but not affective priming. In addition, the same result was found when they used pictures instead of words. To verify that the practice of artificially limiting the relationship between primes and targets to evaluation had promoted affective priming in prior studies, Storbeck and Robinson (2004) then repeated their experiment but removed any systematic descriptive relationship between primes and targets, such that all words were now animal exemplars. As expected, the usual affective priming results reappear when participants are given only evaluation as a possible basis for relating primes and targets.

Another comparative priming study was performed by Klauer and Musch (2002). They used primes and targets that could be categorised based on affect or another non-affective dimension, and manipulated only the task demand (i.e., to evaluate or categorise). They concluded that affective priming is not based on a special evaluation system. Rather, affective priming relies on the same mechanisms responsible for semantic priming.

These experiments suggest that affective priming is not obligatory. The evaluative meaning on which affective priming is based is represented within a larger semantic network in which it is not the dominant mode of semantic categorisation. Evaluation is doubtlessly a very basic level of analysis, but evaluative meaning is not processed apart from other dimensions of semantic meaning, nor does it invoke a special automatic evaluator.

The Storbeck and Robinson experiments used supraliminal exposures of primes, but Erdley and D’Agostino (1988) used subliminal exposures. In a very different paradigm, they too found that when both affective and descriptive features were present, priming occurred along semantic rather than evaluative lines, suggesting that categorisation may often have priority over evaluation.

One might hypothesize that affect is elicited automatically at the onset of a stimulus and degrades from that point. If so, the use of a relatively long Stimulus Onset Asynchrony (SOA) (300 ms) by Storbeck and Robinson (2004) might conceivably have prevented detection of affective priming. To assess this possibility, studies might again use the comparative-priming approach and shorten either stimulus durations or response times, either of which might allow early components of the priming process to be visible. A study by Klinger, Burton, and Pitts (2000) satisfied these two requirements, and concluded that when primes are presented subliminally and response-window procedures are used, finding semantic or affective priming depends mainly on task requirements and response competition.

The Klinger et al. study is unique in the use of the response-window procedure. Since spreading activation builds up over hundreds of milliseconds (Perea & Rosa, 2002), such procedures tend to reduce any effects due to spreading activation within the semantic network, making it likely that any priming effects obtained are due to response compatibility, rather than spreading activation. It is interesting to note, though, that both semantic and affective priming were sensitive to similar task constraints, suggesting that both result from similar mechanisms. Other studies using similar comparative methods and response-window procedures also found both affective and categorical priming (Klauer & Musch, 2002; Klinger et al., 2000). However, crucially, these studies failed to equate semantic and affective features, and in each, affect was the most salient feature. Together these studies suggest that with response-window procedures, regardless of prime duration, priming is driven by response compatibility.

Since the use of a response window shortens the time available for effects due to spreading activation, what happens when spreading activation is allowed to build up over time, by presenting primes subliminally without a response window. Kemp-Wheeler and Hill (1992) performed such a study with a lexical decision task, and found both affective and semantic priming. But they also found that affective priming occurred mostly when people could detect the prime. Such detection did not facilitate semantic priming. They argued that affective priming is a subform of semantic priming and occurs when more time is given to revealing the affective significance of primes and targets.

Moving away from the priming procedure, Storbeck, Robinson, Ram, Meier, and Clore (2004) examined evaluations and categorisations of single target words using a response-window procedure. The window of response varied from 100 ms to 2000 ms and the dependent measure was accuracy. The experiment included nine participants over five days with over 500 trials per day. This allowed us to produce predictive models for the rise of semantic and affective accuracy. The results revealed that in shorter response windows, participants were more accurate in detecting semantic information than affective information. Other studies have found that semantic distinctions can occur as early as 80 ms, while evaluative distinctions start around 100 ms (Van Rullen & Thorpe, 2001).

EEG measures can also be used to discriminate semantic and affective aspects of processing without involving motor output processes. Cacioppo, Crites, and Gardner (1996) and Ito and Cacioppo (2001) found that ERP potentials always tracked semantic relations, even when semantic analysis was not the focus of the task. ERPs also tracked affective features, but only when the task had an explicitly evaluative focus, unless the evaluative components were quite potent. More critically, evidence suggests that the same discriminative processing based on semantic features performed by the visual cortex occurs whether stimuli are presented subliminally or supraliminally, regardless of conscious experience (Dehaene et al., 2001; Rolls & Tovee, 1994; Rolls et al., 1994; Stenberg, Lindgren, & Johansson, 2000). ERP and single-cell recordings both demonstrate that semantic information appears to be represented regardless of the task at hand and whether or not there is conscious perception of the stimuli. That is, semantic information always gets activated, regardless of the explicit task, whereas affective information is processed mainly when evaluation is an explicit part of the task or a highly salient aspect of the stimulus.

To be clear, in this view, the system needs an identification stage before an evaluation stage, and identification occurs in later stages of processing in the visual cortex. Even in classical conditioning, some kind of identification is required by the cortex (e.g., visual cortex) to discriminate a conditioned stimulus from all other stimuli. By “identification stage” we mean simply that a view invariant, neural signature of an object is activated in the visual cortex. Only then can the object activate affective and other associations.

Conclusions

These studies suggest that both semantic4 and affective features are represented in a single semantic network, and that semantic information (which is not to say lexical information, see footnote 3) has a necessary priority. That is, we feel that affective priming is a special case of semantic priming and can be obtained when affect is part of the task demand, the salient feature of the stimuli, or the focus of attention (Storbeck & Robinson, 2004). Under the right set of circumstances, affective relations can be made more accessible than semantic relations (e.g., Bargh et al., 1992, 1996; Klinger et al., 2000; Storbeck & Robinson, 2004). For example, Storbeck and Robinson (2004) found that when they crossed descriptive and evaluative features of stimuli in an evaluative priming task, semantic but not affective priming was observed. But when the relations between primes and targets stimuli were limited to their evaluative features, then affective priming was observed. Thus, under the right set of conditions, affective priming can readily be observed, but such evaluative priming is in no way obligatory. Thus, the fact that evaluative priming can be found when evaluative meaning is made salient, provides little support for ideas about affective primacy or about the separate nature of affective and cognitive processing.

C. The affective automaticity hypothesis

Although the automatic-controlled distinction arose in cognitive psychology (Shiffrin & Schneider, 1977), a special association is often assumed between affect and automaticity. Perhaps the idea was that thoughts can be more easily controlled than feelings has made affect seem to have a life of its own. One can decide to think about one particular topic rather than another, but one cannot decide to feel one way or another, except by guiding thoughts. Is automaticity a key distinction that makes affect and emotion separate from cognition?

Cognitive psychologists have recently become critical of the term “automaticity”. Recent reviews have concluded that the initial demonstrations of what was purported to be automaticity may actually have required attention after all (see, Lavie & De Fockert, 2003; Logan, 2002; Pashler, Johnston, & Ruthruff, 2001; Stolz & Besner, 1999). For example, Pashler et al. (2001, p. 648) stated that, “A variety of proposals for ‘wired-in’ attention capture by particular stimulus attributes have been effectively challenged; attention, it turns out, is subject to a far greater degree of top-down control than was suspected 10 years ago”. Generally, the relevant data have come from studies of cognition rather than affect. In this section, we suggest that the same conclusion applies in the case of affective stimuli.

Harris, Pashler, and Coburn (2004) examined whether affective words could be processed automatically. Their data indicated that affective words can slow responses down on a primary task, suggesting that affect may capture attention. However, when the primary task was made difficult, thus reducing attentional resources, affective words failed to slow responses, suggesting that affect did not capture attention. These results suggest that under high-load conditions, when attention is occupied, affective words should not be expected to “grab” attention in a bottom-up manner. Instead, affect appears to be processed by top-down networks. Similar results have been found when emotional faces were used in a modified Posner cueing paradigm (Fox, Russo, Bowles, & Dutton, 2001) and when threat-related words and faces were used in a variation of the Stroop task (White, 1996). Moreover, examining the affective pronunciation priming task, De Houwer and Randell (2002) observed affective priming only when attention was focused on the primes. When attention was not focused on the primes, affective priming was not observed in the pronunciation paradigm.

These studies all presented evidence to suggest that affective stimuli require attention and that they do not grab attention in a bottom-up manner. However, Lundqvist and Öhman (2005) have argued that evolutionarily relevant threat stimuli (e.g., snakes, spiders, faces) should be especially likely to be processed pre-attentively (see Davey, 1995, for a relevant criticism to evolutionary preparedness account).

Relevant data are limited, but, the data available would suggest that even faces require attention in order to be processed. As discussed above, Fox et al. (2001) found that angry, happy, and neutral faces failed to capture attention when the effects of attention capture versus disengagement were disentangled. Narumoto et al. (2001) found that when faces were presented, area STS, which processes facial expressions, was significantly activated only when the task required facial expression discrimination, but not when identity or gender discriminations were required for the emotional faces (see also Critchley et al., 2000). Pessoa, Kastner, and Ungerleider (2002) performed a study similar to the Harris et al. study, but they used pictures with facial expressions and collected neuroimaging data. They observed that under low-load conditions, amygdala activation was observed to task-irrelevant fear faces. But, under high-load conditions, when processing resources were limited, the amygdala failed to show significant activation to task-irrelevant fear faces, suggesting that attention was driven by top-down influences. These findings suggest that even the amygdala needs attentional resources in order to process fear faces and that fear faces can fail to capture attention.

Amygdala evaluation requires attention

It has been suggested that emotional stimuli are processed automatically, namely, without attention (LeDoux, 1996; Öhman, Esteves, & Soares, 1995; Vuilleumier, Armony, Driver, & Dolan, 2003), and that the amygdala plays a key role in automatic stimulus evaluation (Morris, Öhman, Dolan, 1998; Whalen et al., 1998). This process is often cited as the basis of affective primacy (e.g., Bargh & Chartrand, 1999; Zajonc, 2000). However, cortical input appears to be more important in amygdala processing than has sometimes been emphasised (as discussed earlier), and the data reviewed below suggest that the amygdala requires attention to process threatening and novel stimuli.

Several studies have tested the hypothesis that exposure to affective words should elicit amygdala activation, reflecting the automatic evaluation process (Beauregard et al., 1997; Canli, Desmond, Zhao, Glover, & Gabrieli, 1998). No evidence was found of the hypothesised amygdala activation unless attention was explicitly drawn to the affective content of words by asking participants to evaluate them. Such results suggest that the amygdala does not continuously evaluate all incoming stimuli.

These studies involved lexical stimuli, but the same turns out to be true for the evaluation of pictures.5 When presented with affective pictures, Keightley et al. (2003) found no amygdala activation. When participants were explicitly asked to evaluate affective stimuli, amygdala activation was found only for negative information (Keightley et al., 2003; Lane, Chua, & Dolan, 1999). For fearful faces, however, even passive viewing showed amygdala activation (Critchley et al., 2000; Morris et al., 1998; Vuilleumier et al., 2003; Whalen et al., 1998). However, with other face stimuli there was no amygdala activation even when participants explicitly evaluated them (Critchley et al., 2000; Keightley et al., 2003). Happy and angry faces also showed no amygdala activation for either passive viewing or active evaluation (Blair, Morris, Frith, Perrett, & Dolan, 1999; Morris et al., 1998; Surguladze et al., 2003).

Conclusions

These results suggest that valence is not automatically processed by the amygdala, but the amygdala may be sensitive to arousing stimuli such as fearful faces. Other research groups have also suggested that the amygdala is important for encoding arousal, but not the valence, dimension of stimuli (Adolphs & Damasio, 2001; Adolphs, Russell, & Tranel, 1999; Cahill et al., 1996; Lane et al., 1999; McGaugh, 2004; Morris et al., 1998; Surguladze et al., 2003). Moreover, the evidence suggests that when affect is salient and processing demands are relatively low, emotional information may engage attention. But when processing demands are high and affective stimuli are not of attention, affect will not “capture” attention. Such findings limit the conditions for automaticity, and, as cognitive psychology has already discovered, processing relies on attention, even for affective stimuli.

II. THE AFFECT-COGNITION CONNECTION

Throughout history, people’s optimism or pessimism about the human condition has often turned on their beliefs about the possibility of rational thought unsullied by emotion. Gradually, however, cognition and emotion are coming to be viewed as complementary rather than antagonistic processes. Our current research is informed by an affect-as-information approach (Schwarz & Clore, 1983; Clore & Storbeck, in press), which assumes that affective reactions provide useful feedback both explicitly and implicitly from emotional appraisal processes. Evidence in support of such a view comes from observations that the inability to use affective feedback as a result of brain damage has profoundly negative consequences for judgement and decision making (Damasio, 1994). Conversely, expertise at using affective information seems to be associated with effective personal and social functioning (Mayer, Salovey, & Caruso, 2004).

A. Emotion modulates cognition6

In Part I, we argued against the idea that cognition and emotion involved distinct brain areas or that they operate independently. The strongest claim for independence relied on the “low route” to emotion (LeDoux, 1996), a direct pathway from the sensory thalamus to the amygdala. However, by all available evidence, the low route does not appear to be a candidate for explaining any instance of human emotion. If it operates at all in humans, it appears incapable of even basic affective discriminations without cognitive input. Rather, the evidence from neuroscience suggests that evaluations of the amygdala are dependent upon input from the visual cortex. We suggested that affect probably does not proceed independently of cognition, nor precede cognition in time.

How, then, do we see the relationship between emotion and cognition? At the most general level, emotion modulates and mediates basic cognitive processes. The brain, of course, accomplishes numerous tasks all at once, including automatic processes (Barnard et al., 2007 this issue; Robinson, 1998). As the sensory cortex identifies stimuli in the environment, the visual cortex processes it in a view-invariant manner, allowing it to determine attributes of the object, including its affective significance, regardless of the position the object happens to be in. Once the visual cortex creates a view-invariant code for the object, it projects that information to other areas in the brain.

One of the primary pathways of the visual cortex is to the amygdala, and the role of the amygdala is in part to determine the urgency of the stimulus, which eventuates in the marking of apparently important experiences hormonally and in terms of experienced arousal. The amygdala retrieves the affective value of the stimulus or determines that it is novel and guides subsequent cognitive processing. The amygdala has extensive back projections to all areas of the visual cortex, which we believe modulate visual perception, attention, and memory for affectively significant stimuli. Note that the amygdala is probably not the only area involved in emotional processing that can modulate cognition. The visual cortex also has extensive projections to areas such as the orbitofrontal cortex, prefrontal cortex, and cingulate cortex, all of which can guide cognitive processing based on affective value.

In this section, we illustrate how we believe affect regulates cognition by briefly reviewing several recent studies from our lab. The studies discussed focus on two problems—the role of affect in perception and the affective regulation of styles of information processing. We note that in performance situations, emotional cues regulate cognitive processing, serving to adjust the mix of cognition and perception. Of special interest are several recent experiments that ask about affective consequences for implicit processes of learning, memory, priming, and attitude.

B. The affective regulation of perception

The “New Look” in perception, a movement in the 1950s (Bruner, 1957), maintained that rather than being a passive registration of reality, perception reflected internal expectations and motivations as part of an adaptive process. That movement quickly ran its course without having much impact, but, today, research again suggests that perception of the physical world is influenced by emotion and other internal factors. For example, Proffitt and colleagues (e.g., Bhalla & Proffitt, 1999; Proffitt, Stefanucci, Banton & Epstein, 2003; Witt, Proffitt, & Epstein, 2004) have found that hills appear steeper and distances farther to people with reduced physical resources, either from wearing a heavy backpack, being physically tired, or being elderly. Recent research shows that emotion can have similar effects. In one study (Riener, Stefanucci, Proffitt, & Clore, 2003) participants listened to happy or sad music as they stood at the bottom of a hill. The results showed that sadness can make mountains out of molehills. Sad mood led to overestimation of the incline on verbal and visual measures, but not on a haptic measure. That is, the sad individuals were more likely to say that the hill was steeper compared to happy individuals, but both groups provided similar haptic responses.

Affective feelings thus appear to inform explicit, but not implicit measures of perception. That is, when asked to estimate the incline verbally in degrees (i.e., verbal measure) and when indicating the incline analogically with a sort of protractor (i.e., visual measure), individuals feeling sad estimated the hill to be significantly steeper than individuals who were feeling happy or who had not heard any music. Such perceptual measures are thought to reflect conscious visual perception that relies on processing in the ventral visual stream, or “what” system, concerned with visual identification (Milner & Goodale, 1995). A reasonable argument can be made for why this system might be sensitive to resources for coping with inclines and distances (Proffitt, 2006). The third, haptic measure involved tilting a palm board (without looking at it) to match the incline of the hill. This haptic measure of incline is generally found to be quite accurate and to be immune from the influence of resource depletion such as physical exhaustion. It was also unaffected by sad mood. The measure is thought to reflect unconscious visual perception and relies on processing in the dorsal visual stream, or “how” system, engaged in the visual control of motor behaviour. Whereas it might be adaptive for one’s perception of a hill to reflect one’s resources, as decisions on whether to take action or not might hinge on such information, but for regulation of one’s actual foot placement, such overestimations might be disastrous.

In extensions of this work, Stefanucci, Proffitt, and Clore (2005) also examined the effect of fear on hill estimates. They had individuals on top of the hill and to manipulate fear, some individuals stood on a skateboard, whereas others stood on a stable platform. They found that individuals on the skateboard provided steeper verbal hill estimates again on both the verbal and visual measures when compared to individuals standing on the stable platform. As expected, the haptic measure was again unaffected by the manipulation of emotion.

C. Affective regulation of processing

At the beginning of the cognitive revolution, Jerome Bruner (1957) famously concluded that people are active processors who typically “go beyond the information given”. A number of experiments have been conducted in our lab over the past five years in which emotions and moods were added to classic experiments in cognitive psychology. One way to summarise our results is to say that happy affect appears to promote this “going beyond” through its influence on “relational processing”. In contrast, negative affect leads to more item-specific processing. Such results lead us to conclude that Bruner’s dictum, and all that it implies, may not be applicable when emotional cues of sadness are present. The experiments from our lab suggest (perhaps ironically) that the cognitive revolution had a hidden emotional trigger.

Many of the classic phenomena on which cognitive psychology was founded turn out to depend on affect. For instance, we observed that individuals in happy moods, but not those in sad moods, demonstrate schema effects on constructive memory of the kind introduced by Bartlett in 1932 (Gasper & Clore, 2002). Other classic phenomena also turn out to be more pronounced in happy moods than in sad moods. These include semantic priming (Storbeck & Clore, 2006), script processing (Bless et al., 1996), schema-guided memory (Gasper & Clore, 2002), stereotype use (Isbell, 2004), heuristic reasoning (Gasper, 2000), the global superiority effect (Gasper & Clore, 2002), and false memory generation (Storbeck & Clore, 2005). These results do not arise from general performance deficits caused by sad mood. On the contrary, general reaction times, overall memory accuracy, and basic performance levels often show no mood-based differences. Moreover, since the classic paradigms often rely on particular errors to show the mediating role of knowledge structures, individuals in sad moods may perform better in certain ways than those in happy moods.

These observations are compatible with findings demonstrating that positive moods are associated with processing that is generative (e.g., Erez & Isen, 2002), constructive (e.g., Fiedler, 2001), and broad (e.g., Fredrickson & Branigan, 2005). Our own account of these effects emphasises the informational properties of affect. For example, during task performance, positive affect may be experienced as efficacy and negative affect as difficulty. Feeling that one is effective confers value on one’s own generative thoughts and goals resulting in the reliance on them to process incoming information (relational processing). On the other hand, experiences of difficulty and lack of efficacy reduce the apparent value of one’s own cognitions and goals, leading to a focus on more-specific, literal aspects of stimuli.

D. Affective regulation of implicit processes

Priming

In other research, Storbeck and Clore (2006) tested whether this relational processing of associations can carry over to semantic knowledge. They observed that happy individuals were more likely to relate primes and targets together, demonstrating both category and evaluative priming, depending on the nature of the task. However, sad individuals failed to demonstrate priming on the same tasks, suggesting they were impaired in relating the descriptive meaning from primes to targets. Again, the results suggest that negative affective cues act as though they undermine confidence in using accessible cognitions. In the implicit learning situation, it prevented expression of what had been learned, and in the priming situation, it allowed sad participants to respond to target stimuli independently from the descriptive meaning of the primes.

False memory effects

To investigate further the hypothesis that negative affect impairs the formation and use of implicit associations, Storbeck and Clore (2005) induced positive and negative moods before a false-memory task. The task produces false memories by presenting word lists in which the lists are composed of words that are highly associated to a non-presented word, referred to as the critical lure. False memories are engendered because as individuals are relating the words from the list together, the critical lures should come to mind and are then likely to be falsely recalled. We observed that, in fact, negative moods led to a decrease in activation and subsequent recall of critical lures compared to the positive mood group and the control group. In addition, no differences were observed for the recall of presented items between the three groups. Ironically, the observed effect demonstrated that negative affect can improve memory performance by inhibiting the use of lexical associations during learning. Such findings suggest that affect from mood can influence the expression of implicit associations (Storbeck & Clore, 2005, 2006).

Affective involvement in implicit attitudes

The previous experiments show that affective states modulate the use of implicit associations in cognitive performance situations. Extensive prior research has already shown that affective states can influence evaluative judgements or attitudes expressed in self-report measures. But the intense interest in implicit attitude measurement raises the question of whether or not affect influences attitudes when assessed on implicit measures such as the Implicit Association Test or IAT (Greenwald, McGhee, & Schwartz, 1998).

Several experiments (Huntsinger, Sinclair, Dunn, & Clore, 2006) tested the hypothesis that positive affect would serve as a “go” sign and negative affect as a “stop” sign for acting on goals that were either chronically or temporarily activated. The goal in one experiment concerned taking an egalitarian stance regarding sexist attitudes, and in the other experiment the goal was either to adopt or not to adopt the racial attitudes held by an experimenter. Implicit measures of attitude were employed (a lexical decision task and an IAT). An elaborate set of effects neatly confirmed the predictions, showing that in each case, happy moods prompted participants to act on their chronic or temporarily activated goals, whereas sad moods interfered with goal expression. Importantly, the goals had been activated implicitly as a subtle part of the social situation, and the attitudes were measured implicitly using two different measures—see also DeSteno, Dasgupta, Bartlett, and Cajdric (2004) for a demonstration of the effects of anger on implicitly measured prejudice toward an outgroup. Thus, affective states appear to regulate not only the expression of implicit learning and implicit lexical associations, but also the expression of implicitly measured attitudes.

In summary, our main goal of this section was to demonstrate that affect and cognition should be thought of as fundamentally interactive. In this view, affect is potential moderator of all kinds of cognitive operations from perception and attention to implicit learning and implicit associations (see also, Duncan & Barrett, this issue). We have argued against conceptualising emotion as a separate force in opposition to cognition in favour of viewing cognition and emotion as inherently integrated. We included examples of recent research in our own lab showing affective moderation of basic cognitive processes.

Acknowledgments

Support for this research is acknowledged from National Institute of Mental Health Grant MH 50074 to GLC.

Footnotes

1

That is, we, along with others (e.g., Barrett, 2006), suggest that there is not a brain centre dedicated to specific emotions such as fear, happiness, etc. But, there are specific areas critically involved in emotion processing. For instance, the amygdala is critically involved in the emotion of fear, but is not specifically dedicated to fear.

2

The conception of emotion we raise, affective independence and affective primacy, comes mainly from Zajonc (1980, 2000). The affective automaticity derives from arguments made by Bargh and colleagues (Bargh, 1997; Ferguson & Bargh, 2003).

3

In particular, the strongest evidence for such a route comes from affective blindsight individuals. Individuals have damage to area V1 of the visual cortex and as a result have no conscious perception of the world. However, these individuals still demonstrate affective reactions to fear-inducing visual stimuli. In the literature though, this is still a debated issue. First, the pathways involved are unclear. That is, although information may not be visually conscious to blindsight individuals, areas of the visual cortex still receive visual information (area V4 and extrastriate) from subcortical structures such as the pulvinar and superior collicolus. Therefore, although the area V1 is damaged, areas of the visual cortex still receive the same visual information. Storbeck, Robinson, and McCourtt (2006) examine this issue more extensively.

4

We will use the term “semantic” to describe the meaning analysis that we propose precedes affective analysis. What we have in mind specifically are at least three achievements: (1) the integration of multiple features of the object into a single “object” code; (2) the identification of this object; and (3) the categorisation of the object (e.g., as animate or not). The term semantic, then, refers somewhat more directly to the achievements of area IT (especially invariance, identification, and categorisation) that seem to occur in order for a person to retrieve affective associations.

5

A host of fMRI studies have demonstrated the activation of the amygdala to masked fear faces and other emotional stimuli. Such studies are interesting because individuals do not have a conscious perception of the image. However, the amygdala only shows enhanced activation to arousing images (e.g., fear faces), but not to non-arousing faces (e.g., houses). Although such evidence suggests that amygdala activation can occur without perceptual awareness, we still suggest that the visual system still codes that image and sends its input forward to the amygdala in the same manner as if the stimulus was presented supraliminally. Moreover, imaging studies have a weakness of comparative activity. Therefore, it is difficult to gage how much processing is done between masked and non-masked fear faces. In addition, there is plenty of evidence to suggest that the visual cortex processes masked and non-masked images in a similar manner. Moreover, evidence from single-cell recording suggests that the visual system can still determine whether a face or a house was presented regardless of whether each image was presented with a mask and subliminally. Therefore, studies demonstrating that the amygdala activates for a subliminal, but not a supraliminal picture does not mean that the visual cortex did not send the same information. There is no reason to believe that the categorisation processes performed by area IT are conscious. Indeed, on the basis of ERP data, we might conclude that unconscious categorisation routinely precedes conscious categorisation. Furthermore, unconscious categorisation by the visual system may occur extremely quickly after stimulus exposure, in as little as 48 ms for “global templates” (Sugase, Yamane, Ueno, & Kawano, 1999) and 70–80 ms for classes of stimuli (Van Rullen & Thorpe, 2001). Interestingly, Van Rullen and Thorpe (2001) also found that the initial (70–80) categorisation-related ERP component was not highly correlated with a participant’s response to the task at hand, whereas an ERP component that occurred at 190 ms post-stimulus onset was. Thus, categorisation appears to occur quite rapidly and seems to occur independently of later, possibly more conscious, categorisation processes. Relatedly, people can classify objects on the basis of category membership even with no awareness of the distinct categories guiding their response (e.g., Reed, Squire, Patalano, Smith, & Jonides, 1999). In summary, we conclude that categorisation occurs within later stages of the visual cortex, specifically area IT. Moreover, other data suggest that these same visual areas are not sensitive to the affective significance of objects (Iwai et al., 1990; Nishijo, Ono, & Nishino, 1988a; Rolls, 1999; Rolls, Judge, & Sanghera, 1977). Thus, within area IT and other later stages of the visual cortex we appear to have considerable evidence for categorisation prior to affect retrieval. Recall that studies have found distinct category-related ERPs within 70–80 ms post-stimulus onset (e.g., Van Rullen & Thorpe, 2001). Object identification also appears to occur rapidly, perhaps within 100 ms of stimulus onset (Lehky, 2000; Rolls & Tovee, 1994). These findings suggest that categorisation tends to occur prior to identification. Nevertheless, studies that present masked stimuli have demonstrated that even stimuli presented as briefly as 20–60 ms with pre- and postmasks are still sufficiently processed by area IT to support object identification (Dehaene et al., 2001; Rolls, 1999; Vogels & Orban, 1996). In the latter connection, Rolls, Tovee, Purcell, Stewart, and Azzopardi (1994) argued that such subliminal presentations reduce the amplitude of neural responses to stimuli, but do not change fundamental neural identification processes (see also Kovacs, Vogels, & Orban, 1995, for similar results). Thus, the primary difference between subliminal and optimal viewing conditions pertains to the amplitude of the neuronal responses within area IT, but sufficient processing still occurs to produce an invariant neural code (i.e., identification). From this perspective, demonstrations of “unconscious” cognition or affect are not particularly special from a neurological point of view.

6

The section title implies that cognition does not modulate emotion. We would suggest, like others have, that in fact cognition does modulate emotion (e.g., Ochsner & Gross, 2005), but such a discussion is beyond the scope of this article.

Publisher's Disclaimer: Full terms and conditions of use: http://www.informaworld.com/terms-and-conditions-of-access.pdf

This article maybe used for research, teaching and private study purposes. Any substantial or systematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply or distribution in any form to anyone is expressly forbidden.

The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material.

REFERENCES

  1. Adolphs R. Recognizing emotion from facial expressions: Psychological and neurological mechanisms. Behavioral and Cognitive Neuroscience Reviews. 2002;1:21–62. doi: 10.1177/1534582302001001003. [DOI] [PubMed] [Google Scholar]
  2. Adolphs R, Damasio A. The interaction of affect and cognition: A neurobiological perspective. In: Forgas JP, editor. Handbook of affect and social cognition. Mahwah, NJ: Lawrence Erlbaum Associates, Inc.; 2001. pp. 27–49. [Google Scholar]
  3. Adolphs R, Russell J, Tranel D. A role for the human amygdala in recognizing emotional arousal from unpleasant stimuli. Psychological Science. 1999;10:167–171. [Google Scholar]
  4. Allison T, Puce A, McCarthy G. Social perception from visual cues: Role of the STS region. Trends in Cognitive Sciences. 2000;4:267–278. doi: 10.1016/s1364-6613(00)01501-1. [DOI] [PubMed] [Google Scholar]
  5. Bargh J. The automaticity of everyday life. In: Wyer RS Jr, editor. The automaticity of everyday life: Advances in social cognition. Vol. 10. Mahwah, NJ: Lawrence Erlbaum Associates, Inc.; 1997. pp. 1–61. [Google Scholar]
  6. Bargh J, Chaiken S, Govender R, Pratto F. The generality of the automatic attitude activation effect. Journal of Personality and Social Psychology. 1992;62:893–912. doi: 10.1037//0022-3514.62.6.893. [DOI] [PubMed] [Google Scholar]
  7. Bargh J, Chaiken S, Raymond P, Hymes C. The automatic evaluation effect: Unconditional automatic attitude activation with a pronunciation task. Journal of Experimental Social Psychology. 1996;32:104–128. [Google Scholar]
  8. Bargh J, Chartrand T. The unbearable automaticity of being. American Psychologist. 1999;54:462–479. [Google Scholar]
  9. Barnard PJ, Duke DJ, Byrne RW, Davidson I. Differentiation in cognitive and emotional meanings: An evolutionary analysis. Cognition and Emotion. 2007;21:1155–1183. [Google Scholar]
  10. Barrett LF. Emotions as natural kinds? Perspectives on Psychological Science. 2006;1:28–58. doi: 10.1111/j.1745-6916.2006.00003.x. [DOI] [PubMed] [Google Scholar]
  11. Bartlett FC. Remembering: A study in experimental and social psychology. Cambridge: Cambridge University Press; 1932. [Google Scholar]
  12. Baylis G, Rolls ET, Leonard C. Functional subdivisions of the temporal lobe neocortex. Journal of Neuroscience. 1987;7:330–342. doi: 10.1523/JNEUROSCI.07-02-00330.1987. [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Beauregard M, Chertkow H, Bub D, Murtha S, Dixon R, Evans A. The neural substrate for concrete, abstract, and emotional word lexica: A positron emission tomography study. Journal of Cognitive Neuroscience. 1997;9:441–461. doi: 10.1162/jocn.1997.9.4.441. [DOI] [PubMed] [Google Scholar]
  14. Berkowitz L, Harmon-Jones E. More thoughts about anger determinants. Emotion. 2004;4:107–130. doi: 10.1037/1528-3542.4.2.107. [DOI] [PubMed] [Google Scholar]
  15. Bhalla M, Proffitt D. Visual-motor recalibration in geographical slant perception. Journal of Experimental Psychology: Human Perception and Performance. 1999;25:1076–1096. doi: 10.1037//0096-1523.25.4.1076. [DOI] [PubMed] [Google Scholar]
  16. Blair R, Morris J, Frith C, Perrett D, Dolan R. Dissociable neural responses to facial expressions of sadness and anger. Brain. 1999;122:883–893. doi: 10.1093/brain/122.5.883. [DOI] [PubMed] [Google Scholar]
  17. Bless H, Clore GL, Schwarz N, Golisano V, Rabe C, Wolk M. Mood and the use of scripts: Does a happy mood really lead to mindlessness? Journal of Personality & Social Psychology. 1996;71:665–679. doi: 10.1037//0022-3514.71.4.665. [DOI] [PubMed] [Google Scholar]
  18. Bower GH. Mood and memory. American Psychologist. 1981;36:129–148. doi: 10.1037//0003-066x.36.2.129. [DOI] [PubMed] [Google Scholar]
  19. Bruner JS. Going beyond the information given. In: Bruner JS, et al., editors. Contemporary approaches to cognition. Cambridge, MA: Harvard University Press; 1957. pp. 41–69. [Google Scholar]
  20. Butler R, Diamond I, Neff W. Role of auditory cortex in discrimination of changes in frequency. Journal of Neurophysiology. 1957;20:108–120. doi: 10.1152/jn.1957.20.1.108. [DOI] [PubMed] [Google Scholar]
  21. Cacioppo J, Crites S, Gardner W. Attitudes to the right: Evaluative processing is associated with lateralized late positive event-related brain potentials. Personality & Social Psychology Bulletin. 1996;22:1205–1219. [Google Scholar]
  22. Cahill L, Haier R, Fallon J, Alkire M, Tang C, Keator D, et al. Amygdala activity at encoding correlated with long-term, free recall of emotional information. Proceedings of the National Academy of Sciences. 1996;93:8016–8021. doi: 10.1073/pnas.93.15.8016. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Canli T, Desmond J, Zhao Z, Glover G, Gabrieli J. Hemispheric asymmetry for emotional stimuli detected with fMRI. Neuroreport. 1998;9:3233–3239. doi: 10.1097/00001756-199810050-00019. [DOI] [PubMed] [Google Scholar]
  24. Chaiken S, Trope Y. Dual-process theories in social psychology. New York: Guilford Press; 1999. [Google Scholar]
  25. Clore GL, Storbeck J. Affect as information about liking, efficacy, and importance. In: Forgas JP, editor. Hearts and minds: Affective influences on social cognition and behaviour. New York: Psychology Press; (in press) [Google Scholar]
  26. Critchley H, Daly E, Phillips M, Brammer M, Bullmore E, Williams S, et al. Explicit and implicit neural mechanisms for processing of social information from facial expressions: A functional magnetic resonance imaging study. Human Brain Mapping. 2000;9:93–105. doi: 10.1002/(SICI)1097-0193(200002)9:2<93::AID-HBM4>3.0.CO;2-Z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Damasio A. Descartes’ error: Emotions, reason, and the human brain. New York: Avon Books; 1994. [Google Scholar]
  28. Davey GCL. Preparedness and phobias: Specific evolved associations or a generalized expectancy bias? Behavioral and Brain Sciences. 1995;18:289–325. [Google Scholar]
  29. Davidson RJ. Seven sins in the study of emotion: Correctives from affective neuroscience. Brain and Cognition. 2003;52:129–132. doi: 10.1016/s0278-2626(03)00015-0. [DOI] [PubMed] [Google Scholar]
  30. De Houwer J, Randell T. Attention to primes modulates affective priming of pronunciation responses. Experimental Psychology. 2002;49:163–170. doi: 10.1026//1618-3169.49.3.163. [DOI] [PubMed] [Google Scholar]
  31. Dehaene S, Naccache L, Cohen L, Bihan D, Mangin J, Poline J, et al. Cerebral mechanisms of word masking and unconscious repetition priming. Nature Neuroscience. 2001;4:752–758. doi: 10.1038/89551. [DOI] [PubMed] [Google Scholar]
  32. DeSteno D, Dasgupta N, Bartlett MY, Cajdric A. Prejudice from thin air: The effect of emotion on automatic intergroup attitudes. Psychological Science. 2004;15:319–324. doi: 10.1111/j.0956-7976.2004.00676.x. [DOI] [PubMed] [Google Scholar]
  33. Dewey J. The ego as cause. Philosophical Review. 1894;3:337–341. [Google Scholar]
  34. Duncan S, Barrett LF. Affect is a form of cognition: A neurobiological analysis. Cognition and Emotion. 2007;21:1184–1211. doi: 10.1080/02699930701437931. [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Duvel A, Smith D, Talk A, Gabriel M. Medial geniculate, amygdalar and cingulate cortical training-induced neuronal activity during discriminative avoidance learning in rabbits with auditory cortical lesions. Journal of Neuroscience. 2001;21:3271–3281. doi: 10.1523/JNEUROSCI.21-09-03271.2001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Elliott R, Dolan RJ. Neural response during preference and memory judgments for subliminally presented stimuli: A functional imaging study. Journal of Neuroscience. 1998;18:4697–4704. doi: 10.1523/JNEUROSCI.18-12-04697.1998. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Erdley C, D’Agostino P. Cognitive and affective components of automatic priming effects. Journal of Personality & Social Psychology. 1988;54:741–747. [Google Scholar]
  38. Erez A, Isen A. The influence of positive affect on the components of expectancy motivation. Journal of Applied Psychology. 2002;87:1055–1067. doi: 10.1037/0021-9010.87.6.1055. [DOI] [PubMed] [Google Scholar]
  39. Erickson K, Schulkin J. Facial expressions of emotion: A cognitive neuroscience perspective. Brain and Cognition. 2003;52:52–60. doi: 10.1016/s0278-2626(03)00008-3. [DOI] [PubMed] [Google Scholar]
  40. Fazio R, Sanbonmatsu D, Powell M, Kardes F. On the automatic activation of attitudes. Journal of Personality and Social Psychology. 1986;50:229–238. doi: 10.1037//0022-3514.50.2.229. [DOI] [PubMed] [Google Scholar]
  41. Ferguson M, Bargh J. The constructive nature of automatic evaluation. In: Musch J, Klauer K, editors. The psychology of evaluation: Affective processes in cognition and emotion. Mahwah, NJ: Lawrence Erlbaum Associates, Inc.; 2003. pp. 169–188. [Google Scholar]
  42. Fiedler K. Affective states trigger processes of assimilation and accommodation. In: Martin L, Clore G, editors. Theories of mood and cognition: A user’s guidebook. Mahwah, NJ: Lawrence Erlbaum Associates, Inc.; 2001. pp. 86–98. [Google Scholar]
  43. Fox E, Russo R, Bowles R, Dutton K. Do threatening stimuli draw or hold visual attention in subclinical anxiety? Journal of Experimental Psychology: General. 2001;130:681–700. [PMC free article] [PubMed] [Google Scholar]
  44. Fredrickson B, Branigan C. Positive emotions broaden the scope of attention and thought-action repertoires. Cognition and Emotion. 2005;19:313–332. doi: 10.1080/02699930441000238. [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. Fukuda M, Ono T, Nakamura K. Functional relations among inferior temporal cortex, amygdala, and lateral hypothalamus in monkey operant feeding behavior. Journal of Neurophysiology. 1987;57:1060–1077. doi: 10.1152/jn.1987.57.4.1060. [DOI] [PubMed] [Google Scholar]
  46. Gasper K. How thought and differences in emotional attention influence the role of affect in processing and judgment: When attempts to be reasonable fail. Dissertation Abstracts International: Section B: the Sciences & Engineering. 2000;60:5834. [Google Scholar]
  47. Gasper K, Clore GL. Attending to the big picture: Mood and global versus local processing of visual information. Psychological Science. 2002;13:34–40. doi: 10.1111/1467-9280.00406. [DOI] [PubMed] [Google Scholar]
  48. Ghashghaei HT, Barbas H. Pathways for emotion: Interactions of prefrontal and anterior temporal pathways in the amygdala of the rhesus monkey. Neuroscience. 2002;115:1261–1279. doi: 10.1016/s0306-4522(02)00446-3. [DOI] [PubMed] [Google Scholar]
  49. Greenwald AG, McGhee DE, Schwartz JLK. Measuring individual differences in implicit cognition: The implicit association task. Journal of Personality and Social Psychology. 1998;74:1464–1480. doi: 10.1037//0022-3514.74.6.1464. [DOI] [PubMed] [Google Scholar]
  50. Greve K, Bauer R. Implicit learning of new faces in prosopagnosia: An application of the mere-exposure paradigm. Neuropsychologia. 1990;28:1035–1041. doi: 10.1016/0028-3932(90)90138-e. [DOI] [PubMed] [Google Scholar]
  51. Halgren E. Emotional neurophysiology of the amygdala within the context of human cognition. In: Aggleton J, editor. The amygdala: Neurobiological aspects of emotion, memory, and mental dysfunction. New York: Wiley-Liss; 1992. pp. 191–228. [Google Scholar]
  52. Harris C, Pashler H, Coburn N. Moray revisited: High-priority affective stimuli and visual search. The Quarterly Journal of Experimental Psychology. 2004;57A:1–31. doi: 10.1080/02724980343000107. [DOI] [PubMed] [Google Scholar]
  53. Huntsinger JR, Sinclair S, Dunn E, Clore GL. If it feels good, just do it: Mood shapes conscious and unconscious goal pursuit. USA: University of Virginia; 2006. Unpublished Manuscript. [Google Scholar]
  54. Isbell L. Not all happy people are lazy or stupid: Evidence of systematic processing in happy moods. Journal of Experimental Social Psychology. 2004;40:341–349. [Google Scholar]
  55. Ito T, Cacioppo J. Electrophysiological evidence of implicit and explicit categorization processes. Journal of Experimental Social Psychology. 2001;36:660–676. [Google Scholar]
  56. Iwai E, Yukie M, Watanabe J, Hikosaka K, Suyama H, Ishikawa S. A role of amygdala in visual perception and cognition in Macaque monkeys (macaca fuscata and macaca mulatta) Tohoku Journal of Experimental Medicine. 1990;161:95–120. doi: 10.1620/tjem.161.supplement_95. [DOI] [PubMed] [Google Scholar]
  57. Kanwisher N, McDermott J, Chun M. The fusiform face area: A module in human extrastriate cortex specialized for face perception. Journal of Neuroscience. 1997;17:4302–4311. doi: 10.1523/JNEUROSCI.17-11-04302.1997. [DOI] [PMC free article] [PubMed] [Google Scholar]
  58. Keightley M, Winocur G, Graham S, Mayberg H, Hevenor S, Grady C. An fMRI study investigating cognitive modulation of brain regions associated with emotional processing of visual stimuli. Neuropsychologia. 2003;41:585–596. doi: 10.1016/s0028-3932(02)00199-9. [DOI] [PubMed] [Google Scholar]
  59. Kemp-Wheeler S, Hill A. Semantic and emotional priming below objective detection threshold. Cognition and Emotion. 1992;6:113–128. [Google Scholar]
  60. Klauer K, Musch J. Goal-dependent and goal-independent effects of irrelevant evaluations. Personality and Social Psychology Bulletin. 2002;28:802–814. [Google Scholar]
  61. Klinger M, Burton P, Pitts S. Mechanisms of unconscious priming: I. Response competition, not spreading activation. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2000;26:441–455. doi: 10.1037//0278-7393.26.2.441. [DOI] [PubMed] [Google Scholar]
  62. Komura Y, Tamura R, Uwano T, Nishijo H, Kaga K, Ono T. Retrospective and prospective coding for predicted reward in the sensory thalamus. Nature. 2001;412:546–549. doi: 10.1038/35087595. [DOI] [PubMed] [Google Scholar]
  63. Kovacs G, Vogels R, Orban G. Cortical correlate of pattern backward masking. Proceedings of the National Academy of Sciences of the USA. 1995;92:5587–5591. doi: 10.1073/pnas.92.12.5587. [DOI] [PMC free article] [PubMed] [Google Scholar]
  64. Kudo M, Glendenning K, Frost S, Masterson R. Origin of mammalian thalamocortical projections. I. Telencephalic projection of the medial geniculate body in the opossum (Didelphis virginiana) Journal of Comparative Neurology. 1986;245:176–197. doi: 10.1002/cne.902450205. [DOI] [PubMed] [Google Scholar]
  65. Lane RD, Nadel L, editors. Cognitive neuroscience of emotion: Series in affective science. New York: Oxford University Press; 2000. [Google Scholar]
  66. Lane R, Chua P, Dolan R. Common effects of emotional valence, arousal and attention on neural activation during visual processing of pictures. Neuropsychologia. 1999;37:989–997. doi: 10.1016/s0028-3932(99)00017-2. [DOI] [PubMed] [Google Scholar]
  67. Lavie N, De Fockert J. Contrasting effects of sensory limits and capacity limits in visual selective attention. Perception & Psychophysics. 2003;65:202–212. doi: 10.3758/bf03194795. [DOI] [PubMed] [Google Scholar]
  68. Lazarus R. Vexing research problems inherent in cognitive-mediational theories of emotion and some solutions. Psychological Inquiry. 1995;6:183–196. [Google Scholar]
  69. LeDoux J. The emotional brain: The mysterious underpinnings of emotional life. New York: Simon & Schuster; 1996. [Google Scholar]
  70. LeDoux J. Synaptic self: How our brain becomes who we are. New York: Viking Publishing; 2001. [Google Scholar]
  71. LeDoux J, Romanski L, Xagoraris A. Indelibility of subcortical emotional memories. Journal of Cognitive Neuroscience. 1989;1:238–243. doi: 10.1162/jocn.1989.1.3.238. [DOI] [PubMed] [Google Scholar]
  72. Lehky S. Fine discrimination of faces can be performed rapidly. Journal of Cognitive Neuroscience. 2000;12:848–855. doi: 10.1162/089892900562453. [DOI] [PubMed] [Google Scholar]
  73. Linke R, De Lima A, Schwegler H, Pape H. Direct synaptic connections of axons from superior colliculus with identified thalamo-amygdaloid projection neurons in the rat: Possible substrates of a subcortical visual pathway to the amygdala. Journal of Comparative Neurology. 1999;403:158–170. [PubMed] [Google Scholar]
  74. Logan G. An instance theory of attention and memory. Psychological Review. 2002;109:376–400. doi: 10.1037/0033-295x.109.2.376. [DOI] [PubMed] [Google Scholar]
  75. Lundqvist D, Öhman A. Caught by the evil eye: Nonconscious information processing, emotion, and attention to facial stimuli. In: Barrett LF, Niedenthal P, Winkielman P, editors. Emotion and consciousness. New York: Guilford Press; 2005. pp. 97–122. [Google Scholar]
  76. Mayer J, Salovey P, Caruso D. Emotional intelligence: Theory, findings, and implications. Psychological Inquiry. 2004;15:197–215. [Google Scholar]
  77. McCabe P, McEchron M, Green E, Schneiderman N. Electrolytic and ibotenic acid lesions of the medial subnucleus of the medial geniculate prevent the acquisition of classically conditioned heart rate to a single acoustic stimulus in rabbits. Brain Research. 1993;619:291–298. doi: 10.1016/0006-8993(93)91623-z. [DOI] [PubMed] [Google Scholar]
  78. McGaugh JL. The amygdala modulates the consolidation of memories of emotionally arousing experiences. Annual Reviews in Neuroscience. 2004;27:1–28. doi: 10.1146/annurev.neuro.27.070203.144157. [DOI] [PubMed] [Google Scholar]
  79. Milner A, Goodale M. The visual brain in action. Oxford, UK: Oxford University Press; 1995. [Google Scholar]
  80. Morris J, Öhman A, Dolan R. Conscious and unconscious emotional learning in the human amygdala. Nature. 1998;393:467–470. doi: 10.1038/30976. [DOI] [PubMed] [Google Scholar]
  81. Murphy S, Zajonc R. Affect, cognition, and awareness: Affective priming with optimal and suboptimal stimulus exposure. Journal of Personality and Social Psychology. 1993;64:723–739. doi: 10.1037//0022-3514.64.5.723. [DOI] [PubMed] [Google Scholar]
  82. Narumoto J, Okada T, Sadato N, Fukui K, Yonekura Y. Attention to emotion modulates fMRI activity in human right superior temporal sulcus. Cognitive Brain Research. 2001;12:225–231. doi: 10.1016/s0926-6410(01)00053-2. [DOI] [PubMed] [Google Scholar]
  83. Nicholson D, Freeman J. Lesions of the perirhinal cortex impair sensory preconditioning in rats. Behavioural Brain Research. 2000;112:69–75. doi: 10.1016/s0166-4328(00)00168-6. [DOI] [PubMed] [Google Scholar]
  84. Niedenthal P. Implicit perception of affective information. Journal of Experimental Social Psychology. 1990;26:505–527. [Google Scholar]
  85. Nishijo H, Ono T, Nishino H. Single neuron response in amygdala of alert monkey during complex sensory stimulation with affective significance. Journal of Neuroscience. 1998a;8:3570–3583. doi: 10.1523/JNEUROSCI.08-10-03570.1988. [DOI] [PMC free article] [PubMed] [Google Scholar]
  86. Nishijo H, Ono T, Nishino H. Topographic distribution of modality-specific amygdalar neurons in alert monkey. Journal of Neuroscience. 1988b;9:3556–3569. doi: 10.1523/JNEUROSCI.08-10-03556.1988. [DOI] [PMC free article] [PubMed] [Google Scholar]
  87. Nishijo H, Ono T, Tamura R, Nakamura K. Amygdalar and hippocampal neuron responses related to recognition and memory in monkey. Progress in Brain Research. 1993;95:339–357. doi: 10.1016/s0079-6123(08)60380-5. [DOI] [PubMed] [Google Scholar]
  88. Ochsner K, Gross J. The cognitive control of emotion. Trends in Cognitive Sciences. 2005;9:242–249. doi: 10.1016/j.tics.2005.03.010. [DOI] [PubMed] [Google Scholar]
  89. Öhman A. As fast as the blink of an eye: Evolutionary preparedness for preattentive processing of threat. In: Lang PJ, et al., editors. Attention and orienting: Sensory and motivational processes. Mahwah, NJ: Lawrence Erlbaum Associates, Inc.; 1997. pp. 165–184. [Google Scholar]
  90. Öhman A, Esteves F, Soares J. Preparedness and preattentive associative learning: Electrodermal conditioning to masked stimuli. Journal of Psychophysiology. 1995;9:99–108. [Google Scholar]
  91. Parrott G, Sabini J. On the “emotional” qualities of certain types of cognition: A reply to arguments for the independence of cognition and affect. Cognitive Therapy and Research. 1989;13:49–65. [Google Scholar]
  92. Pashler H, Johnston J, Ruthruff E. Attention and performance. Annual Review of Psychology. 2001;52:629–651. doi: 10.1146/annurev.psych.52.1.629. [DOI] [PubMed] [Google Scholar]
  93. Perea M, Rosa E. Does the proportion of associatively related pairs modulate the associative priming effect at very brief stimulus-onset asynchronies? Acta Psychologica. 2002;110:103–124. doi: 10.1016/s0001-6918(01)00074-9. [DOI] [PubMed] [Google Scholar]
  94. Pessoa L, Kastner S, Ungerleider L. Attentional control of the processing of neutral and emotional stimuli. Cognitive Brain Research. 2002;15:31–45. doi: 10.1016/s0926-6410(02)00214-8. [DOI] [PubMed] [Google Scholar]
  95. Phelps E. The human amygdala and awareness: Interactions between emotion and cognition. In: Gazzaniga M, editor. The cognitive neurosciences. 3rd ed. Cambridge, MA: MIT Press; 2004. pp. 1005–1015. [Google Scholar]
  96. Proffitt DR. Embodied perception and the economy of action. Perspectives on Psychological Science. 2006;1:110–122. doi: 10.1111/j.1745-6916.2006.00008.x. [DOI] [PubMed] [Google Scholar]
  97. Proffitt D, Stefanucci J, Banton T, Epstein W. The role of effort in perceiving distance. Psychological Science. 2003;14:106–112. doi: 10.1111/1467-9280.t01-1-01427. [DOI] [PubMed] [Google Scholar]
  98. Quirk G, Armony J, LeDoux JE. Fear conditioning enhances different temporal components of toned-evoked spike trains in auditory cortex and lateral amygdala. Neuron. 1997;19:613–624. doi: 10.1016/s0896-6273(00)80375-x. [DOI] [PubMed] [Google Scholar]
  99. Quirk G, Repa J, LeDoux JE. Fear conditioning enhances short-latency auditory responses of lateral amygdala neurons: Parallel recordings in the freely behaving rat. Neuron. 1995;15:1029–1039. doi: 10.1016/0896-6273(95)90092-6. [DOI] [PubMed] [Google Scholar]
  100. Reed J, Squire L, Patalano A, Smith E, Jonides J. Learning about categories that are defined by object-like stimuli despite impaired declarative memory. Behavioral Neuroscience. 1999;113:411–419. doi: 10.1037//0735-7044.113.3.411. [DOI] [PubMed] [Google Scholar]
  101. Riener C, Stefanucci JK, Proffitt D, Clore GL. Mood and the perception of spatial layout; Poster presented at the 44th Annual Meeting of the Psychonomic Society; Vancouver, BC, Canada. 2003. [Google Scholar]
  102. Robinson MD. Running from William James’ bear: A review of preattentive mechanisms and their contributions to emotional experience. Cognition and Emotion. 1998;12:667–696. [Google Scholar]
  103. Roediger HL, Gallo D, Geraci L. Processing approaches to cognition: The impetus from the levels-of-processing framework. Memory. 10:319–332. doi: 10.1080/09658210224000144. [DOI] [PubMed] [Google Scholar]
  104. Rolls ET. Neurophysiological mechanisms underlying face processing within and beyond the temporal cortical visual areas. Philosophical Transactions of the Royal Society of London, Series B: Biological Sciences. 335:11–21. doi: 10.1098/rstb.1992.0002. [DOI] [PubMed] [Google Scholar]
  105. Rolls ET. The brain and emotion. Oxford, UK: Oxford University Press; [Google Scholar]
  106. Rolls ET, Judge S, Sanghera M. Activity of neurons in the inferotemporal cortex of the alert monkey. Brain Research. 1977;130:229–238. doi: 10.1016/0006-8993(77)90272-4. [DOI] [PubMed] [Google Scholar]
  107. Rolls ET, Tovee M. Processing speed in the cerebral cortex and the neurophysiology of visual masking. Proceedings of the Royal Society of London, Series B: Biological Sciences. 1994;257:9–15. doi: 10.1098/rspb.1994.0087. [DOI] [PubMed] [Google Scholar]
  108. Rolls ET, Tovee M, Purcell D, Stewart A, Azzopardi P. The responses of neurons in the temporal cortex of primates, and face identification and detection. Experimental Brain Research. 1994;101:473–484. doi: 10.1007/BF00227340. [DOI] [PubMed] [Google Scholar]
  109. Rotshtein P, Malach R, Hadar U, Graif M, Hendler T. Feeling or features: Different sensitivity to emotion in high-order visual cortex and amygdala. Neuron. 2001;32:747–757. doi: 10.1016/s0896-6273(01)00513-x. [DOI] [PubMed] [Google Scholar]
  110. Schwarz N, Clore GL. Mood, misattribution, and judgments of well-being: Informative and directive functions of affective states. Journal of Personality and Social Psychology. 1983;45:513–523. [Google Scholar]
  111. Shi C, Davis M. Visual pathways involved in fear conditioning measured with fear-potentiated startle: Behavioral and anatomic studies. Journal of Neuroscience. 2001;21:9844–9855. doi: 10.1523/JNEUROSCI.21-24-09844.2001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  112. Shiffrin R, Schneider W. Controlled and automatic human information processing: II. Perceptual learning, automatic attending and a general theory. Psychological Review. 1977;84:127–190. [Google Scholar]
  113. Smith N, Cacioppo J, Larsen J, Chartrand T. May I have your attention, please: Electrocortical responses to positive and negative stimuli. Neuropsychologia. 2003;41:171–183. doi: 10.1016/s0028-3932(02)00147-1. [DOI] [PubMed] [Google Scholar]
  114. Stefanucci JK, Proffitt DR, Clore G. Skating down a steeper slope: The effect of fear on geographical slant perception. Poster presented at the 5th Annual Meeting of the Society for Vision Sciences; Sarasota, FL, USA. 2005. May, [Google Scholar]
  115. Stenberg G, Lindgren M, Johansson M. Semantic processing without conscious identification: Evidence from event-related potentials. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2000;26:973–1004. doi: 10.1037//0278-7393.26.4.973. [DOI] [PubMed] [Google Scholar]
  116. Stolz J, Besner D. On the myth of automatic semantic activation in reading. Current Directions in Psychological Science. 1999;8:61–65. [Google Scholar]
  117. Storbeck J, Clore GL. With sadness come accuracy, with happiness, false memory: Mood and the false memory effect. Psychological Science. 2005;16:785–791. doi: 10.1111/j.1467-9280.2005.01615.x. [DOI] [PubMed] [Google Scholar]
  118. Storbeck J, Clore GL. Turning on and off affective and categorical priming with mood. 2006. Manuscript submitted for publication. [Google Scholar]
  119. Storbeck J, Robinson MD. Preferences and inferences in encoding visual objects: A systematic comparison of semantic and affective priming. Personality and Social Psychology Bulletin. 2004;30:81–93. doi: 10.1177/0146167203258855. [DOI] [PubMed] [Google Scholar]
  120. Storbeck J, Robinson MD, McCourt ME. Semantic processing precedes affect retrieval: The neurological case for cognitive primacy in visual processing. Review of General Psychology. 2006;10:41–55. [Google Scholar]
  121. Storbeck J, Robinson MD, Ram N, Meier B, Clore GL. University of Virginia; [Unpublished raw data] [Google Scholar]
  122. Sugase Y, Yamane S, Ueno S, Kawano K. Global and fine information coded by single neurons in the temporal visual cortex. Nature. 1999;400:869–873. doi: 10.1038/23703. [DOI] [PubMed] [Google Scholar]
  123. Surguladze S, Brammer M, Young A, Andrew C, Travis M, Williams S, et al. A preferential increase in the extrastriate response to signals of danger. NeuroImage. 2003;19:1317–1328. doi: 10.1016/s1053-8119(03)00085-5. [DOI] [PubMed] [Google Scholar]
  124. Thompson R. The role of the cerebral cortex in stimulus generalization. Journal of Comparative and Physiology Psychology. 1962;55:279–287. doi: 10.1037/h0047856. [DOI] [PubMed] [Google Scholar]
  125. Van Rullen R, Thorpe S. The time course of visual processing: From early perception to decision-making. Journal of Cognitive Neuroscience. 2001;13:454–461. doi: 10.1162/08989290152001880. [DOI] [PubMed] [Google Scholar]
  126. Vogels R, Orban G. Coding of stimulus invariance temporal neurons. Progress in Brain Research. 1996;112:195–211. doi: 10.1016/s0079-6123(08)63330-0. [DOI] [PubMed] [Google Scholar]
  127. Vuilleumier P, Armony J, Driver J, Dolan RJ. Distinct spatial frequency sensitivities for processing faces and emotional expressions. Nature Neuroscience. 2003;6:624–631. doi: 10.1038/nn1057. [DOI] [PubMed] [Google Scholar]
  128. Whalen P, Rauch S, Etcoff N, McInerney S, Lee M, Jenike M. Masked presentations of emotional facial expressions modulate amygdala activity without explicit knowledge. Journal of Neuroscience. 1998;18:411–418. doi: 10.1523/JNEUROSCI.18-01-00411.1998. [DOI] [PMC free article] [PubMed] [Google Scholar]
  129. White M. Automatic affective appraisal of words. Cognition and Emotion. 1996;10:199–211. [Google Scholar]
  130. Winkielman P, Schwarz N, Fazendeiro T, Reber R. The hedonic marking of processing fluency: Implications for evaluative judgment. In: Musch J, Klauer K, editors. The psychology of evaluation: Affective processes in cognition and emotion. Mahwah, NJ: Lawrence Erlbaum Associates, Inc.; 2003. pp. 189–217. [Google Scholar]
  131. Witt JK, Proffitt DR, Epstein W. Perceiving distance: A role of effort and intent. Perception. 2004;33:577–590. doi: 10.1068/p5090. [DOI] [PubMed] [Google Scholar]
  132. Zajonc R. Attitudinal effects of mere exposure. Journal of Personality and Social Psychology. 1968;9:1–27. [Google Scholar]
  133. Zajonc R. Feeling and thinking: Preferences need no inferences. American Psychologist. 1980;35:151–175. [Google Scholar]
  134. Zajonc R. Feeling and thinking: Closing the debate over the independence of affect. In: Forgas JP, editor. Feeling and thinking: The role of affect in social cognition. Studies in emotion and social interaction. Vol. 2. New York: Cambridge University Press; pp. 31–58. [Google Scholar]

RESOURCES