Abstract
Moral judgment has typically been characterized as a conflict between emotion and reason. In recent years, a central concern has been determining which process is the chief contributor to moral behavior. While classic moral theorists claimed that moral evaluations stem from consciously controlled cognitive processes, recent research indicates that affective processes may be driving moral behavior. Here, we propose a new way of thinking about emotion within the context of moral judgment, one in which affect is generated and transformed by both automatic and controlled processes, and moral evaluations are shifted accordingly. We begin with a review of how existing theories in psychology and neuroscience address the interaction between emotion and cognition, and how these theories may inform the study of moral judgment. We then describe how brain regions involved in both affective processing and moral judgment overlap and may make distinct contributions to the moral evaluation process. Finally, we discuss how this way of thinking about emotion can be reconciled with current theories in moral psychology before mapping out future directions in the study of moral behavior.
Keywords: moral judgment, emotion regulation
“So far, about morals, I know only what is moral is what you feel good after and what it immoral is what you feel bad after.”
-Ernest Hemingway, Death in the Afternoon
In “Death in the Afternoon”, Hemingway takes on bullfighting, a morally ambiguous issue that elicits strong opinions and emotions. For some, the act is seen as the cruel torture of an innocent animal, but for others—including Hemingway himself—bullfighting is seen as an art form, a dance between two strong opponents and a proud symbol of Spanish culture. This type of moral dichotomy extends far outside the bullfighting ring, and is perhaps even as close as one’s kitchen table—the same juicy steak may look like a delicious meal to one person or the brutal murder of an innocent animal to another. From issues like abortion to cigarette smoking, moral opinion can vary wildly, and is infused with both strong emotion and strong opinions at every turn. Certain acts and behaviors can become imbued with a moral meaning dependent on context, culture, the emotions that the act evokes, and how an individual is able to reason about its causes and consequences. The process is dependent upon the interaction of cognition and emotion, or how we think about something is coupled with how we feel about something. Thus, the study of moral decision-making may be particularly informed by adopting a perspective that emphasizes interaction over separation (Helion & Pizarro, 2015).
Traditional models in psychology approach this interaction of feeling and thinking, or cognition and emotion, much in the way that the matador approaches the bull: emotion is strong and willful, and must be tamed, weakened, and sometimes silenced by the steadier, smarter, and more controlled reason (Frank, 1988). Below, we will instead suggest that the relationship between emotion and cognition is not a fight insomuch as it is a smooth dance between equal and inseparable partners—when one shifts the other moves accordingly, and though one may take the lead, it takes two to tango. While both social psychology and cognitive neuroscience have different models for how emotion and cognition interact, most agree that they can be thought of as complementary—indeed, the majority of current neuroscience models focus on how controlled “cognitive” processes can shape and guide affect, whereas many social psychological models tend to focus on how affective inputs shape and modify cognition (Ochsner, Silvers, & Buhle, 2012; Ochsner & Feldman-Barrett, 2001; Keltner & Lerner, 2010). Due to its composition of particularly strong top-down cognitions (e.g. goals, motivations, and ideals) coupled with powerful bottom-up emotional processes, we believe that the moral domain may be an ideal area in which to gain a better understanding of the cognition-emotion interaction.
The goal of this paper is to present a new framework in which to think about affective and cognitive processes in moral judgment, one that views the role of emotion in moral behavior as both automatic and controlled, and takes into account perspectives from extant research in social psychology and social cognitive and affective neuroscience. We will first review how existing theories in psychology and neuroscience address the interaction between emotion and cognition, and how these theories may inform the interpretation of prior research on moral judgment. In the second section, we suggest that, taken together, these ideas form a model of moral judgment in which the relationship between emotion and cognition is bidirectional—emotional processes motivate different types of cognitions and cognitions rein in different emotions. In the third section, we describe how different brain regions may contribute to different aspects of controlled and automatic emotion processes, and how these processes may inform moral behavior and moral judgment. Finally, we will map out how future research in moral judgment and moral behavior can begin to incorporate this way of thinking about the emotion-cognition interaction, and what new insights may be gained by doing so. We suggest that studying moral judgment from a multi-level perspective that has been used in the study of emotion regulation (Ochsner, Silvers, & Buhle, 2012), that takes into account behavioral phenomena (i.e. assessments of wrongness, blame, and emotional responses), coupled with a focus on the neural regions that contribute to moral judgment, we may be able to gain new insights into the nature of the cognitive and affective processes that underlie moral decision-making.
A brief history of moral psychology
For decades, psychologists viewed judgment and decision-making as solely the product of the cognition side of a see-saw equation, where rational thought ruled and there was a direct link between knowledge and behavior such that one knows both what one does and why one is doing it (Loewenstein & Lerner, 2003). This strong emphasis on reason was present in moral research; with early theorists claiming that moral judgment is the product of consciously applying learned rules in order to resolve moral dilemmas (Piaget, 1932; Kohlberg, 1963). They believed that as children aged and their mental abilities developed, they were able to engage in role-taking and mentalizing, which enabled moral maturation (Greene & Haidt, 2002).
In experiments, these cognition-focused models of moral judgment tended to rely on the use of dilemmas that present a conflict between two moral principles, such as the famous trolley dilemma, wherein individuals must make a decision about whether it is appropriate to kill one person by pushing them into the path of an oncoming trolley in order to save five others (Thomson, 1976; Greene, Sommerville, Nystrom, Darley, & Cohen, 2001). These types of dilemmas present a conflict between two moral principles, are often presented from a first-person perspective, and ask participants to evaluate the dilemmas in terms of higher-level concepts that reflect rule application (evaluating moral permissibility requires contrasting an action with a moral rule). Reason’s reign as the primary contributor to moral judgment came to an abrupt end following Haidt’s influential (2001) paper, which claimed that moral judgments, rather than being the result of a complicated calculus between moral rules and moral outcomes, are instead made quickly and effortlessly and are the products of affective intuitions. If moral reasoning did occur, it is usually as a post-hoc explanation used to persuade others. This line of argument was again consistent with see-saw models of cognition-emotion interactions (cf. Ochsner, 2014), but this time the balance of power was shifted away from cognition and back to emotion.
This shift toward emotional primacy has been accompanied by an explosion of moral psychological research within the field of social psychology, with research exploring the philosophical (Knobe, 2003), neural (Greene et al., 2001), and affective (Schnall, Haidt, Clore, & Jordan, 2008) bases of moral judgment. Many of these affect-focused models claim that being able to generate a narrative for one’s decisions or preferences does not require direct access to or a complete understanding of their causes (Nisbett & Wilson, 1977; Haidt, 2001). Instead, moral judgments are made much like aesthetic judgments – quickly, effortlessly, and driven by affective intuitions (Greene & Haidt, 2002). This type of moral research tends to rely on scenarios that elicit moral reactions, or emotional responses to behavior of others (Haidt, 2001; Schnall, Haidt, Clore, & Jordan, 2008), where these behaviors are often novel or unusual, such as incest or bestiality (from Haidt, 2001).
While others have claimed that differences between different moral judgments lay in the engagement of emotion (Greene et al., 2001) we are making a different argument, that emotion is being engaged in both cases, but what varies is the extent to which the emotion is controlled. When we view these two types of moral judgments (those seemingly based on cognition vs. those seemingly based on emotion) within a framework that views emotion as resulting from the operation of both automatic and controlled processes, a new picture begins to emerge. Consider, for example, that individuals often use emotion as an informational source when constructing preferences and making evaluations, particularly when the impact of feelings increases the perception of their relevance, and when other informational inputs are scarce (Simon, 1967; Schwarz, 2012). In situations where no one is morally harmed, but individuals are asked to make an evaluation of moral wrongness (e.g. the classic Haidt scenario wherein two siblings make love and claim that it brings them closer together), individuals may be more likely to use their affective response (i.e. disgust) as an informational input. These types of dilemmas may be triggering automatic emotional responses, which individuals then seamlessly use as a proxy for their moral evaluations. In contrast, when individuals are presented with dilemmas that elicit conflict between two competing goals (e.g. the trolley problem), they may engage in more controlled emotional processing, leading to a conclusion that looks more like the product of reason rather than emotion (Monin, Pizarro, & Beer, 2007).
In line with this criticism, more recent models of moral behavior have claimed that the current dual-process accounts of morality are unable to account for the true nature of what is occurring when individuals make moral decisions (Cushman, 2013; Crockett, 2013; Huebner, 2015). Some of these newer approaches rely heavily on models of reinforcement learning, and focus on the distinction between a model-based learning process, wherein individuals represent the outcomes associated with different courses of action – and select the ones that maximize the best outcome across multiple decisions – and a model-free learning process – which represents the value associated with the immediate decision and guides decisions that maximize values associated with it (Daw, Niv, & Dayan, 2005). For example, Crockett (2013) posited that deontological judgments are the result of a model-free learning system, wherein individuals make decisions based on what has been reinforced in the past. In contrast, utilitarian judgments are the product of a model-based learning system, which maximizes outcomes via a computationally dense decision tree (“If I do Y, then X will occur, if X occurs, then Z will happen…”). In addition to the model-based and model-free systems, this model includes a Pavlovian system, which “prunes” the model-based decision tree when an aversive outcome is found (e.g. pushing a man to his death). Though these new models represent a promising step forward in laying out the cognitive mechanics that underlie moral decision-making, and in particular the union between emotion and cognition rather than making false distinctions between the two, we think that they are still limited when it comes to characterizing the degree to which emotion arises from and can be altered by both reflexive and controlled cognitive processes. Further, they fail to adequately characterize the role of emotion within moral judgment (though, for an argument that attempting to even define the role of emotion within moral judgment is likely a fruitless task, see: Huebner, Dwyer, & Hauser, 2009).
Automatic and Controlled Emotions
Emotions arise from the identification of a goal-relevant stimulus or situation and the activation of associated behavioral and physiological changes that prepare an individual for action, both of which are dependent on the contextual, individual, and cultural factors (Gross & Thompson, 2007; Ochsner & Gross, 2014). Though a great deal of research historically has operated from the perspective that emotions are only reflexive or automatic, and that cognitive control is involved in emotion only insofar as it is used for stopping or blocking emotions from happening, current research in suggests that emotions result as much from controlled processes as they result from automatic ones, and that control processes play roles in all kinds of affective processes (Ochsner & Feldman-Barrett, 2001; Ochsner, 2014).
On the automatic side, emotional processes can be rapid and reflexive, and surely underlie a great deal of human behavior. Automatic emotional processes play an important role in a number of social and cognitive behaviors, including threat and reward detection, person-perception, the formation and expression of attitudes and evaluations, and moral judgment (Lazarus, 1991; Todorov & Uleman 2003; Bargh, Chaiken, Govender, & Pratto, 1992; Haidt, 2001). Affective responses can be quickly and non-consciously tied to representations of individuals and their actions and can act as cues to pay attention to certain features of an evaluative target (Todorov & Uleman, 2004; Knutson & Cooper, 2005).
On the controlled side, higher cognitive processes can also play key roles in emotion. In general, cognitive control refers to processes involved in the effortful, goal-driven guidance of all manner of behaviors, ranging from attention or memory retrieval, to language, perception and actions of all sorts (Ochsner & Gross, 2005; LaBar & Cabeza, 2006). In the context of emotion, controlled processes influence attention to and elaboration of the meaning of a stimulus (Ochsner et al., 2009), as well as the monitoring and reporting of emotion (Satpute, Shu, Weber, Roy, & Ochsner, 2013), all of which can change how the emotional value of a stimulus is construed, how we categorize and perceive our affective states, and how they will inform our future behavior (Moors, Ellsworth, Scherer, & Frijda, 2013).
Together, controlled and automatic processes both create and change affect, and the meaning derived from a situation and subsequent emotional responses are the product of both of these processes working in tandem. These automatic and controlled processes are bidirectional, and can interact in a number of ways to produce distinct affective experiences and evaluations. For example, while we often think of emotional stimuli as the sole producer of affective states (e.g. facing a charging bull), cognitive processes can also give rise to equally powerful affective experiences in the absence of environmental emotional stimuli (e.g. imagining facing a charging bull) (LeDoux, 2002). Taken together, this suggests that emotions can be both immediately felt and consciously constructed, and are the result of automatic and controlled processes that both play a role in generating and transforming an individual’s affective state (Clore & Huntsinger, 2007).
Controlled cognitive processes shape and change affect
It seems likely that one of the contributing factors to forming and acting upon moral judgments is the ability to up- and down- regulate emotion depending on the context of the moral situation and the cognitive and motivational resources that are available to the individual at the time of evaluation (Ochsner et al., 2004; Ray et al., 2005). To revisit the trolley paradigm used by Greene and colleagues (2001), asking participants to imagine themselves killing another person by pushing them onto the trolley tracks may lead to cognitive and behavioral processes consistent with the cognitive up-regulation of emotion (e.g. picturing oneself in the situation, imagining the person one is hurting). This up-regulation may lead an individual to claim that the act of pushing the person is wrong. In contrast, if the same person were given the trolley dilemma, but instead asked to imagine pushing a button in a distant control booth in order to alter the trolley’s path such that it will hit a person, this may engender processes consistent with the cognitive down-regulation of emotion. This down-regulation may cause individuals to endorse a utilitarian viewpoint, and view killing one person to save five as less morally blameworthy. Thus, by considering emotions as evolving out of both automatic and controlled processes, it opens the possibility for new reinterpretations of old moral judgments. If we grant that emotion generation can be the product of controlled processes, a utilitarian decision could be seen as indicative of the presence of emotion rather than its absence. After all, saving five by sacrificing one involves making an affective prediction that saving five will, in fact, feel better than not doing so. Or to reconsider the classic Heinz dilemma, choosing to steal in order to save one’s wife (the utilitarian decision) is a choice inarguably informed by emotion.
Aligning emotion with moral beliefs
To extend this further, it seems possible to reframe moral behavior as the controlled regulation of emotion so that one acts or makes evaluations that are in line with one’s pre-existing ideals and goals. For example, in one study, participants were split into high-utilitarian and low-utilitarian groups based on frequency of utilitarian responses (Greene, 2009). While both groups showed an effect of load, among high-utilitarian participants, the utilitarian judgments were faster than non-utilitarian judgments in the absence of load. While this is consistent with a dual-process view of moral judgment, it could also be consistent with different regulation and appraisal styles, such that some individuals may be more sensitive to different features of the same situation. In support of this assertion, recent research suggests that when presented with trolley-type dilemmas, individuals are sensitive to variations in probability (how likely people are to be saved) and magnitude (how many people will be saved) when calculating an expected value of a moral judgment (Shenhav & Greene, 2010). While on the surface this looks like a wholly “reasoned” process, it is also consistent with a regulatory account. Different meanings of probability and magnitude, and individual differences within these factors (e.g., a 30% chance of saving 5 may be construed very differently based on individual levels of risk-sensitivity) may be linked to the intensity of a response. Making a utilitarian judgment may reflect using higher cognitive processes in order to accumulate meanings that layer on top of, and shape the meaning of, responses cued up by more automatic processes that generate an initial affective response. This may take more time—but may not necessarily reflect a qualitatively different combination of processes–compared to those that lead to deontological judgments (Cunningham & Zelazo, 2007). Thus, the likelihood of making a utilitarian or deontological judgment may be rooted in the regulation (and not the overriding) of affect.
Individuals may up- or down- regulate their automatic emotional responses to moral stimuli in a way that encourages goal-consistent behavior. For example, individuals may down-regulate their disgust when evaluating dilemmas in which disgusting acts occurred but no one was harmed, or they may up-regulate anger when engaging in punishment or assigning blame. To observe this effect in the wild, one need go no further than the modern political arena. Someone who is politically liberal may be as disgusted by the thought of two men kissing as someone who is politically conservative, but may choose to down-regulate their response so that it is more in line with their political views (Feinberg et al., 2014). They can do this in multiple ways, including reframing the situation as one about equality and fairness, construing the act as one of love and affection, or manipulating personal relevance by thinking about homosexual individuals whom the person knows. This affective transformation would rely on controlled emotional processes that shape the initial automatically elicited emotion (disgust) into a very different emotion (tolerance or acceptance). This process requires motivation, recognition (conscious or non-conscious) that one is experiencing an emotion that is in conflict with ones goals and ideals, and a reconstruction of the situation and one’s emotions in order to come to a moral resolution. Comparatively, political conservatives may be less motivated to do so, and may instead up-regulate their disgust response so that their moral judgment is in line with their overarching goals. In contrast, the opposite regulatory pattern may occur (such that liberals up-regulate emotion and conservatives down-regulate emotion) when considering issues like the death penalty or gun control.
Affect Changes and Shapes Controlled Processes
Whether it is the dilemma posed by a rapidly approaching trolley or considering the moral wrongness of consensual sibling incest, it is clear that the types of moral scenarios presented to participants tend to be novel, atypical, and complex—three factors that have been shown to increase the use of emotion as a source of information (Simon, 1967; Forgas, 2002; Schwarz, 2012). Individuals may both differ in the initial intensity of their emotional response, and in the extent to which they think that their emotions are a valuable input to the formation their moral judgments. For example, for some individuals who are particularly disgusted by homosexuality, the strength of the initial affective response may bias downstream reasoning processes and impact their ability to reappraise the situation, resulting in a stronger affective bias—they may end up justifying their reaction without attempting to or being able to down-regulate it. In contrast, an individual who makes moral evaluations based solely on considerations of harm may decide that even though they may be initially disgusted, the act is not morally wrong. And finally, a third individual may initially think that it was disgusting and wrong, but may later change his or her mind after further consideration. Thus, three different individuals might experience disgust about the same situation, but the degree to which this emotion will impact their judgments might vary substantially. This type of moral evaluation process is in line with our proposed model, and helps to explain how two different individuals may come to dramatically different conclusions even if they began with the same automatic affective state.
Aligning moral beliefs with emotion
Inter-individual differences in moral sensitivity may also be rooted in emotional experiences. Individual differences in disgust sensitivity have been linked to increased moral severity (Schnall, Haidt, Clore, & Jordan, 2008; Pond et al., 2012; Pizarro, Inbar, & Helion, 2011), political conservatism (Inbar, Pizarro, & Bloom, 2009), and negative attitudes towards homosexuality (Inbar, Pizarro, Knobe, & Bloom, 2009). In addition, individual differences at the controlled level of emotion processing have also been linked to affective influences on moral judgment. In one study, Van Dillen and colleagues (2012) demonstrated that disgust’s influence on moral judgment was modulated by individual differences in attentional control, or how able individuals are to disengage their attention from specific stimuli in their environments. Taken together, this suggests that the interaction between affective and cognitive processes varies across time, across individuals, and is strongly influenced by preexisting conditions.
This parallels research findings within the stress and coping literature—the same traumatic experience can engender different cognitive and affective consequences across different individuals in both the short- and long-term (Bonanno, Galea, Bucciarelli, & Vlahov, 2007). Part of this may be due to differences in situation selection and the types of coping strategies used to help an individual reframe an emotionally evocative event (Mehl & Pennebaker, 2003). That said, the majority of studies that have focused on emotion within moral behavior have done so looking at emotion as an environmental constant or as a trait-like factor, rather than as a variable that changes throughout the formation of a moral evaluation. Viewing emotion this way may give new insight into the moral decision-making process. For example, an individual who responds with anger when hearing about two men having sex or getting married may act very differently than an individual who feels disgust when presented with the same situation. These qualitatively different emotions may promote two distinct appraisals (“that’s wrong!” vs. “that’s gross!”) and divergent action tendencies (approach vs. avoid).
Neural systems involved in shaping and changing affect and in moral judgment
We know a great deal about the factors that influence whether individuals make seemingly cognition- or emotion- driven moral judgments (Greene & Haidt, 2002). However, determining whether these judgments are reflective of an integrative process whereby both top-down control regions and bottom-up automatic processes combine to generate and control emotion requires examining moral judgment as it occurs on different levels (i.e. behavioral and brain). Prior neuroscience research has demonstrated that making moral judgments within separate moral domains (i.e. harm, dishonesty, disgust) largely relies on different, non-overlapping brain regions (Parkinson et al., 2011), echoing assertions that moral judgment is complex, and involves many underlying processes (Young & Dungan, 2012). By taking a multi-level approach, we may be able to gain a better understanding of the processes that underlie both generating and regulating the emotions involved in moral judgment, and consequently, develop a more nuanced—and perhaps more accurate—view of emotion’s role in human morality. Below, we demonstrate that there is a striking overlap between the regions involved in both automatic and controlled emotional processes and those implicated in moral judgment (Table 1).
Table 1.
Neural Systems Involved in Shaping and Changing Affect
| Region | Role in Shaping and Changing Affect | Role in Moral Judgment |
|---|---|---|
|
| ||
| Amygdala | Detecting and responding to affective stimuli (Davis & Whalen, 2001; Sander, Grafman, & Zalla, 2003) | Correlated with implicit moral attitude strength (Luo et al., 2006) |
| Processing goal relevance (Cunningham & Brosch, 2012) | Implicated in evaluation of one’s moral transgressions (Berthoz et al., 2006) | |
| Affective learning (Nader, Schafe, & LeDoux, 2000) | Involved in deciding punishment for immoral behavior (Treadway et al., 2014) | |
|
| ||
| Insula | Integration of body states (Craig, 2009; Zaki, Davis, & Ochsner, 2012) | Involved in evaluations of fairness and cooperative behavior (Sanfey et al., 2003) |
| Identifying emotional expressions that convey information about body states (e.g. disgust) (Calder et al., 2000) | Expression and experience of disgust (Chapman & Anderson, 2012) | |
| Empathic responding (Lamm, Batson, & Decety, 2007) | ||
| Evaluating risk and avoiding harm (Paulus et al., 2003) | ||
| Disgust experience and recognition (Calder, Lawrence, & Young, 2000) | ||
|
| ||
| Anterior Cingulate Cortex | Self monitoring (van Veen et al., 2001) | Consequentialist decision-making (Greene, et al., 2004) |
| Emotion regulation (Ochsner, Silvers, & Buhle, 2012) | ||
|
| ||
| Dorsomedial prefrontal cortex | Drawing inferences about the mental states and traits of individuals (Amodio, 2014) | Involved in making moral judgments across multiple domains (e.g. disgusting, harmful, and dishonest actions) (Parkinson et al., 2011) |
| Reflecting on and describing feelings related to and intentions behind actions (Spunt, Satpute, & Lieberman, 2011; Young & Saxe, 2008) | ||
|
| ||
| Ventromedial prefrontal cortex | Integrate multiple streams of information from amygdala, ventral striatrum, and dorsal and lateral prefrontal regions (Cunningham, Johnsen, & Waggoner, 2011) | Anticipating and regulating emotional responses when making personally relevant moral judgment (Cushman, 2014; Greene et al., 2001; 2004) |
| Provides an index of a stimulus’s present value (Ochsner & Gross, 2014) | Linking intentions with moral behavior (Young et al., 2010) | |
|
| ||
| Dorsolateral prefrontal cortex | Controlling the focus of selective attention (Wendelken, Bunge, & Carter, 2008) | Endorsement of utilitarian judgments (Shenhav & Greene, 2010) |
| Retrieving semantic and episodic information from memory (Browning et al., 2010) | Judgments involving harm (Parkinson et al., 2011) | |
| Selecting context appropriate and inhibiting context inappropriate responses (Ridderinkhof, Ullsperger, Crone, & Nieuwenhuis, 2004) | ||
|
| ||
| Ventrolateral prefrontal cortex | Resolving cognitive dissonance (Jarcho, Berkman, & Lieberman, 2011) | Acceptance of unfair offers (Rilling & Sanfey, 2009) |
| Regulating amygdala activity (Ochsner, Silvers, & Buhle, 2012) | ||
|
| ||
| Orbitofrontal cortex | Integration of affective and motivational information (Kringelbach, 2005) | Endorsement of utilitarian tradeoffs (Shenhav & Greene, 2010) |
| Implicit self-monitoring of behavior (Beer, John, Scabini, & Knight, 2006) | ||
A key idea is that the regions enumerated below work together as individuals gain explicit awareness of their emotional states—the amygdala and anterior insula respond to the intensity of the affective stimulus and direct attention to the elicitors of the emotion and the body states they elicit, respectively; the dmPFC supports making attributions about the nature of those feelings; the vlPFC helps select appropriate labels for describing the emotional response verbally (Satpute et al., 2013; Lieberman et al., 2007). This suggests that a constellation of brain regions contributes to the generation of emotions that can be both automatic and controlled, shaped by specific situations, and that differ across individuals based on preexisting evaluations, goals and beliefs, and sensitivity to specific stimuli.
Within research on moral judgment, the concerted action of, and communication between, these regions has been shown to influence moral decision-making and moral behavior. One example comes from a study by Decety and colleagues (2012), who examined the development of moral sensitivity, and found that the interaction and connectivity of many of the regions enumerated above change across development, reflecting both neural and socioemotional maturation. In this study, participants across a wide age range (7–40) viewed a series of video clips that portrayed intentional and accidental physical harm. The researchers found that observing unintentional harm was associated with increased activation in the ACC and anterior insula, along with other regions involved in experiencing pain. In contrast, observing intentional harm was associated with increased activation in the mPFC, the posterior superior temporal sulcus (STS) and the OFC. The researchers claimed that these differences in activation patterns are due to the integration of mental states (intention) when interpreting affective information. Perhaps most intriguingly, the researchers found two significant age-related changes: 1) a posterior-to-anterior progression of activation in the insula across age, accompanied by greater signal change in prefrontal control regions, and 2) a medial-to-lateral activation shift in the OFC when observing intentional harms, possibly reflecting a shift from relatively automatic somatosensory responses to observing pain in childhood to more controlled emotional responses in adolescence and adulthood. These results suggest that while affective inputs make a large contribution to both understanding and observing moral situations, affect is both reinterpreted and regulated based on high-level inferences and developmental maturation.
In addition, previous research has linked activation in the OFC and amygdala to the formation and expression of implicit moral attitudes (Luo et al., 2006). Some have even suggested that the callous and unemotional tendencies seen in psychopathy may be due to dysfunction in the vmPFC and amygdala, resulting in a reinforcement-learning problem (Blair, 2007). In this population, harmful and immoral acts might be not associated with aversive reinforcement (i.e. the distress of the victim) and thus are not tagged as a negative behavior. Instead, immoral and harmful behavior may be seen as instrumental in achieving goals. This may result in a failure to exhibit normal reinforcement learning—psychopaths may have affective reactions, but aren’t updating their moral representations. Moral reasoning and learning moral rules thus appears to require the coordination of multiple cognitive processes and brain areas, including those involved in both automatic and controlled emotional processing.
Future Directions
In the sections above, we suggest that affective and controlled cognitive processes involved in moral judgment are complementary and can be informed by social cognitive neuroscience, which has models of the way controlled cognitive processes interact with affective processes in a number of ways. The use of neuroimaging methods can give moral researchers a better understanding of the overlapping processes involved moral judgment and emotion regulation. In particular, we believe that neuroscience is particularly well suited to determine the role of emotion generation and regulation in moral behavior. The majority of moral research within cognitive neuroscience has either focused on passive emotional experiences (Moll et al., 2002), artificial moral dilemmas (Greene et al., 2004), or on the distinction between moral and nonmoral evaluations (Berthoz et al., 2006). While these have been useful in terms of laying a foundation which to build, there remains a paucity of naturalistic studies that take into account how factors such as motivated reasoning, emotion generation, controlled cognitive appraisals, and regulatory strategies may influence complex moral decision-making (Teper, Tullett, Page-Gould, & Inzlicht, 2015).
Regulating Moral Emotion
Future research should examine the how dorsolateral and posterior prefrontal regions are involved in directing attention to specific features of a moral situation, and the role that these regions may play in modulating amygdala and insula activation to affectively charged moral stimuli. The dmPFC, which is involved in attributing mental states, has been shown to guide decisions about intentionality and attributions within moral contexts (Young & Saxe, 2009). As this region is also involved in the amplification of emotional responses (Ochsner, Silvers, & Buhle, 2012), it therefore may play an important role in increasing affective responses to moral events.
When regulating emotion via cognitive reappraisal, there are different tactics that one might use. Two of the most common are reinterpretation (or reevaluating the affective stimulus in a way that makes it more or less unsettling) and psychological distancing (or altering one’s distance from, both psychologically or physically, from the affective stimulus) (Ochsner & Gross, 2014). Both tactics are effective, but involve different brain regions—reinterpretation relies more on ventral lateral prefrontal regions, whereas distancing is linked to increased activation in parietal regions linked to spatial representation (Ochsner, Silvers, & Buhle, 2012). While the majority of studies focusing on emotion regulation have primarily focused on reinterpretation rather than distancing, research on moral judgment has done the opposite, focusing largely on psychological distancing rather than reinterpretation (Greene et al., 2001; 2004; Cushman, Young, & Hauser, 2006; Harenski & Hamann, 2006). Looking at the use of these two regulatory tactics within the context of moral judgment may yield surprising results. For example, psychological distancing may cause the dampening down of an emotional experience (as seen in variants of the trolley dilemma), whereas reinterpretation may lead to different appraisals and qualitatively different emotional experiences. If individuals are motivated to reinterpret situations so as to make them more in line with their goals, we may expect to see more activation in the vlPFC, a region involved in the selection of goal-appropriate responses and the retrieval of semantic memory information that may generate a new reappraisal and/or alter the first one (Ochsner, Silvers, & Buhle, 2012).
Focusing on everyday moral decision-making
Individuals make moral decisions all of the time—recent research has shown that acting morally and immorally are frequent parts of everyday life (Hofmann, Wisneski, Brandt, & Skitka, 2014). However these types of everyday, first-person moral judgments, remain relatively understudied. Prior research has shown that moral behaviors happen on at least three social dimensions: 1) First-person, such as making personal moral decisions, evaluating one’s own moral judgments, and experiencing emotions as a consequence of one’s own behaviors (e.g. guilt, regret, sadness, pain), 2) Third-person, such as evaluating the “rightness” and “wrongness” of others’ moral actions (Haidt, 2001) and experiencing emotions as a consequence of others’ behaviors (e.g. condemnation/contempt, anger, pain, disgust) (Gray & Wegner, 2009), and 3) at the group-level, such as adhering to social norms, and exhibiting disgust/contempt for out-group members and pride/loyalty for in-group members (Graham, Haidt & Nosek, 2009; Fiske, Cuddy, Glicke, & Xu, 2002; Cikara, Farnsworth, Harris, & Fiske, 2010).
While social psychologists have done an excellent job of looking at the latter two dimensions of moral behavior (third-person and group-level), we believe that neuroimaging is uniquely suited to addressing first-person moral decision-making. Though individuals may come to the same moral conclusions when evaluating their own and others’ moral behaviors (e.g. stealing is wrong regardless of who does it), the process by which they get there may be markedly different—for example, one might expect to see more neural and behavioral indications of conflict when considering one’s own moral transgressions than when considering those of others. The differences in the early and late onset vmPFC patients mentioned above (Young et al., 2010) suggest that this may be the case, and that this region may play an important role in the weighting of self-interest in moral decision-making. Future work should further examine these types of moral questions.
How do moral intentions become moral behavior?
Specifically, looking at more naturalistic first-person moral dilemmas will allow researchers to focus on how preexisting goals and motivations influence moral behavior. These factors may play an important role in clarifying and understanding the interplay between affective and cognitive processes in moral judgment. The majority of previously studied first-person dilemmas involve variations on the trolley dilemma, or other hypothetical moral scenarios that don’t adequately reflect the types of moral decisions that individuals make frequently (such as cheating on tests, taxes, or romantic partners). Gaining better insight into affective and cognitive processes that underlie real-world moral decision-making would be useful in terms of understanding how individuals make moral predictions for their own behavior and when those predictions fail, for example, most individuals don’t intend to cheat on their taxes, but many end up doing so.
The hot-cold empathy gap (Loewenstein, 2000) suggests that human beings are unable to accurately imagine motivational and emotional states that they are not currently experiencing, and that they often fail to take into account visceral influences (e.g. hunger, arousal) on future decisions. Failure to take into account the strong pull of visceral states also leads individuals in “cold” states to stigmatize impulsive behavior (Nordgren, van der Pligt, & van Harreveld, 2007). This may partially account for discrepancies between individuals’ moral intentions and their immoral behaviors. By examining the affective processes underlying everyday moral decision-making, we may be able to gain a better understanding of this gap, and possibly how to close it.
Examining moral behavior across the lifespan
While a great deal of moral research has focused on how children learn moral rules and make moral decisions, comparatively little has looked at how moral behavior changes throughout the lifespan. Viewing the same behavior from the lens of a parent versus that of a teenager could shift moral meaning of the same situation (violating curfew). Indeed, changes in social roles like parenthood have been shown to impact how people construe what is moral and immoral. In one study, Eibach, Libby, and Ehrlinger (2009) found that parents who were primed with their parental role prior to making moral judgments found harmless offensive acts (e.g. someone surgically adding horns to their head) to be more immoral than those who were not parents and parents who were unprimed. They make the argument that one’s social role and concerns can shape moral decision-making. The roles that an individual assumes throughout his or her life will change a great deal, and may change how she or he views certain moral actions. These developmental differences may also happen at the automatic level of emotion processing, as emotionality may change as a function of age (Levenson, Carstensen, Friesen, & Ekman, 1991). As brain regions develop and change over time, their contributions to moral behavior may shift accordingly (Decety, Michalska, & Kinzler, 2011).
Conclusions
The field of moral psychology has done an excellent job of bringing emotion into the study of moral behavior, now it’s just a matter of figuring out the specific parameters of the role that it plays. The use of neuroscience methods, coupled with a perspective that views emotion’s role in moral judgment as both automatic and controlled, may help us gain a new view of moral behavior, one that can better address how emotions inform moral behavior throughout the evaluation process. Opinions and emotions about moral issues can be strong, but they can also be flexible, and can be shaped and altered by an individual’s goals and motivations. In many ways, Hemingway was right, when thinking about moral behavior; we tend to make distinctions based on what we feel good and bad after. However, studying the “after” only gets us halfway there. Only by understanding what is felt both “before” and “during” will we truly be able to gain a full picture of moral judgment.
Acknowledgments
Completion of the manuscript was supported by grants AG043463 from NIA and HD069178 from NICHD awarded to K. Ochsner, and grant F32HD081960 awarded to C. Helion.
Contributor Information
Chelsea Helion, Columbia University
Kevin N. Ochsner, Columbia University
References
- Amodio DM. The neuroscience of prejudice and stereotyping. Nature Reviews Neuroscience. 2014;15:670–682. doi: 10.1038/nrn3800. [DOI] [PubMed] [Google Scholar]
- Bargh JA, Chaiken S, Govender R, Pratto F. The generality of the automatic attitude activation effect. Journal of personality and social psychology. 1992;62(6):893. doi: 10.1037//0022-3514.62.6.893. [DOI] [PubMed] [Google Scholar]
- Baumeister RF, Exline J. Virtue, personality, and social relations: Self- Control as the moral muscle. Journal of personality. 1999;67(6):1165–1194. doi: 10.1111/1467-6494.00086. [DOI] [PubMed] [Google Scholar]
- Beer JS, John OP, Scabini D, Knight RT. Orbitofrontal cortex and social behavior: Integrating self-monitoring and emotion-cognition interactions. Journal of cognitive neuroscience. 2006;18(6):871–879. doi: 10.1162/jocn.2006.18.6.871. [DOI] [PubMed] [Google Scholar]
- Berthoz S, Grezes J, Armony JL, Passingham RE, Dolan RJ. Affective response to one’s own moral violations. Neuroimage. 2006;31(2):945–950. doi: 10.1016/j.neuroimage.2005.12.039. [DOI] [PubMed] [Google Scholar]
- Blair RJR. The amygdala and ventromedial prefrontal cortex in morality and psychopathy. Trends in cognitive sciences. 2007;11(9):387–392. doi: 10.1016/j.tics.2007.07.003. [DOI] [PubMed] [Google Scholar]
- Bonanno GA, Galea S, Bucciarelli A, Vlahov D. Psychological resilience after disaster: New York City in the aftermath of the September 11th terrorist attack. Psychological Science. 2007;17(3):181–186. doi: 10.1111/j.1467-9280.2006.01682.x. [DOI] [PubMed] [Google Scholar]
- Browning M, Holmes EA, Murphy SE, Goodwin GM, Harmer CJ. Lateral prefrontal cortex mediates the cognitive modification of attentional bias. Biological psychiatry. 2010;67(10):919–925. doi: 10.1016/j.biopsych.2009.10.031. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Buhle JT, Silvers JA, Wager TD, Lopez R, Onyemekwu C, Kober H, Weber H, Ochsner KN. Cognitive reappraisal of emotion: A meta-analysis of human neuroimaging studies. Cerebral Cortex. 2014;24:2981–2990. doi: 10.1093/cercor/bht154. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Calder AJ, Keane J, Manes F, Antoun N, Young AW. Impaired recognition and experience of disgust following brain injury. Nature neuroscience. 2000;3(11):1077–1078. doi: 10.1038/80586. [DOI] [PubMed] [Google Scholar]
- Calder AJ, Lawrence AD, Young AW. Neuropsychology of fear and loathing. Nature Reviews Neuroscience. 2001;2(5):352–363. doi: 10.1038/35072584. [DOI] [PubMed] [Google Scholar]
- Cauda F, D’Agata F, Sacco K, Duca S, Geminiani G, Vercelli A. Functional connectivity of the insula in the resting brain. Neuroimage. 2011;55(1):8–23. doi: 10.1016/j.neuroimage.2010.11.049. [DOI] [PubMed] [Google Scholar]
- Chapman HA, Anderson AK. Understanding disgust. Annals of the New York Academy of Sciences. 2012;1251(1):62–76. doi: 10.1111/j.1749-6632.2011.06369.x. [DOI] [PubMed] [Google Scholar]
- Chapman HA, Anderson AK. Things rank and gross in nature: A review and synthesis of moral disgust. Psychological bulletin. 2013;139(2):300. doi: 10.1037/a0030964. [DOI] [PubMed] [Google Scholar]
- Ciaramelli E, Muccioli M, Ladavas E, di Pellegrino G. Selective deficit in personal moral judgment following damage to ventromedial prefrontal cortex. Social cognitive and affective neuroscience. 2007;2(2):84–92. doi: 10.1093/scan/nsm001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cikara M, Farnsworth RA, Harris LT, Fiske ST. On the wrong side of the trolley track: Neural correlates of relative social valuation. Social cognitive and affective neuroscience. 2010;5(4):404–413. doi: 10.1093/scan/nsq011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Clore GL, Huntsinger JR. How emotions inform judgment and regulate thought. Trends in cognitive sciences. 2007;11(9):393–399. doi: 10.1016/j.tics.2007.08.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Craig AD. How do you feel–now? The anterior insula and human awareness. Nature Reviews Neuroscience. 2009;10(1):59–70. doi: 10.1038/nrn2555. [DOI] [PubMed] [Google Scholar]
- Crockett MJ. Models of morality. Trends in cognitive sciences. 2013;17(8):363–366. doi: 10.1016/j.tics.2013.06.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cunningham WA, Brosch T. Motivational salience amygdala tuning from traits, needs, values, and goals. Current Directions in Psychological Science. 2012;21(1):54–59. [Google Scholar]
- Cunningham WA, Zelazo PD. Attitudes and evaluations: A social cognitive neuroscience perspective. Trends in cognitive sciences. 2007;11(3):97–104. doi: 10.1016/j.tics.2006.12.005. [DOI] [PubMed] [Google Scholar]
- Cunningham WA, Johnsen IR, Waggoner AS. Orbitofrontal cortex provides cross-modal valuation of self-generated stimuli. Social Cognitive and Affective Neuroscience. 2011;6(3):286–293. doi: 10.1093/scan/nsq038. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cushman F. Action, outcome, and value a dual-system framework for morality. Personality and social psychology review. 2013;17(3):273–292. doi: 10.1177/1088868313495594. [DOI] [PubMed] [Google Scholar]
- Cushman F. The neural basis of morality: not just where, but when. Brain. 2014;137(4):974–975. doi: 10.1093/brain/awu049. [DOI] [PubMed] [Google Scholar]
- Cushman F, Young L, Hauser M. The role of conscious reasoning and intuition in moral judgment testing three principles of harm. Psychological science. 2006;17(12):1082–1089. doi: 10.1111/j.1467-9280.2006.01834.x. [DOI] [PubMed] [Google Scholar]
- Daw ND, Niv Y, Dayan P. Uncertainty-based competition between prefrontal and dorsolateral striatal systems for behavioral control. Nature Neuroscience. 2005;8(12):1704–1711. doi: 10.1038/nn1560. [DOI] [PubMed] [Google Scholar]
- Decety J, Michalska KJ, Kinzler KD. The developmental neuroscience of moral sensitivity. Emotion Review. 2011;3(3):305–307. [Google Scholar]
- Decety J, Michalska KJ, Kinzler KD. The contribution of emotion and cognition to moral sensitivity: a neurodevelopmental study. Cerebral Cortex. 2012;22(1):209–220. doi: 10.1093/cercor/bhr111. [DOI] [PubMed] [Google Scholar]
- Eibach RP, Libby LK, Ehrlinger J. Priming family values: How being a parent affects moral evaluations of harmless but offensive acts. Journal of Experimental Social Psychology. 2009;45(5):1160–1163. [Google Scholar]
- Feinberg M, Antonenko O, Willer R, Horberg EJ, John OP. Gut check: Reappraisal of disgust helps explain liberal–conservative differences on issues of purity. Emotion. 2014;14(3):513. doi: 10.1037/a0033727. [DOI] [PubMed] [Google Scholar]
- Fiske ST, Cuddy AJ, Glick P, Xu J. A model of (often mixed) stereotype content: competence and warmth respectively follow from perceived status and competition. Journal of personality and social psychology. 2002;82(6):878. [PubMed] [Google Scholar]
- Forgas JP. Toward understanding the role of affect in social thinking and behavior. Psychological Inquiry. 2002;13(1):90–102. [Google Scholar]
- Frank RH. Passions within reason: the strategic role of the emotions. WW Norton & Company; New York, NY: 1988. [Google Scholar]
- Freeman JB, Stolier RM. The medial prefrontal cortex in constructing personality models. Trends in cognitive sciences. 2014;18(11):571–572. doi: 10.1016/j.tics.2014.09.009. [DOI] [PubMed] [Google Scholar]
- Garfinkel SN, Critchley HD. Interoception, emotion and brain: new insights link internal physiology to social behaviour. Commentary on: “Anterior insular cortex mediates bodily sensibility and social anxiety” by Terasawa et al. (2012) Social cognitive and affective neuroscience. 2013;8(3):231–234. doi: 10.1093/scan/nss140. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Graham J, Haidt J, Nosek BA. Liberals and conservatives rely on different sets of moral foundations. Journal of personality and social psychology. 2009;96(5):1029. doi: 10.1037/a0015141. [DOI] [PubMed] [Google Scholar]
- Gray K, Wegner DM. Moral typecasting: divergent perceptions of moral agents and moral patients. Journal of personality and social psychology. 2009;96(3):505. doi: 10.1037/a0013748. [DOI] [PubMed] [Google Scholar]
- Greene JD. Dual-process morality and the personal/impersonal distinction: A reply to McGuire, Langdon, Coltheart, and Mackenzie. Journal of Experimental Social Psychology. 2009;45(3):581–584. [Google Scholar]
- Greene JD. Emotion and morality: A tasting menu. Emotion Review. 2011;3:1–3. [Google Scholar]
- Greene J, Haidt J. How (and where) does moral judgment work? Trends in cognitive sciences. 2002;6(12):517–523. doi: 10.1016/s1364-6613(02)02011-9. [DOI] [PubMed] [Google Scholar]
- Greene JD, Nystrom LE, Engell AD, Darley JM, Cohen JD. The neural bases of cognitive conflict and control in moral judgment. Neuron. 2004;44(2):389–400. doi: 10.1016/j.neuron.2004.09.027. [DOI] [PubMed] [Google Scholar]
- Greene JD, Sommerville RB, Nystrom LE, Darley JM, Cohen JD. An fMRI investigation of emotional engagement in moral judgment. Science. 2001;293(5537):2105–2108. doi: 10.1126/science.1062872. [DOI] [PubMed] [Google Scholar]
- Gross JJ, Thompson RA. Emotion regulation: Conceptual foundations. Handbook of emotion regulation. 2007;3:24. [Google Scholar]
- Haidt J. The emotional dog and its rational tail: A social intuitionist approach to moral judgment. Psychological Review. 2001;108:814–834. doi: 10.1037/0033-295x.108.4.814. [DOI] [PubMed] [Google Scholar]
- Harenski CL, Hamann S. Neural correlates of regulating negative emotions related to moral violations. Neuroimage. 2006;30(1):313–324. doi: 10.1016/j.neuroimage.2005.09.034. [DOI] [PubMed] [Google Scholar]
- Helion C, Pizarro DA. Beyond dual-processes: the interplay of reason and emotion in moral judgment. In: Levy N, Clausen J, editors. Springer Handbook for Neuroethics. Dordecht: Springer Netherlands; 2015. [Google Scholar]
- Hemingway E. Death in the Afternoon. New York: Charles Scribner’s Sons; 1932. [Google Scholar]
- Hofmann W, Wisneski DC, Brandt MJ, Skitka LJ. Morality in everyday life. Science. 2014;345(6202):1340–1343. doi: 10.1126/science.1251560. [DOI] [PubMed] [Google Scholar]
- Huebner B. Do emotions play a constitutive role in moral cognition? Topoi. 2015;34(2):427–440. [Google Scholar]
- Huebner B, Dwyer S, Hauser M. The role of emotion in moral psychology. Trends in cognitive sciences. 2009;13(1):1–6. doi: 10.1016/j.tics.2008.09.006. [DOI] [PubMed] [Google Scholar]
- Inbar Y, Pizarro DA, Bloom P. Conservatives are more easily disgusted. Cognition & Emotion. 2009;23:714–725. [Google Scholar]
- Inbar Y, Pizarro DA, Knobe J, Bloom P. Disgust sensitivity predicts intuitive disapproval of gays. Emotion. 2009;9:435–439. doi: 10.1037/a0015960. [DOI] [PubMed] [Google Scholar]
- Jarcho JM, Berkman ET, Lieberman MD. The neural basis of rationalization: Cognitive dissonance reduction during decision-making. Social Cognitive and Affective Neuroscience. 2011;6(4):460–467. doi: 10.1093/scan/nsq054. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Keltner D, Lerner JS. Emotion. In: Gilbert DT, Fiske ST, Lindsay G, editors. The handbook of social psychology. 5th. New York: McGraw Hill; 2010. pp. 312–347. [Google Scholar]
- Kiehl KA, Smith AM, Hare RD, Mendrek A, Forster BB, Brink J, Liddle PF. Limbic abnormalities in affective processing by criminal psychopaths as revealed by functional magnetic resonance imaging. Biological psychiatry. 2001;50(9):677–684. doi: 10.1016/s0006-3223(01)01222-7. [DOI] [PubMed] [Google Scholar]
- Knobe J. Intentional action and side effects in ordinary language. Analysis. 2003;63(279):190–194. [Google Scholar]
- Knutson B, Cooper JC. Functional magnetic resonance imaging of reward prediction. Current opinion in neurology. 2005;18(4):411–417. doi: 10.1097/01.wco.0000173463.24758.f6. [DOI] [PubMed] [Google Scholar]
- Kohlberg L. Moral development and identification. In: Stevenson H, editor. Child psychology: 62nd yearbook of the National Society for the Study of Education. Chicago: University of Chicago Press; 1963. [Google Scholar]
- LaBar KS, Cabeza R. Cognitive neuroscience of emotional memory. Nature Reviews Neuroscience. 2006;7(1):54–64. doi: 10.1038/nrn1825. [DOI] [PubMed] [Google Scholar]
- Lamm C, Batson CD, Decety J. The neural substrate of human empathy: effects of perspective-taking and cognitive appraisal. Journal of Cognitive Neuroscience. 2007;19(1):42–58. doi: 10.1162/jocn.2007.19.1.42. [DOI] [PubMed] [Google Scholar]
- Lazarus RS. Emotion and Adaptation. New York: Oxford University Press; 1991. [Google Scholar]
- LeDoux JE. Synaptic self: How our brains become who we are. Penguin; New York, NY: 2003. [Google Scholar]
- Levenson RW, Carstensen LL, Friesen WV, Ekman P. Emotion, physiology, and expression in old age. Psychology and aging. 1991;6(1):28. doi: 10.1037//0882-7974.6.1.28. [DOI] [PubMed] [Google Scholar]
- Lieberman MD, Eisenberger NI, Crockett MJ, Tom SM, Pfeifer JH, Way BM. Putting feelings into words affect labeling disrupts amygdala activity in response to affective stimuli. Psychological Science. 2007;18(5):421–428. doi: 10.1111/j.1467-9280.2007.01916.x. [DOI] [PubMed] [Google Scholar]
- Lieberman MD, Inagaki TK, Tabibnia G, Crockett MJ. Subjective responses to emotional stimuli during labeling, reappraisal, and distraction. Emotion. 2011;11(3):468. doi: 10.1037/a0023503. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Loewenstein G. Emotions in economic theory and economic behavior. American economic review. 2000:426–432. [Google Scholar]
- Loewenstein G, Lerner JS. The role of affect in decision making. In: Davidson RJ, Scherer KR, Goldsmith HH, editors. Handbook of Affective Sciences. New York: Oxford University Press; 2003. pp. 619–642. [Google Scholar]
- Luo Q, Nakic M, Wheatley T, Richell R, Martin A, Blair RJR. The neural basis of implicit moral attitude—an IAT study using event-related fMRI. Neuroimage. 2006;30(4):1449–1457. doi: 10.1016/j.neuroimage.2005.11.005. [DOI] [PubMed] [Google Scholar]
- Mehl MR, Pennebaker JW. The social dynamics of cultural upheaval: Social interactions surrounding September 11th, 2001. Psychological Science. 2003;14(6):579–585. doi: 10.1046/j.0956-7976.2003.psci_1468.x. [DOI] [PubMed] [Google Scholar]
- Moll J, de Oliveira - Souza R, Eslinger PJ, Bramati IE, Mourao - Miranda J, Andreiuolo PA, et al. The neural correlates of moral sensitivity: A functional magnetic resonance imaging investigation of basic and moral emotions. Journal of Neuroscience. 2002;22:2730–2736. doi: 10.1523/JNEUROSCI.22-07-02730.2002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Monin B, Pizarro DA, Beer JS. Deciding versus reacting: Conceptions of moral judgment and the reason-affect debate. Review of General Psychology. 2007;11(2):99. [Google Scholar]
- Moors A, Ellsworth PC, Scherer KR, Frijda NH. Appraisal theories of emotion: State of the art and future development. Emotion Review. 2013;5(2):119–124. [Google Scholar]
- Nisbett RE, Wilson TD. Telling more than we can know: Verbal reports on mental processes. Psychological Review. 1977;84(3):231. [Google Scholar]
- Nordgren LF, van der Pligt J, van Harreveld F. Evaluating Eve: visceral states influence the evaluation of impulsive behavior. Journal of Personality and Social Psychology. 2007;93(1):75. doi: 10.1037/0022-3514.93.1.75. [DOI] [PubMed] [Google Scholar]
- Ochsner KN. What is the role of control in emotional life? In: Gazzaniga M, editor. Social Neuroscience and Emotion. MIT Press; Cambridge, MA: 2014. pp. 719–730. [Google Scholar]
- Ochsner KN, Feldman-Barrett L. A multiprocess perspective on the neuroscience of emotion. In: Mayne TJ, Bonnano G, editors. Emotion: Current Issues and Future Directions. Guilford Press; New York: 2001. pp. 38–81. [Google Scholar]
- Ochsner KN, Gross JJ. The cognitive control of emotion. Trends in cognitive sciences. 2005;9(5):242–249. doi: 10.1016/j.tics.2005.03.010. [DOI] [PubMed] [Google Scholar]
- Ochsner KN, Gross JJ. Cognitive emotion regulation insights from social cognitive and affective neuroscience. Current Directions in Psychological Science. 2008;17(2):153–158. doi: 10.1111/j.1467-8721.2008.00566.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ochsner KN, Gross JJ. The neural bases of emotion and emotion regulation: A valuation perspective. In: Gross JJ, Thompson R, editors. the Handbook of Emotion Regulation. Second. New York: Guilford Press; 2014. pp. 23–42. [Google Scholar]
- Ochsner KN, Ray RD, Cooper JC, Robertson ER, Chopra S, Gabrieli JD, Gross JJ. For better or for worse: neural systems supporting the cognitive down-and up-regulation of negative emotion. Neuroimage. 2004;23(2):483–499. doi: 10.1016/j.neuroimage.2004.06.030. [DOI] [PubMed] [Google Scholar]
- Ochsner KN, Ray RR, Hughes B, McRae K, Cooper JC, Weber J, Gabrieli JDE, Gross JJ. Bottom-up and top-down processes in emotion generation common and distinct neural mechanisms. Psychological Science. 2009;20(11):1322–1331. doi: 10.1111/j.1467-9280.2009.02459.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ochsner KN, Silvers JA, Buhle JT. Functional imaging studies of emotion regulation: a synthetic review and evolving model of the cognitive control of emotion. Annals of the New York Academy of Sciences. 2012;1251(1):E1–E24. doi: 10.1111/j.1749-6632.2012.06751.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Parkinson C, Sinnott-Armstrong W, Koralus P, Mendelovici A, McGeer V, Wheatley T. Is morality unified? Evidence that distinct neural systems underlie moral judgments of harm, dishonesty, and disgust. Cognitive Neuroscience, Journal of. 2011;23(10):3162–3180. doi: 10.1162/jocn_a_00017. [DOI] [PubMed] [Google Scholar]
- Paulus MP, Rogalsky C, Simmons A, Feinstein JS, Stein MB. Increased activation in the right insula during decision making is related to harm avoidance and neuroticism. Neuroimage. 2003;19:1439–1448. doi: 10.1016/s1053-8119(03)00251-9. [DOI] [PubMed] [Google Scholar]
- Piaget J. The moral judgment of the child. New York: Harcourt, Brace Jovanovich; 1932. [Google Scholar]
- Pizarro D, Inbar Y, Helion C. On disgust and moral judgment. Emotion Review. 2011;3(3):267–268. [Google Scholar]
- Pond RS, Jr, DeWall CN, Lambert NM, Deckman T, Bonser IM, Fincham FD. Repulsed by violence: Disgust sensitivity buffers trait, behavioral, and daily aggression. Journal of personality and social psychology. 2012;102(1):175. doi: 10.1037/a0024296. [DOI] [PubMed] [Google Scholar]
- Ray RD, Ochsner KN, Cooper JC, Robertson ER, Gabrieli JD, Gross JJ. Individual differences in trait rumination and the neural systems supporting cognitive reappraisal. Cognitive, Affective, & Behavioral Neuroscience. 2005;5(2):156–168. doi: 10.3758/cabn.5.2.156. [DOI] [PubMed] [Google Scholar]
- Ridderinkhof KR, Ullsperger M, Crone EA, Nieuwenhuis S. The role of the medial frontal cortex in cognitive control. Science. 2004;306(5695):443–447. doi: 10.1126/science.1100301. [DOI] [PubMed] [Google Scholar]
- Rilling JK, Sanfey AG. The neuroscience of social decision-making. Annual review of psychology. 2011;62:23–48. doi: 10.1146/annurev.psych.121208.131647. [DOI] [PubMed] [Google Scholar]
- Rozin P, Lowery L, Imada S, Haidt J. The CAD triad hypothesis: a mapping between three moral emotions (contempt, anger, disgust) and three moral codes (community, autonomy, divinity) Journal of personality and social psychology. 1999;76(4):574. doi: 10.1037//0022-3514.76.4.574. [DOI] [PubMed] [Google Scholar]
- Sander D, Grafman J, Zalla T. The human amygdala: an evolved system for relevance detection. Reviews in the Neurosciences. 2003;14(4):303–316. doi: 10.1515/revneuro.2003.14.4.303. [DOI] [PubMed] [Google Scholar]
- Sanfey AG, Rilling JK, Aronson JA, Nystrom LE, Cohen JD. The neural basis of economic decision-making in the ultimatum game. Science. 2003;300:1755–1758. doi: 10.1126/science.1082976. [DOI] [PubMed] [Google Scholar]
- Satpute AB, Shu J, Weber J, Roy M, Ochsner K. The Functional Neural Architecture of Self-Reports of Affective Experience. Biological Psychiatry. 2013;73(7):631–638. doi: 10.1016/j.biopsych.2012.10.001. [DOI] [PubMed] [Google Scholar]
- Schnall S, Haidt J, Clore GL, Jordan AH. Disgust as embodied moral judgment. Personality and Social Psychology Bulletin. 2008 doi: 10.1177/0146167208317771. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schwarz N. Feelings-as-information theory. In: Van Lange PAM, Kruglanski A, Higgins ET, editors. Handbook of theories of social psychology. Thousand Oaks, CA: Sage; 2012. pp. 289–308. [Google Scholar]
- Shenhav A, Greene JD. Moral judgments recruit domain-general valuation mechanisms to integrate representations of probability and magnitude. Neuron. 2010;7:667–677. doi: 10.1016/j.neuron.2010.07.020. [DOI] [PubMed] [Google Scholar]
- Simon HA. Motivational and emotional controls of cognition. Psychological Review. 1967;74(1):29–39. doi: 10.1037/h0024127. [DOI] [PubMed] [Google Scholar]
- Spunt RP, Satpute AB, Lieberman MD. Identifying the what, why, and how of an observed action: an fMRI study of mentalizing and mechanizing during action observation. Journal of Cognitive Neuroscience. 2011;23(1):63–74. doi: 10.1162/jocn.2010.21446. [DOI] [PubMed] [Google Scholar]
- Taber-Thomas BC, Asp EW, Koenigs M, Sutterer M, Anderson SW, Tranel D. Arrested development: early prefrontal lesions impair the maturation of moral judgement. Brain. 2014;137(4):1254–1261. doi: 10.1093/brain/awt377. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Teper R, Tullett AM, Page-Gould E, Inzlicht M. Errors in Moral Forecasting Perceptions of Affect Shape the Gap Between Moral Behaviors and Moral Forecasts. Personality and Social Psychology Bulletin. 2015;41(7):887–900. doi: 10.1177/0146167215583848. [DOI] [PubMed] [Google Scholar]
- Thomson JJ. Killing, letting die, and the trolley problem. The Monist. 1976;59(2):204–217. doi: 10.5840/monist197659224. [DOI] [PubMed] [Google Scholar]
- Todorov A, Uleman JS. The efficiency of binding spontaneous trait inferences to actors’ faces. Journal of Experimental Social Psychology. 2003;39:549–562. [Google Scholar]
- Todorov A, Uleman JS. The person reference process in spontaneous trait inferences. Journal of Personality and Social Psychology. 2004;87(4):482–493. doi: 10.1037/0022-3514.87.4.482. [DOI] [PubMed] [Google Scholar]
- Treadway MT, Buckholtz JW, Martin JW, Jan K, Asplund CL, Ginther MR, Jones OD, Marois R. Corticolimbic gating of emotion-driven punishment. Nature Neuroscience. 2014;17(9):1270–5. doi: 10.1038/nn.3781. [DOI] [PubMed] [Google Scholar]
- Van Dillen LF, van der Wal RC, van den Bos K. On the Role of Attention and Emotion in Morality Attentional Control Modulates Unrelated Disgust in Moral Judgments. Personality and Social Psychology Bulletin. 2012;38(9):1222–1231. doi: 10.1177/0146167212448485. [DOI] [PubMed] [Google Scholar]
- Van Veen V, Krug MK, Schooler JW, Carter CS. Neural activity predicts attitude change in cognitive dissonance. Nature Neuroscience. 2009;12(11):1469–1474. doi: 10.1038/nn.2413. [DOI] [PubMed] [Google Scholar]
- Wendelken C, Bunge SA, Carter CS. Maintaining structured information: an investigation into functions of parietal and lateral prefrontal cortices. Neuropsychologia. 2008;46(2):665–678. doi: 10.1016/j.neuropsychologia.2007.09.015. [DOI] [PubMed] [Google Scholar]
- Yang Y, Raine A, Narr KL, Colletti P, Toga AW. Localization of deformations within the amygdala in individuals with psychopathy. Archives of General Psychiatry. 2009;66(9):986–994. doi: 10.1001/archgenpsychiatry.2009.110. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Young L, Bechara A, Tranel D, Damasio H, Hauser M, Damasio A. Damage to ventromedial prefrontal cortex impairs judgment of harmful intent. Neuron. 2010;65(6):845–851. doi: 10.1016/j.neuron.2010.03.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Young L, Dungan J. Where in the brain is morality? Everywhere and maybe nowhere. Social neuroscience. 2012;7(1):1–10. doi: 10.1080/17470919.2011.569146. [DOI] [PubMed] [Google Scholar]
- Young L, Saxe R. The neural basis of belief encoding and integration in moral judgment. Neuroimage. 2008;40(4):1912–1920. doi: 10.1016/j.neuroimage.2008.01.057. [DOI] [PubMed] [Google Scholar]
- Young L, Saxe R. An fMRI investigation of spontaneous mental state inference for moral judgment. Journal of cognitive neuroscience. 2009;21(7):1396–1405. doi: 10.1162/jocn.2009.21137. [DOI] [PubMed] [Google Scholar]
- Zaki J, Ochsner K. Reintegrating the study of accuracy into social cognition research. Psychological Inquiry. 2011;22(3):159–182. [Google Scholar]
- Zaki J, Ochsner KN. The neuroscience of empathy: progress, pitfalls and promise. Nature neuroscience. 2012;15(5):675–680. doi: 10.1038/nn.3085. [DOI] [PubMed] [Google Scholar]
- Zaki J, Davis JI, Ochsner KN. Overlapping activity in anterior insula during interoception and emotional experience. Neuroimage. 2012;62(1):493–499. doi: 10.1016/j.neuroimage.2012.05.012. [DOI] [PMC free article] [PubMed] [Google Scholar]
