Skip to main content
Springer logoLink to Springer
. 2010 Oct 9;18(1):103–115. doi: 10.1007/s11948-010-9236-0

Emotional Engineers: Toward Morally Responsible Design

Sabine Roeser 1,
PMCID: PMC3275750  PMID: 20936371

Abstract

Engineers are normally seen as the archetype of people who make decisions in a rational and quantitative way. However, technological design is not value neutral. The way a technology is designed determines its possibilities, which can, for better or for worse, have consequences for human wellbeing. This leads various scholars to the claim that engineers should explicitly take into account ethical considerations. They are at the cradle of new technological developments and can thereby influence the possible risks and benefits more directly than anybody else. I have argued elsewhere that emotions are an indispensable source of ethical insight into ethical aspects of risk. In this paper I will argue that this means that engineers should also include emotional reflection into their work. This requires a new understanding of the competencies of engineers: they should not be unemotional calculators; quite the opposite, they should work to cultivate their moral emotions and sensitivity, in order to be engaged in morally responsible engineering.

Keywords: Engineering, Design, Responsibility, Risk, Emotion

Introduction

The title of this paper (‘emotional engineers’) might strike the reader as an oxymoron: Engineers are normally seen as the archetype of rational and quantitatively oriented people. However, in this paper I will argue that engineers should use their emotions in order to develop morally responsible technologies. This requires a new understanding of the competencies of engineers: they should not be unemotional calculators; quite the opposite, they should work to cultivate their moral emotions and sensitivity, in order to be optimally engaged in morally responsible engineering.

Values in Technology

A common view amongst engineers is that technology is value-neutral and engineering a predominantly mathematical, quantitative discipline. However, various scholars from different backgrounds have argued that technological design is not value neutral. The way a technology is designed determines its possibilities, which can, for better or for worse, have consequences for human well-being.

Scholars in the field of society and technology studies (STS, Winner 1980) and in continental philosophy of technology (Verbeek 2005; Ihde 1990) have shown how the way technological products are designed determines our behavior.

In the field of legal decision making, Thaler and Sunstein (2008) have argued that our choices are largely determined by the design of the products and infrastructures we use. This is similar to the phenomenon of framing in the communication of statistical information (cf. Tversky and Kahneman 1974). For example, depending on where healthy and unhealthy snacks are placed in a cafeteria, people will be more prone to buy the one rather than the other. Thaler and Sunstein argue that there is no neutral design; no matter how somebody designs a product, that will to a significant degree influence our behavior. Hence, it is better to intentionally design products and infrastructures in such a way that they lead us to responsible behavior rather than making arbitrary or unconscious design-choices that might lead to suboptimal or even irresponsible behavior.

Recently, scholars in analytical philosophy of technology have also developed an account for including moral values1 in the development of technologies in order to come to morally better designs. This account is called ‘value sensitive design’ (Van den Hoven 2007). Design teams should include moral values and stakeholder values in an iterative process in the technologies they develop (Friedman 2004; Zwart et al. 2006).

It is curious how scholars from such diverse backgrounds as decision theory, STS and continental and analytical philosophy of technology independently come to very similar conclusions, partially without even mentioning parallel developments in different disciplines and discourses. Hopefully, this will change in the future by more interdisciplinary research so that scholars from different disciplines can draw on each other’s insights and developments. In any case, the fact that these different disciplines come to similar insights from very different perspectives makes their point even more urgent, that is, that technological design includes values that shape our behavior. However, rather than leaving up these values to be included by happenstance, we should intentionally include values that improve our behavior.

If moral decision making would be left to managers or policy makers, it would take place after a product is already developed. Noëmi Manders-Huits and Michael Zimmer (Manders-Huits and Zimmer 2009) have argued that it would be useful to have a so-called ‘values-advocate’ in a design team to make sure that values get due attention; this could be a moral philosopher or a social scientist. However, I believe that it might not be feasible to have somebody with such a background on each and every design team. Engineers themselves should also be trained to be aware of moral values and to explicitly take them into consideration in the design process (cf. Van der Burg and Van Gorp 2005). Rather than delegating moral reflection to ‘moral experts’, engineers should cultivate their own moral expertise. They have a key moral responsibility in the design process of risky technologies, as they have the technical expertise and are at the cradle of new developments. Engineers can reduce the risks of a technological product by developing a different design.

Risk and value sensitive design can be seen as two sides of the same coin. With value sensitive design we try to diminish the potentially negative effects of risky technologies. Engineers can influence the possible risks and benefits more directly than anybody else. However, technological risks and benefits are not merely a technical matter but also involve ethical aspects. This requires a capacity to be aware of moral saliences. I have argued elsewhere that emotions are an indispensable source of insight into ethical aspects of risk. In the remainder of this paper, I will argue that this means that engineers should also include emotional reflection into their work.

Risk, Values and Emotions

Engineers, policy-makers and other risk-experts generally define risk as a product of probabilities and unwanted consequences. Examples of unwanted consequences are the number of deaths or injuries, or the degree of pollution. In risk analysis and risk management, engineers, policy makers and other risk-experts use cost-benefit analysis to weigh the possible advantages of a technology against its possible disadvantages.

Many social scientists and philosophers who work in the field of risk argue that cost-benefit analysis and the definition of risk as a product of probabilities and unwanted consequences are not sufficient to determine whether a risk is acceptable or not. They also emphasize the importance of further considerations such as whether a risk is taken voluntarily, the distribution of risks and benefits in a population, and the available alternatives to a technology. Furthermore, a high probability of a small effect might be more acceptable than a small probability of a large effect, even though the product of probability and effect might be approximately equal. Defenders of such an approach argue that all risk judgments involve evaluative aspects. Even the standard definition of risk involves an evaluative judgment as to what counts as an unwanted consequence (Fischhoff, Lichtenstein et al. 1981; Jasanoff 1993; Shrader-Frechette 1991; Krimsky and Golding 1992; Slovic 2000; Jaeger, Renn et al. 2001). Hence, risk is not only a quantitative notion; rather, it also involves ethical considerations which conventional methods for risk assessment insufficiently take into account. These ethical considerations do play a role in the risk perceptions of laypeople (Slovic 2000). Hence, they have a richer understanding of risk than experts, which is needed for a complete moral evaluation of risks (Slovic 2000; Roeser 2007).

Empirical research by Paul Slovic and others shows that emotions are a major determinant in the risk perceptions of laypeople (Alhakami and Slovic 1994; Slovic 1999; Finucane et al. 2000; Slovic 2002). Slovic says that emotion and reason can interact and that we should take the emotions of the public seriously since they convey meaning; still, he sees analytic methods as the final arbiter in estimating risks (Slovic et al. 2004). Other scholars even go so far as to say that emotions should be excluded from decision making about risk (Sunstein 2005) or that they should at most be accepted as an unfortunate fact of life (Loewenstein; Weber et al. 2001, p. 281) or used instrumentally, in order to create acceptance for a technology (De Hollander and Hanemaaijer 2003). This interpretation of risk-emotions threatens to undermine the earlier rehabilitation of the risk perceptions of laypeople.

The theoretical framework that most scholars who work on risk and emotion endorse is Dual Process Theory (e.g. Slovic 2002). According to Dual Process Theory, there are two distinct systems with which we apprehend reality. System 1 is unconscious, fast, intuitive and emotional while system 2 is conscious, slow, analytical and rational (Epstein 1994; Sloman 1996; Sloman 2002; Stanovich and West 2002). Rational beliefs are supposed to be an afterthought to our immediate emotional responses (cf. Zajonc 1984; Haidt 2001).

The danger of this approach is that emotions can be discarded as irrational, subjective states. However, there are emotions that by their very nature transcend the two systems postulated by Dual Process Theory (Roeser 2009). These are emotions that involve a high degree of reflectivity and narrativity, such as emotional responses to fictional characters or to people or events who or which are far away (for a more nuanced view on the relation between reason and emotions in philosophy, cf. e.g., de Sousa 1987; Greenspan 1988; Solomon 1993; Stocker and Hegeman 1996; Goldie 2000; Nussbaum 2001; Roberts 2003).

Many emotions are spontaneous responses to what is nearby, but for example sympathetic emotions can lead us to extend our ‘circle of concern’, as Nussbaum (2001) phrases it. If we think about the suffering that other people might undergo by being the victims of a disaster, we usually feel touched and shocked about this. This realization involves moral emotions. These are emotions that are reflective, justifiable and based on reasons. Hence, such moral emotions neither fit neatly into system 1 nor into system 2. We need moral emotions in order to be aware of moral aspects of risky technologies (Roeser 2006b). For example, by caring about certain things we are able to perceive evaluative aspects of the world that we would otherwise not be able to be aware of (Little 1995; Blum 1994). Purely rational reflection would not be able to provide us with the imaginary power that we need to envisage future scenarios and to take part in other people’s perspectives and to evaluate their destinies.

Hence, the fact that the risk perceptions of laypeople involve emotions does not make them suspicious, to the contrary. We need moral emotions in order to have well-grounded insights into whether a technological risk is morally acceptable or not. For example, enthusiasm for a technology can point to benefits to our well-being, whereas fear and worry can indicate that a technology is a threat to our well-being; sympathy and empathy can give us insights in fair distributions of risks and benefits, and indignation can indicate violations of autonomy by technological risks that are imposed on us against our will (Roeser 2006b). Of course these emotions are not infallible: they can bias us towards what is close by. However, all our cognitive capacities are fallible, but we cannot do without them. We need emotions for well-grounded moral evaluations of risk. Emotions can themselves be a source of critical reflection about our risk-emotions (Roeser 2010). Such an approach can provide for a richer account of the importance of emotions in ethical reflection about risk than Dual Process Theory (Roeser 2009). Rather than being biases that threaten objectivity and rationality in thinking about acceptable risks, emotions contribute to a correct understanding of the moral acceptability of a hazard.

Risk-Emotions of Engineers

Engineers are often considered as the archetype of people who perform their work in a rational and quantitative way. They exemplify the idea, which is also endorsed by many Dual Process Theorists, that computational intelligence is superior to other human capacities of processing information, such as intuition and emotion.

However, there are scholars who challenge this computational ideal of intelligence (Dreyfus 1992). Some authors emphasize the importance of narrative intelligence (Mateas and Sengers 2003). Other authors emphasize emotional intelligence (Goleman 1995). These are forms of intelligence that go beyond deductive reasoning and analytical, logical thinking and that play an essential role in our practical rationality. People who lack on these capacities have difficulties making practical and moral judgments (Damasio 1994).

Several authors emphasize that emotions are needed for moral conduct by business managers (Simon 1987; Mumby and Putnam 1992; Gaudine and Thorne 2001; Klein 2002; Lurie 2004). We can extend this idea to other professionals, and more specifically, for the purpose of this paper, to engineers. We need engineers who have a sufficiently developed emotional sensitivity as this will give them access to morally important aspects of the technologies they design.2

It might be objected that we should leave the moral decision making about risky technologies to policy makers. However, as I have argued earlier, that would be a missed opportunity. It might mean that we try to constrain a technology when it is already too late. A more fruitful way is to let engineers explicitly and intentionally include moral reflection in the design process of risky technologies. But given my claim that emotions are a necessary source of moral reflection about risky technologies, this means that the emotions of engineers should play a role in the design of risky technologies.3

All this means that when educating and recruiting engineers, the emphasis shouldn’t solely be on ‘analytical’ or ‘hard’ skills, as has traditionally been the case, but also on ‘emotional’ or ‘soft’ skills. Currently, many technical universities include compulsory ethics-courses in their curricula (cf. Zandvoort et al. (2000), also cf. the ABET-criteria that require ethics courses in engineering curricula in the United States). This is an important step in the right direction. However, the emphasis in such courses is still mainly on argumentative and reasoning-skills. In addition, engineering-education should also include the development of sympathetic and emotional skills. This could be done by role playing games, through which the imaginative and emotional capacities of engineering students can be trained in a safe setting. An additional trajectory would be to include literature-courses and other parts of a liberal arts-education in the curriculum of engineering education programs (cf. Nussbaum (1997) who argues for this in a broad way, not specifically concerning engineering education).4

So far I have sketched why we need to emphasize the emotional capacities of engineers, and how this could be achieved. In the next section I will discuss how we can implement emotional reflection in the engineering design process.

Including Emotions in the Design Process

As I argued in the previous sections, we need engineers who take their emotional responses seriously, as emotions are helpful in assessing the moral values involved in technologies. This will enable engineers to play an important role in reflecting on morally responsible technological design. The importance of the emotions of engineers has so far not been mentioned by the scholars who emphasize values in design and whom I have discussed in section 2. Similar to scholars who work on risk, many scholars who work on values in design see reason as the predestined faculty of critical, moral deliberation, and they see emotions as a threat to rational decision making. At most they acknowledge that engineers should take into account the wants and desires of the customers. However, wants and desires are not necessarily emotions, and they are not necessarily grounded in moral considerations. My alternative account of emotions in risk perception also applies to the design of risky technologies. Emotions should play a key role in risk perception and in value sensitive design. Emotions sensitize us for complex ethical considerations that are involved in the awareness of the risks of technologies as much as in deliberating about how to diminish these risks in designing technologies. Emotions and scientific methods should be in a good balance when engineers think about risks. Where science can inform them about magnitudes, emotions inform them about moral saliences. Both kinds of information are inevitable if engineers want to make well-grounded judgments about acceptable risks.

Experts often accuse the public of being overly frightened of new technologies because they lack the relevant knowledge and are thereby basing their reactions on supposedly irrational feelings. Interestingly, nanotechnology gives rise to greater worries amongst experts than amongst the public (Scheufele, Corley et al. 2007). Of course, this is partially due to fact that most laypeople have never heard of nanotechnology. However, given the newness of nanotechnology, we can assume that the experts are more knowledgeable than the public about nanotechnology and its concomitant risks. Apparently, their fears can be attributed to a rational understanding of the risks involved in nanotechnology. Indeed, fear can point to a source of danger to our well-being (Green 1992; Roberts 2003; Roeser, 2009).

Engineers should use these worries in the design of their research and technologies, e.g., by building barriers to prevent certain hazards from occurring or by applying a precautionary approach, meaning that technologies of which the consequences are hard to predict should first be investigated in a safe setting. If experts are worried about the safety of the products they develop, this should be taken seriously and as a warning sign, asking for a precautionary approach. Experts should communicate their emotional-ethical concerns about technological risks and benefits to the public in addition to supplying quantitative information.

Fear about unpredictable consequences concerns situations in which even the experts do not know exactly what the implications of a technology might be. However, even if the consequences of a technology are fairly well known, there can be remaining emotional-ethical concerns that should be taken seriously. For example, emotions such as sympathy help to reveal ethical considerations such as justice and autonomy in decisions about acceptable risk (Roeser 2006b). By merely focusing on for example annual fatalities, as is the case in conventional approaches to risk analysis, we might overlook other morally relevant considerations which can be revealed through emotions. Emotions about risks can be based on reasonable concerns, for example regarding justice, fairness and autonomy. These concerns should be taken seriously by engineers when they reflect about the risky aspects of the technologies they design.

In the design process there should be a discussion-phase in which the emotional and ethical concerns of the engineers and of stakeholders are made explicit, thereby facilitating ethical reflection about possible risks and how to avoid or diminish these risks. Several methods have been developed to enable reflection about technology, for example scenarios that describe situations in which the use of a technology gives rise to moral considerations (cf. e.g. Boenink et al. 2010). These methods involve narratives that directly engage the imaginative and empathetic capacities of people. To the extent that this is not already the case, these methods could be further developed to explicitly encourage emotional engagement and emotional reflection.

An objection might be that the emotions of different people are too divergent to play such an important role. To this I would like to reply that of course the emotional responses of people can differ, but disagreement is nearly always a part of collective decision making, whether or not emotions are included. We should accept the possibly diverging emotions of people and discuss the concerns that lie behind them. Considering diverging emotions and views enables more balanced judgments. Our emotions are not infallible. Just like other sources of knowledge, emotions can also be mistaken. We should critically assess our emotions, but in doing so, we should take into account other emotions, those of ourselves and of other people. Emotions can be a source of ethical reflection (Lacewing 2005). For example, an emotion such as sympathy can correct egoistic emotions (Roeser 2010).

Emotion and Responsibility in Design

Let me end my discussion by elaborating on the role emotions can play in thinking about the moral responsibility of engineers. Several authors emphasize the importance of emotions such as shame, guilt, resentment and blame for the understanding or ascription of responsibility (cf. Wallace 1994; Schoeman 1987; Eisenberg 2000). These emotions work retrospectively and negatively, by condemning failed responsibility (McGraw 1987). They can be connected with backward-looking responsibility. On the other hand, sympathy, empathy and compassion can let us be aware of our responsibility in a forward-looking sense (for the distinction between these two kinds of responsibility, cf. Nihlén Fahlquist (2008). They make us aware of actions we can perform in order to help to improve the situations of others. This is confirmed by empirical research by Paul Slovic. Slovic shows that we get ‘numbed by numbers’ and statistics concerning desasters. Emotions let us see what matters, they help to motivate. In concrete situations where emotions are aroused, people are capable of being directly involved, and indifference becomes less likely (Slovic 2010).

Backward-looking responsibility and its concomitant emotions are important, as they let people critically reflect on what they have done in the past and how they could have done things better. Ultimately, this should lead to enhanced emotional sensitivity concerning forward-looking responsibility. Forward-looking responsibility and the emotions that are involved with it are especially important in the context of the moral responsibility of engineers in the design of technology, as design is concerned with things that are yet to come.

There is a temptation to try to codify the responsibility of professionals in clear rules that provide for infallible guidelines. However, as various moral philosophers have argued, practical reality is so complex, and every situation so unique, that moral insights cannot be codified and subsumed under simple rules. Rather than applying clear-cut rules, we need context-sensitive insights. (Prichard 1912; Ewing 1929; Broad 1951 [1930]; Ross 1967 [1930]; Dancy 2004). Context-sensitive insights require moral emotions (Damasio 1994; Roeser 2006a). It can be argued that in the case of the design of risky technologies, context-sensitivity is even more important, as risky technologies can lead to new and unpredictable situations that escape codifiable rules.

This connects well with recent developments in thinking about responsibility as a virtue (cf. Williams 2008). Virtue ethicists emphasize that virtuous moral agents need their capacity of moral insight (practical wisdom or ‘phronesis’) to make context-sensitive moral judgments in complex, real-life situations. A virtuous person is somebody whose character is developed in such a way that she steers a wise middle ground between extreme responses. According to some virtue ethicists, this requires that the virtuous person has well developed emotions (Roberts 2003; Döring and Feger 2010; Roberts 2010). A virtue-responsible person is aware of the different normative claims that rest on her and makes the right decision. She is responsive to her responsibilities and ready to act accordingly.

Jessica Nihlén Fahlquist (Nihlén Fahlquist 2010) has argued that this can mean that a professional sees that she has to transcend the formal responsibility she has been assigned by her job description or her official role in the organization she works for. According to Nihlén Fahlquist, such an approach can avoid the so-called ‘problem of many hands’. This problem means that in complex projects that involve the contribution of many different professionals, things can go wrong and serious accidents can happen although nobody acted in a clearly reckless way. Rather, some people made small mistakes that would by themselves have been insignificant. However, due to an unfortunate coincidence, this results in a major accident, because several barriers have failed, as every individual relied on the expectation that the others would do a good job. A famous example is the accident with the Herald of Free Enterprise, where numerous insignificant mistakes led to the capsizing of a ferry and the death of hundreds of people. The same pattern can be seen in other major accidents as well. Nihlén Fahlquist argues that if professionals just follow a minimal responsibility, this might easily lead to gaps in responsibility distributions. This is because real life situations are much more complex than can possibly be foreseen. However, if people see their responsibility less formally, but rather act from virtue, they will extend their responsibility beyond their formally assigned role. Nihlén Fahlquist connects this with insights from the ethics of care. The ethics of care stresses the importance of caring for the needs that concrete persons have, rather than merely obeying abstract rules. People who act from an attitude of responsibility as the virtue of care will check whether things work as they are supposed to work, even if this goes beyond their own task. They will take extra actions if they realize that nobody feels responsibility for a situation that has not been foreseen and has not been formalized in a distribution of tasks. This will likely result in an environment where accidents cannot happen as easily, because more people double-check whether things are going well rather than just doing what they have been told to do. It will also entice people to come with creative solutions to new situations.

This connects well with what I have said before about moral emotions of engineers. Moral emotions make engineers sensitive to moral issues arising from the technologies they develop. Emotions let us get involved with situations. They help us transcend a detached, abstract attitude that could lead to indifference to morally problematic aspects of technologies. This is especially true in the design of risky technologies, where there might be consequences of which we do not know whether and when they manifest themselves, or that are unforeseen or difficult to quantify. A formalistic approach to responsibility can easily lead to negligence or the idea that ‘others are responsible’.

This is also nicely illustrated by a case study (described in Van der Burg and Van Gorp 2005). The design team of a new trailer was aware that the trailer could be designed in a safer way, but since the client had not asked for that, they did not explore that alternative. However, the client, not being a technical expert, was not even aware of the fact that there was a safer alternative. Hence, here the engineers should have taken a pro-active attitude, bringing this option up with the client. Van der Burg and Van Gorp use a virtue ethical approach to argue that engineers should use their imaginative capacities, for example by empathizing with possible victims of a suboptimally safe trailer, in order to come to such a more active appreciation of their moral responsibility in designing risky technologies.

All this shows how engineers can take on stronger responsibilities if they cherish their imaginative, emotional capacities that are also emphasized by various virtue ethicists. By explicitly not only allowing, but even requiring emotional considerations in the engineering arena, engineers will feel involved, responsible and prone to take action. This will lead to morally better designs and to more humane technologies.

Conclusion

In this paper I have argued that in order to have engineers who are morally sensitive to ethical aspects of their work, we need engineers who have well-developed emotional capacities. Engineers who are trained in using their empathy and sympathy can imagine themselves in different roles, for example in the role of victims of risky technologies. This enables them to realize that they should go beyond their formally defined role, and to be motivated accordingly. This means that.

  1. we need to include emotional-ethical reflection and deliberation in the design process of risky technologies; and

  2. we have to revise our curricula for engineering education, by including courses that enhance the emotional and imaginative capacities of future engineers.

This will enable engineers to live up to the moral responsibilities that are inherent to their work.5

Open Access

This article is distributed under the terms of the Creative Commons Attribution Noncommercial License which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.

Footnotes

1

In this paper, I focus on moral values, but a same argument can be made for other values, e.g. aesthetic values.

2

The ‘rationalistic’ bias in current engineering culture is also reflected by the fact that engineering is considered to be a ‘male’ profession, with a low percentage of female engineers and engineering students, as the concepts ‘rational’, ‘male’ and ‘emotional’, ‘female’ are traditionally linked (cf. Faulkner 2000, Robinson and Mcllwee 2005).Turning engineering into a ‘softer’ discipline might also have an effect on gender roles, possibly making engineering a more attractive discipline for women.

3

Thanks to an anonymous referee to press me on this point.

4

I wish to thank an anonymous reviewer for pointing out that this is already the case in engineering curricula in the United States. However, in for example the Netherlands and Germany, this is not yet the case.

5

I would like to thank two anonymous reviewers for their very helpful comments on an earlier version of this paper. Work for this paper has been done during a NIAS-Fellowship.

References

  1. Alhakami, A. S. & P. Slovic (1994). A psychological study of the inverse relationship between perceived risk and perceived benefit. Risk Analysis, 14(6), 1085–1096. [DOI] [PubMed]
  2. Blum LA. Moral perception and particularity. Cambridge England; New York, NY, USA: Cambridge University Press; 1994. [Google Scholar]
  3. Boenink, M., Swierstra, T., & Stemerding, D. (2010). Anticipating the interaction between technology and morality: A scenario study of experimenting with humans in bionanotechnology. Studies in Ethics, Law, and Technology, 4(2), 4. Available at: http://www.bepress.com/selt/vol4/iss2/art4.
  4. Broad, C. D. (1951 [1930]). Five types of ethical theory. London: Routledge and Kegan Paul.
  5. Damasio AR. Descartes’ error : Emotion, reason and the human brain. New York: G.P. Putnam; 1994. [Google Scholar]
  6. Dancy J. Ethics without principles. Oxford New York: Clarendon Press/Oxford University Press; 2004. [Google Scholar]
  7. De Hollander G, Hanemaaijer A. Nuchter omgaan met risico’s: Milieu—en natuurplanbureau (MNP)—RIVM. Bilthoven: RIVM; 2003. [Google Scholar]
  8. de Sousa R. The rationality of emotion. Cambridge, Mass. etc.: MIT Press; 1987. [Google Scholar]
  9. Döring, S. and F. Feger (2010). Risk assessment as virtue. In S. Roeser (Ed.), Emotions and risky technologies (pp. 91–105). Dordrecht: Springer.
  10. Dreyfus H. What computers still can’t do. A critique of artificial reason. Cambridge MA: MIT-Press; 1992. [Google Scholar]
  11. Eisenberg N. Emotion, regulation, and moral development. Annual Review of Psychology. 2000;51:665–697. doi: 10.1146/annurev.psych.51.1.665. [DOI] [PubMed] [Google Scholar]
  12. Epstein S. Integration of the cognitive and the psychodynamic unconscious. American Psychologist. 1994;49(8):709–724. doi: 10.1037/0003-066X.49.8.709. [DOI] [PubMed] [Google Scholar]
  13. Ewing AC. The Morality of punishment, with some suggestions for a general theory of ethics. London, Kegen Paul, Trench: Trubner & co., ltd; 1929. [Google Scholar]
  14. Faulkner W. Dualisms, hierarchies and gender in engineering. Social Studies of Science. 2000;30(5):759–792. doi: 10.1177/030631200030005005. [DOI] [Google Scholar]
  15. Finucane M, Alhakami A, et al. The affect heuristic in judgments of risks and benefits. Journal of Behavioral Decision Making. 2000;13:1–17. doi: 10.1002/(SICI)1099-0771(200001/03)13:1<1::AID-BDM333>3.0.CO;2-S. [DOI] [Google Scholar]
  16. Fischhoff B, Lichtenstein S, et al. Acceptable risk. Cambridge; New York: Cambridge University Press; 1981. [Google Scholar]
  17. Friedman B. Value sensitive design. encyclopedia of human-computer interaction. Great Barrington, MA: Berkshire Publishing Group; 2004. pp. 769–774. [Google Scholar]
  18. Gaudine A, Thorne L. Emotion and ethical decision-making in organizations. Journal of Business Ethics. 2001;31:175–187. doi: 10.1023/A:1010711413444. [DOI] [Google Scholar]
  19. Goldie P. The emotions : A philosophical exploration. Oxford; New York: Clarendon Press; 2000. [Google Scholar]
  20. Goleman D. Emotional intelligence: Why it can matter more than IQ. New York: Bantam; 1995. [Google Scholar]
  21. Green OH. The emotions : A philosophical theory. Dordrecht; Boston: Kluwer; 1992. [Google Scholar]
  22. Greenspan PS. Emotions & reasons: An inquiry into emotional justification. New York: Routledge; 1988. [Google Scholar]
  23. Haidt J. The emotional dog and its rational tail: A social intuitionist approach to moral judgment. Psychological Review. 2001;108(4):814–833. doi: 10.1037/0033-295X.108.4.814. [DOI] [PubMed] [Google Scholar]
  24. Ihde D. Technology and the lifeworld: From garden to earth. Bloomington and Indianapolis: Indiana University Press; 1990. [Google Scholar]
  25. Jaeger CC, Renn O, et al. Risk, uncertainty, and rational action. London etc.: Earthscan; 2001. [Google Scholar]
  26. Jasanoff S. Bridging the two cultures of risk analysis. Risk Analysis. 1993;13:123–129. doi: 10.1111/j.1539-6924.1993.tb01057.x. [DOI] [Google Scholar]
  27. Klein S. The head, the heart, and business virtues. Journal of Business Ethics. 2002;39:347–359. doi: 10.1023/A:1019762707218. [DOI] [Google Scholar]
  28. Krimsky S, Golding D. Social theories of risk. Westport, Conn: Praeger Publishers; 1992. [Google Scholar]
  29. Lacewing M. Emotional self-awareness and ethical deliberation. Ratio. 2005;18:65–81. doi: 10.1111/j.1467-9329.2005.00271.x. [DOI] [Google Scholar]
  30. Little MO. Seeing and caring: The role of affect in feminist moral epistemology. Hypatia: A Journal of Feminist Philosophy. 1995;10(3):117–137. [Google Scholar]
  31. Loewenstein GF, Weber EU, et al. Risk as feelings. Psychological Bulletin. 2001;127:267–286. doi: 10.1037/0033-2909.127.2.267. [DOI] [PubMed] [Google Scholar]
  32. Lurie Y. Humanizing business through emotions: On the role of emotions in ethics. Journal of Business Ethics. 2004;49:1–11. doi: 10.1023/B:BUSI.0000013851.16825.51. [DOI] [Google Scholar]
  33. Manders-Huits, N. & Zimmer, M. (2009). Values and pragmatic action: The challenges of introducing ethical intelligence in technical design communities. International Review of Information Ethics, 10.
  34. Mateas M, Sengers P, editors. Narrative intelligence. Amsterdam, Philadelphia: John Benjamins; 2003. [Google Scholar]
  35. McGraw KM. Guilt following transgression: An attribution of responsibility approach. Journal of Personality and Social Psychology. 1987;53(2):247–256. doi: 10.1037/0022-3514.53.2.247. [DOI] [PubMed] [Google Scholar]
  36. Mumby DK, Putnam LL. The politics of emotion: A feminist reading of bounded rationality. Academy of Management Review. 1992;17(3):465–486. [Google Scholar]
  37. Nihlén Fahlquist, J. (2008). Moral responsibility for environmental problems—individual or institutional? Journal of Agricaltural and Environmental Ethics, 22, 109–124.
  38. Nihlén Fahlquist, J. (2010). The problem of many hands and responsibility as the virtue of care. Managing in critical times—philosophical responses to organisational turbulence proceedings (forthcoming).
  39. Nussbaum MC. Cultivating humanity: A classical defense of reform in liberal education. Cambridge, MA: Harvard University Press; 1997. [Google Scholar]
  40. Nussbaum MC. Upheavals of thought : The intelligence of emotions. Cambridge etc.: Cambridge University Press; 2001. [Google Scholar]
  41. Prichard HA. Does moral philosophy rest on a mistake? Mind. 1912;21:21–37. doi: 10.1093/mind/XXI.81.21. [DOI] [Google Scholar]
  42. Roberts RC. Emotions : An essay in aid of moral psychology. Cambridge, UK; New York: Cambridge University Press; 2003. [Google Scholar]
  43. Roberts, R. C. (2010). Emotions and judgments about risks. In S. Roeser (Ed.), Emotions and risky technologies (pp. 107–126 ). Dordrecht: Springer.
  44. Robinson JG, Mcllwee JS. Men, women, and the culture of engineering. Sociological quarterly. 2005;32(3):403–421. doi: 10.1111/j.1533-8525.1991.tb00166.x. [DOI] [Google Scholar]
  45. Roeser S. A particularist epistemology: “Affectual intuitionism”. Acta Analytica. 2006;21:33–44. doi: 10.1007/s12136-006-1013-y. [DOI] [Google Scholar]
  46. Roeser S. The role of emotions in judging the moral acceptability of risks. Safety Science. 2006;44(8):689–700. doi: 10.1016/j.ssci.2006.02.001. [DOI] [Google Scholar]
  47. Roeser S. Ethical intuitions about risks. Safety Science Monitor. 2007;11:1–30. [Google Scholar]
  48. Roeser S. The relation between cognition and affect in moral judgments about ris. In: Asveld L, Roeser S, editors. The ethics of technological risks. London: Earthscan; 2009. pp. 182–201. [Google Scholar]
  49. Roeser, S. (2010). Emotional reflection about risks. In S. Roeser (Ed.), Emotions and risky technologies (pp. 231–244 ). Dordrecht: Springer.
  50. Ross, W. D. (1967 [1930]). The right and the good. Oxford: The Clarendon Press.
  51. Scheufele DA, Corley EA, et al. Scientists worry about some risks more than the public. Nature Nanotechnology. 2007;2:732–734. doi: 10.1038/nnano.2007.392. [DOI] [PubMed] [Google Scholar]
  52. Schoeman FD, editor. Responsibility, character, and the emotions: new essays in moral psychology. Cambridge: Cambridge University Press; 1987. [Google Scholar]
  53. Shrader-Frechette KS. Risk and rationality : Philosophical foundations for populist reforms. Berkeley, CA etc.: University of California Press; 1991. [Google Scholar]
  54. Simon, H. (1987). Making management decisions: The role of intuition and emotion. The Academy of Management Executive.
  55. Sloman SA. The empirical case for two systems of reasoning. Psychological Bulletin. 1996;119(1):3–21. doi: 10.1037/0033-2909.119.1.3. [DOI] [Google Scholar]
  56. Sloman SA. Two systems of reasoning. In: Gilovich T, Griffin DW, Kahneman D, editors. Heuristics and biases: The psychology of intuitive judgment. Cambridge: Cambridge University; 2002. pp. 379–396. [Google Scholar]
  57. Slovic P. Trust, emotion, sex, politics, and science: Surveying the risk-assessment battlefield. Risk Analysis. 1999;19:689–701. doi: 10.1023/a:1007041821623. [DOI] [PubMed] [Google Scholar]
  58. Slovic P. The perception of risk. London; Sterling, VA: Earthscan Publications; 2000. [Google Scholar]
  59. Slovic P. The Affect heuristic. In: Gilovich T, Griffin D, Kahnemann D, editors. Intuitive judgment: Heuristics and biases. Cambridge: Cambridge University Press; 2002. pp. 397–420. [Google Scholar]
  60. Slovic P. If I look at the mass I will never act”: Psychic numbing and genocide. In: Roeser S, editor. Emotions and risky technologies. Springer: Dordrecht; 2010. pp. 37–59. [Google Scholar]
  61. Slovic P, Finucane ML, et al. Risk as analysis and risk as feelings : Some thoughts about affect, reason, risk, and rationality. Risk Analysis. 2004;24(2):311–322. doi: 10.1111/j.0272-4332.2004.00433.x. [DOI] [PubMed] [Google Scholar]
  62. Solomon RC. The passions : Emotions and the meaning of life. Indianapolis: Hackett Publishing Company; 1993. [Google Scholar]
  63. Stanovich KE, West RF. Individual differences in reasoning: Implications for the rationality debate? In: Gilovich T, Griffin DW, Kahneman D, editors. Heuristics and biases : The psychology of intuitive judgment. Cambridge: Cambridge University Press; 2002. pp. 421–440. [Google Scholar]
  64. Stocker M, Hegeman E. Valuing emotions. Cambridge etc.: Cambridge University Press; 1996. [Google Scholar]
  65. Sunstein CR. Laws of fear. Cambridge: Cambridge University Press; 2005. [Google Scholar]
  66. Thaler RH, Sunstein CR. Nudge: Improving decisions about health, wealth and happiness. London etc.: Penguin Books; 2008. [Google Scholar]
  67. Tversky A, Kahneman D. Judgment under uncertainty: Heuristics and biases. Science. 1974;185:1124–1131. doi: 10.1126/science.185.4157.1124. [DOI] [PubMed] [Google Scholar]
  68. Van den Hoven J. ICT and value sensitive design. In: Goujon P, Lavelle S, Duquenoy P, Kimppa K, Laurent V, editors. The information society: Innovation, legitimacy, ethics and democracy. Boston: Springer; 2007. [Google Scholar]
  69. Van der Burg S, Van Gorp A. Understanding moral responsibility in the design of trailers. Science and Engineering Ethics. 2005;11:235–256. doi: 10.1007/s11948-005-0044-x. [DOI] [PubMed] [Google Scholar]
  70. Verbeek P-P. What things do: Philosophical reflections on technology, agency, and design. Penn State: Pennsylvania State University Press; 2005. [Google Scholar]
  71. Wallace RJ. Responsibility and the moral sentiments. Cambridge, MA: Harvard University Press; 1994. [Google Scholar]
  72. Williams G. Responsibility as a virtue. Ethical Theory and Moral Practice. 2008;11(4):455–470. doi: 10.1007/s10677-008-9109-7. [DOI] [Google Scholar]
  73. Winner L. Do artifacts have politics? Daedalus. 1980;109(1):121–136. [Google Scholar]
  74. Zajonc, R. B. (1984). On Primacy of affect. Approaches to Emotion. K. R. Scherer, Ekman, Paul. Hillsdale, London: Lawrence Erlbaum Associates, pp. 259–270.
  75. Zandvoort H, Van de Poel IR, Brumsen M. Ethics in the engineering curricula: Topics, trends and challenges for the future. European Journal of Engineering Education. 2000;25(4):291–302. doi: 10.1080/03043790050200331. [DOI] [Google Scholar]
  76. Zwart SD, Van de Poel IR, et al. A network approach for distinguishing ethical issues in research and development. Science and Engineering Ethics. 2006;12:663–684. doi: 10.1007/s11948-006-0063-2. [DOI] [PubMed] [Google Scholar]

Articles from Science and Engineering Ethics are provided here courtesy of Springer

RESOURCES