Skip to main content
Wiley Open Access Collection logoLink to Wiley Open Access Collection
. 2023 Dec 17;39(1):5–17. doi: 10.1111/bioe.13255

Advancing ethics support in military organizations by designing and evaluating a value‐based reflection tool

Eva van Baarle 1,2,, Steven van Baarle 3
PMCID: PMC11657313  PMID: 38105609

Abstract

Military employees face all sorts of moral dilemmas in their work. The way they resolve these dilemmas—how they decide to act based on their moral deliberations—can have a substantial impact both on society and on their personal lives. Hence, it makes sense to support military employees in dealing with these dilemmas. Military organizations already support their personnel by adopting compliance‐based approaches that focus, for instance, on enforcing moral rules. At the same time, however, they struggle to develop value‐based approaches that could foster moral learning by improving employees' understanding of personal values, others' values, and their responsibility for others. Consequently, military employees are not adequately supported in their ethical decision‐making when confronted with complex situations. To address this issue, drawing on a design research approach, we develop and evaluate the use of a value‐based reflection tool to support military employees with their moral decision‐making. The design and evaluation of the value‐based reflection tool were informed by five semistructured interviews, notes on 45 joint reflection meetings with trainers, and evaluation notes of 755 participants. Our findings suggest the value‐based reflection tool is a promising way to foster actors' moral competence in organizational settings by triggering the social mechanisms of reflection, empathy, and psychological safety. This study is the first to illustrate that value‐based ethics support can complement compliance‐based ethics support in a military organization. Furthermore, it demonstrates design research's potential to develop actionable knowledge for ethics support practices in organizations.

Keywords: design research, ethics support, military organization, organizational development, organizational learning, value‐based reflection tool

1. INTRODUCTION

Military personnel of all ranks and in all branches are morally and legally responsible for their actions both in warfare and during peacetime.1 As these employees face all sorts of moral dilemmas in their daily practice, military ethics scholars, therefore, aim to help them carry out their tasks “as honorably and correctly as possible.”2 Hence the importance of support practices such as ethics training. These support practices include empowering military personnel to say “no” when something is legally and/or ethically inappropriate.3

The literature distinguishes different types of ethics‐support practices. A concept that is gaining more ground is to distinguish compliance‐based and value‐based ethics support.4 Compliance‐based strategies focus on organizational control by developing and enforcing moral rules, codes of conduct, legal frameworks, and/or appointing “compliance officers.” Compliance‐based strategies also develop sets of moral rules and interdictions with which employees must comply.5 Compliance principles are often based on highly formalized templates of appropriate behavior, which may clash in practice with actors' heterogeneous expectations, beliefs, motives, and behaviors. This has led to criticism that compliance‐focused models constitute a closed system approach6 or a decoupling of policy and practice whereby organizations only symbolically adopt policies without implementing them substantively (i.e., the policies do not result in the desired behavior).7

In contrast, value‐based strategies are relational and responsive‐oriented. By adopting practices from, for instance, the organizational development domain, training and facilitation repertoire, or participatory research, value‐based strategies focus on organizational learning.8 These strategies attempt to stimulate reflection and understanding of personal values, as well as those of others, and responsibility for others concerning context, empathy, and action. They are also referred to as attempts to foster moral competence, comprising six interrelated elements: (1) becoming aware of one's personal values and those of others; (2) the ability to recognize the moral dimensions of a situation and identify which values are at stake or at risk of violation; (3) the ability to adequately judge a moral issue or dilemma; (4) the ability to communicate this judgment; (5) the willingness and ability to act in accordance with this judgment in a morally responsible manner; and (6) the willingness and ability to be accountable to yourself and to others.9

Compliance‐based and value‐based ethics support approaches are not necessarily at odds and can be complementary. Value‐based ethics support can foster moral awareness and motivate employees to comply with organizational values and norms.10 However, many organizations tend to overemphasize their enforcement of compliance. Whereas these approaches sometimes include moral judgment and decision‐making models, the vast majority draw on deontological frameworks, focusing on moral rules, codes of conduct and/or legal frameworks.11

In military organizations, ethics education and training is predominantly conceptual or compliance‐based.12 As such, it focuses on establishing and enforcing key norms, legal principles, and obligations.13 Interestingly, several military academies opt for virtue ethics as a theoretical basis for military ethics education, lectures, and discussions. Yet, they struggle to create a practice‐oriented approach to virtue ethics, such as value‐based training or developing virtues whereby military personnel can draw on their own experiences and dilemmas.14 This focus on compliance is problematic because it fails to adequately support military employees' ethical decision‐making in complex situations.15 Furthermore, inadequate support may increase mental health problems,16 or “moral injuries” including feelings of guilt and shame, anger, and betrayal.17

In this study, we therefore aim to complement the predominant compliance strategy with a value‐based ethics support practice. Our research question is, how does a value‐based ethics practice foster military personnel's moral learning and ethical decision‐making in complex situations? To answer this question, we draw on a design research approach18 to develop and evaluate the use of a value‐based reflection (VBR) tool for military personnel.

Our findings illustrate the VBR tool's potential for fostering moral competence by triggering the social mechanisms of reflection, empathy, and psychological safety. These findings contribute to the literature by showing how to complement compliance‐based practices with a value‐based ethics support practice designed for and tested in a military organization. They furthermore demonstrate design research's potential for developing value‐based ethics support practices in organizations.

2. METHOD

2.1. Research approach

This study aims to contribute to the relevant literature and, at the same time, improve ethics support practices within military organizations. This implies that we devise a “course of action aimed at changing existing conditions into desired ones.”19 Simon has proposed design research (DR) for such action. DR is problem‐centered and has a solution‐oriented nature,20 thus aiming to identify real‐world problems, develop solutions, and subsequently test them.21

The DR approach involves four steps, as visualized in Figure 1. These steps will allow us to learn from the intervention while addressing the problematic situation.22 The four‐step DR methodology comprises the following: (i) problem formulation, (ii) selection of evidence, (iii) design, intervention, and evaluation, and (iv) reflection and learning23 (see Figure 1). Step one, problem formulation, means becoming aware of the practical problems when supporting military personnel in dealing with moral dilemmas. The second step—selection of evidence—involves selecting the best available evidence in (academic) discourses relating to the problem formulation in step one. Step three—design, intervention, and evaluation—applies the problem formulation and theoretical assumptions from the previous step. These assumptions provide a platform for designing the VBR tool prototype, which is further developed in subsequent design cycles. Step four, reflection and learning, moves conceptually beyond designing a VBR tool for a particular domain. Thus, in this stage, learnings are made transferable to other domains and to a wider range of problems regarding ethics support practices. In other words, the lessons learnt in a specific domain become the input for intervention development in other domains.

Figure 1.

Figure 1

The four steps in the design research approach. Source: Adapted from Sein, M., Henfridsson, O., Purao, S., Rossi, M., & Lindgren, R. (2011). Action design research. MIS Quarterly, 35(1), 37–56.

2.2. Data collection

The study draws on three sources of qualitative data, covering a 3‐year period (see Table 1). We collected data during the Netherlands Military Middle Management Career Course (NMMMCC) and the Netherlands International Military Cooperation Course (NIMCC).

Table 1.

Data sources and their use in the design approach.

Source Content Use in design approach
Semistructured interviews

Four interviews lasting 2 h with trainers involved in designing and evaluating the VBR tool.

One interview lasting 2 h with a participant of the Netherlands Military Middle Management Career Course who presented his dilemma during this course.

The semistructed interviews informed the problem formulation (step 1); the design, intervention, and evaluation (step 3); and learning (step 4) of the design approach.

The semistructed interviews informed the problem formulation (step 1); the design, intervention, and evaluation (step 3); and learning (step 4) of the design approach.

Formal and informal evaluation

720 notes from the Netherlands Military Middle Management Career Course, 45 courses for military officers (Major level): Army, Navy, Air Force and Military Police Officers, including Military Medical Personnel (2–3 per course).

35 notes from the Netherlands International Military Cooperation Course—from Major to Colonel level, one participant per country, to foster open learning. Participants were from Bahrain, Brazil, Chile, Egypt, Ethiopia, Georgia, India, Japan, Jordan, Kosovo, Kuwait, Lebanon, Malaysia, Montenegro, Nigeria, North Macedonia, Peru, Suriname, Senegal, and Tanzania.

Notes of joint reflection meetings after each course. In these meetings the participant's evaluation was also part of the conversation. These meetings were not recorded, yet notes were taken during and right after the meeting (45 notes in total).

These notes informed the design, intervention, and evaluation (step 3); and learning (step 4) of the design approach.

These notes informed the design, intervention, and evaluation (step 3); and learning (step 4) of the design approach.

These notes informed the problem formulation (step 1); the design, intervention and evaluation (step 3); and learning (step 4) of the design approach.

First, four semistructured interviews served to capture the narratives of the four trainers involved in the courses. One in‐depth interview was conducted with the participant whose case plays a central role in the Findings section. Second, we collected 720 evaluation notes from participants of the NMMMCC, including Military Medical Personnel, and 35 evaluation notes from participants of the NIMCC. Third, we collected evaluative notes from trainers. All courses were facilitated by two trainers. Each course was evaluated by the trainers in a joint reflection meeting after each course (45 meetings in total). Discussing the implications of the participants' evaluative notes were part of the conversation. The meetings were not recorded, yet notes were taken.

2.3. Data analysis

To interpret our data, we used inductive thematic analysis.24 We analyzed the interviews and different notes by systematically identifying, organizing, and offering insight into patterns of meaning (themes) across our dataset. The transcripts and notes were coded separately by both authors. No pre‐existing coding framework was used. A collaborative discussion followed the open coding, resulting in a tentative overview of themes (e.g., organizational problem, supporting factors, inhibiting factors, accountability, relevance to practice, actions, and roles). Table 1 summarizes the data sources and their use in the design approach.

2.4. Research ethics

In accordance with the ethical principles for medical research, as stated in the Declaration of Helsinki, all participants gave their oral informed consent after being informed about the purpose of the study and the way data would be collected, analyzed, and saved. The case contributor gave us permission to present his case here and has read this article prior to publication. Beforehand, we emphasized the voluntary nature of participation. Medical Research Ethics Committee approval was deemed unnecessary since this study does not fall under the Dutch Medical Research involving the Human Subjects Act and related regulations (www.ccmo.nl). The data set used and analyzed during the current study is available from the corresponding author upon reasonable request.

3. FINDINGS

We present our findings in line with the four‐step design research approach explained above.

3.1. Step 1: Problem formulation

The research question guiding our design process is: how does a value‐based ethics support tool foster military personnel's moral learning and ethical decision‐making in complex situations? This particular question was triggered by our experience as ethics trainers. More specifically, as trainers in a military setting, we regularly ask ourselves what are the key factors for strengthening participants' moral competence. We believe that strengthening moral competence involves both autonomous and critical thinking. Yet, asking critical questions, recognizing your own and others' values, reflecting on moral issues, and resisting group pressure often prove to be even harder in this setting.25 Autonomous and critical thinking will likely be hindered by contextual conditions such as a “can do” mentality,26 a soldier's perceived identity,27 uniformity, hierarchy, and/or masculinity.28 We argue that these particular conditions should inform the intervention design.

The first key design requirement for the VBR tool is that it motivates military personnel to improve their ethical decision‐making in concrete situations. Rules and regulations are important; however, in our teaching, we recognize that employees engage far more with their own or colleagues' concrete experiences. This, therefore, implies that the VBR tool uses concrete experiences of moral dilemmas as the starting point for military personnel's moral learning.

The second key design requirement concerns the tool's applicability/usability. Its design should support military personnel from diverse (educational) backgrounds to improve their ethical decision‐making in a practical way, also when deployed abroad. That is to say, using the tool should not require extensive training or an external moderator. In our experience, even when faced with a group of newly inducted personnel, so far, it has always been possible to use real‐life examples. Although not always operational dilemmas, they may include moral dilemmas regarding group dynamics or fear of speaking up in order to belong to the group. Having said that, discussing personal dilemmas could lead to vulnerabilities in the person sharing the dilemma, who might become distressed and even retraumatized by the experience. Having a capable and prepared moderator might help the response to such a situation. Additionally, moderators need to be aware of and pay attention to these vulnerabilities by systematically debriefing the person sharing the case and guiding them to actors who can provide professional support when needed.

3.2. Step 2: Selection of evidence

The first design requirement, fostering engagement by using concrete moral dilemmas, resonates well with forms of (clinical) ethics support based on hermeneutical and care‐ethical philosophies. Hermeneutics entails a dialogical view of moral learning, centered on the idea that those involved in dialogue have different views about what counts as right or wrong. From a hermeneutical perspective, openness to each other's views can lead to a shared understanding of the situation and common ground about the right course of action.29 Care ethics, like hermeneutics, emphasizes that human beings are immersed in a shared world. From a care ethical perspective, moral insights are contextual and open to improvement by interacting with those people whose care relationships we share.30 Here, moral dilemmas are viewed as inherently related to the context and concrete experiences of the people involved. Abstract principles or rules are considered as guidelines in that, they should be made explicit, explored, and (re)interpreted in concrete practices. Moral judgments imply the vulnerability and dependency of those involved. Moral judgment and moral learning are achieved by means of dialogue: questioning and reflecting on each other's normative convictions, values, and presuppositions.31

Within this field, scholars have developed clinical ethics support tools such as moral case deliberation,32 ethics rounds, or reflection groups,33 that are well established in civilian (European) healthcare settings.34 These tools offer a structured and dialogical approach to support professionals in reflecting and exchanging views on a moral dilemma they have experienced. Research indicates that these forms of clinical ethics support can foster joint reflection on moral issues among professionals to facilitate openness, understanding, and transparency and to nurture moral learning.35 These tools may provide important inspiration for the VBR tool; however, they do not meet the second design criterium, as they require facilitation by external ethicists.36

A second source of inspiration is empirical research on ethics education in the military. In addition to reflection and empathy,37 this line of research also recognizes psychological safety38 as a mechanism to improve ethical decision‐making, moral learning, and moral competence in classroom settings. These two sources inspired the design of the VBR tool prototype.

3.3. Step 3: Design, test, and evaluate the VBR tool

The VBR tool prototype was tested, evaluated, and further developed together with military personnel. Their involvement was deliberate in order to increase ownership, utilize their expertise, and motivate them to use the tool in existing work structures.39 Table 2 details how the specific actions within the VBR tool trigger social mechanisms that, in turn, are apparently important for fostering the desired outcome: moral competence. The table lists the underlying outcomes derived from empirical research on ethics education in the military. Actions (A) refer to the practices that employees have at their disposal to try and foster moral competence or some of its elements. Social mechanisms (M) involve the less visible processes that employees apply to produce an outcome (O), “a set of interacting parts—an assembly of elements producing an effect not inherent in any one of them.”40

Table 2.

Overview of VBR tool actions and social mechanisms that apparently foster moral competence, and the underlying outcomes.

Actions with the VBR tool Social mechanisms triggered by those actions Outcomes achieved
  • Share one's moral dilemma and considerations with others
  • Listening
  • Asking questions
  • Consider your initial opinion, moral intuition, impression, and emotion
  • Explore emotions, intuitions, and others' related values
  • Analyze underlying values, not fixating on solutions (postpone judgments)
  • Take into account various perspectives, values, and responsibilities towards others in a specific context
  • Decide what values or responsibility to prioritize
  • Consider what we can do to limit damage

Empathy—defined as the ability to connect with others, to understand what is at stake for them

Psychological safety—creating a climate where people are comfortable expressing themselves

Reflection—understanding the moral conviction and values imbedded in a specific context

Foster moral competence by achieving the following underlying outcomes:
  • Awareness of personal values and those of others
  • The ability to recognize the moral dimension of a situation, identify, and communicate which values are at stake or risk violating others' values
  • The ability to make a judgment on a moral issue or dilemma
  • The willingness and ability to act in accordance with this judgment in a morally responsible manner
  • The willingness and ability to be accountable to yourself and to others

Source: Adapted from Van Baarle, E. et al., op. cit. note 8; Hamington, M. (2019). Integrating care ethics and design thinking. Journal of Business Ethics, 155(1), 91–103; Edmondson, A. (1999). Psychological safety and learning behavior in work teams. Administrative Science Quarterly, 44(2), 350–383; Van Baarle, E., van de Braak, I., Verweij, D., Widdershoven, G., & Molewijk, B. (2019). The safety paradox in ethics training: A case study on safety dynamics within a military ethics train‐the‐trainer course. Medicine, Health Care and Philosophy, 22(1), 107–117; Van Dijke, J., van Nistelrooij, I., Bos, P., & Duyndam, J. (2019). Care ethics: An ethics of empathy?; Widdershoven, G., & Molewijk, B. (2016). Philosophical foundations of clinical ethics: A hermeneutic perspective. In Clinical ethics consultation (pp. 37–52). Routledge.

In the next step, we illustrate the VBR tool as an intervention in a concrete case contributed by a participant in the Military Middle Management Career Course. We also discussed this case during one of the dilemma training sessions.

3.4. A specific case to illustrate the VBR tool as an intervention

We test the VBR tool in a group setting, distinguishing three different roles: the case contributor, a moderator, and other participants contributing to the conversation. The moderator (a participant or trainer) facilitates the dialogue, stressing the importance of the following: adopting a listening attitude, active or empathic listening; asking (critical) questions; suspending one's judgment; and examining underlying values without immediately focusing on solutions. The moderator also keeps track of the conversation by clarifying the main arguments for all participants. To structure the dialogue, the moderator refers to the three columns shown in Table 3. Asking questions in a group setting elicits substantially more alternatives compared to conducting the exercise individually. This overview of the parties involved helps participants to subsequently identify the main arguments for each option from different perspectives.

Table 3.

The moderator keeps track of the dialogue in three columns.

Parties involved Possible course of action AI allowed on the flight Possible course of action BI do not allowed on the flight

Me

Father

Daughter

Family

Fellow passengers

Crew

ISAF

Colleagues

Role 2 colleagues

Local population

Arguments in favor of A: Arguments in favor of B:
  • Help the father D; V
  • Respect for family, understanding of the importance of being allowed on the flight V
  • Hearts and minds of local population/intelligence position C
  • Duty of care/lessen suffering D; C
  • Sense of responsibility V
  • Not shifting the dilemma to somebody else (taxi driver) C
  • It is not allowed, regulations/hygiene D
  • I find it shocking, intense and degrading C
  • What am I inflicting on the crew and fellow passengers C
  • Protecting the crew and fellow passengers V
  • Consequences for other passengers may not be able to join the flight C
  • We don't have another plane. If we need to clean this one, we may not be able to fly for a week C
  • Setting a precedent/extending actual task C
Damage limitation measures If you choose option A, how can the damage to B be limited? If you choose option B, how can the damage to A be limited?
  • transport as safely as possible/wrapping the body up the best way possible, achieve as close to regulation standard as possible.
  • explore whether it is an option to have only the father and daughter on the flight, with no other passengers.
  • talk about the shocking nature of the request, share your experience, the smell alone can already have a severe impact.
  • inform the other passengers and give them the option to come on the flight or not
  • discuss with crew
  • discuss with the crew, share arguments with the crew, and why the decision was made not to allow the family on the flight.
  • alternative ways to help the father: for example, arrange a taxi for him, in the taxi wrap up the body as best as possible.
  • discuss afterwards with the Role 2 colleagues, inform them of the decision‐making process, possibly also using this tool

Abbreviations: C, argument based on consequentialism; D, argument based on deontology; V, argument based on virtue‐ethics.

Moral competence begins with recognizing the moral dimension of a situation. To be able to do so requires identifying which values are at stake. The participants are asked to think of a personal moral dilemma that affected them. We define a moral dilemma as a situation where you have to choose between two actions embodying different values. Our starting point is a specific moral dilemma: one of the course participants was personally involved in a situation where the values at stake risked being threatened or violated. Even if a choice has already been made, by you or for you, there can still be a moral dilemma. The case for discussion can be in the past (preferably the recent past, so the experience is still fresh) or ongoing. In a group setting, every participant briefly shares a personal dilemma they have experienced. They do not initially reveal their final decision. This enables participants to postpone their judgment and engage in a more open and investigative discussion of the case. The moderator verifies whether the participant sharing the case feels comfortable discussing it with the entire group. This participant shared the following dilemma:

During my deployment to Afghanistan I was based at Kandahar Air Field, and my responsibility was arranging transport for people and equipment from Kabul to Tarin Kowt or Deh Rawod. We had limited resources, but at least we did have a ‘Dash’, a small aircraft, and a helicopter. One night, I received a call from the Role 2 field hospital that there was an Afghan man who needed to travel back to Tarin Kowt with the Dash and that he wanted to take his young daughter with him, but the little girl had died at our camp hospital. Well, the question was whether they could go on the flight. And that is where my dilemma began. It was awful for the father that his daughter had died and that he was now stuck there with her. As far as I know, in their culture, a person has to be buried within 24 hours in their native region … and that was Tarin Kowt. But at the same time, I was thinking: how is he going to fly then? Because a coffin won't fit in the Dash. The response was, ‘That isn't a problem, he can carry her'. The flight would take around 50 minutes. For me, this made the situation even sadder: to lose your daughter and then to also have to hold her in your arms like that for so long. I just can't get my head around it; what that must do to you as a parent and as a person.

And so, I felt I was stuck in a dilemma: do I allow the father and his deceased daughter on the flight or don't I?

The participants formulate the two courses of action: I allow them on the flight, or I do not allow them on the flight (see Table 3). The moderator asks whether there was a moral dilemma in this situation. If so, could the participants together specify the crux of the matter: what were the conflicting values?

Safety, but also the emotional effect of being confronted with this situation, and safety with respect to laws and regulations were all relevant factors. But also that I really wanted to help this man. They wouldn't ask for no reason.

The values mentioned here already provide some direction and can be further investigated and analyzed during the group discussion. The conflicting values in this case will not be the same for all participants. Perhaps not everyone will recognize that there are conflicting values here, or they may not consider this case to be a moral dilemma. And they do not have to. The aim is to learn to recognize personal values and others' values that are at stake in a specific situation.

The participants are invited to join in the case contributor's thought process, to listen, and to pose questions to clarify the situation and context, allowing everyone to picture themselves in that situation. Participants ask:

Why is it important to act, or not to act, a particular way? What had been tried up to that point? What were the results? Did you discuss the situation with other people?

The case contributor replies:

Thinking back again now to my time there, it was obviously a busy time. People are constantly ringing you with requests. There were too many people wanting a seat on a flight and too little capacity […]. The crew would be on the flight, as would other military personnel, but there was no way for me to know ahead of time who they would be. Transporting a body is subject to strict regulations. It was, therefore, not possible for the father to simply hold her in his lap. It just wasn't allowed. Not allowing them on the flight would mean jeopardizing the Afghan family's faith and religious values. That was my assumption. This made it even more difficult for me because I am not a member of that religion, which made it hard for me to put myself in their shoes. I felt I also had to take into account the values held by the aircraft crew and any possible other passengers and protect them. Furthermore, I also had to take into account regulations and aspects such as hygiene. I discussed it with the crew, but I can't remember what they said exactly. I think I might have remembered if they had not agreed with my decision.

The moderator keeps track of the type of questions posed. As participants apparently tend to ask factual questions, moderators need to be aware of this and intervene. Accordingly, the moderator remarks:

It seems it is harder to ask questions about personal considerations, feelings, emotions … while it is exactly these types of questions that may help to get to the heart of the dilemma.

In response to this intervention, one participant asks the case contributor about their feelings:

I was sleeping in a container alley over there, had had a busy, hectic day with canceled flights, having to deliver that news. Late that evening, I received a call and was told the story. And all I could think was: but how? He can just hold her … The idea alone! I suppose at the time, I felt safe in the knowledge that it was simply not allowed. It was such a tough request.

The account of the group discussion clearly demonstrates the added value of taking the time to ask follow‐up questions and improve participants' understanding of the entire context and case contributor's thought process. It also shows that at this point in the discussion, the group appears to be finding it difficult to pinpoint exactly which values collided in the situation. At this stage, this question remained unanswered.

3.5. Reflection using consequentialist, deontological, and virtue‐ethics arguments

Another important task the moderator performs is to help participants reflect on the arguments provided from different perspectives. In this case, are the main arguments to allow the father and his daughter on the flight, or not allow them on the flight? The participants discuss what they consider the main arguments if they were to picture themselves in that situation. Everyone contributes, and the arguments are noted in the table. For each argument presented, the moderator asks the participants to indicate what type it is: a consequentialist (C), deontological (D), or virtue‐ethics argument (V).41

Sometimes, an argument can be clearly rooted in consequentialism for one participant, while another perceives it more as a duty. The addition of this information demonstrates that, when passing judgment on a moral dilemma, colleagues often do so from one particular perspective, for instance, consequentialism. Other colleagues may base their judgment and argument on an entirely different perspective, for example, where duty is the main factor. When conducting ethics training, at times, it seems as if people are not even speaking the same language. The key action is to take into account various perspectives, values and responsibilities in order to eventually arrive at a considered judgment. Table 3 lists the arguments rooted in virtue ethics, deontology, and consequentialism.

At this point in the conversation, it is paramount that the moderator ensures a critical stance and questioning. For instance, the question “what am I inflicting on the crew and fellow passengers?” triggered the case contributor to share important arguments:

Case contributor: The question had been shocking enough, let alone that I would have had to put her on the plane.

Moderator: What do you mean that the question had been shocking?

Case contributor: Well, because I cannot imagine somebody sitting with their deceased child in their lap, that has an emotional impact. The culture is so different; perhaps the father did not experience it that way. It is heartbreaking to lose a child in the first place and to then also have to transport her that way. And at the same time, you might be helping him if that is what he wants.

Moderator: And you mentioned the emotional effect of being confronted with this situation. What do you mean by that?

Case contributor: The impact of the smell… I don't think I mentioned it, but I have experienced that before. Two colleagues died during a previous deployment, and we were tasked with arranging their repatriation. What I can remember very clearly is the smell. That is something you will never forget. That has such a massive impact… And I don't think this would even cross your mind if you hadn't experienced it yourself. It really is impactful… being confronted so closely with death. I think that I perhaps unknowingly allowed that experience to influence my approach to this dilemma: it had a severe impact on me. Perhaps I also wanted to protect my colleagues from those consequences.

The group is silent; the emotion is tangible. At first, many of the participants had found themselves in a problem‐solving mode: “Of course, they'll fly” and “we'll find a way.” For many participants, this was the moment to connect with the case contributor and understand the moral dimension of the situation from his perspective. The case contributor's moral dilemma had been:

Do I help the father, or do I protect my colleagues from a traumatizing experience (by forcing them to sit next to, see, and smell a deceased little girl and her traumatized father)?

3.6. Which value takes precedence?

In this step, the moderator asks participants what they think they would have decided under the same circumstances. Each participant shares their judgment, thereby indicating which value had the greatest impact on their decision. They are also asked to consider what can be done to limit the damage stated in the other column. Table 3 details how the moderator keeps track of the argumentation during the dialogue.

Ultimately, the case contributor also shares what he decided and why. He explains that at the time, he decided not to allow the family on the flight:

Mainly because I did not want to subject other colleagues to the situation, I wanted to protect them. We did arrange a taxi for the family. So I decided not to allow them on the flight.

Because it is an actual, personal case, his colleagues' response can have an impact. As the case contributor states:

It is quite difficult to find out that your fellow participants would have made a different decision; it's quite easy to then think, oh no, I made the wrong decision. Hearing other people's insights is good, but it can also be painful. The model is good; it allows you to see that you did think it through thoroughly and that there is no such thing as the ultimate right decision.

Several participants indicated that having this type of overview and a better understanding of the context meant that they would perhaps make a different decision than they initially thought. This underlines that, however complex the situation, it is possible to make a judgment that is appropriate at that moment and a course of action that is effective in the situation.

3.7. Evaluating the VBR tool

Both inside and outside the Defense organization, participants perceive the VBR tool as supporting them by being easy to use as a

…worthwhile guide because of its simplicity and structure. The three columns are clear to understand at a glance, easy to explain and remember.

Participants indicated that they had found it helpful to be able to visualize a dilemma and look at it from different perspectives. As the case contributor notes

I was stuck in that one‐track train of thought: she's not going on the flight. As this was the case, I could have involved somebody from outside, perhaps a colleague from a different platoon. This colleague could have helped me deal with the situation, by using this tool to ask a number of questions and to create space for other perspectives. That is what it is all about: you need to have someone who can also shed light on things from a different angle.

Participants argue that the VBR tool can also be used to give an account of decisions they have made under difficult circumstances. A participant with ample experience in being deployed explains:

…it acts as an overview on paper of the analysis and judgment process in the context of the time and place in which the decision was made.

A trainer further elaborates on this argument:

Allowing time and space for joint reflection can contribute to broadening the decision‐maker(s)' perspective, promote balanced judgment, elucidate considered decisions, strengthen support for the decision, among others, increase the disadvantaged party's degree of acceptance, and prepare the decision‐maker in terms of communicating the decision and being held accountable.

After using the model, the participants indicated that they thought it could also be used in other contexts than the Netherlands Military Middle Management Career Course to enable structured discussions on moral dilemmas, as one participant explains:

A useful tool to gain insight into moral dilemmas, I will use this tool in my own practice.

The model has also been used as part of ethics training during the Netherlands International Military Cooperation Course at the major‐colonel level, which includes participants from many countries. They indicate that using the model allowed them to learn from each other, particularly by including different values and perspectives from so many diverse backgrounds. Also, they feel that moral competence can be trained by using this method:

Moral competence, recognizing and handling moral dilemmas can be trained. This method is easy to use, and I will definitely be able to use this method myself in the future.

Both the case contributors and trainers acknowledge that it is easy to underestimate the moderator's role. Because it is a personal example, personal values are involved. The case presented above shows that when facing a personal moral dilemma, your deepest convictions and values are at stake. This is a vulnerable area; key values also move us emotionally, touching our fundamental morality. As the case contributor explains:

Hearing other people's insights is good, but it can also be painful.

3.8. Step 4: Reflection and learning

Trainers who have just started using the model emphasize that the three‐column diagram in the VBR tool is not simply a short fill‐in sheet or goal in itself but is aimed at fostering moral competence. It is intended to provide structure, allowing a dialogue on moral dilemmas to take place that not only includes different perspectives but also invites the dialogue's participants to challenge their thinking and suspend their own judgment. A trainer reflects the following:

The aim of using the model is not to come to the correct moral judgment or the one true morally correct course of action, as we believe that values and standards have no universal or objective significance. Different insights are exactly what is needed to generate opportunities for further examination. Not with the aim of finding out who is right but to learn from the different vantage points on both the factual situation and the reflections on this situation. The VBR tool can support this process and thus be of value for approaching future moral dilemmas and also provide guidance when reflecting on the past.

The moderator and participants are jointly responsible for ensuring that a dialogue can actually take place. Yet moderators should be able to address the importance of dialogue, examine underlying values, and possibly intervene on the spot. Intervening with groups in productive ways requires considerable observational and intervention skills. Some participants argue that a trained moderator could be helpful in small groups. One participant suggests:

Sometimes, it may be necessary to have a trained moderator to ask critical questions and monitor the group process. Trainers argue:

Perhaps we need to invest more in practicing this model during various training sessions so that participants can use it in their own practice.

And:

The importance of repetition should not be underestimated; this model can also be useful at military academies.

While the trainers do not feel the VBR tool requires a professional trainer or moderator, it does appear to require a sense of sensitivity to group dynamics—particularly because of the role emotions play. Paying attention to, acknowledging, and addressing emotions, especially in a military context, can be challenging but appears to be a key element in value‐based ethics support (e.g.,42). Moderators also highlight that emotions play an important part in identifying and weighing up values. A trainer explains:

Because emotions can be directly linked to our fundamental moral values, it is helpful to acknowledge and explore emotional aspects. The following questions can be helpful: what is of value to you in this situation? What is your attitude towards this as a person? And how do your feelings compare to how the other people involved experience the situation?

In order to foster reflection and moral learning and examine different perspectives as well as the vulnerability and dependency of those involved, before introducing the model, we explicitly address the importance of dialogical values. Participants could also practice the dialogical guidelines presented in Table 4.

Table 4.

Dialogical guidelines.

Dialogical guidelines for using the VBR tool
  • Do not try to convince each other, instead allow room for various perspectives
  • Be inquisitive towards each other, posing questions that will make each other think
  • Do not convince anyone of your own rightness: postpone your judgment
  • Make an effort to understand others and to view things from their perspective
  • Speak for yourself, refrain from condemning others, instead share the effect something is having on you

Source: Adapted from Kessels, J., Boers, E., & Mostert, P. (2004). Free space and room to reflect. Philosophy in organizations. Boom Uitgeverij.

Trainers who use the model in their own practice indicate that it can also assist in acting on military leader competencies such as communication, inter‐vision, and being people‐centered. These competencies feature in official leadership doctrine.

This model also supports leaders in developing leadership competencies in practice within their teams or units.

Based on these reflections, implementing the VBR tool requires investing in participants with solid listening and communicative skills, providing them with adequate training throughout their professional education, and trying to foster an organizational setting that values and acknowledges these skills within the military. This implies extensive military ethics education at all levels to foster reflective consulting processes that enable moral learning. This ethics support tool is currently used at different levels in national and international courses within the Dutch Armed Forces. We aim to further develop the tool by collaborating with various departments and stakeholders within and outside the organization.

4. DISCUSSION AND CONCLUSION

The purpose of this study is to complement the predominant compliance‐based ethics support in military organizations with a value‐based approach tailored to this specific context. Building on value‐based ethics support insights from the healthcare domain for its design, the VBR tool was developed and then tested in a variety of settings, nationally and internationally. Our findings show that a value‐based ethics practice that incorporates in its design military personnel's “lived experience” of moral dilemmas and is easy‐to‐use, fosters the development of moral competence. These findings have two implications for the literature.

First, our findings illustrate how value‐based approaches can complement compliance‐based ethics support in military organizations. The armed forces primarily draw on compliance‐based and conceptual approaches in their ethics training in order to “produce military personnel who carry out their duties effectively and within the bounds of the law.”43 This approach fails to support military personnel in developing virtues as it does not draw on their own experiences and dilemmas. The moral dilemma presented in this study shows that moral decision‐making processes involve individual reasoning. Equally, it unpacks the complex contextual and relational elements of the moral dilemma confronting the case contributor. That is to say, not only were his values at stake, but he also considered others' values, and his responsibility toward them. The VBR tool acknowledges the relational nature of moral competence and supports both moral decision‐making and responsibility for others.44 This includes reflecting on principles and legal norms, not in an abstract sense but applied in actual practice.45 Practical knowledge, therefore, does not dispute theory. Rules and norms can be made explicit and developed by processes of reflection. Consequently, participants acquire more general knowledge of the situation and reflect on the underlying values of existing norms. Unlike compliance‐based strategies, adopting this hermeneutic and care‐ethical perspective implies that values are rooted in concrete situations. This perspective explicitly includes responsibilities toward others and simultaneously provides insights and guidance about what generally matters in military practice and armed conflicts.

Second, our findings demonstrate design research's potential to develop actionable knowledge for ethics support practices in organizations. By developing, intervening, and evaluating a value‐based ethics support tool, this study addresses military organizations' struggle to move beyond compliance strategies and nurture a practice‐oriented approach to military ethics.46 This VBR tool serves as a guide for mutual dialogue on moral issues, inviting participants to share their thoughts, concerns, and values aloud and in a structured manner. The multiple values discussed are not a reflection of division but an alternative way to assess a situation. The VBR tool can thus be used in ethics education, in the workplace, or to support conversations about moral issues in other settings.

Additionally, value‐based ethics support and design research have much in common, including their relational and responsive dimensions, for instance, the importance of empathy and inquiry.47 Involving military personnel in the development and implementation of the tool allowed us to tailor it to the specific military context. This in turn, fostered participants' ownership of the tool, also empowering them to deal with moral dilemmas in their work environment.48 Our study is thus the first attempt to pull together and synthesize the research findings on value‐based ethics support into an intervention design and subsequently put it to the test in a real‐life organization.

4.1. Limitations and future research

Our findings can be readily generalized to teams in similar (i.e., hierarchy‐driven) organizations such as fire departments, police forces, or even hospitals. Future research could investigate to what extent other organizational settings might benefit from this VBR tool. One limitation is that the study was conducted in the Netherlands, a country with a relatively low power distance between people.49 The VBR tool, therefore, needs to be studied and evaluated in countries with a high power distance.

Another limitation arises from having to make decisions about the trade‐off between testing the VBR tool over a longer period (with more extensive data collection) and a shorter testing period in order to make it available to others for experimentation. We opted for the shorter period, however, more in‐depth studies would be highly valuable for further validating the combined social mechanisms that the invention triggers as well as the outcomes.

We argue that our type of research design approach can be really beneficial when devising and evaluating ethics support tools.50 Future intervention studies could move beyond the “what works?” question in organizational intervention research, or randomized, controlled trials. Our study is a first step in this direction, and future work could focus even more on answering the question “what works for whom in which circumstances?” This may present a more suitable framework to better understand how the intervention's content and process mechanisms achieve the desired outcomes (i.e., improved ethics support).

4.2. Concluding remarks

VBR tools can help military employees foster moral competence and ethical decision‐making. Participants indicate that the model helped them to challenge their own thought processes and taught them how to view a moral dilemma from multiple perspectives (e.g., military and medical combined). They also highlight the importance of having this conversation together with colleagues and truly engaging in a dialogue where participants make an effort to understand the situation. The VBR tool can enable dialogue aimed at understanding past moral dilemmas or addressing current ones. While our findings illustrate that participants are positive about the VBR tool, further research is needed in order to confirm that using the tool fosters ethical decision‐making in complex situations and reduces mental stress.

ACKNOWLEDGMENTS

We would like to thank trainers Laura Veerman, Ewold de Bruijne, and Kevin van Loon, as well as participants of the Netherlands Military Middle Management Career Course, for co‐developing and testing the VBR tool.

Biographies

Eva van Baarle (PhD, VU University Amsterdam) is an assistant professor of military ethics and philosophy at the Netherlands Defense Academy and a research associate at the VU University Medical Center Amsterdam. Her research focuses on fostering reflective practice and moral competence by means of ethics education and on developing interventions that contribute to psychological safety and just culture in organizations.

Steven van Baarle (PhD, Eindhoven University of Technology) is an assistant professor of organization studies at the VU University Amsterdam, department of Management and Organization. His research interests include understanding the dynamics of stability and change in organizations.

van Baarle, E. , & van Baarle, S. (2025). Advancing ethics support in military organizations by designing and evaluating a value‐based reflection tool. Bioethics, 39, 5–17. 10.1111/bioe.13255

[Correction added on 21 December 2023, after first online publication: In affiliation 2, “VU University Amsterdam” has been changed to “VU Amsterdam” in this version.]

Footnotes

1

In many countries, ethics education for military healthcare providers and policymakers is tailored to these specific groups. This study takes a different approach by training these employees in settings that resemble the actual settings they work in, for instance, when deployed. Military healthcare providers and policymakers are embedded in the military organization. They wear a military uniform and receive their mid‐career training alongside personnel from all branches of the military. When deployed abroad, military physicians are part of a military unit, fulfilling multiple roles and responsibilities within their unit. These diverse roles entail a network of various relationships and responsibilities that inherently affect physicians' personal perspectives, decisions, and advice on ethical matters. We therefore argue that in the armed forces, it makes sense to discuss the ethical and human rights challenges facing healthcare providers and policymakers beforehand, when such challenges can be discussed as part of military ethics education.

2

Cook, M. L., & Syse, H. (2010). What should we mean by ‘Military Ethics’? Journal of Military Ethics, 9(2), 119–122.

3

Robinson, P., De Lee, N., & Carrick, D. (Eds.). (2008). Ethics education in the military. Ashgate Publishing Ltd; Coleman, S. (2012). Military ethics: An introduction with case studies. Oxford University Press.

4

Paine, L. (1994). Managing for organizational integrity. Harvard Business Review, 72(2), 106–117; De Colle, S., & Werhane, P. (2008). Moral motivation across ethical theories: What can we learn for designing corporate ethics programs? Journal of Business Ethics, 81(4), 751–764; Weaver, G., & Trevino, L. (2001). The role of human resources in ethics/compliance management: A fairness perspective. Human Resource Management Review. 11(1–2), 113–134.

5

Whetham, D. (2018). Challenges to the professional military ethics education landscape. In Carrick, D., Connelly, J., Whetham, D. Making the military moral (pp. 142–159). Routledge; Messelken, D. (2019). The ‘peace role’ of healthcare during war: Understanding the importance of medical impartiality. BMJ Military Health, 165(4), 232–235.

6

Aguilera, R., Filatotchev, I., Gospel, H., & Jackson, G. (2008). An organizational approach to comparative corporate governance: Costs, contingencies, and complementarities. Organization Science, 19(3), 475–492.

7

Schembera, S., Haack, P., & Scherer, A. (2022). From compliance to progress: A sensemaking perspective on the governance of corruption. Organization Science, 34(3), 1184–1215.

8

Van Baarle, E., Bosch, J., Widdershoven, G., Verweij, D., & Molewijk, B. (2015). Moral dilemmas in a military context. A case study of a train‐the‐trainer course on military ethics. Journal of Moral Education, 44(4), 457–478; De Bock, M., & Olsthoorn, P. (2016). Leadership development of junior army leaders: A Dutch perspective. Journal of Military and Strategic Studies, 16(4), 154–170.

9

Van Baarle, E. et al., op. cit. note 8.

10

Weaver, G., Trevino, L., & Cochran, P. (1999). Corporate ethics practice in the mid‐1990s: An empirical study. Journal of Business Ethics, 18, 283–294; Van Baarle, E., Hartman, L., Verweij, D., Molewijk, B., & Widdershoven, G. (2017). What sticks? The evaluation of a train‐the‐trainer course in military ethics and its perceived outcomes. Journal of Military Ethics, 16(1–2), 56–77.

11

Crane, A., & Matten, D. (2004). Questioning the domain of the business ethics curriculum. Journal of Business Ethics, 54(4), 357–369.

12

Olsthoorn, P. (2016). Ethics education for operations other than war: The Dutch approach. In D. Carrick (Ed.), Ethics education for irregular warfare (pp. 145–158). Routledge.

13

Cook, M. (2019). Reflections on the relationship between law and ethics. Adelaide Law Review, 40, 485–503; Toebes, B. (2013). Healthcare on the battlefield: In search of a legal and ethical framework. Journal of International Humanitarian Legal Studies, 4(2), 197–219.

14

Olsthoorn, P. (2016). Ethics education for operations other than war.

15

Van Baarle, E., Hartman, L., Verweij, D., Molewijk, B., & Widdershoven, G. (2017). What sticks?; Hooft, F. (2022). White coats, green jackets: Physicians and nurses in the Dutch armed forces, professional identity & agency, 1990–2010. PhD dissertation, University Utrecht.

16

Ritov, G., & Barnetz, Z. (2014). The interrelationships between moral attitudes, posttraumatic stress disorder symptoms and mixed lateral preference in Israeli reserve combat troops. The International Journal of Social Psychiatry, 60(6), 606–612; Currier, J., Holland, J., Drescher, K., & Foy, D. (2015). Initial psychometric evaluation of the moral injury questionnaire—Military version. Clinical Psychology & Psychotherapy, 22(1), 54–63; Jordan, A., Eisen, E., Bolton, E., Nash, W., & Litz, B. (2017). Distinguishing war‐related PTSD resulting from perpetration‐ and betrayal‐based morally injurious events. Psychological Trauma: Theory, Research, Practice, and Policy, 9(6), 627–63; Wisco, B., Marx, B., May, C., Martini, B., Krystal, J., Southwick, S., & Pietrzak, R. (2017). Moral injury in U.S. combat veterans: Results from the national health and resilience in veterans study. Depression and Anxiety, 34(4), 340–347.

17

Litz, B., Stein, N., Delaney, E., Lebowitz, L., Nash, W., Silva, C., & Maguen, S. (2009). Moral injury and moral repair in war veterans: A preliminary model and intervention strategy. Clinical Psychology Review, 29(8), 695–706; Frankfurt, S., & Frazier, P. (2016). A review of research on moral injury in combat veterans. Military Psychology, 28(5), 318–330.

18

Mullarkey, M., & Hevner, A. (2019). An elaborated action design research process model. European Journal of Information Systems, 28(1), 6–20; Sein, M., Henfridsson, O., Purao, S., Rossi, M., & Lindgren, R. (2011). Action design research. MIS Quarterly, 25(1), 37–56.

19

Simon, H. (1996). Sciences of the artificial. MIT Press.

20

Nicolai, A., & Seidl, D. (2010). That's relevant! Different forms of practical relevance in management science. Organization Studies, 31(9–10), 1257–1285.

21

Rosemann, M., & Vessey, I. (2008). Toward improving the relevance of information systems research to practice: the role of applicability checks. MIS Quarterly, 32(1), 1–22; Van Aken, J. E. (2004). Management research based on the paradigm of the design sciences: The quest for field‐tested and grounded technological rules. Journal of Management Studies, 41, 219–246.

22

Hevner, A., March, S., Park, J., & Ram, S. (2004). Design science in information systems research. MIS Quarterly, 28(1), 75–105; Sein, M., Henfridsson, O., Purao, S., Rossi, M., & Lindgren, R. (2011). Action design research. MIS Quarterly, 35(1), 37–56.

23

Sein, M., Henfridsson, O., Purao, S., Rossi, M., & Lindgren, R. (2011). Action design research. MIS Quarterly, 35, 37–56.

24

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101.

25

Van Baarle, E. et al., op. cit. note 8.

26

Soeters, J. (2018). Organizational cultures in the military. In G. Caforio & M. Nuciari (Eds.), Handbook of the sociology of the military (pp. 251–272). Springer.

27

Thornborrow, T., & Brown, A. (2009). Being regimented: Aspiration, discipline and identity work in the British parachute regiment. Organization Studies, 30(4), 355–376.

28

Van Baarle, E. et al., op. cit. note 8.

29

Gadamer, H. (1960). Wahrheit und methode. Grundzüge einer philosophischen hermeneutik. J.C.B. Mohr.

30

Tronto, J. (1993). Moral boundaries. A political argument for an ethic of care. Routledge; Tronto, J. (2013). Caring democracy. New York University Press.

31

Widdershoven, G., & Abma, T. (2007). Hermeneutic ethics between practice and theory. In R. Ashcroft., A. Dawson, H. Draper, & J. McMillan (Eds.), Principles of health care ethics (pp. 215–222.). Wiley.

32

Widdershoven, G. & Molewijk, B. 2010. Philosophical foundations of clinical ethics. A hermeneutic perspective. In J. Schildman (Ed.), Clinical ethics consultation. Theories and methods, implementation, evaluation (pp. 37–52). Ashgate; Hartman, L., Inguaggiato, G., Widdershoven, G., Wensing‐Kruger, A., & Molewijk, B. (2020). Theory and practice of integrative clinical ethics support: A joint experience within gender affirmative care. BMC Medical Ethics. 21(1), 1–13; Inguaggiato, G., Metselaar, S., Molewijk, B., & Widdershoven., G. (2019). How moral case deliberation supports good clinical decision‐making. AMA Journal of Ethics, 21(10): 913–919.

33

Metselaar, S., Van Schaik, M., Widdershoven, G., & Pasman, R. (2022). CURA: A clinical ethics support instrument for caregivers in palliative care. Nursing Ethics, 29(7–8), 1562–1577. doi:10.1177/09697330221074014.

34

Porz, R., Landeweer, E., & Widdershoven, G., (2011). Theory and practice of clinical ethics support services: Narrative and hermeneutical perspectives. Bioethics, 25(7), 354–360.

35

Weidema, F., Molewijk, A., Kamsteeg, F., & Widdershoven, G. (2013). Aims and harvest of moral case deliberation. Nursing Ethics, 20(6), 1–15; Haan, M., Van Gurp, J., Naber, S., & Groenewoud. A. (2018). Impact of moral case deliberation in healthcare settings: A literature review. BMC Medical Ethics, 19(1), 1–15.

36

Stolper, M., Molewijk, B., & Widdershoven, G. (2015). Learning by doing. Training health care professionals to become facilitator of moral case deliberation. HEC Forum, 27(1), 47–59.

37

Van Nistelrooij, I., Schaafsma, P., & Tronto, J. (2014). Ricoeur and the ethics of care. Medicine, Health Care and Philosophy, 17(4), 485–491; Widdershoven, G., & Molewijk, B. (2010). Philosophical foundations of clinical ethics. A hermeneutic perspective. In J. Schildman (Ed.), Clinical ethics consultation. Theories and methods, implementation, evaluation (pp. 37–52). Ashgate; Van Baarle, E. et al., op. cit. note 8; Van Dijke, J., van Nistelrooij, I., Bos, P., & Duyndam, J. (2019). Care ethics: An ethics of empathy? Nursing Ethics, 26(5), 1282–1291.

38

Edmondson, A. (1999). Psychological safety and learning behavior in work teams. Administrative Science Quarterly, 44(2), 350–383; Van Baarle, E., van de Braak, I., Verweij, D., Widdershoven, G., & Molewijk, B. (2019). The safety paradox in ethics training: A case study on safety dynamics within a military ethics train‐the‐trainer course. Medicine, Health Care and Philosophy, 22(1), 107–117.

39

Nielsen, K., Stage, M., Abildgaard, J., & Brauer, C. (2013). Participatory intervention from an organizational perspective: Employees as active agents in creating a healthy work environment. In Salutogenic organizations and change (pp. 327–350). Springer.

40

Hernes, G. (1998). Real virtuality. In P. Hedström & R. Swedberg (Eds), Social mechanisms: An analytical approach to social theory (pp. 74–101). Cambridge University Press.

41

Van Baarle, E. (2022). Fostering reflective practice and moral competence: Ethics education in the military. In Ethics and military practice (pp. 15–23). Brill Nijhoff.

42

Molewijk, B., Kleinlugtenbelt, D., & Widdershoven, G. (2011). The role of emotions in moral case deliberation: Theory, practice, and methodology. Bioethics, 25(7), 383–393.

43

Wolfendale, J. (2017). What is the point of teaching ethics in the military? In P. Robinson (Ed.), Ethics education in the military (pp. 175–188). Routledge, p. 161; Cook, M. (2019). Reflections on the relationship between law and ethics. Adelaide Law Review, 40, 485–503.

44

Edwards, S. (2009). Three versions of an ethics of care. Nursing Philosophy, 10(4), 231–240.

45

Benaroyo, L., & Widdershoven, G. (2004). Competence in mental health care: A hermeneutic perspective. Health Care Analysis, 12(4), 295–306; Van Nistelrooij, I., Schaafsma, P., & Tronto, J. (2014). Ricoeur and the ethics of care. Medicine, Health Care, and Philosophy, 17(4), 485–491.

46

Olsthoorn, P. (2016). Ethics education for operations other than war.

47

Hamington, M. (2019). Integrating care ethics and design thinking. Journal of Business Ethics, 155(1), 91–103.

48

Abildgaard, J., Nielsen, K., Wåhlin‐Jacobsen, C., Maltesen, T., Christensen, K., & Holtermann, A. (2020). ‘Same, but different’: A mixed‐methods realist evaluation of a cluster‐randomized controlled participatory organizational intervention. Human Relations, 73(10), 1339–1365.

49

Hofstede, Geert. (2001). Culture's consequences: Comparing values, behaviors, institutions and organizations across nations. Sage.

50

Nielsen, K., & Mariella, M. (2017). What works for whom in which circumstances? On the need to move beyond the ‘what works?’ Question in organizational intervention research. Human Relations, 70(1), 40–62.


Articles from Bioethics are provided here courtesy of Wiley

RESOURCES