Skip to main content
Frontiers in Psychology logoLink to Frontiers in Psychology
. 2019 Jun 28;10:1478. doi: 10.3389/fpsyg.2019.01478

Laborious but Elaborate: The Benefits of Really Studying Team Dynamics

Michaela Kolbe 1,*, Margarete Boos 2
PMCID: PMC6611000  PMID: 31316435

Abstract

In this manuscript we discuss the consequences of methodological choices when studying team processes “in the wild.” We chose teams in healthcare as the application because teamwork cannot only save lives but the processes constituting effective teamwork in healthcare are prototypical for teamwork as they range from decision-making (e.g., in multidisciplinary decision-making boards in cancer care) to leadership and coordination (e.g., in fast-paced, acute-care settings in trauma, surgery and anesthesia) to reflection and learning (e.g., in post-event clinical debriefings). We draw upon recently emphasized critique that much empirical team research has focused on describing team states rather than investigating how team processes dynamically unfurl over time and how these dynamics predict team outcomes. This focus on statics instead of dynamics limits the gain of applicable knowledge on team functioning in organizations. We first describe three examples from healthcare that reflect the importance, scope, and challenges of teamwork: multidisciplinary decision-making boards, fast-paced, acute care settings, and post-event clinical team debriefings. Second, we put the methodological approaches of how teamwork in these representative examples has mostly been studied centerstage (i.e., using mainly surveys, database reviews, and rating tools) and highlight how the resulting findings provide only limited insights into the actual team processes and the quality thereof, leaving little room for identifying and targeting success factors. Third, we discuss how methodical approaches that take dynamics into account (i.e., event- and time-based behavior observation and micro-level coding, social sensor-based measurement) would contribute to the science of teams by providing actionable knowledge about interaction processes of successful teamwork.

Keywords: team process, team dynamics, interaction analysis, methods, measurement

Introduction

Modern organizations rely on teams (Edmondson, 2012; Salas et al., 2013b; Mathieu et al., 2014). For decades, team researchers have been studying how teams create and maintain high performance, how they learn, and how they satisfy their members’ needs. A remarkable finding of this research is that high team performance is not so much predicted by how able single team members are but by the way they cooperate with one another: the team process (West, 2004; Woolley et al., 2010, 2015). Team process is defined as “members’ interdependent acts that convert inputs to outcomes through cognitive, verbal, and behavioral activities directed toward organizing taskwork to achieve collective goals” (Marks et al., 2001, p. 357). This definition implies that team processes are actually dynamic, emerging over time, and changing their pattern. It stands in contrast to the way teams have mostly been studied: much empirical team research has been static rather than dynamic, assessing team states rather than exploring how team processes dynamically develop over time and how these dynamics are related to team outcomes such as performance, satisfaction, and learning (Roe, 2008; Cronin et al., 2011; Humphrey and Aime, 2014; Mathieu et al., 2014; Kozlowski, 2015). As such, much team research has relied on self-reported and cross-sectional data with small samples and short analysis periods rather than on more meaningful, time-based behavioral data. While the number of theories and concepts factoring in time and temporal dynamics in team research is rising (McGrath and Tschan, 2004; Ballard et al., 2008; Lehmann-Willenbrock, 2017), the number of published empirical studies actually integrating dynamics is small considering for how long and how urgently this research has been requested (Stachowski et al., 2009; Tschan et al., 2009, 2015; Grote et al., 2010; Lehmann-Willenbrock et al., 2011, 2013; Zijlstra et al., 2012; Boos et al., 2014; Kolbe et al., 2014; Lei et al., 2016). This may be due to both the “unease of the psychologist in face of interaction” (Graumann, 1979) as well as to methodological challenges. However, recent team research has revealed that team members’ interaction patterns rather than the frequencies of their individual actions are what discriminates higher- from lower-performing teams (Kim et al., 2012; Zijlstra et al., 2012; Kolbe et al., 2014; Lei et al., 2016). These distinguishing dynamics cannot be uncovered with static research but require process-related methods like sequential analysis, time series analysis or process modeling. It is critical to understand how team processes emerge and change and what they need and do to achieve best outcomes. This is specifically important in light of the evidence showing that poor teamwork in high-risk/high complexity fields such as healthcare can have disastrous consequences, i.e., loss of a patient’s life (Cooper et al., 1984; Flin and Mitchell, 2009; Reynard et al., 2009; Fernandez Castelao et al., 2011; Salas and Frush, 2013; Salas et al., 2013b).

In this manuscript, we use teams in healthcare as the application context for illustrating the consequences of methodological choices in studying teams. We deliberately chose healthcare as application context for three reasons. First, teamwork can save lives (Rosen et al., 2018a). There is vast evidence demonstrating that poor teamwork has been involved in medical error (Gawande et al., 2003; Greenberg et al., 2007). Improving teamwork is a major initiative in patient safety and healthcare (Pronovost, 2013; Salas and Frush, 2013; Vincent and Amalberti, 2016). Second, the processes constituting effective teamwork in healthcare are prototypical for teamwork in general: they range from decision-making (e.g., in multidisciplinary decision-making boards in cancer care) to leadership and coordination (e.g., in fast-paced, acute-care settings in trauma, surgery and anesthesia) to reflection and learning (e.g., in post-event, clinical debriefings). Many of the research gaps and much of the knowledge gained from studying teams in healthcare is applicable to teams in other industries (Salas et al., 2013b). Third, given the broad occurrence and critical importance of teams not only in healthcare, knowledge must be gained on what contributes to effective teamwork. Team science has not only a lot to offer with respect to theory and methodology, it has also an obligation to contribute to improving teamwork by providing theoretical and methodological knowledge and supporting teams in healthcare.

The goal of this manuscript is to illustrate the consequences of methodological choices when attempting to study and measure team processes “in the wild” such as in healthcare (Rosen et al., 2012; Salas, 2016). In particular, we aim to show that using methods relying on summative, cross-sectional data collection (e.g., rating teamwork aspects after a medical team performance episode) will result in limited insights into the actual dynamic team process. Instead, gaining critical comprehension of dynamics that characterize effective teamwork requires methods that are more laborious (e.g., real-time behavior coding during the medical team performance episode) but provide more elaborate understanding of what happened while working together. We argue that static team research is a methodical choice that diminishes rather than enhances potential contributions to the science of teams. While we greatly appreciate the value of teamwork surveys such as the Team Diagnostic Survey (Wageman et al., 2005) and the Aston Team Performance Inventory (West et al., 2006), particularly for assessing team members’ subjective perspective of team process functioning for the purpose of training and reflection, we argue that studying team dynamics by means of dynamic teamwork measures is a better methodological fit (Edmondson and McManus, 2007) and more promising for teamwork interventions.

For this purpose, we first describe three examples representing typical teamwork in healthcare and briefly refer to both team conceptual foundation underlying these examples and current research needs, also in order to highlight their representativeness for teamwork in general. Second, we put the methodological approaches of how teamwork in these representative examples has mostly been studied centerstage and highlight the respective consequences. Third, we illustrate potential other methodological approaches which are, for the time being, more extensive but provide benefits for applied team science.

Three Representative Examples for Teamwork

As prototypical examples for teamwork we chose three team settings from healthcare: (1) multidisciplinary decision-making boards, (2) fast-paced, acute care settings, and (3) post-event, clinical team debriefings. The examples convey the criticality of both teamwork for a range of tasks in an important professional sector as well as of team process as a mediator between input and outcome of teamwork. All three examples represent contemporary forms of more or less ad hoc team constellations (Tannenbaum et al., 2012). Embedded in organizational structures, they highlight the dynamic and emergent features of teams and the resulting requirements for appropriate methods in order to grasp these features.

Example 1: Multidisciplinary Decision-Making Boards

Multidisciplinary decision-making boards are a prototype of diverse teams in complex organizations for which the successful exchange of expertise should result in synergy. The most common example is the multidisciplinary tumor board in cancer care (Homayounfar et al., 2015) where experts of multiple disciplines discuss individual patient cases. More recently, Heart Teams have been formed consisting of experts from disciplines involved in management of complex, severe heart diseases (e.g., cardiologists, cardiac surgeons, imaging specialists, anesthesiologists and, if required, general practitioners, geriatricians, and intensive care specialists) and should find optimal treatments (Seiffert et al., 2013; Antonides et al., 2017; Falk et al., 2017). Multidisciplinary decision-making boards are implemented as countermeasure to the increasing complexity of treatment options. Their objective is to provide patients with the most effective treatment in light of the severity of the disease, patients’ requests, resources, and the current state of medical research. Multidisciplinary tumor boards have already become an international standard of cancer care (Pox et al., 2013). Heart Teams are recommended by the European Society of Cardiology and the European Association of Cardio-Thoracic Surgery (Falk et al., 2017).

The criticality of team process in multidisciplinary decision-making boards is illustrated in a meeting situation in Table 1. It shows that a lack of evidence-based communication rules, professional facilitation, and participative leadership behavior that take into account task complexity, conflicting goals, hierarchical structure, and time pressure can jeopardize the effective functioning, synergy, and development of multidisciplinary decision-making boards, and thus their ultimate mission to enhance patient care (Kolbe et al., 2019). As a consequence, team science must provide insights into effective teamwork processes as well as respective countermeasures.

TABLE 1.

Example of a problematic teamwork situation in multidisciplinary decision-making boards.

Situation Potential teamwork process problems Required teamwork process insights
During a tumor board meeting, the chief of surgery arrives late while the discussion of a particular patient initially referred to her department has already started with a preliminary vote for inclusion into a new clinical trial instead of surgery. Using her dominant character she states that the patient will have to get surgery. None of the other board participants repeated the previously discussed arguments favoring the clinical trial and in the protocol a vote for surgery was documented as concordant decision. Counterproductive meeting behaviors and lack of meeting rules (Allen et al., 2015). Identification of actions required to set up and facilitate multidisciplinary tumor board meetings.
Risk that leaders dominate discussion (Larson et al., 1998). Understanding of facilitation techniques which allow for balanced exploitation of information from all board members and of optimal decision rules.
Lack of psychological safety and lack of sharing information, opinions, and concerns by all board members (Mesmer-Magnus and DeChurch, 2009; Edmondson and Lei, 2014). Understanding how to establish and maintain psychological safety during interdisciplinary tumor board meetings.

Example 2: Fast-Paced, Acute Care Settings

Fast-paced, acute care settings such as medical emergencies are prototypical for so-called action teams, i.e., teams that are confronted with highly dynamic, complex, and consequential tasks (Tschan et al., 2006, 2011a). They require teamwork at its best (Driskell et al., 2018; Maynard et al., 2018). For example, resuscitating a patient requires prompt and well-coordinated actions such as diagnosing the cardiac arrest, oxygenating the brain and reestablishing spontaneous circulation (Tschan et al., 2011b). Other fast-paced, acute care settings require more sense-making processes, for example when the diagnosis is not yet clear. Team members must adaptively engage in immediate problem awareness and diagnosis, information-processing, problem-solving, and coordination of actions (Hunziker et al., 2011; Tschan et al., 2011a, 2011b, 2014). They must do this under time pressure and high workload—and in many instances off the cuff as ad hoc action teams (Kolbe et al., 2013a). Both the European Resuscitation Council (ERC) and the American Heart Association (AHA) recommend integrating teamwork trainings into advanced life support education (Bhanji et al., 2010; Soar et al., 2010). This is, in part, realized by simulation-based team training (Kolbe et al., 2013b; Salas et al., 2013a; Weaver et al., 2014).

The criticality of team process in fast-paced, acute care settings is illustrated in a sample situation in Table 2. This example highlights teamwork problems that are particularly challenging if teams face complex tasks, unpredictable circumstances, time pressure, high risk and/or rapid workload changes as it is the case in action teams.

TABLE 2.

Example of a problematic teamwork situation in fast-paced, acute care settings.

Situation Potential teamwork process problems Required teamwork process insights
At 2 a.m. a patient is being brought into the trauma center. She appears to have multiple traumatic injuries. The nurses prepare the patient as quickly as possible and the anesthesia sub-team begins with inducting of anesthesia. The trauma doors open, the attending trauma surgeon comes in and starts yelling and forcefully expressing her disapproval that the patient lies uncovered, bare, and fully exposed in the cold room and that she wouldn’t know how many more times she has to complain about it until the nurses would eventually get it. The nurses look at each other, roll their eyes, and continue their work. So does the anesthesiologist. High frequency of uncivil behavior and its detrimental and contagiously spreading effects for team performance outcomes (Porath and Erez, 2009; Riskin et al., 2015; Foulk et al., 2016; Bar-David, 2018; Klingberg et al., 2018). Insights into the unfolding of incivility during fast-paced, acute care settings and into potential triggers of civility.
Low frequency of voice behavior and related missed opportunities for improvement (Morrison and Milliken, 2000; Kobayashi et al., 2006; Detert and Burris, 2007; Tangirala and Ramanujam, 2012; Schwappach and Gehring, 2014; Raemer et al., 2016). Understanding of social dynamics enabling voice behavior during fast-paced, acute care settings.
Difficulty to function as highly interdependent team because of low civility (Salas, 2016). Identification of team adaptation mechanisms for maintaining and regaining functionality despite low civility.

Example 3: Clinical Team Debriefings

Designed to promote learning from reflected experience, debriefings are guided conversations that facilitate the understanding of the relationship among events, actions, thought and feeling processes, and team performance outcomes (Ellis and Davidi, 2005; Rudolph et al., 2007). With respect to the team setting, debriefings have some characteristics in common with the multidisciplinary decision-making boards (example 1): they rely on psychological safety for providing a conversational climate which allows for information-sharing and sense-making. They are also formed ad hoc, consist of interprofessional, and in many cases, multidisciplinary members across the authority gradient and exist within complex, hierarchical organizations. What distinguishes them from multidisciplinary decision-making boards is their task: whereas the boards’ task is to make decisions regarding future diagnosis and treatment, the task of debriefings is to learn from previous, collective experience. Learning outcomes may vary among team members and decisions are not necessarily required. Also called after-action reviews, after-event reviews, and post-event reviews, debriefings aim to provide the structure for shifting from automatic/habitual to more conscious/deliberate action and information processing (Ellis and Davidi, 2005; DeRue et al., 2012). Debriefings allow for reflection and self-explanation, data verification and feedback, understanding the relationship between teamwork and task work, uncovering and closing knowledge gaps and disparity in shared cognition, structured information sharing, goal setting and action planning, as well as changes in attitudes, motivation, and self and collective efficacy (Ellis and Davidi, 2005; Rudolph et al., 2007, 2008; DeRue et al., 2012; Eddy et al., 2013; Tannenbaum and Cerasoli, 2013; Tannenbaum and Goldhaber-Fiebert, 2013; Tannenbaum et al., 2013; Kolbe et al., 2015; Eppich et al., 2016; Sawyer et al., 2016b; Allen et al., 2018). In healthcare, debriefings are particularly suited for ad hoc teams. While they have become a core ingredient of simulation-based team training (Cheng et al., 2014; Eppich et al., 2015; Sawyer et al., 2016a), their use in daily clinical practice is still limited (Tannenbaum and Goldhaber-Fiebert, 2013) given their vast potential (Mullan et al., 2014; Kessler et al., 2015; Eppich et al., 2016).

The criticality of team process in clinical team debriefings is illustrated in a sample situation in Table 3. This example sheds light on the question how team members and teams as a whole can make use of reflexivity on their team- and taskwork. This includes the issue of identifying process-related markers that indicate turning points in the team process, setting the course for more or less effective team output.

TABLE 3.

Example of a problematic teamwork situation in clinical debriefings.

Situation Potential teamwork process problems Required teamwork process insights
After the management of an unexpected cardiac arrest during surgery, most team members come together for a debriefing. While the participating attending physicians engage in a heated discussion about who was right and who caused the cardiac arrest, the residents and nurses are rather quiet. After a few minutes, the most senior attending physician shares his thoughts on why everybody did what they did and concludes the debriefing, advising the team at large that the mistake simply must not happen again. Team members may experience fear, anxiety, and embarrassment when making and discussing potential mistakes and engage in face-saving actions such as withdrawal, reluctance to ask for help and disclose errors, and obscuring critique (Schein, 1993; Edmondson, 1999; Rudolph et al., 2013). Identification of team adaptation mechanisms for creating and maintaining psychologically safe learning moments for clinical debriefings.
Lack of debriefing rules (Allen et al., 2015; Kolbe et al., 2015), psychological safety and voice (Rudolph et al., 2014). Understanding of required debriefing rules.
Risk of shallow or short-sighted argumentation, single rather than double-loop learning, and low levels of reflection and limited effectiveness of feedback (Argyris, 2002; Homayounfar et al., 2015; Kihlgren et al., 2015; Hughes et al., 2016; Boos and Sommer, 2018). Identification of characteristic modes of argumentation in debriefings depending on status, context, authority gradient and potential turning points and use of structural instabilities in communication.

The three examples were chosen to illustrate generic features of team tasks and team processes. Team tasks call for heterogeneous expertise to be shared, and problem-solving and decision-making procedures that fit task requirements. The tasks require teams to effectively handle interdependent subtasks. And, teams can learn best when they reflect on their team- and taskwork. The vehicle for the accomplishment of all of these task requirements is the team process. The identification of functional team behaviors, critical points and phases in the team process, patterns of how team behavior evolves and adapts to task requirements as well as the facilitation of appropriate team process patterns can help to improve teamwork.

Previous Methodological Approaches and Their Consequences

After having outlined tasks, prototypical process patterns, and respective research needs in the three team examples, we now put the methodological approaches of how teamwork in these representative examples has mostly been studied centerstage and highlight its consequences. In order to be as specific, illustrative and substantial as possible, we will—in a subsequent step-start from the examples to conceptualize and describe methods that promise deeper and more differentiated insights into teamwork and thus provide a basis for more effective practical interventions. We show important implications of focusing on team dynamics and using suitable methods to capture dynamic processes for team performance outcomes.

Previous Methodological Approaches of Studying Teamwork in Multidisciplinary Decision-Making Boards

Studies investigating the effectiveness of multidisciplinary decision-making boards have mainly relied on surveys or database review. Database reviews include the systematic review of certain documents, for example hospitals’ patient documentation system. Surveys include questionnaires on specific aspects of self-reported teamwork quality and processes, typically provided by team members in a cross-sectional way. Rating scales such as behavior-anchored rating scales include behavior examples for desired and undesired behavior and a scale for assessing the quality of these behaviors, mostly provided to non-team members (e.g., observers) in a cross-sectional way. Studies using these methods have mostly focused on input and output factors such as (a) whether a multidisciplinary decision-making board is present or not (Keating et al., 2013), (b) whether tumor boards are attended or not (Kehl et al., 2015), (c) the content that is being discussed (Snyder et al., 2017), (d) whether conducting a tumor board leads to a change in management plan or not (Tafe et al., 2015; Brauer et al., 2017; Thenappan et al., 2017), (e) the feasibility with respect to use of technology or overall duration (Marshall et al., 2014), (f) the degree to which the tumor board is valued by participants (Snyder et al., 2017), and (g) the documentation during the board meeting (Farrugia et al., 2015). These studies provide valuable information on the context and some organizational conditions of tumor boards’ effectivity which should not be underestimated (Salas, 2016). However, they are limited in their potential to reveal insights into the actual process and quality of information-sharing and decision-making. This is problematic because it is particularly the quality rather than quantity of communication that is important for performance (Marlow et al., 2018). That is, whereas some effectiveness factors such as optimal team composition, infrastructure, and data base logistics are already well-investigated, there are fewer data on advantageous interaction and communication processes before and during multidisciplinary decision-making board meetings. This is challenging because, as illustrated in the meeting example above, it is particularly the dynamic process that—in interaction with task complexity, time pressure, conflicting goals, and hierarchical structure—endangers the quality of the decision outcome.

Some studies have explicitly addressed the decision-making in tumor boards. They have relied on self-reports (Lamb et al., 2011) and rating tools such as the Multidisciplinary Team Metric for Observation of Decision-Making (MDT-MODe, Lamb et al., 2013; Shah et al., 2014). Although not addressing the decision-making process as such, these studies have provided valuable knowledge on (a) the ability to reach decisions (e.g., 82.2 to 92.7%, Lamb et al., 2013), (b) the attendance rate and duration of case reviews (e.g., 3 min per case, Shah et al., 2014), (c) estimates of the (poor) quality of presented information (e.g., 29.6 to 38.3%, Lamb et al., 2013), (d) estimates of the (poor) quality of teamwork (e.g., 37.8 to 43.0%, Lamb et al., 2013), (e) the comparative quality of team members’ contributions (e.g., highest from surgeons, Shah et al., 2014), and (f) the barriers to reaching decisions (e.g., inadequate information, Lamb et al., 2013). Although the authors of these studies conclude that rating and self-report tools allow for reliably assessing the quality of teamwork and decision-making (e.g., Lamb et al., 2011), we argue that the methodology of these studies does not allow for insights into the actual, dynamic process of information-sharing and decision-making and the quality of the communication process: it remains unanswered (a) how contributions are shared among board members of different levels of hierarchy, (b) who actually contributes when with which information, (c) how other board members react, (d) how individual contributions (not) influence the decision recommendation, and (e) how dissent about evaluations and recommendations emerges and dissolves. We have argued that neglecting these critical characteristics of the decision-making process is to some degree comparable to a patient undergoing surgery while his or her condition is judged using a rating scale from 1 (bad) to 5 (good) instead of collecting and interpreting data using continuous, machine-based monitoring of heartbeat, breathing, blood pressure, body temperature, and other body functions (Kolbe and Boos, 2018).

Previous Methodological Approaches of Studying Teamwork in Fast-Paced, Acute Care Settings

A number of studies have been conducted to assess how healthcare teams manage fast-paced, acute care settings. They relied on various methods ranging from surveys (Valentine et al., 2015), over rating tools (e.g., Undre et al., 2009; Couto et al., 2015) to event and time-based observation tools (e.g., Riethmüller et al., 2012; Schmutz et al., 2015; Su et al., 2017). Teamwork observation measures have been developed for capturing teamwork in complex medical situations (e.g., Fletcher et al., 2004; Yule et al., 2006; Manser et al., 2008; Kolbe et al., 2009, 2013a; Tschan et al., 2011b; Kemper et al., 2013; Robertson et al., 2014; Seelandt et al., 2014). Overall, these observation tools fall into two main categories: behavioral marker systems (e.g., Fletcher et al., 2004; Yule et al., 2006; Undre et al., 2009; Kemper et al., 2013; Jones et al., 2014; Robertson et al., 2014) and coding schemes (e.g., Manser et al., 2008; Kolbe et al., 2009, 2013a; Tschan et al., 2011b; Seelandt et al., 2014). Both types of tools include a number of advantages and disadvantages (Kolbe and Boos, 2018). For examp le, Undre and colleagues applied a behavioral marker system at three designated times during 50 surgical procedures. They found that teamwork behavior could actually be compared between members of different operating room subteams (Undre et al., 2007). They were also able to show that surgeons’ teamwork scores deteriorated toward the end of procedures (Undre et al., 2007). Whereas these results provide valuable knowledge of teamwork estimates and perceived quality, they do not provide insights into the actual operating room team interaction process. This has been possible with studies using behavior coding. For example, Tschan and colleagues continuously coded communication of 167 surgical procedures and found that especially case-irrelevant communication during the closing phase of the procedure was associated with higher rates of surgical site infections (Tschan et al., 2015). Similarly, Riethmüller and colleagues applied a category system for team coordination in anesthesia (Kolbe et al., 2009) for coding coordination activities of simulated anesthesia task episodes and, in addition, assessed awareness for situational triggers and subsequent handling of complications within post-simulation interviews based on stimulated video-recall of the critical phases around the complication. They showed that the occurrence of a complication, e.g., an anaphylaxis or a malign hyperthermia, during a simulated routine anesthesia requires a shift from implicit to explicit coordination behavior (Riethmüller et al., 2012). Also, Weiss and colleagues tested the effects of inclusive leader language on voice in multi-professional healthcare teams in simulated medical emergencies. Specifically, they coded implicit (i.e., First-Person Plural pronouns) and explicit (i.e., invitations and appreciations) inclusive leader language and found that leaders’ implicit leader utterances were more strongly related to residents’ (in- group) and explicit invitations related more strongly to nurses’ (out-group) voice behavior (Weiss et al., 2017a).

As these studies using behavior coding as stand-alone method for capturing teamwork indicate, they—although requiring much time and many resources—do not only provide very specific insights into the relationship between team dynamics and outcomes but also offer actionable knowledge for more targeted team training intervention.

Previous Methodological Approaches of Studying Teamwork in Clinical Debriefing

The empirical investigation of debriefing and reflexivity in teams is relatively new. Although their overall team context bears similarities with multidisciplinary decision-making boards, research on debriefings has been significantly different from research on the decision-making boards. In disciplines such as psychology and organizational behavior, this research involves experiments (e.g., Gurtner et al., 2007; Ellis et al., 2009, 2010; DeRue et al., 2012; Eddy et al., 2013; Konradt et al., 2015; Otte et al., 2018) and field studies (Vashdi et al., 2013; Weiss et al., 2017b) in which the impact of reflexivity interventions on defined outcomes is tested and different debriefing approaches are compared (e.g., unstructured vs. structured). In disciplines such as healthcare and medical education, there is far more conceptual than empirical work on debriefings. The conceptual work has focused on how to conduct debriefings (Rudolph et al., 2007, 2008, 2013, 2014; Cheng et al., 2014; Eppich et al., 2015, 2016; Kessler et al., 2015; Sawyer et al., 2016a; Cheng et al., 2017; Kolbe and Rudolph, 2018; Endacott et al., 2019). The empirical work has focused on communication in debriefings, albeit rather unsystematically and rarely applying rigorous team science methodology (e.g., Husebø et al., 2013; Kihlgren et al., 2015). Consequences of previous research on teamwork in debriefings include valuable knowledge on debriefing effectiveness and on macro-level debriefing process on the one hand and very limited actionable knowledge on optimal debriefing interaction processes and facilitation for high quality reflection on the other hand.

There are measures available for assessing team reflection and debriefing: (a) REMINT—a reflection measure for individuals and teams (Otte et al., 2017), (b) Debriefing Assessment for Simulation in Healthcare (DASH, The Center for Medical Simulation, 2010; Brett-Fleegler et al., 2012), (c) Objective Structured Assessment of Debriefing (OSAD, Arora et al., 2012), and (d) DECODE for assessing debriefers’ and learners’ communication in debriefings (Seelandt et al., 2018). While REMINT is a self-report measure and not applicable for assessing team dynamics, DASH and OSAD are behavioral marker systems. A recent study pointed to the challenges of measuring team debriefing quality via behavioral markers: Hull and colleagues compared OSAD-based evaluations by examining expert debriefing evaluators, debriefers, and learners (i.e., team members). They found significant differences between these groups: (a) Debriefers perceived the quality of their debriefings more favorably than expert debriefing evaluators. (b) Weak agreement between learner and expert evaluators’ perceptions as well as debriefers’ perceptions were found (Hull et al., 2017). That is, whereas research applying behavioral marker tools can reveal knowledge on differences in perceptions of debriefer/debriefing quality, it provides only limited insights into optimal debriefing interaction processes and how to facilitate high quality reflection in debriefings. This is problematic because, similarly to multidisciplinary decision-making boards (example 1), it is the quality rather than quantity of communication that is important for performance (Marlow et al., 2018); and so far not much is known about how to achieve high quality team interaction during clinical debriefings.

In sum, the review of existing methods used in the three exemplary team research areas shows that approaches for assessing team processes as the critical mechanism mediating the effects of input factors on team performance outcomes exist. Particularly advanced is the research on teamwork in fast-paced, acute care settings with progressive development and application of methods apt for capturing the dynamics of teamwork. Still, overall there is too much focus on aggregate measures, rating tools, and self-report data instead of fine-grained process analysis (Table 4). In what follows, we illustrate potential additional methodological approaches which are, for the time being, more laborious and highlight their consequences with respect to benefits for applied team science. We show the benefits of team interaction process analysis for shedding light on dynamics of teamwork during decision-making in multidisciplinary boards, fast-paced, acute care settings, and during shared reflection.

TABLE 4.

Previous and laborious methodological approaches and their consequences.

Example Team conceptual foundation General research question Previous methodological approaches and their consequences Laborious methodological approaches and their consequences
1 Multi-disciplinary decision-making boards Collective information sharing and decision-making in ad hoc, diverse teams (Stasser and Titus, 1987; Larson et al., 2002; Mesmer-Magnus and DeChurch, 2009; Schulz-Hardt and Mojzisch, 2012). What are the resulting risks of how input characteristics that are typical of multidisciplinary decision-making boards (e.g., high salience of status and hierarchy, conflicting goals, time pressure) may be associated with ineffective decision-making dynamics and suboptimal results?

What are effective countermeasures for managing these risks given the special characteristics of these boards?
Approaches:
Database reviews
Surveys/self-reports
Rating scales

Consequences:
+ Knowledge about input-output relations
+ Knowledge about context factors and selected organizational conditions
+ Knowledge on selected effectiveness criteria such as team composition, infrastructure, data base logistics
− Very limited insights into risks of how typical characteristics of multidisciplinary decision-making boards are associated with ineffective information-sharing and decision-making dynamics, management of dissent and suboptimal results
− Very limited actionable knowledge for designing effective countermeasures for managing decision-making risks due to special characteristics of these boards.
Approach:
Event- or time-based coding of each members’ verbal and non-verbal contributions and analysis of board interaction patterns with respect to characteristic input factors and decision outcomes

Consequences:
− Exhaustive behavior coding and data analysis require significantly more time and resources than using surveys/rating scales
+ Detailed insights into risks of how typical characteristics of multidisciplinary decision-making boards are associated with ineffective information-sharing and decision-making dynamics, management of dissent and suboptimal results
+ Actionable knowledge for designing effective countermeasures for managing decision-making risks due to special characteristics of these boards
2 Teamwork in fast-paced, acute care settings Leadership, coordination, and communication in ad hoc teams (Gaba et al., 2001; Künzle et al., 2010; Boos et al., 2011; Tschan et al., 2014; Fernandez Castelao et al., 2015; Su et al., 2017). How does incivility unfold during fast-paced, acute care settings and what are potential team adaptation triggers of civility?

What are team adaptation mechanisms for maintaining and regaining functionality despite low civility?

What are the enabling social dynamics of voice behavior during fast-paced, acute care settings? How can voice behavior emerge and be effective?
Approaches:
Surveys/self-reports
Rating scales/behavior-marker systems

Consequences:
+ Knowledge of perceived teamwork estimates and perceived teamwork quality
+ Differences in the perceptions of teamwork among team members or subteams highlighted
− Very limited actionable knowledge on actual team interaction such as unfolding of incivility and how it relates to performance outcomes

Alternative approach:
Time- and event-based behavior coding

Consequences of alternative approach
− Coding requires significantly more time than using surveys/rating scales
+ In-depth, actionable knowledge on the process of team interaction and adaptation
Approach:
Time- and event-based behavior coding combined with social-sensor-based measurement (e.g., physiological data, pose)

Consequences:
− Coding still requires significantly more time than surveys/rating scales
+/− Sensor-based measurement is more feasible and unobtrusive but strategies for data analysis are still being developed
+ Comprehensive, in-depth, actionable knowledge on the dynamic process of actual visible and invisible team interactions related to phenomena such as (in-)civility and voice during fast-paced, acute care settings
3 Post-event, clinical team debriefing Individual and team learning in ad hoc teams (Gurtner et al., 2007; Edmondson, 2012; Tannenbaum et al., 2012; Vashdi et al., 2013; Konradt et al., 2015; Schmutz and Eppich, 2017).
Reflective practice (i.e., the exploration of one’s mental routines, taken-for-granted assumptions, and their behavioral consequences) and shared reflection (i.e., collectively looking back on past experience) (Schön, 1983; Argyris, 2002; Edmondson, 2012; Konradt et al., 2016; Koeslag-Kreunen et al., 2018; Otte et al., 2018).
What are team adaptation mechanisms for creating and maintaining psychologically safe learning moments for clinical team debriefings?

What are the team interaction processes that constitute high quality reflection? How do structural instabilities in communication (due to status, context, authority gradient) unfold and what are potential turning points in shared reflection? What are the resulting required process rules for conducting clinical debriefings?
Approach:
Experiments and field studies testing the impact of debriefing or its structure on team outcomes

Consequences:
+ Knowledge on debriefing effectiveness and on macro-level debriefing process
− Very limited knowledge on optimal debriefing interaction processes
− Very limited actionable knowledge on mechanisms for establishing psychological safety in debriefings
− Very limited knowledge on how to facilitate high quality reflection

Alternative approach:
Self-reports
Rating scales/Behavioral marker systems

Consequences of alternative approach:
+/− Knowledge on differences in perceptions of debriefer/debriefing quality
− Very limited knowledge on optimal debriefing interaction processes
− Very limited actionable knowledge on mechanisms for establishing psychological safety in debriefings
− Very limited knowledge on how to facilitate high quality reflection
Approach:
Time- and event-based behavior and communication content coding combined with social-sensor-based measurement (e.g., eye-tracking, pose)

Consequences:
− Behavior and communication coding still requires significantly more time than using surveys/rating scales
+/− Sensor-based measurement is more feasible and unobtrusive but strategies for data analysis are still being developed
+ Charting of the information flow by coding utterances, e.g., mention, repeat, value an information
+ Actionable knowledge on optimal debriefing interaction processes and on mechanisms for establishing psychological safety in debriefings
+Actionable knowledge on how to facilitate high quality reflection which can be translated into interventions and process rules for facilitating clinical debriefings

Laborious Methodological Approaches and Their Benefits

We have labeled the methods we will describe in the following as laborious because they involve, for the time being, more time and resources than most of the above-mentioned approaches. In order to be as specific, illustrative and substantial as possible, we will use the three examples multidisciplinary decision-making boards, fast-paced acute care settings and clinical debriefings to conceptualize and describe methods that promise deeper and more differentiated insights into teamwork and thus provide a basis for more effective practical interventions.

Laborious Methodological Approaches of Studying Teamwork in Multidisciplinary Decision-Making Boards

In order to complement existing research on multidisciplinary decision-making boards’ effectiveness we recommend to collect data by means of event-based or time-based sampling of critical interaction behavior and to analyze data by applying coding systems which have been designed to help uncovering team decision processes which are critical but invisible for the unaided eye (Table 1). These methods allow for in-depth analysis of what actually happens in multidisciplinary decision-making boards. This is important for identifying success factors. For example, using the Advanced Interaction Analysis for Teams (act4teams) coding scheme (Kauffeld et al., 2018) for analyzing multidisciplinary decision-making board team member behaviors could provide useful insights into (a) the optimal sequence of voicing information versus expressing decision preferences too early in the meeting (Mojzisch and Schulz-Hardt, 2010), (b) the impact of board leaders’ statements compared to lower status members’ contributions on the discussion and outcome (Lehmann-Willenbrock et al., 2015), (c) the emergence and impact of counterproductive meeting behaviors such as arriving late, complaining, and engaging in irrelevant discussions (Allen et al., 2015), and (d) the role of solution-focused meeting behavior such as suggesting a new idea or endorsing a solution (Lehmann-Willenbrock et al., 2017).

Likewise, applying aspects of the Hidden Profile coding scheme (Thürmer et al., 2018), MICRO-CO (Kolbe et al., 2011), or ARGUMENT (Boos and Sommer, 2018) would allow for (a) tracing information processing during the meeting, (b) reveal insights into what and how expert information is actually (not) processed and (not) integrated into decisions, and (c) disassemble the argumentation process into its elements, e.g., identifying grounds that are used to support specific claims for action. In a similar vein, continuously coding actual participation rather than attendance in the meeting would allow for insights into the balance of speaker switches, which has been found to be a predictor of good team performance (Woolley et al., 2010; Lehmann-Willenbrock et al., 2017). These insights into the complex, multi-layered decision-making process will not only be relevant for improving multidisciplinary decision-making boards in healthcare but for multi-team and board decision-making in general.

Laborious Methodological Approaches of Studying Teamwork in Fast-Paced, Acute Care Settings

To complement existing research and to provide context-sensitive tools for fast-paced, acute care setting, we need methods that capture the very process of teamwork as detailed, sensitive, and unobtrusively as possible. We need actionable knowledge on which behavioral sequences and interaction patterns are effective and which are prone for failure (Lei et al., 2016; Su et al., 2017). As previous research has shown, most of these insights can only be gained with behavior coding and – as new approach in measuring team dynamics – social sensor technology (Rosen et al., 2015, 2018b; Kolbe and Boos, 2018). Behavior coding as stand-alone method for capturing teamwork requires much time and many resources. At the same time, it not only provides very specific insights into the relationship between team dynamics and outcomes that would otherwise remain hidden but also offers actionable knowledge for more targeted team training intervention. As an attempt to more efficiently collect behavioral team data, social sensors have been recently introduced (Dietz et al., 2014; Kozlowski, 2015; Rosen et al., 2015, 2018b; Schmid Mast et al., 2015; Chaffin et al., 2017; Kozlowski and Chao, 2018). They use sensor technology which is, for example, included in smartphones or new types of wearable devices (e.g., smartwatches and bracelets) to measure behavioral cues and process these data to extract behavioral markers of relevant social constructs (Pentland, 2008). On the individual level, potential markers include participants’ body activity, speech consistency, cardiovascular features, or electrodermal activity. On the team level, markers include face-to-face interaction, centrality of certain team members allowing for a social network analysis, interpersonal distance, and behavioral mimicry. As such, social sensors have the potential to provide high-frequency, automated, low-cost, and unobtrusive measurement of behavioral team data (Kozlowski, 2015; Rosen et al., 2015; Chaffin et al., 2017).

The ability to continuously monitor team members might allow for an in-depth analysis of team dynamics, especially during the management of fast-paced, acute care tasks where other forms of data access are limited and potentially intrusive. Respective research in healthcare has revealed promising results. For example, Petrosoniak and colleagues applied an overlay tracing tool to track selected healthcare team members’ movement during 12 high-fidelity in situ simulation trauma sessions. They found differences in workflow, movement and space used between team members which provide a deeper understanding of teamwork during managing a medical emergency (Petrosoniak et al., 2018). In another study, Vankipuram and colleagues used radio identification tags and observations to record motion and location of clinical teams and were able to model behavior in critical care environments. That is, the detected behavior could be replayed in virtual reality and provides options for further analysis and training (Vankipuram et al., 2011). More recently, Rosen and colleagues used wearable as well as environmental sensors to capture nurses’ work process data in a surgical intensive care unit and found that the respective measures were able to predict perceived mental and physical exertion and, thus, contribute to the measurement of workload (Rosen et al., 2018c).

With respect to future research, social sensors might be able to capture the very process of teamwork. Especially in fast-paced, acute care settings they can complement traditional measurement methods to provide a more comprehensive analysis of team dynamics and actionable knowledge of which behavioral sequences and interaction patterns are effective (Kannampallil et al., 2011). As social sensors are able to provide information about the development and adaptation of team members’ emotional states, their relative proximity, and their activity level, they could, for example, reveal insights into (a) the development of stress levels among team members while (not) speaking up (e.g., changes in heart frequency or electrodermal activity, Setz et al., 2010) and potential countermeasures, (b) the potential of mimicry by team members for revealing civility while speaking up (Chartrand and Bargh, 1999; Meyer et al., 2016), (c) the proximity and centrality of team members as enablers or barriers for speaking up (Jackson and Hogg, 2010), (d) the development of adaptive coordination, especially switching from implicitness to explicitness, as a trainable skill set (Riethmüller et al., 2012).

Again, this kind of results would provide actionable knowledge on the dynamics of leadership and voice which can be used in team trainings. Facing medical emergencies, teams must act immediately, fast and in a highly efficient manner as emergencies often times imply a life-or-death-struggle. Methods are required that can grasp the criticality of situational triggers in the flow of a routine process, the sensitivity and situational awareness thereof and the accurate fitting of well-coordinated behavior for an efficient task management.

Laborious Methodological Approaches of Studying Teamwork in Clinical Debriefings

To complement existing research on team debriefing processes and effectiveness we recommend to collect data by means of event-based or time-based sampling of interaction behavior and to analyze data by applying coding systems which have been designed to help uncovering conversational team learning processes (Table 3). For example, using DECODE—the coding scheme for assessing debriefers’ and learners’ communication in debriefings (Seelandt et al., 2018) or the act4teams Coding Scheme (Kauffeld et al., 2018) for analyzing debriefing communication behavior could provide useful insights into the debriefings’ ideal macro (e.g., reaction phase, analysis phase, summary phases, Rudolph et al., 2007) as well as micro structure (e.g., what kind of facilitator’s communication behaviors trigger group members’ reflection statements, Husebø et al., 2013), in particular with respect to feedback and inquiry (Rudolph et al., 2007; Hughes et al., 2016; Kolbe et al., 2016). It could inform the potential association of team members’ status, professional discipline, actual profession, and their contributions to the debriefing discussion (Lehmann-Willenbrock et al., 2015), the emergence and impact of counterproductive debriefing behaviors such as arriving late, complaining, lecturing, and engaging in irrelevant discussions (Allen et al., 2015, 2018; Kolbe et al., 2015), the optimal balance of understanding and exploring vs. engaging in finding solutions (Kolbe et al., 2015), characteristic modes of argumentation in debriefings depending on status, context, authority gradient, and potential turning points and use of structural instabilities in communication, and the role of leadership in debriefing discussions (Koeslag-Kreunen et al., 2018). Similarly to proposed multidisciplinary decision-making boards research, capturing actual participation rather than attendance in the debriefing would allow for insights into the balance of speaker switches, which has been found to be a predictor of good team performance (Woolley et al., 2010; Lehmann-Willenbrock et al., 2017).

With respect to future research, behavior coding of team debriefings might be complemented with other data collection technology. For example, using eye tracking technology (Hess et al., 2018) might reveal insights the role of eye-contact for establishing and maintaining psychological safety in debriefings.

Conclusion

We have contrasted methodological approaches for studying team dynamics and their consequences. Given the increasing use of teams in modern organizations, there is a need to develop and apply scientifically-rooted concepts and methods to grasp team process dynamics as a means to gain a deeper understanding of successful teamwork.

Coding interaction and communication processes in teams based on generic or tailor-made category systems provides benefits for the science of teams. First, a process- and behavior-oriented approach enables us to operationalize theoretical constructs and everyday phenomena such as decision-making, coordination, and reflexivity in a clear-cut manner. Second, focusing on the processual enactment of team phenomena allows for a much richer picture of how they emerge, develop, and interact, how effective patterns evolve, and for identifying breaking points for potential intervention (Wageman et al., 2009). Third, studying team dynamics via behavior observation allows for taking the so-called functional perspective of group research seriously: opening the black box of team process as a mediator between input and output factors (Roe, 2011, 2014). For now, team behavior coding is still laborious. New developments in machine learning are likely to significantly reduce the involved workload in the future (Bonito and Keyton, 2018).

Implications of this research will be meaningful for team training and the design of prevention and intervention concepts to improve teamwork. Structural changes of input factors such as team composition, resources, reward systems, and norms can improve teamwork to some degree. But in the end, for determining what makes these changes effective or not, a look into how they are enacted during the team process is necessary. In this manuscript, we have tried to elaborate research questions in the realm of healthcare teams which cannot be answered sufficiently without taking the process of team communication and interaction into consideration. We are convinced that—as in other disciplines—innovation and progress in team research heavily depend on methodological and technological innovation. This is what Gigerenzer (1991) called the “tools-to-theories heuristic.” It is not so much the theories and data that drive scientists to new ideas and the solution of existing problems, but instruments, techniques, and methodical skills (Gigerenzer, 1994). With an increasing innovation grade in team research, we have methods and technology available that allow for much deeper and finer-grained team research and for exploring groundbreaking, new questions.

Author Contributions

All authors listed have made a substantial, direct and intellectual contribution to the work, and approved it for publication.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We thank Elisabeth Brauner for supporting us with her valuable insights, guidance, and enthusiasm for group interaction analysis. We also thank Bastian Grande and Kia Homayounfar for sharing their medical expertise.

References

  1. Allen J. A., Reiter-Palmon R., Crowe J., Scott C. (2018). Debriefs: teams learning from doing in context. Am. Psychol. 73 504–516. 10.1037/amp0000246 [DOI] [PubMed] [Google Scholar]
  2. Allen J. A., Yoerger M. A., Lehmann-Willenbrock N., Jones J. (2015). Would you please stop that!?: the relationship between counterproductive meeting behaviors, employee voice, and trust. J. Manag. Dev. 34 1272–1287. 10.1108/JMD-02-2015-0032 [DOI] [Google Scholar]
  3. Antonides C. F. J., Mack M. J., Kappetein A. P. (2017). Approaches to the role of the Heart Team in therapeutic decision making for heart valve disease. Struct. Heart 1 249–255. 10.1080/24748706.2017.1380377 [DOI] [Google Scholar]
  4. Argyris C. (2002). Double-loop learning, teaching, and research. Acad. Manag. Learn. Educ. 1 206–218. 10.5465/AMLE.2002.8509400 [DOI] [Google Scholar]
  5. Arora S., Ahmed M., Paige J., Nestel D., Runnacles J., Hull L., et al. (2012). Objective structured assessment of debriefing (OSAD): bringing science to the art of debriefing in surgery. Ann. Surg. 256 982–988. 10.1097/SLA.0b013e3182610c91 [DOI] [PubMed] [Google Scholar]
  6. Ballard D. I., Tschan F., Waller M. J. (2008). All in the timing: considering time at multiple stages of group research. Small Group Res. 39 328–351. 10.1177/1046496408317036 [DOI] [Google Scholar]
  7. Bar-David S. (2018). What’s in an eye roll? It is time we explore the role of workplace incivility in healthcare. Isr. J. Health Policy Res. 7:15. 10.1186/s13584-018-0209-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Bhanji F., Mancini M. E., Sinz E., Rodgers D. L., McNeil M. A., Hoadley T. A., et al. (2010). Part 16: education, implementation, and teams: 2010 American heart association guidelines for cardiopulmonary resuscitation and emergency cardiovascular care. Circulation 122(18 Suppl. 3), S920–S933. 10.1161/circulationaha.110.971135 [DOI] [PubMed] [Google Scholar]
  9. Bonito J. A., Keyton J. (2018). “Introduction to machine learning,” in The Cambridge Handbook of Group Interaction Analysis, eds Brauner E., Boos M., Kolbe M. (Cambridge: Cambridge University Press; ), 387–404. [Google Scholar]
  10. Boos M., Kolbe M., Strack M. (2011). “An inclusive model of group coordination,” in Coordination in Human and Primate Groups, eds Boos M., Kolbe M., Kappeler P., Ellwart T. (Heidelberg: Springer; ), 11–36. [Google Scholar]
  11. Boos M., Pritz J., Lange S., Belz M. (2014). Leadership in moving human groups. PLoS Comput. Biol. 10:e1003541. 10.1371/journal.pcbi.1003541 [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Boos M., Sommer C. (2018). “ARGUMENT,” in The Cambridge Handbook of Group Interaction Analysis, eds Brauner E., Boos M., Kolbe M. (Cambridge: Cambridge University Press; ), 460–466. [Google Scholar]
  13. Brauer D. G., Strand M. S., Sanford D. E., Kushnir V. M., Lim K.-H., Mullady D. K., et al. (2017). Utility of a multidisciplinary tumor board in the management of pancreatic and upper gastrointestinal diseases: an observational study. HPB 19 133–139. 10.1016/j.hpb.2016.11.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Brett-Fleegler M., Rudolph J. W., Eppich W. J., Monuteaux M., Fleegler E., Cheng A., et al. (2012). Debriefing assessment for simulation in healthcare: development and psychometric properties. Simul. Healthc. 7 288–294. 10.1097/SIH.0b013e3182620228 [DOI] [PubMed] [Google Scholar]
  15. Chaffin D., Heidl R., Hollenbeck J. R., Howe M., Yu A., Voorhees C., et al. (2017). The promise and perils of wearable sensors in organizational research. Organ. Res. Methods 20 3–31. 10.1177/1094428115617004 [DOI] [Google Scholar]
  16. Chartrand T. L., Bargh J. A. (1999). The chameleon effect: the perception-behavior link and social interaction. J. Pers. Soc. Psychol. 76 893–910. 10.1037//0022-3514.76.6.893 [DOI] [PubMed] [Google Scholar]
  17. Cheng A., Eppich W., Grant V., Sherbino J., Zendejas B., Cook D. A. (2014). Debriefing for technology-enhanced simulation: a systematic review and meta-analysis. Med. Educ. 48 657–666. 10.1111/medu.12432 [DOI] [PubMed] [Google Scholar]
  18. Cheng A., Grant V., Huffman J., Burgess G., Szyld D., Robinson T., et al. (2017). Coaching the debriefer: peer coaching to improve debriefing quality in simulation programs. Simul. Healthc. 12 319–325. 10.1097/sih.0000000000000232 [DOI] [PubMed] [Google Scholar]
  19. Cooper J. B., Newbower R. S., Kitz R. J. (1984). An analysis of major errors and equipment failures in anesthesia management: considerations for prevention and detection. Anesthesiology 60 34–42. 10.1097/00000542-198401000-00008 [DOI] [PubMed] [Google Scholar]
  20. Couto T. B., Kerrey B. T., Taylor R. G., FitzGerald M., Geis G. L. (2015). Teamwork skills in actual, in situ, and in-center pediatric emergencies: performance levels across settings and perceptions of comparative educational impact. Simul. Healthc. 10 76–84. 10.1097/sih.0000000000000081 [DOI] [PubMed] [Google Scholar]
  21. Cronin M. A., Weingart L. R., Todorova G. (2011). Dynamics in groups: are we there yet? Acad. Manag. Ann. 5 571–612. 10.1080/19416520.2011.590297 [DOI] [Google Scholar]
  22. DeRue D. S., Nahrgang J. D., Hollenbeck J. R., Workman K. (2012). A quasi-experimental study of after-event reviews and leadership development. J. Appl. Psychol. 97 997–1015. 10.1037/a0028244 [DOI] [PubMed] [Google Scholar]
  23. Detert J. R., Burris E. R. (2007). Leadership behavior and employee voice: is the door really open? Acad. Manag. J. 50 869–884. 10.5465/amj.2007.26279183 [DOI] [Google Scholar]
  24. Dietz A. S., Pronovost P. J., Benson K. N., Mendez-Tellez P. A., Dwyer C., Wyskiel R., et al. (2014). A systematic review of behavioural marker systems in healthcare: what do we know about their attributes, validity and application? BMJ Qual. Saf. 23 1031–1039. 10.1136/bmjqs-2013-002457 [DOI] [PubMed] [Google Scholar]
  25. Driskell T., Salas E., Driskell J. E. (2018). Teams in extreme environments: alterations in team development and teamwork. Hum. Resour. Manag. Rev. 28 434–449. 10.1016/j.hrmr.2017.01.002 [DOI] [Google Scholar]
  26. Eddy E. R., Tannenbaum S. I., Mathieu J. E. (2013). Helping teams to help themselves: comparing two team-led debriefing methods. Pers. Psychol. 66 975–1008. 10.1111/peps.12041 [DOI] [Google Scholar]
  27. Edmondson A. (1999). Psychological safety and learning behavior in work teams. Adm. Sci. Q. 44 350–383. 10.2307/2666999 22929925 [DOI] [Google Scholar]
  28. Edmondson A. C. (2012). Teaming: How Organizations Learn, Innovate, and Compete in the Knowledge Economy. San Francisco, CA: Jossey-Bass. [Google Scholar]
  29. Edmondson A. C., Lei Z. (2014). Psychological safety: the history, renaissance, and future of an interpersonal construct. Annu. Rev. Organ. Psychol. Organ. Behav. 1 23–43. 10.1146/annurev-orgpsych-031413-091305 [DOI] [Google Scholar]
  30. Edmondson A. C., McManus S. E. (2007). Methodological fit in management field research. Acad. Manag. J. 32 1155–1179. 10.5465/amr.2007.26586086 [DOI] [Google Scholar]
  31. Ellis S., Davidi I. (2005). After-Event Reviews: drawing lessons from successful and failed experience. J. Appl. Psychol. 90 857–871. 10.1037/0021-9010.90.5.842 [DOI] [PubMed] [Google Scholar]
  32. Ellis S., Ganzach Y., Castle E., Sekely G. (2010). The effect of filmed versus personal after-event reviews on task performance: the mediating and moderating role of self-efficacy. J. Appl. Psychol. 95 122–131. 10.1037/a0017867 [DOI] [PubMed] [Google Scholar]
  33. Ellis S., Mendel R., Aloni-Zohar M. (2009). The effect of accuracy of performance evaluation on learning from experience: the moderating role of after-event reviews. J. Appl. Soc. Psychol. 39 541–563. 10.1111/j.1559-1816.2009.00450.x [DOI] [Google Scholar]
  34. Endacott R., Gale T., O’Connor A., Dix S. (2019). Frameworks and quality measures used for debriefing in team-based simulation: a systematic review. BMJ Simul. Technol. Enhanc. Learn. 5 61–72. 10.1136/bmjstel-2017-000297 [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Eppich W., Mullan P. C., Brett-Fleegler M., Cheng A. (2016). “Let’s talk about it”: translating lessons from healthcare simulation to clinical event debriefings and clinical coaching conversations. Clin. Pediatr. Emerg. Med. 17 200–211. 10.1016/j.cpem.2016.07.001 [DOI] [Google Scholar]
  36. Eppich W. J., Hunt E. A., Duval-Arnould J. M., Siddall V. J., Cheng A. (2015). Structuring feedback and debriefing to achieve mastery learning goals. Acad. Med. 90 1501–1508. 10.1097/acm.0000000000000934 [DOI] [PubMed] [Google Scholar]
  37. Falk V., Baumgartner H., Bax J. J., De Bonis M., Hamm C., Holm P. J., et al. (2017). 2017 ESC/EACTS Guidelines for the management of valvular heart disease. Eur. J. Cardiothorac. Surg. 52 616–664. 10.1093/ejcts/ezx324 [DOI] [PubMed] [Google Scholar]
  38. Farrugia D. J., Fischer T. D., Delitto D., Spiguel L. R. P., Shaw C. M. (2015). Improved breast cancer care quality metrics after implementation of a standardized tumor board documentation template. J. Oncol. Pract. 11 421–423. 10.1200/jop.2015.003988 [DOI] [PubMed] [Google Scholar]
  39. Fernandez Castelao E., Boos M., Ringer C., Eich C., Russo S. G. (2015). Effect of CRM team leader training on team performance and leadership behavior in simulated cardiac arrest scenarios: a prospective, randomized, controlled study. BMC Med. Educ. 15:116. 10.1186/s12909-015-0389-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Fernandez Castelao E., Russo S. G., Cremer S., Strack M., Kaminski L., Eich C., et al. (2011). Positive impact of crisis resource management training on no-flow time and team member verbalisations during simulated cardiopulmonary resuscitation: a randomized controlled trial. Resuscitation 82 1338–1343. 10.1016/j.resuscitation.2011.05.009 [DOI] [PubMed] [Google Scholar]
  41. Fletcher G., Flin R., McGeorge P., Glavin R., Maran N., Patey R. (2004). Rating non-technical skills: developing a behavioral marker system for use in anaesthesia. Cogn. Technol. Work 6 165–171. 10.1007/s10111-004-0158-y [DOI] [Google Scholar]
  42. Flin R., Mitchell L. (eds). (2009). Safer Surgery: Analysing Behaviour in the Operating Theatre. Farnham: Ashgate. [Google Scholar]
  43. Foulk T., Woolum A., Erez A. (2016). Catching rudeness is like catching a cold: the contagion effects of low-intensity negative behaviors. J. Appl. Psychol. 101 50–67. 10.1037/apl0000037 [DOI] [PubMed] [Google Scholar]
  44. Gaba D. M., Howard S. K., Fish K. J., Smith B. E., Sowb Y. A. (2001). Simulation-based training in anesthesia crisis resource management (ACRM): a decade of experience. Simul. Gaming 32 175–193. 10.1177/104687810103200206 [DOI] [Google Scholar]
  45. Gawande A. A., Studdert D. M., Orav E. J., Brennan T. A., Zinner M. J. (2003). Risk factors for retained instruments and sponges after surgery. N. Engl. J. Med. 348 229–235. 10.1056/nejmsa021721 [DOI] [PubMed] [Google Scholar]
  46. Gigerenzer G. (1991). From tools to theories: a heuristic of discovery in cognitive psychology. Psychol. Rev. 98 254–267. 10.1037/0033-295X.98.2.254 [DOI] [Google Scholar]
  47. Gigerenzer G. (1994). “Where do new ideas come from,” in Dimensions of Creativity, ed. Boden M. A. (Cambridge, MA: MIT Press; ), 53–74. [Google Scholar]
  48. Graumann C. F. (1979). Die scheu des psychologen vor der interaktion. Ein schisma und seine geschichte [the unease of the psychologis in face of interaction. A schism and its history]. Z. Sozialpsychol. 10 284–304. [Google Scholar]
  49. Greenberg C. C., Regenbogen S. E., Studdert D. M., Lipsitz S. R., Rogers S. O., Zinner M. J., et al. (2007). Patterns of communication breakdowns resulting in injury to surgical patients. J. Am. Coll. Surg. 204 533–540. 10.1016/j.jamcollsurg.2007.01.010 [DOI] [PubMed] [Google Scholar]
  50. Grote G., Kolbe M., Zala-Mezö E., Bienefeld-Seall N., Künzle B. (2010). Adaptive coordination and heedfulness make better cockpit crews. Ergonomics 52 211–228. 10.1080/00140130903248819 [DOI] [PubMed] [Google Scholar]
  51. Gurtner A., Tschan F., Semmer N.-K., Naegele C. (2007). Getting groups to develop good strategies: effects of reflexivity interventions on team process, team performance, and shared mental models. Organ. Behav. Hum. Decis. Process. 102 127–142. 10.1016/j.obhdp.2006.05.002 [DOI] [Google Scholar]
  52. Hess S., Lohmeyer Q., Meboldt M. (2018). Mobile eye tracking in engineering design education. Des. Technol. Educ. Int. J. 23 86–98. [Google Scholar]
  53. Homayounfar K., Mey D., Boos M., Gaedcke J., Ghadimi M. (2015). Communication in the tumor board. Forum 30 214–217. 10.1007/s12312-015-1301-9 [DOI] [Google Scholar]
  54. Hughes A. M., Gregory M. E., Joseph D. L., Sonesh S. C., Marlow S. L., Lacerenza C. N., et al. (2016). Saving lives: a meta-analysis of team training in healthcare. J. Appl. Psychol. 101 1266–1304. 10.1037/apl0000120 [DOI] [PubMed] [Google Scholar]
  55. Hull L., Russ S., Ahmed M., Sevdalis N., Birnbach D. J. (2017). Quality of interdisciplinary postsimulation debriefing: 360° evaluation. BMJ Simul. Technol. Enhanc. Learn. 3 9–16. 10.1136/bmjstel-2016-000125 [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Humphrey S. E., Aime F. (2014). Team microdynamics: toward an organizing approach to teamwork. Acad. Manag. Ann. 8 443–503. 10.1080/19416520.2014.904140 [DOI] [Google Scholar]
  57. Hunziker S., Johansson A. C., Tschan F., Semmer N. K., Rock L., Howell M. D., et al. (2011). Teamwork and leadership in cardiopulmonary resuscitation. J. Am. Coll. Cardiol. 57 2381–2388. 10.1016/j.jacc.2011.03.017 [DOI] [PubMed] [Google Scholar]
  58. Husebø S. E., Dieckmann P., Rystedt H., Søreide E., Friberg F. (2013). The relationship between facilitators’ questions and the level of reflection in postsimulation debriefing. Simul. Healthc. 8 135–142. 10.1097/SIH.0b013e31827cbb5c [DOI] [PubMed] [Google Scholar]
  59. Jackson R. L., Hogg M. A. (2010). “Immediacy,” in Encyclopedia of Identity, eds Jackson R. L., Hogg M. A. (Thousand Oaks, CA: Sage; ), 382–384. [Google Scholar]
  60. Jones K., Rosen M., Duval-Arnould J., Hunt E. (2014). Development of a cardiopulmonary resuscitation non-technical skills scoring tool (CPR-NTS). Crit. Care Med. 42:A1427 10.1097/01.ccm.0000457773.66160.b8 [DOI] [Google Scholar]
  61. Kannampallil T., Li Z., Zhang M., Cohen T., Robinson D. J., Franklin A., et al. (2011). Making sense: sensor-based investigation of clinician activities in complex critical care environments. J. Biomed. Inform. 44 441–454. 10.1016/j.jbi.2011.02.007 [DOI] [PubMed] [Google Scholar]
  62. Kauffeld S., Lehmann-Willenbrock N., Meinecke A. L. (2018). “The advanced interaction analysis for teams (act4teams) coding scheme,” in The Cambridge Handbook of Group Interaction Analysis, eds Brauner E., Boos M., Kolbe M. (Cambridge: Cambridge University Press; ), 422–431. 10.1017/9781316286302.022 [DOI] [Google Scholar]
  63. Keating N. L., Landrum M. B., Lamont E. B., Bozeman S. R., Shulman L. N., McNeil B. J. (2013). Tumor boards and the quality of cancer care. J. Natl. Cancer Inst. 105 113–121. 10.1093/jnci/djs502 [DOI] [PMC free article] [PubMed] [Google Scholar]
  64. Kehl K. L., Landrum M. B., Kahn K. L., Gray S. W., Chen A. B., Keating N. L. (2015). Tumor board participation among physicians caring for patients with lung or colorectal cancer. J. Oncol. Pract. 11 e267–e278. 10.1200/jop.2015.003673 [DOI] [PMC free article] [PubMed] [Google Scholar]
  65. Kemper P. F., van Noord I., de Bruijne M., Knol D. L., Wagner C., van Dyck C. (2013). Development and reliability of the explicit professional oral communication observation tool to quantify the use of non-technical skills in healthcare. BMJ Qual. Saf. 22 586–595. 10.1136/bmjqs-2012-001451 [DOI] [PMC free article] [PubMed] [Google Scholar]
  66. Kessler D. O., Cheng A., Mullan P. C. (2015). Debriefing in the emergency department after clinical events: a practical guide. Ann. Emerg. Med. 65 690–698. 10.1016/j.annemergmed.2014.10.019 [DOI] [PubMed] [Google Scholar]
  67. Kihlgren P., Spanager L., Dieckmann P. (2015). Investigating novice doctors’ reflections in debriefings after simulation scenarios. Med. Teach. 37 437–443. 10.3109/0142159X.2014.956054 [DOI] [PubMed] [Google Scholar]
  68. Kim T., McFee E., Olguin Olguin D., Waber B., Pentland A. (2012). Sociometric badges: using sensor technology to capture new forms of collaboration. J. Organ. Behav. 33 412–427. 10.1002/job.1776 [DOI] [Google Scholar]
  69. Klingberg K., Gadelhak K., Jegerlehner S. N., Brown A. D., Exadaktylos A. K., Srivastava D. S. (2018). Bad manners in the Emergency Department: incivility among doctors. PLoS One 13:e0194933. 10.1371/journal.pone.0194933 [DOI] [PMC free article] [PubMed] [Google Scholar]
  70. Kobayashi H., Pian-Smith M., Sato M., Sawa R., Takeshita T., Raemer D. (2006). A cross-cultural survey of residents’ perceived barriers in questioning/challenging authority. Qual. Saf. Health Care 15 277–283. 10.1136/qshc.2005.017368 [DOI] [PMC free article] [PubMed] [Google Scholar]
  71. Koeslag-Kreunen M., Van den Bossche P., Hoven M., Van der Klink M., Gijselaers W. (2018). When leadership powers team learning: a meta-analysis. Small Group Res. 49 475–513. 10.1177/1046496418764824 [DOI] [PMC free article] [PubMed] [Google Scholar]
  72. Kolbe M., Boos M. (2018). “Observing group interaction,” in The Cambridge Handbook of Group Interaction Analysis, eds Brauner E., Boos M., Kolbe M. (Cambridge: Cambridge University Press; ), 68–85. 10.1017/9781316286302.005 [DOI] [Google Scholar]
  73. Kolbe M., Burtscher M., Manser T. (2013a). Co-ACT–A framework for observing coordination behavior in acute care teams. BMJ Qual. Saf. 22 596–605. 10.1136/bmjqs-2012-001319 [DOI] [PubMed] [Google Scholar]
  74. Kolbe M., Weiss M., Grote G., Knauth A., Dambach M., Spahn D. R., et al. (2013b). TeamGAINS: a tool for structured debriefings for simulation-based team trainings. BMJ Qual. Saf. 22 541–553. 10.1136/bmjqs-2012-000917 [DOI] [PubMed] [Google Scholar]
  75. Kolbe M., Grande B., Marty A., Manka R., Taramasso M., Nietlispach F., et al. (2019). Making heart team discussions work. Struct. Heart 3 100–103. 10.1080/24748706.2019.1572254 [DOI] [Google Scholar]
  76. Kolbe M., Grande B., Spahn D. R. (2015). Briefing and debriefing during simulation-based training and beyond: content, structure, attitude and setting. Best Pract. Res. Clin. Anaesthesiol. 29 87–96. 10.1016/j.bpa.2015.01.002 [DOI] [PubMed] [Google Scholar]
  77. Kolbe M., Grote G., Waller M. J., Wacker J., Grande B., Burtscher M., et al. (2014). Monitoring and talking to the room: autochthonous coordination patterns in team interaction and performance. J. Appl. Psychol. 99 1254–1267. 10.1037/a0037877 [DOI] [PubMed] [Google Scholar]
  78. Kolbe M., Künzle B., Zala-Mezö E., Wacker J., Grote G. (2009). “Measuring coordination behaviour in anaesthesia teams during induction of general anaesthetics,” in Safer Surgery. Analysing Behaviour in the Operating Theatre, eds Flin R., Mitchell L. (Aldershot: Ashgate; ), 203–221. 10.1201/9781315607436-13 [DOI] [Google Scholar]
  79. Kolbe M., Marty A., Seelandt J., Grande B. (2016). How to debrief teamwork interactions: using circular questions to explore and change team interaction patterns. Adv. Simul. 1:29. 10.1186/s41077-016-0029-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  80. Kolbe M., Rudolph J. W. (2018). What’s the headline on your mind right now? How reflection guides simulation-based faculty development in a master class. BMJ Simul. Technol. Enhanc. Learn. 4 126–132. 10.1136/bmjstel-2017-000247 [DOI] [PMC free article] [PubMed] [Google Scholar]
  81. Kolbe M., Strack M., Stein A., Boos M. (2011). “Effective coordination in human group decision making: MICRO-CO. A micro-analytical taxonomy for analysing explicit coordination mechanisms in decision-making groups,” in Coordination in Human and Primate Groups, eds Boos M., Kolbe M., Kappeler P., Ellwart T. (Heidelberg: Springer; ), 199–219. 10.1007/978-3-642-15355-6_11 [DOI] [Google Scholar]
  82. Konradt U., Otte K.-P., Schippers M. C., Steenfatt C. (2016). Reflexivity in teams: a review and new perspectives. J. Psychol. 150 153–174. 10.1080/00223980.2015.1050977 [DOI] [PubMed] [Google Scholar]
  83. Konradt U., Schippers M. C., Garbers Y., Steenfatt C. (2015). Effects of guided reflexivity and team feedback on team performance improvement: the role of team regulatory processes and cognitive emergent states. Eur. J. Work Organ. Psychol. 24 777–795. 10.1080/1359432X.2015.1005608 [DOI] [Google Scholar]
  84. Kozlowski S. W. (2015). Advancing research on team process dynamics Theoretical, methodological, and measurement considerations. Organ. Psychol. Rev. 5 270–299. 10.1177/2041386614533586 [DOI] [Google Scholar]
  85. Kozlowski S. W. J., Chao G. T. (2018). Unpacking team process dynamics and emergent phenomena: challenges, conceptual advances, and innovative methods. Am. Psychol. 73 576–592. 10.1037/amp0000245 [DOI] [PubMed] [Google Scholar]
  86. Künzle B., Zala-Mezö E., Wacker J., Kolbe M., Grote G. (2010). Leadership in anaesthesia teams: the most effective leadership is shared. Qual. Saf. Health Care 19:e46. 10.1136/qshc.2008.030262 [DOI] [PubMed] [Google Scholar]
  87. Lamb B. W., Green J. S. A., Benn J., Brown K. F., Vincent C. A., Sevdalis N. (2013). Improving decision making in multidisciplinary tumor boards: prospective longitudinal evaluation of a multicomponent intervention for 1,421 patients. J. Am. Coll. Surg. 217 412–420. 10.1016/j.jamcollsurg.2013.04.035 [DOI] [PubMed] [Google Scholar]
  88. Lamb B. W., Sevdalis N., Mostafid H., Vincent C., Green J. S. A. (2011). Quality improvement in multidisciplinary cancer teams: an investigation of teamwork and clinical decision-making and cross-validation of assessments. Ann. Surg. Oncol. 18 3535–3543. 10.1245/s10434-011-1773-5 [DOI] [PubMed] [Google Scholar]
  89. Larson J. J. R., Sargis E. G., Elstein A. S., Schwartz A. (2002). Holding shared versus unshared information: its impact on perceived member influence in decision-making groups. Basic Appl. Soc. Psychol. 24 145–155. 10.1207/S15324834BASP2402_6 [DOI] [Google Scholar]
  90. Larson J. R. J., Foster-Fishman P. G., Franz T. M. (1998). Leadership style and the discussion of shared and unshared information in decision-making groups. Pers. Soc. Psychol. Bull. 24 482–495. 10.1177/0146167298245004 [DOI] [Google Scholar]
  91. Lehmann-Willenbrock N. (2017). Team learning. Small Group Res. 48 123–130. 10.1177/1046496416689308 [DOI] [PMC free article] [PubMed] [Google Scholar]
  92. Lehmann-Willenbrock N., Allen J. A., Kauffeld S. (2013). A sequential analysis of procedural meeting communication: how teams facilitate their meetings. J. Appl. Commun. Res. 41 365–388. 10.1080/00909882.2013.844847 [DOI] [Google Scholar]
  93. Lehmann-Willenbrock N., Chiu M. M., Lei Z., Kauffeld S. (2017). Understanding positivity within dynamic team interactions. Group Organ. Manag. 42 39–78. 10.1177/1059601116628720 [DOI] [Google Scholar]
  94. Lehmann-Willenbrock N., Meinecke A. L., Rowold J., Kauffeld S. (2015). How transformational leadership works during team interactions: a behavioral process analysis. Leadersh. Q. 26 1017–1033. 10.1016/j.leaqua.2015.07.003 [DOI] [Google Scholar]
  95. Lehmann-Willenbrock N., Meyers R. A., Kauffeld S., Neininger A., Henschel A. (2011). Verbal interaction sequences and group mood: exploring the role of team planning communication. Small Group Res. 42 639–668. 10.1177/1046496411398397 [DOI] [Google Scholar]
  96. Lei Z., Waller M. J., Hagen J., Kaplan S. (2016). Team adaptiveness in dynamic contexts: contextualizing the roles of interaction patterns and in-process planning. Group Organ. Manag. 41 491–525. 10.1177/1059601115615246 [DOI] [Google Scholar]
  97. Manser T., Howard S. K., Gaba D. M. (2008). Adaptive coordination in cardiac anaesthesia: a study of situational changes in coordination patterns using a new observation system. Ergonomics 51 1153–1178. 10.1080/00140130801961919 [DOI] [PubMed] [Google Scholar]
  98. Marks M. A., Mathieu J. E., Zaccaro S. J. (2001). A temporally based framework and taxonomy of team processes. Acad. Manag. Rev. 26 356–376. 10.5465/AMR.2001.4845785 [DOI] [Google Scholar]
  99. Marlow S. L., Lacerenza C. N., Paoletti J., Burke C. S., Salas E. (2018). Does team communication represent a one-size-fits-all approach?: a meta-analysis of team communication and performance. Organ. Behav. Hum. Decis. Process. 144 145–170. 10.1016/j.obhdp.2017.08.001 [DOI] [Google Scholar]
  100. Marshall C. L., Petersen N. J., Naik A. D., Velde N. V., Artinyan A., Albo D., et al. (2014). Implementation of a regional virtual tumor board: a prospective study evaluating feasibility and provider acceptance. Telemed. J. E Health 20 705–711. 10.1089/tmj.2013.0320 [DOI] [PMC free article] [PubMed] [Google Scholar]
  101. Mathieu J. E., Tannenbaum S. I., Donsbach J. S., Alliger G. M. (2014). A review and integration of team composition models moving toward a dynamic and temporal framework. J. Manag. 40 130–160. 10.1177/0149206313503014 [DOI] [Google Scholar]
  102. Maynard M. T., Kennedy D. M., Resick C. J. (2018). Teamwork in extreme environments: lessons, challenges, and opportunities. J. Organ. Behav. 39 695–700. 10.1002/job.2302 [DOI] [Google Scholar]
  103. McGrath J. E., Tschan F. (2004). Temporal Matters in Social Psychology: Examining the Role of Time in the Lives of Groups and Individuals. Washington, DC: APA. [Google Scholar]
  104. Mesmer-Magnus J. R., DeChurch L. A. (2009). Information sharing and team performance: a meta-analysis. J. Appl. Psychol. 94 535–546. 10.1037/a0013773 [DOI] [PubMed] [Google Scholar]
  105. Meyer B., Burtscher M. J., Jonas K., Feese S., Arnrich B., Tröster G., et al. (2016). What good leaders actually do: micro-level leadership behaviour, leader evaluations, and team decision quality. Eur. J. Work Organ. Psychol. 25 773–789. 10.1080/1359432X.2016.1189903 [DOI] [Google Scholar]
  106. Mojzisch A., Schulz-Hardt S. (2010). Knowing others’ preferences degrades the quality of group decisions. J. Pers. Soc. Psychol. 98 794–808. 10.1037/a0017627 [DOI] [PubMed] [Google Scholar]
  107. Morrison E. W., Milliken F. J. (2000). Organizational silence: a barrier to change and development in a pluralistic world. Acad. Manag. Rev. 25 706–725. 10.5465/amr.2000.3707697 [DOI] [Google Scholar]
  108. Mullan P. C., Kessler D. O., Cheng A. (2014). Educational opportunities with postevent debriefing. JAMA 312 2333–2334. 10.1001/jama.2014.15741 [DOI] [PubMed] [Google Scholar]
  109. Otte K.-P., Konradt U., Garbers Y., Schippers M. C. (2017). Development and validation of the REMINT: a reflection measure for individuals and teams. Eur. J. Work Organ. Psychol. 26 299–313. 10.1080/1359432X.2016.1261826 [DOI] [Google Scholar]
  110. Otte K.-P., Konradt U., Oldeweme M. (2018). Effective team reflection: the role of quality and quantity. Small Group Res. 49 739–766. 10.1177/1046496418804898 [DOI] [Google Scholar]
  111. Pentland A. (2008). Honest Signals. How they Shape our World. Cambridge, MA: MIT Press. [Google Scholar]
  112. Petrosoniak A., Almeida R., Pozzobon L. D., Hicks C., Fan M., White K., et al. (2018). Tracking workflow during high-stakes resuscitation: the application of a novel clinician movement tracing tool during in situ trauma simulation. BMJ Simul. Technol. Enhanc. Learn. 5 78–84. 10.1136/bmjstel-2017-000300 [DOI] [PMC free article] [PubMed] [Google Scholar]
  113. Porath C. L., Erez A. (2009). Overlooked but not untouched: how rudeness reduces onlookers’ performance on routine and creative tasks. Organ. Behav. Hum. Decis. Process. 109 29–44. 10.1016/j.obhdp.2009.01.003 [DOI] [Google Scholar]
  114. Pox C., Aretz S., Bischoff S. C., Graeven U., Hass M., Heußner P., et al. (2013). S3-leitlinie kolorektales karzinom version 1.0 – Juni 2013 AWMF-registernummer: 021/007OL. [S3-guideline colorectal cancer version 1.0]. Z. Gastroenterol. 51 753–854. 10.1055/s-0033-1350264 [DOI] [PubMed] [Google Scholar]
  115. Pronovost P. (2013). “Teamwork matters,” in Developing and Enhancing Teamwork in Organizations: Evidence-Based Best Practices and Guidelines, eds Salas E., Tannenbaum S. I., Cohen D., Latham G. (San Francisco, CA: Jossey-Bass; ), 11–12. [Google Scholar]
  116. Raemer D. B., Kolbe M., Minehart R. D., Rudolph J. W., Pian-Smith M. C. M. (2016). Improving anesthesiologists’ ability to speak up in the operating room: a randomized controlled experiment of a simulation-based intervention and a qualitative analysis of hurdles and enablers. Acad. Med. 91 530–539. 10.1097/acm.0000000000001033 [DOI] [PubMed] [Google Scholar]
  117. Reynard J., Reynolds J., Stevenson P. (2009). Practical Patient Safety. Oxford: Oxford University Press. [Google Scholar]
  118. Riethmüller M., Fernandez Castelao E., Eberhardt D., Timmermann A., Boos M. (2012). Adaptive coordination development in student anaesthesia teams: a longitudinal study. Ergonomics 55 55–68. 10.1080/00140139.2011.636455 [DOI] [PubMed] [Google Scholar]
  119. Riskin A., Erez A., Foulk T. A., Kugelman A., Gover A., Shoris I., et al. (2015). The impact of rudeness on medical team performance: a randomized trial. Pediatrics 136 487–495. 10.1542/peds.2015-1385 [DOI] [PubMed] [Google Scholar]
  120. Robertson E. R., Hadi M., Morgan L. J., Pickering S. P., Collins G., New S., et al. (2014). Oxford NOTECHS II: a modified theatre team non-technical skills scoring system. PLoS One 9:e90320. 10.1371/journal.pone.0090320 [DOI] [PMC free article] [PubMed] [Google Scholar]
  121. Roe R. A. (2008). Time in applied psychology: the study of “what happens” rather than “what is”. Eur. Psychol. 13 37–52. 10.1027/1016-9040.13.1.37 [DOI] [Google Scholar]
  122. Roe R. A. (2011). “What is wrong with mediators and moderators?,” in Paper Presented at the 15th European Congress of Work & Organizational Psychology, Maastricht. [Google Scholar]
  123. Roe R. A. (2014). Test validity from a temporal perspective: incorporating time in validation research. Eur. J. Work Organ. Psychol. 23 754–768. 10.1080/1359432X.2013.804177 [DOI] [Google Scholar]
  124. Rosen M. A., DiazGranados D., Dietz A. S., Benishek L. E., Thompson D., Pronovost P. J., et al. (2018a). Teamwork in healthcare: key discoveries enabling safer, high-quality care. Am. Psychol. 73 433–450. 10.1037/amp0000298 [DOI] [PMC free article] [PubMed] [Google Scholar]
  125. Rosen M. A., Dietz A. S., Kazi S. (2018b). “Beyond coding interaction,” in The Cambridge Handbook of Group Interaction Analysis, eds Brauner E., Boos M., Kolbe M. (Cambridge: Cambridge University Press; ), 142–162. 10.1017/9781316286302.009 [DOI] [Google Scholar]
  126. Rosen M. A., Dietz A. S., Lee N., Wang I. J., Markowitz J., Wyskiel R. M., et al. (2018c). Sensor-based measurement of critical care nursing workload: unobtrusive measures of nursing activity complement traditional task and patient level indicators of workload to predict perceived exertion. PLoS One 13:e0204819. 10.1371/journal.pone.0204819 [DOI] [PMC free article] [PubMed] [Google Scholar]
  127. Rosen M. A., Dietz A. S., Yang T., Priebe C. E., Pronovost P. J. (2015). An integrative framework for sensor-based measurement of teamwork in healthcare. J. Am. Med. Inform. Assoc. 22 11–18. 10.1136/amiajnl-2013-002606 [DOI] [PMC free article] [PubMed] [Google Scholar]
  128. Rosen M. A., Wildman J. L., Salas E., Rayne S. (2012). “Measuring team dynamics in the wild,” in Research Methods for Studying Groups and Teams: A Guide to Approaches, Tools, and Technologies, eds Hollingshead A. B., Poole M. S. (New York, NY: Routledge; ), 386–417. [Google Scholar]
  129. Rudolph J. W., Foldy E. G., Robinson T., Kendall S., Taylor S. S., Simon R. (2013). Helping without harming. The instructor’s feedback dilemma in debriefing–A case study. Simul. Healthc. 8 304–316. 10.1097/SIH.0b013e318294854e [DOI] [PubMed] [Google Scholar]
  130. Rudolph J. W., Raemer D. B., Simon R. (2014). Establishing a safe container for learning in simulation: the role of the presimulation briefing. Simul. Healthc. 9 339–349. 10.1097/SIH.0000000000000047 [DOI] [PubMed] [Google Scholar]
  131. Rudolph J. W., Simon F. B., Raemer D. B., Eppich W. J. (2008). Debriefing as formative assessment: closing performance gaps in medical education. Acad. Emerg. Med. 15 1010–1016. 10.1111/j.1553-2712.2008.00248.x [DOI] [PubMed] [Google Scholar]
  132. Rudolph J. W., Simon R., Rivard P., Dufresne R. L., Raemer D. B. (2007). Debriefing with good judgment: combining rigorous feedback with genuine inquiry. Anesthesiol. Clin. 25 361–376. 10.1016/j.anclin.2007.03.007 [DOI] [PubMed] [Google Scholar]
  133. Salas E. (2016). Team science in cancer care: questions, an observation, and a caution. J. Oncol. Pract. 12 972–974. 10.1200/jop.2016.018226 [DOI] [PubMed] [Google Scholar]
  134. Salas E., Frush K. (eds). (2013). Improving Patient Safety through Teamwork and Team Training. New York, NY: Oxford University Press. [Google Scholar]
  135. Salas E., Paige J. T., Rosen M. A. (2013a). Creating new realities in healthcare: the status of simulation-based training as a patient safety improvement strategy. BMJ Qual. Saf. 22 449–452. 10.1136/bmjqs-2013-002112 [DOI] [PubMed] [Google Scholar]
  136. Salas E., Tannenbaum S., Cohen J., Latham G. (eds). (2013b). Developing and Enhancing Teamwork in Organisations: Evidence-Based Best Practices and Guidelines. San Francisco, CA: Jossey-Bas. [Google Scholar]
  137. Sawyer T., Eppich W., Brett-Fleegler M., Grant V., Cheng A. (2016a). More than one way to debrief: a critical review of healthcare simulation debriefing methods. Simul. Healthc. 11 209–217. 10.1097/sih.0000000000000148 [DOI] [PubMed] [Google Scholar]
  138. Sawyer T., Loren D., Halamek L. P. (2016b). Post-event debriefings during neonatal care: why are we not doing them, and how can we start. J. Perinatol. 36 415–419. 10.1038/jp.2016.42 [DOI] [PubMed] [Google Scholar]
  139. Schein E. H. (1993). How can organizations learn faster? The challenge of entering the green room. Sloan Manag. Rev. 34:85. [Google Scholar]
  140. Schmid Mast M., Gatica-Perez D., Frauendorfer D., Nguyen L., Choudhury T. (2015). Social sensing for psychology: automated interpersonal behavior assessment. Curr. Dir. Psychol. Sci. 24 154–160. 10.1177/0963721414560811 [DOI] [Google Scholar]
  141. Schmutz J., Hoffman F., Heimberg E., Manser T. (2015). Effective coordination in medical emergency teams: the moderating role of task type. Eur. J. Work Organ. Psychol. 24 761–776. 10.1080/1359432X.2015.1018184 [DOI] [Google Scholar]
  142. Schmutz J. B., Eppich W. J. (2017). Promoting learning and patient care through shared reflection: a conceptual framework for team reflexivity in health care. Acad. Med. 92 1555–1563. 10.1097/acm.0000000000001688 [DOI] [PubMed] [Google Scholar]
  143. Schön D. A. (1983). The Reflective Practitioner: How Professionals Think in Action. New York, NY: Basic Books. [Google Scholar]
  144. Schulz-Hardt S., Mojzisch A. (2012). How to achieve synergy in group decision making: lessons to be learned from the hidden profile paradigm. Eur. Rev. Soc. Psychol. 23 305–343. 10.1080/10463283.2012.744440 [DOI] [Google Scholar]
  145. Schwappach D. L. B., Gehring K. (2014). Silence that can be dangerous: a vignette study to assess healthcare professionals’ likelihood of speaking up about safety concerns. PLoS One 9:e104720. 10.1371/journal.pone.0104720 [DOI] [PMC free article] [PubMed] [Google Scholar]
  146. Seelandt J. C., Grande B., Kriech S., Kolbe M. (2018). DE-CODE: a coding scheme for assessing debriefing interactions. BMJ Simul. Technol. Enhanc. Learn. 4 51–58. 10.1136/bmjstel-2017-000233 [DOI] [PMC free article] [PubMed] [Google Scholar]
  147. Seelandt J. C., Tschan F., Keller S., Beldi G., Jenni N., Kurmann A., et al. (2014). Assessing distractors and teamwork during surgery: developing an event-based method for direct observation. BMJ Qual. Saf. 23 918–929. 10.1136/bmjqs-2014-002860 [DOI] [PubMed] [Google Scholar]
  148. Seiffert M., Conradi L., Baldus S., Schirmer J., Blankenberg S., Reichenspurner H., et al. (2013). Severe intraprocedural complications after transcatheter aortic valve implantation: calling for a heart team approach. Eur. J. Cardiothorac. Surg. 44 478–484. 10.1093/ejcts/ezt032 [DOI] [PubMed] [Google Scholar]
  149. Setz C., Arnrich B., Schumm J., La Marca R., Tröster G., Ehlert U. (2010). Discriminating stress from cognitive load using a wearable EDA device. IEEE Trans. Inf. Technol. Biomed. 14 410–417. 10.1109/TITB.2009.2036164 [DOI] [PubMed] [Google Scholar]
  150. Shah S., Arora S., Atkin G., Glynne-Jones R., Mathur P., Darzi A., et al. (2014). Decision-making in colorectal cancer tumor board meetings: results of a prospective observational assessment. Surg. Endosc. 28 2783–2788. 10.1007/s00464-014-3545-3 [DOI] [PubMed] [Google Scholar]
  151. Snyder J., Schultz L., Walbert T. (2017). The role of tumor board conferences in neuro-oncology: a nationwide provider survey. J. Neurooncol. 133 1–7. 10.1007/s11060-017-2416-x [DOI] [PubMed] [Google Scholar]
  152. Soar J., Mancini M. E., Bhanji F., Billi J. E., Dennett J., Finn J., et al. (2010). Part 12: education, implementation, and teams. Resuscitation 81 e288–e332. 10.1016/j.resuscitation.2010.08.030 [DOI] [PMC free article] [PubMed] [Google Scholar]
  153. Stachowski A. A., Kaplan S. A., Waller M. J. (2009). The benefits of flexible team interaction during crisis. J. Appl. Psychol. 94 1536–1543. 10.1037/a0016903 [DOI] [PubMed] [Google Scholar]
  154. Stasser G., Titus W. (1987). Effects of information load and percentage of shared information on the dissemination of unshared information during group discussion. J. Pers. Soc. Psychol. 53 81–93. 10.1037/0022-3514.53.1.81 [DOI] [Google Scholar]
  155. Su L., Kaplan S., Burd R., Winslow C., Hargrove A., Waller M. (2017). Trauma resuscitation: can team behaviours in the prearrival period predict resuscitation performance? BMJ Simul. Technol. Enhanc. Learn. 3 106–110. 10.1136/bmjstel-2016-000143 [DOI] [PMC free article] [PubMed] [Google Scholar]
  156. Tafe L. J., Gorlov I. P., de Abreu F. B., Lefferts J. A., Liu X., Pettus J. R., et al. (2015). Implementation of a molecular tumor board: the impact on treatment decisions for 35 patients evaluated at Dartmouth-Hitchcock Medical Center. Oncologist 20 1011–1018. 10.1634/theoncologist.2015-0097 [DOI] [PMC free article] [PubMed] [Google Scholar]
  157. Tangirala S., Ramanujam R. (2012). Ask and you shall year (but not always): examining the relationship between manager consulting and employee voice. Pers. Psychol. 65 251–282. 10.1111/j.1744-6570.2012.01248.x [DOI] [Google Scholar]
  158. Tannenbaum S. I., Beard R. L., Cerasoli C. P. (2013). “Conducting team debriefings that work: lessons from research and practice,” in Developing and Enhancing Teamwork in Organizations: Evidence-Based Best Practices and Guidelines, eds Salas E., Tannenbaum S. I., Cohen J., Latham G. (San Francisco, CA: Jossey-Bass; ), 488–519. [Google Scholar]
  159. Tannenbaum S. I., Cerasoli C. P. (2013). Do team and individual debriefs enhance performance? A meta-analysis. Hum. Factors 55 231–245. 10.1177/0018720812448394 [DOI] [PubMed] [Google Scholar]
  160. Tannenbaum S. I., Goldhaber-Fiebert S. (2013). “Medical team debriefs: simple, powerful, underutilized,” in Improving Patient Safety through Teamwork and Team Training, eds Salas E., Frush K. (New York, NY: Oxford University Press; ), 249–256. [Google Scholar]
  161. Tannenbaum S. I., Mathieu J. E., Salas E., Cohen D. (2012). Teams are changing: are research and practice evolving fast enough? Ind. Organ. Psychol. 5 2–24. 10.1111/j.1754-9434.2011.01396.x 27606397 [DOI] [Google Scholar]
  162. The Center for Medical Simulation (2010). Debriefing Assessment for Simulation in Healthcare (DASH)©. Rater’s Handbook. Available at: https://harvardmedsim.org/wp-content/uploads/2017/01/DASH.handbook.2010.Final.Rev.2.pdf (accessed September 23, 2013). [Google Scholar]
  163. Thenappan A., Halaweish I., Mody R. J., Smith E. A., Geiger J. D., Ehrlich P. F., et al. (2017). Review at a multidisciplinary tumor board impacts critical management decisions of pediatric patients with cancer. Pediatr. Blood Cancer 64 254–258. 10.1002/pbc.26201 [DOI] [PubMed] [Google Scholar]
  164. Thürmer J. L., Wieber F., Schultze T., Schulz-Hardt S. (2018). “Hidden profile discussion coding,” in The Cambridge Handbook of Group Interaction Analysis, eds Brauner E., Boos M., Kolbe M. (Cambridge: Cambridge University Press; ), 565–574. 10.1017/9781316286302.038 [DOI] [Google Scholar]
  165. Tschan F., Seelandt J. C., Keller S., Semmer N. K., Kurmann A., Candinas D., et al. (2015). Impact of case-relevant and case-irrelevant communication within the surgical team on surgical-site infection. Br. J. Surg. 102 1718–1725. 10.1002/bjs.9927 [DOI] [PubMed] [Google Scholar]
  166. Tschan F., Semmer N. K., Gautschi D., Hunziker P., Spychiger M., Marsch S. U. (2006). Leading to recovery: group performance and coordinative activities in medical emergency driven groups. Hum. Perform. 19 277–304. 10.1207/s15327043hup1903_5 [DOI] [Google Scholar]
  167. Tschan F., Semmer N. K., Gurtner A., Bizzari L., Spychiger M., Breuer M., et al. (2009). Explicit reasoning, confirmation bias, and illusory transactive memory. A simulation study of group medical decision making. Small Group Res. 40 271–300. 10.1177/1046496409332928 [DOI] [Google Scholar]
  168. Tschan F., Semmer N. K., Hunziker P. R., Marsch S. C. U. (2011a). “Decisive action vs. joint deliberation: different medical tasks imply different coordination requirements,” in Advances in Human Factors and Ergonomics in Healthcare, ed. Duffy V. G. (Boca Raton, FL: Taylor & Francis; ), 191–200. [Google Scholar]
  169. Tschan F., Semmer N. K., Vetterli M., Gurtner A., Hunziker S., Marsch S. U. (2011b). “Developing observational categories for group process research based on task and coordination-requirement analysis: examples from research on medical emergency-driven teams,” in Coordination in Human and Primate Groups, eds Boos M., Kolbe M., Kappeler P., Ellwart T. (Heidelberg: Springer; ), 93–118. [Google Scholar]
  170. Tschan F., Semmer N. K., Hunziker S., Kolbe M., Jenni N., Marsch S. U. (2014). Leadership in different resuscitation situations. Trends Anaesth. Crit. Care 4 32–36. 10.1016/j.tacc.2013.12.001 [DOI] [Google Scholar]
  171. Undre S., Sevdalis N., Healey A. N., Darzi A., Vincent C. A. (2007). Observational teamwork assessment for surgery (OTAS): refinement and application in urological surgery. World J. Surg. 31 1373–1381. 10.1007/s00268-007-9053-z [DOI] [PubMed] [Google Scholar]
  172. Undre S., Sevdalis N., Vincent C. (2009). “Observing and assessing surgical teams: the observational teamwork assessment for surgery (OTAS),” in Safer Surgery. Analysing Behaviour in the Operating Theatre, eds Flin R., Mitchell L. (Aldershot: Ashgate; ), 83–101. 10.1201/9781315607436-6 [DOI] [Google Scholar]
  173. Valentine M. A., Nembhard I. M., Edmondson A. C. (2015). Measuring teamwork in health care settings: a review of survey instruments. Med. Care 53 e16–e30. [DOI] [PubMed] [Google Scholar]
  174. Vankipuram M., Kahol K., Cohen T., Patel V. L. (2011). Toward automated workflow analysis and visualization in clinical environments. J. Biomed. Inform. 44 432–440. 10.1016/j.jbi.2010.05.015 [DOI] [PubMed] [Google Scholar]
  175. Vashdi D. R., Bamberger P. A., Erez M. (2013). Can surgical teams ever learn? The role of coordination, complexity, and transitivity in action team learning. Acad. Manag. J. 56 945–971. 10.5465/amj.2010.0501 [DOI] [Google Scholar]
  176. Vincent C., Amalberti R. (2016). Safer Healthcare. Strategies for the Real World. Heidelberg: Springer. [PubMed] [Google Scholar]
  177. Wageman R., Fisher C. M., Hackman J. R. (2009). Leading teams when the time is right. Finding the best moments to act. Organ. Dyn. 38 192–203. 10.1016/j.orgdyn.2009.04.004 [DOI] [Google Scholar]
  178. Wageman R., Hackman J. R., Lehman E. (2005). Team Diagnostic Survey: development of an instrument. J. Appl. Behav. Sci. 41 373–398. 10.1177/0021886305281984 [DOI] [Google Scholar]
  179. Weaver S. J., Sydney M. D., Rosen M. A. (2014). Team-training in healthcare: a narrative synthesis of the literature. BMJ Qual. Saf. 23 359–372. 10.1136/bmjqs-2013-001848 [DOI] [PMC free article] [PubMed] [Google Scholar]
  180. Weiss M., Kolbe M., Grote G., Spahn D. R., Grande B. (2017a). We can do it! Inclusive leader language promotes voice behavior in multi-professional teams. Leadersh. Q. 29 389–402. 10.1016/j.leaqua.2017.09.002 [DOI] [Google Scholar]
  181. Weiss M., Kolbe M., Grote G., Spahn D. R., Grande B. (2017b). Why didn’t you say something? Using after-event reviews to affect voice behavior and hierarchy beliefs in multi-professional action teams. Eur. J. Work Organ. Psychol. 26 66–80. 10.1080/1359432X.2016.1208652 [DOI] [Google Scholar]
  182. West M. A. (2004). Effective Teamwork. Practical Lessons from Organizational Research, 2nd Edn Oxford: BPS Blackwell. [Google Scholar]
  183. West M. A., Markiewicz L., Dawson J. F. (2006). Aston Team Performance Inventory: Management Set. London: ASE. [Google Scholar]
  184. Woolley A. W., Aggarwal I., Malone T. W. (2015). Collective intelligence and group performance. Curr. Dir. Psychol. Sci. 24 420–424. 10.1177/0963721415599543 [DOI] [Google Scholar]
  185. Woolley A. W., Chabris C. F., Pentland A., Hashmi N., Malone T. W. (2010). Evidence for a collective intelligence factor in the performance of human groups. Science 330 686–688. 10.1126/science.1193147 [DOI] [PubMed] [Google Scholar]
  186. Yule S., Flin R., Paterson-Brown S., Maran N., Rowley D. (2006). Development of a rating system for surgeons’ non-technical skills. Med. Educ. 40 1098–1104. 10.1111/j.1365-2929.2006.02610.x [DOI] [PubMed] [Google Scholar]
  187. Zijlstra F. R. H., Waller M. J., Phillips S. I. (2012). Setting the tone: early interaction patterns in swift-starting teams as a predictor of effectiveness. Eur. J. Work Organ. Psychol. 21 749–777. 10.1080/1359432X.2012.690399 [DOI] [Google Scholar]

Articles from Frontiers in Psychology are provided here courtesy of Frontiers Media SA

RESOURCES