Skip to main content
Wiley Open Access Collection logoLink to Wiley Open Access Collection
. 2019 Jun 25;53(10):978–988. doi: 10.1111/medu.13896

Peer‐supported faculty development and workplace teaching: an integrative review

Narelle Campbell 1,2,, Helen Wozniak 2,3, Robyn L Philip 1, Raechel A Damarell 4
PMCID: PMC6771963  PMID: 31238387

Abstract

Context

The use of peer support as a faculty development technique to improve clinical teaching is uncommon in medical education, despite the benefits of situating learning in the workplace. The authors therefore conducted a broad search seeking theoretical and empirical literature describing peer support strategies for clinical teachers in health care workplaces. This included descriptive and non‐experimental studies that are often excluded from reviews. The review aimed to identify and assess existing initiatives and to synthesise key challenges and benefits.

Methods

An integrative literature review was undertaken (2004–2017), based on searches of eight international electronic databases and targeted manual searches. Key concepts, elements and models were mapped using an iterative, constant comparative method. An evaluative framework, drawing on previous research, informed conclusions regarding the quality of evidence.

Results

From a pool of 5735 papers, 34 met the inclusion criteria. The majority referred to studies conducted in the USA (59%) and in the medical profession (71%). Analysis revealed a trend towards using a collaborative model (56%), voluntary participation (59%), and direct workplace observation by a peer clinician (68%). Design features of the peer support strategy were commonly reported (65%), with half providing outcome measures (56%). Few papers reported on process evaluation (15%) or evidence of programme sustainability (15%). Despite logistical and time‐associated challenges, benefits accrued to individuals and the workplace, and included improved teaching practices. Embedding the peer support strategy into routine organisational practice proved effective.

Conclusions

The results indicated that a workplace‐based peer support model is an acceptable and effective faculty development strategy for health care clinical teachers. Conceptualising workplace‐based peer support via a sociocultural model that acknowledges the significance of educational design, peers as collaborators and the importance of workplace context and culture is emphasised. Future research should focus on clarification studies informed by contemporary models of faculty development, in which factors impacting the health care workplace are considered.

Short abstract

By reviewing the literature, Campbell et al. highlight how collaborative relationships and workplace connections with peers offer an under‐utilized resource for effectively improving teaching expertise.

Introduction

Practising clinicians provide a significant amount of workplace‐based teaching and supervision for students and early career health professionals. The importance of the clinical teacher role in providing the learning nexus between the patient,1 application of knowledge,2 role identity development3 and learner cannot be overstated. Many clinicians teach, with little or no training, because it is an expected part of their work portfolio. Classroom‐based, workshop‐style faculty development, attended by individuals, may provide theoretical understanding, but cannot ensure transfer of skills and knowledge into the workplace and workplace culture.4 Accessible and effective faculty development to expand clinician teaching expertise is critical5 as a means of moving beyond the apprenticeship model6, 7 so often found in workplace‐based medical education.

One option for improving clinical teaching effectiveness in the clinical workplace is to develop strategies that foster connections and relationships between trusted colleagues. By adopting the role of peer mentor,5, 8, 9 a colleague can assist a peer to develop his or her skills of critical reflection on teaching practice. As models of faculty development evolve, the importance of relationships between facilitators, participants and professional development programmes has become of greater concern,8 as have the impacts of contextual and cultural factors that influence acceptance and uptake of faculty initiatives.4, 5 There is now a focus on harnessing the added value of social and professional networks, and the communities of practice found in the workplace.4, 5, 8, 10

Universities already implement peer support strategies in classroom settings, including in medical programmes.11, 12, 13 Simultaneously, research has increased our theoretical understanding of the peer observation process and identified key components and models of peer support.13, 14 Gosling's14 pivotal work identified three models of peer support that are, respectively, evaluative, developmental and collaborative. Evaluative and developmental models commonly engage experts to make judgements about teaching performance. Collaborative models, however, aim to promote self‐reflection and growth through non‐judgemental feedback amongst peers. Through their reliance on peers as equals, and negotiated practices, collaborative models of faculty development emphasise relationships, ‘reflective practice based on dialogue’14 and the ‘social enterprise’ of learning.8

Although a greater focus on access to peer coaching and mentoring opportunities to increase workplace‐based teaching effectiveness is supported by two Best Evidence Medical Education (BEME) guides,5, 15 the use of peers as a support strategy to improve clinical teaching is still infrequent in medical workplace contexts.5, 16

Aims of the study

This paper presents an integrative review17 investigating the theoretical and empirical literature on the use of peer support strategies as educational interventions for clinical teachers in health care workplaces. The term ‘clinical teaching’ is applied here to include all activities undertaken by a health professional relating to the development of a learner in the workplace. The overarching research question was: In the health professions, is workplace‐based peer support for clinical teachers an acceptable and effective faculty development strategy?

The inquiry specifically aimed to:

  1. describe and analyse existing faculty development initiatives that incorporate peer support strategies for clinical teachers in the health care workplace;

  2. assess the quality of the faculty development initiatives, and

  3. identify key challenges and benefits to implementing peer support strategies in order to identify knowledge gaps for future research.

Methods

Our inquiry was underpinned by a pragmatist epistemology18, 19 in which knowledge and the research process are assumed to be situated within social, historical and political contexts, and pluralistic methods are used to uncover knowledge about problems. Across the selected studies we employed an integrative approach17 to extract and code data from the primary studies. We then implemented an iterative comparative analysis with ongoing refinement and validation to manage the varied data sources from the diverse methodologies employed by authors. Cook et al.20 support this approach as it widens the scope of studies included in the review beyond those with purely experimental designs.

Search strategy

We used two overarching methods to identify the manuscripts: electronic database searches designed and conducted by a research librarian (author RAD), and further targeted manual searches undertaken by the remaining authors. The electronic search strategy was informed by scoping exercises in the OVID MEDLINE database, and then accurately translated for the following databases: EMBASE (OVID); PsycINFO (OVID); CINAHL (Cumulative Index to Nursing and Allied Health Literature) (EBSCOhost); Web of Science, and Informit (Health and Education subsets only). Searches comprised combinations of database‐specific subject headings, where available, and a wide range of text words to optimise search sensitivity. The search covered the years 2004–2017. Figure S1 shows a PRISMA (preferred reporting items for systematic reviews and meta‐analyses) flow chart and key search terms. Full search strategies are available on request.

The database search was supplemented with a table of contents search of six leading medical education journals (Academic Medicine, Medical Education, Medical Teacher, BMC Medical Education, The Clinical Teacher and Focus on Health Professional Education). As an additional process to ensure search strategy rigour, we included a forward citation check and manual review of the reference lists of included papers.

Inclusion criteria

The inclusion criteria for the review required:

  1. participants to be health professionals (papers involving any other profession, including higher education, were excluded), and

  2. peers to be workplace colleagues or peers, with or without educational expertise (not students).

Peer support strategies were required to:

  1. have been developed and conducted using an explicit process;

  2. have been situated in the clinical workplace (not classroom or simulation laboratory);

  3. have been aimed at improving clinical teaching or supervision activities (not clinical practice), and

  4. have included any but not necessarily all of the following elements: briefing; observation; debriefing or discussion, and reflection.

Finally, papers were required to have been published in peer‐reviewed publications.

Study records

All citations identified by electronic database searches and manual content page searching were downloaded into EndNote X8 (Clarivate Analytics, Philadelphia, PA, USA) and duplicates removed. Each of three authors (NC, HW and RLP) screened one‐third of all citations, applying the inclusion criteria to the title, abstract, keywords and publication type. A random sample of 5% of excluded papers were reviewed by all authors, which is a technique recommended by Hammick et al.21 This resulted in 100% agreement. We sourced the full texts of papers for which relevance could not be determined by citation alone.

Data analysis and synthesis

Three authors independently extracted information from the included papers using a standardised data table. The key information comprised study design, context, sample size, analysis of the peer support strategy, evaluation conducted and overall conclusions. We also noted any qualitative analysis outcomes or author comments on the process. Tables S1 and S2 show the results of this process.

Quality appraisal of included papers

In tandem with a detailed descriptive analysis and synthesis, quality was appraised using two methods. Firstly, applying Cook and Ellaway's22 evaluation framework, we examined the type of evaluation data reported and aligned the data to typical stages in the educational design cycle. We regard the ‘cycle’ as iterative, encompassing needs analysis, design and development of peer support strategies, evaluation of implementation processes and outcomes, and evidence of sustainability and/or wider dissemination. This framework provided a means for synthesising quantitative and qualitative data, and contextual and educational processes,8, 23, 24, 25 while still incorporating the more traditional outcomes measures emphasised by Kirkpatrick and Kirkpatrick (reaction, learning, behaviour and results).26 Using this first method, participants’ responses to peer support strategies were compared and contrasted, along with changes to teaching behaviour and workplace culture, and impact on student learning. A second quality appraisal was conducted using the BEME scale (1 = no clear conclusions, 5 = results are unequivocal)21 to rate the strength of the findings in the subgroup of papers classified as research reports.

Results

The electronic search strategies identified 5735 citations. The removal of duplicates left 4198 citations for screening. Full‐text papers were obtained for 74 citations and an additional two papers were found via the journal contents search. Of these 76 papers, only 27 fully met the inclusion criteria. A further seven papers were identified by checking the reference lists of these included papers and by forward citation searching. Thirty‐four unique papers were therefore included in the review. The search process is shown in PRISMA flow chart format27 in Figure S1.

General characteristics of included papers

Table 1 details the characteristics of the included papers.13, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60 Most of the studies had been conducted within the medical profession, in the USA, and reported on the implementation of a peer support strategy. Notably, there were no interprofessional studies. The majority of the studies were research reports using surveys, focus groups or observation to collect data from small samples. Although most of the papers described their implementation of a peer support strategy, or explained a potential strategy (‘tell how’ papers), four aimed to assess the acceptability of a peer support strategy within the workplace (‘test the waters’ papers).

Table 1.

Key characteristics, design features and evaluation results reported in the included papers (n = 34)

Characteristics Studies, n (%)a
Geographic location
USA28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47 20 (59%)
UK13, 48, 49, 50, 51, 52, 53, 54, 55 9 (26%)
Canada56, 57 2 (6%)
Australia58, 59 2 (6%)
Australia and UK60 1 (3%)
Profession
Medicine13, 28, 29, 31, 34, 36, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 50, 51, 52, 53, 54, 55, 60 24 (71%)
Nursing30, 37, 56, 58 4 (12%)
Dentistry49, 57 2 (6%)
Pharmacy32, 33 2 (6%)
Counselling35 1 (3%)
Physiotherapy59 1 (3%)
Aim of paper
Implement peer support strategy29, 32, 33, 34, 35, 36, 38, 39, 40, 42, 44, 45, 46, 49, 51, 53, 54, 55, 56, 57, 58, 59 22 (65%)
Tell how (to implement strategy)13, 28, 31, 37, 41, 47, 50, 52 8 (24%)
Test the waters (assess acceptability of strategy)30, 43, 48, 60 4 (12%)
Type of study
Research report29, 32, 34, 36, 38, 39, 42, 43, 44, 45, 46, 48, 49, 51, 53, 54, 55, 56, 57, 59, 60 21 (62%)
Showcase30, 33, 35, 37, 40, 50, 58 7 (21%)
How to guide13, 28, 31, 41, 47, 52 6 (18%)
Design features of the peer support strategy
Peer support model
Collaborative13, 28, 32, 34, 35, 38, 39, 40, 41, 45, 49, 51, 52, 55, 56, 57, 58, 59, 60 19 (56%)
Developmental29, 30, 31, 36, 42, 43, 44, 46, 47, 48, 50, 53 12 (35%)
Evaluative33, 37, 54 3 (9%)
Strategy type
Workplace observation13, 28, 29, 30, 31, 32, 33, 34, 36, 37, 39, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 52, 53, 54, 58, 60 26 (76%)
Community of practice35, 40, 51, 57 4 (12%)
Reflective practice38, 55, 56, 59 4 (12%)
Nature of participation (if defined)
Voluntary28, 31, 32, 34, 35, 36, 38, 39, 40, 43, 48, 49, 51, 53, 54, 55, 56, 57, 58, 59 20 (59%)
Mandated29, 30, 33, 37, 44, 46 6 (18%)
Status of peer (if defined)
Clinician peer13, 28, 30, 31, 32, 34, 35, 39, 40, 41, 42, 45, 46, 48, 49, 51, 52, 54, 55, 57, 58 21 (62%)
Clinician educator33, 36, 37, 38, 43, 44, 53, 56, 59 9 (26%)
Observation guide or tool (if described/used)
Self‐developed29, 31, 32, 34, 36, 41, 42, 44, 45, 46, 47, 49, 50, 53, 54 15 (44%)
Informed by professional competencies, or validated tool28, 30, 33, 37, 39, 43 6 (18%)
Process includedb
Training13, 29, 30, 31, 33, 34, 35, 37, 38, 39, 41, 42, 43, 44, 46, 48, 49, 50, 51, 53, 56, 57, 59 23 (68%)
Pre‐strategy briefing to clarify process13, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 43, 47, 48, 49, 50, 51, 52, 53, 56, 57, 58 24 (71%)
Explicit mutual agreement on goals of peer support13, 30, 34, 35, 36, 38, 40, 41, 43, 47, 48, 49, 51, 52, 53, 56, 57 17 (50%)
Debrief/feedback immediately post‐strategy13, 29, 31, 32, 34, 35, 36, 39, 40, 41, 42, 43, 45, 47, 49, 51, 52, 53, 54, 55 20 (59%)
Explicit reflection on learning by participant13, 28, 30, 31, 33, 34, 35, 36, 38, 39, 40, 41, 43, 44, 45, 47, 48, 49, 51, 52, 53, 54, 55, 56, 57, 58, 59 27 (79%)
Evaluation data reportedb, c
Design and development of peer support strategy 22 (65%)
Needs analysis30, 40, 43, 48, 50, 57, 59, 60 8 (24%)
Development process13, 30, 31, 32, 33, 35, 37, 39, 40, 41, 43, 44, 46, 47, 50, 53, 56, 57, 59 19 (56%)
Pilot30, 44, 53, 54 4 (12%)
Evaluation of process during implementation28, 35, 39, 44, 56 5 (15%)
Evaluation of outcomes 19 (56%)
Participant reaction29, 36, 39, 45, 46, 49, 51, 53, 54, 55, 57, 59 or intention to change36, 39, 45, 49, 53 12 (35%)
Changed teaching behaviour34, 36, 39, 42, 45, 46, 53, 55, 56, 57 or impact on student learning32, 38, 44, 53, 59 14 (41%)
Evaluation of sustainability and dissemination34, 44, 49, 55, 57 5 (15%)
a

Rounding effect means percentages may not add to 100.

b

Percentages are not additive in the following sections as each paper may have reported on more than one aspect.

c

Based on Cook and Ellaway's evaluation framework.22

Design features of the peer support strategy

Using our previous work mapping peer support strategies in the workplace,61 and drawing on O'Sullivan and Irby's model of faculty development,8 we analysed the design features of the peer support strategies reported in the papers. As Table 1 shows, over half of the papers favoured Gosling's collaborative model of peer support. The reviewed literature emphasised the trend for voluntary participation in a peer support strategy, and direct workplace observation by a clinician peer. Only one paper reported observation by videoconferencing to enable participation across dispersed sites.29 Although 21 studies used observation guides, the majority of which were self‐developed, only six papers detailed how this was achieved. Observations were often preceded by training in the process, and a briefing that encouraged participants to generate their own goals for the observation. The time taken to complete a peer observation varied widely (mean time: 159 minutes; range: 8 minutes to 1 day), and generally included immediate feedback from the peer observer and reflection on learning outcomes.

Evaluation of the design and implementation

The majority of papers (n = 22, 65%) reported on the design and development of the peer support strategy (Table 1). Eight papers described a structured workplace needs analysis to determine the viability and acceptability of the peer support strategy, but only three of these described its subsequent implementation.30, 57, 59 Although 19 papers described the development of their respective programmes, only four conducted a pilot run.30, 44, 53, 54

Very few papers (n = 5, 15%) evaluated the study's implementation process. However, examples that were documented included personal reflections during implementation,28, 35, 56 modifications to enhance feedback processes following peer observation,44 and the identification of unanticipated outcomes such as direct benefits to the observer.28, 56

Evaluation of outcomes: benefits and challenges for participants and the workplace

Over half of the included papers (n = 19, 56%) provided outcome measures that could be attributed to Kirkpatrick's four levels of training evaluation. These included 12 papers that utilised surveys or interviews to evaluate participant reactions or attitudes towards the peer support strategy, and five papers that captured participants’ intentions to change their clinical teaching practice (Table 1).

In terms of outcomes for participants, initial anxiety or misgivings13, 41, 46, 49, 52, 53, 56 were ameliorated by discovering that the peer support strategy was a reciprocal13, 41, 45, 53 and valuable28, 30, 34, 36, 43, 44, 51, 60 process. Intentional, focused feedback and opportunities for reflection were reported to contribute positively to a productive culture of teaching in the workplace.13, 33, 40, 44, 55 In turn, this reduced participants’ sense of isolation as teachers,13, 49, 57 aided more widespread recognition of the benefits of teaching, and increased overall commitment to teaching.36, 46, 58 Participants reported the process confirmed the quality of existing teaching practice30, 38, 49, 56 and provided direction to improve teaching practice.32, 34, 35, 39, 42, 44, 45, 53, 57, 58, 60 Growth in confidence in giving feedback was specifically noted.31, 39, 46

Additional impacts for participants included changes to micro‐teaching skills and behaviours. Evidence was provided via self‐reports39, 46, 53, 55, 57 and direct observations of change.34, 36, 42, 45, 56 Examples of change included enhanced questioning skills,34, 36, 42 better organisation of teaching,34, 36, 45 strategies to engage learners at multiple levels,34, 36 and heightened awareness of how to assist learners to self‐evaluate.56 Observers found unexpected learning benefits in watching their peers teach.13, 41, 45, 53 The impact of peer support strategies on students’ experiences of clinical teaching were less commonly reported32, 38, 44, 53, 56, 59 and all papers shied away from suggesting any links between improved student learning outcomes and implementation of improved clinical teaching practices.

In addition to benefits and challenges for participants, significant contextual workplace challenges emerged from the reported outcome evaluations. Common barriers included a lack of time for participation,13, 28, 32, 33, 43, 48, 52, 55, 56, 57 and general logistical issues.36, 46, 55 Three papers reportedly addressed these barriers by embedding peer support strategies into organisational routines.34, 40, 46 In one example participants prepared written reflections prior to peer‐facilitated group discussions, which fostered the development of a shared knowledge base, the cultivation of relationships and the sustaining of a community of practice.40 Additional workplace benefits reportedly included identification of further faculty development needs44, 49 and anecdotal changes to workplace culture, such as an enhanced sense of community and improved communication between clinical teachers.55, 57

Few papers provided long‐term data describing the programme's sustainability. One described multiple iterations34 and one documented dissemination to other health workplaces.44 Overall, the length of implementation varied widely from 2 weeks42 to 5 years34 (Tables S1 and S2).

Further quality appraisal of the evidence

Overall, evaluation data were reported in all except two of the papers,52, 58 and notably none reported evaluation that spanned all stages of the educational design cycle. For the subset classified as research reports, and evaluated using the BEME strength of evidence rating (Table 1, n = 21), there was a wide variation in the degree to which results supported conclusions (Table S2). Two of the four papers32, 42, 45, 54 with the lowest BEME ratings (1 or 2) were short reports that lacked the detail necessary to assess the conclusions reported.42, 45 One of these reported a randomised controlled trial research design,42 but was limited by low participant numbers in the study group (n = 6). Generally, in studies in which quantitative data were collected and analysed, sample sizes were low32, 39, 45, 59 and hence any statistical findings should be read with caution.

Despite these limitations, 11 of the 21 (52%) papers were deemed to have reported conclusions based on the results collected (BEME rating 4: results clear and very likely to be true).36, 38, 39, 46, 48, 49, 51, 56, 57, 59, 60

Discussion

Informed by contemporary models of faculty development,5, 8 this integrative review enabled us to draw conclusions about how peer support strategies influence, and are influenced by, the social systems and cultures operating within health care workplaces. The review is significant because it addresses a need, identified by prominent medical education researchers,24, 62, 63 to compare, contrast and synthesise educational studies that report on disparate methodologies, contexts and challenges.64 Expanding our evaluative assessment of papers beyond commonly used outcomes‐oriented methods (i.e. Kirkpatrick's four training levels) allowed us to include rich contextual information about participants and their workplace contexts and cultures in the analysis. Had we included only papers referring to outcome measures, over half of the papers would not have been included in the review (n = 18, 53%), thereby limiting our understanding of the impact of workplace culture and participant engagement. Similarly, if only research reports (n = 21) had been included, valuable data describing the design of peer support strategies would have been omitted.

Adapting O'Sullivan and Irby's8 and Steinert et al.'s5 models, and incorporating evidence from our review, we conceptualised workplace‐based peer support from a systems perspective (Fig. 1). We concluded that the formulating of an acceptable and effective faculty development strategy requires that three integrated factors are accounted for: the context and culture in which the programme takes place; the characteristics of the participants, whom we propose are ‘contributors’, and the educational design of the programme. This sociocultural conceptualisation frames the following discussion of key findings.

Figure 1.

Figure 1

A sociocultural model of elements impacting health care workplace peer‐supported faculty development for teaching

We contend that to create and manage an effective peer support strategy, acceptable to clinicians teaching in the workplace, typical educational design processes must be attended to. It is important to select design and development features appropriate to the unique workplace context, to engage in evaluation throughout (particularly process evaluation during and after implementation), to capture outcomes for participants and changes in workplace behaviour, culture and professional networks, and to generate plans that enable sustainability and dissemination beyond the participant group. These design factors impact engagement, which, in turn, is impacted by the characteristics of the participant contributors, and workplace context and culture. All the elements are interdependent and impact on one another.

Addressing contextual and cultural challenges through design

Health care workplaces have complex cultures, and efforts to change these require time and patience.65 Therefore unpacking the professional culture into which peer support strategies are to be embedded may help identify barriers to engagement, and guide design choices. The inclusion of design elements such as needs assessments, pilots and preparatory briefing sessions may assist in this process of teasing out contextual, cultural and leadership issues. On this point, we identified a number of studies (referred to as ‘test the water’ papers (Table 1)), in which a needs analysis or environmental scan was conducted. Strategies reported included an initial survey to identify participant preconceptions,60 the adoption of grassroots involvement in the design,30 and orientation briefings to ensure understanding of purpose and allay misconceptions of surveillance.43, 48 Although some papers reported the development of psychometrically valid and reliable observation tools,30, 32, 33, 37, 39, 43 we suggest that workplaces could instead invest scarce time in building a culture of supportive peer relationships, which will become the foundation for successful implementation strategies.

From the reviewed papers, therefore, two design approaches emerged as particularly successful in addressing educational design issues. These were initiatives that employed a collaborative model of peer support, and/or the selection of peers based on trusted and respected relationships, rather than reliance on outside expertise. By choosing a collaborative model for peer support, the emphasis becomes one of encouraging democratic relationships between equals, facilitation of mutual learning,14 and being a ‘contributor’ to workplace faculty development. By comparison, the evaluative and developmental models, respectively, focus on identifying underperformance and demonstrating competency.11 For a collaborative model to succeed, however, participants must take responsibility for their own learning and not assume that only education experts provide feedback. Trusted peer contributors become invaluable and valid feedback providers, and reciprocity between contributors promotes personal accountability for professional development. Additionally, clinician peers are more readily available in the workplace compared with external ‘education experts’. Availability then becomes an enabler of participation.

Benefits for contributors and the workplace

Our review casts new light on the role of O'Sullivan and Irby's ‘facilitators’,8 who, particularly in the preferred collaborative model, are equal contributors, rather than experts. This compares with the more passive participant role in faculty development strategies that rely on ‘outside’ education expert facilitators. A key outcome for contributors to the peer observation process was immersion in the process itself. By gathering qualitative evidence about their teaching practice, individuals created a personalised resource for ongoing professional development and reflection.28, 34, 46, 49 Personal reflection on self‐generated goals38, 40, 45, 51, 53, 55 is likely to encourage ownership of learning outcomes, which may not occur in contexts in which compliance instruments created by others are used, such as ‘tick the box’ observational tools.

Of course, attempts to change organisational culture bring risks and challenges. It is not easy to develop a culture that supports the unique conditions of the targeted workplace. However, embedding a peer‐focused programme as part of regular departmental practice can militate against opposition from teams or individuals.43 Finn et al.,34 for example, embedded their strategy as a regular workplace routine and reported the longest duration of all studies. They also noted the concomitant rapid development of novice clinical teachers. Such positive outcomes reflect a workplace culture that prioritises and normalises peer‐focused faculty development. We conclude, therefore, that by conceptualising peer support processes within a sociocultural model, the relationships between cultural values44 and shared teaching strategies become apparent,36 the positive collective effect on the organisation becomes visible,30, 57 the status of workplace teaching is elevated,49 and collaboration and relationship building improve.40, 43, 49, 55

With regard to sustainability, there was evidence that this building of relationships and networks within the workplace could lead to the development of self‐sustaining programmes,57 consistent with the features of a community of practice.10, 40, 55 It was also noted that participation encouraged individuals to seek out further teaching improvement opportunities.36 Hence embedding a peer support strategy within a wider programme of faculty development may inspire engagement by late adopters, following the example set by their colleagues.

Future research

As a result of this integrative review, and in response to our final research question, we identified a number of gaps in the literature and areas for future investigation. In particular, although researchers highlighted the need to illuminate factors impacting the implementation of educational interventions,21, 24, 25, 66 few systematically collected or rigorously analysed process evaluation data. We agree with other researchers5, 20, 23 that clarification studies asking the ‘how’ and ‘why’ questions should be a key research focus. These investigations might explore and evaluate alternative design characteristics to address issues such as access (e.g. the efficacy of synchronous and asynchronous digital communications) and peer availability. Peer support faculty development programmes would benefit from better understanding of the roles of social, cultural, professional and interprofessional networks in promoting workplace clinical teaching excellence,67 and of the mechanisms required to sustain programmes. Additionally, gathering evidence of changed teaching practice and improved student learning outcomes associated with programmes would be valuable.

Future studies could be framed around conceptual models of faculty development, including our sociocultural model, and those offered by O'Sullivan and Irby,8 and Steinert et al.5 Additionally, education design research, or design‐based research, shows promise as a worthwhile research paradigm for medical education research,68, 69, 70 and faculty development research,71 as it blends design, research and practice.72

Limitations

A factor limiting this review was the difficulty of managing the critical appraisal of a heterogeneous set of papers. Methodological design flaws in the papers compounded the difficulty. These included the frequent omission of any reporting of the study's underlying education philosophy, the use of ambiguous research questions, opportunistic data collection, and incomplete reporting of data analysis processes. To minimise the impact of these limitations, we applied consistent and structured iterative comparative techniques during the final selection of articles, the extraction and coding of data, and the analysis and synthesis of results. One other limitation was the restriction of our search strategy to English‐language papers published during 2004–2017, which narrowed the reach of the review. This was, however, compensated for by careful consideration of other faculty development review papers reporting earlier studies.4, 5, 15

Conclusion

This integrative review provides an evidence‐based perspective on strategies used to promote peer‐supported faculty development of teaching practice in health care‐related workplace settings. Positioning the evidence within a sociocultural model, our findings confirm the acceptability and effectiveness of faculty development strategies that focus on peer support as a means for improving teaching and learning practice in the workplace. Those initiatives that adopted collaborative and consultative approaches, and utilised trusted peer contributors, rather than outside experts, appeared to be most acceptable to participants and most effective in practice.

Using our sociocultural model, we suggest that strategy development and implementation must acknowledge and account for: (i) educational design elements; (ii) workplace relationships and the concerns of contributors, and (iii) workplace context and culture. By carefully designing workplace‐based peer support programmes around an iterative educational design process, and encouraging voluntary participation with trusted colleagues, reciprocal benefits for collaborators can be realised. Importantly, by fostering the beneficial outcome of mutual reflection on clinical teaching practice, positive relationships, communities and connections can be built amongst clinical teachers in the workplace.

Contributors

NC, HW and RLP contributed to the conception and design of the work, the analysis and interpretation of data, and the drafting and critical revision of the paper. RAD contributed to the design of the work, and the drafting and critical revision of the paper. All authors approved the final manuscript for publication and have agreed to be accountable for all aspects of the work, including the investigation and resolution of questions related to its accuracy or integrity.

Funding

none.

Conflicts of interest

none.

Ethical approval

not applicable.

Supporting information

Figure S1. PRISMA flowchart including key search terms and search strategy.

Table S1. Descriptive data extracted from the included papers.

Table S2. Further details of the results reported in Table 1 for each element of the evaluation‐based framework.

Acknowledgements

the authors wish to acknowledge the support and expertise of Yvonne Steinert, Director of the Centre for Medical Education, Faculty of Medicine, McGill University, Montreal, Quebec, Canada, as well as the reviewers who provided helpful critiques of this paper.

References

  • 1. Billett S. Learning through health care work: premises, contributions and practices. Med Educ 2016;50 (1):124–31. [DOI] [PubMed] [Google Scholar]
  • 2. Irby DM. Excellence in clinical teaching: knowledge transformation and development required. Med Educ 2014;48 (8):776–84. [DOI] [PubMed] [Google Scholar]
  • 3. Pugh D, Hatala R. Being a good supervisor: it's all about the relationship. Med Educ 2016;50 (4):395–7. [DOI] [PubMed] [Google Scholar]
  • 4. Leslie K, Baker L, Egan‐Lee E, Esdaile M, Reeves S. Advancing faculty development in medical education: a systematic review. Acad Med 2013;88 (7):1038–45. [DOI] [PubMed] [Google Scholar]
  • 5. Steinert Y, Mann K, Anderson B, Dolmans D, Spencer J, Gelula M, Prideaux D. A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education: BEME Guide No. 8. Med Teach 2016;38 (8):769–86. [DOI] [PubMed] [Google Scholar]
  • 6. Rassie K. The apprenticeship model of clinical medical education: time for structural change. N Z Med J 2017;130 (1461):66–72. [PubMed] [Google Scholar]
  • 7. Wilkinson TJ. Rethinking apprenticeship. N Z Med J 2017;130 (1461):7–8. [PubMed] [Google Scholar]
  • 8. O'Sullivan PS, Irby DM. Reframing research on faculty development. Acad Med 2011;86 (4):421–8. [DOI] [PubMed] [Google Scholar]
  • 9. Newman LR, Roberts DH, Frankl SE. Twelve tips for providing feedback to peers about their teaching. Med Teach 2018; 10.1080/0142159x.2018.1521953. [Epub ahead of print.]. [DOI] [PubMed] [Google Scholar]
  • 10. de Carvalho‐Filho MA, Tio RA, Steinert Y. Twelve tips for implementing a community of practice for faculty development. Med Teach 2019; 10.1080/0142159x.2018.1552782. [Epub ahead of print.] [DOI] [PubMed] [Google Scholar]
  • 11. Irby DM. Peer review of teaching in medicine. Acad Med 1983;58 (6):457–61. [DOI] [PubMed] [Google Scholar]
  • 12. McLeod P, Steinert Y, Capek R , Chalk C, Brawer J, Ruhe V, Barnett B. Peer review: an effective approach to cultivating lecturing virtuosity. Med Teach 2013;35 (4):e1046–51. [DOI] [PubMed] [Google Scholar]
  • 13. Thampy H, Kersey N. Peer observation of clinical teaching: a guide. MedEdPublish 2015;5. [Google Scholar]
  • 14. Gosling D. Collaborative peer‐supported review of teaching In: Sachs J, Parsell M, eds. Peer Review of Learning and Teaching in Higher Education: International Perspectives. Dordrecht: Springer; 2014;13–31. [Google Scholar]
  • 15. Steinert Y, Mann K, Centeno A et al A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education: BEME guide No. 8. Med Teach 2006;28 (6):497–526. [DOI] [PubMed] [Google Scholar]
  • 16. Hydes C, Ajjawi R. Selecting, training and assessing new general practice community teachers in UK medical schools. Educ Prim Care 2015;26 (5):297–304. [DOI] [PubMed] [Google Scholar]
  • 17. Whittemore RI, Knafl K. The integrative review: updated methodology. J Adv Nurs 2005;52 (5):456–53. [DOI] [PubMed] [Google Scholar]
  • 18. Cherryholmes CH. Notes on pragmatism and scientific realism. Educ Res 1992;21 (6):13–7. [Google Scholar]
  • 19. Creswell JW. Research Design : Qualitative, Quantitative, and Mixed Methods Approaches, 3rd edn Thousand Oaks, CA: Sage Publications; 2009. [Google Scholar]
  • 20. Cook DA, Bordage G, Schmidt HG. Description, justification and clarification: a framework for classifying the purposes of research in medical education. Med Educ 2008;42 (2):128–33. [DOI] [PubMed] [Google Scholar]
  • 21. Hammick M, Dornan T, Steinert Y. Conducting a best evidence systematic review. Part 1: from idea to data coding. BEME Guide No. 13. Med Teach 2010;32 (1):3–15. [DOI] [PubMed] [Google Scholar]
  • 22. Cook DA, Ellaway RH. Evaluating technology‐enhanced learning: a comprehensive framework. Med Teach 2015;37 (10):961–70. [DOI] [PubMed] [Google Scholar]
  • 23. Haji F, Morin M‐P, Parker K. Rethinking programme evaluation in health professions education: beyond ‘did it work?’. Med Educ 2013;47 (4):342–51. [DOI] [PubMed] [Google Scholar]
  • 24. Sharma R, Gordon M, Dharamsi S, Gibbs T. Systematic reviews in medical education: a practical approach: AMEE Guide 94. Med Teach 2015;37 (2):108–24. [DOI] [PubMed] [Google Scholar]
  • 25. Yardley S, Dornan T. Kirkpatrick's levels and education ‘evidence’. Med Educ 2012;46 (1):97–106. [DOI] [PubMed] [Google Scholar]
  • 26. Kirkpatrick DL, Kirkpatrick JD. Evaluating Training Programs: The Four Levels, 3rd edn San Francisco, CA: Berrett‐Koehler; 2006. [Google Scholar]
  • 27. Moher D, Liberati A, Tetzlaff J, Altman DG; Prisma Group. Preferred reporting items for systematic reviews and meta‐analyses: the PRISMA statement. PLoS Med 2009;6 (7):e1000097. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28. Beckman TJ. Lessons learned from a peer review of bedside teaching. Acad Med 2004;79 (4):343–6. [DOI] [PubMed] [Google Scholar]
  • 29. Biery N, Bond W, Smith AB, Leclair M, Foster E. Using telemedicine technology to assess physician outpatient teaching. Fam Med 2015;47 (10):807–10. [PubMed] [Google Scholar]
  • 30. Blauvelt MJ, Erickson CL, Davenport NC, Spath ML. Say yes to peer review: a collaborative approach to faculty development. Nurse Educ 2012;37 (3):126–30. [DOI] [PubMed] [Google Scholar]
  • 31. Chandler D, Snydman L, Rencic J. Implementing direct observation of resident teaching during work rounds at your institution. Acad Int Med Insight 2009;7 (4):14–5. [Google Scholar]
  • 32. Cox CD, Peeters MJ, Stanford BL, Seifert CF. Pilot of peer assessment within experiential teaching and learning. Curr Pharm Teach Learn 2013;5 (4):311–20. [Google Scholar]
  • 33. Elmore L, Blair M, Edgerton L. Preceptor development strategies used in a mixed academic‐community teaching hospital. Curr Pharm Teach Learn 2014;6 (1):167–73. [Google Scholar]
  • 34. Finn K, Chiappa V, Puig A, Hunt DP. How to become a better clinical teacher: a collaborative peer observation process. Med Teach 2011;33 (2):151–5. [DOI] [PubMed] [Google Scholar]
  • 35. Granello DH, Kindsvatter A, Granello PF, Underfer‐Babalis J, Moorhead HJH. Multiple perspectives in supervision: using a peer consultation model to enhance supervisor development. Couns Educ Supervision 2008;48 (1):32–47. [Google Scholar]
  • 36. Gusic M, Hageman H, Zenni E. Peer review: a tool to enhance clinical teaching. Clin Teach 2013;10 (5):287–90. [DOI] [PubMed] [Google Scholar]
  • 37. Lundeen JD, Warr RJ, Cortes CG, Wallis F, Coleman JJ. The development of a clinical peer review tool. Nurs Educ Perspect 2018;39 (1):43–5. [DOI] [PubMed] [Google Scholar]
  • 38. Mai CL, Baker K. Peer‐assisted analysis of resident feedback improves clinical teaching: a case report. A A Case Rep 2017;9 (1):24–7. [DOI] [PubMed] [Google Scholar]
  • 39. Mookherjee S, Monash B, Wentworth KL, Sharpe BA. Faculty development for hospitalists: structured peer observation of teaching. J Hosp Med 2014;9 (4):244–50. [DOI] [PubMed] [Google Scholar]
  • 40. Murray SB, Levy M, Lord J, McLaren K. Peer‐facilitated reflection: a tool for continuing professional development for faculty. Acad Psychiatry 2013;37 (2):125–8. [DOI] [PubMed] [Google Scholar]
  • 41. Newman L, Roberts D, Schwartzstein R. Peer Observation of Teaching Handbook. MedEdPORTAL Publications; 2012;8. [Google Scholar]
  • 42. Parrott S, Dobbie A, Chumley H. Peer coaching shows promise for residents as teachers. Fam Med 2006;38 (4):234–5. [PubMed] [Google Scholar]
  • 43. Peyre SE, Frankl SE, Thorndike M, Breen EM. Observation of clinical teaching: interest in a faculty development program for surgeons. J Surg Educ 2011;68 (5):372–6. [DOI] [PubMed] [Google Scholar]
  • 44. Regan‐Smith M, Hirschmann K, Iobst W. Direct observation of faculty with feedback: an effective means of improving patient‐centered and learner‐centered teaching skills. Teach Learn Med 2007;19 (3):278–86. [DOI] [PubMed] [Google Scholar]
  • 45. Rendon P, Rao D, Pierce JR. Hospitalist peer observation of teaching leads to changes and adoption of new teaching behaviors. J Gen Intern Med 2015;30(Suppl):S497–8. [Google Scholar]
  • 46. Snydman L, Chandler D, Rencic J, Sung YC. Peer observation and feedback of resident teaching. Clin Teach 2013;10 (1):9–14. [DOI] [PubMed] [Google Scholar]
  • 47. Zenni E, Hageman H, Hafler J, Gusic M. Peer Feedback Tool for Clinical Teaching. MedEdPORTAL Publications; 2011;7 (8560). [Google Scholar]
  • 48. Adshead L, White PT, Stephenson A. Introducing peer observation of teaching to GP teachers: a questionnaire study. Med Teach 2006;28 (2):e68–73. [DOI] [PubMed] [Google Scholar]
  • 49. Cairns AM, Bissell V, Bovill C. Evaluation of a pilot peer observation of teaching scheme for chair‐side tutors at Glasgow University Dental School. Br Dent J 2013;214 (11):573–6. [DOI] [PubMed] [Google Scholar]
  • 50. Fry H, Morris C. Peer observation of clinical teaching. Med Educ 2004;38 (5):560–1. [DOI] [PubMed] [Google Scholar]
  • 51. Main P, Curtis A, Pitts J, Irish B. A ‘mutually agreed statement of learning’ in general practice trainer appraisal: the place of peer appraisal by experienced course members. Educ Prim Care 2009;20 (2):104–10. [DOI] [PubMed] [Google Scholar]
  • 52. Metcalfe MJ, Farrant M, Farrant J. Peer review practicalities in clinical medicine. Adv Med Educ Pract 2010;1:49–52. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53. Pattison AT, Sherwood M, Lumsden CJ, Gale A, Markides M. Foundation observation of teaching project – a developmental model of peer observation of teaching. Med Teach 2012;34 (2):e136–42. [DOI] [PubMed] [Google Scholar]
  • 54. Spicer J, Torry R. Can peer group review by trainer group make a robust and effective contribution to the reapproval process across London Deanery? Educ Prim Care 2011;22 (4):263–5. [PubMed] [Google Scholar]
  • 55. Sneddon A, MacVicar R. Annual trainer peer‐review: impact on educational practice and sense of community. Educ Prim Care 2016;27 (2):114–20. [DOI] [PubMed] [Google Scholar]
  • 56. Mahara MS, Jones JA. Participatory inquiry with a colleague: an innovative faculty development process. J Nurs Educ 2005;44 (3):124–30. [DOI] [PubMed] [Google Scholar]
  • 57. Tax CL, Doucette H, Neish NR, Maillet JP. A model for cultivating dental hygiene faculty development within a community of practice. J Dent Educ 2012;76 (3):311–21. [PubMed] [Google Scholar]
  • 58. Barnard A, Harvey T, Theobald K, Tippett V, Rider T. Supporting clinical facilitators through peer review of teaching. Aust Nurs Midwifery J 2016;24 (4):34–5. [PubMed] [Google Scholar]
  • 59. Thomson K, Nguyen M, Leithhead I. Peer mentoring for clinical educators: a case study in physiotherapy. Focus Health Prof Educ 2016;17 (3):30–44. [Google Scholar]
  • 60. Caygill R, Peardon M, Waite C, McIntyre I, Bradley D, Wright J. Attitudes towards peer review of teaching in medical education. Focus Health Prof Educ 2017;18 (2):47–60. [Google Scholar]
  • 61. Campbell N, Wozniak H. Work‐based peer review of clinical supervision practice: A guide to creating a culture of quality supervision. Darwin, NT: Greater Northern Australia Regional Training Network; 2014. Med Educ 2019;53 (10):978–88.31238387 [Google Scholar]
  • 62. Cook DA. Narrowing the focus and broadening horizons: complementary roles for systematic and nonsystematic reviews. Adv Health Sci Educ Theory Pract 2008;13 (4):391–5. [DOI] [PubMed] [Google Scholar]
  • 63. Eva KW. On the limits of systematicity. Med Educ 2008;42 (9):852–3. [DOI] [PubMed] [Google Scholar]
  • 64. Gordon M. Are we talking the same paradigm? Considering methodological choices in health education systematic review. Med Teach 2016;38 (7):746–50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65. Dornan T. Workplace learning. Perspect Med Educ 2012;1 (1):15–23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66. Steinert Y, Naismith L, Mann K. Faculty development initiatives designed to promote leadership in medical education. A BEME systematic review: BEME Guide No. 19. Med Teach 2012;34 (6):483–503. [DOI] [PubMed] [Google Scholar]
  • 67. Barrett J, Scott K. Pedagogical and professional compromises by medical teachers in hospitals. Clin Teach 2014;11 (5):340–4. [DOI] [PubMed] [Google Scholar]
  • 68. Blake A, Doherty I. An instructional design course for clinical educators: first iteration design research reflections. J Learn Design 2007;2 (2):83–104. [Google Scholar]
  • 69. Dolmans DHJM, Tigelaar D. Building bridges between theory and practice in medical education using a design‐based research approach: AMEE Guide No. 60. Med Teach 2012;34 (1):1–10. [DOI] [PubMed] [Google Scholar]
  • 70. Wozniak H. Conjecture mapping to optimize the educational design research process. Australas J Educ Technol 2015;31 (5):597–612. [Google Scholar]
  • 71. O'Sullivan PS, Irby DM. Promoting scholarship in faculty development: relevant research paradigms and methodologies In: Steinert Y, ed. Faculty Development in the Health Professions: A Focus on Research and Practice. Dordrecht: Springer Science Business Media; 2014;375–98. [Google Scholar]
  • 72. McKenney SE, Reeves TC. Conducting Educational Design Research. Oxford: Routledge; 2019. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Figure S1. PRISMA flowchart including key search terms and search strategy.

Table S1. Descriptive data extracted from the included papers.

Table S2. Further details of the results reported in Table 1 for each element of the evaluation‐based framework.


Articles from Medical Education are provided here courtesy of Wiley

RESOURCES