Abstract
Introduction
The validity of feedback as one of the defining components for electronic portfolios (e-portfolios) to be effective and efficacious has yet to be demonstrated. While the literature has shown individual beneficial features of e-portfolios and feedback per se, evidence of feedback as mediated through technology directly resulting in improved educational practice is scarce. The explanation of how feedback via e-portfolio improves educational practice is particularly vague.
Methods and analysis
The aim of this research is to unpack how and why feedback via e-portfolio is likely to flourish or wither in its path. Given the complexity of intervention, we will apply a theory-driven approach for evidence synthesis called realist synthesis. Informed by realist philosophy of science, it seems the most appropriate method because it explores observed outcomes (O) in terms of causal relationship between relevant contexts (C) and generating mechanisms (M). Initial programme theory will be developed through literature scoping. Later on it will be tested against purposively gathered evidence (through database and journal search), which simultaneously will be evaluated for rigour and relevance (whether method used are trustworthy and whether data contributes to theory building). We strive to (1) uncover ‘context sensitive’ mechanisms that generate feedback via e–portfolio to be (in) effective and (2) define in what circumstances is this mostly likely to occur.
Ethics and dissemination
The synthesis report will be written according to the RAMESES guidelines and its findings will be published in peer reviewed articles and presented at relevant conferences. The aim is to inform: (1) policy and decision makers for future-course design; (2) medical educators/clinical supervisors and learners for improved educational use. No formal ethical approval is required.
PROSPERO registration number
120863.
Keywords: feedback, E-portfolio, realist synthesis, effectiveness, healthcare education, implementation, systematic review
Strengths and limitations of this study.
With realist synthesis we account for the breadth and depth of analyses appropriate for complex educational interventions.
No prior realist synthesis has been undertaken on the topic of how feedback via electronic portfolios works effectively.
In developing our initial programme theory we include stakeholder group inputs.
Content experts are not included in programme development.
Only studies published in English language will be searched.
Background
Introduction
Despite variations in content and format, portfolios are essentially a means through which healthcare learners can report on work done, feedback received, progress made and their plans for improving competence.1 Portfolios in postgraduate healthcare education can be employed for a range of end-purposes including reflective practice and assessment (summative and formative), and act as an essential connection between workplace learning organisationally and individually.2 As such, the content of a portfolio may vary according to the requirement of an organisation and the design of the training programme. For example, the content of medical trainees’ electronic portfolios (e-portfolios) may include quantitative assessments (eg, the Mini-Clinical Evaluation Exercise, Direct Observation of Procedural Skills, Case-based Discussion and 360-degree Evaluation), reflective writing (eg, a medical ethics and legislation report, healthcare quality report and personal development report) and an evidence-based medicine report. In the context of such a portfolio, clinical teachers are required to provide appropriate feedback for trainees on their assessment and reports contained within.3 Finally, portfolios can be either physical documents, or can be managed online (known as an e-portfolio).
The interest in e-portfolio use in healthcare education has been on the rise. This is probably because both, portfolios in general and electronic versions in particular, have shown to be beneficial to the user. In all their complexity of design, content and interface,1 4 what makes them stand out from other educational tools is their ability to encourage reflective practice and self-directed learning,5 6 which caters perfectly to the educational discourse that emphasises competence-oriented, individualised learning styles. By emphasising feelings of ownership and personal development,7 they encourage learners to become more self-aware of their learning process and more responsible for their own creation, maintenance and presentation.8
Contextual use of electronic portfolios in healthcare education
E-portfolio in healthcare education is foregrounded in its flexibility of access, repository, and content.1 2 9 10 When explaining its usage, scholars tend to emphasise its contextualisation. For instance, the nature of implementation, design and content11–14 and the individual perceptions of ease of use and usefulness15 are all important facets affecting the e-portfolio use and its potential to fundamentally transform the learning process.
Rather than dwelling on the notion of e-portfolio being merely a combination of portfolio and technology,16 in this paper, we try to argue how organisational, cultural and individual factors present a significant entry point for theorising the e-portfolio use. More importantly, we do so by focusing specifically on feedback portrayed via e-portfolio. We aim to understand (1) in what circumstances feedback via e-portfolio works most effectively and (2) whether this relates to fortunes and mishaps of e-portfolio use?
Effectiveness of feedback via e-portfolio
Feedback plays an influential role in educational achievements17 and when employed in healthcare settings, it is indispensable for successful learning, clinical teaching and improved clinical performance.18 19 Surprisingly, in healthcare education, little is known about how feedback can be used to maximise its impact on learning, behaviour and improved practice, and much less so when talking about technology-enhanced feedback.
One reason for this might be that the majority of research papers on feedback published between 1980 and 2015 used the lowest of Kirkpatrick’s levels of evaluation—assessing reactions to feedback—and amongst all the studies, only 7% out of 650 included articles were about computer-based feedback.20 Literature interpreting feedback as a one-way, educator-driven processes, with a focus on best delivery practices only, might be another reason. Indeed, educational studies have shown time and again that the high variability of effective feedback is too complex for it only to be explained with the notion of delivery processes.17 21 The many facets of learners’ feedback seeking behaviours3 20 22 23 as well as the gaps occurring between mentor’s and learner’s perceptions of the quantity, quality and efficacy of feedback have to be reconsidered if we are to completely understand feedback practice. Indeed, feedback via e-portfolios can occur variously, including: as asynchronous written feedback in which the educator leaves their comments for the learner to find and read, as synchronous technology-enhanced feedback, as synchronous face-to-face feedback, as mandatory or voluntary and as open access or not.
The aim of this research is to develop a model to facilitate feedback via e-portfolio, and thus enhance/improve the responsiveness and use of feedback; meaning, we need to understand the contextual workings for giving and receiving feedback in a technology enhanced environment. In addition, we have to consider not only the provision of information, but also the influence of the manner in which feedback is provided, the recipient’s decision to receive feedback and all the contended responses which might subsequently arise.
Methods
Aim
Focusing on higher educational settings internationally, we aim to understand why and how feedback via e-portfolio might produce different outcomes. For this purpose, we plan to use the Kirkpatrick hierarchy model modified by Tochel et al 2 and distinguish outcomes that describe the impact of intervention in terms of:
Participants’ reactions (eg, their views on learning experiences, attitudes towards e-portfolio use and usefulness, aspects on the nature and efficiency of feedback).
Changes in participants’ attitudes and learning (eg, changes in perceiving e-portfolio or feedback as useful, acquisition of new concepts, improvement of skills).
Changes in participants’ behaviours (motivational changes for further learning, active engagement with agency, e-portfolio content, application of new knowledge).
Changes in organisational practices and any improvements in the health and well-being of patients occurring because of the intervention.
Research questions (RQ)
(RQ1) What outcomes are identified resulting from feedback via e-portfolio, and at what level do they occur?
(RQ2) What mechanisms are identified that relate to: (1) positive outcomes of feedback via e-portfolio, (2) negative outcomes of feedback via e-portfolio?
(RQ3) What are the contexts within which the mechanisms trigger these outcomes, and for whom?
Realist synthesis
To address our RQs within a rapidly developing methodological field of data synthesis,24 we choose a theory driven approach called realist synthesis. Underlined by realist philosophy of science, the method’s hallmark is in its generative understanding of causality. It holds that outcomes (O) of events are generated by/through underlined mechanisms (M), which may or may not occur in certain context (C).25 Mechanisms are not ‘visible’—having their rooting in individual tendencies—and are ‘context specific’—and changeable according to the opportunities provided by specific context(s). Realist synthesis thus looks for interactions among the resources provided by the intervention and the reasoning and/or responses of the participants.26 Rather than assessing variables associated with a particular outcome, the method’s strength is in its ability to (1) explore generative mechanisms that underline main causes of (un)intended outcomes and (2) highlight the circumstances in which these mechanisms are triggered.
Realist synthesis starts with a programme theory and ends, if it has been successful, with a ‘revised, more nuanced and more powerful programme theory’.27 (Re)building programme theory means to draw from theoretical descriptions of C-M-O relationships (middle-range theories) that are close enough to the data that allow empirical/hypothetical testing. In our case, by synthesising the data, we will compare how feedback via e-portfolio was intended to work to the empirical data on the actuality in different situations—all with C-M-O relationships. In this manner we might explain some contingencies that influence the prospect of feedback via e-portfolio generating its intended outcomes.
Study design
The study follows the iterative steps suggested by Pawson et al,25 as well as two realist synthesis protocols: one by Wong et al 28 and the other by Pearson et al 29 (see online supplementary appendix 1 for Diagram of the project). We plan to report the actual realist synthesis according to RAMESES publication standards30 and use a modified flow diagram.28 31
bmjopen-2019-029173supp001.pdf (561.4KB, pdf)
Step 1: clarify the scope, locate existing theories and develop programme theory
The objective in the first step will be to conduct an exploratory (informal search) for various ‘working theories’,25 helping us tobuild an initial programme theory. In realist terms—underlining the relationships between the context, mechanisms and outcomes27 30 32—we are to explore ideas around how feedback via e-portfolio is intended to work and why sometimes things go astray. When getting a feel for the literature (its quality, quantity, as well as its boundary scope),25 we will be mindful not to foreclose potentially important perspectives.29 Therefore, we will conduct a broad electronic database scan for evidence with no quality assessment in mind.33 While the body of references will be narrowed down in Step 2, the documents in this stage will only need to contain information on e-portfolio-related instruments (ie, e-logbook, personal digital assistants, personal development plans) and feedback/assessment/ evaluation. To further test the developing theory, we will also conduct face-to-face interviews with e-portfolio users (clinical teachers and postgraduate trainees) as well as engage in discussion with the research team, who are familiar with the e-portfolio and feedback literature.
Initial programme theory
We have started work on this stage and have a number of potential theories that might help explain the mechanisms underlying the effectiveness of feedback via e-portfolio (see online supplementary appendix 2 Initial programme theory).
Theories of technology adaptation explain how perceptions of e-portfolio correlate to behavioural changes of e-portfolio usage,34–36 for example, the possibility of motivational mechanisms (such as self-efficacy, subjective norms, level of e-learning enjoyment, experiences and computer anxiety) and their impact on perception of (O1, O2) and intention to use (O3). These theories can shed light on whether the specific technology adopted might in any way affect the effectiveness and efficacy of feedback portrayed.
Other potentially valuable sources for our programme theory development are theories on feedback responsiveness and seeking behaviours.37–39 Assuming that responses to feedback arise solely from one’s sense of self-worth (mediated as mechanisms of fear from criticism, longing for appraisal, expectation of recognition), individuals are more likely to effortfully engage with technology/agency (O3) when they perceive feedback as being congruent with their selfhood (regardless of the intervention context). On the other hand, individuals might be able to self-regulate their motivation in relation to a specific context. As regulatory focus theory explains,39 40 it is the ‘promotion’ or ‘prevention foci’ of the context that will dictate the nature of engagement with technology/agency. In realist terms, high engagement and behavioural changes (+/−O3) might occur only when positive aspects of the intervention are conducted in promotion-aroused conditions (C), those regulated by wishes and desires, or when negative aspects of the intervention are given in prevention-aroused conditions (C), those regulated by obligation and necessity. For example, in a ‘promotion foci’ implementation context—such as where e-portfolio is voluntary, a part of formative assessment, the mentor’s comments on the learner’s tasks are positive—the learner will likely want to engage (M) with the mentor in an effortful manner (O3), or perhaps vigorously seek (M) new creative ways to continue the work (O3). By contrast, in a ‘prevention foci’ implementation context—such as where e-portfolio is mandatory, part of summative evaluation, mentor gives negative comments—the learner will perhaps become extra hard-working (M)/hypervigilant just to avoid (M) punishment and rectify (M) the situation. In this situation, a negative aspect of the intervention (C) might lead to positive learning, behavioural changes (O3). On the other hand, if the mentor praises learner’s assignments/performance (C), it is more likely that the feeling will be that no additional effort is needed (M, relaxation, indifference, disengagement), leading to no behavioural changes and low engagement with self, the mentor or e-portfolio (O)
Finally, the educational alliance theory states that behavioural changes to feedback happen according to learner’s evaluation of mentor’s credibility in a supervisor–trainee relationship.41 42 This might be another source for potential theory development. For example, learners trusting in the credibility of the mentor (clinical competency, content credibility, personal characteristics), and the relationship (meaningfulness and authenticity), will more likely contemplate feedback in an effortful manner, which will also probably lead to behavioural changes (O3).
The initial theories uncovered during our searches will be reconsidered against the empirical data. As such, it is possible that only a small number will be prioritised for synthesis, based on their greater resonance with that data.
Step 2: search for evidence
Using a more formal search for published literature in four bibliographic databases (Web of Science, Scopus, Medline+Journal@Ovid, Wiley Online Library), we will look for sufficient evidence to refine, confirm or refute our initial programme theory (see online supplementary appendix 3 Example search strategy for Medline+Journals@Ovid). Specifically, we will look for: (1) empirical (peer reviewed full articles) and non-empirical literature (eg, review, opinion pieces, editorials, commentaries, abstracts from conferences, process evaluations, programme manuals) as long as they comply with our rigour and relevance criteria;30 32 (2) studies of all types of research design will be included; (3) articles published in English; (4) articles published between 2008 and 2017; (5) with participants (learner and educator roles) in healthcare and higher educational settings in Taiwan and abroad (see online supplementary appendix 4 Definitions of concepts and supplementary appendix 5 Inclusion/exclusion criteria for formal search).
bmjopen-2019-029173supp002.pdf (226.2KB, pdf)
Because there is no finite set of relevant papers that can be strategically defined and found, compared with a more traditional systematic review, realist synthesis adopts an iterative approach to searching for multiple types of evidence.25 In order to explore the literature deeper for theoretical elements which might help to explain new findings, or re-examine certain aspects of the developing theory,32 we expect to undertake additional inquires such as: (1) hand searching relevant journals (related to e-learning, e-portfolio or feedback in an educational setting, such as British Journal of Educational Technology, Australian Journal of Educational Technology, Electronic Journal of e-learning, International Journal of ePortfolios); (2) using citation tracking (pearling); (3) skimming through various grey literature platforms (https://www.jisc.ac.uk/) and (4) coming across evidence by chance. Additional searches will be purposeful, focusing on relevant sources for developing programme theory. For all searches, we will make augments in our preliminary criteria (eg, include papers that are missing sufficient data, or not in the timeframe).
Step 3: study selection procedure and appraisal
After importing references into Endnote 9, we will undertake the study selection in two phases. First, we will screen based on title and abstract, excluding all references not specifically mentioning web/online portfolios and the feedback, assessment, evaluation portrayed in it. Second, we will look at the full text documents to further exclude based on the following questions: Does this paper (or a section of it) involve feedback via e-portfolio, that (1) is described as an ongoing (direct or indirect) interaction between receiver and giver using e-portfolio as an educational tool: (2) takes place in higher (healthcare) educational setting? Using the preliminary set of inclusion/exclusion rationales, the lead researcher (LVM) will check a randomly selected sample of 20% of the identified documents. The remaining will be screened by two reviewers. Any discrepancies will be discussed until an agreement is reached.
Aligned with the RAMESSES standards and proposed quality judgements,30 32 we will appraise the quality of included content of a section of a text as: (1) relevant, if they address or contribute to theories we are exploring, and (2) rigour, if the methods used to generate that particular data are credible and trustworthy. Quality judgements will be made on ‘the level of arguments and theory’ rather than merely on ‘the level of data’, allowing us to consider evidence seemingly of lesser quality yet potentially relevant to programme theory development.33 However, to give an indication of the ‘coherence, plausibility and appropriateness”30 of our selection, we will (1) apply elemental methodological questions43 for rigour and (2) use a hybrid tool29 44 45 to distinguish conceptually thick (rich) material from conceptually thin (weaker) according to its ability to provide explanations to developing programme theory. This tool has been shown to be useful in theory-driven synthesis just because it gives the option to focus on richer sources of programme theory without denying the weaker ones as well46 (see online supplementary appendix 6 Test for assessing relevance and rigour).
Step 4: data extraction and organisation
For the included full text papers, we will develop a data extraction sheet to provide an accessible overview of our findings (see online supplementary appendix 7 Data extraction table) as well as importing them into Atlas.ti V.8 for further coding of the themes. While coding, we will consider the raw data, textual descriptive findings as well as authors’ interpretations written in the results or discussion section. For non-research papers, we will consider various forms of textual descriptions. All relevant sections —relating to context, mechanisms and their relationships to outcomes—will be coded deductively (conceptual themes/codes created from initial programme theory developed prior to data extraction) and inductively (conceptual themes/codes recognised during the process). Should the paper contribute to only one specific element of the C-M-O, we will not discard it, as we will be able to make inferences from other sources.
Step 5: data synthesis
To refine and further explain the developing programme theory through the data synthesis process, we will simultaneously analyse evidence for potential C-M-Os and organise them in themes and semi predictable patterns.
To identify potential C-M-Os, we will think ‘‘backwards’ from the outcome’47 and will try to identify the causal mechanisms alongside the contexts within they are associated. We will be careful not to presume there is only one outcome within the chain of events.
When thematically organising the data, we will take a similar approach to that described by many other researchers:27 29 44 45
Juxtapose sources of evidence, for instance, when data about the effects of feedback via e-portfolio in one paper will allow an insight on its effective patterns in another paper; Reconcile sources and identify differences, such as, understanding why different results might occur in apparently similar situations; Adjudicate sources of evidence and make judgements between studies based on their methodological strengths and weaknesses; Consolidate sources of evidence, by creating a multifaceted explanation of the intervention. That is, whenever we have different outcomes in particular contexts, we will try to explain how and why this might occur. Situate sources of evidence, for example, when a particular mechanism is triggered in context A, while another mechanism might only occur in context B.
During this stage, the programme theory will be redeveloping and in its refinement. As we delve into our included studies and beyond, we will be mindful of unexpected patterns, which might inform us of new middle range theories, thereby further explaining dynamics around e-portfolio being an effective means for the feedback process. Considering we expect to find limited data specific to our enquiry, we recognise that some of the theoretical assumptions we will make might be weakly supported. Nevertheless, throughout our work we will be fully transparent about the levels of evidence available to support/refute our hypotheses, giving the reader the space to decide exactly how much of it is relevant.
Patient and public involvement statement
This realist synthesis around feedback via e-portfolio will be done without patient and public involvement. Our rationale for this is that, to the best of our knowledge, patients are not typically involved in this aspect of clinical education. As such, patients will not be invited to contribute to study design, interpretation of the results, or help with writing or editing of the document. Also, we will not include them when developing dissemination strategy.
Ethics and dissemination
No formal ethical approval is required for this synthesis. We aim to publish our findings in at least one peer reviewed journal as well as present them to relevant bodies including broader educational institutions. At present, we have a fairly vague understanding of the complex dynamics between e-portfolio and feedback; even more unclear are all contingencies closely linked to it. By applying a method that has the analytical strength to provide insight into the complexity,27 we hope to pinpoint the most valued educational features of effective feedback via e-portfolio in a contextual manner. With a forward-looking perspective, we aim not only to inform the educational community, but also to give practical guidance and recommendations to policymakers on how to re-enact the context, or even provide enhanced resources in the future.
Supplementary Material
Acknowledgments
We wish to thank Prof Jan Illing and Prof Amelia Kehoe from Newcastle University for insightful guidance when preparing the manuscript.
Footnotes
Contributors: LVM and R-HF conceived the idea for the study, in discussion with MB, designed the study and developed the protocol. MB drafted the protocol manuscript with input from LVM and R-HF. MB prepared the search strategy for Medline Journals@Ovid and other supplement data. All authors have read and approved the final manuscript.
Funding: This work was supported by the Ministry of Science and Technology, Taiwan no. NMRPD1F1331.
Competing interests: None declared.
Provenance and peer review: Not commissioned; externally peer reviewed.
Patient consent for publication: Not required.
References
- 1. Driessen EW, Muijtjens AM, van Tartwijk J, et al. Web- or paper-based portfolios: is there a difference? Med Educ 2007;41:1067–73. 10.1111/j.1365-2923.2007.02859.x [DOI] [PubMed] [Google Scholar]
- 2. Tochel C, Haig A, Hesketh A, et al. The effectiveness of portfolios for post-graduate assessment and education: BEME Guide No 12. Med Teach 2009;31:299–318. 10.1080/01421590902883056 [DOI] [PubMed] [Google Scholar]
- 3. Fu RH, Cho YH, Quattri F, et al. ‘I did not check if the teacher gave feedback’: a qualitative analysis of Taiwanese postgraduate year 1 trainees’ talk around e-portfolio feedback-seeking behaviours. BMJ Open 2019;9:e024425 10.1136/bmjopen-2018-024425 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. Driessen E, van Tartwijk J, van der Vleuten C, et al. Portfolios in medical education: why do they meet with mixed success? A systematic review. Med Educ 2007;41:1224–33. 10.1111/j.1365-2923.2007.02944.x [DOI] [PubMed] [Google Scholar]
- 5. Rees C. ’Portfolio' definitions: do we need a wider debate? Med Educ 2005;39:1142 10.1111/j.1365-2929.2005.02326.x [DOI] [Google Scholar]
- 6. Rees C. The use (and abuse) of the term "portfolio". Med Educ 2005;39:436–7. 10.1111/j.1365-2929.2005.02119.x [DOI] [PubMed] [Google Scholar]
- 7. Klampfer A, Koehler T. E-Portfolios @ teacher training: an evaluation of technical and motivational factors. Proceedings of the IADIS International Conference e-Learning 2013. [Google Scholar]
- 8. Lewis KO, Baker RC. The development of an electronic educational portfolio: an outline for medical education professionals. Teach Learn Med 2007;19:139–47. 10.1080/10401330701332219 [DOI] [PubMed] [Google Scholar]
- 9. Buckley S, Coleman J, Davison I, et al. The educational effects of portfolios on undergraduate student learning: a Best Evidence Medical Education (BEME) systematic review. BEME Guide No. 11. Med Teach 2009;31:282–98. 10.1080/01421590902889897 [DOI] [PubMed] [Google Scholar]
- 10. Chang Chi‐Cheng, Chang C-C. A study on the evaluation and effectiveness analysis of web‐based learning portfolio (WBLP). Br J Educ Technol 2001;32:435–58. [Google Scholar]
- 11. Duque G. Web-based evaluation of medical clerkships: a new approach to immediacy and efficacy of feedback and assessment. Med Teach 2003;25:510–4. 10.1080/01421590310001605697 [DOI] [PubMed] [Google Scholar]
- 12. Duque G, Finkelstein A, Roberts A, et al. Learning while evaluating: the use of an electronic evaluation portfolio in a geriatric medicine clerkship. BMC Med Educ 2006;6:1–7. 10.1186/1472-6920-6-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Kjaer NK, Maagaard R, Wied S. Using an online portfolio in postgraduate training. Med Teach 2006;28:708–12. 10.1080/01421590601047672 [DOI] [PubMed] [Google Scholar]
- 14. Strivens J, Baume D, Owen C, et al. The role of e-portfolios in formative and summative assessment practices: Centre for Recording Achievement. 2008.
- 15. Dornan T, Carroll C, Parboosingh J. An electronic learning portfolio for reflective continuing professional development. Med Educ 2002;36:767–9. 10.1046/j.1365-2923.2002.01278.x [DOI] [PubMed] [Google Scholar]
- 16. Woodward H, Nanlohy P. Digital portfolios: fact or fashion? Assess Eval High Educ 2004;29:227–38. 10.1080/0260293042000188492 [DOI] [Google Scholar]
- 17. Hattie J, Timperley H. The Power of Feedback. Rev Educ Res 2007;77:81–112. 10.3102/003465430298487 [DOI] [Google Scholar]
- 18. van de Ridder JM, Stokking KM, McGaghie WC, et al. What is feedback in clinical education? Med Educ 2008;42:189–97. 10.1111/j.1365-2923.2007.02973.x [DOI] [PubMed] [Google Scholar]
- 19. Veloski J, Boex JR, Grasberger MJ, et al. Systematic review of the literature on assessment, feedback and physicians' clinical performance: BEME Guide No. 7. Med Teach 2006;28:117–28. 10.1080/01421590600622665 [DOI] [PubMed] [Google Scholar]
- 20. Bing-You R, Hayes V, Varaklis K, et al. Feedback for Learners in Medical Education: What Is Known? A Scoping Review. Acad Med 2017;92:1346–54. [DOI] [PubMed] [Google Scholar]
- 21. Kluger AN, DeNisi A. The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychol Bull 1996;119:254–84. 10.1037/0033-2909.119.2.254 [DOI] [Google Scholar]
- 22. Archer JC. State of the science in health professional education: effective feedback. Med Educ 2010;44:101–8. 10.1111/j.1365-2923.2009.03546.x [DOI] [PubMed] [Google Scholar]
- 23. Watling CJ, Lingard L. Toward meaningful evaluation of medical trainees: the influence of participants’ perceptions of the process. Adv Health Sci Educ Theory Pract 2012;17:183–94. 10.1007/s10459-010-9223-x [DOI] [PubMed] [Google Scholar]
- 24. Munn Z, Stern C, Aromataris E, et al. What kind of systematic review should I conduct? A proposed typology and guidance for systematic reviewers in the medical and health sciences. BMC Med Res Methodol 2018;18:1–9. 10.1186/s12874-017-0468-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25. Pawson R, Greenhalgh T, Harvey G, et al. Realist review – a new method of systematic review designed for complex policy interventions. J Health Serv Res Policy 2005;10(Supplement 1):21–34. 10.1258/1355819054308530 [DOI] [PubMed] [Google Scholar]
- 26. Wong G, Greenhalgh T, Westhorp G, et al. Realist methods in medical education research: what are they and what can they contribute? Med Educ 2012;46:89–96. 10.1111/j.1365-2923.2011.04045.x [DOI] [PubMed] [Google Scholar]
- 27. Wong G, Westhrop G, Pawson R, et al. Realist Synthesis: Rameses Training Material. 2013b:1–54.
- 28. Wong G, Brennan N, Mattick K, et al. Interventions to improve antimicrobial prescribing of doctors in training: the IMPACT (IMProving Antimicrobial presCribing of doctors in Training) realist review. BMJ Open 2015;5:e009059 10.1136/bmjopen-2015-009059 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29. Pearson M, Chilton R, Woods HB, et al. Implementing health promotion in schools: protocol for a realist systematic review of research and experience in the United Kingdom (UK). Syst Rev 2012;1:1–7. 10.1186/2046-4053-1-48 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30. Wong G, Greenhalgh T, Westhorp G, et al. RAMESES publication standards: realist syntheses. BMC Medicine 2013a;11:1–14. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31. Liberati A, Altman DG, Tetzlaff J, et al. The PRISMA Statement for Reporting Systematic Reviews and Meta-Analyses of Studies That Evaluate Health Care Interventions: Explanation and Elaboration. PLoS Med 2009;6:e1000100–28. 10.1371/journal.pmed.1000100 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32. Wong G, Greenhalgh T, Westhorp G, et al. Development of methodological guidance, publication standards and training materials for realist and meta-narrative reviews: the RAMESES (Realist And Meta-narrative Evidence Syntheses – Evolving Standards) project. Health Services and Delivery Research 2014;2:1–252. 10.3310/hsdr02300 [DOI] [PubMed] [Google Scholar]
- 33. Wong G. Data Gathering in Realist Review : Emmel N, Greenhalgh J, Manzano A, et al, Doing Realist Research. Los Angeles, London, New Delhi, Singapore, Washington DC, Melbourne: SAGE, 2018. [Google Scholar]
- 34. Abdullah F, Ward R. Developing a General Extended Technology Acceptance Model for E-Learning (GETAMEL) by analysing commonly used external factors. Comput Human Behav 2016;56:238–56. 10.1016/j.chb.2015.11.036 [DOI] [Google Scholar]
- 35. Abdullah F, Ward R, Ahmed E. Investigating the influence of the most commonly used external variables of TAM on students’ Perceived Ease of Use (PEOU) and Perceived Usefulness (PU) of e-portfolios. Comput Human Behav 2016;63:75–90. 10.1016/j.chb.2016.05.014 [DOI] [Google Scholar]
- 36. Carless D. Trust, distrust and their impact on assessment reform. Assess Eval High Educ 2009;34:79–89. 10.1080/02602930801895786 [DOI] [Google Scholar]
- 37. Boehler ML, Rogers DA, Schwind CJ, et al. An investigation of medical student reactions to feedback: a randomised controlled trial. Med Educ 2006;40:746–9. 10.1111/j.1365-2929.2006.02503.x [DOI] [PubMed] [Google Scholar]
- 38. Eva KW, Armson H, Holmboe E, et al. Factors influencing responsiveness to feedback: on the interplay between fear, confidence, and reasoning processes. Adv Health Sci Educ Theory Pract 2012;17:15–26. 10.1007/s10459-011-9290-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39. Kluger AN, Van Dijk D. Feedback, the various tasks of the doctor, and the feedforward alternative. Med Educ 2010;44:1166–74. 10.1111/j.1365-2923.2010.03849.x [DOI] [PubMed] [Google Scholar]
- 40. Watling C, Driessen E, van der Vleuten CP, et al. Understanding responses to feedback: the potential and limitations of regulatory focus theory. Med Educ 2012;46:593–603. 10.1111/j.1365-2923.2012.04209.x [DOI] [PubMed] [Google Scholar]
- 41. Telio S, Ajjawi R, Regehr G. The "educational alliance" as a framework for reconceptualizing feedback in medical education. Acad Med 2015;90:609–14. 10.1097/ACM.0000000000000560 [DOI] [PubMed] [Google Scholar]
- 42. Telio S, Regehr G, Ajjawi R. Feedback and the educational alliance: examining credibility judgements and their consequences. Med Educ 2016;50:933–42. 10.1111/medu.13063 [DOI] [PubMed] [Google Scholar]
- 43. Ohly H, Crossland N, Dykes F, et al. A realist review to explore how low-income pregnant women use food vouchers from the UK’s Healthy Start programme. BMJ Open 2017;7:e013731 10.1136/bmjopen-2016-013731 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44. Pearson M, Chilton R, Wyatt K, et al. Implementing health promotion programmes in schools: a realist systematic review of research and experience in the United Kingdom. Implement Sci 2015;10:1–20. 10.1186/s13012-015-0338-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45. Brennan N, Bryce M, Pearson M, et al. Towards an understanding of how appraisal of doctors produces its effects: a realist review. Med Educ 2017;51:1002–13. 10.1111/medu.13348 [DOI] [PubMed] [Google Scholar]
- 46. Brennan N, Bryce M, Pearson M, et al. Understanding how appraisal of doctors produces its effects: a realist review protocol. BMJ Open 2014;4:e005466 10.1136/bmjopen-2014-005466 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47. Wong G. Special Invited Editorial: Getting Started With Realist Research. Int J Qual Methods 2015:1–2. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
bmjopen-2019-029173supp001.pdf (561.4KB, pdf)
bmjopen-2019-029173supp002.pdf (226.2KB, pdf)