Skip to main content
GMS Zeitschrift für Medizinische Ausbildung logoLink to GMS Zeitschrift für Medizinische Ausbildung
. 2010 Apr 22;27(2):Doc34. doi: 10.3205/zma000671

Research in medical education: pratical impact on medical training and future challenges

Diana H J M Dolmans 1,*, Cees P M van der Vleuten 2,3,*
PMCID: PMC3140348  PMID: 21818203

Abstract

Medical education research has changed over the years from merely descriptive studies towards justification or curriculum comparison studies and, nowadays, towards a slow introduction of more clarification studies. In clarification studies quantitative and qualitative methods are used to explain why or how educational interventions work or do not work. This shift is described in this paper. In addition, it is explained how research into workplace learning and assessment has impacted developments in educational practice. Finally, it is argued that the participation of teachers within the medical domain in conducting and disseminating research should be cherished, because they play a crucial role in ensuring that medical education research is applied in educational practice.

Lead

Worldwide, medical education research has grown enormously the last twenty years. There have been huge increases in the number of scientific journals and the number of issues published per journal, the number of participants at national and international conferences on medical education and the number of candidates with a career as medical education researchers [1]. But, apart from growth, which developments have we seen in medical education research? Has medical education research had a positive impact on medical training? What future challenges will medical education research have to meet in order to further enhance evidence-based innovations in our medical training programmes? These questions are addressed below.

Changes in medical education research

Within the field of medical education research, there has been a shift in the type of studies that are conducted. It is a shift from merely descriptive studies, explaining which kinds of innovations are implemented in practice, towards justification studies. Justification studies often focus on comparisons of curricula; e.g. does a traditional curriculum result in different outcomes compared to an innovative curriculum [2]? Slowly, more clarification studies are being reported, investigating how different variables influence each other and paying attention not only to outcomes but also to the underlying processes that could explain why and how an intervention does or does not work

There has been much debate in the literature about justification or curriculum comparison studies. To (bio)medically trained researchers controlled experimentation is the hallmark of good research. But, controlling circumstances in educational interventions is very hard and often impossible. Trying to control an educational intervention may actually lead to a rather reductionist and trivial exercise [3]. We do not argue that controlled experiments should never be done, because it is dependent on the research question formulated. We are currently involved in testing the hypothesis that elaboration in a group leads to better knowledge retention [4]. The randomized experimental and control groups are completely standardized (through the use of video) except for an elaboration intervention and the experimentation is conducted in a laboratory situation. Naturally, the price is ecological validity and generalization to authentic contexts.

Currently, more clarification studies are being reported in the literature. This shift is highly valuable. Education is a complex domain in which many different variables interact with each other, such as the student, the teacher, the learning materials, the assessment, etcetera. Because of this complexity, it is not easy to conduct research in this area [5], [6]. The complex interactions between different variables make it difficult to compare curricula and to detect the real cause of better outcomes [7], [8]. Clarification studies try to unravel the processes underlying the observed effects and address the question ‘Why or how did it work [2]?’ These studies are highly valuable because they clarify what works under which circumstances. Different methods can be used to conduct clarification studies, both quantitative and qualitative methods. Nowadays, an increasing number of qualitative studies are being published as opposed to quantitative studies. Many qualitative studies focus on answering the questions why (explanation) and how, leading to deeper understanding of differing perspectives [9]. Not only qualitative studies but also design-based studies are on the increase. In design-based studies, an educational design is developed based on current theoretical insights and evaluated by multiple methods, with the dual goal of refining theory and improving practice [10]. Design-based studies are often conducted in real life settings in which multiple aspects and interactions are evaluated and in which researchers and practitioners interact closely with each other [10]. The ‘ecological validity’ of this research avenue is high, but the proof will be of quite a different nature than we are used to in the conventional RCT approach. An example of this type of research from our own experience is the development of teaching portfolios to stimulate the professional development of teachers [11], [12]. Constructivist theories of learning emphasize that learners actively construct their own knowledge by interpreting events and information based on what they already know. From this perspective, the professional development of teachers can be encouraged by stimulating them to critically reflect on their teaching practice e.g. by means of a teaching portfolio, an authentic assessment tool combining different instruments to measure different competencies and in which feedback plays a crucial role. Modern theories of assessment, teachers’ professional development and teaching portfolios were used to develop a teaching portfolio prototype [13].

In summary, the field of medical education research has not only grown rapidly, it has also changed over the past years. More and more studies are reported that deepen our understanding of how and why education works. Mixed methods and mixed research avenues that complement each other are needed, inspired by theoretical notions that illuminate in some way how and why things work in educational practice.

Impact of research on educational practice

The ultimate question is if and how medical education research changes educational practice. Before answering this question it is first of all important to keep in mind that the relationship between research and practice is not always straightforward. Research often leads to contradicting findings, open-door findings or findings that are highly context-specific and this can make it difficult to apply research findings to medical training programmes. Despite these difficulties, medical education research has definitely contributed to improvements of training programmes over the years. Workplace learning and assessment will be described below as two examples to illustrate the relationship between medical education research and educational practice.

Workplace learning

Workplace learning is considered by medical experts to be the optimal way of learning a profession. In medical curricula, workplace learning has played a dominant role for a long time. In many traditional curricula, students start with theoretical courses during the first years of the training programme and later move on to clinical training in different disciplines in the hospital during which they apply what they have learnt during theoretical training under the guidance of experts. Workplace learning is potentially a very rich learning environment offering students many possibilities to interact with patients and medical experts and to participate in clinical practice [14], [15]. Although workplace learning offers many opportunities for student learning, research has demonstrated that students also experience difficulties [16], [17]. Found that students experienced difficulties when they had to apply in practice what they had learnt during their theoretical courses. In order to diminish the gap between theory and practice and to create a more gradual transition from school-based learning to workplace learning, workplace learning is nowadays introduced earlier in many medical curricula.

Research has also demonstrated that there are considerable variations between students in the skills they perform and the patients they encounter during workplace learning [18]. Learning takes place rather haphazardly in workplace learning, depending on the patients or problems presenting in daily practice. Another major problem, reported in several studies, is that students often receive only limited supervision and feedback [19], [20], [21]. This is a serious problem, since it is known from the literature that direct supervision in the workplace is the key to effective student learning [22]. Quality of supervision has been demonstrated to have a direct impact on students’ clinical competencies [23]. Insights about these shortcomings of workplace learning have led to the development of several interventions to optimize student learning, such as in-training assessment to provide learners with more feedback, structuring of workplace learning experiences, deepening the reflective component of learning based on (rich) information from others, etcetera. In addition, there has been increased awareness of the importance of training clinical staff members and providing them with new knowledge and skills for effective teaching and learning in the workplace [24]. Compared to a few years ago, today much more time is devoted to faculty development activities during which faculty learn more about effective workplace learning and tools they can use to optimize workplace learning.

The attention given in the literature to problems of workplace learning have also led to the development and implementation of instruments for evaluating the quality of the clinical learning environment [25] and the performance of clinical supervisors [26]. Finally, only recently, concerns about the quality of student learning in the workplace have led to the implementation of longitudinal attachments in undergraduate medical training programmes to increase student continuity with patients and supervisors [27]. But, not only undergraduate medical training programmes have changed over the years, postgraduate medical education has also seen rapid changes since 2000 [28].

In sum, research within the domain of workplace learning has contributed to various initiatives aimed at optimizing workplace learning. Apparently, medical education research can lead to changes in educational practice. But, it is also important to keep in mind that it is not easy to implement findings from research in daily practice. For example, although it is known that high quality supervision is the key factor for the success of workplace learning, it remains difficult to stimulate clinical staff to spend more time supervising students during workplace learning, because of competing values and responsibilities between patient care, research and education [29]. Improving education not only requires the introduction of new tools in educational practice, it also requires a cultural change, commitment and involvement from all participants in the workplace and this requires long-term efforts.

Assessment

The area of student assessment is definitely one that is led by research. We will present a few instances and refer for the broader developments to other literature [30], [31], [32]. In the sixties it was found that performance on one assessment exercise (item, station, oral, patient encounter, etc) was hardly predictive of performance on another exercise. The phenomenon has been termed the ‘content specificity problem of clinical competence’ and was later found to occur with virtually all assessment methods, regardless of what was being measured. The phenomenon resonated with findings in cognitive psychology and stimulated a great deal of cognitive expertise research (which in itself had quite some impact on educational practice). The impact on educational practice was that short, single shot assessments were abandoned (e.g. the long case) and that efficient sampling strategies across content were introduced in any method of assessment. It was also found that contextualizing assessment by presenting authentic tasks did not require extensive, complex and resource intensive simulations, but could be achieved with short scenarios or vignettes. It was also found that the stimulus format, the task presented to the assessee, was more important than the response format (open, closed, oral, performance-based, etc.). This has had a tremendous impact on assessment strategies all over the world. For example, licensing examinations across the world have completely changed their practice of written assessment. All written test items have been changed to small but authentic simulations of authentic professional tasks, requiring higher cognitive abilities and application of knowledge. Later this was followed by performance-based assessment strategies using the same approach: efficient, frequent and authentic sampling across a number of clinical encounters using multiple assessors. There are probably very few medical schools around the world that do not use the Objective Structured Clinical Examination (OSCE) one way or the other. It is a very clear example of how educational practice is influenced by research. In the mean time research has considerably professionalized the OSCE approach in general (scoring, standard setting, role playing, equating, etcetera); a whole ‘OSCE-ology’ has emerged from that.

An interesting more recent insight is that objectification is not really a required goal and sometimes even not a desired goal in assessment. Subjective measures can be reliable and objective ones can be unreliable all depending on how the sampling is performed. The key is sampling across elements that influence the measurement. The key is NOT standardizing, structuring, or objectifying the measurement. This is a tremendous insight with dazzling practical implications. The OSCE was invented as a reaction to subjective clinical examinations. It was therefore called ‘Objective and Structured’. However, reliability and validity depend on how sampling is done across content, patients and examiners much more than on how structured or objective the measurement itself is. This insight is the basis for moving back to the unstandardized ‘noisy’ but authentic clinical context and for conducting appropriate sampling. All work-based assessment as it is currently developing is based on these premises [33].

In all, assessment provides an excellent example of how research is able to impact educational practice.

Future directions for medical education research

The professionals involved in medical education research are growing not only in number but also in the diversity of their scientific backgrounds. At the same time medical education research is being accused of a lack of scientific rigour or of insufficient quality [34]. According to some leaders in the field, progress in medical education research has been too slow. They argue that many of the studies reported in the journals have been done before or lack a theoretical background or fail to test theories [34], [35]. Furthermore, there is a lack of understanding about social science research and qualitative methodologies, probably due to the dominance of the biomedical model [34], [35]. These factors hinder the increase of the body of knowledge in the field of medical education research.

Of course, the quality of research should be increased by conducting studies that test theories [35] and by conducting more rigorous qualitative studies and mixed-methods studies. Theories need to be used too. They give researchers different ‘lenses’ through which to look at complicated problems and social issues. Theories broaden our understanding of situations and can be applied in practice [36]. And of course, medical education research should lead to the creation of new knowledge for academics [37] and contribute towards our understanding of the problems encountered in education [1].

But there is one very fundamental aspect of the research in medical education that is quite unique and which holds promise for research impacting educational practice. That aspect is the participation of the medical teachers – the practitioners of medical education - in conducting the research and in disseminating it. In general education, there is much discussion about the gap that separates educational research from educational practice [38]. Education research is accused of being too theory oriented and of failing to address the problems of educational practice. On top of that, the users of general education research, the teachers, are disengaged from participating in the research. We daresay that this does not apply to medical education and that in fact the opposite is true. Very characteristic, but also unique, is that the teachers within the medical domain participate in conducting the research and in disseminating it. There is no other domain that has so many international journals dedicated to education (we counted 15 but lost track), some of which are specifically dedicated to the translation of research to education practice (i.e., Medical Teacher, The Clinical Teacher). The international meetings on medical education have become huge in number of attendees. In part, the explanation for this lies in the amalgamation of what is offered in these meetings (workshops, symposia, hands-on experiences, practical experiences and research). This thriving community of education specialists and representatives from the domain itself, we believe, is the agent of the impact of research on educational practice. This community is slowly but clearly professionalizing in terms of educational research standards and the use of theory. It is crucial, however, that we professionalize at the right pace. We need to strike a careful balance between research that has practical relevance and research that is of high scientific quality and clarifies what works well under which conditions and why. We should never risk becoming disengaged from the medical teacher or any other person having a direct responsibility in educational practice [39]. We believe and are determined to continue to cherish this participative community in medical education. The impact will follow almost automatically.

The authors

  1. Diana H.J.M. Dolmans, PhD, is educational scientist and associate professor within the Department of Educational Development and Research, Faculty of Health, Medicine and Life Sciences, Maastricht University, the Netherlands.

  2. Cees P.M. van der Vleuten, PhD, is professor of Education, Chair of the Department of Educational Development and Research, Scientific Director of the School of Health Professions Education, Faculty of Health, Medicine and Life Sciences, Maastricht University, the Netherlands.

References


Articles from GMS Zeitschrift für medizinische Ausbildung are provided here courtesy of German Medical Science

RESOURCES