Skip to main content
Taylor & Francis Open Select logoLink to Taylor & Francis Open Select
. 2013 Jun 28;35(11):e1598–e1607. doi: 10.3109/0142159X.2013.803061

The educational impact of assessment: A comparison of DOPS and MCQs

Kate A Cobb 1,3,*, George Brown 1,4, Debbie A D C Jaarsma 2,5, Richard A Hammond 1,6
PMCID: PMC3809925  PMID: 23808609

Abstract

Aim

To evaluate the impact of two different assessment formats on the approaches to learning of final year veterinary students. The relationship between approach to learning and examination performance was also investigated.

Method

An 18-item version of the Study Process Questionnaire (SPQ) was sent to 87 final year students. Each student responded to the questionnaire with regards to DOPS (Direct Observation of Procedural Skills) and a Multiple Choice Examination (MCQ). Semi-structured interviews were conducted with 16 of the respondents to gain a deeper insight into the students’ perception of assessment.

Results

Students’ adopted a deeper approach to learning for DOPS and a more surface approach with MCQs. There was a positive correlation between an achieving approach to learning and examination performance. Analysis of the qualitative data revealed that deep, surface and achieving approaches were reported by the students and seven major influences on their approaches to learning were identified: motivation, purpose, consequence, acceptability, feedback, time pressure and the individual difference of the students.

Conclusions

The format of DOPS has a positive influence on approaches to learning. There is a conflict for students between preparing for final examinations and preparing for clinical practice.

Introduction

The educational impact of assessment has long been an important topic in medical and other forms of higher education (Crooks & Mahalski 1985; Ramsden 1992; McManus et al. 1998; McLachlan 2006; Cilliers et al. 2010; Al-Kadri et al. 2012). It has been shown to have both a positive effect on student learning as well as fostering less desirable learning strategies (Scouller 1998; Leung et al. 2008; Donnon & Hecker 2010). Many of these studies used the Study Process Questionnaire (SPQ) (Biggs & Australian Council for Educational Research 1987b) for the measurement of educational impact. This inventory provides a measure of approach, motivation and strategy (Table 1). It has been used to compare two forms of written assessments such as Multiple Choice Questions and Assignments (Scouller 1998), to compare learning styles and examination performance (McManus et al. 1998) or as a basis for discussing various approaches to learning and assessment (Evelyn Brown 2003; Gibbs 2004–05). So far it has not been used to compare the educational impacts of workplace-based assessment (WPBA) and Multiple Choice Questions (MCQs).

Table 1. .

 Summary of the differences in motivations and learning strategies of the deep, surface and achieving approach to learning (Biggs & Australian Council for Educational Research 1987a)

Approach Motivation Strategy
Deep Interest in the subject resulting in intrinsic motivation to learn. Develop understanding of the subject, linking ideas and concepts.
Surface Fear of failure, motivated to complete the minimum amount of work to pass. Reproduction for high stakes assessment, involves rote learning.
Achieving To achieve the highest grades possible, motivated by competition with their peers. Organisation of work for academic success. May involve elements of both deep and surface strategies.

WPBA may be measured by a variety of methods from portfolios to direct observation. At the University of Nottingham School of Veterinary Medicine and Sciences (SVMS), Directly Observed Procedural Skills (DOPS) (Norcini & Burch 2007) are used as a form of WPBA to examine the performance of practical and clinical skills of each final year student. These DOPS help students to identify their areas of weakness, to improve performance and thereby encourage a deep approach to learning (Nicol & Macfarlane-Dick 2006; Swanwick 2010). In this context the DOPS provides an opportunity for assessment for learning (AFL) (Black & Wiliam 1999; Schuwirth & van der Vleuten 2011; Schuwirth et al. 2011).

Practice points
  • DOPS encourages a deep approach to learning in a clinical context.

  • MCQs encourage a surface approach to learning in final examinations.

  • Time pressure increases student stress levels and encourages a surface approach to learning particularly close to assessment points and as opportunities for DOPS became scarcer.

  • Students used achieving strategies when they perceived them as essential for success in both MCQ's and DOPS.

  • Assessment of and for learning should be used to encourage students to develop a deep approach to learning which is more closely aligned with the requirements of clinical practice.

In contrast, end of course final examinations based on MCQs are used for assessment of learning (AOL). No feedback is provided to students and the purpose of the assessment is purely summative. There is evidence that MCQs foster surface approaches to learning (Scouller 1998). However, if carefully designed and appropriate standards are set, they are useful for assessing core knowledge (Brown et al. 1997).

As indicated, the scores on the SPQ were used as a measure of the likely educational impact of the DOPS and the MCQs. The SPQ provides measures of deep, surface and achieving approaches to learning (Biggs & Australian Council for Educational Research 1987a; Biggs & Australian Council for Educational Research 1987b). Each approach is strongly associated with a form of motivation and a learning strategy (Table 1). Students who adopt a deep approach are intrinsically motivated about their subject and apply learning strategies which enable them to increase their understanding. The surface approach is adopted by those students whose motivation is to complete the course without failure, based on memorising the knowledge and concepts which the students are most likely to be examined on. An achieving approach is adopted by students with a competitive nature, they utilise a highly organised strategy to achieve the highest marks. They may include elements of the deep and surface approaches to achieve, but lack the intrinsic motivation and sometimes understanding demonstrated by deep learners.

These approaches to learning are not fixed entities. Different learning environments (Biggs & Australian Council for Educational Research 1987a; Kember et al. 1997; McManus et al. 1998), assessment formats (Newble & Jaeger 1983; Tang 1994; Scouller 1998; Leung et al. 2008; Cilliers et al. 2011) and year of study (Biggs & Australian Council for Educational Reasearch 1987a; Donnon & Hecker 2010) can influence the approach adopted by students.

For example, Donnon and Hecker (2010) describe a shift from a deep to a surface approach to learning in Health Science students during their final year of study. Cilliers et al (2010, 2011) report that postgraduate medical students stated they valued highly assessment for clinical practice. They described the tensions they experienced between studying to pass examinations and studying to become a good clinician. As they progressed towards student internship, their concern with patient care became a more prominent factor in their learning whereas earlier in the course they were prepared to sacrifice their vocationally motivated learning in order to reduce stress levels and pass examinations.

This study focussed on the effects of DOPS and an assessment based on MCQs upon approaches to learning of final year students. A mixed-method approach was used; impact was measured quantitatively using the SPQ and its well-tested theoretical model (Biggs & Australian Council for Educational Research 1987a; Biggs & Australian Council for Educational Research 1987b). The relationship between approach to learning and examination performance was also investigated by correlating approach scores with MCQ score. To understand these relationships in greater depth, the views reported by students of their approaches to learning for the two different assessment formats were explored during interviews and compared with the quantitative analyses. It was hypothesised that the final year students in this sample would adopt different learning approaches for the two different assessment formats.

Methods

Students and context

The participants of the study were final year students of the School of Veterinary Medicine and Science (SVMS) at the University of Nottingham. Students follow a five-year curriculum and during their fifth year complete 26 weeks of clinical practice in two-week rotation blocks. The SVMS has adopted a dispersive model whereby students complete rotations based in private veterinary hospitals; each rotation covers one of three species areas: farm animal, equine or small animal. During the year, students are required to pass 10 DOPS assessments across the range of species and from different skill areas. There are a total of 48 potential skills that may be assessed as a DOPS, based on the practical competencies required of a newly qualified veterinary surgeon. These skills are categorised into 10 skill areas; for example diagnostic imaging, anaesthesia and physical examination. Each student is randomly assigned one DOPS from each skill area which is to be completed in a particular species during their final year. No marks are allocated to the DOPS; the clinician makes a decision on the student's performance and assigns a category of ‘excellent’, ‘competent’ or ‘needs further development’. Students are given verbal and written feedback on each assessment and must be passed as ‘competent’ in all 10 DOPS to be eligible to sit the MCQ examination at the end of the final year. As indicated, this final examination is composed of MCQ questions of different formats, including ‘single best answer questions’ with 4 or 5 distracters and extended matching questions. All questions are linked to final year learning outcomes, are based on clinical case scenarios and assess clinical knowledge and its applications. These MCQs are delivered and marked online within a week-long exam period. Students, who pass this examination, graduate as veterinary surgeons and gain MRCVS status (Membership of the Royal College of Veterinary Surgeons). Examples of both DOPS and MCQs are provided in the Appendices.

Data collection

A mixed methods approach was used. An online survey was sent to all (87) students in the final year of the veterinary medicine course. The survey used the shortened, 18 item, version of the SPQ (Biggs & Australian Council for Educational Research 1987b) previously validated in a study of over 1300 medical students across five different universities within the UK (Fox et al. 2001). The questionnaire was piloted to check meaning in the context of a course in veterinary medicine. Consistent with a comparable study (Scouller 1998), participants answered each item twice: considering the MCQ examination and DOPS. The survey also contained questions regarding career preferences and some optional free text response boxes for student comment. The responses were collected using SurveyMonkey™ (http://www.surveymonkey.com)

Students were also asked if they would be willing to participate in an interview, 34 students agreed to participate, of which 19 were invited to interview and 16 students attended. Students were selected to reflect the gender balance and rotation groups within the year. Face to face, semi-structured interviews were conducted to explore in depth the students’ perceptions of the two different formats of assessment and their impact on their approaches to learning. Each interview was recorded with a digital voice recorder and transcribed verbatim. The interviews typically lasted around 40 minutes. All the qualitative data and SPQ scores were collected before the results of the final examination were published.

This study was approved by the SVMS ethical review panel and conducted in accordance with the guidance outlined in the ‘Revised Ethical Guidelines for Educational Research (2011)’ by the British Educational Research Association (BERA 2011).

Data analysis

The results of the SPQ were analysed using SPSS version 17. The Wilcoxon signed ranks test was used to compare approach to learning scores between DOPS and the MCQ examination. A Spearman's rho correlation coefficient was calculated to determine the relationship between approach to learning score and academic performance in the MCQ examination. Internal reliability coefficients (Cronbach's alpha) were calculated for the 18-item SPQ.

The qualitative data were analysed using thematic analysis. A deductive approach was used to identify pre-determined themes based on student approaches to learning, as described by Biggs (Biggs & Australian Council for Educational Research 1987a). Inductive analysis was also used to identify initial codes. Collaborative coding of three transcripts by a second researcher resulted in an iterative review process until the code structure was agreed and then applied to the remaining data set (Saldaña 2009). The pre-determined and initial codes were then grouped into over-arching themes (Braun & Clarke 2006).

Results

Seventy of the 87 students completed the 18-item SPQ with respect to both MCQ and DOPS, this represented a response rate of 80.5%. Of those, six students chose to remain anonymous and therefore were not included in the comparison of study approach and examination performance.

Study approach scores for DOPS and MCQ examinations

There were significant differences between deep and surface approaches to learning for the DOPS and the MCQ formats but the achieving approach did not differ significantly between the two formats (Table 2). There was, however, a significant positive relationship between the achieving approach and performance in the MCQ examination (ρ = 0.31, p < 0.05). No other significant correlations were identified between approach to learning and examination performance. The internal reliability (Cronbach's alpha) for all 18 items of the shortened SPQ was 0.64 for MCQ and 0.69 for DOPS.

Table 2. .

 Mean study approach scores for DOPS compared to MCQ. Standard deviation is provided in parentheses

DOPS MCQ Wilcoxon signed ranks test (z) p Value
Surface approach 14.5 (2.8) 16.2 (3.1) −5.048 0.000
Deep approach 19.4 (4.1) 18.0 (3.7) −3.299 0.001
Achieving approach 15.1 (4.3) 15.3 (4.1) −0.830 0.406

Qualitative analysis of student interview data

Two related overarching themes were identified from the data analysis: the effects of MCQ and DOPS on the approach to learning and other factors that influenced learning behaviour. Examples of the quotations from students are referred to as Q1, Q2, etc. Students are referred to by gender and number, for example M1, F2, etc.

Theme 1: The effects of MCQ and DOPS on approaches to study

Within this theme deep, surface and achieving learning strategies were identified from the discussions with students. They are presented here in association with the two assessment formats. DOPS was thought to encourage a deep learning strategy.

In Q1 the student describes her approach to the DOPS assessment

Q1: It's not so much sitting down with a book and learning it from scratch, but I think for most people it's trying to relate everything you see when you see practice on rotation, to what you know and build on it and go and look up what you are not sure about. [F2]

DOPS encouraged the search for deeper understanding (Q2):

Q2: I had equine anaesthesia DOPS, because I felt the need to go and read everything there was about equine anaesthesia to make sure I was going to get that right. [F7]

The provision of a list of tasks for the DOPS fostered an increased breadth of study. Rather than learning only for the assessment task, some students described using the list as a set of objectives to achieve before graduation (Q3)

Q3: but at the same time, because they’re very defined tasks, you have to make sure you include the whole group of DOPS and not just the one you’re assessed on. [F2]

The MCQ format was more commonly associated with surface strategies. Students are aware that this is not best practice for constructive learning which will aid their development as a practitioner. However, they described adopting surface strategies in preparation for the MCQ examination, as this was deemed necessary to be successful (Q4 and Q5):

Q4: I hope I retain the key things, but a lot of the little detail, no I’ll forget very quickly because of the way you have to revise for the exam, you’ve got two weeks right before them to try and cram it all in so that you can click the right box. [F3]

Q5: I thought I’m not sure if this is benefiting me cos I’m shoving so much into my brain each day that actually I think it's just pushing stuff out, whereas if it had been maybe a bit more spread out or a bit less, I would have actually benefited from it more. [F4]

For some students this provides a conflict between the way in which they want to study and the strategies they feel they need to adopt to be successful; this is sometimes perceived as unfair, as described by the student in Q6:

Q6: I think at the moment it (MCQ examination) sort of biases towards people who can absorb facts, absorb facts, absorb facts, and then spew it out for a week of assessment, rather than sort of testing the more rounded sort of characteristics of an individual and a sort of deeper understanding of the material. [M4]

In contrast with the MCQ, DOPS was rarely associated with a surface approach (Q1) although a surface strategy was occasionally used to prepare for the DOPS assessments when there was a lack of opportunity to complete the assessment tasks in the time available in clinics. In Q7 the student describes how DOPS drove her to adopt a surface approach due to lack of time and opportunity. The quotation also reveals that DOPS can be perceived by students as a ‘high stakes assessment’.

Q7: I would just go and cram for it and just try and get any exposure to that skill until I did the DOPS. It was sheer panic. I can’t describe how scared we were that we weren’t going to get them done. That's the only thing we thought about. On a Monday morning when you started rotations, am I going to get a DOPS, am I going to get to do it. That really drove us. But then having said that, the last two or three rotations, cos we’d finished, we got them all done by end of March, we actually really relaxed and we had more time to sit and learn about the cases we’d seen and chat about the cases. [F11]

Success in examinations is, obviously, important to students and in preparation for both MCQ and DOPS they described a change in learning approach to achieve maximum success. However, the ways in which these behaviours manifest is different for each format. For the MCQ examination, an increase in surface strategy techniques was reported (Q5) and a decrease in deep learning strategy (Q8):

Q8: I always start my revision as I should mean to go on, which is sort of going through things in-depth and trying to understand them. Inevitably I run out of time and have to resort to flicking through lectures and skim-reading things. Often I have found that that is a terrible policy I know, and it won’t serve me in the long-term, it's got me a lot of extra marks because I’ll recognise a picture from a lecture on an exam and it’ll just be in my extremely short-term memory. And I know that that can work for me here. Obviously I’m going to do that before an exam cos I know it might get me the marks, but I don’t feel happy that that's the way I’m learning. [F10]

The influence of DOPS on the student approach to learning was sometimes closer to the achieving approach than to a deep approach (Q9).

Q9: I’ll be honest, I did tactically pick certain DOPS so they could only fall on certain rotations. And I tactically picked the easier DOPS out of different skill areas, cos that's just sensible. You don’t do a bitch spay if you can do an FNA (fine needle aspirate) do you. [M5]

The students in this study are required to pass the MCQ examination in each of three different species areas. Their preference for working within one particular field often led to an achieving strategy (Q10):

Q10: cos I’m interested in small animals, whereas with equine and farm, sometimes I might not be quite aware of all the important things. So I’d ask other people about it, just sort of discuss with each other what the important diseases were, and really focus on those and make sure I have a good understanding of those, and then everything else comes as sort of a bonus. [F4]

For some students, DOPS can provide too much of a focus and appears to limit experiential learning. Students occasionally described the staff as concentrating on assessing DOPS at the expense of clinical teaching. In Q11, the student describes his experience of being assessed on collecting and analysing a urine sample:

Q11: So you tore yourself away from something interesting to go get a urine sample cos you wanted to practise cos you really wanted to pass. Yet there's going to be a hundred chances to get a urine sample, but they might be doing something really interesting over there. It was a bit of a hard dilemma cos you felt like it shouldn’t be the focus, but yet at the back of your mind you think I’ve got to pass this so I need to practise it. [F9]

Theme 2: Reported Influences on approaches to learning

Analyses of the transcripts revealed underlying influences on the participants’ approaches to learning. A theoretical model of the relationship between the perceived format of the assessment and its influences on approaches to learning emerged from the data analysis (Figure 1).

Figure 1.

Figure 1.

A model of influences on approaches to learning.

Motivation

Almost all the students in this study demonstrated a deep motivation to learn, often expressed in terms of wanting to become a ‘good vet’ and do the best they can for their clients and to ensure the welfare of animals in their care. Some also described the competitive element associated with the final examination.

Q12: it gives you feedback about your own performance and your own understanding, knowledge and whereabouts you are, especially whereabouts you are in the year. I think that's quite important cos we’re quite a competitive year. [F11]

Achieving motivation extended beyond wanting to become a ‘good vet’. For some students motivation to learn comes from their own personal gain and the satisfaction of high attainment:

Q13: I don’t think I’ve ever failed an exam and I’ve always wanted just to get the best out of what I do. I think I’ve always done well. To then not do well is just sort of self-failure [laughs]. Just a personal thing. [F7]

Some described deep, intrinsic motivation to learn: wanting to learn for their own satisfaction and a ‘love of learning’ (Q14):

Q14: learning is something I fully enjoy and I learn extra languages in my spare time just because I find it fulfilling. So it's of course a big part of the profession and, you know, you need to keep that going, but just for myself, I don’t want to feel like I’m at a standstill somehow. I just like to challenge myself and keep moving forward. So I suppose it's important as a vet, but for my point of view it's probably even more important as a person. [F2]

However some students reported that ‘fear of failure’, a surface motivator had a strong influence (Q7 and Q15).

Q15: We were saying earlier that the things we’ve been learning in fifth year are good for going into practice. Well I was learning them so I’ve got them in practice, but I was also learning them because I was scared of failing, especially when we come up to finals. My revision leave was purely and simply so I did not fail cos I was so scared of failing and not graduating with my mates. [F11]

Purpose of assessments

There was an ongoing conflict for participants between learning for the assessment and learning to be a competent practitioner. Some students realised that becoming a competent clinician was more important to them than their examination results. However, they had to pass the examinations and this hurdle still had a large impact on their learning strategies. The participants perceived MCQs to be testing knowledge (Q4, 5 and 6) and the DOPS to be testing skills required for competent practice (Q16).

Q16: I think they (DOPS) are generally a good way of assessment. I think it does make you think about what you need to know and certainly you sort of get used to saying whether you’re competent or not and then that kind of transfers to other skills and you sort of think well can I do this, could I do it on day one. [M1]

Consequence of the assessment

It has already been shown that the participants considered that high stakes assessment of MCQs encouraged a surface approach (Q5, 6 and 8) whereas the lower stakes assessment of DOPS prompted a deeper, more reflective approach (Q2 and 3). The DOPS also had other consequences which impacted on their approach to learning: case responsibility and face to face interaction with an assessor (Q17):

Q17: If you know a vet's going to quiz you, you’ll spend much more time looking stuff up. If you know they’re not going to ask questions, inherently human nature's not to look so much stuff up, and it probably shouldn’t be the way, but invariably it is. [M1]

Acceptability

Students reported that DOPS was an acceptable method of assessing practical skills (Q18):

Q18: I’m quite okay at practical skills, but OSCEs, you just get so stressed and your hands are shaking, I don’t think it's a very realistic way of kind of assessing practical skills really. I think that DOPS do that a lot better because it's in a real setting, you know, probably the best way of doing it. An MCQ I don’t think particularly represents what we’re going to do when we’re out there in practice, because you don’t have an option of four things to choose from. [F3]

However, there were some criticisms of the variation in difficulty of tasks (Q9), and tutors (Q19):

Q19: I think there are certainly people who you want to be examining your DOPS and there are people who you have a heart sink. When you see them come in, in the morning, you think oh god, I hope my DOPS is not today. [M3]

Feedback

Participants appreciated the regular opportunity for face to face feedback in WPBA. It helped them to improve and boosted their confidence (Q20, Q21):

Q20: and he discussed with me where I needed to improve and so made me feel a lot better about it cos you can see why you failed and work out how to improve, and it just all seems quite achievable then. [F4]

Q21: I got quite positive feedback and it gave me a real boost actually cos as I said, PDSA was my first rotation, and it really boosted my confidence going into the next one thinking yeah maybe I can do this. So that was really good. [F3]

Time pressure

For some students time pressure was essential to motivate efficient learning strategies but as Q4, 5, 6, 8 and 22 demonstrate, time pressures affects many students by increasing stress levels and driving them towards a more superficial approach to learning for MCQs.

Q22: Yet in two weeks there's just no time. It was awful to think you hadn’t even covered everything. You were going into exams and you hadn’t read some of your lectures. [F9]

In contrast to the MCQ examination, time pressure for DOPS was less of an issue for most students. In Q23 the student describes how DOPS allowed her to develop a deeper more reflective approach.

Q23: I think with the DOPS, if you fail one, you have the time to, you know, pass 2 more in the group and get the group done, you know, and think about it build on it and reflect on okay, why did I fail. And that's, in my opinion, very good, because it gives you time to use that experience and build on it. [F2]

Individual differences

Both DOPS and MCQ assessment formats impacted on student learning. However, these effects were not uniform across the participants. Q24–27 demonstrate the differential effect of DOPS.

Q24 the first eight months of the year I was so obsessed with DOPS, that's the only thing I could think about. [F11]

Q25: if you had a DOPS that needed doing, you might get to the last day of rotation and you would hunt down a case that you could do that on and you might potentially miss out on what you’d normally do. [M2]

Q26 because you have one DOPS per rotation, if that, and you know, it's quite a limited amount of work that I would ever have done for a DOPS. No. I don’t think it took anything away from my time or from my experience generally. [F10]

Q27: I think the DOPS were more of a I’ve just got to get it done kind of thing. Yeah I think they were just things that you just had to tick a box. [F6]

Discussion

This study provides evidence in line with that of earlier studies on MCQs and it reports findings on the effects of DOPS on approaches to learning which hitherto have been neglected. It has highlighted the differential impact of MCQs and DOPS on approaches to learning. Summative MCQs appear to induce surface approaches whereas DOPS induce deeper learning strategies. This evidence is in line with Tang (1994), Cilliers et al. (2011) and to some extent that of Scouller (1998) and Ringsted (2004). Different forms of WPBA, including DOPS, provide students with regular encounters with clinicians which results in a more consistent effort to learning compared to preparation for an examination at the end of the course (Cilliers et al. 2012).

But, assessment formats are not the only factors which influence approaches to learning. It is important to emphasize that DOPS and MCQs focus on different areas of competence: the MCQs assess the students’ ability to apply their knowledge and interpret case information; in contrast the DOPS are designed to assess practical skills. The learning outcomes assessed may be as important to the approach adopted as the assessment format itself. Evidence from the qualitative findings of this study indicates that the effects may be due as much to the stakes involved as to the format of the assessment. High stakes assessment, such as final examinations, can be a powerful driver for learning but the impact is not necessarily a positive one for all students (Cilliers et al. 2010; Al-Kadri et al. 2012). Low stakes assessment are likely to lead to deeper approaches to learning (Nicol & Macfarlane-Dick 2006; Al-Kadri et al. 2012). Students will employ surface-learning strategies when under time pressure or stressed, at the expense of deeper more meaningful learning which they know will be beneficial to their future career, assessment therefore has the potential to inhibit learning for clinical practice. But a few will use deep learning strategies even when these are not conducive to achievement. However for this sample of students, who were nearing graduation, the achieving approach was relatively constant regardless of the assessment format (Cilliers et al. 2011). This evidence from these qualitative analyses is borne out by the quantitative analyses which yielded a positive correlation between performance in examinations and achieving approaches in this study and in earlier studies (McManus et al. 1999; Evelyn Brown 2003; Donnon & Hecker 2010). They add to the growing evidence that WPBA, and in particular DOPS, have a positive educational impact on student approaches to learning (Ringsted et al. 2004; Norcini & Burch 2007; Prescott-Clements et al. 2008).

But, inevitably, there are limitations of small sample studies. This study does not disentangle the effects of other influences from the formats of DOPS and MCQs. For example, student perception of examination content; the implementation of assessment including the timing of examinations within the course; and personal factors for individual students all influence the students’ approach to learning. The shortened version of the SPQ used to measure impact did not quite meet the customary recommended Cronbach's alpha of 0.7. This could reflect the short scale and sample size (Tavakol & Dennick 2011). The sample was too small for confirmatory factor analysis. In addition, it should be noted that the study was conducted at one higher education institute and on one cohort of students, other measures of impact on students across a variety of institutions might have yielded different results.

Despite these limitations, the mixed methods approach used in this study has demonstrated that different forms of assessment have differential impacts and revealed that there are many subtle influences which have an impact on students’ approaches to learning. Not least amongst these, are the students’ perceptions of the modes and formats of assessment.

Conclusion

The educational impact of different formats of assessment is a complex research and practical problem. Many factors influence approaches to learning and the effects observed are not uniform across all students. In particular, the format of DOPS and MCQs appear to have differential effects upon approaches to learning. Salient in the students’ experience is the conflict between preparing for their clinical profession and preparing to pass final examinations. The resolution of these conflicts is a challenge for assessors as well as for students. Further work is needed on the role of assessments for learning and of learning which take account of the differential effects of different forms of assessment.

Acknowledgements

The authors would like to thank the Students who participated in this study at the University of Nottingham.

Declaration of interest: the authors report no declarations of interest. The authors alone are responsible for the content and writing of this article.

Glossary of terms

Directly Observed Procedural Skill (DOPS): DOPS is a form of workplace-based assessment (WPBA) in which the assessor observes the trainee performing a practical procedure on a patient from start to finish.

Reference: Norcini JJ, McKinley DW. 2007. Assessment methods in medical education. Teach Teach Educ 23:239–250.

Wilkinson JR, Crossley JGM, Wragg A, Mills P, Cowan G, Wade W. 2008. Implementing workplace-based assessment across the medical specialties in the United Kingdom. Med Educ 42:364–373.

Educational impact: Educational impact refers to the effect of an intervention, for example assessment, on the learning process.

Reference: van der Vleuten CP, Schuwirth LW. 2005. Assessing professional competence: From methods to programmes. Med Educ 39:309–317.

Appendix 1: An example DOPS

graphic file with name MTE-35-e1598-s001.jpg

Appendix 2: An example MCQ

You are presented with a weak, collapsed 10-year-old Bouvier de Flandres dog with a history of having had dilated cardiomyopathy diagnosed several months previously. On examination, the dog is tachycardic but the rhythm is regular. On presentation you record the following ECG.

graphic file with name MTE-35-e1598-s002.jpg

  1. Which of the following rhythm diagnoses best describes the rhythm shown in the ECG?

    • Atrial tachycardia

    • Atrial fibrillation

    • Junctional tachycardia

    • Ventricular tachycardia

    • Ventricular fibrillation

    • Lignocaine

    • Sotalol

    • Digoxin

    • Diltiazem

    • Mexilitine

  2. Which of the following antidysrhythmic drugs used in the management of tachydysrhythmias would be most appropriate for the emergency management of this case?

    • Lignocaine

    • Sotalol

    • Digoxin

    • Diltiazem

    • Mexilitine

References

  1. Al-Kadri HM, Al-Moamary MS, Roberts C, van der Vleuten CPM. Exploring assessment factors contributing to students' study strategies: Literature Review. Med Teach. 2012;34:S42–S50. doi: 10.3109/0142159X.2012.656756. [DOI] [PubMed] [Google Scholar]
  2. BERA. 2011. Ethical Guidelines for Educational Research. British Educational Research Association [Online]. Available from www.bera.ac.uk. [Google Scholar]
  3. Biggs JB, Australian Council For Educational Research . Student approaches to learning and studying. Australian Council for Educational Research; Melbourne: 1987a. [Google Scholar]
  4. Biggs JB, Australian Council For Educational Research . Study process questionnaire manual. Australian Council for Educational Research; Melbourne: 1987b. [Google Scholar]
  5. Black P, Wiliam D. 1999. Assessment for learning: Beyond the black box. Assessment Reform Group Report, pp 1–12. [Google Scholar]
  6. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3:77–101. [Google Scholar]
  7. Brown G, Bull J, Pendlebury M. Assessing student learning in higher education. Routledge; London: 1997. [Google Scholar]
  8. Cilliers FJ, Schuwirth LW, Adendorff HJ, Herman N, van der Vleuten CP. The mechanism of impact of summative assessment on medical students' learning. Adv Health Sci Educ Theory Pract. 2010;15(5):695–715. doi: 10.1007/s10459-010-9232-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Cilliers FJ, Schuwirth LW, Herman N, Adendorff HJ, van der Vleuten CP. A model of the pre-assessment learning effects of summative assessment in medical education. Adv Health Sci Educ Theory Pract. 2011;17(1):39–53. doi: 10.1007/s10459-011-9292-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Cilliers FJ, Schuwirth LW, van der Vleuten CPM. A model of the pre-assessment learning effects of assessment is operational in an undergraduate clinical context. BMC Med Educ. 2012;12:9. doi: 10.1186/1472-6920-12-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Crooks T, Mahalski P. Relationships among assessment practices, study methods, and grades obtained. Res Dev Higher Educ. 1985;8:234–240. [Google Scholar]
  12. Donnon T, Hecker K. A model of approaches to learning and academic achievement of students from an inquiry based Bachelor of Health Sciences program. Can J Higher Educ. 2010;38:1–19. [Google Scholar]
  13. Evelyn Brown GG. 2003. Evaluation tools for investigating the impact of assessment regimes on student learning. Bioscience Education E-journal [Online], 2. Available from http://www.bioscience.heacademy.ac.uk/journal/vol2/beej-2-5.aspx. [Google Scholar]
  14. Fox RA, McManus IC, Winder BC. The shortened Study Process Questionnaire: An investigation of its structure and longitudinal stability using confirmatory factor analysis. Br J Educ Psychol. 2001;71:511–530. doi: 10.1348/000709901158659. [DOI] [PubMed] [Google Scholar]
  15. Gibbs G, Simpson C. Conditions under which assessment supports students' learning. Learn Teach Higher Educ. 2004;1(1):3–31. [Google Scholar]
  16. Kember D, Charlesworth M, Davies H, McKay J, Stott V. Evaluating the effectiveness of educational innovations: Using the study process questionnaire to show that meaningful learning occurs. Stud Educ Eval. 1997;23:141–157. [Google Scholar]
  17. Leung SF, Mok E, Wong D. The impact of assessment methods on the learning of nursing students. Nurse Educ Today. 2008;28:711–719. doi: 10.1016/j.nedt.2007.11.004. [DOI] [PubMed] [Google Scholar]
  18. McLachlan JC. The relationship between assessment and learning. Med Educ. 2006;40:716–717. doi: 10.1111/j.1365-2929.2006.02518.x. [DOI] [PubMed] [Google Scholar]
  19. McManus IC, Richards P, Winder BC. Intercalated degrees, learning styles, and career preferences: Prospective longitudinal study of UK medical students. BMJ. 1999;319:542–546. doi: 10.1136/bmj.319.7209.542. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. McManus IC, Richards P, Winder BC, Sproston KA. Clinical experience, performance in final examinations, and learning style in medical students: Prospective study. BMJ. 1998;316:345–350. doi: 10.1136/bmj.316.7128.345. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Newble DI, Jaeger K. The effect of assessments and examinations on the learning of medical students. Med Educ. 1983;17:165–171. doi: 10.1111/j.1365-2923.1983.tb00657.x. [DOI] [PubMed] [Google Scholar]
  22. Nicol DJ, MacFarlane-Dick D. Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Stud Higher Educ. 2006;31:199–218. [Google Scholar]
  23. Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Medl Teach. 2007;29:855–871. doi: 10.1080/01421590701775453. [DOI] [PubMed] [Google Scholar]
  24. Prescott-Clements L, van der Vleuten CP, Schuwirth LW, Hurst Y, Rennie JS. Evidence for validity within workplace assessment: The Longitudinal Evaluation of Performance (LEP) Med Educ. 2008;42:488–495. doi: 10.1111/j.1365-2923.2007.02965.x. [DOI] [PubMed] [Google Scholar]
  25. Ramsden P. Learning to teach in higher education. Routledge; London: 1992. [Google Scholar]
  26. Ringsted C, Henriksen AH, Skaarup AM, van der Vleuten CP. Educational impact of in-training assessment (ITA) in postgraduate medical education: A qualitative study of an ITA programme in actual practice. Med Educ. 2004;38:767–777. doi: 10.1111/j.1365-2929.2004.01841.x. [DOI] [PubMed] [Google Scholar]
  27. Saldaña J. The coding manual for qualitative researchers. Sage Publications Ltd; London: 2009. [Google Scholar]
  28. Schuwirth L, Colliver J, Gruppen L, Kreiter C, Mennin S, Onishi H, Pangaro L, Ringsted C, Swanson D, van der Vleuten C et al. Research in assessment: Consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011;33:224–233. doi: 10.3109/0142159X.2011.551558. [DOI] [PubMed] [Google Scholar]
  29. Schuwirth LW, van der Vleuten CP. General overview of the theories used in assessment: AMEE Guide No. 57. Med Teach. 2011;33:783–797. doi: 10.3109/0142159X.2011.611022. [DOI] [PubMed] [Google Scholar]
  30. Scouller K. The influence of assessment method on students' learning approaches: Multiple choice question examination versus assignment essay. Higher Educ. 1998;35:453–472. [Google Scholar]
  31. Swanwick T. Understanding medical education: Evidence, theory and practice. Wiley-Blackwell; Oxford: 2010. [Google Scholar]
  32. Tang C. Assessment and student learning: Effects of modes of assessment on students' preparation strategies. In: Gibbs G, editor. Improving student learning: Theory and practice. Oxford Brookes University, The Oxford Centre for Staff Development; Oxford: 1994. pp. 151–170. [Google Scholar]
  33. Tavakol M, Dennick R. Making sense of Cronbach's alpha. Int J Med Educ. 2011;2:53–55. doi: 10.5116/ijme.4dfb.8dfd. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Medical Teacher are provided here courtesy of Taylor & Francis

RESOURCES