Skip to main content
MedEdPublish logoLink to MedEdPublish
. 2021 Mar 19;9:153. Originally published 2020 Jul 31. [Version 2] doi: 10.15694/mep.2020.000153.2

Practical Considerations for Online Open Book Examinations in Remote Settings

Hui Meng Er 1,2,a, Vishna Devi Nadarajah 1,2, Pei Se Wong 1,2, Nilesh Kumar Mitra 1,2, Zabibah Ibrahim 1,2
PMCID: PMC10702684  PMID: 38073814

Abstract

This article was migrated. The article was marked as recommended.

The COVID-19 outbreak has led to lockdown of cities and restricted access to university campuses, and hence face-to-face delivery of education has been disrupted worldwide. In order to continue teaching, learning and assessment activities, academic institutions have embarked on online delivery and assessments using technology. Online open book examination is one of the tools considered during the crisis period to ensure that students’ progression in the academic programmes and graduation are not delayed. Its use is supported by literature evidences that show promotion of critical thinking and problem solving skills amongst students. The positive findings from our previous study on the impact of open book examinations on student performance and learning approach have encouraged us to implement online open book examinations in various health professional programmes in our institution during the COVID-19 pandemic. While some successful progress has been made, there are areas that need further exploration to provide detailed insights on the practice and effectiveness of remote online open book examinations. The objective of this paper is to share the practical tips for implementing online open book examinations remotely, in order to ensure the validity, reliability and fairness of the examinations.

Keywords: Online examination, Open book examination, Remote, Health professional programmes

Introduction

The COVID-19 pandemic has resulted in lockdown of cities globally as public health measures to mitigate the spread of the coronavirus. Technology has rapidly been deployed by institutions to enable continuation of education delivery. This has accelerated the buy-in of online learning among the educators and students. Meanwhile, much attention has also been focussed on online assessments, particularly the platforms, tools and formats that can serve the assessment purpose and address its utility, i.e. validity, reliability, educational impact, acceptability and cost ( Van der Vleuten, 1996).

While closed book examinations promote students’ test preparation and deep learning ( Block, 2012; Durning, Dong, Ratcliffe, Schuwirth et al., 2016), open book examinations enhance their critical thinking and creative problem solving skills in an environment that simulates the real working scenario ( Johanns, Dinkens and Moore, 2017). In addition, open book assessments help students to develop competency in knowledge management ( Rowlands and Forsythe, 2006). Despite the availability of various online proctoring tools in the market, many have cautioned against their use in view of data privacy and confidentiality ( Thompson, 2020). Besides, some students may not have access to webcam that is required for online proctoring. These limit the conduct of closed book examinations remotely. Therefore, online open book examinations have been adopted by many academic institutions during this unprecedented period when campus access is restricted. The objective of this paper is to discuss the practical considerations for implementation of online open book examinations, based on our experience in conducting these examinations in various health professional programmes at our institution.

Institutional Context and Programme Needs Analysis

The decision to convert conventional examinations to online open book examinations should be made based on the institutional context and educational programme needs. These include the availability of a secure online assessment platform, administrative and logistic capacity of the institution, academic calendar, student assessment load as well as post-lockdown modifications to campus activities. As the COVID-19 situation continues to evolve, there is much uncertainty over the lockdown duration. Moreover, it is anticipated that conventional teaching, learning and assessment activities need to be modified post-lockdown in order to ensure physical distancing in the campus. Postponing the examinations to after lockdown is lifted will inevitably lead to increased workload of the examination administrators and logistic burden in scheduling assessment activities that may exceed the university capacity ( Figure 1). Examinations may need to be conducted in smaller groups and hence higher number of examination venues, extended examination schedules and quarantine procedures are necessary. It will also likely to delay students’ graduation and entry into the healthcare workforce. Assessment overload on the students is another consequence that could contribute to stress among the students. These implications have to be deliberated realistically against the challenges faced in converting to online open book examinations within a short period of time.

Figure 1: Managing the impact of examination postponement and post-lockdown modifications to campus activities.

Figure 1:

Figure 1 is based on the “Coronavirus Flattening the Curve PowerPoint Template” from SlideModel.com. The blue curve represents the consequence of postponement of examinations, while the green curve represents the effect of intervention through conduct of online open book examinations during and post-lockdown.

Faculty Readiness

In open book examinations, students are allowed to refer to textbooks, online resources, or reference materials during the examinations ( Ramamurthy, Er, Nadarajah and Pook, 2016). Faculty training is necessary to ensure that the examinations are designed to test higher order thinking skills, e.g. applying, analysing, evaluating, and creating with resources ( Brightwell, Daniel and Stewart, 2004). Other skills can also be tested, such as locating information, using computer applications and conducting internet research using real world scenarios ( Johanns, Dinkens and Moore, 2017). These skills are indeed essential for healthcare professionals. However, some faculty may face challenges in preparing these types of questions within a short time, particularly in situations where clinical scenario or application based teaching is limited. To support this transition, university guidelines, webinars and personalised online faculty mentoring clinics for setting online open book examinations are highly recommended. An example of faculty guide on online open book assessment is available online for the readers ( IMU, 2020a). The guide describes the principles and considerations for remote online open book examinations. On the other hand, it is important to recognise the limitations in assessing clinical and procedural skills using online open book examinations. Creativity makes it possible to assess some of these competencies, for example, by getting the students to demonstrate the skills via live streaming in anticipation that some modifications may be required.

Students’ Readiness

Students have been reported to spend less time to prepare for open book examinations compared to closed book examinations due to misconception that the answers could be found in books ( Durning, Don, Ratcliffe, Schuwirth et al., 2016). As a consequence of inadequate preparation, some spent more time looking for answers rather than producing a quality answer ( Block, 2012). These findings have highlighted the importance of communicating to the students what online open book examinations are about and familiarising them with higher order thinking questions during teaching and formative assessments. Besides, the students should be informed in advance of the rules and regulations for online open book examinations including the consequences of academic misconduct. An example of student guide on online open book assessment is available online for the readers ( IMU, 2020b). It guides the students on preparation for online open book examinations. Additionally, students are also provided with information about the preparation and process for online examinations with online invigilation ( IMU, 2020c).

Online open book examination relies on students’ computer hardware, software and internet connection during the examination. Evidences have shown that students could experience a greater cognitive load during online examinations compared to traditional paper-based examinations, as a result of having to deal with navigating technology in addition to answering the examination questions ( Cramp, 2019). This could lead to stress and anxiety. Hence, practice examinations are crucial for students to test their login to the online assessment platform, familiarise with the features of the platform and have practice opportunities with questions that require them to draw diagrams or plot graphs, either using computer software or by hand, followed by uploading to the answers for submission. Technical support should be available before and during the examinations. For example, if a student is disconnected from the online assessment platform during the examination due to internet connection problems, the technical support personnel can check the examination time remaining and the logs. Based on these reports, the examination coordinator can advise on the extra time allocation if needed. In situations where students face intermittent internet access, the asynchronous online mode should be considered, for example, by allowing the students to download the examination questions, work on the answers offline (without having to depend on internet connection) and submit the answers within the specified duration.

Quality Assurance

It is important that the decision to convert conventional examinations to online open book examinations is made in compliance with the institutional governance in order to uphold the credibility and integrity of the academic programmes. With programme and course learning outcomes being the cornerstone, the design of online open book examinations should be guided by the assessment blueprint. Vetting of examination questions are particularly critical to ensure the validity, reliability, fairness and integrity of the examinations. Besides verifying the content, construct and concurrent validities, standard setting and psychometric analysis should continue to be practised for quality assurance. In addition, policies and standard operating procedures are required for the entire spectrum of work processes from preparing examination questions, vetting, uploading questions to the online assessment platform, authenticating test candidates, test taking, marking, to results processing, reporting and publishing. Not only will these ensure the examination efficiency, they safeguard the examination security which is a key concern since most of these activities are conducted online remotely. Besides the professional support staff of the examination office, technical staff from the Information Technology (IT) and E-Learning departments will also be involved in setting up the online examinations and providing technical assistance remotely during the examinations. Hence, the roles and responsibilities among the stakeholders should be clearly defined. Timely feedback from the stakeholders is crucial for continuous improvement.

Adaptability and Agility

In preparation for the COVID-19 crisis, universities would have developed various business continuity plans (BCPs) to minimise impact to operations and ensure that teaching and learning activities remain viable, taking into consideration of personal and community safety. BCPs in this context will normally include plans to transition to online delivery and assessments, which are justified by transition risk assessment, needs prioritisation, resource readiness, communication strategies while maintaining academic standards and governance. Most of these BCPs are prepared at the institutional level, but they need to be contextualised to individual programme requirements to enable autonomy according to the varying context and communities they operate in. On the other hand, decisions related to public health and safety are usually made at the national level, resulting in varying external factors that are beyond the institution’s control. Nevertheless, the roll out of BCPs can be challenging in a constantly evolving situation like the COVID-19 pandemic. Frequent changes of plans are unavoidable and these can affect all stakeholders including the faculty, professional support staff and students. Reflecting on our experience, it can be concluded that while BCPs are useful guides, adaptability and agility among the stakeholders are crucial in a crisis situation. Factors that help to reduce the stress level of the stakeholders include data accessibility to guide decision making, preparedness of faculty for online delivery and assessments, availability of faculty development and mentoring, readiness and upskilling of professional support staff, as well as open and transparent communication and collaboration among all staff in a collegial manner with the common goal of ensuring positive student experiences during the challenging time.

Limitations

The measures discussed in this paper have helped our institution to be better prepared for remote online open book examinations and ensured continuity of assessments during the COVID-19 pandemic period. Nevertheless, some faculty members argue that this format may not be appropriate for all learning outcomes. It could be more suitable for senior students who have more exposure to real world scenarios compared to the students in the earlier academic years. The potential of misalignment between curriculum delivery and assessments as well as access to resources during online open book examinations could affect the student performance. Debates remain on whether the same standard setting practice for closed book examinations can be applied for open book examinations. A study is currently being undertaken to explore the faculty perception on these areas. More evidences are necessary to support the validity and reliability of remote online open book examinations, under invigilated and non-invigilated environments. The impact of remote online open book assessment on students’ learning and performance is being investigated.

Conclusion

Online open book examination is a viable and relevant option for health professions education at times of uncertainty. The COVID-19 pandemic has increased the utility of this tool among the higher education institutions. However, it is important to have a systematic and collaborative approach for its implementation based on good practices, standards and evidences. Continuous studies on its validity, reliability and impact on learners should be carried out. The findings will help health professional educators to improve its implementation in order to ensure that the graduates are competent and work ready in an increasingly challenging healthcare environment.

Take Home Messages

  • The decision to convert conventional examinations to online open book examinations should be made based on the institutional context and educational programme needs.

  • Faculty training and support for setting online open book examinations are highly recommended.

  • Students should be familiarised with the conduct of online open book examinations through practice examinations.

  • Quality assurance is important to ensure the validity, reliability and fairness of online open book examinations.

  • Adaptability and agility among the stakeholders of online open book examinations are crucial in a crisis situation.

Notes On Contributors

Professor Hui Meng Er is the Acting Dean of Teaching and Learning at the International Medical University, Malaysia.

ORCID iD: https://orcid.org/0000-0001-9835-4840.

Professor Vishna Devi Nadarajah is the Pro Vice Chancellor, Education and Institutional Development, at the International Medical University, Malaysia.

ORCID iD: https://orcid.org/0000-0002-7126-7189.

Dr Pei Se Wong is the Associate Dean of Teaching and Learning at the International Medical University, Malaysia.

ORCID iD: https://orcid.org/0000-0002-3958-0001.

Nilesh Kumar Mitra is the Acting Director of Learning Resources at the International Medical University, Malaysia.

ORCID iD: https://orcid.org/0000-0002-8487-4607.

Ms Zabibah Ibrahim is the Assistant Manager of E-Learning at the International Medical University, Malaysia.

ORCID iD: https://orcid.org/0000-0003-0555-2106.

Acknowledgments

The authors would like to acknowledge that Figure 1 is based on the “Coronavirus Flattening the Curve PowerPoint Template” from SlideModel.com.

The authors would also like to thank the staff of the Departments of E-Learning, IT and Examination Unit at the International Medical University for the useful feedback on the conduct of online open book examinations.

[version 2; peer review: This article was migrated, the article was marked as recommended]

Declarations

The author has declared that there are no conflicts of interest.

Ethics Statement

The practical tips have been developed based on our experience in implementing online open book examinations. Ethics approval is not required.

External Funding

This article has not had any External Funding

Bibliography/References

  1. Block R. M.(2012) A discussion of the effect of open-book and closed-book exams on student achievement in an introductory statistics course. PRIMUS. 22(3), pp.228–238. 10.1080/10511970.2011.565402 [DOI] [Google Scholar]
  2. Brightwell R. Daniel J-H. and Stewart A.(2004) Evaluation: Is an open book examination easier?. Bioscience Education. 3(1), pp.1–10. 10.3108/beej.2004.03000004 [DOI] [Google Scholar]
  3. Cramp J. Medlin J. F. Lake P. and Sharp C.(2019) Lessons learned from implementing remotely invigilated online exams. Journal of University Teaching & Learning Practice. 16(1), Article 10. Available at: https://ro.uow.edu.au/jutlp/vol16/iss1/10( Accessed 15 May 2020). [Google Scholar]
  4. Durning S., Dong T., Ratcliffe T., Schuwirth L. et al. (2016) Comparing open-book and closed-book examinations: A systematic review. Academic Medicine. 91, pp.583–599. 10.1097/ACM.0000000000000977 [DOI] [PubMed] [Google Scholar]
  5. International Medical University (IMU) (2020a) IMU faculty guide on online open book assessment. Available at: https://studentimuedu-my.sharepoint.com/:b:/g/personal/huimeng_er_imu_edu_my/EWuKmBCc20BGlPFcvGTwk1YBHwXK9Tjo6MrW1wfjBYZ26g?e=WnifhJ( Accessed: 23 Nov 2020).
  6. International Medical University (IMU) (2020b) IMU student guide on online open book assessment. Available at: https://studentimuedu-my.sharepoint.com/:b:/g/personal/huimeng_er_imu_edu_my/EX2ComYVkcpKq_P3Se8IUccBzHakJLh300h-zOHaUP4LKg?e=lIexJa( Accessed: 23 Nov 2020).
  7. International Medical University (IMU) (2020c) IMU student guide on online examination with online invigilation. Available at: https://studentimuedu-my.sharepoint.com/:b:/g/personal/huimeng_er_imu_edu_my/ESwKkyYm6xJMrdFuzgM6TE0Br7-IuFPxgUddWmuvbVSvmA?e=6i240o( Accessed: 23 Nov 2020).
  8. Johanns B. Dinkens A. and Moore J.(2017) A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking. Nurse Education in Practice. 27, pp.89–94. 10.1016/j.nepr.2017.08.018 [DOI] [PubMed] [Google Scholar]
  9. Ramamurthy S. Er H. M. Nadarajah V. D. and Pook P. C. K.(2016) Study on the impact of open and closed book formative examinations on pharmacy students’ performance, perception, and learning approach. Currents in Pharmacy Teaching and Learning. 8, pp.364–374. 10.1016/j.cptl.2016.02.017 [DOI] [PubMed] [Google Scholar]
  10. Rowlands J. E. and Forsythe D.(2006) Open-book professional accountancy examinations. South African Journal of Higher Education. 20(5), pp.703–717. 10.4314/sajhe.v20i5.25709 [DOI] [Google Scholar]
  11. Thompson M.(2020) UCSB Faculty association issues letter advising against the use of ProctorU testing services. Available at: https://dailynexus.com/2020-03-16/ucsb-faculty-association-issues-letter-advising-against-the-use-of-proctoru-testing-services/( Accessed: 15 May 2020).
  12. Van Der Vleuten C. P. M.(1996) The assessment of professional competence: Developments, research and practical implications. Advances in Health Sciences Education. 1, pp.41–67. 10.1007/BF00596229 [DOI] [PubMed] [Google Scholar]
MedEdPublish (2016). 2021 Apr 19. doi: 10.21956/mep.20272.r31468

Reviewer response for version 2

Gominda Ponnamperuma 1

This review has been migrated. The reviewer awarded 4 stars out of 5 This article provides useful practical information for any institute that plans to implement online learning. Out of all the sections, I particularly like the sections on quality assurance and student readiness. The open book assessment guides for faculty would also be quite useful for anyone planning to conduct open-book assessment. If the authors plan to revise this article further, I could offer the following suggestions.1. Since the objectives in the abstract and the main article slightly differ, it would be good to repeat the objective stated in the abstract (in exactly the same way) in the main article. This would ensure that both objectives contain 'validity, reliability and fairness'.2. The authors could develop a summary table that meshes the tips in relation to each of the sections (e.g. faculty readiness, student readiness, quality assurance, etc.) in rows against the assessment properties (e.g. validity, reliability, fairness) in columns.

Reviewer Expertise:

NA

No decision status is available

MedEdPublish (2016). 2021 Mar 19. doi: 10.21956/mep.20272.r31469

Reviewer response for version 2

Ken Masters 1

This review has been migrated. The reviewer awarded 3 stars out of 5 I am pleased to see that the authors have made some attempt at addressing the issues raised in response to the first version, but the changes are not great. Although a Limitations section has been added, many of the other changes appear to be minor. For example, in the Abstract, apart from the addition of a single sentence (“While some successful progress…”), there appears to have been no change made, and it remains a lead-in to the paper rather than a summary of the paper.Almost the entire rest of the paper is identical with only minor alterations and insertions. So, while the paper’s value has been somewhat increased, I'm afraid it is rather a missed opportunity.

Reviewer Expertise:

NA

No decision status is available

MedEdPublish (2016). 2020 Nov 3. doi: 10.21956/mep.19136.r27649

Reviewer response for version 1

Ken Masters 1

This review has been migrated. The reviewer awarded 2 stars out of 5 The paper aims at the practical considerations for online open book examinations in remote settings. While the paper does raise important issues, it is frustratingly short on practicalities. More details are given below, but, essentially, each section highlights the stuff that needs to be done, but gives very little practical indication on HOW it is to be done.• The Abstract gives a useful background to the paper, but should really be used to summarise the paper, so it would have been better if it had given a summary of the main points main in the paper.• In the Faculty Readiness, there is good information about the faculty being given training on open-book exams. Most readers who would benefit from this paper would be novices, and would probably not have access to such training, so it would be useful if the authors could supply some (preferably open-access) resources that get faculty up to speed as quickly as possible.• Similarly for students: are there more detailed resources that can be given to students to assist them with open-book exams?• Again (and I find myself repeating myself), the quality needs to be assured, and we need to set standards, yes, but how? Are there rubrics, recommendations, standard tools, etc.? How are these implemented?Perhaps the authors could produce a Version 2 of this paper in which they go into some more details about the practicalities and resources that can be used to achieve what has been outlined.

Reviewer Expertise:

NA

No decision status is available

MedEdPublish (2016). 2020 Oct 28. doi: 10.21956/mep.19136.r27646

Reviewer response for version 1

Gominda Ponnamperuma 1

This review has been migrated. The reviewer awarded 4 stars out of 5 In these uncertain times, with on-site testing becoming almost impossible for most institutions and with the controversies that surround proctored online closed-book exams, open-book exams offer a breath of fresh air to the testing community. In this light, this article is a very useful eye-opener to the world of open-book testing. Authors should be congratulated for penning the important considerations when moving towards open-book testing. An important consideration that the authors could probably explore a bit more is 'standard setting'. Since the level of testing (i.e. the difficulty of items) in open-book exams is different, when compared with the closed-book exams, the standard setters may need to recalibrate their expectations of a passing (or a borderline) candidate.

Reviewer Expertise:

NA

No decision status is available

MedEdPublish (2016). 2020 Aug 28. doi: 10.21956/mep.19136.r27643

Reviewer response for version 1

Richard Fuller

This review has been migrated. The reviewer awarded 4 stars out of 5 Open book testing or 'takeaway exams', which test skills right at the top of learning taxonomies has been an established component of assessment in other disciplines for some time. Open book questions are deliberately designed to test more complex questions that cannot be easily solved through recall, and employ wider learning strategies to access, synthesise and apply material from many sources. These questions require specific design and are very different from SBAs and SAQs - and may be more suitable for learners at later stages of programmes.They are a newer phenomenon in health professions education, although much has been written about them recently! Unfortunately, there has been a tendency to conflate them with online exams, and link them with concerns about cheating and probity. This has led to confusion by staff and students alike about online exams with closed book questions against the clock, vs truly open book takeaway type testing!A good introductory resource is from Twente University.https://www.utwente.nl/en/telt/online-lectures/remote-assessment/online-examination/open-book-exam/ This is a really useful review by colleagues from IMU and I commend the very practical focus on institutional readiness and most importantly, preparing students for a very different test format. Arguably, developing these formats further will help assessment be more 'authentic' and reflect the way that health professionals deal with complex patient issues and multiple sources of data (lab results, imaging, multi professional notes) - and how we work, and solve problems, as teams

Reviewer Expertise:

NA

No decision status is available

MedEdPublish (2016). 2020 Aug 24. doi: 10.21956/mep.19136.r27647

Reviewer response for version 1

Viktor Riklefs 1

This review has been migrated. The reviewer awarded 5 stars out of 5 Very useful article containing practical tips on what to consider when switching to online open-book quiz, probably the best option for remote settings. The authors give very practical advice on how to take into account the readiness of both students and teaching staff when designing the assessment, and how not to loose the quality. Many advices are given based on the own experience of the authors, which certainly adds value to the article.

Reviewer Expertise:

NA

No decision status is available

MedEdPublish (2016). 2020 Aug 13. doi: 10.21956/mep.19136.r27645

Reviewer response for version 1

Ben Canny 1

This review has been migrated. The reviewer awarded 4 stars out of 5 Given that way that medical practitioners now practice (with a computer (smartphone) in our hands), I am consistently surprised that we have generally failed to embrace online, open internet examinations. Like many things during the pandemic, we have changed with great rapidity, and will wonder why the change took us so long. This article gives a great summary of how online open book examinations can be conducted, and may provide guidance of how to proceed once the "new normal" arrives.

Reviewer Expertise:

NA

No decision status is available

MedEdPublish (2016). 2020 Aug 3. doi: 10.21956/mep.19136.r27642

Reviewer response for version 1

Hosam Eldeen Gasmalla 1

This review has been migrated. The reviewer awarded 4 stars out of 5 This article is one of many that in many ways puts an emphasis on programmatic assessment, it also shows that we need to adobt more diverse methods and tools.This tool "open-book examination" can also help focusing more on low-stakes formative assessments.

Reviewer Expertise:

NA

No decision status is available

MedEdPublish (2016). 2020 Jul 31. doi: 10.21956/mep.19136.r27648

Reviewer response for version 1

Felix Silwimba 1

This review has been migrated. The reviewer awarded 5 stars out of 5 this is a welcome alternative to student assessment. it is worth exploring.

Reviewer Expertise:

NA

No decision status is available

MedEdPublish (2016). 2020 Jul 31. doi: 10.21956/mep.19136.r27644

Reviewer response for version 1

Hebat Allah A Amin 1

This review has been migrated. The reviewer awarded 5 stars out of 5 This is an interesting article addressing a real problem.I think sharing the experience through a case study or a Step by Step Guide would be more fruitful and allow the dissimination of the experience. Thanks for the important topic.

Reviewer Expertise:

NA

No decision status is available


Articles from MedEdPublish are provided here courtesy of Association for Medical Education in Europe

RESOURCES