Skip to main content
PLOS One logoLink to PLOS One
. 2022 Aug 19;17(8):e0272927. doi: 10.1371/journal.pone.0272927

Students and examiners perception on virtual medical graduation exam during the COVID-19 quarantine period: A cross-sectional study

Nazdar Ezzaddin Alkhateeb 1,*, Baderkhan Saeed Ahmed 2, Namir Ghanim Al-Tawil 3, Ali A Al-Dabbagh 1
Editor: Vijayalakshmi Kakulapati4
PMCID: PMC9390930  PMID: 35984844

Abstract

Background

With the emergence of the COVID-19 pandemic and lockdown approach that was adopted all over the world, conducting assessments while maintaining integrity became a big challenge. This article aims at sharing the experience of conducting an online assessment with the academic community and to assess its effectiveness from both examiners’ and students’ perspectives.

Methods

An online assessment was carried out for the final year medical students of Hawler Medical University/Iraq during the lockdown period of the COVID-19 pandemic, June 2020. Then, an online questionnaire was sent to a sample of 61 examiners and 108 students who have been involved in evaluating the mentioned assessment process. Mann-Whitney and Kruskal-Wallis tests were used to compare the mean ranks of the overall satisfaction scores between categories of the students and examiners. Categorical data were summarized and presented as frequencies and percentages.

Results

The response rates among examiners and students were 69.4% and 88.5% respectively. The majority of the examiners were generally satisfied with the online examination process compared to only around a third of the students. However, both examiners and students agreed that online examination was not suitable for assessing the physical examination skills.

Conclusion

The online assessment can be considered a good alternative and acceptable method for medical students’ assessment in unpredicted emergencies, yet it was not applicable in testing physical examination skills.

Introduction

The COVID-19 pandemic has affected every part of societies all around the world, it has caused the largest interruption of education systems in the history of mankind involving almost 1.6 billion learners in over 200 countries [1]. The standard provision of medical education for undergraduate medical students has to be changed during this global crisis, and, it is important to develop innovative approaches of assessment that adopt the standards of medical education and consider the current social and environmental constraints caused by the COVID-19 pandemic [2].

Assessment is the measurement of the students’ learning and the way they are assessed usually influence what and how they learn. In many countries, clinical and written examinations have been suspended, cancelled, or replaced by online examinations or new methods of assessment including; Comprehensive high-stake online exams [3], Modified national Objective Structured Clinical Examination (OSCE) [4], High stakes modified OSCE [5], Web-based OSCE [6], Online oral [7], and Online assessment [8].

College of Medicine at Hawler Medical University has considered it essential to set up a modified online examination not only to maintain the students’ academic progress but also to provide enough medical graduates. Therefore, this study aims at sharing the experience of implementing a comprehensive online examination to final year students at the college of medicine during this pandemic and to assess its effectiveness and satisfaction by examiners and students.

Materials and methods

Study design and setting

This was a cross-sectional study performed at the College of Medicine, Hawler Medical University, Kurdistan Region/Iraq during June 2020.

Study participants and sampling

The study population was 72 examiners and 150 6th year students who were involved in the online assessment experience. The sample size was estimated using the Epi info 7 computer program (free program issued by WHO and CDC) where the following information was entered into the program: the size of the population (mentioned above), the confidence interval was set at 95%, the estimated prevalence was set at 50% and the absolute precision was set at 5%. Accordingly, the estimated sample size was found to be 108 students and 61 examiners. Simple random sampling was used to select the sample out of the examiners and students’ populations. The aim of the study was clarified to the participating students and examiners, and their consent was taken.

Structure of the online assessment

Over the past years (before the COVID-19 pandemic), the final year medical students’ assessment was composed of a two-part assessment; advanced clinical competence which consisted of: Objective Structured Clinical Examination (OSCE) that assesses the skills and a written component in a form of Single best answer (SBA) and Extended matching questions (EMQ) which assess the knowledge.

It’s worth mentioning that the attitude of the students is assessed as part of daily evaluation through all the courses. The college has developed a comprehensive multiform examination which include all disciplines (Pediatric, Obstetrics & Gynecology, Medicine and Surgery). This examination represents 40% of the students’ final graduation average grade of the MBChB program and the remaining 60% is collected from students’ grades in years 1–5.

During the COVID-19 pandemic lockdown, final assessment with the physical presence of students was not possible, therefore the curriculum and assessment committees of the college made a modification for the assessment process with taking the examiners’ opinions into consideration.

Planning and preparation

A committee (including the Dean, head of clinical departments, head of the Medical Education department, and the 6th year director) was formed to decide on the forms and number of stations, competencies to be assessed, assure that the blueprint was properly sampling the essential part of the curriculum and responsible on students’ and examiners’ orientation on the final exam process during COVID-19 pandemic. A grading score checklist was prepared for all sets of questions.

The 6th year director prepared 18 examination panel teams. Each team was composed of four members from major clinical specialties. A focal point was allocated for each team.

The 6th year director conducted a Zoom meeting with the 18 focal points to describe the implementation process, and discuss challenges expected and possible solutions.

To decrease the possible internet problems, the exam panel teams were asked to attend the college and to meet the students in one Zoom account (focal point’s account).

Description of the online assessment

In usual situations, clinical competencies in history taking, examination, communication, and procedural skills besides data interpretation and management competencies were assessed through OSCE stations.

In the modified online assessment, all the above-mentioned competencies were assessed except for the examination and procedural skills competencies.

The oral exam was conducted as an online Zoom meeting between the student and the examination panel. In a random manner, eighteen groups of students were created and they were distributed to the online groups (8–9 students were put into the online zoom groups). As a result, first students in each of the 18 groups would be examined with the same questions, and so on (Fig 1).

Fig 1. Examination panels setup.

Fig 1

All students who assigned #1 in the 18 groups will pass through the same set of questions, the same will be applied for students #2–8. Two extra set of questions were saved to overcome any problems.

The Blueprint was regulated according to the competencies for each round of student exams, and the questions included emergency patient scenarios in different disciplines and different systems.

For the security purpose of the online exam, 10 sets of different questions according to the blueprint competencies were prepared. The 9th and 10th set were prepared as backup sets in case any connection problem appeared during the exam. In case any student experienced connection problems, they would be shifted to the end with a new set of questions i.e. 9th and 10th in order not to affect the sequence of the exam in the other examination panels.

Implementation

To overcome technical competence deficiency in some examiners, each examination panel contained one faculty member who was proficient in the Zoom application and that member was responsible for admitting the students to the Zoom meeting, controlling the waiting list and sharing the screen that show the questions to the students. Student answers were graded according to a designed checklist.

The sequence of asking questions was also the same in each examination panel starting from Surgery case followed by Medicine, Obstetrics & Gynecology, and Pediatrics.

Each exam panel examined 8 or 9 students at the same time and with the same questions for the same rounds. The question set was changed with each student to control the security of the exam (Fig 2). The time of examination for each student was 20 minutes, 5 minutes for each branch.

Fig 2. Examination process.

Fig 2

(A) Sequence of discipline questions asked by examination panel, (B) question sets according to assigned students numbers in each examination panel i.e. 8 set of questions used by each examination panel for the 8 students.

Stakeholder evaluation of the process

Evaluation of the online assessment process from both students and examiners perspectives was done through a questionnaire that was designed by one of the authors through literature reviews. The questionnaire was prepared and distributed to both examiners and students through a Google form. It was composed of 9 and 8 items for both examiners and students respectively that were rated on a five-point Likert scale from 1 (strongly disagree) to 5 (strongly agree). The items were related to the participants’ experience and satisfaction with the setting, implementation, and environment of the online assessment. Three open-ended questions were added to the questionnaire to indicate, the most commonly liked and disliked aspects of the online examination, and suggestions to improve it. In addition to that, one more item was included in students’ questionnaire about whether they faced any connection problem during the online assessment process. An overall score for each individual was calculated by summing up their Likert scale scores of all items [4]. Internal consistency reliability of the responses was evaluated by Cronbach alpha (α).

Data analysis

The data obtained from the Google forms was shifted to another statistical software, the Statistical Package for Social Sciences (SPSS, version 25). Categorical data were summarized and presented as frequencies and percentages. Mann-Whitney and Kruskal-Wallis tests were used to compare the mean ranks of the overall satisfaction scores between categories of the students and examiners. Thematic analysis was performed for the above-mentioned open-ended questions. P-values <0.05 were regarded as significant.

Ethical consideration

Ethical approval was obtained from the ethics committee of Hawler Medical University. (the ethical approval number: 8 on 20/4/2021) A written informed consent about providing feedback on the online assessment was obtained through Google form. Students’ identity was not disclosed for ethical reasons. The confidentiality of the participants’ information was maintained.

Results

Data from 75 students and 54 examiners were available for analysis. The response rate for the electronic questionnaire was 69.4% for students and 88.5% for examiners.

The overall score for each individual was calculated, in addition to the Mean values and standard deviations. There were no significant differences in the mean ranks of the total scores between male and female students (p = 0.392), and between male and female examiners (p = 0.095), also there were no significant differences between the different specialties (p = 0.392) (Table 1).

Table 1. Mean satisfaction scores of the online assessment evaluation scales by gender and specialty.

Students Examiners
N Mean score (SD) P-value N Mean score (SD) P-value
Gender Gender
Male 27 23.9 (6.9) 0.392* Male 26 34.46 (6.5) 0.095*
Female 48 25.3 (6.7) Female 28 33.25 (5.7)
Specialties
Medicine 3 35.7 (0.6)
Obstetrics and gynecology 20 34.5 (5.2) 0.392**
Pediatric 15 33.5 (6.1)
Surgery 16 33 (7.6)

*Mann-Whitney test

**Kruskal-Wallis test

The mean (SD) of the examiners’ overall satisfaction score was 3.76 (0.67) out of 5 which was significantly (p<0.001) higher than that of students (3.1 (0.85)).

The student online assessment survey had good internal reliability (Cronbach alpha = 0.883). It was clear that the vast majority of the student responses were in the "Agree" category apart from their response to the statement of "the assessment process evaluated the clinical skills of the students" where only 10.7% agreed on it. According to 53.3% of students, the time allocated for answering questions was enough. Half of the students agreed that the examiners were professional, and the assessment questions were clear. In general, 37.3% of the students were satisfied with the assessment process (Table 2).

Table 2. Students’ perception about the assessment process.

  Disagree and Strongly Disagree Neutral Agree and Strongly Agree
Indicators No. (%) No. (%) No. (%)
Students comprehensive instructions on the way of the online assessment. 22 (29.3) 20 (26.7) 33 (44.0)
There was sufficient communication between the school administration and students. 20 (26.7) 26 (34.7) 29 (38.7)
The format of the online assessment was acceptable. 23 (30.7) 25 (33.3) 27 (36.0)
The examiners were professional. 15 (20.0) 22 (29.3) 38 (50.7)
The assessment questions were clear. 18 (24.0) 19 (25.3) 38 (50.7)
The conducted assessment evaluated students’ clinical skills. 40 (53.3) 27 (36.0) 8 (10.7)
The time allocated to answer each (medicine, surgery, OBGYN, pediatrics) was enough. 17 (22.7) 18 (24.0) 40 (53.3)
Overall, you were satisfied with the assessment process. 23 (30.7) 24 (32.0) 28 (37.3)

It is worth mentioning that 33 (44%) students faced problems during the assessment process which was either electricity shutdown (21.2%), internet connection (45.5%) or both (33.3%). The majority of the students’ issues were solved immediately while only 9 out of the 33 students (27.3%) waited till the end of the exam and the extra set of questions were used for them.

Table 3 shows the main themes and subthemes that emerged from the students’ perception on Pros and Cons of the online assessment. The two main areas highlighted in the Pros were the question and format of the exam, and the professional behavior of the examiners. The students mostly liked the type of the questions, being practical and covering the common clinical and emergency cases reasonably (34.6%). On the other hand, the two main themes that emerged in the Cons were related to administration issues, and question and format of the assessment. The major areas that students disliked were as follows: long waiting time and delay (22.7%), unfair distribution of marks (22.7%), connection problems (20%). Other subthemes are presented in detail in Table 3.

Table 3. Areas that the students like (Pros) and dislike (Cons) regarding the assessment process.

Themes Subthemes No. of responses (%) N = 75
The most commonly liked aspects of the exam (Pros)
Questions and format of assessment • The questions were practical and reasonable covering common clinical and emergency problems. 26 (34.6)
• Students liked the online assessment as a new and safe solution. 11 (14.7)
Examiners behavior • The examiners behaved professionally. 18 (24.0)
Nothing specific. 18 (24.0)
The most commonly disliked aspects of the exam* (Cons)
Administration issues • Long waiting time and delay 17 (22.7)
• Connection problem 15 (20.0)
• Poor communication with the examiners (voice and face were not clear). 6 (8.0)
• Unfair distribution of questions and marks as only 20% was dedicated for the final exam 17 (22.7)
• The online assessment process 7 (9.3)
• Preparedness and organization for the assessment process 2 (2.7)
Questions and format of the assessment • The questions were not clear, and many of them are not emergency cases. 9 (12.0)
• Clinical skills cannot be assessed 5 (6.7)
• Short time allocated for the assessment process 5 (6.7)
Nothing specific 6 (8.0)

*Students may have more than one disliked area.

Around one quarter (24%) of the students had no specific suggestion for improvement, 17.3% suggested another method for assessment (other than the online), 13.3% suggested better training for the students and examiners before the exam, and 13.3% of students suggested accurate timing, and to give more time. The other suggestions are mentioned in (Table 4).

Table 4. Students’ suggestions for improvement.

  No. (%)
No specific suggestion 18 (24.00)
To find a method of assessment, other than the online 13 (17.33)
Better training for the students and examiners before the exam 10 (13.33)
Accurate timing, and to give more time 10 (13.33)
Questions should focus on emergency conditions 8 (10.67)
Same closed-ended questions for all the students, more questions, and more examiners 8 (10.67)
Better internet and website 5 (6.67)
Others 3 (4.00)
Total 75 (100.00)

Sixty-one examiners were asked to participate in the study, but only 54 (88.5%) participated. Around half (48.1%) were males. The mentioned teaching staff belonged to the following departments: medicine, surgery, pediatrics, obstetrics and gynecology. The examiners’ online assessment survey was found to have good internal reliability (Cronbach alpha = 0.881).

It is evident in (Table 5) that the majority of the examiners were generally satisfied with the online examination process, they agreed on all the items raised by the researchers except for ’assessment of the clinical competence’ where many of them thought that the online exam could not assess the clinical competence and did not reflect real clinical practice.

Table 5. Examiners’ perception about the examination process.

Disagree and Strongly Disagree Neutral Agree and Strongly Agree
The items No. (%) No. (%) No. (%)
You were clearly informed about the assessment process 3 (5.6) 11 (20.4) 40 (74.1)
During the assessment, the technical support by the cohost was adequate 3 (5.6) 12 (22.2) 39 (72.2)
During the assessment, health regulation, physical distancing, and mask wearing on the day of the exam (at college) was adequate. 6 (11.1) 11 (20.4) 37 (68.5)
Exam duration was acceptable. 1 (1.9) 7 (13.0) 46 (85.2)
Assessment reflected real clinical practice. 11 (20.4) 28 (51.9) 15 (27.8)
The question reflected proper sampling from the curriculum 2 (3.7) 18 (33.3) 34 (63.0)
You were satisfied with the provided assessment checklist 8 (14.8) 9 (16.7) 37 (68.5)
The online assessment could assess clinical competence. 17 (31.5) 14 (25.9) 23 (42.6)
Organization of the whole process met your expectation 3 (5.6) 12 (22.2) 39 (72.2)

Organization of the exam and teamwork were the most liked areas by the examiners (38.9% and 27.8% respectively). Four examiners (7.4%) liked the fairness of the exam, and another 7.4% liked the type of questions, being clinical and covering the curriculum. On the other hand, the most disliked areas were: the inability of the exam to reflect the clinical skills of the students” (22.2%), connection problems (18.5%), and poor organization (14.8%). Details of examiners related themes and subthemes are presented in (Table 6).

Table 6. Areas that the examiners like (Pros) and dislike (Cons) regarding the assessment process.

Themes Subthemes No. of responses (%)
The most commonly liked aspects of the exam (Pros)
Administration
Issues
• Organization of the examination process 21 (38.9)
• Teamwork 15 (27.8)
• The time of the exam was known for the students 3 (5.6)
• A new experience in assessing the students 1 (1.9)
Questions and format of assessment • Fair 4 (7.4)
• The questions were clinical, covering the curriculum 4 (7.4)
Safety • Being online, so no chance to get the COVID-19 3 (5.6)
Nothing specific 3 (5.6)
The most commonly disliked aspect of the exam (Cons)
Administration • Poor organization 8 (14.8)
Issues • Connection problems 10 (18.5)
• No social distancing 1 (1.9)
Questions and format of assessment • It did not reflect the clinical skills of the students 12 (22.2)
• Direct questions that can’t differentiate between students 3 (5.6)
• Absence of a detailed checklist for each case 4 (7.4)
• The appearance of the answer key for some questions 3 (5.6)
Examiners • Not every examiner was involved in preparing the questions 2 (3.7)
• Delay of the examiners in attending on time 1 (1.9)
Nothing specific 10 (18.5)
Total 54 (100)

More than one third (36.4%) of the suggestions were about better organization, and to have more questions. Nine (16.4%) suggestions were about the training of the examiners before the exam. Eight (14.5%) suggestions were to make an on-campus exam with implementing safety measures where possible. The other suggestions are presented in (Table 7).

Table 7. Examiners’ suggestions for improvement.

Suggestions No. (%)
Better organization, better preparation for the exam, and more questions. 20 (36.4)
Training of the examiners before the exam 9 (16.4)
On-campus assessment with the use of safety measures 8 (14.5)
Nothing specific 7 (12.7)
Ongoing assessment should have a role in the assessment process 3 (5.5)
Better internet connection 2 (3.6)
Involve more examiners in the process 2 (3.6)
Use of a detailed checklist for one of the departments 1 (1.8)
Use the same process of assessment in the future 1 (1.8)
Assessment by an individual examiner (not in committee) 1 (1.8)
Share the exam experience with other colleges 1 (1.8)
Total* 55 (100.0)

Note. *More than one suggestion is possible for each examiner.

Discussion

Online assessment is an unusual method of assessment for graduating medical students and proper decision and preparation with the involvement of administration and teachers is essential. The presented study represents one of the experiences of online assessment during COVID-19 lockdown.

The COVID-19 pandemic has affected medical education in many aspects. Creative ways of assessment were important to maintain the standards of medical education during the lockdown time [9], especially the assessment of clinical competencies which is a challenging area needing innovative modification [10].

As the lockdown was imposed by the government of Iraq and Kurdistan Region, to continue learning and save the academic year, the Ministry of Higher Education and Scientific Research in Iraq and Kurdistan Region decided in March 2020 to replace in-person classes with their online equivalents. These challenges were more distressing for final year medical students waiting for their graduation assessments [11].

At Hawler Medical University, College of Medicine, the college council adapted the distribution of assessments’ marks, assigning 80% of the total course mark to coursework related activities, while the remaining 20% was allocated to final assessments to ensure a fair ranking of students. A similar approach was performed in Jordan [12].

As the face-to-face final OSCE was prohibited at the start of the pandemic, OSCE blueprint, instructions and structured marking schedules were adapted. Physical examination or procedural skills couldn’t be assessed due to absence of simulated or real patients. Therefore, it was planned to have an objective structured viva examination instead. Questions were based on emergency oriented short case vignette from all major disciplines. Technological development and accessibility to variety of digital tools have supported the education process in COVID-19 era throughout the globe. Zoom, Google meet, Microsoft team, WebEx, …etc are examples. Overall, these tools enabled sharing clinical photographs, laboratory results and imaging using the screen-sharing feature, and facilitated the assessment of clinical reasoning and data interpretation domains via examiner questioning synchronously. Problem-based questions enabled the assessment of the clinical reasoning and higher-order thinking skills [13, 14]. Hawler Medical University used Zoom as a tool for the final year assessment as the staff and students had experience with it during the early COVID-19 period.

In several countries, clinical and theory examinations have been cancelled [9], postponed, or delayed or replaced by online or new methods of assessment [15, 16].

Results indicated that the overall satisfaction of both examiners and students with the conducted assessment was high with that of the examiners’ being higher. Similar results were obtained by Tan et al. [17]. On the other hand, concerns about the number of questions used, the mark allocated for the final assessment, technical problems and poor nonverbal communication on video-conferencing were mainly disliked by a certain fraction of students (30.7%) and might have contributed in their non-satisfaction. According to Mak and his colleagues, it is hard to gain control on the full range of nonverbal communication through the online platform due to screen size, position of the student on the screen as well as impaired video quality due to internet problems [18]. This was also found in our students, as a good percentage of them used mobile screens for joining. The online assessment process has many challenges during implementation including connection problems [19]; variations in household internet access for both examiners and students was an expected problem especially in a country like Iraq. To minimize this concern, two points were considered, asking examiners in the examination panel to conduct the examination as a committee in the campus benefiting from the fact that the examiners (who are also physicians) had permission from the local authorities for free movement in the lockdown period and only students needed to be at home. Secondly, if any connection problem occurred for any of the students, an extra set of questions were present to be used at the end of the exam, in order not to affect the sequence of the exam as explained in detail in methodology.

Exam security was another challenge, as it was necessary to ensure that students have met the required learning outcome. Threats to assessment security can be alleviated by designing assessment methods that are resistant to challenges of cheating [20].

To overcome this problem, 18 committees conducted the exam at the same time and changed the question for each round as clarified in (Fig 1), and a Zoom room was created for every committee with enabling the waiting feature.

Less chance for cheating was declared by many students as one of the points that students liked about the process of assessment, however, it led to longer waiting time for their turn to enter the Zoom room.

To use technology in a low-technology context, adapting new approaches to solve local problems should be restricted to what is possible under the circumstances [21]. Poor technical capability of all examiners to use IT especially Zoom, as this exam was conducted at the beginning of COVID-19 (2019–2020 academic year) and lacking IT personnel in the college put a higher pressure on the assessment committee and the administrative team to find a solution. This problem was alleviated by choosing 18 examiners who were good at using technology specifically Zoom to be focal points for each examining committee.

In medical education, it is important to prepare medical students for real-life scenarios by designing authentic assessments [15, 22] to ensure the students’ ability to deal with emergency case scenarios. Therefore, all case scenarios that were prepared reflected real emergency case scenarios. However, around one-third of examiners and nearly half of students disagreed that this online assessment assessed clinical skills.

The strength of this study is that it reports both examiners and students’ perception on the online assessment experience. The overall satisfaction score mean of examiners were significantly higher than that of students. This agreed with Elshami et al. where satisfaction rates among students and examiners were 41.3 and 74.3% respectively [23].

Limitations: This study examines one institution experience but, it is important to share the modification of assessment process done in the pandemic time with medical educators throughout the globe especially in low resource countries. Although it was not possible to practice a mock final online assessment by students due to uncertainty at the beginning of the pandemic, good preparation, planning and prediction of the expected problems with their solutions by focal points and effective communication with the students, had a major role in the success of the process.

In conclusion, the findings from this study suggest that assessment via alternative methods during the lockdown has enlightened examiners to the value of their use in normal situations.

There is therefore, a definite need for Information communication technology training including the effective use of distance educational tools for student-oriented teaching and assessment strategies for all faculty members. This may come as an obligatory online course in alignment with preparing new university examiners and equipping them with digital tools. An important area that needs to be addressed in future researches is to explore the perception of stakeholders on online assessments utilizing focus groups and qualitative methods.

Supporting information

S1 File

(PDF)

S2 File

(PDF)

Acknowledgments

The authors appreciate all medical students and examiners who spent their time to participate in the study.

Data Availability

All relevant data are within the paper and its Supporting Information files.

Funding Statement

The authors received no specific funding for this work.

References

  • 1.Sani I, Hamza Y, Chedid Y, Amalendran J, Hamza N. Understanding the consequence of COVID-19 on undergraduate medical education: Medical students’ perspective. Ann Med Surg. 2020;58: 117–119. doi: 10.1016/j.amsu.2020.08.045 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Huber SG, Helm C. COVID-19 and schooling: evaluation, assessment and accountability in times of crises—reacting quickly to explore key issues for policy, practice and research with the school barometer. Educ Assessment, Eval Account. 2020;32: 237–270. doi: 10.1007/s11092-020-09322-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Khalaf K, El-Kishawi M, Moufti MA, Al Kawas S. Introducing a comprehensive high-stake online exam to final-year dental students during the COVID-19 pandemic and evaluation of its effectiveness. Med Educ Online. 2020;25(1). doi: 10.1080/10872981.2020.1826861 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Hytönen H, Näpänkangas R, Karaharju-Suvanto T, Eväsoja T, Kallio A, Kokkari A, et al. Modification of national OSCE due to COVID-19-Implementation and students’ feedback. Eur J Dent Educ. 2021;00: 1–10. doi: 10.1111/eje.12646 [DOI] [PubMed] [Google Scholar]
  • 5.Boursicot K, Kemp S, Ong TH, Wijaya L, Goh SH, Freeman K, et al. Conducting a high-stakes OSCE in a COVID-19 environment. MedEdPublish. 2020;9:54. doi: 10.15694/mep.2020.000054.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Major S, Sawan L, Vognsen J, Jabre M. COVID-19 pandemic prompts the development of a Web-OSCE using Zoom teleconferencing to resume medical students’ clinical skills training at Weill Cornell Medicine-Qatar. BMJ Simulation and Technology Enhanced Learning. BMJ Publishing Group; 2020. pp. 376–377. doi: 10.1136/bmjstel-2020-000629 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Akimov A, Malin M. When old becomes new: a case study of oral examination as an online assessment tool. Assess Eval High Educ. 2020;45: 1205–1221. doi: 10.1080/02602938.2020.1730301 [DOI] [Google Scholar]
  • 8.Elzainy A, El Sadik A, Al Abdulmonem W. Experience of e-learning and online assessment during the COVID-19 pandemic at the College of Medicine, Qassim University. J Taibah Univ Med Sci. 2020;15: 456–462. doi: 10.1016/j.jtumed.2020.09.005 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Papapanou M, Routsi E, Tsamakis K, Fotis L, Marinos G, Lidoriki I, et al. Medical education challenges and innovations during COVID-19 pandemic. Postgrad Med J. 2022;98: 321–327. doi: 10.1136/postgradmedj-2021-140032 [DOI] [PubMed] [Google Scholar]
  • 10.Nimavat N, Singh S, Fichadiya N, Sharma P, Patel N, Kumar M, et al. Online Medical Education in India &ndash; Different Challenges and Probable Solutions in the Age of COVID-19. Adv Med Educ Pract. 2021;12: 237–243. doi: 10.2147/AMEP.S295728 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Choi B, Jegatheeswaran L, Minocha A, Alhilani M, Nakhoul M, Mutengesa E. The impact of the COVID-19 pandemic on final year medical students in the United Kingdom: a national survey. BMC Med Educ. 2020;20: 206. doi: 10.1186/s12909-020-02117-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Elsalem L, N A-A, AA J, Obeidat N. Remote E-exams during Covid-19 pandemic: A cross-sectional study of students’ preferences and academic dishonesty in faculties of medical sciences. Ann Med Surg. 2021;62: 326–333. doi: 10.1016/j.amsu.2021.01.054 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Varutharaju E, Ratnavadivel N. Enhancing higher order thinking skills through clinical simulation. Malaysian J Learn Instr. 2014;11: 75–100. doi: 10.32890/mjli.11.2014.7666 [DOI] [Google Scholar]
  • 14.Kansal AK, Gautam J, Chintalapudi N, Jain S, Battineni G. Google Trend Analysis and Paradigm Shift of Online Education Platforms during the COVID-19 Pandemic. Infect Dis Rep. 2021;13: 418–428. doi: 10.3390/idr13020040 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.O’Byrne L, Gavin B, McNicholas F. Medical students and COVID-19: the need for pandemic preparedness. J Med Ethics. 2020;46: 623–626. doi: 10.1136/medethics-2020-106353 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Birch E, Wolf M de. A novel approach to medical school examinations during the COVID-19 pandemic. Med Educ Online. 2020;25. doi: 10.1080/10872981.2020.1785680 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Tan CK, Chua WL, Vu CKF, Chang JPE. High-stakes examinations during the COVID-19 pandemic: To proceed or not to proceed, that is the question. Postgrad Med J. 2021;97: 427–431. doi: 10.1136/postgradmedj-2020-139241 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Mak V, Krishnan S, Chuang S. Students’ and Examiners’ Experiences of Their First Virtual Pharmacy Objective Structured Clinical Examination (OSCE) in Australia during the COVID-19 Pandemic. Healthcare. 2022;10:328. doi: 10.3390/healthcare10020328 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Guangul FM, Suhail AH, Khalit MI, Khidhir BA. Challenges of remote assessment in higher education in the context of COVID-19: a case study of Middle East College. Educ Assessment, Eval Account. 2020;32: 519–535. doi: 10.1007/s11092-020-09340-w [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Gamage KAA, de Silva EK, Gunawardhana N. Online delivery and assessment during COVID-19: Safeguarding academic integrity. Educ Sci. 2020;10(11):301. doi: 10.3390/educsci10110301 [DOI] [Google Scholar]
  • 21.Yapa HM, Bärnighausen T. Implementation science in resource-poor countries and communities. Implement Sci. 2018;13: 1–13. doi: 10.1186/S13012-018-0847-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Karakus A, Şenyer N. The preparedness level of final year medical students for an adequate medical approach to emergency cases: computer-based medical education in emergency medicine. 2014;7(1):1–6. doi: 10.1186/S13012-018-0847-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Elshami W, Taha MH, Abuzaid M, Saravanan C, Al Kawas S, Abdalla ME. Satisfaction with online learning in the new normal: perspective of students and faculty at medical and health sciences colleges. 2021;26:1920090. doi: 10.1080/10872981.2021.1920090 [DOI] [PMC free article] [PubMed] [Google Scholar]

Decision Letter 0

Vijayalakshmi Kakulapati

27 Apr 2022

PONE-D-22-08106Students and examiners perception on virtual medical graduation exam during the COVID-19 quarantine period: a cross-sectional study.PLOS ONE

Dear Dr. Alkhateeb,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Jun 11 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Vijayalakshmi Kakulapati, Ph.D

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at 

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf.

2. Please provide additional details regarding participant consent. In the Methods section, please ensure that you have specified (1) whether consent was informed and (2) what type you obtained (for instance, written or verbal). If your study included minors, state whether you obtained consent from parents or guardians. If the need for consent was waived by the ethics committee, please include this information.

3. Your ethics statement should only appear in the Methods section of your manuscript. If your ethics statement is written in any section besides the Methods, please move it to the Methods section and delete it from any other section. Please ensure that your ethics statement is included in your manuscript, as the ethics statement entered into the online submission form will not be published alongside your manuscript. 

4. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

Reviewers' comments:

Comments to the Author

Reviewer #1: This article presents a description of the online assessment adopted at Hawler Medical University/Iraq during the lockdown period of the COVID-19 pandemic, mainly for final year medical students, followed by an evaluation based on questionnaires for both examiners and students. Afterwards, a discussion of the outcomes of the questionnaire is presented. 

- One cannot classify this paper as a rigorous study. The outcomes, results and conclusions are quite trivial to the extent that one could have easily predicted them. Not to add that authors themselves found that they had almost the same results with similar studies (lines 261-263 and 295-297).

- The description of the online assessment should be complemented by clear and descriptive. This not the case here. Lines 116-120 that are supposed to explain the distribution of students are not well presented, nor the figure 1. I wasn’t able to map the information presented in the paragraph to the Figure. Not to add that the figure is ‘weak’. Moreover, Figure 2 is simply not necessary; I cannot see any added value; I would remove it. Regarding Figure 3, it is a bit vague, and no explanation in the caption. “Examination” is written as “Eamination”.

- As for the statistical part, the number of the participants as mentioned in line 168, (75 students and 54 examiners) is considered not a sufficiently large one to conduct a thorough study. And more importantly, the interpretation of p-value; Are the p-values presented in Table 1 considered ‘good’ enough from a statistical perspective?

- The questionnaire is not comprehensive and questions are very general. I would expect some very specific questions related to the quality/clearness of the X-rays and other pictures/figures, diverse type of tests (blood, urine…), comprehensiveness of case studies… does the scope of the questions cover a substantial area of knowledge in each discipline?

Moreover, here is a list of some minor suggestions, authors may take into consideration:

“Conclusions and recommendations” section is very small; I would merge them with “Discussions”.

Line 22: “while” instead of “with”.

Lines 23: “…aims at sharing…” instead of “…aims to share…”. Same applies to lines 57

Line 31: usage of “had been” is to be revised.

Line 51: “have been” to be removed.

Line 55: “Our medical school”, please specify since it is the first instance, so that reader does not have to refer to the affiliations.

Line 125: “were” instead of “was” in both instances.

Line 131, a comma to be inserted after “faculties”.

Line 132, “proficient” instead of “efficient”

Usage of “had been” to be reviewed in lines 161-166.

In line 206, “Around half (48.1%) were males, and the male: female ratio was 0.92: 1.” Redundancy! either part of the sentence is to be kept not both.

Line 235, “presented” instead of “present”

Line 260, “have been” to be removed.

Line 264, “has” instead of “have”

Line 265, a semicolon instead of a comma.

In general, English should be thoroughly reviewed

Reviewer #2: In this work, the authors conducted a research work that included online questionnaire to assess virtual medical graduation exam effectiveness from both examiners’ and students’ perspectives. Results showed that majority of the examiners were generally satisfied with the online examination process compared to only around a third of the students. Both

examiners and students agreed that online examination is not suitable for assessing the physical examination skills.

I recommend that researchers behind this work:

1- Add more details on reasons why third of students were not satisfied with the online exam.

2- Use of additional methods including interviews can be helpful to collect new /augment existing data.

3- Researchers seemed to rely more on quantitative methods. I recommend to include qualitative methods as it can capture different types of data/interactions that are hard to collect with questionnaires.

4- Researchers need to add more discussion on other tools/technologies that can be used other than Zoom for examination.

 

PLoS One. 2022 Aug 19;17(8):e0272927. doi: 10.1371/journal.pone.0272927.r002

Author response to Decision Letter 0


17 May 2022

Responses to editor and reviewers’ comments

Comments

Editor 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming.

Response :Thank you for your comment. The manuscript prepared according to the journal style (figures and tables)

2. Please provide additional details regarding participant consent. In the Methods section, please ensure that you have specified (1) whether consent was informed and (2) what type you obtained (for instance, written or verbal). If your study included minors, state whether you obtained consent from parents or guardians. If the need for consent was waived by the ethics committee, please include this information.

Response: An additional subtitle “Ethical consideration” added to methods section and include more details on the informed written consent.

All the participants of the study were adults

3. Your ethics statement should only appear in the Methods section of your manuscript. If your ethics statement is written in any section besides the Methods, please move it to the Methods section and delete it from any other section. Please ensure that your ethics statement is included in your manuscript, as the ethics statement entered into the online submission form will not be published alongside your manuscript.

Response : The ethics statement moved to Methods section.

4. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

Response : Many thanks for informing me. I found a retracted paper had been cited mistakenly.

Reference no. 3 changed to another reference entitled

” Huber SG, Helm C. COVID-19 and schooling: evaluation, assessment and accountability in times of crises—reacting quickly to explore key issues for policy, practice and research with the school barometer. Educ Assessment Eval Account. 2020;32: 237–270. doi:10.1007/S11092-020-09322-Y/FIGURES/7. PMID: 32837626; PubMed Central PMCID: PMC7286213”

Reviewer #1

1. This article presents a description of the online assessment adopted at Hawler Medical University/Iraq during the lockdown period of the COVID-19 pandemic, mainly for final year medical students, followed by an evaluation based on questionnaires for both examiners and students. Afterwards, a discussion of the outcomes of the questionnaire is presented.

Response : No comment

2. One cannot classify this paper as a rigorous study. The outcomes, results and conclusions are quite trivial to the extent that one could have easily predicted them. Not to add that authors themselves found that they had almost the same results with similar studies (lines 261-263 and 295-297).

Response : Thank you for your comment. The study was done at the beginning of the COVID-19. All the world specially the developing countries with low resources and limited technology information were struggling to find a way for assessing final year medical students, it was necessary to share the experience of conducting an online assessment in such circumstances.

3. The description of the online assessment should be complemented by clear and descriptive. This not the case here. Lines 116-120 that are supposed to explain the distribution of students are not well presented, nor the figure 1. I wasn’t able to map the information presented in the paragraph to the Figure. Not to add that the figure is ‘weak’. Moreover, Figure 2 is simply not necessary; I cannot see any added value; I would remove it. Regarding Figure 3, it is a bit vague, and no explanation in the caption. “Examination” is written as “Eamination”.

Response : Thank you for your comment. We agree that the figures were not very clear, we revised the paragraph (line 116-120) to make it clearer. And added a figure legend to both figure 1 and 3, Figure 2 was removed.

The revised part is with track changes

4. As for the statistical part, the number of the participants as mentioned in line 168, (75 students and 54 examiners) is considered not a sufficiently large one to conduct a thorough study. And more importantly, the interpretation of p-value; Are the p-values presented in Table 1 considered ‘good’ enough from a statistical perspective?

Response: As the reviewer mentioned the number of the participants were 75 students and 54 examiners

The sample size calculation mentioned in line 71-80, the response rate among students and examiners were 69.4% and 88.5% respectively which is considered high.

The p value in Table 1 showed non-significant differences.

5. The questionnaire is not comprehensive and questions are very general. I would expect some very specific questions related to the quality/clearness of the X-rays and other pictures/figures, diverse type of tests (blood, urine…), comprehensiveness of case studies… does the scope of the questions cover a substantial area of knowledge in each discipline?

Response: The reviewer was absolutely right regarding the questionnaire questions being very general and missed specific questions. This was the case because the aim of this study was to evaluate the first online final year assessment which was conducted in a low resource country, and to help administration for future steps and at the end to share the experience of the implemented plan with the medical education community.

6. Conclusions and recommendations” section is very small; I would merge them with “Discussions Response: The conclusion section was merged with the discussion

7. Line 22: “while” instead of “with”.

Lines 23: “…aims at sharing…” instead of “…aims to share…”. Same applies to lines 57

Line 31: usage of “had been” is to be revised.

Line 51: “have been” to be removed.

Line 55: “Our medical school”, please specify since it is the first instance, so that reader does not have to refer to the affiliations.

Line 125: “were” instead of “was” in both instances.

Line 131, a comma to be inserted after “faculties”.

Line 132, “proficient” instead of “efficient”

Usage of “had been” to be reviewed in lines 161-166.

In line 206, “Around half (48.1%) were males, and the male: female ratio was 0.92: 1.” Redundancy! either part of the sentence is to be kept not both.

Line 235, “presented” instead of “present”

Line 260, “have been” to be removed.

Line 264, “has” instead of “have”

Line 265, a semicolon instead of a comma.

In general, English should be thoroughly reviewed

Response: Many thanks for your valuable corrections. All the suggested changes were done.

Reviewer#2

1. In this work, the authors conducted a research work that included online questionnaire to assess virtual medical graduation exam effectiveness from both examiners’ and students’ perspectives.

Response: No comment

2. Add more details on reasons why third of students were not satisfied with the online exam.

Response: Thank you for your helpful comment. Reasons are added as following paragraph

“On the other hand, concerns about the number of questions used, the marks allocated for the final assessment, technical problems and poor nonverbal communication on videoconferencing were mainly disliked by a certain fraction of students and might have contributed in their non-satisfaction. According to Mak and his colleagues it is hard to gain control on the full range of nonverbal communication through the online platform due to screen size as some students used their mobile phones, position of the student on the screen as well as impaired video quality due to internet problems[23]. This was also found in our students as a good percentage of them used mobile screens for joining the exam. ”

3. Use of additional methods including interviews can be helpful to collect new /augment existing data.

4. Researchers seemed to rely more on quantitative methods. I recommend to include qualitative methods as it can capture different types of data/interactions that are hard to collect with questionnaires.

Response to point 3 and 4: Thank you for raising this point. we relied mainly on quantitative data but open ended questions was also used and thematic analysis was performed to extract data (Table3-6).

5. Researchers need to add more discussion on other tools/technologies that can be used other than Zoom for examination.

Response: The requested discussion was added as the following paragraph

“Technological development and accessibility to variety of digital tools have supported the education process in COVID-19 era throughout the globe. Zoom, Google meet, Microsoft team, WebEx,…etc are examples. Overall, these tools enabled sharing clinical photographs, laboratory results and imaging using the screen-sharing feature, and facilitated the assessment of clinical reasoning and data interpretation domains via examiner questioning synchronously. Problem-based questions enabled the assessment of the clinical reasoning and higher-order thinking skills [16,17]. Hawler Medical University used Zoom as a tool for the final year assessment as the staff and students had experienced it during the early COVID-19 period.”

Attachment

Submitted filename: Response to reviewers.docx

Decision Letter 1

Vijayalakshmi Kakulapati

23 Jun 2022

PONE-D-22-08106R1Students and examiners perception on virtual medical graduation exam during the COVID-19 quarantine period: a cross-sectional study.PLOS ONE

Dear Dr. Alkhateeb,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

==============================

ACADEMIC EDITOR:need to improve and incorporated all reviewer comments Check language corrections 

==============================

Please submit your revised manuscript by Aug 07 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Vijayalakshmi Kakulapati, Ph.D

Academic Editor

PLOS ONE

Journal Requirements:

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

Additional Editor Comments (if provided):

need to check language corrections

authors are advised to address all review comments 

Comments to the Author

Reviewer #2: The authors addressed my comments and provided clear feedback. The qualitative analysis might need more refinement/details. The authors mentioned conducting thematic analysis but they didn't reveal patterns/themes identified in the open ended questions. How it helped and what more information it provided?

PLoS One. 2022 Aug 19;17(8):e0272927. doi: 10.1371/journal.pone.0272927.r004

Author response to Decision Letter 1


15 Jul 2022

Responses to editor and reviewers’ comments

Comments

Editor

1. Need to check language corrections

Response: Thank you for your comment. The manuscript underwent language editing and grammar mistakes corrected

2. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

Response: Many thanks for informing me.

Based on cross reference meta data link references function, some references have been removed and some were added

The removed references were:

Ref. no. 1

Pokhrel S, Chhetri R. A Literature Review on Impact of COVID-19 Pandemic on Teaching and Learning. High Educ Futur. 2021;8: 133–141.

Ref. no.4

Murphy J. Assessment in medical education. Ir Med J. 2007;100: 356. Available: http://archive.imj.ie//ViewArticleDetails.aspx?ContentID=3620

Ref.no.11

Sabzwari S. Rethinking Assessment in Medical Education in the time of COVID-19. MedEdPublish. 2020;9: 1–6.

Ref. no. 12

Shehata MH, Kumar AP, Arekat MR, Alsenbesy M, Mohammed Al Ansari A, Atwa H, et al. A toolbox for conducting an online OSCE. Clin Teach. 2021;18: 236–242.

Ref.no.14

Al-Mendalawi MD. Teaching Paediatrics in Iraq Amid the COVID-19 Pandemic. Sultan Qaboos Univ Med J. 2020;20: e408

Ref. no.19

Rezaei H, Haghdoost A, Javar HA, Dehnavieh R, Aramesh S, Dehgani N, et al. The effect of coronavirus (COVID-19) pandemic on medical sciences education in Iran. J Educ Health Promot. 2021;10. doi:10.4103/JEHP.JEHP_817_20

Ref. no.24

Pettit M, Shukla S, Zhang J, Sunil Kumar KH, Khanduja V. Virtual exams: has COVID-19 provided the impetus to change assessment methods in medicine? Bone Jt open. 2021;2: 111–118.

Ref. no.26

Derbel F. Technologically-Capable Teachers in a Low-Technology Context.

The following references were added

Choi B, Jegatheeswaran L, Minocha A, Alhilani M, Nakhoul M, Mutengesa E. The impact of the COVID-19 pandemic on final year medical students in the United Kingdom: a national survey. BMC Med Educ. 2020;20: 206.

Guangul FM, Suhail AH, Khalit MI, Khidhir BA. Challenges of remote assessment in higher education in the context of COVID-19: a case study of Middle East College. Educ Assessment, Eval Account. 2020;32: 519–535.

Yapa HM, Bärnighausen T. Implementation science in resource-poor countries and communities. Implement Sci. 2018;13: 1–13.

Reviewer#2

1. The authors addressed my comments and provided clear feedback.

Response: No comment

2. The qualitative analysis might need more refinement/details. The authors mentioned conducting thematic analysis but they didn't reveal patterns/themes identified in the open ended questions. How it helped and what more information it provided?

Response: Thank you for your helpful comment.

Table 3 and table 6 which present thematic analysis of the open-ended questions were modified, themes and subthemes were added to it.

A paragraph was added in the results section as follows

“Table 3 shows the main themes and subthemes that emerged from the students’ perception on Pros and Cons of the online assessment. The two main areas highlighted in the Pros were the question and format of the exam, and the professional behavior of the examiners. The students mostly liked the type of the questions, being practical and covering the common clinical and emergency cases reasonably (34.6%). On the other hand, the two main themes that emerged in the Cons were related to administration issues, and question and format of the assessment. The major areas that students disliked were as follows: long waiting time and delay (22.7%), unfair distribution of marks (22.7%), connection problems (20%). Other subthemes are presented in detail in Table 3.”

Attachment

Submitted filename: Response to reviewer second round.docx

Decision Letter 2

Vijayalakshmi Kakulapati

29 Jul 2022

Students and examiners perception on virtual medical graduation exam during the COVID-19 quarantine period: a cross-sectional study.

PONE-D-22-08106R2

Dear Dr. Alkhateeb,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Vijayalakshmi Kakulapati, Ph.D

Academic Editor

PLOS ONE

**********

**********

Acceptance letter

Vijayalakshmi Kakulapati

10 Aug 2022

PONE-D-22-08106R2

Students and examiners perception on virtual medical graduation exam during the COVID-19 quarantine period: a cross-sectional study.

Dear Dr. Alkhateeb:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Vijayalakshmi Kakulapati

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 File

    (PDF)

    S2 File

    (PDF)

    Attachment

    Submitted filename: Response to reviewers.docx

    Attachment

    Submitted filename: Response to reviewer second round.docx

    Data Availability Statement

    All relevant data are within the paper and its Supporting Information files.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES