Short abstract
With this issue we begin print publishing the responses received in our call for Medical Education Adaptations: Lessons learned from educators' experiences rapidly transforming practice on account of COVID‐19 related disruption.
1. WHAT PROBLEMS WERE ADDRESSED?
Despite the coronavirus disease 2019 (COVID‐19) pandemic, it is still necessary to develop good assessment tools to evaluate the competence of our students. Our medical education department, therefore, changed its face to face workshops on 'how to create multiple choice questions (MCQs)' into online ones. It was the first time our medical education department had created questions online.
2. WHAT WAS TRIED?
Cognitive apprenticeship concepts were adopted when designing and implementing the training workshop. We started with full instructions on how to use Zoom™ cloud (Zoom Video Communications Inc., San Jose, CA, USA) through video and a technical training session offered before the workshop. Video‐recording of a lecture explained the guidelines adopted by the assessment unit regarding the MCQ design. A document including frequently asked questions regarding the design of MCQs was also disseminated 2 days before the synchronous meeting. During the meeting, we started by 'modelling' with a short presentation on how to write a high‐quality MCQ. Then, 'coaching' and 'scaffolding' was achieved by dividing participants into groups based on their specialty, with each group having an online coach (facilitator). Using breakout rooms in Zoom cloud, each group worked together and received tips from their coach for correcting poorly constructed MCQs. One coach was assigned to each group of 3‐4 trainees. 'Reflection' was then performed individually and asynchronously by sending the trainee an email containing a file of MCQs' designed by them previously' with a request to critique and reflect on their quality. Upon returning the file the trainee received feedback. Additionally, 'articulation' was performed by asking learners to write down step‐by‐step instructions on how to modify their previously constructed MCQs, leading to a second round of feedback. Finally, learners engaged in 'exploration' as they were asked to create new MCQs and send them via email for further feedback. The quality of the newly designed questions was judged against a checklist for assessing the quality of MCQs adopted, revealing newly designed MCQs that were higher in quality than those submitted pre‐workshop.
3. WHAT LESSONS WERE LEARNED?
Proper design of hands‐on activities can change faculty staff minds.
Offering multiple and different opportunities for hands‐on activities to the trainee with constructive feedback can improve the outcomes of faculty development activities.
It is important to provide appropriate scaffolding to faculty staff when they perform assigned activities to support their learning.
Providing additional experience to practice (and check ability to transfer newly acquired knowledge and skills) is valuable.
Unexpectedly, when we were able to achieve what had been done in previous face to face training sessions, trainees reported less interference and better participation was achieved by conducting the workshop online.