Skip to main content
BMJ Simulation & Technology Enhanced Learning logoLink to BMJ Simulation & Technology Enhanced Learning
. 2018 Mar 23;4(2):47–50. doi: 10.1136/bmjstel-2017-000245

Systematic review of the implementation of audience response systems and their impact on participation and engagement in the education of healthcare professionals

Morkos Iskander 1,2
PMCID: PMC8936989  PMID: 35515888

Abstract

Background

Audience response system provides a mechanism to engage larger groups as active participants in teaching sessions. However, they are traditionally based on ‘fixed’ closed loop system, which limits their functionality to a single geographical location, thus has the effect of confining their use to universities and other larger institutions, with a primary focus on education. Conversely, in the healthcare education context, the majority of formal education is undertaken through postgraduate training programmes, largely conducted in smaller cohorts in clinical settings.

Objective

The purpose of this review is to evaluate audience response systems in terms of feasibility of implementation and the impact on participation within the field of education of healthcare professionals, in comparison to the non-healthcare education.

Study selection

Therefore, systematic structured searches of PubMed and Medline databases for healthcare education were conducted, and Scopus, Education Resources Information Center, British Education Index, Education Abstracts, Education Administration Abstracts and PsycINFO databases for non-healthcare education databases.

Findings and conclusions

Consistent and fundamental differences were found in the studies evaluating healthcare education compared with other fields, with more difficulties encountered in implementation and a less significant impact on engagement seen. Here we discuss the consequences of these findings on the use of audience response systems and beyond.

Keywords: audience response systems, review, clickers, classroom response systems

Introduction

Postgraduate medical education for junior doctors forms a core part of their continuing professional development and serves to guide the acquisition of core clinical knowledge.1 These seminars are delivered at a local level in individual hospitals and are delivered to groups of between 20 and 40 junior doctors at the early years of their postgraduate career. These are traditionally composed of a series of didactic lectures around critical topics, with a final summative component. The formative component largely takes the format of either a collection of key slides from previous presentations or questions focused on key points asked to individual audience members. The aim is to highlight key aspects of the topics, and aims to reinforce the acquisition of knowledge and highlight the areas of application to future practice, this leading to higher levels of learning, as discussed in Bloom’s taxonomy of learning.2 The disadvantage of the first approach is that it largely delivers the same information using the same style, and can therefore prove to be unengaging. In the second approach these questions are usually directed to individual members of the audience, leading to uneven level of engagement across the cohort and therefore excluding the majority of the audience, contrary to evidence that higher levels of engagement lead to higher levels of meaningful learning.3

However, audience response systems or classroom response systems (CRS) have been recently adopted as an effective way of enhancing the levels of engagement, particularly in higher education.4 The major drawback of CRS systems is the required technological infrastructure to administer and deliver, confining its use to larger institutions and universities, and effectively limiting the use of CRS within the context of medical education to undergraduate training. This, in turn, effectively excluded postgraduate seminars due to the limitations of infrastructure.

Advances in technology have recently enabled the utilisation of a web-based CRS, such as Poll Everywhere and Conferences i/o among others, offering CRS that is easy to design and administer, as well as being free at the point of use, which may facilitate more widespread adoption. This platform has been demonstrated in the undergraduate setting to good effect,5 although no demonstration of CRS or this platform has been made in the healthcare education setting.

This review aims to systematically analyse the available literature, within and without healthcare professional education, and phenomenographically assess for differences between the two contexts.

CRS in education outside healthcare

A systematic review of Scopus and EBSCOHost databases was conducted, using the search terms ‘ audience response system ’, ‘ audience response system ’, ‘ electronic voting system ’, ‘ clickers ’ and ‘ Poll Everywhere ’, with the addition of ‘adult education’, ‘higher education’, ‘undergraduate’ and ‘postgraduate’. In the EBSCO platform the following databases were included: Education Resources Information Center, British Education Index, Education Abstracts, Education Administration Abstracts and PsycINFO. The search was limited to journal articles and reviews published following 2000. The references of included articles were additionally surveyed for further studies with additional narrative value. These choices were made to enable the capture of all relevant publications, while maintaining relevance in a technologically dynamic context. Figure 1 presents a summary of the literature search strategy of this review using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flow chart.6

Figure 1.

Figure 1

Preferred Reporting Items for Systematic Reviews and Meta-Analyses flow chart of the search strategy on the use of classroom response systems in educational settings outside of healthcare. Adapted from Moher et al.6

Thirty-three studies were reviewed in full, of which 22 identified studies attempted to evaluate the ease of instituting and implementing a CRS system, or addressed the use of CRS in relation to enhancing participation and engagement. Of these, six studies evaluated the ease of implementation, with all indicating that CRS can be incorporated into existing teaching programme without difficulty. Interestingly, MacGeorge et al 7 highlighted that anxiety regarding the initial use of CRS exists in lecturers as well as students, with Walklet et al 8 emphasising that additional planning time may be required when utilising CRS. Additionally, Bunz9 and Smith et al 10 suggested that a more simplified system with streamlined features may offer greater ease at the point of use; Graham et al 11 added that increased lecturer experience with the technology may facilitate the integration of CRS more effectively into the teaching programme. The studies concerned with ‘cloud’-based CRS platforms suggested that the complexity was centred around appropriate setting of questions, rather than use of the platforms themselves.12 13

Of the 19 studies evaluating student engagement and participation with CRS, 2 studies did not find a statistically significant link between the use of CRS and student engagement13 14; furthermore, an additional study found mixed results in engagement with CRS, with suggestion that some groups may be resistant to the inclusion of the technology into lectures.15 The remaining studies indicated that subjectively reported engagement was aided by the use of CRS.

An important note was that the use of CRS has been reported to provide a more captivating and fun experience for the students.11 16 17 Curiously, the evidence to whether this leads to a motivational boost to engage with the materials outside of CRS lecture is divided, with Hoekstra18 and Smith et al 19 suggesting that anticipating participation led to students undertaking preparatory work; in contrast, Laxman20 and Oigara and Keengwe21 did not find that engagement continued outside of the lectures.

Several studies suggested that CRS may be of specific benefit to groups that may otherwise be under-represented in other modalities of lectures and small group teaching.17 22 However, Heaslip et al 23 highlighted that students with physical disabilities can encounter difficulties in utilising the CRS systems.

CRS in healthcare education

A systematic review of Medline and PubMed databases was conducted using terms of ‘audience response system’, ‘audience response system’, ‘electronic voting system’, ‘clickers’ and ‘Poll Everywhere’. Figure 2 presents a summary of the literature search strategy according to the PRISMA flow chart.1 These databases were chosen as they include health education journals. The references of included articles were additionally surveyed for further studies with additional narrative value.

Figure 2.

Figure 2

Preferred Reporting Items for Systematic Reviews and Meta-Analyses flow chart of the search strategy on the use of classroom response systems in educational settings within healthcare. Adapted from Moher et al.6

Of the 149 abstracts screened, 8 was excluded as duplicates, and 36 studies were identified for full assessment. After evaluating the full articles, 13 studies were included in the review, with an additional 2 studies identified through screening the reference list of included studies. All the studies included in the review used CRS systems ‘fixed’ to a set lecture hall or auditorium. Two studies did not find a clear suggestion of CRS improving participation24 25; in addition to these, three further studies stated that significant difficulties, either in integration of CRS into the existing teaching programme or in the technical set-up, hampered its implementation.26–28 In contrast, Llena et al,29 while not explicitly set out to assess the implementation of CRS, did find the platform easy to use. However, with a third of studies in the review highlighting difficulties in implementing CRS, significant impediments to CRS may exist in healthcare education.

Interestingly, the impact on participation exhibited a wide range of variability, with Stevens et al 30 establishing only a minor benefit in student participation, and the majority of studies quantifying student participation demonstrating a greater benefit.31–33 Pettit et al 34 commented that increased participation was established with the addition of a competitive element to CRS. Furthermore, the effects on learning have been reported as mixed, with Sternberger35 suggesting that exposure and utilisation of CRS foster the development of critical thinking earlier in training, in contrast to Filer27 and Rahman et al 36 suggesting that although increased participation occurred with CRS, this did not translate into more effective learning. Patterson32 was the sole study that commented on reduced participation following technical difficulties, citing this as the major drawback of CRS systems.

Only one study evaluated the use of CRS in postgraduate education of healthcare professionals, with a study following up of a small cohort of trainees through multiple sessions of CRS-enhanced sessions throughout the study period.26 Limitations of this study include the small number of participants, with a maximum of 10 trainees included at any point. This suggests that CRS is an under-researched area in healthcare education, particularly in postgraduate medical education, which may be a legacy of earlier technology.

Analysis and discussion

Due to a large degree of heterogeneity in study design and research methodology, no meta-analysis was possible based on these systematic reviews. It is interesting to note that while CRS is used easily and without any significant obstacles, incorporation of CRS into healthcare education programmes has been hampered by significant difficulties at various stages. There may be a variety of reasons for this discrepancy, with the lack of support from dedicated information technology services within healthcare education being a distinct justification, particularly in contrast to higher education institutions. Second, the paradigm of healthcare education may not lend itself readily to a more time-consuming method of group learning.

It is also worth noting that there was an additional significant incongruity with regard to the impact on participation between healthcare education and non-healthcare education, with a far greater benefit derived from CRS outside of healthcare education. While there is no clear reason for this divergence of impact, it may be due to the baseline level of participation and motivation in the two context.

Conclusions

The use of CRS may be seen as advancing active, rather than passive, learning. The advantages of pursuing an active learning approach were discussed by Cleveland et al 37 and Torre et al,38 who suggested that utilising this style leads to the development of a deeper understanding, enhances engagement with the material at hand, and encourages discussion and peer learning. This is on a background of a growing body of evidence that peer learning provides a unique learning opportunity, with a reciprocal benefit for both those receiving teaching and the individuals providing it.39–41 Therefore, the fostering of this relationship through encouraging a group discussion and learning may be a useful side effect of using CRS.

It may be that CRS’ greatest strength is to encourage learning through an active rather than a passive and receptive process. It has been suggested that active learning leads to a greater understanding of the topic and appreciation of the impact of the information, as well as encouraging the students to take more ownership of their own learning.37 42–44 When applied to professional education, active learning can nurture the development of a competency and growth of the ‘professional identity’.45 When applied to healthcare education, active learning techniques lead to greater student satisfaction, in addition to improved academic performance and enhanced resilience.46 47

The incongruity between healthcare education and non-healthcare education settings on both implementation and engagement in the literature suggests that findings in educational research in general may not always be directly applicable to healthcare education, although the reasons for this are unclear and could be investigated in more detail in the future. It would therefore be prudent to caution against accepting educational research from settings outside healthcare without trialling them first.

Therefore, based on the findings presented here, the integration of CRS into postgraduate healthcare education programmes is likely to be beneficial. As these programmes, out of practical necessity, include didactic lectures, CRS may serve to promote a more active learning process, to the benefit of both the audience and their patients. Consequently, the additional preparation and planning required for successful implementation of CRS, with alignment to the learning objectives, may be justified. However, adequate preparation and technical support are vital to success. Furthermore, within healthcare educational settings, CRS may not be relied on to singularly improve audience participation, but may form part of a more comprehensive array of tools for this purpose.

Acknowledgments

Thanks to Dr K Lee, Ms K Stapleford and Ms J Ryan for helpful discussions and suggestions.

Footnotes

Competing interests: None declared.

Provenance and peer review: Not commissioned; externally peer reviewed.

References

  • 1. Beard J, Strachan A, Davies H, et al. Developing an education and assessment framework for the Foundation Programme. Med Educ 2005;39:841–51. 10.1111/j.1365-2929.2005.02236.x [DOI] [PubMed] [Google Scholar]
  • 2. Bloom BS. Taxonomy of educational objectives: the classification of educational goals handbook 1: Cognitive domain [S.l.]. Longman, 1974. [Google Scholar]
  • 3. Luscombe C, Montgomery J. Exploring medical student learning in the large group teaching environment: examining current practice to inform curricular development. BMC Med Educ 2016;16:184. 10.1186/s12909-016-0698-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Bruff D. Teaching with classroom response systems: creating active learning environments. 1st edn. San Francisco, Calif: Jossey-Bass; Chichester: John Wiley [distributor], 2009. [Google Scholar]
  • 5. Gubbiyappa KS, Barua A, Das B, et al. Effectiveness of flipped classroom with Poll Everywhere as a teaching-learning method for pharmacy students. Indian J Pharmacol 2016;48:S41–6. 10.4103/0253-7613.193313 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Moher D, Liberati A, Tetzlaff J, et al. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. BMJ 2009;339:b2535. 10.1136/bmj.b2535 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. MacGeorge EL, Homan SR, Dunning JB, et al. Student evaluation of audience response technology in large lecture classes. Educ Technol Res Dev 2008;56:125–45. 10.1007/s11423-007-9053-6 [DOI] [Google Scholar]
  • 8. Walklet E, Davis S, Farrelly D, et al. The impact of Student Response Systems on the learning experience of undergraduate psychology students. Psychology Teaching Review 2016;22:35–48. [Google Scholar]
  • 9. Bunz U. Using scantron versus an audience response system for survey research: does methodology matter when measuring computer-mediated communication competence? Comput Human Behav 2005;21:343–59. 10.1016/j.chb.2004.02.009 [DOI] [Google Scholar]
  • 10. Smith LA, Shon H, Santiago R. Audience response systems: using “Clickers” to enhance bsw education. J Technol Hum Serv 2011;29:120–32. 10.1080/15228835.2011.587737 [DOI] [Google Scholar]
  • 11. Graham CR, Tripp TR, Seawright L, et al. Empowering or compelling reluctant participators using audience response systems. Activ Learn High Educ 2007;8:233–58. 10.1177/1469787407081885 [DOI] [Google Scholar]
  • 12. Shon H, Smith L. A review of poll everywhere audience response system. J Technol Hum Serv 2011;29:236–45. 10.1080/15228835.2011.616475 [DOI] [Google Scholar]
  • 13. Dunn PK, Richardson A, Oprescu F, et al. Mobile-phone-based classroom response systems: Students’ perceptions of engagement and learning in a large undergraduate course. Int J Math Educ Sci Technol 2013;44:1160–74. 10.1080/0020739X.2012.756548 [DOI] [Google Scholar]
  • 14. Green AJ, Chang W, Tanford S, et al. Student perceptions towards using clickers and lecture software applications in hospitality lecture courses. Journal of Teaching in Travel & Tourism 2015;15:29–47. 10.1080/15313220.2014.999738 [DOI] [Google Scholar]
  • 15. Trees AR, Jackson MH. The learning environment in clicker classrooms: student processes of learning and involvement in large university‐level courses using student response systems. Learn Media Technol 2007;32:21–40. 10.1080/17439880601141179 [DOI] [Google Scholar]
  • 16. Stowell JR, Nelson JM. Benefits of electronic audience response systems on student participation, learning, and emotion. Teach Psychol 2007;34:253–8. 10.1080/00986280701700391 [DOI] [Google Scholar]
  • 17. King SO, Robinson CL. ‘Pretty Lights’ and Maths! Increasing student engagement and enhancing learning through the use of electronic voting systems. Comput Educ 2009;53:189–99. 10.1016/j.compedu.2009.01.012 [DOI] [Google Scholar]
  • 18. Hoekstra A. Vibrant student voices: exploring effects of the use of clickers in large college courses. Learn Media Technol 2008;33:329–41. 10.1080/17439880802497081 [DOI] [Google Scholar]
  • 19. Smith MK, Trujillo C, Su TT. The benefits of using clickers in small-enrollment seminar-style biology courses. CBE Life Sci Educ 2011;10:14–17. 10.1187/cbe.10-09-0114 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. Laxman K. A study on the adoption of clickers in higher education. ‎Australas J Educ Technol 2011;27:1291–303. 10.14742/ajet.894 [DOI] [Google Scholar]
  • 21. Oigara J, Keengwe J. Students’ perceptions of clickers as an instructional tool to promote active learning. Educ Inf Technol 2013;18:15–28. 10.1007/s10639-011-9173-9 [DOI] [Google Scholar]
  • 22. Williamson Sprague E, Dahl DW. Learning to click: an evaluation of the personal response system clicker technology in introductory marketing courses. ‎J Mark Educ 2009;32:93–103. [Google Scholar]
  • 23. Heaslip G, Donovan P, Cullen JG. Student response systems and learner engagement in large classes. ‎Activ Learn High Educ 2014;15:11–24. 10.1177/1469787413514648 [DOI] [Google Scholar]
  • 24. Duggan PM, Palmer E, Devitt P. Electronic voting to encourage interactive lectures: a randomised trial. BMC Med Educ 2007;7:25. 10.1186/1472-6920-7-25 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25. De Gagne JC. The impact of clickers in nursing education: a review of literature. Nurse Educ Today 2011;31:e34–40. 10.1016/j.nedt.2010.12.007 [DOI] [PubMed] [Google Scholar]
  • 26. Arneja JS, Narasimhan K, Bouwman D, et al. Qualitative and quantitative outcomes of audience response systems as an educational tool in a plastic surgery residency program. Plast Reconstr Surg 2009;124:2179–84. 10.1097/PRS.0b013e3181bcf11f [DOI] [PubMed] [Google Scholar]
  • 27. Filer D. Everyone’s answering: using technology to increase classroom participation. Nurs Educ Perspect 2010;31:247–50. [PubMed] [Google Scholar]
  • 28. Jensen JV, Ostergaard D, Faxholt AK. Good experiences with an audience response system used in medical education. Dan Med Bull 2011;58:A4333. [PubMed] [Google Scholar]
  • 29. Llena C, Forner L, Cueva R. Student evaluation of clickers in a dental pathology course. J Clin Exp Dent 2015;7:e369–373. 10.4317/jced.52299 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30. Stevens NT, McDermott H, Boland F, et al. A comparative study: do “clickers” increase student engagement in multidisciplinary clinical microbiology teaching? BMC Med Educ 2017;17:70. 10.1186/s12909-017-0906-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31. Satheesh KM, Saylor-Boles CD, Rapley JW, et al. Student evaluation of clickers in a combined dental and dental hygiene periodontology course. J Dent Educ 2013;77:1321–9. [PubMed] [Google Scholar]
  • 32. Patterson B, Kilpatrick J, Woebkenberg E. Evidence for teaching practice: the impact of clickers in a large classroom environment. Nurse Educ Today 2010;30:603–7. 10.1016/j.nedt.2009.12.008 [DOI] [PubMed] [Google Scholar]
  • 33. de Oliveira-Santos C, Tirapelli C, Rodrigues CT, et al. Interactive audience response systems in oral and maxillofacial radiology undergraduate lectures. Eur J Dent Educ 2017;31. 10.1111/eje.12258 [DOI] [PubMed] [Google Scholar]
  • 34. Pettit RK, McCoy L, Kinney M, et al. Student perceptions of gamified audience response system interactions in large group lectures and via lecture capture technology. BMC Med Educ 2015;15:92. 10.1186/s12909-015-0373-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35. Sternberger CS. Interactive learning environment: engaging students using clickers. Nurs Educ Perspect 2012;33:121–4. 10.5480/1536-5026-33.2.121 [DOI] [PubMed] [Google Scholar]
  • 36. Rahman A, Jacker-Guhr S, Staufenbiel I, et al. Use of elaborate feedback and an audience-response-system in dental education. GMS Z Med Ausbild 2013;30:Doc35. 10.3205/zma000878 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37. Cleveland LM, Olimpo JT, DeChenne-Peters SE. Investigating the relationship between instructors' use of active-learning strategies and students' conceptual understanding and affective changes in introductory biology: a comparison of two active-learning environments. CBE Life Sci Educ 2017;16:ar19. 10.1187/cbe.16-06-0181 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38. Torre D, Manca A, Durning S, et al. Learning at large conferences: from the ‘sage on the stage’ to contemporary models of learning. Perspect Med Educ 2017;6:205–8. 10.1007/s40037-017-0351-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39. Benè KL, Bergus G. When learners become teachers: a review of peer teaching in medical student education. Fam Med 2014;46:783–7. [PubMed] [Google Scholar]
  • 40. Glynn LG, MacFarlane A, Kelly M, et al. Helping each other to learn-a process evaluation of peer assisted learning. BMC Med Educ 2006;6:18. 10.1186/1472-6920-6-18 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41. Secomb J. A systematic review of peer teaching and learning in clinical education. J Clin Nurs 2008;17:703–16. 10.1111/j.1365-2702.2007.01954.x [DOI] [PubMed] [Google Scholar]
  • 42. LaCosse J, Ainsworth SE, Shepherd MA, et al. An active-learning approach to fostering understanding of research methods in large classes. Teach Psychol 2017;44:117–23. 10.1177/0098628317692614 [DOI] [Google Scholar]
  • 43. Erickson SA. Empowering students in science through active learning: voices from inside the classroom. US: ProQuest Information & Learning, 2016. [Google Scholar]
  • 44. Keith N, Wolff C, learning E. In: Kraiger K, Passmore J, dos Santos NR, eds, et al . The wiley blackwell handbook of the psychology of training, development, and performance improvement. wiley blackwell handbooks in organizational psychology. Wiley-Blackwell, 2015:92–116. [Google Scholar]
  • 45. Niemi H, Nevgi A. Research studies and active learning promoting professional competences in finnish teacher education. Teach Teach Educ 2014;43:131–42. 10.1016/j.tate.2014.07.006 [DOI] [Google Scholar]
  • 46. Kilgour JM, Grundy L, Monrouxe LV. A Rapid review of the factors affecting healthcare students' satisfaction with small-group, active learning methods. Teach Learn Med 2016;28:15–25. 10.1080/10401334.2015.1107484 [DOI] [PubMed] [Google Scholar]
  • 47. Schmidt HG, Cohen-Schotanus J, Arends LR. Impact of problem-based, active learning on graduation rates for 10 generations of Dutch medical students. Med Educ 2009;43:211–8. 10.1111/j.1365-2923.2008.03287.x [DOI] [PubMed] [Google Scholar]

Articles from BMJ Simulation & Technology Enhanced Learning are provided here courtesy of BMJ Publishing Group

RESOURCES