Skip to main content
American Journal of Pharmaceutical Education logoLink to American Journal of Pharmaceutical Education
. 2010 Feb 10;74(1):6. doi: 10.5688/aj740106

Impact of a Student Response System on Short- and Long-Term Learning in a Drug Literature Evaluation Course

Flora C Liu 1, Jacob P Gettig 1, Nancy Fjortoft 1,
PMCID: PMC2829154  PMID: 20221357

Abstract

Objective

To evaluate the effectiveness of a student response system on short- and long-term learning in a required second-year pharmacy course.

Method

Student volunteers enrolled in the course Drug Literature Evaluation were blinded and randomized to 1 of 2 groups. Group 1 attended a lecture in which the instructor used a student response system. Group 2 attended the same lecture by the same instructor an hour later, but no student response system was used. A 16-point unannounced quiz on the lecture material was administered to both groups at the end of class. Approximately 1 month later, both groups were given another unannounced quiz on the same material to test long-term student learning.

Results

One hundred seventy-nine (92.3%) students participated in both quizzes. Students who attended the class in which the student response system was used scored an average 1 point higher on quiz 1 than students who were assigned to the control group (10.7 vs. 9.7; p = 0.02). No significant difference was seen between the quiz 2 scores of the 2 groups (9.5 vs. 9.5; p = 0.99).

Conclusions

The use of a student response system can positively impact students' short-term learning; however, that positive effect did not appear to last over time. Faculty members may want to consider the use of student response systems to enhance student learning in large lecture classes.

Keywords: audience response system, student learning, active learning

INTRODUCTION

Mentkowski, in her seminal work, calls for “learning that lasts,” and defines that as the successful integration of learning, development, and performance.1 To promote “learning that lasts,” in pharmacy education, faculty members need to develop and provide educational experiences that engage students and provide them with opportunities to apply, problem solve, and evaluate their own learning. Lecture is the most commonly used teaching method and may be the most effective method to convey a large amount of information to a large number of students. Faculty members, therefore, are continually challenged to develop techniques that can be used in the large lecture setting that engage students and provide them with appropriate experiences that promote “learning that lasts.” The average student typically pays attention only for the first 10-20 minutes of a lecture.2 While this assumption is debatable,3 it behooves educators to be aware of this and create methods to continually engage students during lectures and make learning more active. The use of technology such as student response systems allows for more student engagement and interaction, which may improve the quality of students’ learning.4-7

A student response system, also known as an audience response system, classroom response system, personal response system, and electronic voting system, is an automated system that allows for interaction and feedback between an audience and a speaker.4-8 This wireless system allows the speaker to ask the audience multiple-choice questions, receive responses from audience members via the transmission of signals from a remote-control-like device called a clicker, and immediately displays the responses on a screen in the form of a chart. It is a way for teachers to question students and see immediate responses.

The use of student response systems is increasing in popularity as a tool to aid in student learning in various disciplines, including the health sciences, and at various levels of education, including postgraduate education.8-13 Both students and teachers may benefit from the two-way communication made possible through the use of this technology. Teachers are able to direct students’ attention to key concepts through the use of student response system questions in class.9 The instant feedback of a student response system also allows teachers to evaluate whether students understand the material and then to tailor lectures toward concepts with which students seem to be struggling.5-6 Students also receive immediate feedback to measure their own understanding of the key concepts taught. In addition, students who are not confident or who may be less inclined to speak in pubic are more likely to participate using the student response system because all responses are anonymous.4

Research on the impact of the student response system on education has shown that the use of student response systems can improve student attendance, attention, motivation,9 and engagement.8-10 Studies examined the impact of student response systems on student learning and found that students perceive their learning is enhanced through the use of clickers.8-10 A few studies compared average class grade across cohorts from different academic years and found that students in the year in which a student response system was used performed better academically than in previous years without the student response system.9,11,14 Randomized controlled studies found that students who used the student response system had higher quiz scores than students who did not use the student response system.15-17 These studies utilized a small sample of radiology residents,15 medical students in tutorials during their first clinical year,16 and undergraduate students in an introductory computer science course using a discussion format.17

Pharmacy education is beginning to explore and examine the use of student response systems. One study found that students perceived the use of student response systems as beneficial and appreciated the ability to obtain immediate feedback and compare their results with the entire class.18 Students reported that the use of student response systems increased their motivation to prepare for and attend class, and they saw the potential to streamline quizzes and the time to report results. Another study described the use of the student response system to increase student motivation, attention, and student learning.14 At another college, students in the student response system group performed better academically than students taught in previous years without the student response system.19 Student response systems also increased pharmacy students' confidence in their knowledge.20

While the research on the use of student response systems supports its use to promote student engagement, there are few studies that examine the impact of the student response system on student learning using an experimental design comparing 2 groups of students in real time. The purpose of this study was to evaluate the effectiveness of a student response system on short- and long-term learning in a required second-year pharmacy drug literature evaluation class, using an experimental research design.

METHODS

Midwestern University Chicago College of Pharmacy implemented the TurningPoint Response System with Response Card XR (Turning Technologies, Youngstown, OH), at the beginning of the school year in fall 2008. All first- and second-year pharmacy students were required to purchase clickers and were trained how to use them. This was also the first year many professors used the student response system. Faculty members were required to attend a 1-hour training session which included information on how to use the student response system and how to incorporate student response system questions into lectures using TurningPoint. Pharmacy faculty also attended a college-wide seminar presented by 3 pharmacy faculty members who had successfully integrated the student response system in their teaching. University Information Technology Services staff members were available to faculty for specific questions throughout the year.

All pharmacy students in their second year at Midwestern University Chicago College of Pharmacy enrolled in the course Drug Literature Evaluation were eligible to participate in the study. Participants were randomized into 1 of 2 groups and blinded to the study intervention. In other words, students attended the lectures and used the student response system, but were not aware what the intervention was. The faculty had used the system previously in the class, so its use was not novel to the students. Group 1, the student response system group, attended a lecture on non-inferiority trials, which incorporated the use of student response systems. Group 2, the control group, attended the same lecture 1 hour later without the utilization of student response systems. In both groups, an 8-question, 16-point, unannounced quiz was administered at the end of class to evaluate students’ comprehension and retention of the lecture material. About 1 month later, both groups were given a second unannounced quiz covering the same material as the first quiz to test students’ comprehension and long-term retention of lecture material. The quiz scores were not included in students' final grades, but rather students were given bonus points for participating in another class activity that day that was not related to the study.

The lectures given to group 1 and group 2 were as similar as possible. The same faculty member delivered the lecture and followed the same notes during the lecture. Group 1 was asked 5 questions using the student response system, and given about 30 seconds to respond to each question using clickers. Group 2 was asked the same questions verbally throughout the lecture. The question-and-answer choices were not displayed on TurningPoint so students were expected to raise their hands to respond to the questions. The correct and incorrect answers were discussed in both group 1 and group 2 lectures.

To assure the validity of the quiz questions, the faculty member responsible for the lectures contacted colleagues via list servs to solicit validated questions about non-inferiority studies. In addition, the faculty member reviewed the performance (eg, percent correct and point biserial) of questions from past examinations. Questions that were good discriminators in past examinations were selected as models for questions in the first and second quizzes. Both quizzes consisted of 8 questions that included 2 broad questions about the content from the non-inferiority trials lecture and 6 questions regarding an abstract of a non-inferiority study. The 6 questions related to the abstract required higher level application of lecture material. Questions on quiz 1 and quiz 2 were very similar; however, a different study abstract was used in quiz 2. The faculty member selected an abstract of a comparable non-inferiority study for quiz 2. The quiz questions were distinctly different from the 5 student response system questions and verbal questions embedded in the intervention and control groups’ lectures, which were more factual in nature and less applied.

Students in each group were given 15 minutes to complete quizzes while test proctors monitored them to prevent academic dishonesty. After quiz 1, students in group 1 exited the classroom from the ground level of the lecture hall while group 2 students waited to enter the classroom from the first floor, to minimize communication between the 2 groups about the unannounced quiz. During quiz 2, both study groups were combined and took the quiz at the same time and location.

Alpha was set at 0.05 a priori. Student t tests were used to examine statistical differences in continuous variables between groups. Paired t tests were used to examine statistical differences within groups between time 1 and time 2. Chi-square tests were used to examine statistical differences in categorical variables. ParSCORE, (version 6.1, Scantron Corporation, Eagan, Minnesota), was used to analyze quiz scores and generate item analysis reports for quiz 1 and 2.

This study was reviewed by the Midwestern University Institutional Review Board (IRB), and it met the criteria for exemption. Written consent was obtained from all study participants.

RESULTS

Statistical analysis was performed only on data from students who took both quiz 1 and quiz 2. The demographics of the student response system and control groups were similar as described in Table 1. One hundred seventy-nine of the eligible students participated in the study for a response rate of 92.3% and completed both quizzes; 88 in the student response system group and 91 in the control group. A significant difference was seen in the quiz 1 scores between the 2 groups (10.7 vs. 9.7; p = .026) (Table 2). The students in the student response system group performed on average 1 point higher than the students in the control group. No significant difference was seen between the quiz 2 scores of the 2 groups (9.5 vs. 9.5; p = 0.989).

Table 1.

Demographics of Pharmacy Students Enrolled in a Drug Literature Evaluation Course

graphic file with name ajpe6tbl1.jpg

Abbreviation: SRS = student response system

aCumulative GPA was obtained from the student's official transcript and was not self-reported.

bNumber of participants with a bachelors or masters degree

Table 2.

Quiz Scores of Pharmacy Students Who Attended a Lecture With and Without a Student Response System

graphic file with name ajpe6tbl2.jpg

Abbreviation: SRS = student response system

a< 0.05 between student response system group and control group for quiz 1

bp < 0.05 between quiz 1 and quiz 2 scores in SRS group

cp < 0.05 between quiz 1 and quiz 2 scores in control group

A significant decrease in quiz 2 scores was seen compared to quiz 1 scores in both study groups, which indicates that students may have failed to retain their knowledge of lecture material over time (Table 2).

Both quizzes discriminated well between good performers and poor performers. The average point biserials for quiz 1 and quiz 2, were 0.4 and 0.4, respectively (Table 3). The discriminating ability of question 3 on quiz 2 is suspect since about 99% of the class answered it correctly, and its corresponding point biserial was 0.1.

Table 3.

Item Analyses of Questions on Preintervention and Postintervention Quiz Administered to Pharmacy Students in Both Student Response System and Control Groups

graphic file with name ajpe6tbl3.jpg

No technical difficulties were experienced with the use of student response systems, and a high response rate to student response systems questions was seen in the group 1 lecture. The percent of students responding to questions 1 through 5 were 54%, 80%, 83%, 86%, and 88%, respectively. The lower response rate to question 1 was attributed to an underestimation by the faculty member of how long it would take students to respond. An example of a student response system and responses is in Figure 1.

Figure 1.

Figure 1

Screenshot of one of the in-class Student Response System questions. Abbreviations: sup = superior, NI = non-inferior.

DISCUSSION

The data suggests that the use of student response systems can enhance students’ short-term learning as evidenced by the higher quiz scores of the student response systems group. While the actual difference in scores between the 2 groups on the first quiz was only 1 point, the difference was significant, and students may argue that 1 additional point is academically significant. Palmer et al also found that the student response system group scored 1 point higher than the control group, and the differences were significant.16 The use of a longer quiz with more questions may show larger differences between the groups. The SRS provides students with the opportunity to interact and obtain immediate feedback on their learning. Whether the higher quiz scores are the result of student engagement or immediate feedback, or a combination of the 2 factors, is not clear. The data do suggest that regardless of the underlying factors, the use of a student response system can impact learning positively.

The results of this study, however, suggest that this positive effect does not have a lasting impact. Student scores in both the student response system and control groups on quiz 2 were almost identical. There may have been other mitigating factors that affected this outcome. As noted in the results, 1 question out of 8 on quiz 2 was not effective at discriminating learners, as 99% of the students answered it correctly. This may have minimized the effect of the student response system. All students may have spent time reviewing and learning the material in preparation for the final examination, which occurred 2 weeks after quiz 2. There was also a workshop activity about a non-inferiority study between the lecture and quiz 2 in which all students participated. The workshops are designed as an interactive group activity to reinforce concepts taught in lecture. This activity may have provided both groups with the opportunity for engagement and feedback. This study did not control for any outside study or preparation by the students.

Regardless of whether the effect of the student response system lasted over time, the short-term impact on learning is noteworthy. Colleges of pharmacy and other disciplines struggle with the challenge of making large lecture classes engaging. The use of a student response system can keep students engaged, provide immediate feedback to both the student and faculty members, and positively impact learning. In addition, the reported evidence supports the use of student response systems simply because students perceive real benefits from it.8-10;18,19 Perception alone can be a powerful tool in improving learning.

Research of this nature has obvious limitations. Students in group 1 may have used text messages to inform students in group 2 that a pop quiz was scheduled. Faculty fatigue may have caused slight differences in the lecture delivery between group 1 and group 2, and there is the uncontrolled time between quiz 1 and quiz 2 when students may have studied the lecture material. Finally, the pre- and post-intervention quizzes were not identical. Every effort was made to ensure that the abstracts used in the quizzes were similar in complexity, but this decision was based on faculty expertise and judgment, and a second expert opinion was not sought. This may have confounded the results.

CONCLUSION

The use of a student response system positively impacted students' short-term learning; however, the positive effect did not last over time. Faculty members may want to consider the use of student response systems to enhance student learning in large lecture classes.

REFERENCES

  • 1.Mentkowski M Associates. Learning That Lasts. San Francisco, CA: Jossey-Bass; 2000. [Google Scholar]
  • 2.Gross Davis B. Tools for Teaching. San Francisco, CA: Jossey-Bass; 1993. [Google Scholar]
  • 3.Wilson K, Korn JH. Attention during lectures: beyond ten minutes. Teaching Psychol. 2007;32(2):85–89. [Google Scholar]
  • 4.Caldwell JE. Clickers in the large classroom: current research and best-practice tips. CBE-Life Sci Educ. 2007;6:9–20. doi: 10.1187/cbe.06-12-0205. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Robertson LJ. Twelve tips for using a computerized interactive audience response system. Med Teacher. 2000;22(3):237–239. [Google Scholar]
  • 6.Menon AS, Moffett S, Enriquez M, Martinez MM, Dev P, Grappone T. Audience response made easy: using personal digital assistants as a classroom polling tool. J Am Med Inform Assoc. 2004;11(3):217–220. doi: 10.1197/jamia.M1468. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Cain J, Robinson E. A primer on audience response systems: current applications and future considerations. Am J Pharm Educ. 2008;72(4) doi: 10.5688/aj720477. Article 77. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Uhari M, Renko M, Soini H. Experiences of using an interactive audience response system in lectures. BMC Med Educ. 2003;3:12. doi: 10.1186/1472-6920-3-12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Gauci SA, Dantas AM, Williams DA, Kemm RE. Promoting student-centered learning in lectures with a personal response system. Adv Physiol Educ. 2009;33(1):60–71. doi: 10.1152/advan.00109.2007. [DOI] [PubMed] [Google Scholar]
  • 10.Trapskin PJ, Smith KM, Armitstead JA, Davis GA. Use of an audience response system to introduce an anticoagulation guide to physicians, pharmacists, and pharmacy students. Am J Pharm Educ. 2005;69(2) Article 28. [Google Scholar]
  • 11.Conoley J, Moore G, Croom B, Flowers J. A toy or teaching tool? The use of audience-response systems in the classroom. Techniques. October 2006:46–48. [Google Scholar]
  • 12.Eggert CH, West CP, Thomas KG. Impact of audience response system. Med Educ. 2004;38(5):576. doi: 10.1111/j.1365-2929.2004.01889.x. [DOI] [PubMed] [Google Scholar]
  • 13.Stowell JR, Nelson JM. Benefits of electronic audience response systems on student participation, learning and emotion. Teaching Psychol. 2007;34(4):253–258. [Google Scholar]
  • 14.Cain J, Black EP, Rohr J. An audience response system strategy to improve student motivation, attention, and feedback. Am J Pharm Educ. 2009;73(2) doi: 10.5688/aj730221. Article 21. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Rubio EI, Bassignani MJ, White MA, Brant WE. Effect of an audience response system on resident learning and retention of lecture material. Am J Radiol. 2008;190:w319–w322. doi: 10.2214/AJR.07.3038. [DOI] [PubMed] [Google Scholar]
  • 16.Palmer EJ, Devitt PG, DeYoung NJ, Morris D. Assessment of an electronic voting system within the tutorial setting: a randomized controlled trial. BMC Med Educ. 2005;5:24. doi: 10.1186/1472-6920-5-24. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Martyn M. Clickers in the classroom: an active learning approach. Educause Q. 2007;2:71–74. [Google Scholar]
  • 18.Medina MS, Medina PJ, Wanzer DS, Wilson JE, Er N, Britton ML. Use of an audience response system (ARS) in a dual-campus classroom environment. Am J Pharm Educ. 2008;72(2) doi: 10.5688/aj720238. Article 38. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Slain D, Abate M, Hodges BM, Stamatakis MK, Wolak S. An interactive response system to promote active learning in the doctor of pharmacy curriculum. Am J Pharm Educ. 2004;68(5) Article 117. [Google Scholar]
  • 20.Kelley KA, Beatty SJ, Legg JE, McAuley JW. A progress assessment to evaluate pharmacy students' knowledge prior to beginning advanced pharmacy practice experiences. Am J Pharm Educ. 2008;72(4) doi: 10.5688/aj720488. Article 88. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from American Journal of Pharmaceutical Education are provided here courtesy of American Association of Colleges of Pharmacy

RESOURCES