Abstract
Audience response systems (ARSs) are an increasingly popular tool in higher education for promoting interactivity, gathering feedback, preassessing knowledge, and assessing students' understanding of lecture concepts. Instructors in numerous disciplines are realizing the pedagogical value of these systems. Actual research on ARS usage within pharmacy education is sparse. In this paper, the health professions literature on uses of ARSs is reviewed and a primer on the issues, benefits, and potential uses within pharmacy education is presented. Future areas of educational research on ARS instructional strategies are also suggested.
Keywords: audience response system, technology, technology-enhanced learning
INTRODUCTION
Although in existence for several years, audience response systems (ARSs) are just now enjoying widespread success as a teaching tool within higher education.1,2 Audience response systems are referred to by an assortment of names including personal response stations, interactive voting systems,3 electronic voting systems,4-6 student response systems,7,8 interactive student response systems,9 group response systems,10 group process support systems,11 and the more colloquial term clickers.12-15
Regardless of nomenclature, these systems typically consist of 3 elements: presentation software such as PowerPoint, receiver hardware, and response devices. Instructors present questions to which the audience responds via the handheld devices or computer software. The instructor has the option of displaying the aggregate results to the audience and/or simply collecting the results for further analysis. Most systems have the ability to collect responses either anonymously or in an individually identifiable format.
Literature Review
ARSs are used for a variety of reasons, such as collecting data and engaging the audience in a presentation.16 Instructors have reported that utilization of an ARS during instruction increased students' perceptions of learning course material,17 engagement with lecture content,18 class participation,19 interest in the course,7 and performance on examinations.7 In addition, several researchers4,11,20 have reported that an ARS enhances traditional lectures by promoting interactivity and initiating discussion.10
These systems are simply a tool, however, and must be used appropriately to achieve desired results. Interactivity and general learning outcomes are influenced by the instructor's pedagogy and strategic use of the ARS.13,15,21 From an instructor's viewpoint, ARSs can be beneficial for peer-learning activities,15 gathering feedback on students' understanding of lecture material,4,6,22,23 identifying students' misconceptions about content,18,24 and enabling the instructor to adapt lectures to address those misconceptions.5,10 Instructors' major concern is covering less material due to time needed for student responses and subsequent discussion.24
Much of the ARS literature has focused on learner perceptions of ARS usage. Students appear to have positive attitudes regarding the use of an ARS in classes.15 Reinforcement of content,25 provision of feedback,10,15,22 anonymity in participation,4,6,11,19 increased interest in the course,7 and ability to compare one's level of knowledge to the rest of the class6,25 have all been reported as positive characteristics of ARS use within lectures. Conversely, using an ARS simply for the sake of technology and not for a pedagogical benefit is troubling to students,4 as is potential problems with classroom time used for setting up the system.6
Research literature concerning the use and effectiveness of ARS utilization in pharmacy education is limited. Researchers26 used an ARS to measure knowledge of an anticoagulation guide before and after a short instructional presentation. Internal medicine residents (n = 15), clinical pharmacists and residents (n = 24), and third-year pharmacy students (n = 83) completed pre-and postintervention questionnaires. Significant gains in knowledge of the anticoagulation guide were found in all groups. Pharmacy students also completed a 4-statement survey concerning their perceptions of the ARS as a tool in the lecture. The majority agreed that the ARS increased involvement with and understanding of the content.
In order to determine the effects of an ARS on learning, pharmacy researchers9 compared examination scores from traditional lecture formats with examination scores in ARS-aided lectures in different courses. The study occurred over 2 academic years, comparing traditional lecture formats during the first year and ARS utilization in the second year. In the first course, Clinical Pharmacokinetics, students achieved significantly higher test scores in the ARS lectures versus the traditional format lectures. In the second course, Medical Literature Evaluation, students who received the ARS lectures scored significantly higher on the final examination than students receiving traditional lectures. In Pathophysiology and Therapeutics, students in the ARS lectures scored significantly higher than students in traditional lectures on test questions that required analytical-type thinking. A survey of faculty members who used ARSs revealed that they believed utilization of the ARS systems benefitted student learning, but that less material was “covered” in those classes. Finally, in a survey of student satisfaction, students reported more satisfaction with the ARS format.
Few studies on ARS usage within other health professions have been reported in the literature. Researchers27 studied the differences in factual information recall by family medicine residents (n = 22) taught using traditional lectures, interactive lectures without an ARS, and interactive lectures with an ARS. Quiz scores following interactive lectures with and without an ARS were significantly higher than quiz scores following traditional lectures. Mean quiz scores 1 month later declined in all 3 groups, but mean scores in the ARS group remained the highest. The researchers concluded that both interactive lectures without an ARS and lectures using an ARS were associated with improved learning outcomes.
Pradhan, Sparano, and Ananth28 conducted a randomized controlled trial comparing traditional delivery of information on contraceptive options with delivery utilizing an ARS. Eight obstetrics and gynecology residents were assigned to a presentation using ARS and 9 were assigned to a traditional lecture. The residents were tested for baseline knowledge and received a posttest evaluation 6 weeks later to assess knowledge retention. Residents in the ARS group achieved significantly higher test score gains than subjects in the traditional lecture group. The researchers concluded that an ARS might be an efficient teaching tool for residency education. An evaluation survey also revealed that all residents found the ARS easy to use and a majority of the residents (n = 14) perceived it to be a helpful learning tool.28
Another study examined effects of ARS usage on presentation quality, speaker quality, and ability to maintain interest in continuing medical education clinical round tables. Participants (n = 164) at tables that used an ARS rated those areas significantly higher than did participants (n = 119) at tables not using an ARS. Researchers concluded that an ARS strategy may increase attention and enthusiasm in CME learners.29
Over 3 years, researchers collected data from attendees of CME lectures using an ARS (n = 148) and lectures not using an ARS (n = 67).30 On a scale from 1 to 4 with 4 being excellent, ARS lectures (mean = 3.47, SD = 0.38) were rated significantly higher by attendees than non-ARS lectures (mean = 3.32, SD = 0.37). Additional questions related to perceptions of the ARS as a teaching/learning tool were posed to attendees (n = 491) at CME lectures during the last year of the study. The number of attendees actually completing the surveys was not given; however, 92.4% rated ARS lectures as more useful for learning than non-ARS lectures.
Only a small number of research studies on ARS use within health professions education have been published. Previous research has tended to focus on either student/faculty perceptions of its use or on knowledge retention when using an ARS during lectures; however, there are numerous areas for further ARS research.
ARS SELECTION AND APPLICATION
The next few sections of this manuscript serve as a primer on audience response system selection, issues, usage and applications to teaching and learning. They are written as an overview for those considering the adoption and implementation of an ARS at the institution, program, or course level. Particular attention is given to matters pertaining to logistical issues, pedagogical issues, and considerations of optimal usage, which are not typically found in research literature. Individual faculty, departments, and colleges/schools must decide for themselves the most important aspects in a system.
Vendors
Because of the rapidly changing development of ARS vendors, features, and pricing schemes, no particular vendor will be discussed. Instead, selection and implementation information, which can be applied across all vendors are presented. Some of the more prominent ARS vendors include:
Comtec (http://www.comtecars.com)
eInstruction (http://www.einstruction.com)
iClicker (http://www.iclicker.com)
Interwrite PRS (http://www.interwritelearning.com)
Qwizdom (http://www.qwizdom.com)
TurningPoint (http://www.turningtechnologies.com)
Like other e-learning applications, this list is fluid and subject to change.
Reasons for Acquisition
Determining the reasons for acquiring an ARS is important since those reasons may be the guiding factor in vendor selection. These reasons might include, but are not limited to: learner engagement, ability to determine whether learners are appropriately assimilating course materials, retention of prior course content, formative and summative assessments (both low stakes and high stakes), and attendance checks.
Ownership Considerations
The issues and logistics associated with system ownership is another fundamental consideration in choosing an ARS. Comprehension of how an ARS works is necessary to comprehend ownership issues. An ARS is the integration of several different facets, one of which is the software necessary to create the polling questions and run the system. This software is typically installed on faculty workstations, shared computer resources, and in the classroom. In order for the learner to respond, each individual needs a response device (clicker), and the learning environment (classroom) needs a receiver (antenna or USB key).
One challenge is to choose an ARS based on how well it integrates into both a program's teaching philosophy and the institution's technology plan. For example, a pharmacy school with a mandatory laptop program may highly value an ARS that can utilize laptops as response devices, rather than basing the decision on other features.
Because of the multifaceted nature of issues involved in implementing an ARS, numerous variables need to be considered and multiple questions answered. Who purchases (owns) the response stations? How does a program handle defective response cards? Who purchases the receivers (antennas)? Who purchases the software? Does the purchase of any hardware device include the software? Should the college have additional receivers (antennas) and response cards? The answers to these questions are vital to programmatic decisions regarding implementation and management of an ARS.
Financial Considerations
The cost of implementing and maintaining a system is one of the most obvious and important variables when selecting a system. Vendors offer different licensing models and pricing schemes that can complicate cost comparisons. Initial purchase cost, licensing fees, response device costs, support, and maintenance are just a few of the variables that should be included in calculating the total cost of ownership.
The decision on whether the program or students should pay for the system may vary depending on which vendor is selected. ARS pricing and packages differ among the vendors. Frequency of use by students as well as the initial and/or annual cost should also be considered. Holding students responsible for purchasing response cards may be the easiest solution; however, the administration should purchase any required software to demonstrate programmatic support to faculty members and students. Warranties on the devices (typically 1 year in length) can sometimes be negotiated into student purchase plans.
Implementation Considerations
While cost may be a significant variable in a purchase decision, this should not sacrifice ease of use and the ability to achieve the intended purpose. The adoption and use of an ARS represents a change to the teaching, learning, and assessment dynamic, which can be difficult for some. If a product has a steep learning curving or is not easy to use, adoption will be hindered or even rejected. Ease of use considerations when choosing an ARS should include, but are not limited to the following: the integration of ARS software within PowerPoint or other presentation software, the system's ability to be used as a free-standing application outside of PowerPoint, the level of difficulty in creating and using polling slides in the learning environment, the complexity of the ARS response device, and the system's data recording and reporting features. Ease of use comparisons can be accomplished by using a matrix of the above considerations, as well as those specific to the institution. A basic comparison can be accomplished using manufacturer provided information to get a general sense of functionality. It may also benefit all constituents to view and participate in an onsite demonstration.
The decision to adopt an ARS is usually an administrative one; however, obtaining extensive feedback from faculty and staff is advised. Establishing a small advisory committee to evaluate the different considerations and participate in the onsite demonstration is recommended. Involvement of the advisory committee should facilitate faculty member and student adoption of the ARS and promote the idea that it was a shared decision by users and not simply an administrative decision.
ARS Response Card (Clicker) Considerations
Response cards (clickers) vary in terms of functionality and cost. Some devices accept a single alphanumeric input only, while others allow multiple alphanumeric inputs and even short essay answers. Some ARS systems will work with a variety of response cards and some have their own proprietary card. In general, response cards that allow complex input are often more expensive than the simpler ones. Proprietary software is now being developed that allow computers to serve as response devices. How instructors want to use the devices during class, along with the cost differential in response devices, may dictate the vendor and type of device chosen.
ARS Receiver Type (Infrared Versus Radio Frequency) Considerations
Receivers (antennas) are required to pick up the signal sent by each response card (clicker). There are 2 types of receivers currently in use: infrared (IR) and radio frequency (RF); each with its own advantages and disadvantages. IR units are generally less expensive, but require “line of sight” similar to a television remote control and may not function well in larger venues or in rooms where reception might be obstructed. Institutional computing groups, in addition to the advisory committee identified earlier, should determine which is most appropriate.
Purchase of the receivers (antenna) is usually the responsibility of the program and can be done by either purchasing dedicated receivers (antennas) for each instructional classroom or purchasing a few receivers and moving them from room to room as needed. Receivers are usually relatively inexpensive, therefore purchasing them and dedicating one for use in each instructional classroom is not a significant expense. Some vendors supply the receivers at no cost if a minimum number of response cards are purchased.
Need for Additional Receivers and Response Devices
Purchasing additional receivers and response cards that faculty members and administration can check out and use outside of the standard classrooms with other constituents (faculty, alumni, continuing education, and other external groups) may be beneficial. ARSs are portable, easy to use, and applicable to a variety of learning and presentation situations. For example, ARS usage may be valuable at determining understanding at a continuing education program as opposed to handing out independent questions via paper. This can be a beneficial way to represent the program to external constituents of the school (preceptors, alumni, advisory board, and others) and give a positive view of the application of technology. Finally, using an ARS in faculty and/or committee meetings permits faculty members to experience the system from an end-user perspective, and allows for anonymity during difficult faculty voting issues. Maintaining a central checkout system for the use of additional response cards and receivers helps preserve accountability for system usage. In addition, depending upon usage, multiple receivers with large blocks of response cards may be needed to ensure availability of the system.
Classroom Considerations
Each system has some specific implementation issues that are unique to that product. However, some general implementation issues should be considered in order to ensure a seamless application of an ARS in the classroom. It is important for the classroom technology to have the software and receiver (antenna) preloaded in every classroom where the ARS will be used. Programming receiver channels and frequencies in advance reduces the chances for failure. If financially unfeasible to equip all rooms, consideration should be given to placing the receivers in rooms with the potentially best return on investment. Although less than ideal, a mobile cart equipped with an ARS receiver and any necessary audiovisual equipment can be shared among classrooms. Additional classrooms and facilities can be added incrementally as budget allows.
Some systems require the receiver to be “initiated” before the system will work. If so, this must be completed first in order for the receiver to recognize the student response cards. Documenting ARS, computer, and room setup procedures is important to ensure that each classroom environment is consistent in the event something in the classroom is reset and/or future classrooms are used.
Faculty Training and Support
The availability of a designated primary support person to resolve problems, answer questions, and ensure resolution of any updates is crucial for faculty members. As with any new technology, it is important to ensure that all faculty members who use the ARS receive appropriate training in its use. Faculty development may need to be offered early and often to foster comfort and confidence with an ARS. One approach is to provide initial training, let faculty members use the system in the classroom, and then offer a follow-up session addressing any identified issues. This also gives other faculty members the opportunity to learn from peers and hear first-hand accounts of benefits and shortfalls of the system.
Student Considerations
As end-users of the system, the impact of an ARS on students must be recognized. Any issues they might have should be considered during the preimplementation and implementation phases. Device costs are particularly important to students and every effort should be made to make the response card purchase a one time event, linking it to a programmatic requirement.
Finally, appropriate application of the ARS in the curriculum should be defined and encouraged. Overuse of the system or use for trivial purposes can lead to student burnout and apathy. If students lose motivation to participate in ARS sessions, the academic value is decreased and faculty members may become reluctant to use it.
APPLICATIONS TO TEACHING AND LEARNING
The uses of an ARS as related to teaching, learning, and assessment are widespread and applicable in a variety of situations. The following are some general applications of an ARS within the classroom.
Gauging and Improvising
One of the most useful applications of an ARS is gauging and improvising within lectures. As has been previously discussed, using an ARS to gauge student understanding at any point during a lecture is extremely useful and an easy application for which these systems were designed. Numerous strategies exist for obtaining different types of information. For example, an instructor can use an ARS at the start of a lecture to evaluate understanding from a previous course, lecture, or assigned readings. This pretesting provides information on retention and understanding of previously discussed material. Using an ARS in the middle to end of a lecture can evaluate student understanding of currently presented materials. The instructor can then use the results of ARS feedback to focus more on areas of misunderstanding. One factor for faculty to consider is whether the ARS exercises are graded or simply formative with no grades attached to the exercise. This determination may affect how students perceive the use of the system.
Student Engagement
One of the primary benefits of ARS usage is the ability to increase student engagement and interactivity.31 Asking students to answer questions with the system actively engages them with content and can also spark further class discussion. Using an ARS to increase interaction may reduce the numbers of times that an instructor asks a question in class and only gets minimal response. Because of greater participation in answering the questions, an instructor can obtain accurate feedback from the class as a whole.
Eliciting “True” Opinions and Thoughts
With an ARS, it is possible to know how students truly feel about a given situation, while giving them the security of being anonymous. Anecdotally, one reason that students give for lack of participation in class is that they do not want to risk being wrong in front of their peers. An ARS gives students the chance to interact and offer opinions without fear of public scrutiny for incorrect answers or expressing an unpopular opinion. As a result, an instructor may receive more accurate feedback from students than when using the traditional “raise your hand” method.31
RECOMMENDATIONS FOR RESEARCH
The overall research base (particularly within the health professions) on ARS-usage is relatively small compared to other forms of technology-based learning. Most of the existing ARS literature is of the descriptive variety and focuses on perceptions or attitudes of students and instructors. Also, studies in which learning or other outcomes are measured generally compare courses/lectures in which an ARS was used with courses/lectures that did not use an ARS. While these studies may be valuable to teaching/learning practitioners, they lack a very important element in terms of research: the instructional strategy that was used. Any effects from using an instructional medium do not come from the use of the media itself, but from the instructional methods employed.32 The design of the questions and how the system is utilized within an overall instructional setting will dictate its success. An ARS is simply another tool for teaching and learning. Just like other media or instructional tools, how it is used determines its effectiveness.
Future research on ARSs should primarily focus on specific teaching and learning needs including analysis of different strategies to elicit feedback, interaction, and/or participation. Information is needed on specific strategies that promote desired levels of discussion and/or interaction. Another area for further research is the use of ARSs to facilitate discussions on topics of a sensitive nature, such as ethics, morality, and personal belief systems.
Researchers interested in adding to the knowledge base about teaching with ARSs have several questions from which to choose. How can an instructor take advantage of the anonymity and group comparison features of ARSs to delve into discussions of delicate topics? Does a particular style of ARS-aided questioning result in better discussions in which critical thinking and analysis are more prevalent? These types of questions need answers in order for teaching practitioners to take full advantage of ARSs as a teaching tool.
CONCLUSION/SUMMARY
With appropriate strategies, ARSs have the potential to make classroom lectures more engaging and interactive. In addition to increased participation by students, instructors can reap the benefits of having real-time feedback on students' understanding and misunderstanding of lecture material. Like most technologies, careful consideration needs to be given to the selection and use of an ARS at an individual course and/or programmatic level. Pedagogical, technical, and logistical issues should be addressed in order to achieve successful implementation in the educational environment.
ACKNOWLEDGMENTS
The authors would like to thank the many different faculty members who have shared experiences and comments with the authors concerning ARS usage. Also, we thank Ms. Belinda Morgan for providing editorial assistance on this manuscript.
REFERENCES
- 1.Abrahamson L. A brief history of networked classrooms: effects, cases, pedagogy, and implications. In: Banks DA, editor. Audience Response Systems in Higher Education: Applications and Cases. Hershey, PA: Information Science Publishing; 2006. pp. 1–25. [Google Scholar]
- 2.Burnstein RA, Lederman LM. The use and evolution of an audience response system. In: Banks DA, editor. Audience Response Systems in Higher Education: Applications and Cases. Hershey, PA: Information Science Publishing; 2006. pp. 40–52. [Google Scholar]
- 3.Brezis M, Cohen R. Interactive learning in medicine: Socrates in electronic clothes. Q J Med. 2004;97:47–51. doi: 10.1093/qjmed/hch008. [DOI] [PubMed] [Google Scholar]
- 4.Draper SW, Brown MI. Increasing interactivity in lectures using an electronic voting system. J Comput Assisted Learning. 2004;20:81–94. [Google Scholar]
- 5.Kennedy GE, Cutts QI. The association between students' use of an electronic voting system and their learning outcomes. J Comput Assisted Learning. 2005;21:260–8. [Google Scholar]
- 6.Stuart SAJ, Brown MI, Draper SW. Using an electronic voting system in logic lectures: one practitioner's application. J Comput Assisted Learning. 2004;20:95–102. [Google Scholar]
- 7.Preszler RW, Dawe A, Shuster CB, Shuster M. Assessment of the effects of student response systems on student learning and attitudes over a broad range of biology courses. Life Sci Educ. 2007;6:29–41. doi: 10.1187/cbe.06-09-0190. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Lowery RC. Teaching and learning with interactive student response systems: a comparison of commercial products in the higher-education market. Paper presented at: Annual meeting of the Southwester Social Science Association and its affiliates. March 23-26, 2005New Orleans, LA. [Google Scholar]
- 9.Slain D, Abate M, Hodges BM, Stamatakis MK, Wolak S. An interactive response system to promote active learning in the doctor of pharmacy curriculum. Am J Pharm Educ. 2004;68:1–9. [Google Scholar]
- 10.Draper S, Cargill J, Cutts Q. Electronically enhanced classroom interaction. Aust J Educ Technol. 2002;18:13–23. [Google Scholar]
- 11.Jones C, Connolly M, Gear A, Read M. Group interactive learning with group process support technology. Br J Educ Technol. 2001;32:571–86. [Google Scholar]
- 12.Gentry D. Using audience response systems in FCS. J Fam Comp Sci. 2007;99:42–4. [Google Scholar]
- 13.Trees AR, Jackson MH. The learning environment in clicker classrooms: Student processes of learning and involvement in large university-level courses using student response systems. Learn Media Technol. 2007;32:21–40. [Google Scholar]
- 14.Herreid CF. “Clicker” cases: introducing case study teaching into large classrooms. J Coll Sci Teach. 2006;36:43–7. [Google Scholar]
- 15.Caldwell JE. Clickers in the large classroom: current research and best-practice tips. CBE Life Sci Educ. 2007;6:9–20. doi: 10.1187/cbe.06-12-0205. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Steinert Y, Snell L. Interactive lecturing: strategies for increasing participation in large group presentations. Med Teach. 1999;21:37–42. [Google Scholar]
- 17.Johnson JT. Creating learner-centered classrooms: Use of an audience response system in pediatric dentistry education. J Dent Educ. 2005;69:378–81. [PubMed] [Google Scholar]
- 18.Hatch J, Jensen M, Moore R. Manna from heaven or “clickers” from hell. J Coll Sci Teach. 2005;24:36–9. [Google Scholar]
- 19.Beekes W. The ‘Millionaire’ method for encouraging participation. Active Learning High Educ. 2006;7:25–36. [Google Scholar]
- 20.Uhari M, Renko M, Soini H. Experiences of using an interactive audience response system in lectures. BMC Med Educ. 2003:3. doi: 10.1186/1472-6920-3-12. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Beatty ID, Gerace WJ, Leonard WJ, Dufresne RJ. Designing effective questions for classroom response system teaching. Am J Phys. 2006;74:3–39. [Google Scholar]
- 22.Fitch JL. Student feedback in the college classroom: A technology solution. Educ Technol Res Dev. 2004;52:71–81. [Google Scholar]
- 23.Elliott C. Using a personal response system in economics teaching. Int Rev Econ Educ. 2003;1:80–6. [Google Scholar]
- 24.Knight JK, Wood Wb. Teaching more by lecturing less. Cell Biol Educ. 2005;4:298–310. doi: 10.1187/05-06-0082. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Bunce DM, VandenPlas JR, Havanki KL. Comparing the effectiveness on student achievement of a student response system versus online WebCT quizzes. J Chem Educ. 2006;83:488–93. [Google Scholar]
- 26.Trapskin PJ, Smith KM, Armitstead JA, Davis GA. Use of an audience response system to introduce an anticoagulation guide to physicians, pharmacists, and pharmacy students. Am J Pharm Educ. 2005;69:190–7. [Google Scholar]
- 27.Schackow TE, Chavez M, Loya L, Friedman M. Audience response system: Effect on learning in family medicine residents. Fam Med. 2004;36:496–504. [PubMed] [Google Scholar]
- 28.Pradhan A, Sparano D, Ananth CV. The influence of an audience response system on knowledge retention: An application to resident education. Am J Obstet Gynecol. 2005;193:1827–30. doi: 10.1016/j.ajog.2005.07.075. [DOI] [PubMed] [Google Scholar]
- 29.Miller RG, Ashar BH, Getz KJ. Evaluation of an audience response system for continuing education of health professionals. J Continuing Educ Health Professions. 2003;23:109–115. doi: 10.1002/chp.1340230208. [DOI] [PubMed] [Google Scholar]
- 30.Copeland HL, Stoller JK, Hewson M. Making the continuing medical education lecture effective. J Continuing Educ Health Professions. 1998;18:227–34. [Google Scholar]
- 31.Robinson E, Cain J. Orlando, FL: 2007. Do you know what your students know? Using response stations to engage your students. Annual Meeting of the American Association of Colleges of Pharmacy. July 16. [Google Scholar]
- 32.Clark RE. Media will never influence learning. Educ Technol Res Dev. 1994;42:21–9. [Google Scholar]