Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2019 Jul 1.
Published in final edited form as: Public Health Nurs. 2018 Mar 22;35(4):353–359. doi: 10.1111/phn.12397

Lessons Learned from using an Audience Response System in a Community Setting for Research Data Collection

Keneshia Bryant-Moore 1, Tiffany Haynes 1, Dennis Z Kuo 2, M Kathryn Stewart 1, Karen Hye-cheon Kim Yeary 1, Johnny Smith 3, Jerome Turner 4, Songthip T Ounpraseuth 1, Greer Sullivan 5, Stephanie McCoy 1, Brittany Hudson 1, Kimberly Harris 1
PMCID: PMC6055995  NIHMSID: NIHMS940059  PMID: 29566271

Abstract

A community-academic team implemented a study involving collection of quantitative data using a computer-based audience response system (ARS) whereby community partners led data collection efforts. The team participated in a reflection exercise after the data collection to evaluate and identify best practices and lessons learned about the community partner-led process. The methods involved a qualitative research consultant who facilitated the reflection exercise that consisted of two focus groups—one academic and one community research team members. The consultant then conducted content analysis. Nine members participated in the focus groups. The reflection identified the following themes: the positive aspects of the ARS; challenges to overcome; and recommendations for the future. The lessons learned here can help community-academic research partnerships identify the best circumstances in which to use ARS for data collection and practical steps to aid in its success.

Keywords: Community Based Participatory Research, Community/Public Health, Computers/Technology, Data Collection Methods

Background

In community based participatory research (CBPR) brief surveys are often used to capture information from community participants but some survey designs and administration techniques may not meet the needs of populations with low literacy rates. Suitable methods for designing and collecting quantitative data via surveys in low literacy populations have been well established. These methods include recommendations for the design of the survey itself such as 1) writing the survey at a 6-grade level or lower; 2) keeping sentences short; 3) not using words with multiple syllables; 4) using lots of white space on printed surveys; and 5) the use of pictures or illustrations (National Institutes of Health, 2016). In addition, delivery options for the survey may include 1) reading the survey questions aloud to participants, this can be accomplished by either having a live person read the question or a recording of the questions; 2) providing assistance in completing the survey, such as an interviewer asking the question and recording the responses for the participant; and 3) the use of I-pads, tablets, mobile phones, computers or other electronic assistive devices (Hearst, 2014; Ndebele, Wassenaar, Munalula & Masiye, 2012). These techniques can be used with individuals or in groups within the community setting. But, challenges may be faced with the delivery of the surveys using these methods.

There are some notable challenges when collecting data within a community setting with large groups. These may include the limited availability of assistive electronic devices due to cost associated with purchasing the devices. Though some programs allow participants to use their personal mobile phone, these programs require 1) internet connection, 2) Smart Phone, and/or 3) a downloaded app on the phone (Stowell, 2015). In addition, other data collection programs are web-based and also require internet connection within the community setting. Though researchers can purchase “hot spots” to support these programs, it adds to the cost and coordination of the process.

Because of these identified challenges, alternative data collection methods for this population may be sought. The Audience Response System (ARS) is a tool developed for classroom learning, but can be used to collect data in low-literacy populations. This approach offers several advantages – the program administrator presents the information and multiple-choice questions on a PowerPoint slide and participants use a handheld electronic key-pad device or “clicker” to transmit their responses to the computer program. Using a clicker to submit anonymous responses encourages participants to answer the questions honestly without fear of embarrassment from answering incorrectly (Mastoridis & Kladidis, 2010). ARS has been shown to increase engagement between program administrators (teachers, researchers, presenters) and their audience (students, research participants, conference attendees)(Mastoridis & Kladidis, 2010; Vana, Silva, Muzyka & Hirani, 2011; Thomas, Monturo & Conroy, 2011; Solecki, Cornelius, Draper & Fisher, 2010; Patel, Koegel, Booker, Jones & Wells, 2006). The program can display anonymous, aggregate responses on a PowerPoint slide, providing the program administrator and participants with immediate feedback. This feedback allows the administrator to discuss the material in the survey and provide further information (Mastoridis & Kladidis, 2010).

For researchers, ARS can be an efficient method to engage and collect data from a group of participants. The data is automatically saved into a computer database that does not require internet connection. This could potentially not only reduce data entry errors, but alleviates the burden of processing paper-based surveys and the use of internet “hot spots” (Gray et al., 2016; Riebl, Paone, Hedrick, Zoellner, Estabrooks & Davy, 2013).

Objective

After an extensive literature search using CINAHL, Google Scholar, PubMed, MEDLINE, and PsycINFO, few research manuscripts were identified that describe the use of ARS as a data collection method in research. The research studies that did incorporate ARS used it most often as an assessment tool to analyze research participants comprehension of the consent process or to simply collect demographic data and were conducted by academic researchers (Gamito, Burhansstipanov, Krebs, Bemis & Bradley, 2005; Vohra, Chebl, Miller, Russman, Baker & Lewandowski, 2014; Keifer, Reyes, Liebman & Juarez-Carrillo, 2014). Gray et al. (2016) did use ARS as a means to test the validity and reliability of the Food, Health, and Choices Questionnaire (FHC-Q) administered with ARS technology. The study occurred in a classroom setting and the ARS was administered by researchers. Currently, there is a gap in the literature that describes the use of ARS as a data collection method for research in a community setting administered by community members. Thus, the objective of this paper is to describe and evaluate the innovative and unique process of using ARS as data collection method for a CBPR study in a community setting.

Methods

Target Population

Arkansas leads the nation in deaths related to cancer, cardiovascular health, and diabetes. The state ranks 48th in the nation in its citizens’ overall health status and ability to access health care (United Health Foundation, 2017). Additionally, individuals in Arkansas are more likely to report engaging in health behaviors that place them at risk for developing chronic health conditions including, physical inactivity and smoking (Kaiser Family Foundation, 2017). This is especially true for ethnic minorities in Arkansas. Health disparities based on race and ethnicity are striking in Arkansas, especially in the Delta region, which is disproportionally affected by poverty. There is a 31% disparity between African American and White Arkansans’ mortality rates for all causes (U.S. Department of Agriculture, 2011). Recognizing the pressing need to eliminate health disparities, community organizations and members, especially those from the faith community in the Delta, have partnered with university over many years to develop and implement health interventions. In the rural south, churches play an essential role in communities, serving as key organizational entities that provide social support, and leadership. Because of their position in the community, churches are potentially effective settings for implementing health interventions. Over 10 years ago, the Faith Task Force in Phillips County was developed to identify and address the health needs of the community using a CBPR approach in partnership with a university faculty member. Chaired by Pastor J. T., the Faith Task Force is made up of clergy, parishioners and community leaders from a diverse representation of denominations. In 2013, the task force expanded to Jefferson County led by Pastor J.S. As a part of a funded research grant, university researchers in the fields of medicine, psychology, nursing, and nutrition, and community partners set out to identify the most pressing health concerns of congregants in the Arkansas Delta through surveys. The community-academic partnership team adapted an existing questionnaire to develop the final 72-item questionnaire that was administered to groups of church members from rural Black Churches. The questionnaire included questions associated with physical conditions (e.g. diabetes), mental health conditions (e.g. depression), and health behaviors (e.g. alcohol abuse). Examples of such questions included reported general health in adults and children, use of specific health care resources in the past year, and screening behaviors (e.g. screening for breast cancer. Instrument development is described elsewhere (Yeary et al., 2015; Stewart et al., 2015). Because of the low literacy rates in the region and the evidence of engagement and novelty of ARS, it was used as the data collection method. Another university community partner, Tri-County Rural Health Network (TCRHN) had previous experience using ARS in the community. TCRHN is a non-profit community-based organization whose home office is located in the Arkansas Delta region. TCRHN offers information and programs addressing health care. For this project they provided expertise in using the ARS system in community settings.

Community Engagement

The community partners were trained in teams of two to lead the ARS data collection sessions in their counties of residence. The Phillips County team consisted of Pastor J.T. and a community liaison, who have engaged in data collection, but not in the ARS format. The other team, based in Jefferson County, consisted of Pastor J.S. and a community liaison. The teams were trained by a TCRHN staff member who had utilized the ARS in the community setting on a different project (Stewart el al., 2015). The trainings occurred at a location convenient for each team in an unstructured format. The ARS equipment was owned by the university (used primarily in the classroom setting), and had to be checked out and returned for each training and subsequent data collection session. This TCRHN staff member also attended the first couple of data collection sessions in each county to provide technical support. In addition, the trainer provided a couple of brief refresher trainings for the team members per their request.

Prior to the first data collection session, a research assistant developed the PowerPoint slides to be used with the ARS software for the sessions. The ARS software does not support continuous data or open-ended questions; therefore a paper-and-pencil survey was developed to supplement the ARS survey. Due to limited research funds, confined number of ARS “clickers” and the goal of having a diverse sample, each church setting were limited to 20 participants. The community partners recruited all of the participating churches for the study. From August 2013- February 2014, participants completed health assessments using the ARS. A 10-step data collection protocol was developed included in Table 1. A total of 461 participants across 30 churches completed the ARS survey. On average data collection sessions took 2-hours. The results of the survey were used to guide research priorities, focusing on the health issues of greatest concern to the community.

Table 1.

Data Collection Process using ARS

ARS Research Team Training and Preparation
Adaptation of the health assessment questionnaire by the community and academic research team.
Questions were developed in PowerPoint for use in the ARS software and the paper-and-pencil questionnaire supplement was developed.
Training of the community partners on the ARS software and equipment.
Recruitment of churches to participate in the study by community partners.
Data collection sessions were then scheduled at each church. Generally, the sessions were during the week in the evening hours, but a few were held on a Saturday morning.
Protocol for Data Collection Sessions
Step
1
Participants were welcomed as they entered the church; before getting started, the hosting church pastor was asked to give a few words and introduce the research team.
2 The research study was described including participant’s rights; A “Participant Information Sheet” with this information was distributed and read verbatim. Signatures from the participants were not required.
3 The ARS handheld devices/“clickers” were distributed and an orientation on how to use them was provided.
4 A framing of the health assessment was then provided by either Pastor J.S. or Pastor J.T. This included a brief description of the health disparities experienced in the community.
5 The ARS health assessment survey was conducted. Each question read aloud and time was allotted for discussion and questions.
6 At the end of the ARS health assessment participants were asked their thoughts about the most concerning health issue in their community.
7 Because the ARS does not support continuous data responses and paper health assessment was also required. The paper health assessment survey and pens were then distributed to participants. As the paper surveys were being distributed, the ARS handheld devices were collected and the corresponding device identification number was written on the paper survey for each participant. Prior to the data collection session, the session date and location were written on the surveys.
8 Each question on the paper survey was then read aloud and time was allotted for discussion and questions.
9 When the paper survey was completed, a large envelope passed around for participants to place their surveys inside.
10 At the conclusion of the session, the participants and hosting church pastor were thanked for their time and input, then provided their incentives.
Post-Data Collection Session
After the data collection sessions were completed. The community partners downloaded the ARS data (in Excel) to a password protected USB flashdrive then gave it to the research assistant with the completed paper-and-pencil questionnaires for data entry. The paper survey data were entered into an excel spreadsheet. The device identification number was included in order to merge the ARS data for each participant.

Evaluation of the ARS in the Community Setting

Because of the unique research team arrangement and new data collection method, the team participated in a reflective exercise in order to evaluate the process (planning, training, and facilitating) and identify lessons learned. The reflective exercise was based upon the concept of reflective practice, which is a method or technique used in various fields that aids individuals and groups to reflect on their experiences and actions in order to promote the process of continuous learning (Moon, 1999). This can be particularly effective in CBPR to not only reflect on the research process but also the partnership.

An external qualitative research consultant was contracted to conduct two focus groups: one focus group was held with the academic team members and one with the community team members; all participants had attended at least one health assessment using the ARS. The community and academic partners were interviewed separately to provide a ‘safe space,’ in which to openly share their reflections on the data collection experience. The consultant developed the interview guide for the focus groups based on the study protocol and the team’s desired outcomes (e.g. lessons learned). The questions were developed to elicit information about the ARS, the data collection process, and the community-academic partnership within the context of this process. Questions included “When you think about the use of the ARS, what are the first thoughts that come to mind?”, “Describe what you set out to do and what actually occurred when using the ARS in the church setting.”; and What would you say went well using the ARS?

The consultant was accompanied by a note-taker who was not involved in the study. The note-taker captured non-verbal cues and an overall description of the focus group and group dynamics. The focus groups lasted approximately 90-minutes and were conducted on the university campus in a private conference room. The sessions were digitally recorded and transcribed verbatim. The consultant then reviewed, coded and conducted content analysis using Atlas.ti software. The results were shared with the research team and there was consensus that the consultant adequately captured that was shared in the focus groups and that they were presented in the appropriate context. Because this was a reflection exercise, the university Institutional Review Board (IRB) determined it as Non-Human Subjects Research.

Lessons Learned

Six academic and three community research team members participated in the focus groups. Data were organized into three categories: Positive aspects of ARS, Challenging Aspects of the ARS in a community setting, and Recommendations for the Future.

Positive Aspects of the ARS

The positive aspects of ARS as a data collection method in research included three major areas training/preparation, recruitment and participant experience at the data collection sessions, and the benefits of CBPR when using the ARS in the community. The community partners talked about how they benefitted from being able to practice with the ARS equipment on their own time. The pastors were happy to have the equipment available to them well ahead of the first session in a church. Pastor J.T., who has more research experience, and his community liaison offered assistance to Pastor J.S., who has less research experience, and this was deeply appreciated by him.

Nobody in my county had been involved in this kind of initiative before, so there was nobody to call, nobody to contact, no other pastor. It was a kind of trailblazing for me and for our county. Rev. J.T. who had more experience…in this environment and research was able to give me some reassurance. Community Team Member

Pastor J.S. also received ongoing assistance from the TCRHN staff member even when her funding had ended. The additional technical assistance resulted in the building of confidence with Pastor J.S. feeling more confident with the ARS toward the end of data collection. Pastor J.T. stated that he had prior exposure using the ARS, and that helped him with the data collection process.

The community partners talked about two large “front end” expectations- recruitment and raising community member awareness about health issues. The pastors were complimented for their outreach to and respect from the community as reasons for successful recruitment. The academic team members felt the community partners enhanced the prospect of people attending and participating in the sessions. The academics were noted for the confidence given to community leaders for outreach skills and techniques (e.g. community presentations). All research team members stated that the health awareness raised in the church community setting was positive. This was largely attributed to the community partners’ ability to engage participant audiences and to establish an atmosphere for humor and talking about sensitive issues (e.g. sexuality). The research team felt the ARS system not only aided in engagement by providing immediate feedback to the group to facilitate discussion, but noted that the ARS was a “less boring procedure” than a traditional paper and pencil method, and that people of all generations were able to use the “clicker” system with ease after a brief orientation to the method.

People could appreciate the information that was given. They could understand the necessity and imperativeness of the health issues in our community as a result of using that [ARS] system. Community Team Member

This [ARS] really engaged the audience, the participants were interested, they would have conversation, they were discussing what was going on and the health of their community in a way that I think that if we had just given them a paper survey, they would have go to their own separate corners, then that wouldn’t have happened. Academic Team Member

Both academic and community team members talked about capacity and confidence building and considered true CBPR as part of “what worked” with the ARS data collection method. Several powerful testimonies to the benefits of CBPR were brought up by both groups.

They [community partners] know their community as far as recruitment and really selling it [the research study]. I think it was important for them to understand the ARS system and how it worked in order to actually share that with the pastors that they were recruiting, so they have a full understanding of exactly what we’re going to be doing. I think it is important for them to really understand that, and then how it would be conducted. Academic Team Member

This was also attributed to the responses from the participants. Though they were not questioned about their experiences using ARS, through informal conversations, comments and observations the participants stated they enjoyed the ARS and the health assessment process, and that it was a preferable method to paper version of surveys. The participants also felt like their responses were important and that they individually contributed to a larger process.

Overall, both community and academic team members stated that the ARS data collection method “worked” and was successful in church community health assessments, and that “good, clean quantitative data” was a positive measure of success. Additionally, they felt that the ARS system was user-friendly for research participants, an effective way to gather group data, and was suitable for visually and literacy-challenged audience participants.

We read every question, so that they clearly knew what they were responding to. And I think that worked very well. So I think that was another thing that contributed to the accuracy of the data that was collected because we made sure they knew what the questions were. Community Team Member

I think it was much easier to use for people who couldn’t read very well, because the pastor would read it aloud. I mean, they would have to be able to tell the numbers, but I think it’s definitely easier to take that if you’re illiterate or have literacy issues. Academic Team Member

Challenging Aspects of the ARS in a Community Setting

The research team identified four major challenge areas after using ARS in the community setting: 1) ARS Training and technology support; 2) the limitations of the ARS equipment and software; 3) Data collection protocol; and 4) potential pressure for congregants to participate.

Community partners stated the biggest issue for them was becoming comfortable with the ARS equipment and software. They felt it would have been helpful for them to have access to technical and training resources until they felt comfortable. This included comfort with medical terminology and definitions (e.g. colonoscopy). The ARS was owned by the university and as a result there were some instances when the research team was competing with university students who were given scheduling preferences to use the ARS equipment. This made it challenge to ensure the ARS equipment was available for not on the ARS trainings, but each data collection session.

That was the challenge, and that was an internal challenge, is that we use the clickers owned by the university and not…we did not purchase them for the actual study. When we initiated the assessments, it was also the beginning of the fall semester. We had to wait for all the students to check out their ARS “clickers” before we could check out ours. Academic Team Member

The community partners also noted that everyone is different and the level of training and follow-up needed varies depending on a person’s previous experiences and confidence-building needs. None of the academic team members participated in the ARS training. Some had previous exposure using the system as instructors or participants in professional conferences, but none felt they were competent in using the ARS equipment and software. This did not become an issue until technological “glitches” were experienced in the field. This is also when it became evident that continued technical support from the TCRHN staff member was needed, but due to budget constraints this was not possible.

During the data collection sessions, when there were electronic or technical “glitches” that needed to be resolved. It was difficult to get immediate assistance or help from the Informational Technology (IT) department at the university. This was largely due to the university IT department not being included as part of the research team. The department is designed to support classroom use of ARS, not research. In addition, the timing of the data collection sessions was often during non-business hours, which added another layer of complication to gain access to help. Unfortunately, these technical problems led to delays during some of the data collection sessions. Paper versions of the ARS survey were available during each data collection session, but were never used as a result of technical issues.

For me [the challenge] was the initial stages…being able to correct whatever faults or defaults –whatever occurred, effectively and efficiently. I think at one point we had a defect and I had to try to get in touch with somebody. I couldn’t fix it myself…our screen going black or the system going down and what do I need to do and all that. Community Team Member

The research team identified limitations of the ARS system and software prior to beginning the data collection sessions. The decision to use ARS was made prior to finalizing the questions to be used in the survey and the types of questions developed did not take into consideration the limitations of the ARS system. ARS does not have the ability to capture continuous data or qualitative data. This meant that questions requiring continuous or open-ended responses needed to be asked in a paper-and-pencil format. In addition, there was a limitation of the numerical 0-9 keypad on the ARS “clicker”. This limited the number of responses for each question. Because of this, a paper version of the survey was required as a complementary part of the ARS data collection. This required the team to match the electronic data with the paper-and-pencil data and was difficult in practice because of potentially mislabeling the survey or participants exchanging handheld ARS devices.

A data collection session protocol (see Table 1) was developed by the team prior to the initiation of data collection; unfortunately every potential scenario could not be predicted until the data collection sessions were initiated. The greatest challenge was incorporating into the protocol the management of participants who arrived late to the session and when more than 20 community members arrived to participate in the study. For participants who were late, three potential options were discussed, 1) they would be told they could not participate; they would be given a paper version of the ARS survey, or only complete a portion of the ARS survey. It was decided that depending on how late the participants were they would be either told not to participate or given a paper version of the ARS survey to supplement data not captured with the ARS. Convenience sampling was used; therefore participants were included at each church on a first come basis. If there was a concern more than 20 people would be present, numbers (1-20) were distributed as people arrived. Additionally, during some of the data collection sessions participants shared, exchanged, or mixed up their “clickers” during the survey. This often occurred if a participant had to excuse themselves for the restroom or other reason.

People would put their clicker down on the pew…and they would be mixing them up on the pews, “Which one was yours?” Academic Team Member

Lastly, though this was not expressed by any participant to the research team, another potential concern during the data collection sessions was a sense of pressure to participate. Although the data collection was anonymous and participation voluntary, some congregants may have felt obligated to participate due to the respect they have for the community partners and their pastors.

I got the impression that many of them [participants] showed up because the pastor said, “Come on in”. Academic Team Member

Participants were reminded of their rights and that they were not required to answer all of the survey questions. This was also a challenge because the number of participant responses displays on the screen. Therefore, it was unclear if participants did not answer the question intentionally or not.

Recommendations for the Future

The research team identified five major recommendations for using ARS in the future based upon the lessons learned from their experiences. The first recommendation was to ensure the data collected is compatible with the ARS equipment and software. The use of a paper-and-pencil survey to compliment the ARS led to additional protocol steps and opportunities for data entry errors.

You have to think about what kind of data you want to collect at the front end…some things probably would’ve been better collected other ways. Academic Team Member

The second recommendation was to develop a formal ARS Training program for the entire research team (community and academic partners).

I think I needed to be trained to use it [ARS]. I had limited experience as a faculty member because we use them in some of our courses for our students…I took the training probably about three, maybe even four years ago…we want to be helpful to our teammates, I felt a lot of times I wasn’t adequate and able to actually help…I wish I knew more about it [ARS]. Academic Team Member

The training should incorporate a practice or mock data collection session prior to collecting research data. Confirm that all facilitators are comfortable with the medical terminology used in the survey, and the incorporation of trouble-shooting “worst case scenarios” and technical glitches.

…Ensuring the facilitators get to a point where they feel comfortable in being able to say “okay, I believe I can do it,” and because I did go in several sessions a bit uncertain and still a bit timid about being able to be expedient and you know, all those kind of things. Community Team Member

The third recommendation was to purchase ARS equipment specifically for the research team. This would ensure that the equipment is available for all possible time frames that the team would need them for training and data collection sessions. Also, the research team could modify some of the “clickers” to include braille for the visually impaired participants. In an effort to address required changes in the data collection protocol in a timely manner, the fourth recommendation was to have a debriefing immediately after each data collection session or during the weekly research team meetings. The fifth and final suggestion was including more staff with the ability to provide technical support. This may include a person from the university and hiring community liaisons with more technology experience to support the community partners not only during business hours, but also during the evening and weekends.

Conclusions

As with all CBPR projects, trust, respect and dependability between community and academic partners was key. These principles were also the basis of success in completing the health assessments using the ARS. The research team identified benefits of using ARS and the health assessment process in churches in other rural areas. Overall, the ARS was accepted by both the academic and community research team members as an suitable method for collecting data among a community population with low literacy. The ARS allowed the community investigators to engage the participants and promote discussion. Lastly, the ARS data was easily downloaded into an Excel spreadsheet, which provided a clean data set. These benefits of the ARS identified supported previous literature (Gray, 2016; Keifer, Reyes, Liebman & Juarez-Carrillo, 2014). In contrast, in an effort to collect data in a rigorous manner, there were some challenges faced by the team. The inability to collect continuous data led to the use of a paper and pencil survey that required it to be matched with the ARS data. In addition, because participants at times shared or exchanged “clickers” during the survey, this challenged the quality of the ARS data. Based upon the experience of the research team and the participants’ level of engagement using the ARS, it will likely be used in the future for future studies. But, the ARS will likely be limited based upon the type of data collected.

The lessons learned from this research project can help community-academic research partners identify the best circumstances in which to use ARS for data collection, including key elements in the planning, training, and facilitation process. Overall, the ARS was viewed as successful, and with some minor changes and consideration of its limitations would be recommended for use in future CBPR research and other projects in the community setting.

Acknowledgments

Funded by NIH Grants 1R24MD007923-01; the project described was also supported by the Translational Research Institute (TRI), grant UL1TR000039 through the NIH National Center for Research Resources and National Center for Advancing Translational Sciences. The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH. We would like to acknowledge Martha Tinney with Minded Solutions for conducting the focus groups and analyzing the data.

Footnotes

DR. KENESHIA J. BRYANT (Orcid ID : 0000-0002-1361-3215)

References

  1. Gamito E, Burhansstipanov L, Krebs L, Bemis L, Bradley A. The use of an electronic audience response system for data collection. Journal of Cancer Education. 2005;20:80–6. doi: 10.1207/s15430154jce2001s_16. [DOI] [PubMed] [Google Scholar]
  2. Gray HL, Koch PA, Contento IR, Bendelli LN, Ang I, Di Noia J. Validity and reliability of behavior and theory-based psychosocial determinants measures, using Audience Response System technology in urban upper-elementary schoolchildren. Journal of Nutrition Education and Behavior. 2016;48(7):437–452. doi: 10.1016/j.jneb.2016.03.018. [DOI] [PubMed] [Google Scholar]
  3. Hearst M. Using technology for data collection among low literacy populations. Internal Grant Awards. 2014;56 Retrieved From: http://sophia.stkate.edu/internal_awards/56. [Google Scholar]
  4. Kaiser Family Foundation. Analysis of the Centers for Disease Control and Prevention (CDC)’s Behavioral Risk Factor Surveillance System (BRFSS) 2013–2015 Survey Results 2017 [Google Scholar]
  5. Keifer M, Reyes I, Liebman A, Juarez-Carrillo P. The Use of Audience Response System Technology with Limited-English-Proficiency, Low-Literacy, and Vulnerable Populations. Journal of Agromedicine. 2014;19(1):44–52. doi: 10.1080/1059924X.2013.827998. [DOI] [PubMed] [Google Scholar]
  6. Mastoridis S, Kladidis S. Coming soon to a lecture theatre near you: the ‘clicker’. Clinical Teacher. 2010;7:97–101. doi: 10.1111/j.1743-498X.2010.00355.x. [DOI] [PubMed] [Google Scholar]
  7. Moon J. Reflection in Learning and Professional Development: Theory and Practice. Kogan Page; London: 1999. [Google Scholar]
  8. National Institutes of Health. Clear Communication: Clear & Simple. 2016 Retrieved from: https://www.nih.gov/institutes-nih/nih-office-director/office-communications-public-liaison/clear-communication/clear-simple.
  9. Ndebele P, Wassenaar D, Munalula E, Masiye F. Improving understanding of clinical trial procedures among low literacy populations: an intervention within a microbicide trial in Malawi. BMC Medical Ethics. 2012;13:29. doi: 10.1186/1472-6939-13-29. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Patel K, Koegel P, Booker T, Jones L, Wells K. Innovative approaches to obtaining community feedback in the Witness for Wellness experience. Ethnicity & Disease. 2006;16:S35–42. [PubMed] [Google Scholar]
  11. Riebl S, Paone A, Hedrick V, Zoellner J, Estabrooks P, Davy B. The comparative validity of interactive multimedia questionnaires to paper-administered questionnaires for beverage intake and physical activity: pilot study. JMIR Research Protocols. 2013;2(2):e40. doi: 10.2196/resprot.2830. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Solecki S, Cornelius F, Draper J, Fisher K. Integrating clicker technology at nursing conferences: an innovative approach to research data collection. International Journal of Nursing Practice. 2010;16:268–73. doi: 10.1111/j.1440-172X.2010.01840.x. [DOI] [PubMed] [Google Scholar]
  13. Stewart M, Felix H, Olson M, Cottoms N, Bachelder A, Smith J, et al. Community Engagement in Health-Related Research: A Case Study of a Community- Linked Research Infrastructure, Jefferson County, Arkansas, 2011–2013. Preventing Chronic Disease. 2015;12 doi: 10.5888/pcd12.140564. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Stowell J. Use of clickers vs. mobile devices for classroom polling. Computers & Education. 2015;82:329–334. [Google Scholar]
  15. Thomas C, Monturo C, Conroy K. Experiences of faculty and students using an audience response system in the classroom. Computers, Informatics, Nursing. 2011;29:396–400. doi: 10.1097/NCN.0b013e3181fc405b. [DOI] [PubMed] [Google Scholar]
  16. United Health Foundation. Arkansas. 2017 Retrieved from http://www.americashealthrankings.org/AR.
  17. U.S. Department of Agriculture Economic Research Service. Rural America at A Glance 2011 Edition. 2011 Retrieved from: http://www.ers.usda.gov/Publications/EIB85/EIB85.pdf.
  18. Vana K, Silva G, Muzyka D, Hirani L. Effectiveness of an audience response system in teaching pharmacology to baccalaureate nursing students. Computers, Informatics, Nursing. 2011;29:TC105–13. doi: 10.1097/NCN.0b013e3182285d71. [DOI] [PubMed] [Google Scholar]
  19. Vohra T, Chebl R, Miller J, Russman A, Baker A, Lewandowski C. Improving Community Understanding of Medical Research: Audience Response Technology for Community Consultation for Exception to Informed Consent. Western Journal of Emergency Medicine. 2014;15(4):414–18. doi: 10.5811/westjem.2014.3.19426.. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Yeary K, Ounpraseuth S, Kuo D, Harris K, Stewart M, Bryant K, Haynes T, Turner J, Smith J, Williams S, Sullivan G. To what extent do community members’ personal health beliefs and experiences impact what they consider to be important for their community-at-large? Journal of Public Health. 2015:1–9. doi: 10.1093/pubmed/fdv118. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES