Skip to main content
AMIA Annual Symposium Proceedings logoLink to AMIA Annual Symposium Proceedings
. 2008;2008:429–433.

Coupling Direct Collection of Health Risk Information from Patients through Kiosks with Decision Support for Proactive Care Management

David F Lobach 1, Garry M Silvey 1, Janese M Willis 1, Kevin R Kooy 1, Kensaku Kawamoto 1, Kevin J Anstrom 2, Eric L Eisenstein 3, Frederick Johnson 1
PMCID: PMC2655964  PMID: 18999181

Abstract

Data collection from patients for use in clinical decision making is foundational for medical practice. Increasingly, kiosks are being used to facilitate direct data collection from patients. However, kiosk-collected data are generally not integrated into the care process. In this project, 4,014 people initiated a kiosk-administered health risk assessment questionnaire using a free-standing public-access kiosk. For 201 of these initiated sessions, kiosk users supplied a Medicaid identification number which allowed their data to be integrated into a regional health information exchange and reviewed by a standards-based clinical decision support system. This system identified 479 survey responses which had been predetermined to warrant follow-up. Notices about these sentinel responses were emailed to care managers and sent to clinical sites. While this study demonstrates the feasibility of collecting and acting on patient-entered health data, it also identifies key challenges to providing proactive care management in this manner.

Introduction

Computers have been used for over four decades to collect clinical information from patients.13 The advantages and disadvantages of these data collection efforts have been extensively reviewed.4 With advances in technology, stand-alone computer kiosks are now increasingly being used to support data collection through direct human-computer interactions and to supply health information directly to patients with demonstrable efficacy57 in diverse settings8,9 and across racial/ethnic barriers.10 The focus of many of these systems is to tailor educational content based on patient-entered data.69,11 It has been less common for patient-specific health information collected via kiosks to be used for patient management,12 in part because the independent, unassisted collection of clinically useful information requires the association of the collected information with a patient identifier. Most projects in which patient-identifiable clinical data have been collected through computer kiosks have generated information that was presented to a clinician, often through an intermediary, as printed text for interpretation and decision making.12 In this project, we collected information about health risks and barriers to accessing care directly from patients though free-standing public-access kiosks. We then automated the surveillance of these data through a clinical decision support system (CDSS) in order to notify care providers about sentinel and modifiable care issues without the involvement of intermediary facilitators or clinicians.

In this paper, we describe a two-year study in which data were collected directly from Medicaid beneficiaries through public-access kiosks, and then monitored by a CDSS to enable proactive intervention in response to modifiable health risks and addressable barriers to care access. We also discuss the challenges we encountered and lessons learned in developing and supporting proactive care management in this manner.

Methods

Kiosk Hardware

The kiosks used in this project were known as Distributed eHealth Resource Information Center Kiosks (DERICK). Each DERICK kiosk consisted of a hardened metal, impact-resistant enclosure configured with a Windows XP-based computer, laser printer, 15” LCD with a resistive touch screen overlay, rugged keyboard and trackball, video camera, amplified dual speaker system, built-in microphone, and telephone handset (KIS, Inc., Louisville, CO) (Fig. 1).

Figure 1.

Figure 1

DERICK Kiosk (above)

Kiosk Software and Questionnaire Development

In developing the questionnaire for the kiosks, our goal was to create a 25-question survey that could be completed in under 5 minutes. To create this content, we interviewed care managers working with Medicaid beneficiaries to identify common risks and barriers that were particularly germane to a vulnerable, at-risk population. We also reviewed several commercially available health risk assessment (HRA) questionnaires and clinical practice guidelines (e.g., U.S. Preventive Services Task Force guidelines). Final selection of the questionnaire items was based on capacity for the risk/barrier to be modified, severity of the consequences if the risk/barrier were not addressed, and the estimated prevalence of the issues in our target population. All questions were translated into Spanish, and both English and Spanish versions were loaded into the questionnaire database. The flow of the user interaction with DERICK is summarized in Figure 2.

Figure 2.

Figure 2

Summary of User Flow through a DERICK Kiosk Session (right)

Kiosk Site Selection and Deployment

We deployed kiosks at four diverse community locations in Durham County, NC serving a high percentage of Medicaid patients: a primary care clinic (Duke Family Medicine Center, DFMC), a federally qualified health center (Lincoln Community Health Center, LCHC), a hospital emergency department (Durham Regional Hospital, DRH ED), and a local department of social services (Durham County Department of Social Services, DCDSS).

Kiosk Educational Resources

We developed three types of educational resources for patients that could be delivered through the kiosks. These resources were designed to respond specifically to the issues identified through the questionnaire. These contextually relevant educational resources included 33 pamphlets addressing medical and socioeconomic issues commonly faced by at-risk populations, 12 three-minute videos addressing common health conditions, and, for patients who provided Medicaid IDs, individually tailored printouts with practical recommendations to address their identified risks and barriers. All educational materials were available in both English and Spanish.

Detection of Sentinel Events

In addition to collecting health risk and barrier information directly from patients and faxing a copy of this information to a patient’s assigned clinic, we used decision support to identify specific questionnaire responses that warranted attention.13 The items requiring notification were selected by a team of care managers and clinicians. We then programmed rules into an existing CDSS known as SEBASTIAN to detect when sentinel answers were entered into the record of a kiosk user who provided a Medicaid ID.14 No notices were sent for guest users, since no identifying information was collected from these individuals. In order to study the impact of the proactive notices about sentinel health risks and barriers to care, we incorporated the distribution of notices into another project that was evaluating the impact of a variety of notices distributed either by weekly e-mail alerts to care managers or quarterly feedback summary reports sent to clinic managers and medical directors. For this larger project, the study population was approximately 20,000 Medicaid beneficiaries who resided in Durham County, NC and were actively enrolled in Medicaid as of March 2006. These patients were randomly assigned by family unit to one of six groups in equal proportions. Group 1 was designated to have e-mail alerts sent to their care managers, group 2 was designated to have feedback summary reports sent to their clinic sites. Group 3 had letters sent directly to the patients. These letters, however, did not include notices about self-reported data and are therefore not relevant to this project. Groups 4 through 6 were control groups in which no notifications were sent.

Three responses to kiosk questions were considered so serious that e-mail notices about these patients were sent to the program director of the care management team regardless of how the patient was randomized, and the patient was dropped from the evaluation. These responses were: 1) a woman indicating that she was pregnant but not receiving prenatal care; 2) a teenager responding affirmatively to depression screening questions; and 3) any kiosk user indicating that there was abuse in his/her home.

Usability and Satisfaction Surveys

To assess the usability and acceptability of the kiosks, we conducted usability and satisfaction surveys from sequential kiosk users. The surveys were derived from validated instruments for assessing usability and satisfaction with computer-based systems.15,16

This study was approved by the Duke University School of Medicine Institutional Review Board.

Results

Kiosk Availability and Use

All kiosk activity was automatically recorded in transaction files that tracked several usage metrics, including kiosk session time, health risk questions answered, pamphlets printed, videos viewed, and user demographics. During the two-year period between the initial kiosk deployment in March 2006 and February 2008, we provided 335 weeks of kiosk availability spread across the 4 sites (Table 1). During this period, 18,025 sessions were initiated on the kiosks and 5,704 users progressed through the module assessing computer and language skills, which determined how subsequent questions were presented17 (Figure 4). For the 4,808 users for whom skill assessment data are available (data on the initial 896 users were not recorded), 16% of these users were evaluated as low literacy (less than fifth grade reading level) and the remainder as high literacy (Table 1). Users entered valid Medicaid IDs for only 203 sessions (4.5%), and the remaining users were anonymous, or “guest” users. Sixty-eight identified Medicaid beneficiaries (3.5% of completed sessions) completed the HRA survey as did 1,853 guest users (Figure 4). These users who completed the survey printed 5,652 educational pamphlets and initiated 2,262 educational video sessions (Table 1).

Table 1.

Kiosk Deployment and Use between March 2006 and February 2008

Site Total
LCHC DFMC DCDSS DRH ED
Deployment Date 03/07/06 05/24/06 07/24/06 10/11/06
Down Time (weeks) 7 3 3 3 16
Total Weeks of Operation 97 88 81 69 335
Initiated Sessions 8,492 2,777 3,844 2,912 18,025
Completed Sessions 595 380 496 450 1,921
Low Literacy – All Sessions 204 145 190 226 765
High Literacy – All Sessions 1,002 865 1,051 1,125 4,043
Low Literacy – Completed Sessions 78 52 61 67 258
High Literacy – Completed Sessions 239 217 322 320 1,098
Completed Sessions per Week 6.2 4.3 6.2 6.5 5.7
Initiated Sessions with Medicaid ID 67 65 46 25 203
Completed Sessions with Medicaid ID 16 22 18 12 68
Completed Sessions with Medicaid ID per Week 0.17 0.25 0.22 0.17 0.20
Number of Educational Pamphlets Printed 1,655 1,178 1,417 1,402 5,652
Number of Educational Videos Initiated 724 440 531 567 2,262
Number of Educational Videos watched for > 1 minute 150 88 124 108 470
Number of Tailored Printouts Generated* 16 22 18 12 68

LCHC = Lincoln Community Health Center; DFMC = Duke Family Medicine Center; DCDSS = Durham County Department of Social DHR ED = Durham Regional Hospital Emergency Department

Explanations for downtime: redoing network, outages due to printer jams, CPU replacement, network cable plugged into wrong port

*

Tailored printouts only available for patients who provided Medicaid IDs

Figure 4.

Figure 4

User Distribution through Kiosk Sessions during 335 Weeks of Kiosk Availability. Numbers are counts of users who reached each stage in the kiosk session.

With regard to language selection, of the 18,025 initiated sessions, 7,640 were initiated in Spanish (42%) and 10,385 in English (58%). Of the 1,921 completed sessions, 327 completed sessions (17%) were in Spanish and 1594 (83%) were in English.

Content and Duration of Kiosk Sessions

The median time to complete the introduction and skill assessment was 1 minute and 10 seconds, and the median time to complete the questionnaire was 3 minutes and 0 seconds. The median number of questions answered was 33, including the 5 mandatory skill assessment questions. The number of questions answered ranged from 20 to 39 based on question branching patterns.

Printing of Educational Materials

During the study period, 1,921 risk assessment questionnaires were completed, allowing these users access to the educational pamphlets. A total of 5,652 pamphlets were printed from 30 of the available 33 pamphlets. On average, patients who had access to the educational pamphlets (i.e., completed the HRA questionnaire) printed 3.4 handouts each. The most printed pamphlets were “How to Eat Healthy Meals for Less,” “10 ways to Help Your Child Stay Healthy,” and “Understanding and Dealing with Depression.”

For all the kiosk users who entered a Medicaid ID and completed the questionnaire, a total of 68 personalized, tailored printouts were generated. These printouts contained information specifically designed to enable Medicaid beneficiaries to overcome patient-entered health risks and barriers to healthcare access.

Use of Educational Videos

During the period from March 2006 to February 2008, 470 videos were viewed by kiosk users. A video was considered “viewed” by a kiosk user if the video was played for more than one minute. All of the 12 available videos were viewed at least once. The most viewed videos were “Vitamin News Flash: Folic Acid,” “Diabetes: Take Care of Yourself,” and “Get the Facts about the Pap.”

Usability and User Satisfaction

In order to assess the users’ impressions of the kiosk system, we conducted user satisfaction and system usability surveys among 216 kiosk users. Patients were recruited randomly from the waiting rooms of DFMC, LCHC, and DCDSS. In general, users viewed the system favorably and found it easy and satisfying to use. Additionally, users reported that they could learn how to use the system quickly, efficiently navigate the system, and were comfortable providing personal information to a computer. Users also indicated that the kiosk session length was appropriate and the questions were understandable.

Detection of Sentinel Events

As described above, sentinel answers to questions reported by patients were distributed by e-mail or feedback reports to care providers for a randomly selected subset of Medicaid beneficiaries. In total, 203 patients entered a valid Medicaid ID at the time when they responded to the questionnaire. The responses of these users generated a total of 479 sentinel answers from which 59 e-mail notices and 54 feedback report notices were sent. No notices were sent for a total of 366 sentinel answers because these users were assigned to the control group or enrolled in Medicaid after the patient randomization in March 2006.

Discussion

In this project, we have developed and deployed four free-standing data collection kiosks at diverse publicly accessible community sites. We have demonstrated that these kiosks can be used to collect information about health risks and barriers to care access directly from patients. Furthermore, we have coupled this information with a CDSS to send out proactive notices about health risks. The kiosks have also provided a mechanism through which contextually relevant health education materials can be conveyed to patients. Additionally, our usability and satisfaction surveys indicated that many patients had a positive impression of the kiosk data entry session.

We have learned several valuable lessons through the development, deployment, and operational support of community-based patient data collection kiosks. With regard to system deployment and support, we discovered that community sites did not want to host kiosks that were only for a specific subset of their client population (i.e., Medicaid beneficiaries), but instead requested that the kiosks be available to all users. In response, we created a “guest” kiosk user status for individuals who did not have or chose not to enter a Medicaid ID number. These individuals received the full kiosk functionality except that they received no tailored printout. One of the surprising lessons we have learned from this project is the challenge of maintaining functional kiosks in public settings. We have been surprised by the amount of damage inflicted on the hardened kiosks by the public. At one site, the tamper resistant telephone receiver was ripped off the kiosk. At most sites, we routinely found items that had been shoved into the printer output slot resulting in obstruction of the slot.

With regard to system implementation, we have been very pleased by the rate at which the kiosk sessions have been initiated in the various community settings, but disappointed by the dropout rate of users during a session, leading to a significant reduction in the number of completed sessions. The rate at which users have supplied their Medicaid IDs at the start of a session has been lower than initially anticipated and has resulted in relatively few completed sessions by individuals supplying valid Medicaid IDs. The approximately three-fold drop-off between initiated sessions and sessions for which the skill assessment was completed likely represents curious casual users, especially children, interacting with the kiosk. This pattern appeared to be particularly prevalent at LCHC where the kiosk was deployed in a pediatric clinic waiting area. The drop-off of an additional ~1,200 users at the Medicaid ID entry screen may reflect the loss of users who are not seriously interested in completing the HRA or users who were deterred by the thought of entering identifiable personal health information into the kiosk. We observed a two-fold decrease between the number of users who responded to the first health risk assessment survey question and the last health risk assessment survey question. Some of these patients may have been called for their appointments before completing their kiosk session. Others may have encountered questions that they did not wish to answer. We plan to review the questionnaire data entry tracking log to see if there are characteristic dropout patterns among users who started the survey but did not finish it. Such a finding would suggest that some questions may be considered too revealing or sensitive. Of note, the observed drop-offs may be an expected outcome for kiosks designed for patient data entry. This possibility warrants further investigation, as prior studies have typically omitted data on the number of initiated (as opposed to completed) kiosk sessions.

The differential completion rates by language (42% of sessions were initiated in Spanish, but only 17% of completed surveys were in Spanish) may indicate that the Spanish speaking users dropped out at significantly higher rates than English speaking users. An alternative explanation is that curious casual users may have been more likely to “test out” the Spanish version and then stop after a few screens. We will need to assess these findings further to determine if they represent true attrition of legitimate Spanish-speaking kiosk users. If validated, such a finding would imply that the kiosk data collection technique may be less suitable for Spanish-speaking patients.

The lack of sessions with Medicaid IDs was disappointing and has impaired our capacity to notify providers proactively about sentinel survey responses that could benefit from prompt intervention. We sought to increase the entry of Medicaid IDs by adding a screen to the introduction that illustrated exactly where the Medicaid ID is situated on the Medicaid enrollment card. We also provided “invitations” to the reception personnel at the kiosk sites to encourage Medicaid beneficiaries to use the kiosks by providing them with a copy of their Medicaid IDs on the invitation. We also considered using financial incentives, but we decided against this option so as to simulate a sustainable operational clinical environment. As an ongoing project, we are conducting focus groups to determine how to increase the number of users who enter valid Medicaid IDs.

In the context of proactive patient management using decision support, we have successfully demonstrated that CDSSs can be used for surveillance of a database containing patient-entered information in order to identify sentinel events that could benefit from proactive intervention. Unfortunately, the infrequent entry of Medicaid IDs by kiosk users resulted in a relatively small number of HRAs from which notices could be generated. The small number of notices has limited this aspect of our evaluation from being conclusive regarding the benefits of this technology.

In addition to developing the capacity for detecting sentinel events from patient-entered information as described above, we have also shown that we can incorporate these data into existing patient records in a community-oriented health information exchange. We have also demonstrated the capability of distributing these data to patents’ assigned clinics and of displaying these data through a Web-based health information system. However, the low number of HRAs that were associated with Medicaid IDs limited the extent to which we could demonstrate the impact of the information distribution capabilities that we developed.

The findings of this study are limited in that the kiosk deployment focused on Medicaid beneficiaries, which may restrict the applicability of our findings to other populations.

Conclusion

Data entered directly by patients into a public-access kiosk can be coupled with clinical decision support to enable proactive care management of health risks and barriers to care access. This coupling requires patients to enter a unique identifier which ultimately may limit the extent to which patient entered data can be used in decision support. Further research is needed to determine how to optimize the collection of patient identifiers and to determine the impact of this information technology-based approach on care quality and costs.

Acknowledgments

This study was funded in part by H2ATH00998 and H2ATH07753 from the Office for the Advancement of Telehealth of the Health Resources and Services Administration.

References

  • 1.Slack WV, Hicks GP, Reed CE, Van Cura LJ. A computer-based medical-history system. N Engl J Med. 1966;274:194–198. doi: 10.1056/NEJM196601272740406. [DOI] [PubMed] [Google Scholar]
  • 2.Collen MF. Patient data acquisition. Medical Instrumentation. 1978;12:222–225. [PubMed] [Google Scholar]
  • 3.Mayne JG, Martin MJ. Computer-aided history acquisition. Med Clin North Am. 1970;54:825–33. [PubMed] [Google Scholar]
  • 4.Bachman JW. The patient-computer interview: a neglected tool that can aid the clinician. Mayo Clin Proc. 2003;78:67–78. doi: 10.4065/78.1.67. [DOI] [PubMed] [Google Scholar]
  • 5.Nicholas D, Huntington P, Williams P. Three years of digital consumer health information: a longitudinal study of the touch screen health kiosk. Information Process Mgt. 2003;39:479–502. [Google Scholar]
  • 6.Gielen AC, McKenzie LB, McDonald EM, Shields WC, Wang MC, et al. Using a computer kiosk to promote child safety: results of a randomized, controlled trial in an urban pediatrics emergency department. Pediatrics. 2007;120:330–339. doi: 10.1542/peds.2006-2703. [DOI] [PubMed] [Google Scholar]
  • 7.Thompson DA, Lozano P, Christakis DA. Parent use of touchscreen computer kiosks for child health promotion in community settings. Pediatrics. 2007:2006–2669. doi: 10.1542/peds.2006-2669. [DOI] [PubMed] [Google Scholar]
  • 8.Kreuter MW, Black WJ, Friend L, Booker AC, Klump P, et al. Use of computer kiosks for breast cancer education in five community settings. Health Ed Behavior. 2006;33:625–642. doi: 10.1177/1090198106290795. [DOI] [PubMed] [Google Scholar]
  • 9.Radvan D, Wiggers J, Hazel T. HEALTH C.H.I.P.s: opportunistic community use of computerized health information programs. Health Ed Res. 2004;19:581–590. doi: 10.1093/her/cyg080. [DOI] [PubMed] [Google Scholar]
  • 10.Jackson M, Peterst J. Introducing touchscreens to black and ethnic minority groups – a report of processes and issues in the three cities project. Health Information Lib J. 2003;20:143–149. doi: 10.1046/j.1365-2532.2003.00425.x. [DOI] [PubMed] [Google Scholar]
  • 11.Connell CM, Shaw BA, Holmes SB, Hudson ML, Derry HA, Strecher VJ. The Development of an Alzheimer’s Disease Channel for the Michigan Interactive Health Kiosk Project. J of Health Communication. 2003:11–22. doi: 10.1080/10810730305732. [DOI] [PubMed] [Google Scholar]
  • 12.Porter SC, Cai Z, Gribbons W, Goldman DA, Kohane IS. The asthma kiosk: a patient-centered technology for collaborative decision support in the emergency department. J Am Med Inform Assoc. 2004;11:458–467. doi: 10.1197/jamia.M1569. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Lobach DF, Kawamoto K, Anstrom KJ, et al. Proactive population health management in the context of a regional health information exchange using standards-based decision support. AMIA Annu Symp Proc. 2007:473–477. [PMC free article] [PubMed] [Google Scholar]
  • 14.Kawamoto K, Lobach DF. Design, implementation, use, and preliminary evaluation of SEBASTIAN, a standards-based Web service for clinical decision support. Proc AMIA Symp. 2005:380–384. [PMC free article] [PubMed] [Google Scholar]
  • 15.Chin JP, Diehl VA, Norman KL. Development of an instrument measuring user satisfaction of the human-computer interface. CHI ‘88 Conference Proceedings: Human Factors in Computing Systems; New York. ACM Press; 1988. pp. 213–218. [Google Scholar]
  • 16.Webster J, Trevino LK, Ryan L. The dimensionality and correlates of flow in human-computer interactions. Computers in Human Behavior. 1993;9:411–426. [Google Scholar]
  • 17.Lobach DF, Arbanas JM, Mishra DD, Campbell M, Wildemuth BM. Adapting the human-computer interface for reading literacy and computer skill to facilitate collection of information directly from patients. Medinfo. 2004:1142–46. [PubMed] [Google Scholar]

Articles from AMIA Annual Symposium Proceedings are provided here courtesy of American Medical Informatics Association

RESOURCES