Abstract
This article describes how a family caregiver lifestyle physical activity clinical trial uses research technology to enhance quality control and treatment fidelity. This trial uses a range of Internet, Blaise® Windows-based software, and Echo Server technologies to support quality control issues such as data collection, data entry and study management advocated by the clinical trials' literature, and to assure treatment fidelity concerning intervention implementation (i.e., design, training, delivery, receipt and enactment) as proposed by the National Institutes of Health Behavior Change Consortium. All research staff are trained to use these technologies. Strengths of this technological approach to support quality control and treatment fidelity include the comprehensive plan, involvement of all staff, and ability to maintain accurate and timely data. Limitations include the up-front time and costs for developing and testing these technological methods, and having support staff readily available to address technological issues if they occur.
Keywords: family caregivers, physical activity intervention, quality control, treatment fidelity
Approximately five million Americans are living with Alzheimer's disease (AD) and it is currently the seventh leading cause of death in the United States (Alzheimer's Association, 2010). Due to progressive deterioration seen in persons with AD, many family caregivers experience negative mental and physical health consequences (Vitaliano, Zhang, & Scanlan, 2003). Current family caregiver clinical trials have addressed care-related stressors, focus on caregiver mental health outcomes, and use psycho-educational or skill-building models. Fewer interventions have focused on caregiver health promotion and physical health, even though it is necessary for caregivers to maintain optimal mental and physical health to sustain their caregiving responsibilities. Increasing physical activity is one of the most effective health behavior interventions that has potential to benefit caregiver mental and physical health (Christmas & Andersen, 2000; Mazzeo et al., 1998). Of importance in implementing these interventions is quality control regarding clinical trial guidelines and maintenance of treatment fidelity (Friedman, Furberg, & DeMets, 1998; Resnick et al., 2005). Although a variety of intervention technologies, including audiotapes, computers, Internet, videotapes, and telephone-based systems have been used to deliver caregiver interventions, less attention has been given to exploring how technology can support quality control and treatment fidelity in family caregiver trials (R. Schulz, Lustig, Handler, & Martire, 2002). This study draws from two bodies of literature designed to assure quality control concerning clinical trials and treatment fidelity concerning intervention implementation.
The clinical trials' literature addresses quality control including study design and development, study management and analysis of data. This literature has evolved out of large academic and industrial clinical trials that have focused on developing and testing pharmaceutical interventions designed to prevent and/or treat diseases such as those affecting the heart and lungs, cancer and AIDS. The clinical trial is seen as the definitive tool for evaluating the applicability of clinical research and represents a key research activity that has potential to improve quality of health care and control costs by carefully comparing alternative treatments (Friedman et al., 1998; Hulley, Cummings, Browner, Grady, & Newman, 2007; National Institutes of Health, 1979).
As intervention studies developed over time, behavioral scientists expressed concerns about the differences between treatment fidelity requirements in pharmaceutical clinical trials and health behavioral interventions conducted in field settings. In response to these concerns, an integrated model of treatment fidelity for health behavior interventions was developed by the Treatment Fidelity Workgroup of the National Institutes of Health Behavior Change Consortium (BCC) (Bellg et al., 2004). Key components of this BCC model and a tool for assessing treatment fidelity in these studies suggest that the following should be addressed: treatment design and training of providers; and treatment delivery, receipt, and enactment (Borrelli et al., 2005). Unfortunately, many intervention studies assume that treatment fidelity has been achieved because of the study design, but few studies use objective assessments to quantify the extent to which the components of treatment fidelity have actually been achieved (Resnick et al., 2005).
In the past 10–15 years, major changes have taken place in the availability of computer, web-based and other technologies that have altered the face of health care delivery; professional and consumer education; and how health behavior intervention research is conducted. These technologies provide opportunities for improving data collection and management; developing, delivering and disseminating health promotion interventions; integrating treatment fidelity into health behavior clinical trials; and assessing mental and physical health outcomes (Clancy, Glanz, Rimer, & Viswanath, 2008; Nguyen, 2010; Wantland, Portillo, Holzermer, Slaughter, & McGhee, 2004).
Purpose/Aims
The purpose of this paper is to describe how a family caregiver lifestyle physical activity clinical trial uses technology to support quality control as proposed in the clinical trials' literature; and assess treatment fidelity as proposed by NIH Behavior Change Consortium (Bellg et al., 2004; Resnick et al., 2005) (See Table 1). Specific aims are: (a) describe the Telephone Resources and Assistance for Caregivers (TRAC) clinical trial and how technology supports quality control of study design and development, study management, and data analysis; and (b) articulate how technology is used to assure treatment fidelity concerning TRAC design and intervention implementation (i.e., training, delivery, receipt and enactment).
Table 1.
Description of Clinical Trial Quality Control Strategies and TRAC Technology Used to Support these Strategies (Friedman et al., 1998)
Quality Control Strategies | Definition and Description | TRAC Study Technology that Supports Quality Control Strategies |
---|---|---|
Study Design and Development | ||
Design | Random assignment of participants to a control or intervention group to compare effectiveness of the treatment intervention. Clinical trials have clearly defined primary and secondary questions stated in advance. Sample size, determined prior to beginning the study, provides for adequate levels of significance and power. | Study design and intervention development occurs prior to study implement-tation. Study procedures are summarized in a Study Procedure Manual, available to team members on a shared computer drive. |
Outcome Variables | A single outcome answers the primary study question. Secondary outcomes are related to the primary outcome or to subgroup hypotheses. | Primary Outcome: Physical activity Measures: CHAMPS and Actical® direct physical activity monitoring device; Actical® Device Data Monitoring Form Secondary Outcome: CG mental and physical health, physical function and blood pressure, using OMRON IntelliSense™ digital blood pressure monitor |
Study Management | ||
Recruitment | Successful recruitment depends on careful planning, use of multiple strategies, flexibility, careful community entry and attention to multicultural issues, time and resources devoted to acquiring the desired sample size. |
http://www:clinicaltrials.gov/; Electronic Referral Form; Electronic Contact History Form |
Eligibility and Screening | Eligibility criteria focus on including participants who are most likely to benefit from the study and least likely to be harmed by the intervention. The study population is defined in advance and clear inclusion criteria are specified. These criteria impact generalizations made from study findings. | Electronic Community Prescreening Form; Electronic IQCODE Form; Electronic Eligibility and Screening Form |
Data Collection | Baseline data are assessed on all study participants prior to intervention initiation. Efforts are directed to ensuring the quality of data critical to interpreting the trial throughout the study. | Study procedures are summarized in a Study Procedure Manual, available to team members on a shared computer drive. Computer-Assisted Personal Interviewing (CAPI) |
Randomization | Randomization assures that each participant has the same chance of being assigned to the intervention group; tends to result in study groups that are similar; removes investigator bias in assigning participants to groups; and assures that statistical tests will be interpretable. | Electronic Randomization Form |
Data Management, Tracking and Report Generation | Assuring data quality is addressed during study planning and monitored throughout the study. Quality of data is maximized when: an operations protocol manual is prepared before the study begins; forms are well-designed and pretested; staff are trained to standardize all procedures; certification procedures are developed for collection of selected data; blinded assessment is used to reduce variability of the data; and data-entry programs identify missing, inconsistent values, and prohibit entry of out-of-range values. Quality monitoring assures that forms are completed and there is adherence to study procedures. | Study procedures are summarized in a Study Procedure Manual, available to team members on a shared computer drive. Electronic All Eligibility Form; Electronic Component Status Form |
Study Data Analysis | ||
Data Analysis | Data analytic approaches require attention to the following issues (a) assessment of adverse events and monitoring response variables, generally carried out by a Data Safety Monitoring Board/Committee; (b) determination of which participants should be analyzed, when/how covariate adjustments should be made, and how subgroup analyses should be conducted; and (c) presentation and interpretation of results with sufficient information so that readers can adequately evaluate the trial. | Data Analytic Requests, Output and Comments are posted on the web-based data management program. |
Note: All Forms are available on the TRAC Study web site. CAPI = Computer-Assisted Personal Interviewing; CG = Caregiver; CHAMPS = Community Healthy Activities Model Program for Seniors; IQCODE = Informant Questionnaire on Cognitive Decline in the Elderly; RADC = Rush Alzheimer's Disease Center; TRAC = Telephone Resources and Assistance for Caregivers.
Clinical Trials: Maintaining Quality Control
Three major categories of quality control outlined in the clinical trials' literature are vital to assuring that the TRAC study accomplishes its aims. These categories include: 1) study design and development, 2) study management, and 3) data analysis. The following paragraphs briefly describe each clinical trial quality control strategy, how these strategies apply to the TRAC study and technology that supports these methods of quality control. The TRAC study uses a range of web- and Windows-based programs along with other technologies to complete day-to-day study management procedures and assure treatment adherence. Technological programs are implemented in collaboration with the Rush Alzheimer's Disease Center (RADC) and the Rush Institute for Health Aging (RIHA), who originally developed these procedures for implementation of large epidemiological studies. The TRAC study contracts for recruitment services and support from the RADC, a clinical diagnostic and NIH-funded Alzheimer's Disease Center (P30AG010161); and the RIHA for study management and data analytic services, using a centralized home page.
Study Design and Development
Design
A clinical trial compares a prospective treatment intervention to a control group and randomly assigns participants to either the intervention or control. Advantages of randomization include that it minimizes the potential for bias, results in comparable groups, and supports validity of statistical analyses and study significance. Clinical trials include primary and secondary questions that are stated in advance and are generally based on prior data. Sample size, calculated using either the investigators' or other existing data, is determined in advance of the study and ensures that adequate significance levels and power are attained (See Table 1; Friedman et al., 1998).
The TRAC study is an 18-month randomized clinical trial of family caregivers of persons with AD. The primary research question is to determine if the Enhancing Physical Activity Intervention (EPAI) treatment condition, which combines a lifestyle physical activity program and a caregiver skill-building intervention, is more effective in increasing physical activity than the Caregiver Skill-Building Intervention control (CSBI). The CSBI focuses on caregiver skill only. Secondary research questions focus on determining the effects of the EPAI on caregiver mental and physical health, and physical function; and the process of engaging caregivers in a lifestyle physical activity intervention (i.e., intervention adherence and approaches). Study sample size (targeted to be 190 caregivers) was based on prior pilot data (Farran et al., 2008). The TRAC study builds on the Stress Process Model (Lazarus & Folkman, 1984; Pearlin, Mullan, Semple, & Skaff, 1990); and both interventions are guided by Social Cognitive Theory (Bandura, 1986, 1989,1997).
Participants participate in a 12-month intervention during which they receive one home visit at baseline and 19 regular phone calls for a total of 20 contacts from a trained telephone counselor (TC). Phone calls are conducted at progressively spaced intervals that begin at weekly, bi-weekly, and then monthly contacts. TCs selected to administer TRAC study interventions have masters and/or PhD preparation in such areas as aging, family caregivers, physical activity and/or geropsychiatric nursing. TCs are assigned to only one intervention and have separate protocol intervention manuals by intervention. Caregivers participate in data collection interviews at baseline, 3, 6, 9, 12 and 18 months. All study contacts are completed in the caregiver's home or by telephone. Research assistants who collect data are blind to intervention assignment. The Study Protocol Manual, which supports study design and procedures, is technologically available to research team members by way of using a shared computer drive (Table 1).
Outcome variables
Per clinical trial guidelines, the primary outcome or response variable is the variable investigators are most interested in changing (i.e., physical activity). Primary outcomes must be able to be assessed, determined as completely as possible, and be able to provide an unbiased assessment. Secondary outcomes focus on questions that may be related to the primary outcome (i.e., mental and physical health) or may relate to subgroup hypothesis (i.e., who is most likely to increase physical activity?). Primary and secondary outcome variables are identified in advance of the study, used to plan study design and calculate sample size, and must be able to be assessed in all study participants (Table 1) (Friedman et al., 1998).
The primary TRAC study outcome is caregiver lifestyle physical activity and is measured using two methods to minimize potential bias: (a) self-reported caregiver physical activity, measured by the Community Healthy Activities Model Program for Seniors (CHAMPS), an assessment completed immediately post-intervention and for long-term follow-up (Stewart et al., 2001); and (b) objective assessment of physical activity using an Actical® accelerometer, which provides levels of physical activity and also quantifiable assessment of energy expenditure in the form of kilocalories for participants in each intervention group. An Actical® device is initialized via a laptop computer and placed on the caregiver's waist with a neoprene belt. Caregivers are instructed concerning device placement and to wear the device for five consecutive days (i.e., at baseline, 6, 12, and 18 months). All tracking of the Actical® device including placement, anticipated retrieval date, and dates of expected device maintenance are automated and monitored via a web-based Actical® Device Data Monitoring Form (Table 1).
Secondary study outcomes are assessed with a broad range of commonly used caregiver mental and physical health and physical function measures (Lawton, Kleban, Moss, Rovine, & Glicksman, 1989; Rikli & Jones, 2001; Stewart, Hays, & Ware, 1988; Watson & Clark, 1988). Blood pressure is assessed and provides valuable objective information concerning caregivers' potential health risk and practical health information (Rogers, 2005). To minimize sources of error, resting blood pressure is assessed in triplicate over a 5-minute period and automatically averaged, using an OMRON IntelliSense™ Digital Blood Pressure Monitor (Omron Healthcare, 2001).
Study Management
Recruitment
Using clinical trial guidelines, recruitment focuses on acquiring sufficient subjects within a reasonable length of time and is often one of the greatest challenges faced by a clinical trial. Planning for recruitment occurs before the study begins and is vital to determining realistic estimates concerning the number of available participants that meet study criteria. Clinical trials generally use multiple recruitment strategies, including disease-specific clinical centers (i.e., Alzheimer's disease center) physician's offices, institutional and community-based health care facilities and groups, and formal associations that address specific clinical concerns (i.e., Alzheimer's Association). Studies that target a specified percentage of multicultural participants also need to pay particular attention to strategies for recruiting these populations, such as establishing trust, entry into selected communities, and maintenance of ongoing relationships (Levkoff & Sanchez, 2003). Excellent communication with recruitment sites is vital to establishing important working relationships and acquiring the desired number of study participants. Generally, multiple recruitment strategies are most effective and include general respect and accommodation of participants; acknowledgement that success of particular approaches is unpredictable, adequate time and resources for recruitment activities, and that approaches should not be overly aggressive (Table 1) (Friedman et al., 1998).
The Rush Alzheimer's Disease Center (RADC) is the primary TRAC recruitment site from which approximately 40–50% of participants are recruited. Other community providers, such as senior centers, adult day care, continuing care communities and specific dementia-related services refer remaining caregiver participants to the study. Study information was also posted on the clinical trials.gov website (http://www:clinicaltrials.gov/).
Technology is used for tracking referrals from all potential referral sites (i.e., Referral Form) and maintaining an ongoing record of contact history for potential study participants (i.e., Contact History Form) (Table 1). Advantages of using these web-based technological forms include the ability to consistently and reliably follow referrals made to the study, including contact information, dates, and total numbers of contacts.
Eligibility and screening
Clinical trial eligibility criteria are clearly stated in advance, relate to participant safety, and determine which participants are most likely to benefit and adhere to the intervention. Participants who have a greater chance of experiencing adverse events are not admitted to a trial (Table 1) (Friedman et al., 1998).
TRAC eligibility criteria focus on selecting family caregivers who indicate some strain in providing for care recipient activities of daily living needs (Schulz & Beach, 1999) and are sedentary (i.e., less than 60 minutes of physical activity/week). To minimize adverse events, physician approval is obtained for persons over 70 years, those with uncontrolled hypertension, and other medical issues that could interfere with physical activity (i.e., diabetes, active treatment for cancer, use of assistive devices).
Two web-based forms provide support for determining eligibility and screening (Table 1). The Community Prescreening Form is used when referrals are received from community settings and not RADC. Staff use a web-based version of the IQCODE (Informant Questionnaire on Cognitive Decline in the Elderly; Jorm, 1994) which serves as a proxy for a dementia diagnosis, in the event one cannot be obtained. The Eligibility and Screening Form is used to complete screening and covers topics including caregiver/care recipient information, caregiver strain and caregiver medical conditions. Advantages of this system include its flexibility in handling multiple study tasks such as eligibility and screening, and enrollment of community subjects not seen through RADC.
Data collection
Clinical trial guidelines designate that baseline data be assessed on all participants prior to intervention initiation to determine if study groups are comparable at the beginning of the study. Quality of data are monitored throughout the study with particular attention given to identifying data that are missing, incorrect, or have excess variability (Table 1) (Friedman et al., 1998). TRAC approaches that assure data quality include study design and use of the study protocol manual, well-designed forms that minimize potential for errors in data entry, training and certification of data collection staff to assure standard procedures, periodic retraining, repeat and blinded assessment, and use of computer based data-entry programs.
Caregiver interview data are collected using computer-assisted personal interviewing (CAPI) techniques implemented through Blaise®, Version 4.23. Blaise is a Windows-based software system designed to facilitate the preparation and management of stand-alone electronic data collection forms. The advantages of this approach are greater accuracy at the time of data collection, less checking and correction required later, rapid availability of data for study management and data analysis, and elimination of the cost of double entry data-keying services. Blaise permits immediate, on-line quality control at the time of data collection by detailed range and logic checks built into the data entry program. After data are collected, routine error checking programs identify errors, although few are identified because of the range and logic checks built into the programmed Blaise data entry forms. Routine programs also monitor data corrections and log changes to database tables to provide a traceable history of corrections. The program is loaded onto desktops and laptops and then administered by research assistants to all TRAC subjects either in their homes or by telephone. This eliminates paper-based surveys and reduces the potential for data transcription errors. Data collected using the CAPI interview techniques include: caregiver and care recipient background variables, caregiver stressor variables, and caregiver mental and physical health, and self-report physical activity.
Randomization
Clinical trials are judged by whether samples are randomly assigned to groups; other designs are unfavorably viewed. Advantages of randomization include that (a) participants have the same chance of being assigned to the intervention group, (b) participants in each study group tend to be similar, (c) there is less bias because investigators are removed from assigning participants to groups, and (d) this approach assures that statistical tests are interpretable (Table 1) (Friedman et al., 1998). Following participant baseline assessment, the project manager uses a web-based randomization form on the TRAC website, enters a participant's project number, and the participant is automatically assigned to a group, thus eliminating the need for paper-and-pencil randomization (Table 1).
Data management, tracking, and report generation
Ensuring data quality is initially addressed during study planning and monitored throughout the study. Poor-quality data is minimized when a detailed operations protocol manual that addresses all study aspects is prepared before the study begins; forms are well-designed and pretested; investigators and staff are trained to standardize all procedures; certification procedures are initiated for collection of selected data; repeat and/or blinded assessment is used to reduce variability of the data; and data-entry programs identify missing data, inconsistent values, and prohibit entry of out-of-range values. Quality monitoring includes monitoring that forms are completed and there is adherence to study procedures (Table 1) (Friedman et al., 1998).
TRAC study staff use two web-based forms for ongoing monitoring and reporting. A weekly All Eligibility Form notifies staff of due dates for follow-up interviews and the window of time during which these interviews should be completed. This automated system assures that interview dates will not be missed due to human error. The Component Status Form allows staff to regularly view the intervention contacts and follow-up assessments completed by participants at any given time. With these forms, staff can be assured that participants are receiving the intervention contacts as intended. Other reporting functions enable staff to identify all new baseline interviews, timing of follow-up interviews, and creation of a current CONSORT table of study enrollment at any time (K.F. Schulz, Altman, & Moher, 2010).
Data Analysis
Clinical trial guidelines suggest that a number of complex issues, which go beyond the technology focus of this manuscript, need to be addressed concerning data analysis and interpretation of clinical trial findings, including (a) assessment of adverse events and monitoring response variables; (b) determination of which participants should be analyzed, when/how covariate adjustments should be made, and how subgroup analyses should be conducted; and (c) presentation and interpretation of results with sufficient information so that readers can adequately evaluate the trial (Table 1). The reader is referred to additional sources that address clinical trial guidelines in greater detail (Friedman et al., 1998; Meinert, 1986). Suffice it to say that assessment of TRAC adverse events and preliminary monitoring of response variables is carried out by a Data and Safety Monitoring Committee as outlined by National Institutes of Health Guidelines (National Institutes of Health 1998).
In place for the TRAC study is a technological process that further supports quality assurance of data analytic processes. Data analytic requests are submitted via TRAC's centralized web-based data management program. The programmer assigned to the TRAC study accesses these requests and posts output on-line through this program. Correspondence between the programmer and TRAC staff concerning a particular data request is posted as a question-and-answer (Q&A) forum on the website and linked with the particular data request. Responses are posted on the website as well as sent via e-mail, which provides an on-going “paper trail” concerning issues that are addressed. The major advantage of this centralized system is that it eliminates the need to maintain print files of data requests and output and links any ongoing communication with a particular request.
Maintaining Treatment Fidelity
The following paragraphs address strategies for maintaining treatment fidelity developed by the National Institutes of Health Behavior Change Consortium (BCC; Bellg et al., 2004; Borrelli et al., 2005; Resnick et al., 2005). While the clinical trials literature addresses overall study design and development, study management, and data analytic strategies, the BCC strategies focus more specifically on treatment fidelity as related to a particular behavioral intervention.
Treatment Design
The BCC design guidelines focus on ensuring that a study adequately tests hypotheses related to an underlying theory or clinical processes (Table 2; Bellg et al., 2004). Four major BCC treatment fidelity strategies are recommended concerning treatment design: (a) information about dose for the intervention and comparison conditions, (b) mention of provider credentials, (c) acknowledgement of a theoretical model, or (d) clinical guidelines upon which the intervention is based (Borrelli et al., 2005). For succinctness, this information concerning the TRAC study is summarized in the Maintaining Quality Control, Study Design and Development section, noted above. The Study Procedure Manual summarizes information about the intervention, including theoretical model and clinical guidelines. Separate print-copy Intervention Protocol Manuals are used by the EPAI and CSBI TCs, respectively (see Table 2).
Table 2.
Behavior Change Consortium Treatment Fidelity Strategies and TRAC Technology Used to Support these Strategies (Borrelli et al., 2005)
Treatment Fidelity Strategies | Definition and Description | TRAC Study Technology used to Support Treatment Fidelity Strategies |
---|---|---|
Treatment Design | Treatment design issues focus on (a) information about dose for both intervention and control conditions, (b) provider credentials, (c) acknowledgement of a theoretical model, and (d) clinical guidelines upon which the intervention is based. | Intervention design and development occurs prior to study implementation. Study procedures are summarized in a Study Procedure Manual, available to team members on a shared computer drive. Telephone counselors have print-copies of their respective EPAI and CSBI intervention manuals. |
Training of Providers | Training issues (a) describe how providers are trained, (b) standardize provider training, (c) measure whether providers maintain skill and acquire appropriate post-training, and (d) describe how intervention skills are maintained over time. | Echo server audio files |
Delivery of Treatment | Determines whether (a) content is being delivered as specified, (b) dose is delivered as specified, (c) provider adhered to intervention plan, (d) nonspecific treatment effects are assessed, and (d) treatment manual is used. | Echo server audio files Care-Related Concerns Form Telephone Counselor Performance Evaluation Form |
Receipt of Treatment | Determines whether the participant (a) comprehends intervention during the intervention, (b) understands intervention above and beyond intervention intention, (c) is able to perform intended skills, and (d) receives interventions concerning how to improve performance during the intervention. | Echo server audio files Program Adherence Goals Form Telephone Counselor Performance Evaluation Form Pedometer Steps and Physical Activity Log (EPAI, only) recorded on Program Adherence Goals Form |
Enactment of Treatment | Determines if subject can (a) perform skills in the setting in which intervention might be applied, and (b) improve performance in the intended setting. | Echo server audio files Treatment Compliance and Enactment Form Clinician Assessment of Caregiver Skills Form |
Note. All Forms are available on the TRAC study web-site. CSBI: Caregiver Skill Building Intervention (control); EPAI: Enhancing Physical Activity Intervention (treatment); TRAC: Telephone Resources and Assistance for Caregivers.
Technology That Supports Treatment Fidelity Related to TRAC Intervention Implementation
TCs and supervisors use Echo Server technology to efficiently and inexpensively monitor four aspects of treatment fidelity: training, delivery, receipt, and enactment (Table 2). Echo service is an internet protocol (IP) that carries a voice from the transmitter of one telephone to the receiver of another telephone. The Echo Server receives a data packet from a client (the voice transmitter) and echoes it back, ensuring that each sound is received exactly as it was sent without modification. Although there are other types of recording devices, the academic medical center already had the Echo Server system in place to record patient phone calls for quality monitoring. Echo Server data is configured so that TCs can save the audio file of each telephone intervention call to our research central “server,” similar to an electronic filing cabinet. Supervisors can then directly access these saved audio files from the central server. Every telephone intervention call is recorded using the Echo Server technology and each audio file is time- and date-stamped and uploaded to a password protected designated location on the central server, a SUN workstation. To maintain confidentiality of the audio files and intervention logs, access privileges are limited to designated personnel with individual, password-controlled accounts. Clinical supervisors review 20%–25% of randomly selected Echo Server audio files for each intervention on a weekly to bimonthly basis and compare content of the audio file to the specific TC web-based forms.
Using the Echo Server system to record phone calls has more advantages than disadvantages: 1) supervisors can listen to the audio files at any time and in any place they have internet access; and 2) calls do not need to be taped and transcribed. The major limitation of this system is that only calls made from the TC's university office can be recorded. Calls made from home or cell phones are not recorded, so supervisors cannot monitor these interactions. However, non-recorded calls represent a small percentage of intervention calls.
Supervisor preparation for calls. The audio file is stored on the TRAC study common drive as an MP3. When opened, the supervisor can visually verify the caregiver's identification number, call number, date of the call, and duration of the call in minutes. Prior to listening to an audio file, the supervisor reviews the TC's documentation logs for four types of information that helps put the session in context: (a) Background: Are there any new caregiver or care receiver issues that could impact the intervention? If yes, what are they? (b) Care-related concerns: What content was covered during the session and is this consistent with what should be covered according to the protocol? (c) Short and long-term goals: What short-term goal is the caregiver focusing on for this session? What are the long-term goals? (d) Treatment compliance and adherence: For every session, the TC rates the caregiver's completeness of homework assignments related to care-related issues (i.e., personal care, behavioral symptoms, caregiver stress, community resources contacted; and the caregiver's level of intervention enactment, that is, the extent to which caregivers applied the intervention content and achieved their goals). The EPAI supervisor rates additional information as it applies to the physical activity intervention (Table 2).
Training Providers
Training, which is vital to maintaining treatment fidelity, determines if treatment providers have been adequately trained to provide the intervention to study participants (Table 2; Bellg et al., 2004). Four treatment fidelity strategies are generally considered to constitute what should be addressed in a description of the training process: (a) how providers were trained,(b) how training is standardized, (c) how skills are assessed post training; and (d) and how provider skills are maintained over time (Borrelli et al., 2005). Both TCs received 16 hours of initial training in a group setting using the standardized TRAC Study Procedure Manual. The TC for the treatment group (EPAI) received an additional 8 hr of training related to physical activity, including content developed in the “Active Choices” program (Wilcox et al., 2006), which builds on implementing key constructs of social cognitive theory (Wilcox et al., 2006) such as tailoring the physical activity intervention to a person's readiness for change (Prochaska, DiClemente, & Norcross, 1992).
The telephone counselor/clinical supervisor relationship
This relationship is vital to assessing the impact of training on intervention providers, as well as determining intervention delivery, receipt and enactment. Two PhD-prepared clinical supervisors/co-investigators served in this role, one for each intervention arm. Clinical supervisors have expertise in care of older adults, family caregivers and/or physical activity interventions.
Monitoring TRAC fidelity related to training and retraining of telephone counselors. The audio files produced using Echo Server technology provide an easy way to conduct ongoing, weekly to bi-monthly monitoring of the intervention. Results of this monitoring are shared with TCs on a regular basis in face-to-face meetings, telephone calls, or email messages. Interactions between supervisors and telephone counselors are used to identify areas of drift from the protocol and retrain as needed, to identify and reinforce areas of competence and strength, and to help resolve problems that TCs encounter concerning the skill-building (control) and physical activity (treatment) components of the intervention.
Delivery of Treatment
Delivery focuses on using study procedures to standardize delivery of the intervention to determine if the intervention is delivered as intended (Bellg et al., 2004). Common strategies associated with treatment delivery include determining if (a) the intervention content is delivered as intended, (b) the dose of the intervention is as intended, (c) the provider adhered to the intervention plan, (d) the intervention has nonspecific treatment effects, and (e) if the treatment manual is being used (Table 2; Borrelli et al., 2005).
Monitoring TRAC fidelity related to intervention delivery
This aspect of fidelity focuses on the TCs and occurs when supervisors listen to audio files of the telephone-based intervention to determine whether the intervention was delivered as intended. Two aspects of telephone counselor delivery are assessed: (a) competence and skills in interacting with the caregiver; and (b) adherence to the protocol in terms of content, dose, and quality of the intervention. To balance individual caregiver needs with prescribed manual requirements, TCs make deliberate efforts to adhere to intervention content. If a caregiver is facing a major decision (i.e., increasing medication for care recipient, nursing home placement) the TC may focus on the issue at hand rather than content for the schedule, or highlight content direct the caregiver to manual content already discussed ahead of the schedule; or highlight content that is not being covered because of personal needs. These alterations from protocol are documented in the web-based form Care-Related Concerns (Table 2). For each audio file reviewed, the supervisor completes the Telephone Counselor Performance Evaluation Form, an electronic data entry form that focuses on Telephone Counselor competence and adherence. A sample of these items can be seen in Table 3.
Receipt of Treatment
Treatment fidelity related to receipt concerns the degree to which caregivers receive and understand the treatment as intended (Table 2; Bellg et al., 2004; Resnick et al., 2005). Common strategies associated with treatment receipt include: (a) assessment of caregiver's comprehension of intervention during the intervention, (b) strategies designed to improve caregiver intervention comprehension above and beyond intervention content, (c) caregiver's ability to perform the intervention during the intervention period, and (d) strategies to improve caregiver intervention performance during intervention period (Borrelli et al., 2005).
Monitoring TRAC fidelity related to intervention receipt
In the TRAC study, this aspect of fidelity is monitored through the use of the electronic form, Program Adherence Goals (Table 2), a form that documents caregiver's short and long-term goals formulated at the end of each telephone session. Caregiver verbal feedback regarding personal reaction to this content is also documented.
This aspect of fidelity focuses on the caregivers and is designed to assure that the intervention has been received and understood. Supervisors listen to the audio files for evidence during the intervention that caregivers understand the content of the intervention being delivered. For example, (a) Do caregivers' questions and comments demonstrate that they understand the material being taught? (b) Do caregivers' description of the care they provide demonstrate that they can use the skills they are being taught? (c) To ascertain receipt, the supervisor assesses whether caregivers set appropriate goals, build on their strengths, and change their behavior. (d) The supervisor also assesses whether the telephone counselor appropriately rehearses new behaviors and skills with the caregiver and appropriately uses contracts and homework assignments. These aspects of receipt are documented by the supervisor in the web-based form, Telephone Counselor Performance Evaluation (Table 2). EPAI-only participants also use pedometers to monitor daily activity and set goals for physical activity improvement, and they record their steps in a Physical Activity Log (Table 2)
Enactment of Treatment Skills
Treatment fidelity enactment strategies assess whether subjects actually perform the intervention skills in real-world settings or might improve their performance of intervention in their own setting (Borrelli et al., 2005). Enactment for CSBI and EPAI participants examines whether caregivers engage in positive caregiving skills. For EPAI participants only, enactment examines whether caregivers can address barriers to increasing and maintaining their physical activity of choice (Table 2).
Monitoring TRAC fidelity related to treatment enactment
Enactment during the intervention is assessed with two web-based forms: the Treatment Compliance and Enactment and the Clinician Assessment of Behavioral Skill Forms (Table 2). These forms capture enactment of the intervention immediately following each session and at long term follow-up. Specifically, the forms track whether caregivers are learning and making skill-based improvements pertaining to caregiving and physical activity (in the EPAI, only).
Results
This manuscript describes the process of comprehensively integrating technology into a randomized clinical trial of a physical activity intervention for family caregivers of persons with Alzheimer's disease. This study builds upon two bodies of literature that provide direction for determining study quality control. Clinical trials' literature directs that attention be given to study design and development, study management, and data analysis (Friedman et al., 1998). The National Institutes of Health Behavior Change Consortium guidelines provide direction for assuring treatment fidelity (Bellg et al., 2004; Borrelli et al., 2005; Resnick et al., 2005). These issues are complex and go beyond the mere use of technology, but building behavior intervention studies upon these combined bodies of literature increases the likelihood that studies will be properly designed and implemented and that interventions are appropriately developed, implemented and monitored. Of note, is that the thinkingprocess that occurs before, during and after the study is implemented is vital to designing and testing appropriate health behavior interventions.
Integration of technology throughout all phases of the TRAC study resulted in at least three advantages. First, communication was enhanced at multiple levels—(a) amongst three institutional infrastructures within a large medical center (i.e., college of nursing, clinical treatment center, and a research center); (b) within the investigative team--that although based in the Midwest, also includes a co-investigator on the West Coast; and (c) through communication with varied community sites, that while primarily based within the Chicago Metropolitan area, includes subject recruitment in three contiguous states. Second, technologies that support study management procedures, data analysis, and tracking of study outcomes are comprehensive and suggest that technology, properly developed, implemented, and maintained, potentially enables research staff to “do more,” and to do it more accurately and efficiently. Third, technologies used to support intervention delivery suggest that telephone-based strategies can be used to support family caregiver skill building and health promotion; as well as to provide one-on-one supervision of telephone counselors who deliver these interventions.
Potential limitations and/or disadvantages of such expansive technological support primarily rest on the costs and need for staff who can address a broad range of issues that may arise in the implementation and management of such a complex system. Other limitations of this current study are that outcomes and the process of intervention implementation have not yet been fully examined. Other potential limitations suggest that technology should not be integrated into studies “just for the sake of technology” but should have a clearly designed purpose and make study monitoring, data collection, and treatment implementation easier, more efficient and accurate, rather than more difficult and costly to implement.
Next steps that need to be considered include the integration of technology into the delivery and dissemination of health behavior interventions (Glanz, Rimer & Viswanath, 2008; Nguyen, 2010). The increasing numbers of persons who have access to and use information and communication technologies support this direction of inquiry. Particular platforms available to advance this work include the Internet, social and participatory media, and increasing use of online and mobile formats (e.g. iPad and iPhone), which appeal to a wide range of users (Eysenbach, 2008; Fox, 2008). Challenges for investigators who desire to develop and test future technologically based health behavior interventions may include consumers, who are better informed than in the past, have more access to other information available to them, and may have higher expectations for meaningful interventions. Another challenge is developing technology itself, which is an evolving moving target that may create an extended time delay between the conceptualization, development and testing of a technologically based intervention.
Potential opportunities concerning integration of technology into future behavioral interventions include increased possibilities of reaching more persons, given the broader range of available technologies; expanded possibilities to collect interactive online data; and enhanced ability to track intervention exposure. Additional opportunities for intervention delivery include the ability to offer multi-component interventions using varied format and greater flexibility for participant communication and exchange of information (Carrieri-Kohlman, 2005–2011). To assure that these opportunities are maximized it is necessary for researchers to develop collaborative ventures with individuals and/or small businesses who have technological skills and expertise that augments researchers' ability to conceptualize, develop and implement health promotion interventions.
Figure 1.
Supervisor/Telephone Counselor Performance Evaluation: Exemplar Items
Acknowledgement
The authors thank their collaborators at the Rush Alzheimer's Disease Center and the Rush Institute for Healthy Aging; and current TRAC caregiver participants; and Research Associates, Diane Marston and Deborah Monson.
Funding The authors disclosed receipt of the following financial support for the research and/or authorship of this article: National Institute of Nursing Research, R01 NR009543; National Institute of Aging, P30 AG010161
Footnotes
Declaration of Conflicting Interests The authors declared no conflicts of interest with respect to the authorship and/or publication of this article.
References
- Alzheimer's Association 2010 Alzheimer's disease facts and figures. Alzheimer's and Dementia. 2010;6:158–194. doi: 10.1016/j.jalz.2010.01.009. [DOI] [PubMed] [Google Scholar]
- Bandura A. Social foundations of thought and action: A social cognitive theory. Prentice Hall; Englewood Cliffs, NJ: 1986. [Google Scholar]
- Bandura A. Human agency in social cognitive theory. American Psychology. 1989;44:1175–1184. doi: 10.1037/0003-066x.44.9.1175. [DOI] [PubMed] [Google Scholar]
- Bandura A. Self-efficacy: The exercise of control. W.H. Freeman and Company; New York: 1997. [Google Scholar]
- Bellg AJ, Borrelli B, Resnick B, Hecht J, Minicucci DS, Ory M, Ogedegbe G, Orwig D, Ernst D, Czajkowski S. Enhancing treatment fidelity in health behavior change studies: Best practices and recommendations from the NIH behavior change consortium. Health Psychology. 2004;23(5):443–451. doi: 10.1037/0278-6133.23.5.443. [DOI] [PubMed] [Google Scholar]
- Blaise (Version 4.23) [Computer Software] Westat; Rockville, MD: [Google Scholar]
- Borrelli B, Sepinwall D, Ernst D, Bellg AJ, Czajkowski S, Breger R, DeFrancesco C, Levesque C, Sharp DL, Ogedegbe G, Resnick B, Orwig D. A new tool to assess treatment fidelity and evaluation of treatment fidelity across 10 years of health behavior research. Journal of Consulting and Clinical Psychology. 2005;73(5):852–860. doi: 10.1037/0022-006X.73.5.852. [DOI] [PubMed] [Google Scholar]
- Carrieri-Kohlman V. Dyspnea self-management: Internet or face-to-face (R01NR008938) 2005–2011 [Google Scholar]
- Christmas C, Andersen RA. Exercise and older patients: Guidelines for the clinician. Journal of the American Geriatrics Society. 2000;48(3):318–324. doi: 10.1111/j.1532-5415.2000.tb02654.x. [DOI] [PubMed] [Google Scholar]
- Clancy C. Achieving enhanced quality and care through health IT. Retrieved November 20, 2006, from http://www.ahrq.gov/news/sp110106.htm.
- Eysenbach G. Medicine 2.0: Social networking, collaboration, participation, apomediation, and openness. Journal of Medical Internet Research. 2008 Sep 25;10(3):2010–e22. doi: 10.2196/jmir.1030. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Farran CJ, Staffileno BA, Gilley DW, McCann JJ, Li Y, Castro CM, King AC. A lifestyle physical activity intervention for caregivers of persons with Alzheimer's disease. American Journal of Alzheimer's Disease and Other Dementias. 2008;23(2):132–142. doi: 10.1177/1533317507312556. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fox S. Pew internet & American life project. 2008 Retrieved September 25, 2010, from http://www.pewinternet.org.
- Friedman LM, Furberg CD, DeMets DL. Fundamentals of clinical trials. 2nd Edition ed. Springer; New York: 1998. [Google Scholar]
- Glanz K, Rimer BK, Viswanath K, editors. Health Behavior and Health Education. 4th ed. Jossey-Bass, John Wiley & Sons; San Francisco, CA: 2008. [Google Scholar]
- Hulley SB, Cummings SR, Browner WS, Grady DG, Newman TB. Designing clinical research. 3rd ed. Lippincott Williams & Wilkins; Philadelphia, PA: 2007. [Google Scholar]
- Jorm AF. A short form of the Informant Questionnaire on Cognitive Decline in the Elderly (IQCODE): Development and cross-validation. Psychological Medicine. 1994;24:145–153. doi: 10.1017/s003329170002691x. [DOI] [PubMed] [Google Scholar]
- Lawton MP, Kleban MH, Moss M, Rovine M, Glicksman A. Measuring caregiving appraisal. Journal of Gerontology: Psychological Sciences. 1989;44(3):P61–71. doi: 10.1093/geronj/44.3.p61. [DOI] [PubMed] [Google Scholar]
- Lazarus RS, Folkman S. Stress, appraisal, and coping. Springer Publishing Company, Inc; New York: 1984. [Google Scholar]
- Levkoff S, Sanchez H. Lessons learned about minority recruitment and retention from the Centers on Minority Aging and Health Promotion. Gerontologist. 2003;43(1):18–26. doi: 10.1093/geront/43.1.18. [DOI] [PubMed] [Google Scholar]
- Mazzeo RS, Cavanagh P, Evans WJ, Fiatarone M, Hagberg J, McAuley E. American College of Sports Medicine position stand: Exercise and physical activity for older adults. Medicine & Science in Sports & Exercise. 1998;30(6):992–1008. [PubMed] [Google Scholar]
- Meinert CL. Clinical trials: Design, conduct and analysis. 1986 [Google Scholar]
- National Institutes of Health . NIH inventory of clinical trials No. Volume 1) National Institutes of Health, Division or Research Grants, Research Analysis and Evaluation Branch; Bethesda, MD: 1979. [Google Scholar]
- National Institutes of Health NIH policy for data and safety monitoring. 1998 Retrieved September 25, 2010, from http://grants.nih.gov/grants/guide/notice-files/not98-084.html.
- Nguyen HQ. Digital health consumers: Transforming the clinical research landscape. Western Institute of Nursing Keynote presentation; Phoenix, Arizona: 2010. [Google Scholar]
- Omron Healthcare, I HEM-907 digital blood pressure monitor. 2001 doi: 10.1097/00126097-200104000-00007. from www.omronhealthcare.com. [DOI] [PubMed]
- Pearlin LI, Mullan JT, Semple SJ, Skaff MM. Caregiving and the stress process: An overview of concepts and their measures. The Gerontologist. 1990;30(5):583–594. doi: 10.1093/geront/30.5.583. [DOI] [PubMed] [Google Scholar]
- Prochaska JO, DiClemente CC, Norcross JC. In search of how people change. applications to addictive behaviors. American Psychologist. 1992;47:1102–1114. doi: 10.1037//0003-066x.47.9.1102. [DOI] [PubMed] [Google Scholar]
- Resnick B, Bellg AJ, Borrelli B, DeFrancesco C, Breger R, Hecht J, Sharp DL, Levesque C, Orwig D, Ernst D, Ogedegbe G, Czajkowski S. Examples of implementation and evaluation of treatment fidelity in the BCC studies: Where we are and where we need to go. Annals of Behavioral Medicine. 2005;29(Special Supplement):46–54. doi: 10.1207/s15324796abm2902s_8. [DOI] [PubMed] [Google Scholar]
- Rikli RE, Jones CJ. Senior fitness test manual. Human Kinetics; Champaign, IL: 2001. [Google Scholar]
- Rogers ME. Pre exercise and health screening. In: Jones J, Rose D, editors. Physical activity instruction of older adults. Human Kinetics; Champaign, IL: 2005. pp. 57–80. [Google Scholar]
- Schulz KF, Altman DG, Moher D. CONSORT 2010 statement: Updated guidelines for reporting parallel group randomized trials. Obstetrics & Gynecology. 2010;115(5):1063–1070. doi: 10.1097/AOG.0b013e3181d9d421. [DOI] [PubMed] [Google Scholar]
- Schulz R, Beach SR. Caregiving as a risk factor for mortality: The Caregiver Health Effects Study. Journal of the American Medical Association. 1999;282(23):2215–2219. doi: 10.1001/jama.282.23.2215. [DOI] [PubMed] [Google Scholar]
- Schulz R, Lustig A, Handler S, Martire LM. Technology-based caregiver intervention research: Current status and future directions. Gerontechnology. 2002;2(1):15–47. [Google Scholar]
- Stewart AL, Hays RD, Ware JE, R. The MOS short-form general health survey: Reliability and validity in a patient population. Medical Care. 1988;26(733–735) doi: 10.1097/00005650-198807000-00007. [DOI] [PubMed] [Google Scholar]
- Stewart AL, Mills KM, King AC, Haskell WL, Gillis D, Ritter PL. CHAMPS physical activity questionnaire for older adults: Outcomes for interventions. Medicine & Science in Sports & Exercise. 2001;33(7):1126–1141. doi: 10.1097/00005768-200107000-00010. [DOI] [PubMed] [Google Scholar]
- Vitaliano P, Zhang J, Scanlan J. Is caregiving hazardous to one's physical health? A meta-analysis. Psychological Bulletin. 2003;129(6):946–972. doi: 10.1037/0033-2909.129.6.946. [DOI] [PubMed] [Google Scholar]
- Wantland DJ, Portillo CJ, Holzermer WL, Slaughter R, McGhee EM. The effectiveness of web-based vs. non-web-based interventions; a meta-analysis of behavioral change outcomes. Journal of Medical Internet Research. 2004;6(4):e40. doi: 10.2196/jmir.6.4.e40. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Watson D, Clark L. Development and validation of brief measures of positive and negative affect: The PANAS scales. Journal of Personality and Social Psychology. 1988;54(6):1063–1070. doi: 10.1037//0022-3514.54.6.1063. [DOI] [PubMed] [Google Scholar]
- Wilcox S, Dowda M, Griffin SF, Rheaume C, Ory MG, Leviton L, King AC, Dunn A, Buchner DM, Bazzarre T, Estabrooks PA. Results of the first year of active for life: Translation of 2 evidence-based physical activity programs for older adults into community settings. American Journal of Public Health. 2006;96(7):1201–1209. doi: 10.2105/AJPH.2005.074690. [DOI] [PMC free article] [PubMed] [Google Scholar]