Abstract
The current standard in healthcare research is to maintain scientific fidelity of any intervention being tested. Fidelity is defined as the consistent delivery of interventions which ensures that all participants are provided the same information, guidance, and/or materials. Notably, the methods for ensuring fidelity of intervention delivery must also be consistent. This manuscript describes our Intervention and Technology Delivery Fidelity Checklists used to ensure consistency. These checklists were completed by trained nurse observers who rated the intervention implementation and the technology delivery. Across our clinical trials and pilot studies, the fidelity scores were tabulated and compared. Intervention information and materials were delivered by a variety of technology devices including telehealth monitors, videophones, and/or iPads. Each of the devices allow audio-visual connections between health professionals from their offices and patients/participants in their homes. Our checklists guide monitoring fidelity of technology delivery. Overall checklist ratings across our studies demonstrate consistent intervention, implementation, and technology delivery approaches. Uniquely, our fidelity checklist verifies the interventionist’s correct use of the technology devices to ensure the consistent audio-visual delivery. Our checklist methods to ensure intervention fidelity and technology delivery are essential research procedures which can be adapted for use by researchers across multiple disciplines.
Keywords: intervention fidelity, science rigor, technology, telehealth, mHealth
Introduction
The term fidelity refers to a concept used widely in information science and healthcare research. Intervention fidelity is defined as staying true to the description of the intervention being tested and consistently delivering that information to all research participants in the same manner.1 Advancements in technology have served to increase nursing, medical, and allied health providers’ ability to deliver interventions to patients via affordable internet options into their homes. Establishing intervention or treatment fidelity is challenging, especially when the intervention is delivered at a distance using telehealth technology systems. An important measure of intervention fidelity when using telehealth delivery is consistent delivery across technology platforms.2–3 If delivery varies or if information is inconsistent participants do not receive the same intervention. With such variation in a research study, the analysis of outcomes becomes impossible.4
The purpose of this article is to (1) describe the basic components and types of intervention and telehealth delivery fidelity procedures, and (2) to illustrate how we use observation checklists to ensure our interventions are delivered consistently and as planned across all participants in each study.5 Our intervention fidelity procedures include having a professional, who is not participating in the intervention delivery, observe and rate interventions using checklists. Positive checklist ratings indicate that the implementation of an intervention was conducted as planned and consistently across each participants’ delivery.
Thus, the focus of this article is how to use our research checklist procedures to ensure intervention and delivery fidelity. Further, we will provide checklist examples and ratings from our pilot and other studies to illustrate observed fidelity. Checklists were designed for each study to measure consistent approaches specific to the intervention and technology used in the delivery. The principles and components that guide our fidelity monitoring checklists are based in the National Institutes of Health’s Best Practices national framework for intervention fidelity.6 This framework recommends strategies for maintaining and enhancing consistency of intervention delivery in the studies that NIH funds. The NIH Framework principles are: (a) consistency in intervention content, information, and delivery; and (b) training the interventionists (i.e. the nurse, psychologist, or health professional who implements the intervention) for delivering a consistent intervention. Maintaining these NIH principles using strategies such as observation checklists are essential to good intervention and delivery fidelity. Concrete examples of the use of these checklists are described across our series of studies with patients who must infuse intravenous (IV) nutritional fluids daily. Given that patient participants were observed in their homes from a distance via the audio and visual devices, fidelity was assessed for each technology delivery.
Fidelity in Interventions Delivered via Technology
Audio-visual connections allow interventionists to observe and support patients in learning and following through with prescribed home-care and health management tasks in their own settings. Delivery into the home reduces the risk of patients’ exposure to infection or contagion associated with waiting rooms and hospital clinics and reduces patient travel. Technology to support patients in their homes is growing exponentially, yet there are few studies which test consistent fidelity delivery of interventions via audio-visual technology.3 Consistency and reliability of the method of delivery should be measured so that each patient participant’s intervention is delivered in the same manner.7–9 Further, it is imperative that the intervention delivery is consistent so that other researchers can replicate the study.10–11 Thus, our checklist procedures for assessing delivery fidelity were used across all our studies.
All intervention sessions began with the nurse or psychologist interventionist guiding the participants in the correct set-up and use of the technology placed in their homes (see table 1). These procedures allowed for audio-visual delivery between participants’ homes and professionals’ offices to be consistently understandable with clear visual connections.
Table 1:
Checklist Items Regarding Set-up for Technical Delivery
| Observations of technical delivery with a videophone or iPad: |
|---|
| 1. The nurse interventionist chooses an appropriate location where it is suitable to make the videophone/iPad visit (e.g. private, away from excess noise). Guides patient to be close to the telephone line jack and the internet connection. |
| 2. The videophone/iPad is placed on top of a stand, counter, desk, or table so that there is no camera movement, with the camera opening in the front facing where the patient sits 2 feet away. |
| 3. Nurse interventionist checks that there is sufficient lighting (can have lamp moved behind the subject to reduce shadows or glare). |
| 4. The nurse interventionist checks that the camera is in focus and the patient agrees that the videophone or iPad volume is easily heard. |
| 5. The nurse interventionist uses proper equipment directions to the participant for each assessment undertaken (i.e. directs patient to place finger in oximeter correctly). |
| 6. Troubleshoots any blurred visuals or poor audio as necessary. |
Rating scale ranged from 1= strongly disagree to 5= strongly agree.
Across all our studies, maintaining technology delivery fidelity included placing the technology device where the two-way visualization was made clearest by using appropriate lighting and checking for adequate sound.12–13 Effective lighting was essential for nurse interventionists to clearly interact with patient participants and observe their facial expressions and body language. Unique to our fidelity strategies is training interventionists on using each technology device per study so that all research participants receive consistent intervention delivery. Our intervention observation checklist aligns with previous methods developed for observing nursing care given in the home. An inconspicuous observer monitored each technology intervention delivery session and was trained to use the specific checklist rating scales. All rating scales were set up on a Likert (1 to 5) rating system. Observers rated if strategies for each delivery were completed with fidelity, such as video calls were conducted in a private location with the technical device on a stable table/counter, volume was adequate, and a clear picture was captured.
Our fidelity checklist criteria assess various technology delivery devices such as videophones, telemedicine equipment, and audio-visual tablets such as iPads. The checklist rating items are written to apply to the specific technology used in each study.12 Thus checklist procedures were designed to ensure each technology device functions at its highest possible level. Participants also evaluated the clarity of the delivery. Table 2 is a summary checklist from participants’ anonymous evaluation of the early videophone technology in our studies. Participant ratings used a Likert scale from 1 (strongly disagree) to 5 (strongly agree) regarding technology delivery of interventions. Participants rated it easy to ask the nurse interventionist questions, see and understand what the interventionist was saying, and clearly see the illustration materials projected to them over the telehealth monitor. The high midrange scores (>3) given by patient participants concerning their desire to use the videophone to talk with others were encouraging.
Table 2.
Participant Checklist Ratings of Videophone Use and Delivery of Information.
| Videophone Evaluations Questions | Mean (SD) out of 5 |
|---|---|
| It was easy to ask questions using the videophone. | 3.9 (1.54) |
| It was easy to understand what the nurse was saying using the videophone. | 4.0 (1.50) |
| It was easy to set up and use the videophone. | 4.4 (.73) |
| The videophone let the nurse find out how I was feeling just as well as if she had come to visit in person. | 3.6 (1.51) |
| I felt that the other person and I had good personal information exchange. | 4.3 (1.00) |
| It would have been better if the other person had been able to visit me in person. | 3.3 (1.50) |
| The videophone intruded on my privacy. | 1.3 (.71) |
| My relatives or friends | 3.4 (1.60) |
Rating scale ranged from 1= strongly disagree to 5= strongly agree.
Materials & Methods
Specially trained nurse observers used the checklist to rate if the interventionists followed training in the rare technical disconnections, managed any pixel screen blurring due to movement, and scheduled subsequent technology sessions.14 Checklist rating also guided the observer to rate the delivery of the specified intervention information content, topics to discuss, or health care skills demonstrated.15 Our fidelity checklists also include criteria-specific rating items for correct use of a variety of intervention approaches. These included behavioral skills training, resilience building strategies, psychologists’ adherence to consistent counseling techniques and approaches with patients,16–17 teaching about medications, and home monitoring of prescribed medical treatments. A specific fidelity checklist was generated for each study based on intervention content, materials provided to participants, and the specific technology used.
In all our studies described here, the patient participants were drawn from the population of individuals prescribed life-long, daily IV nutrient infusions to sustain their health.18 The research question and data from each study’s fidelity procedure rating reported here is: were the interventions and technology delivery consistent across all participants in that study? Fidelity ratings were also summarized across the studies and reported here to verify each study had intervention and delivery implementation fidelity (consistency).19
Samples of Technology Devices and Telemedicine Equipment Used Across Studies
Our studies have used varying technology equipment. All devices were easily mailed or taken to the participants’ homes for on-going and long-distance telecommunication visits. Across all studies, all participants were consented to participate per IRB approval. Equipment loan agreement forms and image disclosure consents were obtained prior to scheduling the technology delivered intervention visits. All long-distance telephone fees and encrypted internet provider fees were covered at no cost to participants. All studies used only our university medical center privacy encryption connections and the IRB approved internet providers, which are firewall protected. It is important to note that the placebo control groups in these studies also had technology-based audio-visual sessions using the same device on the same schedule and for the same length of time as the intervention group. However, the control group visits did not include the interventions being tested. Thus, this group controls for influences of the novel technology visits and the Hawthorne effect of being observed. Each study had a manual describing a specific intervention and approach to be used with that patient population. The research team observers rated interventions during each technology session.
Observation When Using the Fidelity Rating Checklists
The observer was trained to complete the checklist ratings while viewing the delivery of each intervention. An important disclaimer in the directions to the observer at the beginning of the fidelity checklist stated, “These ratings are used to evaluate the administering of the intervention by the technology used. Checklists are not used to rate the participants’ discussion comments or their reactions to the intervention content.”
The checklist had a total of 5 rating sections that were labeled with titles of each section. Each item in the rating scale was given a rating number. The summed numbers for each section were averaged, resulting in an overall section rating score. Checklists were unique to each study, related to that specific technology intervention approach, and were used each time a patient had an audio-visual session delivered.
Sections on the Intervention Fidelity Observation Checklist
The first section of each checklist, entitled Interventionist Technical Competence, included questions which rated the interventionist’s ability to set up the technology equipment correctly and make the necessary connections prior to and during the intervention session. Notably, each study had a rating scale regarding the consistent audio-visual delivery through each specific technology device. For example, in all the videophone/iPad interventions, Technical Competence was assessed by observing that the interventionist checked that the: a) videophone/iPad camera was in focus; b) technology device was sitting the correct distance in front of the participant for visual assessment; c) location area was private, not public, suitable for making a healthcare visit; and d) videophone/iPad visit and audio clarity were at acceptable volume.
The second section of the checklist, entitled Interventionist Follows Intervention and Approach Manual, included observations which assessed the interventionist presenting the intervention information logically as in the standardized manual and engaging participants in the intervention discussion. Facilitation of discussion, not lecturing, was emphasized. These rating items assessed if the interventionist: a) facilitated the participants’ discussion; b) followed each step of the delivery approach; c) implemented information from the intervention guide; d) engaged the participant discussing the intervention; and e) elicited a “verbal response” that the participant will use the intervention approach and respond to future emailed reminders. This second section ensured that across the study each participating patient received the same information and was engaged in discussions of when and how to use the approach in their own health/illness self-management.
The third section, Interventionist Assesses Patient Comprehension of the Information, rated the interventionist seeking clarity of the information to ensure delivery is clear and understood. For example, for the Infection/Depression intervention there were fidelity items rated such as “participant describes the principles of infection symptom monitoring” and the “participants discuss strategies to avoid low moods.”
The fourth section was Interventionist Overall Competence in Addressing Untoward Discussion during the audio-visual technology-based visit. Ratings on items for the interventionist included: “adequately managing interruptions or participants monopolizing the discussion,” “adequately addressing participant questions about the approach,” and “acknowledging participants’ concerns.” These ratings assessed the interventionist’s ability to draw participants back onto the information topic after untoward issues arise.
The fifth section of the checklist, Interventionist Effectively Communicates, used 5 rating items to assess the interventionist’s use of effective communication techniques. Ratings included whether the interventionist “uses reflective listening” (e.g., listens carefully, then restates what the participant is saying to clarify) and “uses emphatic responses” (e.g., “I follow you” or “I understand”). Other ratings observed whether the interventionist “asks the participant to get comfortable for the next 15 to 60 minutes and at the discussion close” and then “routinely asks the participant if they have any questions.”
The sixth and final section, Interventionist Guides Intervention Home Use and Future Technology Session Scheduling, assessed the interventionist’s ability to reinforce participants’ use of the intervention information, skills or materials discussed, and to maintain scheduling of the technology-based discussions. The first rating item assessed whether the interventionist “guides participants to select their preferred time of day for use of the intervention in their daily health care.” The second rating assessed whether the interventionist demonstrated “ease of rescheduling future technology-based sessions” (e.g., interventionist asked, “What time is good for you? Let’s see if we can keep to the weekly schedule.”).
These sixth section items were adapted for the technology being used in each study. Likewise, in each study, the total sample (or every patient participating in each study), was observed for the fidelity ratings of the intervention and the technology delivery.
Training Sessions with Research Teams for Maintaining Fidelity
To ensure fidelity of an intervention, it is crucial that all team members are trained in standard research procedures before administering the intervention to participants. For example, each team member needs to be able to explain encryption, firewall protection, intervention content, the research procedures, and guidelines for patient participants. It is crucial that all team members involved in technology interventions can demonstrate knowledge and understanding to address technology questions or problems that may arise during an intervention.20 Fidelity training also ensures that procedures for visual projections of slides and/or handouts are coordinated throughout the interventions. Training includes practicing responding to possible technical problems during sessions. Additionally, terms used to guide participants’ use of the technology device should be defined during the original intervention so as not to confuse with varied wording.21
The interventionists are taught to begin a session by setting “ground rules” for participants about not disclosing personal or health related information or giving medical advice to others during group discussions. The interventionist must learn communication techniques of guiding the discussion back on to the intervention topic if it strays and how to prompt other participants to share if there is one who monopolizes. Duties should be discussed during initial training and reviewed annually to confirm the roles each research team member is to play throughout the intervention implementation. Lastly, annual staff training adds to the rigor of the study by ensuring all team members maintain consistent operating knowledge of the research regulations and IRB/HIPAA compliance.
Pilot Study Testing Using Fidelity Checklists
The first pilot study included 10 individual patient participants who were invited to test analog video phones sent to their homes in 2010 through the mail.22 In this study, nurse fidelity checklists also included observations of participants’ conducting their daily IV infusion care. Home care and self-management of IV infusions can be challenging for patients. Few fidelity studies have been conducted to determine the level of detail that interventionists can visually assess using small, plug in, analog/cell videophone or iPad connections. Thus, our pilot studies tested the fidelity of nurses being able to observe participants and clearly see the details of patient participants conducting their home IV infusions via technology.
Equipment chosen for this study included a one-piece flip top analog videophone weighing 1.5 pounds with a 4-inch color LCD TFT active matrix screen and high resolution (325K pixels) color CCD camera and embedded internal speaker. This device allowed two-way visuals so that the interventionist and participant could see each other simultaneously. The internal 33.6 kilobytes per second (kbps) modem transmitted the telehealth audio and video signals via a Public Switched Telephone Network at a rate of approximately 18 frames per second. This rate allowed visual exchanges which looked similar to television viewing.
The purpose of this pilot study was to determine the utility of these compact, analog videophones to observe details of IV infusion care procedures conducted by patients at home and to evaluate patient participant and interventionist satisfaction with their audio-visual behavioral intervention from a distance.
The videophone was placed 8 to 12 inches away from the participants while they conducted their daily IV infusions. The nurse interventionist noted any procedural concerns observed and then made suggestions for improving procedures and for maintaining asepsis technique. The fidelity ratings found the interventionist consistently guided the participant procedures. Notably, these ratings also identified that nurses did clearly observe participants performing their IV cleansing procedures. For example, the interventionist was able to clearly evaluate the procedures of participants cleaning the skin around the IV infusion site and 100% of participants correctly covering their IV site without their fingers touching, thus maintaining sterile bandaging. Furthermore, in 90% of participants the interventionist was able to clearly observe participants cleansing their catheter tubing hub connection with antiseptic solution.
However, the ratings also found that there was inconsistency in nurses’ ability to observe any appearance of infection around the participant’s IV site. Even after instructing participants to move their technology device camera as close as possible to their IV site and directing the lighting, there was not enough visual clarity in all cases to determine presence or absence of infection. Thus, the nurse needed to ask each participant about any redness, inflammation, or swelling at the IV site and if there was any symptoms of discomfort or fever. From this fidelity check we learned that when tele-video technology is used to assess or support patients, the interventionists should be prepared to engage the patient/participant in describing their experiences and/or symptoms as well as conducting a visual assessment. The outcomes of this fidelity data testing indicated that technology allowed adequate assessment of some but not all of the details of patient IV home care (See Table 3). Importantly for clinical intervention fidelity, more recent technology has improved cameras that can zoom in close-up for better inspection.
Table 3.
Nurse Interventionists’ ability to clearly observe participants’ IV site and families’ IV infusions care while using Audio-Visual Technology.
| Items Evaluated by Nurse Interventionists | Clearly Observed | |
|---|---|---|
| Number | % | |
| Has dedicated IV clean area in the home (n=10) | 9 | 90 |
| Cleanses IV insertion hub with antiseptic solution (n=10) | 9 | 90 |
| IV location (n=8) | 8 | 100 |
| IV type (n=8) | 8 | 100 |
| Appearance of IV site (erythema, drainage, tenderness) (n=8) | 5 | 62.5 |
| Cleans skin with swabstick from exit site outward in a firm circular motion to a 3” diameter (n=8) | 6 | 75 |
| Covers exit site without touching catheter/skin (n=8) | 8 | 100 |
Note: Not all 10 participants completed all steps rated.
Clinical Trials Fidelity Testing
Our first clinical trial using telemedicine equipment included 30 individual patient participants. The telehealth units used in this study were connected through home residential telephone lines. These telehealth units had small, built-in, 2-way cameras that allowed the interventionist to see the patient at home and the patient to see the interventionist. The unit weighed 2.75 pounds and was easily installed by the interventionist when delivered into patient homes. The in-home modem transmitted the tele-monitor audio and video signals via single telephone line at 15 frames per second. This speed creates a 2 to 3 second delay between speech and reception which was described as minimal to the participants. This equipment was selected for its technical reliability, portability, and the low cost per unit.23
This clinical trial tested participants’ adherence to in-home breathing enhancement treatments.24 The adherence outcome from the intervention came from each participants’ breathing assist ventilator timer-recorder measurement. After the telehealth interventions, a higher percentage of intervention than placebo control participants were adhering to the time prescribed for using their breathing machine. The participants and interventionists completed a technology fidelity survey about their opinions of the telehealth transmissions between each participant and the nurse interventionist.25
Another clinical trial, this one using iPad technology, included 126 participants who attended group audio-visual sessions conducted from 2013 through 2016.12 A team of 3 clinical expert professionals participated in each intervention session from their offices with multiple participants (ranging from 2 to 6 participants joining the session) from home.14 The iPad technology used in this trial was the 16GB iPad mini with Retina Display camera. Each iPad had a data plan allowing access to our encrypted fire wall protected university medical center computer server. Using this teleconferencing iPad technology, multiple participants and professionals could see one another in separate windows on the iPad screen. From this study, a cost analysis of implementing these iPad sessions was completed.26
An additional clinical trial delivering another nursing intervention began in 2016 and is near completion.27 This trial also uses iPad technology delivery. As in the pilot studies, each clinical trial intervention had standardized information materials consistently provided to the participants.
Fidelity Checklists Data Analyses Summary
A team member summed the Fidelity Rating Scale scores across each of these studies for each section and totaled all section scores to get an overall rating. Specifically, the scores of each intervention session was rated into an overall total score, which was later used in the statistical regression analyses for determining how much information each patient was exposed to. Thus, fidelity rating scores were used to control for Type III error (when a lack of outcome effect occurs because of an insufficient amount of the intervention being conveyed).6,28 Intervention Fidelity scores were also used for calculating the amount of time taken with each participant to control for Type II error.29 Researchers used the ratings data from the Fidelity Checklist in each study to discuss consistent delivery of information and best practices with technology delivery throughout each study.
Results
Across all our technology delivered intervention studies, the overall scores for Interventionists’ Technical Competence Ratings (i.e. videophone/iPad placement, adequate lighting and volume, session scheduling choice, troubleshooting) ranged from 4 to 4.5 on a 5-point scale. In the iPad studies, technology fidelity was further assessed by having participants complete an anonymous survey about their use of the technology. A majority completed the survey and the overall scores from participants were positive regarding their technology use.12
The rating scores in the other sections on following the Interventions’ Manual, checking participant intervention comprehension and effective communication, and scheduling across these studies were also between 4 and 5. These scores indicate the planned interventions were consistently conducted as described in the research manuals. Further, the interventionists consistently delivered the interventions, assessed participants’ comprehension, effectively facilitated discussions, communicated well, and guided home intervention use.
The fidelity ratings data from each study reveal that the interventions were being administered in a consistent manner throughout all studies. Additionally, the data found consistent fidelity observation ratings across all the studies reviewed, regardless of the technology used. Specifically, the data resulted in the conclusions that: a) there was proper use of the videophones/iPads; b) interventionists demonstrated competence in addressing untoward events such as technical disconnections during the visit; c) interventionists presented consistent information by following the standardized information scripts and used discussion strategies to engage families; d) assessment of the participants’ comprehension of the information was validated; and e) reinforcement of participants’ ease and use of the information content each day in their home management was affirmed. Furthermore, our studies systematically evaluated the use of the videophones to visually observe patients preparing and completing their daily complex home IV infusions.
Discussion
This observation data across studies confirmed that the training of interventionists for consistent health care information intervention and consistent technology delivery was successful. One essential aspect of maintaining fidelity is writing a script that guides the intervention content order and how to insert graphics/handouts delivered. The script is not a word for word document but rather topical, with bolded headings for reminders about the topics to be delivered and the questions to pose during delivery to engage the patients/participants in discussion and encourage patients’/participants’ sharing.30 A written script is not to be read nor presented like a lecture, but rather to be used as a guide for discussing specific content that the health professionals want patients/participants to understand and use in their daily home care.31 The health professional delivering the content must always deliver the script aloud for clarity and practice delivery per ease of following the script. Prior to delivering an intervention to patients/participants, rearrangements can be made to the script flow so that content builds on the most health protecting (i.e. infection prevention). Adjustments can then be made for timing and possible rearranging of topics to ensure a good flow.
Techniques to Ensure Fidelity of Interventionist’s Intervention and Technical Delivery of Healthcare
Our research team discovered it is important to physically set up the technology the interventionist uses so the script can be easily followed while maintaining eye contact with patients/participants during the sessions. Good eye contact aids interventionists in developing rapport and engaging patients/participants. Our psychologist interventionist used dual computer screens during each audio-visual session, with the session script displayed on the desk top computer screen and a second larger screen showing participants in individual thumbnails (small frames simultaneously displayed on the monitor). A dual lens live-video camera hanging from the ceiling captured the interventionist and placed the interventionist in a thumbnail with all other participants on each iPad screen. The camera then projected the script and the live-video thumbnails from each participant onto the wall-mounted 55-inch TV screen, allowing the displayed session script to be slightly enlarged and aiding in simultaneous visualization of the script and each participant (see photograph).
Photograph.

The interventionist sat at a desk approximately 12 feet away from the wall-mounted screen with the high-definition camera placed in-between the interventionist and the screen. The high-definition camera was mounted to the ceiling with an extension pole, minimizing view obstructions and allowing the interventionist to easily read and scroll through the script while maintaining eye contact and visual engagement with participants (see photograph). The two, white, sheet-like cloths hanging in the photo were arranged for lighting so that clarity was achieved in video-capture. All participants on each iPad were able to see all other participants and the interventionist during each session.
A multidisciplinary team approach was developed for delivering the intervention while engaging participants in discussions and monitoring participant response. In one study, there were 3 professionals (a nurse, physician, and psychologist) involved in leading the intervention discussion.18 In another study, our psychologist led the delivery of the scripted intervention while another interventionist, who initiated patient participants’ enrollment into the study, monitored participants for discussion engagement. This team member clarified participant responses to address any concerns, maintain pace of the session for the time allotted, and ask supportive questions based on her observations of participant reactions (i.e. discomfort, emotional reaction, fatigue, and possible illness). This approach was essential for maintaining engagement and participant “uptake” of the information being conveyed in the intervention.
Lastly, our researchers acknowledge that our high-fidelity ratings came from team preparation. Technical specialists, though most often unseen by participants, were essential in maintaining fidelity in technology delivery. These technical staff experts worked to ensure clarity of power point slides and resources to display across the technology delivery. The intervention scripts promoted timely delivery of visual aids by technical staff throughout intervention sessions.
While analyzing the ratings data during each study, the researchers found that engagement in discussion amongst participants was facilitated by each interventionist in a number of ways. One facilitation method important for discussion engagement included pausing to give time for participants to contribute. Interventionists used feedback from the participants to adjust future group information sessions to meet the needs of the group. Importantly, the interventionists developed their ability to encourage the participants’ home health care activities. This technology delivery has been highly evaluated by participants as beneficial for reaching out and following up about their care without miles of travel and waiting room time.
Implications for Future Research
Future research should be carried out using Intervention Fidelity Checklists to determine whether technology delivered study interventions are followed consistently and to ensure that delivery also meets fidelity.32–33 In addition, recent articles are being written about maintaining fidelity to ensure that there is transparency of research procedures so that these can be used in other studies.34 Moreover, there are recommendations for systematically testing fidelity when assessing interventions which are tailored or individualized for patients.1,4 Further studies with the ever-improving technologies used in telehealth/mHealth are needed to establish sustainability of using these devices to consistently deliver interventions. Using our specific checklists to observe and then rate the fidelity of intervention information and delivery via technology ensures consistency, an essential component of rigorous research. Authors will share these checklists upon request.
Acknowledgements
We are grateful to our University Medical Center for Telemedicine and to Dedrick Hooper and Jeremy Ko for their technical expertise in establishing each technology session. The authors extend their appreciation to all participants who participated in these studies for their time, use of technology, and shared opinions and evaluation of technology delivered healthcare information. Stark Wright is acknowledged for his photographic skills.
Conflicts of Interest and Source of Funding
The authors do not have personal financial interests related to the subject matter discussed in the manuscript. The project was partially supported by National Institutes of Health, 1R01NR015743-01A-1. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institute of Nursing Research.
Contributor Information
Jaime Rachelle M Bonar, School of Nursing, University of Kansas Medical Center.
Shawna Wright, Dept. of Psychiatry & Behavioral Sciences, University of Kansas School of Medicine.
Donna Macan Yadrich, School of Nursing, University of Kansas Medical Center.
Marilyn Werkowitch, School of Nursing, University of Kansas Medical Center.
Lavonne Ridder, School of Nursing, University of Kansas Medical Center.
Ryan Spaulding, Department of Biostatistics & Data Science, University of Kansas School of Medicine.
Carol E. Smith, School of Nursing and, Preventive Medicine & Public Health, University of Kansas Medical Center.
References
- 1.Perez D, Van der Stuyft P, Zabala MC, et al. A modified theoretical framework to assess implementation fidelity of adaptive public health interventions. Implementation Science. 2016;11(91). doi: 10.1186/s13012-016-0457-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Roen K, Arai L, Roberts H, Popay J. Extending systematic reviews to include evidence on implementation: Methodological work on a review of community-based initiatives to prevent injuries. Soc Sci Med. 2006;63(4):1060–1071. [DOI] [PubMed] [Google Scholar]
- 3.Bosak KA, Pozehl B, Yates B, et al. Challenges of applying a comprehensive model of Intervention fidelity. West J Nurs Res. 2012;34(4):504–19. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Calthorpe RJ, Smith S, Gathercole K, et al. Using digital technology for home monitoring, adherence and self-management in cystic fibrosis: A state-of-the-art review. Thorax. 2019;0:1–6. doi: 10.1136/thoraxjnl-2019-213233 [DOI] [PubMed] [Google Scholar]
- 5.Murphy SL, Gutman SA. Intervention fidelity: A necessary aspect of intervention effectiveness studies. American Journal of Occupational Therapy. 2012;66:387–388. doi: 10.5014/ajot.2010.005405 [DOI] [PubMed] [Google Scholar]
- 6.Bellg A, Resnick B, Minicucci DS, et al. Enhancing treatment fidelity in health behavior change studies: Best practices and recommendations from the NIH Behavior Change Consortium. Health Psychology. 2004;23(5):443–451. doi: 10.1037/0278-6133.23.5.443. [DOI] [PubMed] [Google Scholar]
- 7.Santacroce SJ, Maccarelli LM, Grey M. Intervention fidelity. Nurs Res. 2004;53(1):63–66. [DOI] [PubMed] [Google Scholar]
- 8.Ross VM, Smith CE. Longitudinal trends in quality of life after starting home parenteral nutrition: A randomized controlled study of telemedicine. Clin Nutr. 2008:27(2);314. [DOI] [PubMed] [Google Scholar]
- 9.Piamjariyakul U, Smith CE. Telemedicine utilization reports and evaluation In: Montoney L, Gomez C, eds. Telemedicine in the 21st Century. Hauppauge, NY: NOVA Science Publishers; 2008. [Google Scholar]
- 10.Lee CYS, August GJ, Realmuto GM, Horowitz JL, Bloomquist ML, Klimes-Dougan B. Fidelity at a distance: Assessing implementation fidelity of the Early Risers Prevention Program in a going-to-scale intervention trial. Prev Sci. 2008;9(3):215–229. [DOI] [PubMed] [Google Scholar]
- 11.Munro CL, Savel RH. Rigor and reproducibility in critical care research. Am J Crit Care. 2007;26(4):265–267. doi: 10.4037/ajcc2017306 [DOI] [PubMed] [Google Scholar]
- 12.Smith CE, Werkowitch M, Yadrich DM, Thompson N, Nelson EL. Identification of depressive signs in patients and their family members during iPad-based audiovisual sessions. Comput Inform Nurs. 2017;35(7):352–357. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Smith CE, Leenerts MH, Gajewski BJ. A systematically tested intervention for managing reactive depression. Nurs Res. 2003;52(6):401–409. [DOI] [PubMed] [Google Scholar]
- 14.Smith CE, Spaulding R, Piamjariyakul U, et al. mHealth clinic appointment PC tablet: Implementation, challenges and solutions. J Mob Technol Med. 2015;4(2):21. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.McGrew JH, Griss ME. Concurrent and predictive validity of two scales to assess the fidelity of implementation of supported employment. Psychiatr Rehabil J. 2005;29(1):41. [DOI] [PubMed] [Google Scholar]
- 16.Smith CE, Cha JJ, Kleinbeck SV, Clements FA, Cook D, Koehler J. Feasibility of in-home telehealth for conducting nursing research. Clin Nurs Res. 2002;11(2):220–233. [DOI] [PubMed] [Google Scholar]
- 17.Nuro K, Maccarelli L, Baker SM, Martino S, Rounsaville BJ, Carroll KM. Yale adherence and competence scale (YACSII) guidelines. West Haven, CT: Yale University Psychotherapy Development Center; 2005:161. [Google Scholar]
- 18.Smith CE, Piamjariyakul U, Werkowitch M, et al. A clinical trial of translation of evidence based interventions to mobile tablets and illness specific internet sites. International Journal of Sensor Networks and Data Communications. 2016;5(1–7). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Carroll C, Patterson M, Wood S, et al. A conceptual framework for implementation fidelity. Implementation Science. 2007;2(40). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Yadrich DM, Fitzgerald SA, Werkowitch M, Smith CE. Creating patient and family education websites: Assuring accessibility and usability standards. Comput Inform Nurs. 2012;30(1):46–54. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Fitzgerald SA, Yadrich D, Werkowitch M, Piamjariyakul U, Smith CE. Creating patient and family education web sites: Design and content of the home parenteral nutrition family caregivers web site. Comput Inform Nurs. 2011;30(1):46–54. doi: 10.1097/NCN.0b013e3182343eac [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Spaulding R, Smith CE, Piamjariyakul U, Fitsgerald S, Yadrich D, Prinyarux C. Configuring mHealth devices for secure patient home care and research. Featured interview presented at: International mHealth Scientific Conference; 2013; Washington, D.C. [Google Scholar]
- 23.Smith E, Cha J, Puno F, MaGee J, Bingham J, Van Gorp M. Quality assurance processes for designing patient education web sites. Comput Inform Nurs. 2002;20(5):503–512. [DOI] [PubMed] [Google Scholar]
- 24.Smith CE, Dauz E, Clements F, Werkowitch M, Whitman R. Patient education combined in a music and habit-forming intervention for adherence to continuous positive airway (CPAP) prescribed for sleep apnea. Patient Educ Couns. 2009;74(2):184–190. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Smith CE, Dauz ER, Clements F, et al. Telehealth services to improve nonadherence: A placebo-controlled study. Telemed J E Health. 2006;12(3):289–296. [DOI] [PubMed] [Google Scholar]
- 26.Kim H, Spaulding R, Werkowitch M, et al. Costs of multidisciplinary parenteral nutrition care provided at a distance via mobile tablets. J Parenter Enteral Nutr. 2014;38(2_suppl):50S–57S. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Nelson EL, Yadrich DM, Thompson N, et al. Telemedicine support groups for home parenteral nutrition users. Nutr Clin Pract. 2017;32(6):789–798. [DOI] [PubMed] [Google Scholar]
- 28.Sidani S Measuring the intervention in effectiveness research. West J Nurs Res. 1998;20(5):621–635. [DOI] [PubMed] [Google Scholar]
- 29.Lipsey MW. Design sensitivity: Statistical power for experimental research. Vol. 19 Newbury Park, California: Sage; 1990. [Google Scholar]
- 30.Carroll KM, Nich C, Sifry RL, et al. A general system for evaluating therapist adherence and competence in psychotherapy research in the addictions. Drug Alcohol Depen. 2000;57(3):225–238. [DOI] [PubMed] [Google Scholar]
- 31.Duaz E, Moore J, Smith CE, Puno F, Schagg H. Installing computers in older adults’ homes for patient education website: A systematic approach. Comput Inform Nurs. 2004;22(5):1–7. [DOI] [PubMed] [Google Scholar]
- 32.O’Brien RA. Translating a research intervention into community practice: The nurse family partnership. J Prim Prev. 2005;26(3):241–257. [DOI] [PubMed] [Google Scholar]
- 33.Dumas JE, Lynch AM, Laughlin JE, Smith EP, Prinz RJ. Promoting intervention fidelity: Conceptual issues, methods, and preliminary results from the early alliance prevention trial. Am J Prev Med. 2001;20(1):38–47. [DOI] [PubMed] [Google Scholar]
- 34.French CT, Diekemper RL, Irwin RS, et al. Assessment of intervention fidelity and recommendations for researchers conducting studies on the diagnosis and treatment of chronic cough in the adult: CHEST guideline and expert panel report. CHEST. 2015;148(1):32–54. doi: 10.1378/chest.15-0164. [DOI] [PMC free article] [PubMed] [Google Scholar]
