Abstract
Wearable devices, like smartwatches, are increasingly used for tracking physical activity, community mobility, and monitoring symptoms. Data generated from smartwatches (PGHD_SW) is a form of patient-generated health data, which can benefit providers by supplying frequent temporal information about patients. The goal of this study was to understand providers’ perceptions towards PGHD_SW adoption and its integration with electronic medical records. In-depth, semi-structured qualitative interviews were conducted with 12 providers from internal medicine, family medicine, geriatric medicine, nursing, surgery, rehabilitation, and anesthesiology. Diffusion of Innovations was used as a framework to develop questions and guide data analysis. The constant comparative method was utilized to formulate salient themes from the interviews. Four main themes emerged: (1) PGHD_SW is perceived as a relative advantage; (2) data are viewed as compatible with current practices; (3) barriers to overcome to effectively use PGHD_SW; (4) assessments from viewing sample data. Overall, PGHD_SW was valued because it enabled access to information about patients that were traditionally unattainable. It also can initiate discussions between patients and providers. Providers consider PGHD_SW important, but data preferences varied by specialty. The successful adoption of PGHD_SW will depend on tailoring data, frequencies of reports, and visualization preferences to correspond with the demands of providers.
Subject terms: Health policy, Health policy, Patient education, Patient education
Introduction
Consultations in the clinic between patients and providers are the main opportunity to exchange information, discuss treatment options, and develop a trusting relationship1. However, patients with chronic issues feel pressure from the time limitations, which prevent them from sharing important details related to their care2. As a consequence, clinicians’ understanding of the effects of disease and treatment on patients during their day-to-day lives is poor3. Furthermore, optimal evidence-based medicine is challenging because providers’ are usually unaware whether patients are implementing management plans into their daily lives4.
Technological advances have the ability to improve healthcare delivery5. mHealth technology, like wearable devices, are increasingly used for community mobility, and tracking physical activity and symptoms (e.g., pain)6,7. Data generated, otherwise known as patient-generated health data, has received significant attention because of its potential to foster better communication, improve care coordination, and strengthen patient engagement8–11. Patient-generated health data can be produced from wearable devices, such as smartwatches (PGHD_SW). Sixteen percent of U.S. adults own a smartwatch12, and over 56 million U.S. adults will use a wearable device at least once per month in 201913.
PGHD_SW differs from other types of data, like patient-reported outcomes (PRO) because PRO data are typically standardized surveys informed by healthcare professionals to understand patients’ experiences of health. Patient Reported Outcomes Measurement Information System (PROMIS) are obtained by surveying patients about topics such as fatigue, pain, anger, and satisfaction that follow historical questionnaire formats. PGHD_SW often includes PROs and is patient-directed, collected through commercial tools, such as mobile phone applications and activity trackers14. Patient-generated health data, including data from smartwatches, contributes to patient empowerment15, helps patients make sense of their disease, enhances trust with providers16, and enables autonomy10.
PGHD_SW supplies frequent temporal information about patients. Providers typically rely on patients’ recall to track and assess management routines for patients with chronic disease17. PGHD_SW contributes to providers’ decision-making18 and fosters a deeper and more accurate understanding of a patient’s illness between clinic visits19. However, adoption of PGHD_SW has been hindered by provider perceptions that it contributes to information overload, concerns about liability from lack of timely review20,21, and fears around patient privacy10. It also remains unclear which digital biomarkers are valuable for clinicians, and how they should be visualized22.
Providers have expressed favorable attitudes towards behavioral tracking technologies23, but preliminary studies have been limited to general sentiment towards PGHD_SW. As smartwatch technology continues to evolve, with the addition of new data points (e.g., heart rate monitoring, electrocardiogram tracking, and blood pressure) and more precise measures of existing health indicators, it is necessary to understand providers’ perceptions and preferences towards the technology. Therefore, the goal of this study was to perform qualitative interviews with various providers to identify the value of specific data points (e.g., physical activity, fatigue), discover effective techniques to present data (e.g., graphical charts or tables), and learn how to practically incorporate PGHD_SW into practice (e.g., frequency of reports).
Diffusion of Innovations (DoI)24 was used as a framework, since the theory helps to understand factors that facilitate the adoption of new technology in health care25, including wearable devices26,27. Perceptions of the attributes of an innovation are a main indicator whether the innovation will be accepted. The five main attributes are: (1) relative advantage; (2) compatibility; (3) complexity; (4) trialability; (5) observability24. Using four of these attributes (observability was not yet pertinent since we were only inquiring about perceptions), we conducted in-depth qualitative interviews to answer the following research questions:
What are the benefits and barriers to incorporating PGHD_SW into practice?
How should PGHD_SW be visually presented and what are the most clinically valuable data points?
What are PGHD_SW’s implications on patient-provider communication?
Results
Demographic characteristics
Twelve out of 20 providers (60% enrollment rate) gave written consent to participate from October 2018 to August 2019. Interviews averaged 27 min in length. Most participants were male (n = 7, 58%) and the average age was 45 years (SD = 9.8). One provider was in his last year of residency, but all other providers averaged 12 years (SD = 9.4) of practice experience after residency. Most providers specialized in gerontology (n = 4) and worked exclusively in out-patient settings (n = 5; 42%). A full demographic summary is in Table 1.
Table 1.
Variable | n (%) |
---|---|
Sex | |
Female | 5 (42%) |
Male | 7 (58%) |
Average Age | 45.1 |
Average Years in Practice | 12.4 |
Race | |
White | 8 (67%) |
Indian | 2 (17%) |
Latino | 1 (8%) |
Asian | 1 (8%) |
Specialty | |
Geriatric | 4 (33%) |
Orthopedic Surgery | 4 (33%) |
Anesthesiology | 2 (17%) |
Nursing | 1 (8%) |
Physical medicine and rehabilitation | 1 (8%) |
Patient setting | |
Out-patient | 5 (42%) |
In-patient | 3 (25%) |
Both | 4 (33%) |
Current smartwatch ownership | |
Yes | 5 (42%) |
No | 7 (58%) |
Themes
Four main themes and associated sub-themes emerged: (1) PGHD_SW is perceived as a relative advantage; (2) data are viewed as compatible with current practices; (3) barriers to overcome to effectively use PGHD_SW; (4) assessments from viewing sample data.
Theme 1: PGHD_SW is perceived as a relative advantage
Overall, providers were optimistic about PGHD_SW and viewed it as equivalent to other types of medical data. Five (42%) providers were familiar with similar data, such as PROMIS measures, and three (25%) had experience using such data. Providers acknowledged the potential advantages of PGHD_SW, such as revealing insights about patients that they would typically not be able to access. For instance, referring to measures like pain and physical activity, a surgeon said, “We don’t have a way of getting this stuff when patients are at home…It’s the biggest void in healthcare right now”. Another surgeon who had experience with PROMIS viewed PGHD_SW as extremely valuable and stated:
“We have historically put more stock into the stuff we generate, like blood tests, physical exam findings. But there is a definite value in paying attention to what the patient is generating…I pay a lot more attention to that stuff than I do to their vital signs, honestly.”
All but one provider echoed this sentiment and considered PGHD_SW as commensurate with traditional types of medical data. The lone dissenter, an anesthesiologist, did not see as much value in PGHD_SW due to typically caring for patients in intensive care or emergency situations.
Theme 1a: Stimulates in-person communication
Aside from emergency situations, the same anesthesiologist acknowledged that PGHD_SW would help to keep him abreast of a patient’s status. He provided a hypothetical scenario: “I would start the conversation with ‘I see very high pain scores. I see you’re not moving a lot. I see we got an alert here. Did you fall?’ I think that’s a good starting point rather than ‘how are you?’”. Other providers appreciated the ability to use the data to enhance the consultation and provoke more productive discussions. A geriatrician preferred to review the data before the consultation, and then implement a new part of their care plan based on the data. Surgeons viewed PGHD_SW as a means to re-connect with patients after discharge. Referring to a recently discharged patient who might be struggling, an orthopedic surgeon envisioned:
“We [would] say, ‘please come in we need to talk. What’s going on? Why are you struggling?’ We show them that data. ‘You’re only walking 20 feet. The average for all our patients has been this much.’”
Theme 1b: Activation
In addition to using PGHD_SW as a platform to communicate with patients, some providers appreciated the ability to show patients data points. A nurse said, “[The data] can be powerful if patients can own it and take control”. Similarly, a surgeon suggested that the data can be used to form partnerships, enabling honest dialogue. The surgeon imagined a situation in the in-patient setting where data could be used to corroborate feedback from both nurses and the patient:
[I could] ask the nurse, ‘how much did the patient get up today? The patient got up twice today.’ Then, you look at [the data] and they actually moved for three minutes. I’d be like, ‘hey look, you haven’t been moving today. Look at the graph.’
Theme 2: Data are viewed as compatible with current practices
Providers from various specialties had different viewpoints about which data to use and how to use it. Few were concerned about possible ramifications of increased workload, and instead, believed that PGHD_SW may create efficiencies. Two-thirds (67%; n = 8) of providers did not express fears of data overload. This lack of concern was prefaced with the assumption that providers would have control over the frequency of reports. A geriatrician clarified that a monthly report may be too frequent, but a quarterly report would be helpful. An orthopedic surgeon was excited by the data and considered additional data, “Not a nuisance, that’s actually more convenience”.
All 12 (100%) participants believed that data reports should be integrated with the electronic medical record (EMR). Not only is the EMR a secure environment, but it is the most convenient method of viewing information since everything about the patient is consolidated in one place. A rehabilitation physician suggested that reports come as a secure message, which would initially be reviewed by a nurse. Since providers regularly interact with the EMR, they can instantly be made aware of incidents that need their attention, like a fall. Similarly, a surgeon stated, “I would most want push notification if you could set parameters on worrisome trends”.
Theme 2a: Data preferences differ by specialty
Each medical specialty required different types of data and had different uses for the data. For example, PGHD_SW reports and their frequency would differ greatly among geriatricians caring for older individuals in the out-patient setting than surgeons or anesthesiologists taking care of in-patients after undergoing an operation. A surgeon conjectured about a scenario that might occur after conducting knee surgery. He said, “We typically bring in all our knees at two weeks, but if we can get daily reports for the first ten days, we can say ‘you don’t need to come in, you’re fine’”. An anesthesiologist preferred PGHD_SW delivered six times a day, but a nurse overseeing chronic disease patients said that monthly reports are beneficial to provide a “good picture” when patients come back for an appointment in 6 months. Specific data points also varied by specialty, which is discussed in theme four.
Theme 2b: Trust
Nine (75%) providers would trust PGHD_SW just as much as information directly from questioning patients. A surgeon summarized this sentiment by saying, “If the patient is reporting a high amount of pain, I have to trust what the patient says”. However, three providers (25%) were skeptical about the validity of PGHD_SW. The rehabilitation physician and nurse both thought the data, such as physical activity, can be easily manipulated and patients may skew the data by entering information that makes them “look good”. Likewise, a geriatrician was doubtful about patients entering information about medication adherence. She said, “I think people are not often truthful about it, so I don’t know if a daily report would be helpful”.
Theme 3: Barriers to overcome to effectively use PGHD_SW
Less than half (n = 5; 42%) of providers owned a smartwatch, but all were generally familiar with their functionality. Providers without a smartwatch, or those with no experience ever using one, expressed some trepidation about incorporating them into their practice. This reaction was encapsulated by an orthopedic surgeon who said, “Would I be comfortable [educating patients]? Yes. Is it a good use of my time? Probably not”. Only one provider, an anesthesiologist, was opposed to devoting any amount of time to discuss smartwatch functionality with patients, saying, “You run the risk of physicians becoming IT support”.
Theme 4: Assessments from viewing sample data
Providers reviewed a list of data points (e.g., falls, mood, cognition tests, and physical activity) and selected the three attributes that they considered most and least valuable. Pain (n = 9, 25%) appeared most frequently as one of the top three valuable attributes. Falls was second (n = 8, 22%) and mobility (n = 6, 17%) and physical activity (n = 6, 17%) were tied for third. However, providers considered mobility and physical activity (exercise) to be similar, and if combined, it would become the attribute most frequently appearing in the top three (n = 12, 33%). When asked to select the overall most important attribute, falls (n = 3; 25%), pain (n = 3; 25%), and mobility (n = 3; 25%) were tied for first.
Providers had difficulty choosing the least valuable data points, with some choosing to not select any as least valuable. Among those that did, driving appeared as a bottom three selection (n = 5; 23%) most frequently, followed by fatigue (n = 4, 18%). Both attributes, along with mood, were also ranked as the overall least important attributes, each selected twice (17%). Several providers were intrigued by a cognitive test, but did not consider a smartwatch an ideal device to administer such a test and therefore, would not trust the results. A summary of the rankings is in Table 2.
Table 2.
Data point | Geriatricians (freq) | Surgeons (freq) | Anesthesiologists (freq) | Phys. Med and rehab (freq) | Nurses (freq) | Total freq (%) |
---|---|---|---|---|---|---|
Top 3 valuable ranking | ||||||
Pain | 2 | 4 | 2 | – | 1 | 9 (25%) |
Falls | 4 | 2 | 1 | 1 | – | 8 (22%) |
Mobility | 1 | 3 | 1 | 1 | – | 6 (17%) |
Physical activity (exercise) | 1 | 2 | 2 | 1 | – | 6 (17%) |
Hydration | 2 | 1 | – | – | 1 | 4 (11%) |
Medication adherence | 1 | – | – | – | 1 | 2 (6%) |
Fatigue | 1 | – | – | – | – | 1 (3%) |
Bottom 3 least valuable ranking | ||||||
Driving | – | 3 | 1 | 1 | – | 5 (23%) |
Fatigue | 1 | 1 | 1 | 1 | – | 4 (18%) |
Cognition testing | 1 | 1 | 1 | – | – | 3 (14%) |
Medication adherence | 1 | 2 | – | – | – | 3 (14%) |
Hydration | – | 2 | – | 1 | – | 3 (14%) |
Mood | 1 | 1 | – | – | – | 2 (9%) |
Mobility | – | – | – | – | 1 | 1 (5%) |
Physical Activity (exercise) | – | – | – | – | 1 | 1 (5%) |
aNot all providers selected three attributes as most/least valuable.
Providers were also shown data visualization samples, including line charts, pie charts, bar graphs, and radial gauges. Line graphs were the most popular (n = 6, 50%) because they easily conveyed longitudinal data that could be tracked over time. Bright color schemes were also favored, because it was important that data could be quickly interpreted. After viewing red, yellow, and green line charts, a geriatrician said the color-coding “helps to get an instant idea of how things have changed”.
Theme 4a: Baseline data needed
Providers expressed a need for baseline data to serve as a comparison. A surgeon commented, “We try to get [data] before and after, baseline and follow up. It’s really for us to show we’re doing a good job and to ensure that we’re hitting a target”. Currently, a surgeon collects data using paper and pencil, and then it is entered into a computer. Baseline data were particularly important for surgeons because it was the only way of knowing whether patients were closer to reaching the level of function previous to the procedure. A nurse also saw the need for baseline data because, “If we don’t have data before, we don’t know the ideal goal or range for that patient to be in. It becomes a guess”.
Discussion
Although the benefits of patient-generated health data have been recognized at the policy level9, and health system leaders believe such data are important for collecting biometric (weight, blood pressure, and blood glucose) and patient activity data (exercise and nutrition)28, successful adoption of PGHD_SW ultimately depends on the actual users of the data. Our in-depth qualitative interviews with providers from diverse specialties found that PGHD_SW was considered to have the potential to be highly valuable because it enabled access to information about patients that was traditionally unattainable. In addition, PGHD_SW can contribute to more productive discussions between patients and providers, in both the in-patient and out-patient setting.
Our findings indicate that PGHD_SW may be unique compared to other types of patient-generated health data. Other methods of capturing patient data, like PROMIS measures, typically excludes patients with low literacy29, while PRO data often requires significant administrative burdens to both patients and providers30. Data from smartwatches allow the patient, instead of the provider, to take ownership of generating and capturing data31. We found that few providers were concerned about PGHD_SW adding to workloads. Rather, providers in our study viewed PGHD_SW as a means of reducing workload due to the ability of monitoring patient’s behaviors, which inform decision-making and can lead to reductions in patient and healthcare burden. Administrative burden can potentially be reduced since PGHD_SW can integrate with electronic records, and due to PGHD_SW’s continuous monitoring, it can be more accurate and portable than collecting sensor data using other devices, like smartphones.
Similar to previous studies which determined the importance of developing practice workflows32, our findings indicate that PGHD_SW should be tailored by medical specialization based on factors like patient population, medical setting, and procedure. Moreover, we discovered that providers view PGHD_SW as an opportunity to collaborate with patients. Patient engagement, and attention to maintaining healthy behaviors, could be enhanced if providers partner with patients about the meaning and context of their data33. Furthermore, it may be possible to improve the patient-provider relationship if PGHD_SW is integrated into interactions34. However, patients and providers might have different perceptions about the value of PGHD_SW, therefore, it is important to align patient and provider perceptions35.
Another patient-provider interaction issue that arose from our study were reservations related to trusting the data generated from smartwatches. Previous studies have shown that providers sometimes have difficulty believing patients with surprising or unusual symptoms36 and assign a label of “unreliable” to their patient37. However, PGHD_SW has the opportunity to quell suspicions since most data are passively collected22 and therefore, may be more resistant to patient modification. Similarly, some providers expressed the need for baseline data. Such data are attainable since collection of PGHD_SW begins as soon as a device is used.
Pain, falls, mobility, and physical activity were the most preferred and relevant data points, but providers saw value in all of the other data points presented. However, cognitive assessment triggered concerns about how the test could be administered using a smartwatch. Currently, technology to measure cognitive function has been developed and tested using smartphones38–40, but such tests are still in the developmental phases for wearable devices41,42.
Diffusion of Innovations’ framework proved helpful in contextualizing findings from the study, discovering that PGHD_SW has advantages over other types of patient-generated data and can be compatible with current workflows for a variety of medical specialties. For instance, upon viewing samples of output data, providers suggested that PGHD_SW be integrated with electronic medical records. Experts agree that the meaningful use of patient-generated health measures in clinical practice should begin with willing providers, rather than a large-scale implementation43. The trialability attribute from DoI would allow for a phased implementation using EHRs, in a tested environment. Thus far, PRO data have been successfully implemented into the EHR to provide quantitative, objective data regarding patients’ health status44. Additionally, DoI also promotes the use of opinion leaders, or champions, to advocate for the innovation24. Opinions leaders can be identified during the testing phase can accelerate the use of PGHD_SW and can lead to more providers embracing use of the data. Other studies have demonstrated that physicians under the influence of an opinion leader were more likely to adopt an innovation than physicians not under an opinion leader’s influence45,46. Other aspects of DoI will also be relevant to the adoption of PGHD_SW, such as the observability attribute once a prototype is developed, and the innovation-decision process24, during the implementation phase.
Limitations to our study include only focusing on perceptions of anticipated use, the small sample size and possible selection bias, in that providers volunteered for the study. Also, due to the small sample size, results cannot be generalized and reflect perceptions among other providers of the same specialty. Moreover, the feasibility of data-driven care largely inhabits the field of primary care. Our study focused on secondary care, but greater attention on this topic should be targeted towards primary care.
In addition to primary care, we plan to focus on other specialties to understand how providers manage the nuances involved with measurement specific to their specialty. We will also seek opinions and attitudes from patients about their perceptions of PGHD_SW collection. Ultimately, an application will be developed to enable seamless integration of PGHD_SW into electronic health records.
Methods
Sample and design
This study was conducted at UF Health, an 852-bed level I trauma center located in Gainesville, Florida. This study was approved by University of Florida Institutional Review Board by IRB 201801446, and all methods were performed in accordance with the relevant guidelines and regulations. Informed consent was obtained from subjects before enrollment in the study. A combination of purposive and convenience sampling47 was utilized to attain as many participants as possible from a wide variety of specialties. The goal was to obtain perspectives from providers in both the in-patient and out-patient setting. Providers were recruited from internal medicine, family medicine, geriatric medicine, nursing, surgery, rehabilitation, and anesthesiology.
An overview of the study was presented at periodic team meetings by members of the research team. Those interested in participating were sent an email describing the study. Additionally, individual providers who may not have been present at meetings were targeted through email because of their expertise and potential interest in participation. Providers interested in participating were scheduled for an interview in a private office.
A semi-structured interview guide was designed to gauge providers’ perceptions of PGHD_SW and the way data should optimally be presented. To construct the guide, four members of the research team (JA, PR, TM, MR) wrote an initial set of questions and then vetted them to align with the goals of the study. After modifications, the research team reached consensus on the structure and content of the questionnaire. The semi-structured interview guide consisted of 15 main questions, but was written with flexibility, to allow for follow-up questions based on individual responses48. Selected questions from the interview guide are in Box 1. In addition, sample data visualization graphs and possible data points were presented to the interviewees to collect preferences. All interviews were conducted by the first (JA) and third author (MR) and were audio recorded and subsequently transcribed. Informed consent was reviewed and written consent was received before interviews were conducted.
Box 1 Sample interview guide questions.
• What are your views about viewing data generated by your patients via smartwatches?
• How often would you like to receive summaries of patient data?
• How do you think PGHD from smartwatches will contribute to patient care?
• How important is PGHD to you, compared with other medical information you interpret, such as blood tests, scans, X-rays…etc?
• How will you modify your communication with patients/caregivers now that they can enter and view PGHD data?
Analytical process
Transcripts of interviews underwent independent, open coding by the first (JA) and third authors (MR) using the constant comparative method49. This method helps to reduce the data to concepts50 through an iterative and inductive process of reducing the data through constant recoding51. Using the open codes, we created a comprehensive codebook. Interviews continued during data analysis until no new themes emerged and saturation was achieved52 through recurrence, repetition, and forcefulness of the data53. Transcripts were then re-read using the codebook and reviewed to identify emerging insights54. Codes were condensed and developed into preliminary themes, guided by DoI’s attributes of an innovation. However, one of the attributes, observability, was not included since fully functioning PGHD_SW was not yet applicable at the time of the study. Discrepancies among the research team were resolved by revisiting the codebook and through discussions by the entire research team until consensus was reached55. To validate our conclusions, numbers were used to provide quantitative confirmation to make claims more precise56. Quantitative counts have several advantages in qualitative research, such as enabling the identification and characterization of diverse perceptions, and providing evidence for interpretations57.
Reporting summary
Further information on research design is available in the Nature Research Reporting Summary linked to this article.
Supplementary information
Acknowledgements
Data Science and Applied Technology Core of the Claude D Pepper Older Americans Independence Center at the University of Florida (P30 AG028740); Research reported in this publication was supported by the University of Florida Informatics Institute SEED Funds and the UF Clinical and Translational Science Institute, which is supported in part by the NIH National Center for Advancing Translational Sciences under award number UL1 TR001427. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health; Partial funding from R21 AG059207 was used to support staff and faculty during the project; publication of this article was funded in part by the University of Florida Open Access Publishing Fund. We would like to thank the participants who volunteered and took the time to participate in interviews.
Author contributions
J.A. designed the study, supervised, and conducted the qualitative research, as well as led the analysis of results. T.M. coordinated project management, organized the recruitment of participants, helped write the qualitative questionnaire, and analyzed the results. M.R. oversaw the recruitment process and scheduled interviewed, conducted interviews, and analyzed results. S.P. and T.V.M. developed criteria for smartwatch data points and data visualization strategies. L.S. helped to develop the qualitative questionnaire by providing a clinical perspective, and P.R. acquired funding, participated in developing the questionnaire, and contributed to analysis plans. All authors critically reviewed manuscript drafts and approved the final version of the manuscript. J.A. is responsible for the overall content, and attests that all listed authors meet authorship criteria and that no others meeting the criteria have been omitted.
Data availability
Selected quotes from participants are included in the article; supplementary tables include data about providers’ preferences. The full dataset is available from the corresponding author upon reasonable request.
Competing interests
The authors declare no competing interests.
Footnotes
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Supplementary information is available for this paper at 10.1038/s41746-020-0236-4.
References
- 1.Street RL. Analyzing communication in medical consultations: do behavioral measures correspond to patients’ perceptions? Med. Care. 1992;30:976–988. doi: 10.1097/00005650-199211000-00002. [DOI] [PubMed] [Google Scholar]
- 2.Martin CM, Banwell CL, Broom DH, Nisa M. Consultation length and chronic illness care in general practice: a qualitative study. Med. J. Aust. 1999;171:77–81. doi: 10.5694/j.1326-5377.1999.tb123525.x. [DOI] [PubMed] [Google Scholar]
- 3.Nelson E, et al. Functional health status levels of primary care patients. Jama. 1983;249:3331–3338. doi: 10.1001/jama.1983.03330480037027. [DOI] [PubMed] [Google Scholar]
- 4.Marrero DG, et al. Twenty-first century behavioral medicine: a context for empowering clinicians and patients with diabetes. A Consens. Rep. 2013;36:463–470. doi: 10.2337/dc12-2305. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Institute of Medicine Committee on Quality of Health Care in America. Crossing the quality chasm: A new health system for the 21st century. (National Academy Press, 2001).
- 6.Case MA, Burwick HA, Volpp KG, Patel MS. Accuracy of smartphone applications and wearable devices for tracking physical activity dataaccuracy of devices to track physical activity dataletters. JAMA. 2015;313:625–626. doi: 10.1001/jama.2014.17841. [DOI] [PubMed] [Google Scholar]
- 7.Manini TM, et al. Perception of older adults toward smartwatch technology for assessing pain and related patient-reported outcomes: pilot study. JMIR Mhealth Uhealth. 2019;7:e10044. doi: 10.2196/10044. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.US Department of Health and Human Services. 2015 Edition Health Information Technology (Health IT) Certification Criteria, 2015 Edition Base Electronic Health Record (EHR) Definition, and ONC Health IT Certification Program Modifications. (2015).
- 9.Medicare Cf, Medicaid Services H. Medicare and Medicaid Programs; Electronic Health Record Incentive Program–Stage 3 and Modifications to Meaningful Use in 2015 Through 2017. Final rules with comment period. Fed. register. 2015;80:62761. [PubMed] [Google Scholar]
- 10.Petersen C. Patient-generated health data: a pathway to enhanced long-term cancer survivorship. J. Am. Med. Inform. Assoc. 2015;23:456–461. doi: 10.1093/jamia/ocv184. [DOI] [PubMed] [Google Scholar]
- 11.Woods SS, Evans NC, Frisbee KL. Integrating patient voices into health information for self-care and patient-clinician partnerships: veterans Affairs design recommendations for patient-generated data applications. J. Am. Med. Inform. Assoc. 2016;23:491–495. doi: 10.1093/jamia/ocv199. [DOI] [PubMed] [Google Scholar]
- 12.NPD Group. U.S. Smartwatch Sales See Strong Gains, According to New NPD Report. https://www.npd.com/wps/portal/npd/us/news/press-releases/2019/us-smartwatch-sales-see-strong-gains-according-to-new-npd-report/ (2019).
- 13.Wurmser, Y. Wearables 2019. (eMarketer, 2019).
- 14.Kumar RB, Goren ND, Stark DE, Wall DP, Longhurst CA. Automated integration of continuous glucose monitor data in the electronic health record using consumer technology. J. Am. Med. Inform. Assoc. 2016;23:532–537. doi: 10.1093/jamia/ocv206. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Snyder CF, et al. Feasibility and value of PatientViewpoint: a web system for patient‐reported outcomes assessment in clinical practice. Psycho‐Oncol. 2013;22:895–901. doi: 10.1002/pon.3087. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Burns K, et al. Creating consumer-generated health data: interviews and a pilot trial exploring how and why patients engage. J. Med. Internet Res. 2019;21:e12367. doi: 10.2196/12367. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Rothman M, et al. Use of Existing Patient-Reported Outcome (PRO) Instruments and Their Modification: The ISPOR Good Research Practices for Evaluating and Documenting Content Validity for the Use of Existing Instruments and Their Modification PRO Task Force Report. Value Health. 2009;12:1075–1083. doi: 10.1111/j.1524-4733.2009.00603.x. [DOI] [PubMed] [Google Scholar]
- 18.Islind, A. S., Lindroth, T., Lundin, J. & Steineck, G. Shift in translations: Data work with patient-generated health data in clinical practice. Health Informa. J.10.1177/1460458219833097 (2019). [DOI] [PubMed]
- 19.Cohen DJ, et al. Integrating patient-generated health data into clinical care settings or clinical decision-making: lessons learned from Project HealthDesign. JMIR Hum. Factors. 2016;3:e26. doi: 10.2196/humanfactors.5919. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Shapiro, M., Johnston, D., Wald, J. & Mon, D. Patient-generated health data. (RTI International, 2012).
- 21.National eHealth Collaborative. Patient-generated health information technical expert panel. (Department of Health and Human Services, Office of the National Coordinator for Health Information Technology, 2013).
- 22.Genes N, et al. From smartphone to EHR: a case report on integrating patient-generated health data. npj Digital Med. 2018;1:23. doi: 10.1038/s41746-018-0030-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Holtz B, Vasold K, Cotten S, Mackert M, Zhang M. Health care provider perceptions of consumer-grade devices and apps for tracking health: a pilot study. JMIR Mhealth Uhealth. 2019;7:e9929. doi: 10.2196/mhealth.9929. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Rogers, E. Diffusion of innovations. (The Free Press, 2003).
- 25.Gattoni A, Tenzek KE. The practice: an analysis of the factors influencing the training of health care participants through innovative technology. Commun. Educ. 2010;59:263–273. doi: 10.1080/03634521003605808. [DOI] [Google Scholar]
- 26.Sultan N. Reflective thoughts on the potential and challenges of wearable technology for healthcare provision and medical education. Int. J. Inf. Manag. 2015;35:521–526. doi: 10.1016/j.ijinfomgt.2015.04.010. [DOI] [Google Scholar]
- 27.Sezgin E, Özkan-Yildirim S, Yildirim S. Understanding the perception towards using mHealth applications in practice:Physicians’ perspective. Inf. Dev. 2018;34:182–200. doi: 10.1177/0266666916684180. [DOI] [Google Scholar]
- 28.Adler-Milstein J, Nong P. Early experiences with patient generated health data: health system and patient perspectives. J. Am. Med. Inform. Assoc. 2019 doi: 10.1093/jamia/ocz045. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Hahn EA, et al. The talking touchscreen: a new approach to outcomes assessment in low literacy. Psycho-Oncol. 2004;13:86–95. doi: 10.1002/pon.719. [DOI] [PubMed] [Google Scholar]
- 30.Makhni EC, et al. Patient Reported Outcomes Measurement Information System (PROMIS) in the upper extremity: the future of outcomes reporting? J. Shoulder Elb. Surg. 2017;26:352–357. doi: 10.1016/j.jse.2016.09.054. [DOI] [PubMed] [Google Scholar]
- 31.Lai AM, Hsueh P-Y, Choi Y, Austin RR. Present and future trends in consumer health informatics and patient-generated health data. Yearb. Med. Inform. 2017;26:152–159. doi: 10.15265/IY-2017-016. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Lober W, Evans H. Patient-generated health data in surgical site infection: changing clinical workflow and care delivery. Surgical Infect. 2019;20:571–576. doi: 10.1089/sur.2019.195. [DOI] [PubMed] [Google Scholar]
- 33.Miyamoto SW, Henderson S, Young HM, Pande A, Han JJ. Tracking health data is not enough: a qualitative exploration of the role of healthcare partnerships and mHealth technology to promote physical activity and to sustain behavior change. JMIR mHealth uHealth. 2016;4:e5. doi: 10.2196/mhealth.4814. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Loos, J. R. & Davidson, E. J. in 2016 49th Hawaii International Conference on System Sciences (HICSS). 3389–3399.
- 35.Zhu H, Colgan J, Reddy M, Choe EK. Sharing patient-generated data in clinical practices: an interview study. AMIA Annu Symp. Proc. 2017;2016:1303–1312. [PMC free article] [PubMed] [Google Scholar]
- 36.Rogers WA. Is there a moral duty for doctors to trust patients? J. Med. Ethics. 2002;28:77–80. doi: 10.1136/jme.28.2.77. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Pelaccia T, et al. Do emergency physicians trust their patients? Intern. Emerg. Med. 2016;11:603–608. doi: 10.1007/s11739-016-1410-1. [DOI] [PubMed] [Google Scholar]
- 38.Timmers C, et al. Ambulant cognitive assessment using a smartphone. Appl. Neuropsychology: Adult. 2014;21:136–142. doi: 10.1080/09084282.2013.778261. [DOI] [PubMed] [Google Scholar]
- 39.Sangha S, George J, Winthrop C, Panchal S. Confusion: delirium and dementia—a smartphone app to improve cognitive assessment. BMJ Qual. Improvement Rep. 2015;4:u202580.w201592. doi: 10.1136/bmjquality.u202580.w1592. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Ramsey AT, Wetherell JL, Depp C, Dixon D, Lenze E. Feasibility and acceptability of smartphone assessment in older adults with cognitive and emotional difficulties. J. Technol. Hum. Serv. 2016;34:209–223. doi: 10.1080/15228835.2016.1170649. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Hafiz, P. & Bardram, J. E. in Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, 1162–1165 (ACM, London, United Kingdom, 2019).
- 42.Hafiz, P. & Bardram, J. E. in Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, 278–279 (ACM, London, United Kingdom, 2019).
- 43.Van Der Wees PJ, et al. Integrating the use of patient-reported outcomes for both clinical practice and performance measurement: views of experts from 3 countries. Milbank Q. 2014;92:754–775. doi: 10.1111/1468-0009.12091. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Gold, H. T. et al. Implementation and early adaptation of patient-reported outcome measures into an electronic health record: a technical report. Health Informa. J.10.1177/1460458218813710 (2018). [DOI] [PubMed]
- 45.Hao H, Padman R, Telang R. An empirical study of opinion leader effects on mobile information technology adoption in healthcare. AMIA Annu Symp. Proc. 2011;2011:537–542. [PMC free article] [PubMed] [Google Scholar]
- 46.Hao H, Padman R. An empirical study of opinion leader effects on mobile technology implementation by physicians in an American community health system. Health Inform. J. 2018;24:323–333. doi: 10.1177/1460458216675499. [DOI] [PubMed] [Google Scholar]
- 47.Palinkas LA, et al. Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. Adm. Policy Ment. Health. 2015;42:533–544. doi: 10.1007/s10488-013-0528-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Fylan, F. in A handbook of research methods for clinical and health psychology Vol. 5 (eds J Miles J. & P Gilbert P.) Ch. 6, 65–78 (Oxford University Press, 2005).
- 49.Glaser BG, Strauss AL. The constant comparative method of qualitative analysis. Soc. Probl. 1965;12:436–445. doi: 10.2307/798843. [DOI] [Google Scholar]
- 50.Taylor, S. J., Bogdan, R. & DeVault, M. Introduction to qualitative research methods: A guidebook and resource. (John Wiley & Sons, 2015).
- 51.Glaser BG, Strauss AL. The constant comparative method of qualitative analysis. Discov. grounded Theory.: Strateg. qualitative Res. 1967;101:158. [Google Scholar]
- 52.Morse JM, Barrett M, Mayan M, Olson K, Spiers J. Verification strategies for establishing reliability and validity in qualitative research. Int. J. Qualitative Methods. 2002;1:13–22. doi: 10.1177/160940690200100202. [DOI] [Google Scholar]
- 53.Owen WF. Interpretive themes in relational communication. Q. J. Speech. 1984;70:274–287. doi: 10.1080/00335638409383697. [DOI] [Google Scholar]
- 54.Srivastava P, Hopwood N. A practical iterative framework for qualitative data analysis. Int. J. qualitative methods. 2009;8:76–84. doi: 10.1177/160940690900800107. [DOI] [Google Scholar]
- 55.Auerbach, C. & Silverstein, L. B. Qualitative data: An introduction to coding and analysis. (NYU press, 2003).
- 56.Becker, H. S. Sociological work. (Transaction publishers, 1970).
- 57.Maxwell JA. Using numbers in qualitative research. Qualitative Inq. 2010;16:475–482. doi: 10.1177/1077800410364740. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
Selected quotes from participants are included in the article; supplementary tables include data about providers’ preferences. The full dataset is available from the corresponding author upon reasonable request.