Abstract
Background
Diabetes preventive care reduces complications, yet completion rates remain suboptimal. Patient portals may help close care gaps by enabling patients to initiate services, but their use for diabetes preventive care is understudied.
Objective
To evaluate the acceptability and feasibility of the Diabetes Care Gaps Intervention (DCGI), a portal-based tool that notifies patients when preventive care is due and enables them to initiate orders.
Methods
We conducted a single-arm pilot study at an academic medical center. Adult primary care patients with diabetes and ≥1 care gap (HbA1c, urine microalbumin, eye exam, or pneumococcal vaccine) received DCGI access for three months. Primary outcomes were usability (System Usability Scale, SUS) and user experience (qualitative interviews, Likert-scale items). Secondary outcomes included diabetes self-efficacy, distress, engagement (patient-initiated orders), and care gap closure. Participants received modest compensation for survey completion and first use of the DCGI to initiate or report care.
Results
Among 50 participants (median age 58.3 [IQR: 45.9, 67.0], 52% female, and 40% from minoritized racial/ethnic groups), 78% (39/50) initiated or reported care through the DCGI. The median SUS score was 78.8 (IQR: 70.0, 90.0), significantly above the “good” usability threshold of 71 (p=0.002). Interviews revealed two acceptability themes: (1) appreciation of the DCGI for helping keep up with preventive care, and (2) unease about using the DCGI due to limited portal confidence. Users showed improved diabetes self-efficacy and reduced distress. Of 110 total care gaps, 47 (43%) were addressed via patient-initiated orders, and 79% (37/47) led to completed care (gap closure).
Conclusions
The DCGI was acceptable and feasible for empowering patients to address diabetes care gaps. Future randomized studies should evaluate effectiveness and scalability.
Keywords: patient portal, diabetes mellitus, practice guidelines, usability, pilot study
Background
Approximately one in ten individuals in the U.S. has type 2 diabetes mellitus, and its prevalence is projected to continue rising.1,2 Diabetes-related complications can be prevented or delayed through adherence to recommended preventive care, such as annual diabetes eye exams. 3 However, many patients remain overdue for these services. Depending on the specific type of preventive care, an estimated 25-50% of individuals with diabetes experience care gaps, with disproportionately higher rates among people from minoritized racial or ethnic groups.4–6 Low completion rates for diabetes preventive care stem from a combination of system-, clinician-, and patient-level barriers, including limited time during clinical encounters, and lack of awareness and knowledge among patients.7,8 Leveraging the patient portal to support patient engagement outside of traditional visits may help to address these challenges.9,10 In particular, patient portals provide patients convenient access to their health information and the ability to engage with their care team, including reviewing results, messaging clinicians, and scheduling services.11,12
Prior work has shown that patient portal-based interventions can improve uptake of select preventive services, such as mammography or colorectal cancer screening, and may enhance patient activation and clinical efficiency.13–16 However, most portal-based interventions have focused on single services or preventive service reminders, with limited support for patients to initiate orders for multiple guideline-based services within routine primary care workflows. 17 In the context of diabetes care, little attention has been paid to whether portals can be used to empower patients to directly address preventive care gaps.
To address these limitations in existing portal-based approaches, we developed the Diabetes Care Gaps Intervention (DCGI), a patient portal tool embedded in Epic’s MyChart platform that notifies patients when guideline-based diabetes preventive care becomes due and enables patients to initiate orders for the care directly through the portal. The DCGI advances prior approaches through patient-initiated ordering that leverages existing EHR infrastructure (e.g., Epic’s Health Maintenance and Reporting Workbench infrastructure) and integrates into routine clinical workflows. Moreover, to our knowledge, no prior studies have evaluated patient portal functionality that enables patients to directly initiate clinician-cosigned orders to address multiple guideline-based diabetes preventive care services within routine care.
Objective
The objective of this study was to evaluate the acceptability (e.g., usability, user experience, and perceived effectiveness) and feasibility (e.g., engagement, care gap closure) of the DCGI.18,19 We previously conducted a design sprint and formative usability testing to develop and refine the DCGI. 20 This pilot study represents the next phase in assessing the utility of the DCGI when integrated into routine clinical care.
Methods
Study setting, design, and eligibility
This research was conducted at Vanderbilt University Medical Center (VUMC), a large academic medical center in Nashville, TN between May 2021 and May 2022. All clinical data at VUMC is stored in the electronic health record (EHR), supplied by Epic Systems Corp. Patients receive access to their clinical data via an integrated patient portal called My Health at Vanderbilt (MHAV) 21 which is currently configured on Epic’s MyChart platform. MHAV is accessible via a web browser and via mobile app for iOS and Android. The DCGI was embedded in MHAV.
We used a single-arm pilot study to assess feasibility and acceptability. Participants were recruited from two VUMC-affiliated adult primary care clinics. Eligible patients were between 18-75 years of age, diagnosed with type 1 or type 2 diabetes mellitus, able to speak and read in English, owned a mobile device with internet access, had an active MHAV account, were willing and able to use the MHAV app on their mobile device, and were due for one or more of these four guideline-based, diabetes preventive care services: (1) hemoglobin A1c [HbA1c], (2) urine microalbumin, (3) diabetes eye exam, and/or (4) pneumococcal polysaccharide vaccine (PPSV23). We excluded patients with known cognitive deficits or functional impairment preventing the use of a mobile device, who were pregnant or planning to become pregnant during the study period, had severe difficulty seeing, hearing, or had a medical condition limiting speech or communication.
We queried the EHR for eligible patients with at least one of the four diabetes care gaps. To increase the likelihood that participants would have an active care gap at enrollment, we preferentially targeted patients without a primary care appointment scheduled in the subsequent 60 days. To ensure a diverse and representative study sample, we prioritized recruitment for patients aged 65 years or older and those from minoritized racial and ethnic backgrounds. On a rolling basis, we sent potentially eligible patients a letter describing the study. Interested patients could contact study personnel to learn more about the study and/or review an online eligibility screener. Those who were eligible could complete an electronic consent form and enroll online via REDCap (Research Electronic Data Capture).22,23 Supplementary file 1 – Figure S1 presents a CONSORT diagram that details the number of patients recruited, screened, enrolled, and analyzed.
Ethical approval
The Vanderbilt Institutional Review Board approved this research (IRB# 202281), and the study was registered on ClinicalTrials.gov (NCT04728620) under the title Evaluation of a Patient Portal Intervention to Address Diabetes Care Gaps.
Procedure
A research assistant reviewed study procedures with participants by phone and confirmed eligibility. Subsequently, enrolled participants were sent a baseline survey (T0). Upon completion of the baseline survey, enrolled participants were given access to the DCGI for three months. The 3-month intervention window was selected to allow adequate time for participants to initiate orders for diabetes preventive care via the DCGI and complete initiated care (e.g., complete their diabetes eye exam). Participants then completed surveys at two additional time points: after using the intervention for the first time to address care gap(s) (T1), and 3-month follow-up (T2). All surveys were sent via email using REDCap. All survey measures included in this study are reported in Supplementary file 2.
After participants concluded their 3 months of access to the DCGI, we invited a subset to participate in semi-structured interviews. We used strategic purposive sampling to recruit at least two participants for interviews in each of the following categories: (a) age 65 or over, (b) limited health literacy, (c) minoritized racial or ethnic background, and (d) did not use DCGI. Interviews lasted approximately 30 minutes and took place by phone or via videoconference (Zoom). Interviews were conducted by one member of the study team (WM), an experienced qualitative researcher who had no prior relationship with the participants. Additional participants were interviewed until thematic saturation was reached, defined a priori as no new themes emerging in two consecutive interviews. 24 Interviews were audio-recorded with participant consent and transcribed verbatim. Qualitative study procedures were conducted by following the COREQ guidelines, and evidence-based qualitative methodology. 25
Participants were compensated $30 for completing each survey. In addition, participants received $10 for using the intervention for the first time. Participants who completed an interview were compensated an additional $40 for their time.
Intervention
Details on the design and formative testing of the DCGI are reported in Nelson et al. 20 Example screenshots are provided in Figure 1. The DCGI leverages Epic’s MyChart platform and existing EHR infrastructure to notify patients of diabetes preventive care gaps and enable patient-initiated orders for the corresponding care within routine clinical workflows. The intervention was built using Epic’s Health Maintenance functionality, which applies guideline-based logic to identify when patients are overdue for specific diabetes preventive care (HbA1c testing, urine microalbumin testing, diabetes eye examination, or pneumococcal vaccination). When a care gap was detected, a corresponding patient-facing To Do item was automatically generated and displayed within the MHAV patient portal, along with an associated push notification (Figure 1(a)). Each To Do item included a brief description of the overdue care written at or below a sixth-grade reading level and the recommended completion interval (Figure 1(b)). Patients could click an actionable button to initiate the recommended care (Figure 1(c)).
Figure 1.
Diabetes care gaps intervention (DCGI) screenshots for hemoglobin A1c care gaps.
Using Epic’s Reporting Workbench, a care coordinator identified patient-initiated DCGI requests. The care coordinator used the integrated bulk order functionality within Reporting Workbench to generate the corresponding care orders (e.g., laboratory orders, vaccinations, and referrals) to be cosigned by the patients’ primary care physicians (PCPs), consistent with standard clinical workflows. Finally, the care coordinator used Reporting Workbench bulk communication tool to send confirmation messages to patients via MHAV containing detailed next steps (e.g., directions to the lab) (Figure 1(d)). Once services were completed, results of patient-initiated preventive care (e.g., HbA1c and urine microalbumin test results, documentation of completed eye examinations) were routed to the PCP’s EHR in-basket for review and follow-up via existing workflows.
Because many patients receive diabetes eye exams outside of VUMC, patients could use DCGI to report the date of an external diabetes eye exam completed within the preceding 12 months. To address design limitations during the study period, we added: a “decline” response option to allow patients to clear To Do items if they preferred not to initiate care (e.g., if planning to address the care gap at an upcoming visit); additional reminder messages for unanswered To Do items; and language clarifying reporting of external eye exams.
Measures
Clinical and sociodemographic characteristics
We collected self-reported socio-demographics, duration of diabetes, health literacy,26,27 eHealth literacy, 28 and subjective numeracy. 29 Participants’ most recent HbA1c result was abstracted from the EHR.
Outcomes
We organized our primary and secondary outcomes into two overarching domains—acceptability and feasibility.18,19 A summary of all outcome measures is provided in Table 1.
Table 1.
Summary of outcome measures.
| Outcome | Type | Measures | Timepoint |
|---|---|---|---|
| Acceptability | |||
| Usability | Primary | • System Usability Scale30,31 | T1 |
| User Experience | Primary | • Participants’ willingness to recommend and continue to use the DCGI via Likert-scale items | T2 |
| • Qualitative interviews | |||
| Perceived Effectiveness | Secondary | • Diabetes self-efficacy via the Manage Disease in General Scale 32 | T0, T2 |
| • Diabetes distress via the Problem Areas in Diabetes Scale-5 (33) | |||
| • Participants’ understanding of diabetes preventive care via Likert-scale items | |||
| Feasibility | |||
| Engagement | Secondary | • Patient-initiated orders | T2 |
| Care gap closure | Secondary | • Order completion | T2 |
DCGI, Diabetes Care Gaps Intervention; T0=baseline; T1=after using the intervention for the first time to address care gap(s); T2=3-month follow-up.
Acceptability measures
Usability and user experience
Our primary outcomes were usability and user experience. Usability was assessed by the 10-item System Usability Scale (SUS) which is scored 0 (worst) to 100 (best).30,31 User experience was assessed by two Likert-scale items and qualitative interviews. The two Likert-scale items asked participants to rate (1 = “strongly disagree” to 5 = “strongly agree”) their willingness to recommend the intervention to other patients and to continue using the intervention in the future. Qualitative interviews with participants who used the intervention asked what they liked and did not like about DCGI, why DCGI was or was not helpful, and solicited suggestions for improvement. Interviews with participants who did not use the intervention asked about their reasoning. Then, following a brief explanation of the intervention’s functionality, non-users were asked to share their thoughts on its potential helpfulness and suggestions for improvement. Supplementary file 2 includes the semi-structured interview guide.
Perceived effectiveness
We also measured the intervention’s potential impact on psychosocial domains as part of an exploratory secondary analysis. We assessed diabetes self-efficacy (i.e., confidence in managing diabetes) using an adapted version of the 5-item Manage Disease in General Scale (MDGS) of the Chronic Disease Self-Efficacy Scales. 32 In addition, we assessed diabetes distress using the 5-item Problem Areas in Diabetes Scale (PAID-5). 33 Finally, we assessed participants’ understanding of how often they should receive each type of diabetes preventive care.
Feasibility measures
Engagement
We measured engagement via the number of participants who used DCGI to address a care gap. Specifically, we assessed the number of participants who used DCGI at least once during their 3-month participation to either initiate care and/or report outside care (e.g., an eye exam completed outside VUMC).
Care gap closure
We measured the proportion of care gaps that were closed during the study period after care was initiated through the intervention (i.e.,order completion), both at the participant level and the level of each diabetes preventive care service.
Analyses
Acceptability
We calculated descriptive statistics, such as mean and median, for SUS score at T1 among all participants who used the intervention (i.e., initiated and/or reported care), as well as for subgroups based on limited and adequate health literacy levels. 34 Assuming a standard deviation of 12 based on prior studies, with a sample size of 48, a one sample t-test would detect an absolute difference in mean SUS scores of at least 5 points above the threshold score of 71, indicative of “good” usability, with 80% power. However, due to the non-normality of the data, we used a one-sample Wilcoxon signed rank test to compare the median SUS score for each group to the threshold score of 71. 34 If the median SUS were 76.5 or greater, a one-sample Wilcoxon signed rank test would have at least 80% power to detect a difference from the threshold score of 71, assuming a sample size of 48 and an SD of 12 but truncating at the max SUS score at 100. For Likert-items at T2 that asked whether participants would recommend and continue using the intervention, we calculated the proportion of participants who responded that they strongly agreed or agreed with each item. We created 95% Wilson confidence intervals (CI) for these proportions. 35
Qualitative data coding and analysis was conducted by the Vanderbilt University Qualitative Research Core (VU-QRC). An initial coding system was developed deductively from the semi-structured interview guide and inductively following preliminary review of the transcripts. Two experienced qualitative coders independently applied the initial coding system to the same transcript to assess coding consistency. 36 Intercoder agreement was reviewed collaboratively, and any discrepancies were resolved through reconciliation discussion sessions. After refining the codebook to ensure consistent and accurate coding, the remaining transcripts were divided and coded independently. Coded data were subsequently organized and synthesized by grouping related codes into higher-order themes focused on intervention acceptability and user experience. Transcripts, codes, and quotations were managed using Microsoft Excel version 16 and SPSS version 28.
As an exploratory secondary analysis, we assessed whether there was a significant improvement in our psychosocial measures from T0 to T2; we restricted these analyses to participants who used the intervention. For the MDGS and PAID-5, we performed Wilcoxon signed-rank tests on the pairwise differences. For participants’ understanding of how often they should receive diabetes preventive care, we assessed changes between T2 and T0 using McNemar’s tests. All quantitative analyses were completed using R4.2.2. 37
Feasibility
We performed additional secondary analyses focusing on feasibility and assessed the number of participants who used the intervention to address a care gap at least once during their 3-month participation and/or who reported outside care. Additionally, we measured the prevalence of care gaps at T0 and T2 after participants initiated care through the intervention; prevalence was measured both at the participant level and the level of each diabetes preventive care service.
Results
Participant characteristics
A total of 60 participants were enrolled in the study. Of these, 50 completed the baseline survey and had at least one diabetes care gap, making them eligible to receive the intervention and be included in the analysis. Table 2 shows participant characteristics. Participants’ median age was 58.3 (IQR: 45.9, 67.0), 52% (26/50) were female, and 40% (20/50) had a minoritized racial or ethnic background.
Table 2.
Participant characteristics, n (%) or median [Q1, Q3].
| | Full sample n=50 | DCGI users n=39 a |
|---|---|---|
| Gender | ||
| Female | 26 (52%) | 22 (56%) |
| Male | 24 (48%) | 17 (44%) |
| Age | 58.3 [45.9, 67.0] | 58.2 [43.7, 66.6] |
| Under 65 | 32 (64%) | 26 (67%) |
| 65 and over | 18 (36%) | 13 (33%) |
| Race and ethnicity | ||
| Black or African American | 15 (30%) | 12 (31%) |
| White or Caucasian | 30 (60%) | 23 (59%) |
| Asian | 2 (4%) | 2 (5%) |
| Hispanic or Latino | 1 (2%) | 1 (3%) |
| Other | 2 (4%) | 1 (3%) |
| Education | ||
| High school or GED | 7 (14%) | 7 (18%) |
| Some college | 9 (18%) | 7 (18%) |
| College degree | 14 (28%) | 10 (26%) |
| Graduate or professional degree | 20 (40%) | 15 (39%) |
| Insurance | ||
| Individual/Group plan only | 30 (60%) | 23 (59%) |
| Governmental (Medicaid/Medicare only) | 14 (28%) | 10 (26%) |
| Both | 6 (12%) | 6 (15%) |
| Annual Household Income | ||
| Less than $50,000 | 9 (18%) | 7 (18%) |
| $50,000-$99,999 | 23 (46%) | 19 (49%) |
| $100,000 and greater | 16 (32%) | 11 (28%) |
| Prefer not to answer | 2 (4%) | 2 (5%) |
| Health Literacy b | ||
| Adequate | 36 (72%) | 26 (67%) |
| Limited | 14 (28%) | 13 (33%) |
| eHealth Literacy c | 32.0 [31.0, 37.0] e | 33.0 [30.5, 38.5] |
| Subjective Numeracy d | 5.0 [4.1, 5.4] | 5.1 [4.1, 5.4] |
| Diabetes Type e | ||
| Type 1 | 4 (8%) | 3 (8%) |
| Type 2 | 40 (80%) | 30 (77%) |
| Unspecified | 6 (12%) | 6 (15%) |
| Hemoglobin A1c, % (mmol/mol) | 6.7 [6.2, 7.6] f (50 [44, 60]) | 6.6 [6.2, 7.4] f (49 [44, 57]) |
| < 9% (<75 mmol/mol) | 46 (94%) | 37 (97%) |
| ≥ 9% (≥75 mmol/mol) | 3 (6%) | 1 (3%) |
| Years diagnosed with diabetes | 10.0 [5.0, 19.0] f | 9.5 [5.0, 19.5] f |
DCGI, Diabetes Care Gaps Intervention; GED, General Education Development.
aParticipants who used the intervention at least once to initiate care and/or report outside care.
bAssessed using a validated 1-item scale to identify patients with limited health literacy: “How confident are you filling out medical forms?” Response options included 1 = not at all, 2 = a little bit, 3 = somewhat, 4 = quite a bit, and 5 = extremely. Consistent with prior research, 23,32 participants reporting any lack of confidence (i.e., response options 1–4) were classified as having limited health literacy.
cAssessed using the eHealth Literacy Scale (eHEALS). Possible score range: 5–40.
dAssessed using the Subjective Numeracy Scale (SNS). Possible score range: 1-6.
eBased on participant’s problem list in electronic health record.
fOne missing value.
Acceptability
Usability and user experience
The median SUS score was 78.8 (IQR: 70.0, 90.0) among DCGI users (n=36, missing data=3), significantly above the threshold score of 71 that is indicative of “good” usability (p=0.002). 34 The median SUS score was 80.0 (IQR: 70.0, 90.6) among participants with adequate health literacy (n=24), and 77.5 (IQR: 71.2, 90.0) among participants with limited health literacy (n=12) (p=0.88). Most participants (82%, 31/38, missing data = 1, 95% CI: 67% - 91%) agreed they would recommend the intervention to other patients. In addition, most participants (84%, 32/38), missing data = 1, 95% CI: 70% - 93%) agreed they would continue to use the intervention in the future.
Ten participants completed interviews, and their characteristics were comparable to those of the full sample: 40% (4/10) were >65 years of age, 20% (2/10) had limited health literacy, and 30% (3/10) were from a minoritized racial or ethnic background. We identified two main acceptability-related themes based on participants’ feedback, reported in Table 3. First, participants felt DCGI had significant value because it facilitated keeping up with diabetes preventive care (n=7/10; 70%). Many participants mentioned feeling overwhelmed with the maintenance required for their diabetes care and that it was often difficult to manage both their diabetes care and personal life. The second theme from interviews was participant uncertainty about how to use the intervention, which led to delays with using it or not using it at all (n=5/10; 50%). Some expressed not having skills to use technology in general and others mentioned wanting clearer instructions from the beginning on how to use the intervention. A few participants requested having someone at their doctor’s office walk through it with them.
Table 3.
Qualitative feedback themes and example quotes.
| Theme | Quotes |
|---|---|
| Helps facilitate diabetes management | Very, very excited and look forward to the feature…I think for me and my lifestyle and busy-ness, it would be much better when I have time, if I could set these [appointments] up myself. I like getting a prompter that says, “I need my eye appointment this month or my A1C.” …In fact, to me, it makes more sense to go into my appointment with my physician and we all know what my A1C is when I’m there. We don’t have to talk and then order the test and then I get a note a few days later saying whether I was good or bad. (White Male, >65 years old, adequate health literacy) a |
| I think in theory, it’s a great concept. Having diabetes, there’s a lot of maintenance that’s involved and it can get overwhelming…for a lot of folks. You have different providers, from your endocrinologists to your optometrist to potentially a podiatrist. There’s a lot to balance. (White Female, <65 years old, adequate health literacy) | |
| Well, my initial thoughts are, it sounds great. Because it’s kind of hard… I typically miss my A1C test anyway… I end up waiting till I go see my provider for a year and then they’re like, “You’re way past due for your A1C test.” I think that would be a great feature because if it would just pop up into my Vanderbilt health app, then I could just click on it. It would just give me the reminder, “Oh, hey, you got to do this.” And I could click on it, get an appointment, and kind of be done with it. (White Male, <65 years old, adequate health literacy) | |
| Uncertainty with using intervention | I kept delaying it, and delaying it, because I didn’t know how it was going to work. So I kept delaying it until I had more time to look into it properly. So yeah, if that had been clear from the beginning, or if it had been something that I had been doing for a long time, I think there wouldn’t have been that delay. (Asian Female, <65 years old, limited health literacy) |
| …I can’t tell you how many things I see that I have to give up on because I don’t have the skills and I don’t know what it is, but it does seem to be an age thing. (White Male, >65 years old, adequate health literacy) | |
| …Some patients might not even download an app like that. You know, [you need to] help them to download it, and show them exactly what it is, and where they can expect to see a message from the doctor about the test, and so on. I think that might be helpful to have somebody actually walk you through the first time, or first use. (Asian Female, <65 years old, limited health literacy) |
aHealth literacy was assessed using a validated 1-item scale: “How confident are you filling out medical forms?” Response options included 1 = not at all, 2 = a little bit, 3 = somewhat, 4 = quite a bit, and 5 = extremely. Participants reporting any lack of confidence (i.e., response options 1–4) were identified as having limited health literacy.
Perceived Effectiveness
Supplementary file 1 - Table S1 reports change in MDGS and PAID-5 scores from baseline to 3 months among participants who used the intervention. We found a significant increase in diabetes self-efficacy; the median difference in MDGS scores was 0.6 (p= 0.03). In addition, we found a decrease in diabetes distress; the median difference in PAID-5 scores was -1.0 (p= 0.05). When examining change in understanding of the recommended frequency for each diabetes preventive care service, we found no change in the proportion of DCGI users who identified the correct frequency from baseline to 3 months (Supplementary file 1 – Table S2).
Feasibility
Engagement
After receiving DCGI notification, 78% (39/50) of participants used the intervention to address a care gap at least once during their 3-month participation. Specifically, 22 participants initiated care, 7 reported outside care, and 10 both initiated and reported outside care.
Participants in the full sample (n=50) and in the subset who used the intervention (n=39) were similar in their characteristics (Table 2). Those with limited health literacy were nearly equally proportioned in both groups, and scores on eHealth literacy and subjective numeracy were also very similar.
Care gap closure
Among the 50 participants, we identified a total of 110 diabetes care gaps including 36 for diabetes eye exam, 31 for urine microalbumin, 25 for HbA1c, and 18 for pneumococcal vaccination. Table 4 shows care initiated by participants and resulting care gap closure. Participants used the DCGI to initiate care for 43% (47/110) of the total care gaps, most commonly to address HbA1c care gaps (56%, 14/25) and urine microalbumin (55%, 17/31). When care was initiated via the DCGI, participants typically (79%, 37/47) proceeded to receive the corresponding care.
Table 4.
Diabetes care gaps intervention (DCGI) care initiated and received by type of diabetes preventive care.
| | Number of care gaps | No response/declined a | Care initiated | Of care initiated, care received (gap closed) |
|---|---|---|---|---|
| Hemoglobin A1c | 25 | 44% (11/25) | 56% (14/25) | 93% (13/14) |
| Urine microalbumin | 31 | 45% (14/31) | 55% (17/31) | 88% (15/17) |
| Pneumococcal vaccination | 18 | 67% (12/18) | 33% (6/18) | 67% (4/6) |
| Diabetes eye exam | 36 b | 25% (9/36) | 28% (10/36) | 50% (5/10) |
| Overall | 110 | 42% (46/110) | 43% (47/110) | 79% (37/47) |
aParticipants either did not respond to the request or chose to decline initiating care.
bAdditionally, 47% (17/36) of patients reported an external diabetes eye exam completed within the preceding 12 months.
Table 5 shows care gaps at the participant level at baseline and at 3-month follow-up among DCGI users and non-users. At 3-month follow-up, 36% (14/39) of DCGI users had no care gaps compared to 27% (3/11) among non-users.
Table 5.
Diabetes care gaps at participant level at baseline and 3-month follow-up.
| Number of gaps | Baseline (n=50) | 3-Month follow-up | |
|---|---|---|---|
| DCGI users (n=39) | Non-users (n=11) | ||
| 0 | --- | 36% (14) | 27% (3) |
| 1 | 24% (12) | 41% (16) | 36% (4) |
| 2 | 46% (23) | 15% (6) | 18% (2) |
| 3 | 16% (8) | 8% (3) | 18% (2) |
| 4 | 14% (7) | 0% (0) | 0% (0) |
Discussion
We evaluated the feasibility and acceptability of a novel patient portal intervention, DCGI, designed to support patient-initiated ordering for guideline-based diabetes preventive care. Participants found the DCGI acceptable, reporting favorable usability and perceived helpfulness. Overall, engagement with the intervention was strong, with high rates of subsequent care gap closure among DCGI users. In addition, we observed exploratory and preliminary evidence of perceived effectiveness via modest improvements in participant-reported diabetes self-efficacy and distress, as well as interviews among DCGI users indicating the intervention helped participants keep up with preventive care and facilitated diabetes management.
As a pilot study, our findings provide preliminary support for expanding patient portal functionality to enable patient-initiated ordering of diabetes preventive care. Acceptability findings were encouraging across the full sample of participants including those with varying levels of health literacy. Consistent with prior studies examining self-ordering preventive care via the patient portal in other contexts, we found evidence that using DCGI was associated with fewer care gaps at three months, although causal inference cannot be made given our study design.16,38 Notably, we did not observe improvements in participants’ understanding of the recommended frequency for each diabetes preventive care service. This may reflect the short duration participants had access to DCGI or a need to enhance the education provided as part of the intervention.
Patient portal interventions have proliferated in recent years and provide opportunities to support patients and drive improved outcomes, while improving clinical efficiency. Most portal-based interventions available are focused on delivering education, facilitating collection of patient-reported information, or delivering preventive service reminders. 17 Although there are a growing number of interventions that facilitate self-scheduling for preventive care, relatively few have enabled patients to initiate orders for preventive care within routine clinical workflows. In one study that evaluated a patient portal intervention for improving colorectal screening, the investigators found that offering the ability to self-order screening kits led to more kit completion 38 ; however, the kits were directly mailed to participants whereas participants in our study had to go to a lab or clinic to receive care enhancing the real-world applicability.
Service-specific patterns in diabetes preventive care completion may further inform how and when portal-enabled, patient-initiated orders may be most effective. Completion following patient initiation was highest for laboratory-based services that were internally delivered and required minimal additional coordination (e.g., HbA1c and urine microalbumin testing), whereas completion was lower for services requiring in-person scheduling or external coordination (e.g., pneumococcal vaccination and diabetes eye examinations). These differences likely reflect variation in task complexity and care pathways. Similar patterns have been observed in prior digital health studies, where simpler, single-step services embedded within health system workflows were more readily completed than services requiring multiple steps or external providers. 39 Our findings suggest that patient-initiated ordering may be most impactful for services that can be completed with minimal downstream friction, and that additional system-level supports may further improve completion for more complex or externally delivered care.
Based on our qualitative findings, patient participants appreciated the intervention’s utility; however, some expressed uncertainty with using it, mainly due a lack of skill and confidence using the patient portal. Although patient portal use has increased in recent years, there are still disparities in use among more vulnerable groups,40–43 and it remains essential that all patients feel equipped to use portals to avoid worsening health disparities.44,45 A recent retrospective analysis of EHR data demonstrated that more engagement with the patient portal is associated with increased odds of completing screening mammograms, influenza vaccinations, and fecal immunochemical test screening. 46 Although few studies have evaluated programs or interventions designed to reduce disparities in patient portal use, technical training and assistance programs have the best evidence for increasing use in vulnerable populations. 47 Ensuring equitable use of portals alongside functionality that empowers patients to initiate guideline-based care may ultimately help reduce health disparities by reducing common barriers to care completion.48,49
Limitations
Several limitations of this study should be acknowledged. First, eligibility required an active MHAV account and all data collection occurred electronically, which may bias toward patients with higher digital access and literacy and limit generalizability. To mitigate this concern, we intentionally oversampled older adults and individuals from minoritized racial and ethnic backgrounds. Second, this was a single-arm pilot study, and participants continued to interact with the healthcare system during participation. Consequently, we cannot attribute observed care gap closures solely to the DCGI, as other clinical encounters or processes may have contributed. Although a higher proportion of DCGI users had no remaining care gaps at 3 months compared with non-users (36% vs. 27%), this comparison is descriptive and exploratory and does not establish causality given the lack of randomization and the potential for selection bias. Third, modest improvements in diabetes self-efficacy and distress were observed among DCGI users, however, these psychosocial outcomes were exploratory and evaluated without adjustments for multiple comparisons. Although PAID-5 scores have been associated with depression and poorer glycemic control in prior studies50,51 the small effect size observed may not reflect a clinically meaningful difference. In addition, these analyses were restricted to participants who used DCGI, which may introduce selection bias toward more digitally engaged and activated patients. Fourth, we preferentially recruited patients who had at least one diabetes care gap and no primary care appointment scheduled in the next 60 days to increase the likelihood that they would still have a care gap at the time of enrollment, enabling them to engage with and provide feedback on the intervention. Patients meeting these criteria may differ systematically from those with upcoming visits, potentially representing a subset with different care-seeking behavior or competing demands. However, the ability to engage patients who were overdue for care and not imminently scheduled for a visit may alternatively be viewed as a strength, as these individuals represent a population for whom visit-based strategies may be less effective. Fifth, financial compensation to participants may have influenced engagement. Participants received modest compensation for survey completion and first use of the DCGI, which may have increased adoption beyond what might be observed in routine care. While promoting usage was crucial for gathering feedback at this formative stage and a high percentage of participants reported that they would continue to use DCGI if available, future work evaluating the effectiveness of DCGI should do so without linked financial incentives to better assess real-world adoption and sustainability. Lastly, this was a pilot study with a relatively small number of participants who were recruited from a single, urban academic medical center, limiting generalizability. A smaller sample size limits the ability to capture the diversity and variability of the target population as well as reduces power to detect small effect sizes or differences and conduct adjusted analyses. Nonetheless, we still observed a significant and moderate difference between the median SUS score and the threshold SUS score despite a smaller sample.
Conclusions
In this single-arm pilot study, the DCGI was acceptable and feasible and supported patient-initiated ordering for diabetes preventive care. Future iterations of DCGI should consider more direct support to patients to improve confidence with both the patient portal and DCGI, as well as tiered interventions for patients that do not respond to portal notifications. While engagement and subsequent completion of initiated care were encouraging, the single site and small sample limit generalizability. Future randomized studies are warranted to evaluate effectiveness, scalability, and impacts across patient subgroups. If successful, the technology, infrastructure, and patient empowerment associated with the DCGI may be used to address other types of guideline-based preventive services (e.g., cancer screening).
Supplemental material
Supplemental material for Pilot study of a patient portal intervention designed to empower patients to address diabetes care gaps by Lyndsay A. Nelson, Jared Cobb, Tom Elasy, Sapna S. Gangaputra, Amber J. Hackstadt, Kryseana Harper, Lindsay S. Mayberry, Neeraja Peterson, S. Trent Rosenbloom, Zhihong Yu, William Martinez, in Digital Health.
Supplemental material for Pilot study of a patient portal intervention designed to empower patients to address diabetes care gaps by Lyndsay A. Nelson, Jared Cobb, Tom Elasy, Sapna S. Gangaputra, Amber J. Hackstadt, Kryseana Harper, Lindsay S. Mayberry, Neeraja Peterson, S. Trent Rosenbloom, Zhihong Yu, William Martinez, in Digital Health.
Acknowledgements
The authors would like to thank all participants for their involvement and VUMC HealthIT department for supporting the development of the intervention. The authors also thank the Vanderbilt University Qualitative Research Core (VU-QRC) for conducting the qualitative analyses.
Author contributions: Lyndsay A. Nelson: Methodology, Writing – Original Draft. Jared Cobb: Software, Resources, Writing – Review & Editing. Tom Elasy: Conceptualization, Methodology, Resources, Writing – Review & Editing. Sapna S. Gangaputra: Methodology, Writing – Review & Editing. Amber J. Hackstadt: Methodology, Formal analysis, Visualization, Writing – Review & Editing, Supervision. Kryseana Harper: Project administration, Investigation, Writing – Review & Editing. Lindsay S. Mayberry: Methodology, Writing – Review & Editing. Neeraja Peterson: Resources, Writing – Review & Editing. S. Trent Rosenbloom: Methodology, Writing – Review & Editing. Zhihong Yu: Data Curation, Formal analysis, Visualization, Writing – Review & Editing. William Martinez: Conceptualization, Methodology, Writing – Review & Editing, Formal analysis, Resources, Supervision, Project Administration, Funding Acquisition.
Funding: The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This research was supported by the National Institutes of Health (NIH), National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK) through R18 DK123373 and received support from the NIH’s National Center for Advancing Translational Sciences (NCATS) through UL1 TR000445. The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH.
The authors declared no potential conflicts of interest with respect to the research authorship, and/or publication of this article.
Supplemental material: Supplemental material for this article is available online.
ORCID iDs
Lyndsay A. Nelson https://orcid.org/0000-0002-7304-5643
Jared Cobb https://orcid.org/0000-0002-3081-5556
Tom Elasy https://orcid.org/0000-0002-1651-8387
Sapna S. Gangaputra https://orcid.org/0000-0001-5461-8568
Amber J. Hackstadt https://orcid.org/0000-0002-0164-6589
Kryseana Harper https://orcid.org/0000-0002-3336-0278
Lindsay S. Mayberry https://orcid.org/0000-0002-0654-4151
Neeraja Peterson https://orcid.org/0000-0002-9910-4680
S. Trent Rosenbloom https://orcid.org/0000-0001-7455-2260
Zhihong Yu https://orcid.org/0000-0003-1927-9841
William Martinez https://orcid.org/0000-0002-3155-4386
Ethical considerations
The Vanderbilt Institutional Review Board approved this research (IRB# 202281).
Consent to participate
All participants provided written consent to participate in this research.
Data Availability Statement
The deidentified datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.*
References
- 1.Centers for Disease Control and Prevention . Type 2 diabetes, 2024, Available from. https://www.cdc.gov/diabetes/about/about-type-2-diabetes.html?CDC_AAref_Val=https://www.cdc.gov/diabetes/basics/type2.html [Google Scholar]
- 2.Diabetes Collaborators GBD. Global, regional, and national burden of diabetes from 1990 to 2021, with projections of prevalence to 2050: A systematic analysis for the global burden of disease study 2021. Lancet 2023; 402(10397): 203–234. 10.1016/S0140-6736(23)01301-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.American Diabetes Association . Standards of care in diabetes-2023 abridged for primary care providers. Clin Diabetes 2022; 41(1): 4–31. 10.2337/cd23-as01 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Delevry D, Ho A, Le QA. Association between processes of diabetes care and health care utilization in patients with diabetes: Evidence from a nationally representative us sample. Journal of Diabetes 2021; 13(1): 78–88. 10.1111/1753-0407.13109 [DOI] [PubMed] [Google Scholar]
- 5.Centers for Disease Control and Prevention . Preventive care practices, 2022, Available from. https://archive.cdc.gov/#/details?q=https://www.cdc.gov/diabetes/library/reports/reportcard/preventive-care-practices.html&start=0&rows=10&url=https://www.cdc.gov/diabetes/library/reports/reportcard/preventive-care-practices.html [Google Scholar]
- 6.Pu J, Chewning B. Racial difference in diabetes preventive care. Research in Social and Administrative Pharmacy 2013; 9(6): 790–796. 10.1016/j.sapharm.2012.11.005 [DOI] [PubMed] [Google Scholar]
- 7.Lewing B, Abughosh SM, Lal LS, et al. Patient, physician, and health system factors associated with five types of inadequate care during management of type-2 diabetes mellitus in the united states. Diabetes Epidemiology and Management 2022; 6: 100046. 10.1016/j.deman.2021.100046 [DOI] [Google Scholar]
- 8.Fairless E, Nwanyanwu K. Barriers to and facilitators of diabetic retinopathy screening utilization in a high-risk population. Journal of Racial and Ethnic Health Disparities 2019; 6: 1244–1249. 10.1007/s40615-019-00627-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Tulu B, Trudel J, Strong DM, et al. Patient portals: An underused resource for improving patient engagement. Chest 2016; 149(1): 272–277. 10.1378/chest.14-2559 [DOI] [PubMed] [Google Scholar]
- 10.Ancker JS, Osorio SN, Cheriff A, et al. Patient activation and use of an electronic patient portal. Inform Health Soc Care 2015; 40(3): 254–266. 10.3109/17538157.2014.908200 [DOI] [PubMed] [Google Scholar]
- 11.Antonio MG, Petrovskaya O, Lau F. The state of evidence in patient portals: Umbrella review. J Med Internet Res 2020; 22(11): e23851. 10.2196/23851 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Lyles CR, Nelson EC, Frampton S, et al. Using electronic health record portals to improve patient engagement: Research priorities and best practices. Annals of internal medicine 2020; 172(11_Supplement): S123–S129. 10.7326/M19-0876 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Woodcock EW. Barriers to and facilitators of automated patient self-scheduling for health care organizations: Scoping review. J Med Internet Res 2022; 24(1): e28323. 10.2196/28323 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.North F, Nelson EM, Buss RJ, et al. The effect of automated mammogram orders paired with electronic invitations to self-schedule on mammogram scheduling outcomes: Observational cohort comparison. JMIR Med Inform 2021; 9(12): e27072. 10.2196/27072 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Judson TJ, Odisho AY, Neinstein AB, et al. Rapid design and implementation of an integrated patient self-triage and self-scheduling tool for covid-19. J Am Med Inform Assoc 2020; 27(6): 860–866. 10.1093/jamia/ocaa051 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Miller DP, Jr., Denizard-Thompson N, Weaver KE, et al. Effect of a digital health intervention on receipt of colorectal cancer screening in vulnerable patients: A randomized controlled trial. Ann Intern Med 2018; 168(8): 550–557. 10.7326/M17-2315 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Gleason KT, Powell DS, Wec A, et al. Patient portal interventions: A scoping review of functionality, automation used, and therapeutic elements of patient portal interventions. JAMIA Open 2023; 6(3): ooad077. 10.1093/jamiaopen/ooad077 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Bowen DJ, Kreuter M, Spring B, et al. How we design feasibility studies. Am J Prev Med 2009; 36(5): 452–457. 10.1016/j.amepre.2009.02.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Sekhon M, Cartwright M, Francis JJ. Acceptability of healthcare interventions: An overview of reviews and development of a theoretical framework. BMC Health Serv Res 2017; 17(1): 88. 10.1186/s12913-017-2031-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Nelson LA, Reale C, Anders S, et al. Empowering patients to address diabetes care gaps: Formative usability testing of a novel patient portal intervention. JAMIA Open 2023; 6(2): ooad030. 10.1093/jamiaopen/ooad030 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Steitz BD, Wong JIS, Cobb JG, et al. Policies and procedures governing patient portal use at an academic medical center. JAMIA Open 2019; 2(4): 479–488. 10.1093/jamiaopen/ooz039 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Harris PA, Taylor R, Minor BL, et al. The redcap consortium: Building an international community of software platform partners. J Biomed Inform 2019; 95: 103208. 10.1016/j.jbi.2019.103208 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Harris PA, Taylor R, Thielke R, et al. Research electronic data capture (redcap)--a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009; 42(2): 377–381. 10.1016/j.jbi.2008.08.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Saunders B, Sim J, Kingstone T, et al. Saturation in qualitative research: Exploring its conceptualization and operationalization. Qual Quant 2018; 52(4): 1893–1907. 10.1007/s11135-017-0574-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (coreq): A 32-item checklist for interviews and focus groups. Int J Qual Health Care 2007; 19(6): 349–357. 10.1093/intqhc/mzm042 [DOI] [PubMed] [Google Scholar]
- 26.Sarkar U, Piette JD, Gonzales R, et al. Preferences for self-management support: Findings from a survey of diabetes patients in safety-net health systems. Patient Education and Counseling 2008; 70(1): 102–110. 10.1016/j.pec.2007.09.008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Tieu L, Sarkar U, Schillinger D, et al. Barriers and facilitators to online portal use among patients and caregivers in a safety net health care system: A qualitative study. J Med Internet Res 2015; 17(12): e275. 10.2196/jmir.4847 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Norman CD, Skinner HA. Eheals: The ehealth literacy scale. J Med Internet Res 2006; 8(4): e27. 10.2196/jmir.8.4.e27 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Fagerlin A, Zikmund-Fisher BJ, Ubel PA, et al. Measuring numeracy without a math test: Development of the subjective numeracy scale. Medical Decision Making 2007; 27(5): 672–680. 10.1177/0272989X07304449 [DOI] [PubMed] [Google Scholar]
- 30.Brooke J. Sus: A “quick and dirty” usability. Usability evaluation in industry 1996; 189(3). [Google Scholar]
- 31.Bangor A, Kortum PT, Miller JT. An empirical evaluation of the system usability scale. Intl Journal of Human–Computer Interaction 2008; 24(6): 574–594. 10.1080/10447310802205776 [DOI] [Google Scholar]
- 32.Lorig K. Outcome measures for health education and other health care interventions. Sage Publications, 1996. [Google Scholar]
- 33.McGuire B, Morrison T, Hermanns N, et al. Short-form measures of diabetes-related emotional distress: The problem areas in diabetes scale (paid)-5 and paid-1. Diabetologia 2010; 53(1): 66–69. 10.1007/s00125-009-1559-5 [DOI] [PubMed] [Google Scholar]
- 34.Bangor A, Kortum P, Miller J. Determining what individual sus scores mean: Adding an adjective rating scale. Journal of Usability Studies 2009; 4(3): 114–123. [Google Scholar]
- 35.Brown LD, Cai TT, DasGupta A. Interval estimation for a binomial proportion. Statistical science 2001; 16(2): 101–133. 10.1214/ss/1009213286 [DOI] [Google Scholar]
- 36.Hemmler VL, Kenney AW, Langley SD, et al. Beyond a coefficient: An interactive process for achieving inter-rater consistency in qualitative coding. Qualitative Research 2022; 22(2): 194–219. 10.1177/1468794120976072 [DOI] [Google Scholar]
- 37.R Core Team . R: A language and environment for statistical computing. R foundation for statistical computing, 2022. Available from. https://www.R-project.org/ [Google Scholar]
- 38.Hahn EE, Baecker A, Shen E, et al. A patient portal-based commitment device to improve adherence with screening for colorectal cancer: A retrospective observational study. J Gen Intern Med 2021; 36(4): 952–960. 10.1007/s11606-020-06392-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Salway R, Sillero-Rejon C, Forte C, et al. A service evaluation of the uptake and effectiveness of a digital delivery of the nhs health check service. BMJ Open 2024; 14(11): e091417. 10.1136/bmjopen-2024-091417 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Chang E, Blondon K, Lyles CR, et al. Racial/ethnic variation in devices used to access patient portals. The American Journal of Managed Care 2018; 24(1): e1–e8. [PubMed] [Google Scholar]
- 41.Walker DM, Hefner JL, Fareed N, et al. Exploring the digital divide: Age and race disparities in use of an inpatient portal. Telemed J E Health 2020; 26(5): 603–613. 10.1089/tmj.2019.0065 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Yoon E, Hur S, Opsasnick L, et al. Disparities in patient portal use among adults with chronic conditions. JAMA Netw Open 2024; 7(2): e240680. 10.1001/jamanetworkopen.2024.0680 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Khatib R, Glowacki N, Chang E, et al. Disparities in patient portal engagement among patients with hypertension treated in primary care. JAMA Netw Open 2024; 7(5): e2411649. 10.1001/jamanetworkopen.2024.11649 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Ganeshan S, Pierce L, Mourad M, et al. Impact of patient portal-based self-scheduling of diagnostic imaging studies on health disparities. Journal of the American Medical Informatics Association 2022; 29(12): 2096–2100. 10.1093/jamia/ocac152 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Veinot TC, Mitchell H, Ancker JS. Good intentions are not enough: How informatics interventions can worsen inequality. J Am Med Inform Assoc 2018; 25(8): 1080–1088. 10.1093/jamia/ocy052 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Rauhut MA. Digital engagement and the efficacy of patient portal-based preventive care interventions. Digit Health 2025; 11: 20552076251356013. 10.1177/20552076251356013 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Grossman LV, Masterson Creber RM, Benda NC, et al. Interventions to increase patient portal use in vulnerable populations: A systematic review. J Am Med Inform Assoc 2019; 26(8-9): 855–870. 10.1093/jamia/ocz023 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Sadeghi B, Tran J, Tsai IS, et al. Role of online patient portal self-scheduling and self-referral pathways to decrease health disparity for screening mammography. J Am Coll Radiol 2024; 21(1): 147–153. 10.1016/j.jacr.2023.06.027 [DOI] [PubMed] [Google Scholar]
- 49.Antonio MG, Petrovskaya O, Lau F. Is research on patient portals attuned to health equity? A scoping review. J Am Med Inform Assoc 2019; 26(8-9): 871–883. 10.1093/jamia/ocz054 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Trief PM, Wen H, Burke B, et al. Psychosocial factors and glycemic control in young adults with youth-onset type 2 diabetes. JAMA Netw Open 2024; 7(4): e245620. 10.1001/jamanetworkopen.2024.5620 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Vislapuu M, Brostrom A, Igland J, et al. Psychometric properties of the norwegian version of the short form of the problem areas in diabetes scale (paid-5): A validation study. BMJ Open 2019; 9(2): e022903. 10.1136/bmjopen-2018-022903 [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supplemental material for Pilot study of a patient portal intervention designed to empower patients to address diabetes care gaps by Lyndsay A. Nelson, Jared Cobb, Tom Elasy, Sapna S. Gangaputra, Amber J. Hackstadt, Kryseana Harper, Lindsay S. Mayberry, Neeraja Peterson, S. Trent Rosenbloom, Zhihong Yu, William Martinez, in Digital Health.
Supplemental material for Pilot study of a patient portal intervention designed to empower patients to address diabetes care gaps by Lyndsay A. Nelson, Jared Cobb, Tom Elasy, Sapna S. Gangaputra, Amber J. Hackstadt, Kryseana Harper, Lindsay S. Mayberry, Neeraja Peterson, S. Trent Rosenbloom, Zhihong Yu, William Martinez, in Digital Health.
Data Availability Statement
The deidentified datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.*

