Skip to main content
ACR Open Rheumatology logoLink to ACR Open Rheumatology
. 2022 Sep 13;4(11):964–973. doi: 10.1002/acr2.11498

Development and Testing of an Electronic Health Record‐Integrated Patient‐Reported Outcome Application and Intervention to Improve Efficiency of Rheumatoid Arthritis Care

Daniel H Solomon 1,, Anuj K Dalal 1, Adam B Landman 1, Leah Santacroce 1, Hallie Altwies 1, Jackie Stratton 1, Robert S Rudin 2
PMCID: PMC9661861  PMID: 36099161

Abstract

Objective

Many patients with rheumatoid arthritis (RA) have difficulty finding clinicians to treat them because of workforce shortages. We developed an app to address this problem by improving care efficiency. The app collects patient‐reported outcomes (PROs) and can be used to inform visit timing, potentially reducing the volume of low‐value visits. We describe the development process, intervention design, and planned study for testing the app.

Methods

We employed user‐centered design, interviewing patients and clinicians, to develop the app. To improve visit efficiency, symptom tracking logic alerts clinicians to PRO trends: worsening PROs generate alerts suggesting an earlier visit, and stable or improving PROs generate notifications that scheduled visits could be delayed. An interrupted time‐series analysis with a nonrandomized control population will allow assessment of the impact of the app on visit frequency.

Results

Patient interviews identified several of the following needs for effective app and intervention design: the importance of a simple user interface facilitating rapid answering of PROs, the availability of condensed summary information with links to more in‐depth answers to common questions regarding RA, and the need for clinicians to discuss the PRO data during visits with patients. Clinician interviews identified the following user needs: PRO data must be easy to view and use during the clinical workflow, and there should be reduced interval visits when PROs are trending worse. Some clinicians believed visits could be delayed for patients with stable PROs, whereas others raised concerns.

Conclusion

PRO apps may improve care efficiency in rheumatology. Formal evaluation of an integrated PRO RA app is forthcoming.

INTRODUCTION

The most recent American College of Rheumatology Committee on Workforce report predicts a shortage of approximately 4133 rheumatology providers (rheumatologists, nurse practitioners, and physician assistants) by 2030 (1). These shortages arise because of demographic shifts in the patient population and expected retirements in the rheumatology workforce. Most efforts to alleviate the expected shortage focus on enhancing the number of rheumatologists through increasing the number of rheumatology training programs and rheumatology trainees (2). Although these efforts may help, unless there is a radical increase in rheumatology fellowship programs, it is very unlikely that they can make up the deficit in time to address the problem. This situation poses an impending risk that many thousand patients with rheumatological disease will be unable to access the care that is necessary to effectively treat their conditions in a timely manner, resulting in substantial preventable morbidity and mortality.

Efforts to address this problem must include improving the efficiency of rheumatology practice. To date, few efforts have attempted to systematically increase the efficiency of rheumatology care. With strong evidence that remote monitoring works (3), reducing the frequency of in‐person clinic visits offers a promising opportunity. Currently, follow‐up visit schedules are determined mostly by rules of thumb (eg, “every 3‐6 months for disease monitoring”) that occur regardless of the clinical needs of the patients for such visits. One recent experience at Geisinger Medical Center suggests that there is substantial potential to improve the efficiency of face‐to‐face visits through previsit telephone calls with trained nurses to determine the need for in‐person visits, enhancing timely care (4).

A more scalable possibility is to use patient‐reported outcomes (PROs) between visits. PROs are recommended for use in rheumatology care to improve quality, and investigators started developing PRO measures 30 years ago (5). The Food and Drug Administration guidance suggests that PRO instruments are most appropriate for assessing concepts best understood by patients from their perspective (eg, fatigue, pain, depression, and physical function) (6). Initial PRO measures were developed as generic tools, but more recently they have focused on specific diseases, like rheumatoid arthritis (RA) (7). The National Institutes of Health (NIH) has embraced PROs through the Patient‐Reported Outcomes Measurement Information System (PROMIS) measures (8, 9), developed as part of a trans‐NIH effort. Although these PROs primarily include generic measures that can be used across different diseases, the PROMIS measures have been widely used in rheumatology research and are recommended to be used in practice, including in RA (7).

There is ample evidence of the benefits to patients from regular use of PROs, including a better experience managing chronic diseases (10). In one recent study, 31 patients with RA met in focus groups and identified a willingness to use PROs to track disease activity and share this information with their health care providers, especially if providers were willing to act on the PRO data (11). These same patients expressed a strong interest in electronic communication between visits. Studies across other chronic diseases find similar patient satisfaction and willingness to engage with PROs (12).

Despite the potential benefits, most rheumatologists do not systematically collect PRO data for various reasons (13). Some collect PRO measures at the time of visits. Mobile health tools (ie, apps) could facilitate routine PRO collection between visits, providing richer data for decision making and offloading some work of symptom assessment from clinic visits. Our research team collected PRO data between visits through an app during the RA Flare trial (14), which queried patients every day through an app using several PRO questionnaires. We found strong and sustained patient adherence, with 70% of subjects continuing to use the app over 6 months (15). Another PRO app for RA demonstrated approximately 90% adherence over 12 weeks for patients who underwent several hours of training and personal assistance downloading the app (16). A PRO app from the United Kingdom (17) was tested in 20 patients, with 80%‐90% adherence over 12 weeks with substantial one‐on‐one coaching. These patients reported positive emotional benefits of remote monitoring: they liked knowing that the data were integrated into their care, and the app allowed patients to see the “bigger picture” of their disease course. These results have encouraged development and testing of several apps, mostly in Europe (18).

These studies suggest that a clinically integrated app can help facilitate PRO collection and use in clinical care for RA and may be leveraged to improve care efficiency. If the app is effective at improving communication between patients and clinicians to improve the efficiency of care, then one might consider further dissemination. Herein, we describe the development of an app integrated within the electronic health record (EHR) and ongoing evaluation.

MATERIALS AND METHODS

Setting

The study was conducted at two rheumatology practices affiliated with Brigham and Women's Hospital, which is an academic health system in Boston, Massachusetts. Both practices used a single EHR (Epic Systems, Inc). Eleven clinically focused rheumatologists were engaged in the development of the integrated app and the current study of the app. There is a large RA population seen at both sites (>3000 patients based on estimates from 2019), with the majority being White and primarily English‐speaking. Both practice sites have trainees, but they were not involved in the app development or study.

RA PRO app and intervention design

We enhanced an early version of a custom‐developed RA PRO app that was not integrated into our institution's EHR. We employed user‐centered design (UCD) methods with both patients and clinicians (rheumatologists) to design an integrated version of this app and related workflows (19). Following standard UCD methods, we conducted several iterations of needs assessment, prototype design, app development, EHR integration, and testing with end users. Notably, we triangulated findings from both types of users (patients and providers) so that the intervention addresses both users' needs. We interviewed 10 patients with RA after the RA Flare Trial (see interview prompts) and learned that patients were glad to answer the PRO questionnaires (PROMIS pain interference short form, PROMIS fatigue short form, PROMIS function short form, and RA Disease Activity Index [RADAI]‐5) (7, 20, 21). However, they often felt that daily data collection was too frequent. Patients wanted to know that their clinicians were using the PRO data in decision making. They also wanted the app to provide simple information about RA and medications. We used these patient assessments to inform new features for the app.

Interview prompts for patients and clinicians:

Patients

  1. Do you use apps? If so, how often, what types? Any health care apps? Any for RA?

  2. Do you track your symptoms currently? If so, how? If so, why?

  3. If you used the previously developed app, what did you like about it? What did you find problematic about the app?

  4. How often would you be willing to engage with the app for your symptoms?

  5. What factors would make you use the app more or less frequently?

  6. Would you consult the app for information about RA? [Mockups of app shown using “think aloud” protocol]

Clinicians

  1. Do you currently use PROs for patients with RA? If so, which ones?

  2. Do you think your patients use apps for RA?

  3. What is a typical interval between face‐to‐face visits for your patients with RA?

  4. What dictates that interval?

  5. Would you use PRO data that could be viewed in the EHR? (A prototype was demonstrated so clinicians could respond to methods for viewing data in the EHR)

  6. Would you consider seeing patients less often if they reported stable symptoms? If no, why not?

  7. Would you consider seeing patients more often if PRO data looked to be worsening? If no, why not?

  8. What would be your preferred method for receiving messages about PRO data? Email? EHR inbox message? [Specific messages were tested using “think aloud” protocol]

The revised app was described to a group of 11 rheumatologists from our academic practice; we chose rheumatologists who have a clinical focus (vs. research) and conducted individual 30‐minute interviews (see interview guide). The interviewers described the app and then showed a mock‐up of how the data would be displayed in the EHR. We sought input from the rheumatologists regarding 1) the general concept of the integrated PRO app, 2) the proposed EHR display of data 3) possible symptom tracking logic (see below) for triggers to communicate with the rheumatologist via EHR inbox messages to improve visit efficiency, 4) EHR inbox messages content, and 5) barriers to using the PRO data for making determinations about offering patients early or delayed visits. As the interviewees contributed opinions, the app mock‐up was revised. This iterative process proceeded (interview, app revision, interview, etc) until all rheumatologists were interviewed and the app was felt to be responsive to the rheumatologists' suggestions. We then interviewed three patients with RA who had previously used an earlier version of the app to make sure that patient concerns and interests were addressed in the new version of the app. Themes were saturated in both sets of interviews with clinicians and patients.

Symptom tracking logic and EHR inbox messages

We identified two scenarios in which EHR inbox messages would be generated: patients who were doing poorly and might benefit from an earlier visit, and patients who were stable and might be considered for a delayed visit. We focused on PROs that correspond to commonly discussed areas of RA that might trigger early or delayed visits, such as disease activity, pain, and function. Thus, the RADAI‐5, PROMIS pain interference, and PROMIS function questionnaires were used to determine worsening of disease or disease stability.

To assess whether patients might need an earlier visit, every time the patient completes Pain, Function, or RADAI questionnaires, the logic considers the prior 10 values for the PRO type completed. If the average of the second five values are more than one standard deviation worse than the average of the first five, an EHR inbox message was sent to the rheumatologist indicating a worsening of symptoms and suggesting that an early visit might be useful (see Figure 1A). No such messages were sent if the patients were within 28 days of a scheduled visit, because of concerns with scheduling.

Figure 1.

Figure 1

Inbox messages sent to rheumatologists by app. Inbox messages sent to rheumatologists through the electronic health record based on PRO data collected in the RA app. (A) The message sent when an early visit is recommended. (B) The message sent when a delayed visit is recommended. (C) The message sent within 2 days of a patient visit when a patient has data from the app in the electronic health record. PRO, patient‐reported outcome; RA, rheumatoid arthritis; RADAI, Rheumatoid Arthritis Disease Activity Index.

For the second scenario (a possible delayed visit), the logic estimates a “baseline” value for the PROs for each patient by averaging the five values for each PRO measure prior to a clinic visit. The possible delay in visit is assessed 2 weeks prior to the next scheduled visit. At that time, if the average PRO values since the prior visit for Pain, Function, or RADAI all remained stable without any worsening by more than one standard deviation from the baseline, then an EHR inbox message was sent to the rheumatologist that a delayed visit might be considered (see Figure 1B).

In addition to these two scenarios, rheumatologists are also alerted through an inbox message 2 days prior to a scheduled visit for patients who had PRO data in the EHR interface. This reminds them to open the data tab in the EHR and review the data to inform discussion during the visit (see Figure 1C).

Question and answer library

Based on patients' interest during design sessions, we developed a “library” of questions and answers for a new “learn” tab in the app (see Figure 2). This library gives no treatment recommendations, but rather it includes well‐researched answers to common questions about RA with links to Arthritis Foundation and other standardized information. These questions include basic disease information (diagnosis, treatments, disease process), medication issues (timing, common side effects, common interactions), lifestyle concerns (diet and exercise), and prevention (vaccines).

Figure 2.

Figure 2

Patient user interfaces of the rheumatoid arthritis app. This graphic shows the user interface that patients using the rheumatoid arthritis app see when interacting with the app.

Integration in the EHR

As we have done previously for other apps (22), we integrated the app into the EHR. Although the data are not stored in the EHR, a “web frame” from the RA app is viewable from the chart. The EHR integration work involved the following three components:

  1. Making the app data viewable within each patient's chart in the EHR from one mouse click on a new tab created for this study (see Figure 3). The graphical displays were prototyped and tested with rheumatologists for ease of understanding and ability to use data from the app as part of their progress note.

  2. Providing the app software with the ability to send EHR inbox messages directly to the clinician.

  3. Reading visit schedule data from the EHR to inform the timing of EHR inbox messages.

Figure 3.

Figure 3

App graphical interface in the electronic health record. This graphic displays a typical set of data within the electronic health record for a patient using the RA app. Data from all four PROs can be viewed and the most recent values are printed at the top in a text box that can be copied and pasted into a progress note. PRO, patient‐reported outcome; PROMIS, Patient‐Reported Outcomes Measurement Information System; RA, rheumatoid arthritis; RADAI, Rheumatoid Arthritis Disease Activity Index.

Study design

After the intervention design and successful integration of the app within the EHR, we designed an interrupted time‐series analysis (ITSA) (see Figure 4) to understand how the app might impact visit frequency and RA treatment changes. The study protocol was reviewed and approved by the Mass General Brigham Institutional Review Board. Although a randomized controlled trial (RCT) would be ideal, we did not pursue an RCT because of feasibility concerns. We chose an ITSA from possible observational designs because it allows one to assess changes in a given outcome that are measured over multiple time points. Furthermore, we are including a matched control group of patients with RA who did not receive the app.

Figure 4.

Figure 4

Example interrupted time‐series design. This figure demonstrates a hypothetical set of data for an interrupted time series with an intervention and control groups for two time periods, preintervention and postintervention. In the case of the current ongoing study, the intervention group refers to the patients who received and downloaded the app and the control group refers to the patients matched 1:1 to the app patients.

The eligible study population includes all patients with RA who have had at least two visits in the 12 months prior and who have a planned visit with one of the selected rheumatologists. As noted earlier, we selected rheumatologists with a clinical focus, and all were willing to participate. After rheumatologists reviewed the list of eligible patients, we contacted the patients through a secure email system explaining the study. Patients who did not opt out of further contact were met at an upcoming visit or could go through consent online with study staff. Consenting patients were given a link to the app and a private code that allowed them to download the app.

Although this was not conducted as an RCT because of feasibility concerns, we chose matched controls, based on patient's age and sex, rheumatology provider, number of visits in the prior 12 months, and calendar date of index visit. Controls were chosen from the group of potentially eligible patients who were not able to be contacted (see Figure 5).

Figure 5.

Figure 5

Flow of patient enrollment during the first 3 months of recruitment (January 2022). This flow diagram demonstrates patient enrollment during the first 3 months of the study period. Recruitment is ongoing. PRO, patient‐reported outcome; RA, rheumatoid arthritis.

The intervention included patient‐level and rheumatologist‐level components. Patients received an app‐based push notification reminding them to complete available PRO questionnaires. The four PRO questionnaires were rotated so that a different one was available every 48 hours. Patients who did not download and start filling in questionnaires or who stopped completing questionnaires for 21 days were sent a reminder email. We sent a maximum of two reminders to each patient during the 12 months of study follow‐up.

Rheumatologists were sent the three types of EHR inbox messages described earlier. They were also sent email reminders every 4 to 6 weeks about patients of theirs that were participating in the study. This reminder suggested that they review PRO data before visits.

The study included both primary and secondary outcomes. The primary outcome was visit frequency; we hypothesized that visit frequency would be reduced for patients receiving the app during the 1‐year study period after they initiated use of the app. The secondary outcomes included clinical, process, and satisfaction measures. Clinical measures of interest were change in disease modifying antirheumatic drugs and PRO results among patients who used the app. The process measures included the number of EHR inbox messages triggered and how rheumatologists responded to them, in other words, whether the rheumatologist asked for an early or a delayed visit. Finally, usability and satisfaction with the app was measured through a standardized survey for patients and clinicians.

As an ITSA, we repeatedly measured the primary outcome (ie, rheumatology visits) in the 12 months prior to baseline and the 12 months after baseline across all patients in both groups. Baseline was considered the date of visit for patients who received and used the app; the control patient's visit within 1 month of the matched app patient visit was considered their baseline. The EHR was examined to determine the number of visits for each of the 12 months prior to baseline and the 12 months after baseline; visits were assessed separately for both the intervention and control patients. This allows us to assess the slope (trend over time) for the two patient groups during the two periods (prebaseline and postbaseline). Furthermore, this data structure facilitates a segmented regression that will be conducted in R (version 4.1.2). The repeated measures regression will employ autocorrelation to account for the fact that the same patients were repeatedly assessed in the two time periods. Furthermore, we account for small differences across the intervention and control patients by adjusting for relevant covariates. The secondary outcomes were measured in the patients who received the app and were exploratory. Thus, we relied on descriptive statistics to assess these outcomes.

RESULTS

The trial we describe earlier is in process, and final results are not yet available. We include here two interim results. First, we describe the results of the UCD process (see Table 1). Second, we describe the recruitment experience over the first 4 months (see Figure 5).

Table 1.

Prestudy clinician survey (N = 10 responses [11 clinicians were sent questionnaire])

Most (50%+) Some (10%–50%) A few (1%–10%) None (0%)
Percentage of visits that could be delayed with no impact on outcomes a 2 8 0 0
Percentage of patients who would be satisfied with option to delay a visit 4 4 2 0
Barriers to offering option to delay a visit Top ranked (N) b In top three (N) Indicated at least once
I am too busy to spend time deciding if visit could be delayed 1 3 Yes
Most of my patients should come in even if their PROs are stable 0 2 Yes
Most of my patients need to come in for labs anyway 0 4 Yes
Rescheduling visits makes scheduling more complicated for me and my staff 2 4 Yes
Rescheduling visits increases my workload by removing easier visits 0 0 Yes
I am already scheduling patients as far out as I am comfortable with 4 5 Yes
If the open slots from a delayed visit are not filled, there will be lost revenue 0 0 Yes
I do not trust the PRO data to inform the decision to delay a visit 0 0 Yes
Delaying a visit could complicate prior authorizations 0 1 Yes
My time needed to determine if the visit could be delayed and not reimbursed 0 0 Yes

Abbreviations: PRO, patient‐reported outcome; RA, rheumatoid arthritis.

a

Exact wording for questions:

1. In your view, what portion of your RA patient visits could be delayed for a few weeks or longer without any notable impact on their health care outcomes?

2. In your view, what portion of your RA patients would be satisfied with receiving the option to delay a visit for a few weeks or longer when they have stable PROs? (Assume the patient had a prior visit with you in the previous months and that they could always choose to keep their scheduled visit.)

3. Which of the following factors may influence your decision to choose to offer patients with stable PROs the option to delay a visit? Rank factors (1 = highest) that have an influence. Put “N/A” next to any factor that has no influence.

b

Not all responses indicated a top choice.

Findings for UCD

During one‐on‐one patient interviews, we found several consistent themes. First, patients wanted their rheumatologist to be in touch if they saw something clinically relevant in the PRO data. Second, patients wanted their rheumatologist to be aware of how they felt between visits. Third, they liked having basic disease and drug information in the app. Fourth, they hoped that the app would help them keep track of symptoms in an organized and simple manner. Finally, patients all agreed that delaying visits would be appropriate and were positive about that idea when the PRO data were stable. Delaying visits was noted to reduce patients' costs of care (ie, less parking fees and less copayment fees). Some representative quotes are included below.

Option to delay a visit:

“That'd be fine because you are saying I'd have the option to come in if I wanted to…”

“If I'm feeling good I'd just be like ‘that's fine’ because it is kind of a… parking is a little expensive over there, would like to keep my money.”

Option for earlier visit:

“I would feel good that like a little bit more… extra care was being taken for my care, and I'd wanna go in.”

“If my doctor calls, I'd be a little concerned at first… I'd feel good about going in knowing that it would catch something quickly.”

Key findings from clinicians included, first, that they believed the availability of PRO data viewable directly within the patient chart may help them prepare for a visit, allow them to bring up issues that patients may have forgotten, and help them identify issues that patients may be reluctant to bring up themselves during visits. Second, clinicians found the idea of previsit EHR inbox messages to be helpful for reminding them that the PRO data were available. Third, they all wanted to be notified when patients' PRO data showed a worsening trend, potentially warranting an earlier visit.

By contrast, the clinicians gave widely varying responses to being notified when patients' PRO data were stable for a possible delayed visit. Clinician survey results of their perspectives on delayed visits are discussed in Table 1. All clinicians reported that at least some patient visits could be delayed with no impact on outcomes. About half believed that most patients would be satisfied with receiving a delayed visit option, whereas the other half believed some or few patients would be satisfied with that option. Several clinicians recognized benefits such as freeing up slots for patients who need them more and better patient satisfaction.

When questioned about barriers to delaying visits for patients with stable PROs, clinicians identified four major barriers: they are already scheduling out as far as they are comfortable with (eg, 6 months), researching visits adds complications for them and their staff, they are too busy to spend time deciding whether visits could be delayed, and most of their patients need to come more frequently for labs anyway. (All patients we asked about the option to delay a visit were positive about having the option, even if they had labs, because of the potential to avoid a copay and save time.) Clinicians also identified several other barriers: a belief that patients should come in when scheduled regardless of PROs, concerns that removing “easy” visits would increase their workload, potential for lost revenue if the open slot was not filled, lack of trust in the PRO data, potential for a delayed visit to complicate prior authorizations, and lack of reimbursement for spending time deciding whether a delayed visit could be reimbursed. Notably, some clinicians were skeptical of any effort to reduce visit frequency. One clinician believed that the main reason to do so was cost control. Others said that all visits have value; even if symptoms were stable, the visit could be used for coaching or education (eg, bone health). One said that visit schedules should not be changed once made.

Clinicians offered various suggestions to improve the intervention: help them determine whether a visit can be done virtually versus in‐person, extract all relevant data (eg, lab values, medications) needed to determine whether patient visits could be postponed saving them time reviewing medical records, and ensure that the algorithm is accurate when it suggests patients for delayed visits so that the messages do not become added noise to them. We describe their comments and revisions to the RA app in Table 2.

Table 2.

Clinician feedback and changes made to the RA app

Clinician feedback RA app intervention changes
PRO data can help them prepare for a visit PRO data easily viewable in EHR
Reminders of visit data can be useful EHR inbox messages sent 1‐2 days before visit
Wanted to be notified of worsening symptoms Symptom logic‐triggered EHR inbox messages sent
Some visits could be delayed Symptom logic‐triggered EHR inbox messages sent
Ensure algorithm is accurate for delayed visit suggestions Plan to monitor symptom logic during implementation and adjust as needed
Need summary data to determine if visits can be postponed Inconsistent data locations in EHR make this infeasible in short‐term, planned to explore in future
Need help determining if visit can be virtual Out of scope but will consider for future

Abbreviations: EHR, electronic health record; PRO, patient‐reported outcome; RA, rheumatoid arthritis.

Recruitment experience

Through an automated chart review process (searching on diagnosis and billing codes for RA, target rheumatologists, and a given date range), during the first 3 months of recruitment, we were able to identify 659 potentially eligible patients. We then conducted a brief chart review to confirm the diagnosis and to ensure that patients have been seen in the prior 12 months. This list of potential subjects was sent to the target rheumatologists every 2 weeks, allowing them to remove patients from the recruitment list if they are deemed poor candidates (entirely left to the rheumatologists' discretion). Relatively few patients (n = 54; 8.2%) are removed at that step.

We then attempted to contact this group of 605 patients through several approved methods: secure email within the patient portal, telephone call, or through face‐to‐face contact at an upcoming visit. These methods allowed contact with patients to describe the study and invite participation. Of those invited, 111 (18.3%) patients expressed interest and signed a consent form. All patients signing a consent form received a link via email or text, allowing them to download the app and begin use. Of this group, 62 (55.9%) downloaded and began using the app as of January 2022.

DISCUSSION

We have developed the PRO app and intervention for RA because of the clear importance of patient‐reported symptoms in clinical decision making for this chronic condition and potential to improve efficiency of RA care. The current app being tested has been modified from prior versions (14) based on end‐user feedback as described earlier. Importantly, the new version integrates data into the EHR, allowing rheumatologists to use the PRO data easily in clinical practice. Symptom tracking logic facilitates visit frequency decision making based on PRO results. To rapidly test the new version of the app, we designed a controlled ITSA that will allow for an early‐phase assessment of app usability for patients and rheumatologists and an understanding of whether the app's symptom tracking logic may reduce visit frequency; this could potentially improve access to rheumatologists. Our UCD findings suggest that there may be a disconnect between clinicians' view of patient benefits from more efficient RA care, resulting from fewer visits, compared with the actual patient view; clinicians may undervalue patient out‐of‐pocket costs for visits as well as the nonfinancial costs, such as time away from work and family. One could imagine a value‐based model of care in which between‐visit monitoring through the PRO app might be the responsibility of advanced practice providers such as a physician assistant or nurse practitioner. Changes to the health care system to realign incentives and payments may be required to facilitate such a care model.

If we find that patients and rheumatologists use the app and that it appears feasible for more widespread dissemination, this may have important implications for rheumatology practice. First, if visit frequency is reduced, this may allow for better access to rheumatology care. There is currently an access problem in rheumatology that is predicted to become worse over the next several decades (1). Second, although visit frequency may be reduced, it is also possible that patients will be more likely to be seen and helped when RA is poorly controlled. EHR inbox messages that recommend visits prior to scheduled visits could help with adherence to treat‐to‐target. Finally, by providing patients easy to use tools to follow their symptoms and to access answers to commonly asked questions, patients feel more control over their RA. Improved self‐efficacy has been demonstrated in prior work to improve outcomes (23).

While developing the current app, we considered several design options that require further discussion. A key issue that we struggled with is the symptom tracking logic for the EHR inbox messages. There is no established psychometrically proven method for establishing a baseline for the PROs and then determining what threshold should trigger a delayed and/or early visit message. We developed our logic through clinical judgement, assessment of minimally clinically important differences, and by reviewing data from our prior trial. However, this is an important area for future investigation.

Another design option for the app was to add more features, such as a medication list, medication reminders, exercise videos, and a diet tracker. These options were suggested by some patients and are technically feasible. However, they would have added complexity to the patient interface. Such features could also distract from the PROs and would have added cost and time for the development.

We integrated the app into the EHR based on what we heard from patients and rheumatologists. The ability to use the PRO data in routine clinical practice was important to both groups of end users. In addition, routine collection of PROs is a recommended aspect of clinical care for patients with symptomatic conditions (5, 24). Based on our work in other chronic symptomatic conditions, for example, asthma (22, 25, 26), we believe that a PRO app can be developed as a platform that could facilitate tracking symptoms across multiple conditions. We are beginning to think through other rheumatic diseases in which a PRO app may be useful.

Even before the study has been completed and data analyzed, we acknowledge some important limitations. We decided to not pursue an RCT, thus our data will be from a nonrandomized intervention. A controlled ITSA is a strong study design that allows for moderate inferences (27), but there remains the possibility of confounding. We plan to enroll 150 subjects to receive and begin using the app. Although this is a relatively small number of subjects, it should provide robust pilot data to assess the value of this integrated PRO app for RA. Moreover, the study is being conducted at one academic medical center. We recognize that rheumatologists in private practice may have other concerns that would limit transfer of the app to their practice. This may be a major concern if the app places new demands on rheumatologists outside of visits (eg, review of lab results and PRO results). Finally, patients are required to have a smartphone and be able to use a simple app, which presents barriers for some patients. However, a recent national survey found that more than 90% of US adults own a smartphone, with the numbers increasing among all demographic groups. Black, Hispanic, and less educated adults have only a few percentage points lower smartphone adoption than average (28, 29).

In conclusion, we have employed a UCD method to develop a PRO app for RA that is integrated in the EHR. The app triggers EHR inbox messages to clinicians to facilitate early or delayed visits, with the intention of improving visit efficiency. Currently, the app is being tested in one academic rheumatology practice. Although the development, integration, and testing procedure require substantial time and effort, a rigorous process increases likelihood of success and ability to assess findings. If this PRO app is found successful, it may have value for other rheumatic diseases, as well as other chronic conditions in which management is largely based on patient‐reported symptoms.

AUTHOR CONTRIBUTIONS

All authors were involved in in drafting the article or revising it critically for important intellectual content, and all authors approved the final version to be published. Solomon had full access to the data and take responsibility for the integrity of the data and the accuracy of the data analysis.

Study conception and design

Solomon, Dalal, Landman, Santacroce, Altwies, Stratton, Rudin.

Acquisition of data

Solomon, Altwies, Stratton, Rudin.

Analysis and interpretation of data

Solomon, Rudin.

Supporting information

Disclosureform

ACKNOWLEDGMENTS

None.

This work was supported by the Rheumatology Research Foundation.

REFERENCES

  • 1. Battafarano DF, Ditmyer M, Bolster MB, et al. 2015 American College of Rheumatology workforce study: supply and demand projections of adult rheumatology workforce, 2015‐2030. Arthritis Care Res (Hoboken) 2018;70:617–26. [DOI] [PubMed] [Google Scholar]
  • 2. The Rheumatologist : American College of Rheumatology. ACR addresses the rheumatology workforce shortage. URL: https://www.the‐rheumatologist.org/article/acr‐addresses‐the‐rheumatology‐workforce‐shortage/.
  • 3. de Thurah A, Stengaard‐Pedersen K, Axelsen M, et al. Tele‐health followup strategy for tight control of disease activity in rheumatoid arthritis: results of a randomized controlled trial. Arthritis Care Res (Hoboken) 2018;70:353–60. [DOI] [PubMed] [Google Scholar]
  • 4. Butt S, Newman E, Smith N. Nurse scheduled telephone visit: the right rheumatology care for the right patient at the right time. Arthritis Rheumatol 2016;68(suppl 10). [Google Scholar]
  • 5. Callahan LF. The history of patient‐reported outcomes in rheumatology. Rheum Dis Clin North Am 2016;42:205–17. [DOI] [PubMed] [Google Scholar]
  • 6. Food and Drug Administration . Guidenace for industry patient‐reported outcome measures: use in medical product development to suppport labeling claims. URL: https://www.fda.gov/regulatory‐information/search‐fda‐guidance‐documents/patient‐reported‐outcome‐measures‐use‐medical‐product‐development‐support‐labeling‐claims.
  • 7. Bartlett SJ, Orbai AM, Duncan T, et al. Reliability and validity of selected PROMIS measures in people with rheumatoid arthritis. PLoS One 2015;10:e0138543. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Cella D, Riley W, Stone A, et al. The Patient‐Reported Outcomes Measurement Information System (PROMIS) developed and tested its first wave of adult self‐reported health outcome item banks: 2005‐2008. J Clin Epidemiol 2010;63:1179–94. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Cella D, Yount S, Rothrock N, et al. The Patient‐Reported Outcomes Measurement Information System (PROMIS): progress of an NIH Roadmap cooperative group during its first two years. Med Care 2007;45(5 Suppl 1):S3–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Nelson EC, Eftimovska E, Lind C, Hager A, Wasson JH, Lindblad S. Patient reported outcome measures in practice. BMJ 2015;350:g7818. [DOI] [PubMed] [Google Scholar]
  • 11. Navarro‐Millan I, Zinski A, Shurbaji S, et al. Perspectives of rheumatoid arthritis patients on electronic communication and patient‐reported outcome data collection: a qualitative study. Arthritis Care Res (Hoboken) 2019;71:80–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Valderas JM, Kotzeva A, Espallargues M, et al. The impact of measuring patient‐reported outcomes in clinical practice: a systematic review of the literature. Qual Life Res 2008;17:179–93. [DOI] [PubMed] [Google Scholar]
  • 13. Krusche M, Klemm P, Grahammer M, et al. Acceptance, usage, and barriers of electronic patient‐reported outcomes among German rheumatologists: survey study. JMIR Mhealth Uhealth 2020;8:e18117. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Lee YC, Lu F, Colls J, et al. Outcomes of a mobile app to monitor patient‐reported outcomes in rheumatoid arthritis: a randomized controlled trial. Arthritis Rheumatol 2021;73:1421–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Colls J, Lee YC, Xu C, et al. Patient adherence with a smartphone app for patient‐reported outcomes in rheumatoid arthritis. Rheumatology (Oxford) 2021;60:108–12. [DOI] [PubMed] [Google Scholar]
  • 16. Bingham CO, 3rd , Gaich CL, DeLozier AM, et al. Use of daily electronic patient‐reported outcome (PRO) diaries in randomized controlled trials for rheumatoid arthritis: rationale and implementation. Trials 2019;20:182. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Austin L, Sharp CA, van der Veer SN, et al. Providing ‘the bigger picture’: benefits and feasibility of integrating remote monitoring from smartphones into the electronic health record. Rheumatology (Oxford) 2020;59:367–78. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Seppen BF, L'Ami M J, Duarte Dos Santos Rico S, et al. A smartphone app for self‐monitoring of rheumatoid arthritis disease activity to assist patient‐initiated care: protocol for a randomized controlled trial. JMIR Res Protoc 2020;9:e15105. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Schneiderman B. Designing the user interface: strategies for effective human‐computer interaction. Addison‐Wesley Longman Publishing Co., Inc.; 1997. [Google Scholar]
  • 20. Leeb BF, Haindl PM, Maktari A, Nothnagl T, Rintelen B. Patient‐centered rheumatoid arthritis disease activity assessment by a modified RADAI. J Rheumatol 2008;35:1294–9. [PubMed] [Google Scholar]
  • 21. Bingham CO III, Gutierrez AK, Butanis A, et al. PROMIS fatigue short forms are reliable and valid in adults with rheumatoid arthritis. J Patient Rep Outcomes 2019;3:14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22. Rudin RS, Perez S, Rodriguez JA, et al. User‐centered design of a scalable, electronic health record‐integrated remote symptom monitoring intervention for patients with asthma and providers in primary care. J Am Med Inform Assoc 2021;28:2433–44. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23. Karlson EW, Liang MH, Eaton H, et al. A randomized clinical trial of a psychoeducational intervention to improve outcomes in systemic lupus erythematosus. Arthritis Rheum 2004;50:1832–41. [DOI] [PubMed] [Google Scholar]
  • 24. Fautrel B, Alten R, Kirkham B, et al. Call for action: how to improve use of patient‐reported outcomes to guide clinical decision making in rheumatoid arthritis. Rheumatol Int 2018;38:935–47. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25. Rudin RS, Fanta CH, Predmore Z, et al. Core components for a clinically integrated mHealth app for asthma symptom monitoring. Appl Clin Inform 2017;8:1031–43. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26. Rudin RS, Fanta CH, Qureshi N, et al. A clinically integrated mHealth app and practice model for collecting patient‐reported outcomes between visits for asthma patients: implementation and feasibility. Appl Clin Inform 2019;10:783–93. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27. Hudson J, Fielding S, Ramsay CR. Methodology and reporting characteristics of studies using interrupted time series design in healthcare. BMC Med Res Methodol 2019;19:137. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28. Pew Research Center . Mobile fact sheet. URL: https://www.pewresearch.org/internet/fact-sheet/mobile/
  • 29. Rhoades H, Wenzel S, Rice E, Winetrobe H, Henwood B. No digital divide? Technology use among homeless adults. J Soc Distress Homeless 2017;26:73–7. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Disclosureform


Articles from ACR Open Rheumatology are provided here courtesy of Wiley

RESOURCES