Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2024 Oct 1.
Published in final edited form as: J Nurse Pract. 2023 Sep 4;19(9):104754. doi: 10.1016/j.nurpra.2023.104754

Evaluating an Advanced Practice Provider-Managed Coronavirus Disease 2019 Deterioration Program

Janey Kottler a, Shaveta Khosla a, Vicki Recio a, David Chestek a, Jacqueline Shanks b, Karen Larimer c, Terry Vanden Hoek a
PMCID: PMC10486239  NIHMSID: NIHMS1925676  PMID: 37693741

Abstract

Background:

COVID-19 changed how healthcare systems could provide quality healthcare to patients, safely. An urban healthcare system created an advanced practice provider (APP)-managed continuous remote patient monitoring (cRPM) program.

Methods:

A mixed-method study design focusing on the usable and feasible nature of the cRPM program. Both APP-guided interviews and online questionnaires were analyzed.

Results:

There was overwhelmingly positive APP feedback utilizing the remote monitoring solution including providing quality healthcare, detecting early clinical deterioration, and desiring to adapt the solution to other acute or chronic diseases.

Implications:

Understanding the clinical users’ feedback on usability and feasibility of cRPM highlights the significance of rapid clinical assessment, urgent care escalation and provider accessibility.

Keywords: Advance practice provider, novel coronavirus 2019, continuous remote patient monitoring, nurse practitioner, feasibility, usability

1.0. Introduction

Healthcare personnel were redeployed to manage patients with novel coronavirus 2019 (COVID-19). Nurse practitioners (NP) adjusted their roles, transitioning to phone triage, outpatient monitoring, COVID-19 units or testing centers. The NP scope of practice expanded, with many organizations pursing remote monitoring.1,2An urban health system developed an advanced practice provider (APP)-managed COVID-19 continuous remote patient monitoring (cRPM) program utilizing continuous physiologic data analytics monitoring solution with applied artificial intelligence (AI) (physIQ’s pinpointIQ Chicago, IL). The Personalized Analytics and a Wearable Biosensor Platform for Early Detection of COVID-19 Decompensation (DeCODe) study, supported by the National Institutes of Health (NIH), collected data identifying early COVID-19 decompensation to assist APP in preventing hospitalization or death.3,4The results of DeCODe study are published in another article.4

1.1. Background and Significance

Current COVID-19 literature with AI and machine learning analytics mostly focuses on registered nurses (RN) symptom tracking or contact tracing transmission models.57DeCODe differed from other COVID-19 monitoring programs by being clinically managed by APP (NP and physician’s assistants [PA]), who responded to AI generated clinical alerts triggered by unique physiologic changes.3,4

Little remains published on cRPM feasibility or usability, however, various COVID-19 studies have demonstrated positive responses from various healthcare roles with less sophisticated technology.3,5,811 Concerns from users included limited utility for non-COVID-19 diagnoses, or poor electronic medical record (EMR) and organizational integration.5,1012

1.2. DeCODe Implementation and Population

APP monitored 1,000 patients between October 2020 and August 2021. 512 patients monitored at least 26 days (goal being 28 days), while 856 patients monitored from 9 to 28 days. Almost twice the number of patients were female, and the mean and median ages were 46.5 and 46, respectively. Non-Hispanic black (39.4%) and Latinx populations (35.9%) were overrepresented.

While there are reports of COVID-19 remote monitoring programs utilizing triage RN, to the authors’ knowledge, none assess management by APP. Although more costly than RN, APP’s scope of practice allows for autonomous patient management, including critical decision-making without physician escalation. While back-up emergency department (ED) physicians were available, APP never utilized this resource.

Patients were identified and consented from COVID-19 reports via the EMR or referral. Consented patients received monitoring kits including: a smart mobile phone, charger, pulse oximeter, five biosensor blue-tooth patches, and instruction guide. If the patient was very ill during enrollment, the APP could escalate patient care. Supplementary Figure 1 shows the monitoring kit’s contents. Figure 1 shows the DeCODe workflow.

Figure 1.

Figure 1.

Overview of the clinical operational workflow processes for DeCODe.

Each monitoring shift (8-hour shifts, seven days per week), APP reviewed at least the past 24 hours of data. Patients with clinical alerts were assessed first. Symptomology and oxygen levels in the twice daily symptom surveys were also reviewed. APP contacted patients with any alerts, concerning trends, data lapse, or new symptoms, and escalated care to the ED, urgent care or primary care, as needed. Supplementary Figure 2 shows an example of the clinical user’s monitoring platform interface.

1.3. Purpose

This study’s aim was to determine the usability of an APP-managed cRPM program. Usability was measured by effectiveness, efficiency, and clinical user satisfaction constructs, set by the International Organization of Standardizations (ISO)/International Electrotechnical Commission (IEC), standard 62366:2007, Definition 3.17.1315 An additional study aim was to determine feasibility of the program by describing the implementation, practicality, expansion, efficacy, demand, adaptation, integration, and acceptability.16Assessing cRPM usability and feasibility is essential as the demand for various modalities of health technology surges.

2.0. Methods

To assess usability and feasibility, we used a mixed-method study design focusing on the DeCODe study clinical users (APP). APP completed a questionnaire that reflected their perceptions, and five of the 14 participating APP were randomly chosen to complete a qualitative guided interview. Additionally, monitoring platform data reports were analyzed assessing clinical user interactions.

Two instruments were developed to determine usability and feasibility. We based our usability evaluation on the framework established by the ISO, 9241–11.14

The construct of feasibility used the framework by Bowen and colleagues to evaluate implementation, practicality, expansion, efficacy, demand, adaptation, integration, and acceptability.16 Table 1 compiles the definitions, constructs and instruments measuring usability and feasibility.

Table 1.

Construct definition, variable and instrument table.

STUDY OBJECTIVES AND ENDPOINTS
Construct Definition Variable Operationalized Variable/Instrument
USABILITY is the degree to which a platform can be used by specified consumers to achieve quantified objectives with effectiveness, efficiency, and satisfaction in a context of use.13 The monitoring solution and the DeCODe Program must have a high level of usability to engage the APP in its use and its ability to obtain accurate and timely information for use in care management. For this evaluation “users” are the clinical team.
Effectiveness The accuracy and completeness with which users achieve specified goals.13 Effectiveness can be calculated by measuring the completion rate of a required activity in utilization of the platform. The completion rate is calculated by assigning a binary value of ‘1’ if the test patient manages to complete a task and ‘0’ if he/she does not.13 Time from alert to note Question on survey/interview Report can be run on timing b/w alert and note
 • APP notes completed
 • APP alerts addresses
Efficiency The resources expended in relation to the accuracy and completeness with which users achieve goals.13 Question on survey/interview
Test Level Satisfaction The user’s impression of the overall ease of use of the system being tested (comfort and acceptability of use).13 Question on survey/interview
FEASIBILITY is the determination of the intervention, DeCODe program, being relevant and sustainable. For the purposes of this study, we will evaluate implementation, practicality, expansion, efficacy, demand, adaptation, integration, and acceptability.16
Implementation To what extent can this program be successfully delivered to intended patients in some defined but not fully controlled context?  1. Degree of execution
 2. Success or failure of execution
 3. Amount and type of resources needed to implement
Number enrolled
 Number completed
 Staffing design
Practicality To what extent can the program be carried out with intended patients using existing means, resources and circumstances and without outside intervention?  1. Factors affecting implementation ease or difficulty
 2. Efficiency, speed, or quality of implementation
 3. Positive/negative effects on target patients
 4. Ability of patients to carry out intervention activities
Question on survey/interview
Expansion To what extent can the program be expanded to a new program or service?  1. Fit with organizational goals/culture
 2. Positive/negative effects on organization
 3. Disruption due to expansion component
Question on survey/interview
Efficacy (see Usability-Effectiveness) Does the new program show promise of being successful with the intended population?  1. Intended effects of program or process on key intermediate variables
 2. Maintenance of change from initial change
Question on survey/interview
Demand To what extent is the program likely to be used again or continue?  1. Actual use – number enrolled v. declined?
 2. Expressed interest or intention to use
 3. Perceived demand
See Implementation
Question on survey/interview
Adaptation To what extent does an existing program perform when changes are made for a new format or with different population?  1. Process outcomes comparison b/w intervention use in two populations Question on survey/interview
Integration To what extent can a new program be integrated within an existing system?  1. Perceived fit with infrastructure
 2. Perceived sustainability
Question on survey/interview
Acceptability (see Usability – Satisfaction) To what extent is a new program judged as suitable or attractive to program deliverers?  1. Satisfaction
 2. Intent to continue use
 3. Perceived appropriateness
Question on survey/interview

2.1. Assessment and Analysis of Usability and Feasibility

2.1.1. Online Questionnaire

For the questionnaire, the reliable and valid System Usability Scale was used with additional questions to assess usability.17 This resulted in a 35-item questionnaire. The sample size was limited by the number of clinicians providing care to patients.

The questionnaire did not allow for a composite score. Each item was assigned a construct, and descriptive statistics were generated.

2.1.2. Guided Interview

To obtain APP’s opinions and experiences on cRPM, one-third of the APP were invited to complete a semi-structured interview. Questions were developed collaboratively by the research team (experienced clinicians in patient care and solution) and reflected on the aforementioned concepts.

Researcher trained in interview techniques conducted all interviews. Nondirective interview style was used, allowing for open-ended questions and freedom of subject matter. Probes and follow-up questions were used when indicated allowing new constructs and ideas. These de-identified guided interview transcriptions were the primary source for data analysis.

We completed a systematic analysis of interview transcripts identifying themes while coding. After interviews were transcribed, two independently-working researchers created a construct codebook. The research team reviewed the code books and collaboratively aligned to achieve consensus, creating an overview description of the significant themes.

2.1.3. Platform Data

To better understand the timeliness of user interaction and clinical usability, we queried platform data reports. Reports such as calculated time from clinical alert notification to APP alert note documentation assisted in assessing effectiveness.

The measures of central tendency were assessed for all clinical events (N=3387). Further, we analyzed the response time for clinical events that occurred between 08:00 AM-04:00 PM (typical APP monitoring times on the platform, 7 days per week) and between 08:00 AM-08:00 PM (typical work hours for the APP).

3.0. Results

3.1. Demographics

Of the 14 APP that worked with the monitoring solution, nine came from Urgent Care or ED. Four APP came from the College of Nursing or Clinical Decision-Making Unit. Eight were trained as family NP, three were acute care adult-gerontology NP and three were urgent care PA. Ten APP completed the online questionnaire; five completed the guided interviews.

3.2. Online Questionnaire

Survey respondents were predominantly female (90%). Almost all were NP. Mean age was 41 (sd=6) years. Half of the respondents (50%) were very comfortable with their general knowledge of computers and technology, while 40% were somewhat comfortable. The APP varied in clinical experience, ranging from four to 22 years, with a mean of 10 (sd=5.5) years. Table 2 reports the results of the online questionnaire on a Likert scale, highlighting each question’s construct(s).

Table 2.

The Monitoring Solution User Questionnaire – DeCODe Use Case with Constructs and Results.

Strongly Agree Somewhat Agree Neutral Somewhat Disagree Strongly Disagree Construct
I needed to learn a lot of things before I could start using the monitoring solution. 10% 20% 20% 40% 10% Practicality Efficiency
The monitoring solution sometimes reminded me of things I otherwise would have forgotten. 50% 50% Satisfaction Acceptability Efficacy
The monitoring solution makes documentation easier. 90% 10% Satisfaction Acceptability Practicality Efficacy Integration
The monitoring solution often makes mistakes. 50% 20% 30% Satisfaction Acceptability
The monitoring solution has uncovered issues with patients that I might not otherwise have found out about. 20% 50% 30% Satisfaction Acceptability Efficacy
I would rather not use monitoring solution. 50% 50% Satisfaction Acceptability Expansion Demand
I felt I received adequate training to competently use the monitoring solution. 60% 30% 10% Practicality Integration
Patient privacy and confidentiality was well maintained using the monitoring solution. 80% 20% Efficacy Satisfaction Acceptability
Technical support for the monitoring solution was very good. 70% 20% 10% Satisfaction Acceptability Practicality
The monitoring solution made lots of errors. 10% 10% 70% 10% Satisfaction Acceptability Efficacy
The monitoring solution improved quality of care. 30% 70% Satisfaction Acceptability Practicality Efficacy
The monitoring solution integrated well with the EMR. 40% 20% 30% 10% Satisfaction Acceptability Practicality Expansion Integration
The monitoring solution did things for me that the EMR could not. 30% 30% 40% Satisfaction Acceptability Practicality Expansion Efficacy
The monitoring solution provided early detection of patient symptoms. 20% 80% Practicality Efficacy Satisfaction Acceptability
The monitoring solution helped me identify when a patient was getting sick from COVID-19. 50% 50% Satisfaction Acceptability Practicality Efficacy
The monitoring solution was a good remote monitoring platform to use in keeping an eye on patients with COVID-19. 80% 20% Acceptability Practicality Efficacy Satisfaction Demand
If I had the opportunity to use the monitoring solution again, I would be happy to use it. 80% 20% Acceptability Expansion Satisfaction Demand
The monitoring solution fit well into the workflow of managing the patients in this study. 70% 30% Integration Practicality Expansion Satisfaction Acceptability
Using the monitoring solution is a platform I could see using for a long to time to manage patients. 60% 40% Integration Expansion Satisfaction Acceptability Demand
I think the health system should investigate using the monitoring solution in different cases (e.g., heart failure, COPD, etc.) 70% 30% Demand Expansion Integration Satisfaction Acceptability
I am glad the monitoring solution was available to use in managing patients with COVID-19. 80% 20% Demand Satisfaction
I think that I would like to use the monitoring solution frequently. 40% 40% 20% Expansion Satisfaction Acceptability Demand
I found the monitoring solution unnecessarily complex. 10% 60% 30% Practicality Satisfaction Acceptability
I thought the monitoring solution was easy to use. 50% 40% 10% Practicality Satisfaction
I felt very confident using the monitoring solution. 60% 40% Acceptability Satisfaction
I was able to quickly assess my patient’s health status prior to speaking with them because I could see their data in the monitoring solution. 50% 50% Practicality
I thought there was too much inconsistency in the monitoring solution. 20% 70% 10% Acceptability Satisfaction
I would imagine that most healthcare professionals could learn to use the monitoring solution very quickly. 50% 40% 10% Acceptability Satisfaction
I found the monitoring solution very cumbersome to use. 60% 40% Acceptability Satisfaction
The team was able to manage more patients effectively using the monitoring solution than if usual methods were used. 40% 60% Acceptability Efficiency
I got very good at using the monitoring solution and was able to review patient data accurately and completely in the platform. 60% 30% 10% Efficiency
Using the monitoring solution I was able to review large amounts of patient data quickly. 60% 40% Efficiency
Using the monitoring solution is something that could be done with patients with other chronic diseases. 70% 30% Adaptation
I could see using the monitoring solution in managing patients with other chronic acute illnesses. 70% 30% Adaptation
The monitoring solution was a valuable tool in the management of patients with COVID-19. 80% 20% Satisfaction Acceptability

3.3. Guided Interview

The guided interview transcripts analyses were broadly categorized into two themes: (1) positives, and (2) challenges and need for improvement.

3.3.1. Theme 1: Positives

This theme relates to the APPs’ positive experiences.

Overall Experience

All the APP reported an overall positive experience managing patients through the monitoring solution. All stated the majority of patients remained engaged, and appreciative. The APP used the platform to review and compare vital signs as a positive clinical/decision-making tool, but needed to combine physiologic data with the subjective information from surveys to make decisions.

Interview 1 (I1): “It was pretty good, patients were appreciative of what we’re doing. So, they were really kind of grateful that we had something like this.

The APP found the monitoring solution was user friendly, most acknowledging its benefits in the APP clinical role.

I5: “I thought it was very easy to work with once we got the flow of it and it was not hard at all.”

Daily Patient Questionnaires/Built-in Alerts

All APP reported that the alerts and questionnaires were positive tools, allowing them to monitor patterns. Providers reported the questionnaires, in conjunction with clinical alerts, reduced patient calls, encouraging patient prioritization.

I4: “They were the most helpful throughout the entire thing, and I used them daily for every patient. So those were kind of like my bread and butter to see how the patient was actually doing.”

Impacts

All APP found the system to be effective in early deterioration detection, allowing care escalation, or prevent unnecessary ED trips and hospital admissions. A few providers also noted the monitoring solution gave patients a sense of security while they recovered.

I3: “by giving the patient reassurance, because we have solid data in how they are doing, it prevents them from needlessly going to the emergency room

Patient Privacy and Confidentiality

Providers did not have any issues regarding patient privacy and confidentiality.

3.3.2. Theme 2: Challenges/Scope for Improvement

This theme relates to the areas that need improvement as experienced by the clinical team.

Technical Difficulties

Some APP reported technical difficulties but expressed an easy learning curve. Though minor technological incidents, most APP reported easy resolution.

I4: “The frustrating part was more the technical aspect as an NP… so there was kind of a learning curve in the beginning enrolling patients and getting them on the patch. Once I got used to the kit it was straight forward

Suggested Changes

APP requested a hover feature to allow quick review of important information, including a medication list. They also requested having more patient information on the platform or linking to the EMR to avoid toggling between the platform and EMR for better accessibility and proficiency. Providers also recommended improving some of the clinical alerts.

Patient Communication

Contacting patients was sometimes challenging. Language barriers were not considered a significant problem as interpreter services filled the gap. Sometimes, patients’ ages and/or technology savviness impacted communication.

I2: “the only challenges would be not being able to get ahold of patients. Some people are great about answering the phone, others were not.”

3.4. Platform Data

Data regarding the time from a clinical event alert to when the APP documented a note for the clinical event (N=3,387) were analyzed. The overall mean was 0.67 (sd=0.37, median=0.66) days, or being about 16 hours. Because the APP typically performed monitoring checks on the platform from 08:00 AM-04:00 PM, 7 days per week, a deeper analysis focused on clinical events that came through during those hours, where the mean response time was 0.82 (sd=0.41, median=0.87) days, or being about 20 hours (n=1,143). Events that entered the platform between 08:00 AM-08:00 PM had an average response time of 0.79 (sd=0.39, median=0.80) days, or being about 19 hours (n=1,985).

4.0. Discussion

Guided interviews and online questionnaires revealed both positive and negative experiences of clinical users utilizing cRPM in a real-use patient care scenario.

4.1. Usability

Clinical alerts were addressed on average 16 hours after generation. APP would typically perform monitoring duties for 8-hour shifts, and focus on addressing alerts that appeared within the previous 24 hours. Thus, alerts may have fired later in a shift, and were not addressed until the next day. Considering the APP worked 7 days a week and patients were notified that the platform would be checked once daily, the turnaround time for addressing clinical alerts is appropriate. It is possible that the APP documented the clinical note later, skewing the APP response time results.

While 30% of clinical users agreed there was a learning curve, this is the first instance this monitoring solution was utilized for a clinical use case. All users felt as though reviewing large amounts of patient data occurred accurately, demonstrating the solution’s efficiency. At one point, there were over 140 patients being simultaneously monitored, proving that a large patient volume can be assessed efficiently while maintaining quality.

The interviewed APP expressed that the alerts and daily questionnaires assisted with prioritizing a large patient population quickly and confidently. All clinical users felt it was a valuable tool in managing COVID-19 patients, and most felt they would like to use cRPM for other diseases daily, demonstrating the high level of user satisfaction.

4.2. Feasibility

The DeCODe team was able to rapidly create and implement a novel cRPM program at five US healthcare institutions with minimal workflow interruptions. Completion of the monitoring program was defined as monitoring at least 26 days (goal being 28 days), which 51.2% of the enrollees completed, and 85.6% of DeCODe patients were monitored between nine to 28 days. This allowed clinicians time to follow compliant patients for an extended time.

Both COVID-19 patients and APP experienced a sense of security as part of this study. APP felt secure knowing that the monitoring platform could provide a safety net when providing clinical care, with most users feeling strongly there were limited errors.

Most clinical users were pleased with the reminder alert function and ease of documentation, showing the efficacious nature of this high-stress, high-patient volume healthcare setting. Though a helpful resource with COVID-19, the APP felt strongly that a similar program could be developed for various diseases, acute or chronic.

Technology difficulties usually occurred with new technology users. To note, language and elderly technology and communication barriers were addressed by using video calls or an interpreter. EMR integration would also have improved toggling and clicking, which the APP recommended for integration and expansion.

4.3. Limitations

There were some limitations to the study methods. Additional questions were added to the System Usability Scale, thus being a modified tool with only face validity. Also, the study sample size was small without a priori sample size determination.

Another limitation is with monitoring platform data analytics. The time that the APP addressed the clinical alert was defined as the time the clinical note was documented. While the APP was expected to document the time patient outreach occurred, documentation may have occurred later, thus the documented note time may not accurately reflect the correspondence time, potentially overestimating the clinical alert response time.

5.0. Conclusion

cRPM solutions can be feasible and usable systems for APP caring for patients experiencing either acute or chronic illnesses. Remote monitoring solutions can be effectively deployed remotely to patients across multiple healthcare systems. cRPM possesses the unique ability for widespread adaptability and generalizability with various users, locations and illnesses with ease. Health systems must continue to adapt their operational models in response to rapidly changing and unpredictable circumstances. Clinicians can utilize cRPM to allow for timely clinical decision making, prompt accessibility between patients and providers and consistent outpatient medical observation.

NP can be remote monitoring champions because of their clinical and leadership skillsets. Other institutions can develop programming such as DeCODe to encourage healthcare provider collaboration, quality cRPM and limiting disease spread of various diseases.18,19 Further research is necessary to understand the true cost-effectiveness and outcome impact of APP-led cRPM programming.

Supplementary Material

1

Supplementary Figure 1. Contents of monitoring kit (pulse oximeter not shown but included).

2

Supplementary Figure 2. Example of pinpointIQ monitoring platform.

Alerts turn “green” when addressed by the APP; new alerts will appear as “red”, notifying the clinical user that an alert needs to be addressed.

Manuscript highlights:

  1. This use-case shows the usability and feasibility of an advanced practice provider (APP)-managed continuous remote patient monitoring (cRPM) solution.

  2. cRPM programs can be expanded to acute and chronic diseases to evaluate patients remote or in-person.

  3. APPs can utilized telemedicine while still providing safe and high quality healthcare to patients remotely.

Acknowledgements

All authors are either employed by the NIH grant awardee or sub-awardee. One author had stock options with the awardee however not presently. All authors have worked on this NIH-funded research through their employed institutions.

Funding/Support

This project has been funded in whole or in part with Federal funds from the National Institutes of Health, Department of Health and Human Services, under Contract No. 75N91020C00040. This study was registered on clinicaltrials.gov (NCT04575532).

ABBREVIATIONS:

APP

Advanced practice providers

AI

Artificial intelligence

cRPM

Continuous remote patient monitoring

COVID-19

Coronavirus 2019

EMR

Electronic medical record

ED

Emergency department

IEC

International Electrotechnical Commission

ISO

International Organization of Standardization

NIH

National Institutes of Health

NP

Nurse practitioners

DeCODe

Personalized Analytics and a Wearable Biosensor Platform for Early Detection of COVID-19 Decompensation

PA

Physician’s assistants

RN

Registered nurses

US

United States

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

References

  • 1.Stucky CH, Brown WJ, Stucky MG. Covid 19: An unprecedented opportunity for nurse practitioners to reform healthcare and advocate for Permanent Full Practice Authority. Nursing Forum. 2020;56(1):222–227. doi: 10.1111/nuf.12515 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Golinelli D, Boetto E, Carullo G, Nuzzolese AG, Landini MP, Fantini MP. Adoption of digital technologies in health care during the COVID-19 pandemic: Systematic Review of early scientific literature. Journal of Medical Internet Research. 2020;22(11). doi: 10.2196/22280 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Larimer K, Wegerich S, Splan J, Chestek D, Prendergast H, Vanden Hoek T. Personalized analytics and a wearable biosensor platform for early detection of COVID-19 decompensation (decode): Protocol for the development of the COVID-19 decompensation index. JMIR Research Protocols. 2021;10(5). doi: 10.2196/27271 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Richards DM, Tweardy MJ, Steinhubl SR, et al. Wearable sensor derived decompensation index for continuous remote monitoring of COVID-19 diagnosed patients. npj Digital Medicine. 2021;4(1). doi: 10.1038/s41746-021-00527-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Lim HM, Abdullah A, Ng CJ, et al. Utility and usability of an automated COVID-19 Symptom Monitoring System (COSMOS) in primary care during COVID-19 pandemic: A qualitative feasibility study. International Journal of Medical Informatics. 2021;155:104567. doi: 10.1016/j.ijmedinf.2021.104567 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Adly AS, Adly AS, Adly MS. Approaches based on Artificial Intelligence and the internet of intelligent things to prevent the spread of covid-19: Scoping review. Journal of Medical Internet Research. 2020;22(8). doi: 10.2196/19104 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Budd J, Miller BS, Manning EM, et al. Digital Technologies in the public-health response to COVID-19. Nature Medicine. 2020;26(8):1183–1192. doi: 10.1038/s41591-020-1011-4 [DOI] [PubMed] [Google Scholar]
  • 8.Kricke G, Roemer PE, Barnard C, et al. Rapid Implementation of an Outpatient Covid-19 Monitoring Program. NEJM Catalyst. Published online June 16, 2020:1–12. doi: 10.1056/CAT.20.0214 [DOI] [Google Scholar]
  • 9.Bouabida K, Malas K, Talbot A, et al. Healthcare professional perspectives on the use of remote patient-monitoring platforms during the COVID-19 pandemic: A cross-sectional study. Journal of Personalized Medicine. 2022;12(4):529. doi: 10.3390/jpm12040529 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Alsharif AH. Cross sectional E-health evaluation study for telemedicine and M-health approaches in monitoring COVID-19 patients with chronic obstructive pulmonary disease (COPD). International Journal of Environmental Research and Public Health. 2021;18(16):8513. doi: 10.3390/ijerph18168513 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Shah SS, Safa A, Johal K, Obika D, Valentine S. A prospective observational real world feasibility study assessing the role of APP-based remote patient monitoring in reducing primary care clinician workload during the covid pandemic. BMC Family Practice. 2021;22(1). doi: 10.1186/s12875-021-01594-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Bouabida K, Malas K, Talbot A, et al. Remote Patient Monitoring Program for COVID-19 patients following hospital discharge: A cross-sectional study. Frontiers in Digital Health. 2021;3. doi: 10.3389/fdgth.2021.721044 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Mifsud J Usability metrics - a guide to quantify the usability of any system. Usability Geek. September 13, 2019. Accessed June 13, 2022. https://usabilitygeek.com/usability-metrics-a-guide-to-quantify-system-usability/. [Google Scholar]
  • 14.ISO 9241–11:2018 - Ergonomics of human-system interaction — Part 11: Usability: Definitions and concepts. International Organization of Standardization (ISO). March 2018. Accessed March 2022. https://www.iso.org/standard/63500.html. [Google Scholar]
  • 15.IEC 62366:2007(EN), Medical devices — Application of usability engineering to medical devices. International Electrotechnical Commission (IEC). 2007. Accessed March 2022. https://www.iso.org/obp/ui/#!iso:std:iec:62366:ed-1:v1:en. [Google Scholar]
  • 16.Bowen DJ, Kreuter M, Spring B, et al. How we design feasibility studies. American Journal of Preventive Medicine. 2009;36(5):452–457. doi: 10.1016/j.amepre.2009.02.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Brooke J A quick and dirty usability scale. Researchgate. November 1995. Accessed March 2022. https://www.researchgate.net/publication/228593520_SUS_A_quick_and_dirty_usability_scale.
  • 18.Taiwo O, Ezugwu AE. Smart Healthcare support for remote patient monitoring during covid-19 Quarantine. Informatics in Medicine Unlocked. 2020;20:100428. doi: 10.1016/j.imu.2020.100428 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Gordon WJ, Henderson D, DeSharone A, et al. Remote Patient Monitoring Program for hospital discharged COVID-19 patients. Applied Clinical Informatics. 2020;11(05):792–801. doi: 10.1055/s-0040-1721039 [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

1

Supplementary Figure 1. Contents of monitoring kit (pulse oximeter not shown but included).

2

Supplementary Figure 2. Example of pinpointIQ monitoring platform.

Alerts turn “green” when addressed by the APP; new alerts will appear as “red”, notifying the clinical user that an alert needs to be addressed.

RESOURCES