Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2017 Jan 1.
Published in final edited form as: Acad Med. 2017 Jan;92(1):87–91. doi: 10.1097/ACM.0000000000001276

The Electronic Health Record Objective Structured Clinical Examination: Assessing Student Competency in Patient Interactions While Using the Electronic Health Record

Frances E Biagioli 1, Diane L Elliot 2, Ryan T Palmer 3, Carla C Graichen 4, Rebecca E Rdesinski 5, Kaparaboyna Ashok Kumar 6, Ari B Galper 7, James W Tysinger 8
PMCID: PMC5177541  NIHMSID: NIHMS799379  PMID: 27332870

Abstract

Problem

Because many medical students do not have access to electronic health records (EHRs) in the clinical environment, simulated EHR training is necessary. Explicitly training medical students to use EHRs appropriately during patient encounters equips them to engage patients while also attending to the accuracy of the record and contributing to a culture of information safety.

Approach

Faculty developed and successfully implemented an EHR objective structured clinical examination (EHR-OSCE) for clerkship students at two institutions. The EHR-OSCE objectives include assessing EHR-related communication and data management skills.

Outcomes

The authors collected performance data for students (n = 71) at the first institution during academic years 2011–2013 and for students (n = 211) at the second institution during academic year 2013–2014. EHR-OSCE assessment checklist scores showed that students performed well in EHR-related communication tasks, such as maintaining eye contact and stopping all computer work when the patient expresses worry. Findings indicated student EHR skill deficiencies in the areas of EHR data management including medical history review, medication reconciliation, and allergy reconciliation. Most students’ EHR skills failed to improve as the year progressed, suggesting that they did not gain the EHR training and experience they need in clinics and hospitals.

Next Steps

Cross-institutional data comparisons will help determine whether differences in curricula affect students’ EHR skills. National and institutional policies and faculty development are needed to ensure that students receive adequate EHR education, including hands-on experience in the clinic as well as simulated EHR practice.

Problem

Medical educators have widely acknowledged the importance of training medical students to appropriately use electronic health records (EHRs).1,2 EHR competencies fall into two general domains. The first involves how effectively the clinician interacts with the EHR data (charting, ordering, etc.). Milano and colleagues3 have described a curriculum addressing these data management aspects of EHR use. A second, more challenging domain involves communication. This domain entails establishing and maintaining patient rapport while using the EHR. Using an EHR in the exam room can detract from or add to the patient experience4; thus, measures of, and training in, this latter competency domain are needed.5

Prior to EHRs, students updated chart histories, revised medication and allergy lists as appropriate, and wrote notes and orders that a supervising attending reviewed and signed. Since the advent of EHRs, students’ opportunities to work with patient medical records have been influenced by rotations at sites with different EHR platforms, by supervising physicians’ differing levels of comfort with and ability to use EHRs, and by policies that restrict student EHR use.6,7 A 2009 survey of clerkship directors showed that many students lack full EHR access. To illustrate, only about a fourth of directors (27%) reported that students could use the EHR to view patient records, write notes, and enter orders (which a supervising physician had to cosign); less than half (41%) allowed students to view the EHR and write notes; and a nearly a third (32%) allowed students view-only access.1 Limiting students’ access to the EHR diminishes their potential contributions to patient care teams.

Simulation allows students to use EHRs when clinical access is inadequate. Morrow and colleagues8 have described an EHR-specific, communication-skills-related objective structured clinical examination (OSCE). Using a sim-EHR, we extended Morrow and colleagues’ work to include EHR-related data management skills while interacting with patients. We describe the development, performance data, and participant perceptions of this EHR-OSCE at two institutions.

Approach

After a literature review, Oregon Health & Science University (OHSU) faculty created a list of EHR competencies (Table 1). Next, faculty created the EHR-OSCE scenario materials including a rater assessment checklist (Table 2) which expanded on Morrow and colleagues’ work.8 OHSU piloted the EHR-OSCE with third-year family medicine (FM) clerkship students in January through July 2010 and incorporated it into this required FM clerkship in 2011. In 2013 the University of Texas Health Science Center at San Antonio (UTHSCSA) adopted the EHR-OSCE as part of its FM clerkship. The institutional review boards at both OHSU and UTHSCSA approved all the procedures necessary for the studies mentioned in this report.

Table 1.

Electronic Health Record (EHR)-Related Competencies, Listed by ACGME Competency for Medical Students

ACGME competency areaa EHR-related measure Domain
Interpersonal and communication skills
  • Establish/maintain rapport with patients while using EHRs

  • Use EHRs to enhance patient interactions

Communication
Professionalism
  • Attend to patient needs/concerns ahead of computing tasks when using EHRs

  • Maintain a professional demeanor while interacting with the patient while using EHRs

Communication
Medical knowledge/patient care
  • Locate/interpret lab results in a patient’s EHR

  • Identify portions of an EHR critical to the safe care of the patient

  • Write and enter an EHR prescription

Data management
Systems-based practice
  • Identify discrepancies between information provided by a patient and the patient’s EHR

  • Implement a routine method to reconcile EHR content with patients

Data management

Abbreviation: ACGME indicates Accreditation Council for Graduate Medical Education.

a

Competencies are from Accreditation Council for Graduate Medical Education (ACGME) common program requirements. IV.A.5. ACGME competencies. Effective July 1, 2007. http://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/CPRs_07012015.pdf. Accessed April 26, 2016.

Table 2.

EHR-OSCE Rater Assessment Checklist and Student Performancea

Item no. and descriptions No. (and %) of students who completed the item correctly
OHSU (n = 71) UTHSCSA (n = 211)
General
 1. Introduced self to the patient 70 (98.6) 210 (99.5)
 2. Used words and terms understandable to the patient 67 (94.4) 196 (92.9)
 3. Initially asked open-ended questions 63 (88.7) 208 (98.6)
 4. Allowed the patient to answer questions 70 (98.6) 209 (99.1)
 5. Asked questions in a nonjudgmental manner 69 (97.2) 211 (100.0)
 6. Summarized patient history 44 (62.0) 175 (82.9)
 7. Showed listening body language 67 (94.4) 207 (98.1)
 8. Appeared poised, professional, and confident even if frustrated by the EHR 68 (95.8) 206 (97.6)
 9. Clearly communicated the next step in patient care to the patient 66 (93.0) 193 (91.5)
EHR-related communication skills
 10. Established rapport BEFORE turning to the computerb 47 (66.2) 207 (98.1)
 11. Acknowledged/introduced the need to use the computer during the visitb 69 (97.2) 179 (84.8)
 12. Moved the computer/patient to facilitate communication while using the EHR (i.e., constructed a  doctor/patient/computer triangle)b 64 (90.1) 177 (83.9)
 13. Maintained eye contact intermittently with the patient despite using the EHR 66 (93.0) 211 (100.0)
 14. Seemed supportive and concerned despite using the EHR 61 (85.9) 201 (95.3)
 15. Explained what he/she was doing in the EHR when or before typing/looking in the computerb 67 (94.4) 185 (87.7)
 16. Visually or verbally shared EHR information (i.e., included the patient in the computer work)b 67 (94.4) 195 (92.4)
 17. Visually or verbally shared EHR test results with the patient 67 (94.4) 189 (89.6)
EHR-related data management skills
 18. Verified the patient’s allergy to sulfa in the EHR and chose a medication other than  trimethoprim/sulfamethoxazole 42 (59.2) 201 (95.3)
 19. Reviewed EHR medication list with the patient 27 (38.0) 186 (88.2)
 20. Reviewed EHR past medical history or problem list with the patient 15 (21.1) 165 (78.2)
 21. Reviewed the patient’s social history including sexual history (either by looking in the EHR or asking) 37 (52.1) 108 (51.2)
 22. Entered a prescription for the patient’s UTI 62 (87.3) 49c (90.7)

Abbreviations: EHR indicates electronic health record; OSCE, objective structured clinical exam; OHSU, Oregon Health & Science University; UTHSCSA, University of Texas Health Science Center at San Antonio; UTI, urinary tract infection.

a

The authors developed and implemented this EHR-OSCE at OHSU in academic years 2011–2013 and at UTHSCSA in academic year 2013–2014.

b

Derived from Morrow JB, Dobbie AE, Jenkins C, Long R, Mihalic A, Wagner J. First-year medical students can demonstrate EHR-specific communications skills: A control-group study. Fam Med. 2009;41:28–33.

c

At UTHSCSA, 54 students were evaluated using this item’s exact wording. To better capture the faculty grading instructions, this item was reworded to “Entered a prescription for the appropriate patient, medication, strength, and dosage,” and 145 of the 157 students performed this item correctly.

Logistics

During the EHR-OSCE, students take a focused history, review EHR data with a standardized patient (SP), share their assessment with the “patient,” and enter a prescription in the EHR. The SP portrays a woman with a bladder infection, and the student accesses the sim-EHR on the in-room computer. A faculty member observes the encounter either from a chair in a corner of the room (OHSU) or from behind a two-way mirror (UTHSCSA). The faculty observer rates student performance by marking the assessment checklist during the encounter and by reviewing the accuracy of the EHR prescription (medication choice, dosage, instructions for taking the medication, and duration) after the EHR-OSCE. One faculty member–SP dyad observes students in serial fashion as they rotate through the EHR-OSCE station. A 17-minute cycle allows students 10 minutes to read and conduct the scenario, 2 minutes to wrap up conversations with the SP, and 5 minutes to receive feedback from both the SP and faculty member.

The EHR-OSCE entails a straightforward diagnosis to assess students’ EHR skills rather than their medical knowledge. The scenario always starts with the computer and a chair positioned flush with the wall as far from the patient as possible (i.e., if a student has moved the furniture or if the SP has moved during a prior OSCE, the SP and furniture return to their starting locations before a new encounter begins); this positioning allows the raters to assess whether students move the computer monitor to foster patient engagement (see Supplemental Digital Appendix 1 at http://links.lww.com/ACADMED/A373). Faculty observers assess chart review and medication reconciliation skills by having SPs divulge portions of their medical history only when the student reviews specific parts of the sim-EHR chart with them. For example, SPs will disclose that they have stopped taking oral contraceptives in an effort to become pregnant only if students ask if the SP is still taking the oral contraceptive listed in the EHR. SP training comprises both written case material and rehearsal. Faculty assessors receive case materials in advance and are trained by the lead program faculty. At the beginning of the clerkship, students are told that one OSCE station requires EHR use.

Institutional differences

UTHSCSA includes the EHR-OSCE as part of the summative assessment that contributes to students’ overall clerkship grades, whereas OHSU uses it as a formative assessment that does not affect students’ grades. Additionally, at UTHSCSA, students receive some preclinical general EHR skills instruction and later, during the FM clerkship, students are taught EHR-related patient-interaction skills. OHSU students receive their EHR training during their second year, and unlike at UTHSCSA, EHR skills that this OSCE tests are not explicitly taught during the third-year clerkships.

Resources

Critical requirements in developing the EHR-OSCE included infrastructure for a realistic sim-EHR and skilled information technology (IT) personnel who programmed the sim-EHR charts in a training EHR environment. A training EHR environment (also known as a sandbox EHR) is identical to the EHR used in patient care, but houses only simulated patient charts. Using the training EHR environment ensures that automatic updates occur in step with the hospital’s EHR platform (both institutions use Epic; Epic Systems Corporations, Verona, Wisconsin) and provides a place for students to practice EHR skills without accessing actual patient records.

Additional necessary resources included a moveable, Internet-enabled computer with EHR software (e.g., a computer mounted on a moveable arm or a laptop on a wheeled table); faculty who are comfortable with and skilled in using EHRs; and institutional policies supporting student EHR use. The greatest hurdles arose from garnering both institutional support for developing student-friendly policies that grant student EHR access and funding for IT support. Collaboration between institutions allowed us to surmount those barriers by sharing IT resources and student EHR use policies. For example, OHSU’s IT lead was available to train and field questions for UTHSCSA’s IT personnel as they implemented the project.

Outcomes

We provide performance data from OHSU students who experienced the OSCE in academic years 2011–2013 (n = 71) and UTHSCSA students who experienced it in academic year 2013–2014 (n = 211; see Table 2). All third-year UTHSCSA students participated, and all were graded by four trained faculty observers. Only a subset of OHSU students participated in the EHR-OSCE (some were randomly assigned other stations), and we have reported assessment data from the faculty member who conducted the majority of the OHSU observations. At the end of each OSCE day, using a convenience sample methodology, two faculty members (one from each institution; F.E.B. and J.W.T) collected participant comments regarding the experience.

Using the EHR-OSCE assessment data, we identified areas in which students performed well and areas in which their abilities were lacking. Both institutions’ students performed well with some items in the EHR-related communication domain: Over 90% maintained eye contact while using the EHR (n = 277 [across both schools]) and shared information from the EHR with the patient (n = 262 [across both schools]).

The EHR skill areas in which students performed less well included (1) neglecting to confirm the EHR medical record data with the patient and (2) failing to use the EHR in a manner to establish patient rapport and foster patient engagement.

Confirming EHR data and other EHR-related data management skills

As mentioned, students at both schools could improve their data confirmation skills. To illustrate, only 21% (n = 15) of the OHSU students confirmed the medical history documented in the EHR with the patient. A greater percentage of UTHSCSA’s students—78% (n = 165)—confirmed EHR medical history data with patients; nonetheless, UTHSCSA students performed more poorly in this area than almost any other. UTHSCSA students performed well with regard to reviewing medications (88%, n = 186) and noting allergies (95%, n = 201). Only 38% (n = 27) of OHSU students reviewed medications, and only 59% of OHSU (n = 42) students discovered the patient’s allergy despite an EHR warning flag. Although this relatively poor performance could indicate that this skill was not taught well, it might also demonstrate the pitfalls of alert fatigue, which occurs when too many EHR alerts dim the user’s reaction to any one in particular.9 In fact, one student reflected, “Even though it is bright yellow, I still didn’t see the (allergy) warning.”

EHR-related communication skills

Although, as mentioned, the great majority of students at both institutions maintained eye contact and included the patient in their computer work, many performed poorly in other EHR-related communication skills. A majority of OHSU students (66%, n = 47) moved quickly to the computer, not allowing time to establish rapport with the patient. When giving feedback about the EHR-OSCE, many students commented that their clinic preceptors often type during an entire patient visit; thus, clinical role modeling may have contributed to this behavior. While they were giving students feedback about their EHR-OSCE performance, faculty emphasized the importance of introducing an EHR only after connecting personally with the patient, and even then, using the computer only as it augments care. One OHSU student captured this sentiment well: “I wouldn’t have a stethoscope in my ears the whole visit. You use it only when needed. So why should I use the computer the entire visit?” Although most UTHSCSA students established rapport (98%, n = 207) before working on the computer, fewer (84%, n = 177) moved the monitor to share EHR information with patients—even though doing so engages patients. One SP explained, “Moving the computer makes me feel as if we are working together.”

UTHSCSA’s students demonstrated better overall performance compared with OHSU students. Among UTHSCSA’s students, some EHR skills improved as the year progressed, notably introducing and moving the computer, sharing EHR information, and reconciling medications. In contrast, the OHSU students who completed the EHR-OSCE later in the academic year did not outperform those who completed it in earlier rotations. There may be several reasons for these institutional performance differences. The variance in performance may be related to when the students at each institution receive EHR training and when the EHR-OSCE occurs. UTHSCSA students receive didactic EHR training in the weeks immediately preceding the OSCE, whereas OHSU students receive their didactic EHR training as many as 14 months prior to the OSCE. Formal EHR training may have a shortened half-life when students observe how actual clinicians use the EHR in a real-world setting. Didactic training may need to be closely followed by hands-on practice in a clinical or simulated setting to cement the appropriate skills. Another possible reason for performance variances may be the lack of well-defined standards for some EHR-OSCE skills (For example: Exactly how much time is needed to establish rapport, and would all patients need the same amount of time to establish rapport? A new patient could require more than an established patient). Finally, the performance discrepancy may have resulted from UTHSCSA’s inclusion of the EHR-OSCE as part of each student’s grade.

Checklist adaptations

As a result of the OHSU-UTHSCSA collaboration, we added two information-privacy-related measures to the EHR-OSCE rater checklist: (1) before sharing information in the EHR with the patient, students must verify the patient’s identity; and (2) students must secure the chart and log out on completion.

Perceptions

Faculty and students value the EHR-OSCE. After the implementation of the EHR-OSCE, one UTHSCSA faculty member reported, “This is the only time we have ever had students thank us for an OSCE.” Students also acknowledged the importance of learning these skills. One OHSU student observed, “We don’t get this education anywhere else.” We believe these comments reflect the need for more student EHR education, practice, and assessment.

Next Steps

We continue to collect data on student performance on the EHR-OSCEs at both institutions. Further analysis of these data will help determine the validity of the tool. Additionally, the OHSU nurse practitioner and physician assistant programs are considering adopting the EHR-OSCE, and analyzing the performance data from these programs will allow us to refine the tool. As institutions create EHR-related competencies and curricula, the EHR- OSCE could be used both to measure the effectiveness of curricular changes and to improve the EHR training. The EHR-OSCE provides a practical way to measure an individual student’s ability and also a means to identify institutional areas that need improvement, including new or better EHR-related curricula, guides for EHR use, and faculty development.

This Innovation Report describes using the EHR-OSCE to assess a portion of the class at one time. A larger-scale EHR-OSCE would require additional computer-equipped rooms and increased faculty time. Parts of the rater checklist require medical knowledge, which, in turn, requires knowledgeable faculty to observe and evaluate the student’s performance. This grading could be accomplished through asynchronous faculty review of video recordings and students’ sim-EHR charts.

Another critical need is for a better, standardized definition of clinical EHR best practices. Completion of some items on the EHR-OSCE is clearly obvious, while completion of others is more ambiguous. For example, failing to discover a patient’s allergy is clearly an error. Other items are more difficult to measure objectively insofar as they relate to patient outcomes. For example, determining how much intermittent eye contact is adequate to establish patient rapport is difficult. Further research is needed to determine which measures best quantify humanistic EHR use.

The fact that some students’ EHR skills failed to improve as the year progressed is an interesting finding. This unexpected observation could be due to the lack of consistent hospital and clinic policies granting students access to EHRs for writing notes and entering orders. Other potential causes include limited faculty comfort with students using EHRs, students’ reluctance to alter “real” charts, and the perceived inadequacy of student EHR instruction. EHR-related faculty development, institutional policies regarding student use of EHRs, and research on how best to supervise student use of EHRs are necessary next steps. On a national level, the Liaison Committee on Medical Education does not yet have standards specific to EHR education.10 Efforts to create these standards, develop uniform national EHR curricula, and establish national educational policies that support supervised, hands-on student use of the EHR in patient care would be important contributions.

In summary, the EHR-OSCE can be feasibly adapted and implemented to provide a structured experience of using an EHR while interacting with patients. Feedback and guidance from SPs and faculty help students focus on patients instead of the EHR and to enter data correctly to avoid EHR errors. The areas of EHR skill deficit (confirming data in the EHR and establishing rapport with the patient) highlight a need for increased EHR education and faculty development. Medical educators should advocate national standards, curricula, and policies which support student use of EHRs in patient care.

Supplementary Material

SM

Acknowledgments

The authors wish to acknowledge and thank Oregon Health & Science University’s Gretchen Scholl, electronic health record (EHR) educational informaticist, for her support in creating and disseminating the simulated EHR charts, and Andrew Hamilton, MS, MLS, for his assistance with the literature search.

Funding/Support: Oregon Health & Science University and the University of Texas Health Science Center at San Antonio were partially supported to complete this study with a National Cancer Institute grant: 3K07CA121457-04S1 and 1R25CA158571-01A; title: Integrating Patient Centered EHR and HIT Curriculum Into BSS Medical Education. PI: Biagioli.

Footnotes

Other disclosures: None reported.

Ethical approval: The institutional review boards at both Oregon Health & Science University and the University of Texas Health Science Center at San Antonio approved all the procedures necessary for the studies mentioned in this report.

Previous presentations: Previous presentations on this topic include “Adapting OSCEs to Measure Patient-Centric EHR Use” at the 39th annual Society of Teachers of Family Medicine (STFM), Medical Student Education Conference (San Antonio, Texas, January 2013); “Strategies for Teaching Students to Efficiently and Effectively Use an Electronic Health Record” at the 40th annual STFM, Medical Student Education Conference (Nashville, Tennessee, January 2014); and “Implementation of an Innovative Electronic Health Record OSCE at Two Institutions” at the 41st annual STFM Medical Student Education Conference, Atlanta, Georgia, February 2015).

Contributor Information

Frances E. Biagioli, Professor of family medicine, Oregon Health & Science University, Portland, Oregon.

Diane L. Elliot, Professor, Division of Health Promotion & Sports Medicine, Oregon Health & Science University, Portland, Oregon.

Ryan T. Palmer, Assistant professor, Department of Family Medicine, Oregon Health & Science University, Portland, Oregon.

Carla C. Graichen, 2016 graduate, Oregon Health & Science University, Portland, Oregon.

Rebecca E. Rdesinski, Research associate, Department of Family Medicine, Oregon Health & Science University, Portland, Oregon.

Kaparaboyna Ashok Kumar, Distinguished teaching professor and vice chair of medical student education, Department of Family & Community Medicine, University of Texas Health Science Center at San Antonio, San Antonio, Texas.

Ari B. Galper, Research assistant, Department of Family Medicine, Oregon Health & Science University, Portland, Oregon

James W. Tysinger, Distinguished teaching professor and vice chair of professional development, Department of Family & Community Medicine, University of Texas Health Science Center at San Antonio, San Antonio, Texas.

References

  • 1.Hammoud MM, Margo K, Christner JG, Fisher J, Fischer SH, Pangaro LN. Opportunities and challenges in integrating electronic health records into undergraduate medical education: A national survey of clerkship directors. Teach Learn Med. 2012;24:219–224. doi: 10.1080/10401334.2012.692267. [DOI] [PubMed] [Google Scholar]
  • 2.Tierney MJ, Pageler NM, Kahana M, Pantaleoni JL, Longhurst CA. Medical education in the electronic medical record (EMR) era: Benefits, challenges, and future directions. Acad Med. 2013;88:748–752. doi: 10.1097/ACM.0b013e3182905ceb. [DOI] [PubMed] [Google Scholar]
  • 3.Milano CE, Hardman JA, Plesiu A, Rdesinski RE, Biagioli FE. Simulated electronic health record (Sim-EHR) curriculum: Teaching EHR skills and use of the EHR for disease management and prevention. Acad Med. 2014;89:399–403. doi: 10.1097/ACM.0000000000000149. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Frankel R, Altschuler A, George S, et al. Effects of exam-room computing on clinician–patient communication: A longitudinal qualitative study. J Gen Intern Med. 2005;20:677–682. doi: 10.1111/j.1525-1497.2005.0163.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Assis-Hassid S, Heart T, Reychav I, Pliskin JS, Reis S. Existing instruments for assessing physician communication skills: Are they valid in a computerized setting? Patient Educ Couns. 2013;93:363–366. doi: 10.1016/j.pec.2013.03.017. [DOI] [PubMed] [Google Scholar]
  • 6.Gliatto P, Masters P, Karani R. Medical student documentation in the medical record: Is it a liability? Mt Sinai J Med. 2009;76:357–364. doi: 10.1002/msj.20130. [DOI] [PubMed] [Google Scholar]
  • 7.Knight AM, Kravet SJ, Kiyatkin D, Leff B. The effect of computerized provider order entry on medical students’ ability to write orders. Teach Learn Med. 2012;24:63–70. doi: 10.1080/10401334.2012.641490. [DOI] [PubMed] [Google Scholar]
  • 8.Morrow JB, Dobbie AE, Jenkins C, Long R, Mihalic A, Wagner J. First-year medical students can demonstrate EHR-specific communication skills: A control-group study. Fam Med. 2009;41:28–33. [PubMed] [Google Scholar]
  • 9.van der Sijs H, Aarts J, Vulto A, Berg M. Overriding of drug safety alerts in computerized physician order entry. J Am Med Inform Assoc. 2006;13:138–147. doi: 10.1197/jamia.M1809. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Liaison Committee on Medical Education. Functions and Structure of a Medical School: Standards for Accreditation of Medical Education Programs Leading to the M.D. Degree. March 2014. [Accessed April 18, 2016];Standards and Elements Effective. 2015 Jul 1; https://med.virginia.edu/instructional-support/wp-content/uploads/sites/216/2015/12/2015_16_functions_and_structure_march_2014.pdf.

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

SM

RESOURCES