Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2016 Feb 23.
Published in final edited form as: Rural Remote Health. 2015 Dec 3;15(4):3399.

The feasibility and acceptability of administering a telemedicine objective structured clinical exam as a solution for providing equivalent education to remote and rural learners

RT Palmer 1, FE Biagioli 1, J Mujcic 1, BN Schneider 1, L Spires 1, LG Dodson 1
PMCID: PMC4763875  NIHMSID: NIHMS755748  PMID: 26632083

Abstract

Introduction

Although many medical schools incorporate distance learning into their curricula, assessing students at a distance can be challenging. While some assessments are relatively simple to administer to remote students, other assessments, such as objective structured clinical exams (OSCEs) are not. This article describes a means to more effectively and efficiently assess distance learners and evaluate the feasibility and acceptability of the assessment.

Methods

We developed a teleOSCE, administered online in real time, to two cohorts of students on a rural clerkship rotation and assessed the feasibility and acceptability of using such an approach to assess medical students’ clinical skills at rural locations. Project feasibility was defined as having development and implementation costs of less than $5000. Project acceptability was determined by analyzing student interview transcripts. A qualitative case study design framework was chosen due to the novel nature of the activity.

Results

The implementation cost of the teleOSCE was approximately US$1577.20, making it a feasible educational endeavor. Interview data indicated the teleOSCE was also acceptable to students.

Conclusions

The teleOSCE format may be useful to other institutions as a method to centrally administer clinical skills exams for assessment of distance medical students.

Keywords: clinical assessment, distance learning, medical education, OSCE, rural education, standardized patients, telemedicine

Introduction

Medical education is increasingly moving from the academic medical center to diverse settings in distant locations1. Increases in class size and faculty workload, as well as new experiential approaches to clinical training, are calling for learners to spend less time on campus and more time in different learning settings. An important example is training medical students in rural locations, which provides valuable educational experiences while also assisting in addressing rural workforce shortages. However, the distance from the home educational institution can affect student engagement with faculty and with other learners. Additionally, it is difficult for the home educational institution to assess clinical learning for distance students. Instead, the clinical experiences and preceptor feedback for distance students often become proxies for the clinical assessments administered to on-campus learners. As more students move off-campus for clinical training, medical institutions need more robust methods of assessing clinical competence for distance learners that are comparable to the assessment methods for on-campus students.

Although many medical schools incorporate distance learning into their curricula2, assessing students at a distance occurs much less often due to the inherent challenges3,4. For example, preceptor-proctored knowledge-based exams are relatively simple to administer to remote students, while clinical skills exams are not. Objective structured clinical exams (OSCEs) utilize standardized patient (SP) actors to simulate real-world clinical encounters in a safe teaching environment. The utilization of OCSEs in medical education is common and the efficacy in assessing clinical learning is well documented5-8. As medical schools increase the numbers of off-campus learners, it is important for institutions to consider how they will assess clinical skills development without requiring learners to return to campus.

Few options currently exist to centrally administer OCSEs to remote learners. Existing options are often proprietary and fee-based, which may be prohibitive for programs with limited funding9-11. Optimal educational options would include non-proprietary solutions that institutions can share and implement to meet their own distance learning assessment needs.

Methods

We conducted a pilot program to study implementation of a non-proprietary distance OSCE solution, the teleOSCE. In spring of 2013 and spring of 2014 the teleOSCE was administered online to nine rural distance learners using commercially available Adobe Connect video-conferencing software, cell phones and a primary care focused diabetes management case. Due to the novel nature of the activity, we sought to determine the feasibility and acceptability of the teleOSCE. We defined feasibility as requiring development and implementation costs of US$5000 or less, including faculty and staff time and effort, and defined acceptability in terms of student receptivity and evaluation of the usefulness of the experience.

All study activities were approved by the Institutional Review Board of Oregon Health & Science University (OHSU). Study participants were third-year medical students enrolled in the Rural Scholars Program (RSP) at OHSU. The RSP is a competitive admission program for students focused on careers in rural medicine. RSP students complete a significant portion of their family medicine coursework via distance learning, spending a minimum of 10 continuous weeks in a remote, Oregon rural clinical site. During the family medicine clerkship, on-campus learners participate in a ‘teaching’ OSCE, a formative assessment in a simulated clinical encounter completed early in the clerkship. Unlike traditional summative assessment OSCEs, the teaching OSCE is a ‘pass–no pass’ assessment, with an SP and a faculty member giving immediate formative feedback at the end of the simulated encounter. The formative nature of the feedback makes this a popular activity among participating students. Due to their remote locations, RSP students historically did not take part in the teaching OSCE. The teleOSCE was developed to bridge this educational gap.

Case development

A telemedicine scenario was chosen for the teleOSCE to mimic a real-world situation in which a physician interacts with a rural patient in a non-face-to-face manner. Telemedicine is ‘the remote delivery of healthcare services and clinical information using telecommunications technology12.’ Since RSP students are located in rural settings, the teleOSCE simultaneously solved the logistical challenge of centrally assessing clinical skills in real time, while also exposing learners to a new model of rural patient care.

Three competency domains were assessed in the teleOSCE: (1) clinical knowledge: learners must identify diabetes management issues and recommend appropriate follow-up, (2) patient-centered use of technology: learners must remain patient-focused despite performing the clinical encounter using the telemedicine software, and (3) understanding of the geographic and socioeconomic realities of rural patients: learners must incorporate rural circumstances into the plan of care (travel distance, lack of an in-town pharmacy, and few nutritional options).

Implementation logistics

We used Adobe Connect online meeting software as the teleOSCE ‘exam room’. Adobe Connect enables multiple live video and audio feeds as well as the ability to access documents from inside the digital meeting room, making it an ideal platform for a telemedicine simulation. RSP students were already familiar with the Adobe Connect technology, having used it for other curricular requirements in the program. Each student was given a specific appointment time to connect with the SP via the internet in the virtual exam room. One non-clinical faculty member served as the meeting operator to provide technical support for the session, and a clinical faculty member served as the observer. Each encounter lasted 20 minutes, with 15 minutes for the clinical encounter and 5 minutes for feedback. From four separate locations learners, faculty members and the SP all participated in the teleOSCE from their own computers and cell phones. Figures 1 and 2 illustrate the teleOSCE setup.

Figure 1.

Figure 1

Each objective structured clinical exam participant connects to the meeting room from a separate location via the internet with a laptop computer and a cell phone. All interactions take place online in an Adobe Connect virtual meeting room.

Figure 2.

Figure 2

The standardized patient in the objective structured clinical exam is shown on the left and the student is on the right. The observing faculty member and technical operator are also present with their webcams turned off.

Data collection

Participants digitally signed consent forms prior to participating in the teleOSCE. A qualitative case study framework was chosen for the study, with two RSP student cohorts participating as part of a convenience sampling. Cohort 1 (n=4) participated in spring of 2013 and cohort 2 (n=5) participated in spring 2014. Cohort 1 data were collected by phone interview by RP within one month of the conclusion of the teleOSCE. All students participated in the interviews. Interview audio was coded by RP using categorical aggregation13 and clustering14. Atlas.ti v10 (Atlas.ti; http://www.atlasti.com/index.html) was used as the qualitative analysis software. Interview analysis results were shared with participants to verify accuracy. To simplify the transcription and coding process, we converted the interview protocol from cohort 1 to a secure online survey form and emailed it to cohort 2 students within a week of completing the teleOSCE with a respondent rate for cohort 2 of 100%. Coding methodology for the online survey responses matched cohort 1 interview coding methodology. BS reviewed both interview audio and survey responses for coding accuracy. The following interview protocol was used in both telephone and online survey data collection:

  1. Was this an acceptable format for you to conduct an OSCE exercise? Explain why or why not.

  2. How realistic was it for you to assess a patient in the format of the telemedicine OSCE?

  3. What was your experience with the technology used to do this OSCE?

  4. Do you feel this educational activity was a good use of your time while on your rural rotation? Briefly explain why or why not.

We determined that the teleOSCE was financially feasible if development and implementation costs were at or below US$5000. This amount was based on years of professional experience in curriculum development as well as an expert consensus process involving similarly experienced faculty peers. Faculty costs were calculated by a standard multiplier of faculty hourly rate times total hours spent in both development and implementation. Standardized patient compensation and teleconferencing fees were factored into the feasibility calculations.

Ethics approval

The study protocol was approved by Oregon Health & Science University's Institutional Review Board.

Results

Financial feasibility

TeleOSCE financial feasibility was determined by calculating the total cost of faculty full-time equivalents needed for case development and implementation, telephone charges for the Adobe Connect meeting room, and SP costs. As described in Table 1, the project entailed a total cost of US$1577.20, meeting our definition of financial feasibility. The ‘extrapolated cost’ column in Table 1 illustrates how the teleOSCE may be implemented at a feasible cost for an even larger group of students.

Table 1.

Cost of development and implementation of the tele-objective structured clinical exam.

TeleOSCE component Cost for this study Extrapolated cost
Development (14 h combined faculty FTE) $1039 (fixed start-up cost) $0 (case freely available)
Implementation: observer/grader (3 h for 9 students) Clinician faculty FTE $300 ($100/h) $500 (assuming SP observer at $15/h with three students/h)
Implementation: technical support (3 h for 9 students) Non-clinical faculty $135 ($45/h) $833 (assuming staff support at $25/h with three students/h)
Adobe Connect phone charges: $0.06/min/user for 3 h (four continuous users) $43.20 $0 (Using Google Hangouts, Skype or other free group video-conferencing software)
SP $15/h (1 h training + 3 h for OSCE) $60 See row 2, above: SP can potentially act and grade session
Total $1577.20 $1333 per 100 students

All costs are in US dollars

Assumes a 100-student institution using least expensive methods

FTE, full-time equivalent. OSCE, objective structured clinical exam. SP, standardized patient

Acceptability

Transcript coding of student interviews revealed the teleOSCE to be an educational activity of acceptable quality and importance. Coding themes are illustrated in Table 2.

Table 2.

Coding themes for and excerpts from the tele-objective structured clinical exams.

Theme Interview excerpt
Quality of educational experience • ‘This was a great use of time. It didn't take much time at all, but was an excellent assessment of my knowledge and clinical judgment.’
• ‘I thought this was a really valuable experience.’
• ‘It compares to the OSCEs I've done at [the study institution] other than the fact that it was over a computer and not in person.’
• ‘I really appreciate not having to drive all the way back to [my home institution] and it worked pretty well.’
• ‘I thought it [the teleOSCE] was very effective. It is kind of a novel way of teaching. Instead of see one, do one, teach one; we just DID one!’
• ‘I think it [the feedback] was just as good as feedback I have gotten in person.’
• ‘I thought this was a good experience. I think a lot of medical students today have at least cursory experience using online video communication, such as Skype, to communicate with people. This helps integrate technology in an effective way into learning how to care for patients.’
Use of technology • ‘It was nice that the chatroom was in a format that we were familiar with from our other sessions.’
• ‘Video quality was really clear ... it was really easy to hear and see. It was just like Skyping with someone ... pulling up pictures was doable but harder.’
• ‘I had issues. When I attempted to open both the sugar log and the photo my laptop first had to download Google Chrome (instead of Internet Explorer I was using) and then download the files. This was just too much, along with the Adobe Connect meeting room up, for my little guy [computer] to pull off all at once. It took nearly 15 minutes before I finally saw the picture. However, it was a good experience because it forced me to make a decision about patient care based solely on hx [patient history] which was good.’
Exposure to new practice models • ‘Before, I thought that telemedicine was mainly for [my home institution] or for big city physicians to kind of consult with rural physicians, you know, like a rural physician would have a patient in their office to, like, consult with a specialist ... but then, after this experience, it kind of taught me that you can actually do visits with patients in their houses. It's never crossed my mind before that patients would have the same technology as the physician in the office so that you could do a visit with the patient in their homes by themselves, like that ... That was new for me.’
• ‘It [the teleOSCE] fits pretty well with the theme that we do have a lot of patients who have a hard time getting in to see the doctor because even though this is a rural area, they live even farther out, so I can definitely see myself doing this, you know, later on in my career when I will have to do telemedicine with patients.’
• ‘I think getting used to the idea of telemedicine ... to provide a high level of patient care to their patients is really valuable.’

OSCE, objective structured clinical exam.

Discussion

Strengths

The strengths of the teleOSCE are its scalability and its ability to clinically assess learners from a distance. While Adobe Connect was used for this implementation, other video-conferencing software such as Skype, GoToMeeting, or FaceTime may be used as well. The teleOSCE case we developed is also freely available on the Family Medicine Digital Library database (Society of Teachers of Family Medicine) for any institution to use and share15. Additionally, the teleOSCE allows the SP and the faculty participants to ‘work from anywhere’, easing recruitment of faculty and actors, and reducing travel and scheduling time. Finally, the teleOSCE provides a time efficient, financially feasible and educationally acceptable format for centrally assessing the clinical skills and competence of distance learners in a manner comparable to that used in evaluating on-campus learners. By using or modifying the teleOSCE, institutions are now capable of directly assessing their learners using their own institutional faculty, SPs and educational competency metrics.

Limitations

Because this was a pilot, the sample size was small and represents only one institution. Also, findings of this study are qualitative and may not be generalizable to all other learners at other institutions. Additional trials of the teleOSCE are needed to validate comparability, acceptance by faculty and students in broader settings, and to ensure replicability.

Conclusions

Results of this study indicate that administration of the teleOSCE to remote learners is both financially feasible and acceptable to students. In addition to solving logistical issues, the teleOSCE seems well suited to expose students to telemedicine visits as a new model of rural care, while simultaneously increasing awareness of common issues in rural population health.

The next steps are to expand the teleOSCE to other health training programs and settings. Further validation of comparability is being undertaken with a modified version of the teleOSCE being used in the on-campus family medicine clerkship OSCE at the study institution. Development and validation of additional teleOSCE cases will also be important. Increased utilization of the teleOSCE cases and format by other programs and institutions will generate a larger framework to support future scholarly inquiries and build upon this initial exploration.

Acknowledgements

The authors are grateful for editing assistance from Dr Patty Carney, Professor, Department of Family Medicine, Oregon Health and Science University, Portland, Oregon.

References

  • 1.Kahn MJ, Maurer R, Wartman SA, Sachs BP. A case for change: disruption in academic medicine. Academic Medicine. 2014;89(9):1216–1219. doi: 10.1097/ACM.0000000000000418. [DOI] [PubMed] [Google Scholar]
  • 2.Parisky A, Ortiz T, McCann K, Hoffmann E, Boulay R. How top US medical schools are using distance learning resources: an exploratory study of four institutions.. In: Bastiaens T, et al., editors. Proceedings of World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education; Chesapeake, VA. Association for the Advancement of Computing in Education; 2009. pp. 2930–2935. [Google Scholar]
  • 3.Mattheos N, Schittek M, Attström R, Lyon HC. Distance learning in academic health education: a literature review. European Journal of Dental Education. 2000;5:67–76. doi: 10.1034/j.1600-0579.2001.005002067.x. [DOI] [PubMed] [Google Scholar]
  • 4.Tangalos EG, McGee R, Bigbee AW. Use of the new media for medical education. Journal of Telemedicine and Telecare. 1997;3:40–47. doi: 10.1258/1357633971930184. [DOI] [PubMed] [Google Scholar]
  • 5.Davidson R, Duerson M, Rathe R, Pauly R, Watson RT. Using standardized patients as teachers: a concurrent controlled trial. Academic Medicine. 2001;6:840–843. doi: 10.1097/00001888-200108000-00019. [DOI] [PubMed] [Google Scholar]
  • 6.Dong T, Saguil A, Artino AR, Gilliland WR, Waechter DM, Lopreaito J, et al. Relationship between OSCE scores and other typical medical school performance indicators: a 5-year cohort study. Military Medicine. 2012;177(9 Suppl):44–46. doi: 10.7205/milmed-d-12-00237. [DOI] [PubMed] [Google Scholar]
  • 7.May W, Park JH, Lee JP. A ten-year review of the literature on the use of standardized patients in teaching and learning: 1996-2005. Medical Teacher. 2009;31:487–492. doi: 10.1080/01421590802530898. [DOI] [PubMed] [Google Scholar]
  • 8.McGraw RC, O'Conner HM. Standardized patients in the early acquisition of clinical skills. Medical Education. 1999;33(8):572–578. doi: 10.1046/j.1365-2923.1999.00381.x. [DOI] [PubMed] [Google Scholar]
  • 9.WebOSCE.net [13 November 2013];WebPatient encounter. (Internet) 2013 Available: http://webcampus.drexelmed.edu/webosce/
  • 10.Daetwyler CJ, Cohen DG, Gracely E, Novack DH. eLearning to enhance physician patient communication: a pilot test of ‘doc.com’ and ‘WebEncounter’ in teaching bad news delivery. Medical Teacher. 2010;32(9):e381–e390. doi: 10.3109/0142159X.2010.495759. [DOI] [PubMed] [Google Scholar]
  • 11.Novack DH, Cohen D, Peitzman SJ, Beadenkopf S, Gracely E, Morris J. Pilot test of WebOSCE: a system for assessing trainees’ clinical skills via teleconference. Medical Teacher. 2000;24:483–487. doi: 10.1080/0142159021000012504. [DOI] [PubMed] [Google Scholar]
  • 12.American Telemedicine Association [13 November 2013];Telemedicine FAQs. (Internet) 2012 Available: http://www.americantelemed.org/learn/what-is-telemedicine/faqs.
  • 13.Creswell JW. Qualitative inquiry and research design. Sage Publications; Thousand Oaks: 2007. [Google Scholar]
  • 14.Marshall C, Rossman GB. Designing qualitative research. Sage Publications; Los Angeles, CA: 2011. [Google Scholar]
  • 15.Society of Teachers of Family Medicine [17 November 2015];Resource library. (Internet) 2015 Available: http://www.fmdrl.org/index.cfm?event=c.beginBrowseD&clearSelections=1&criteria=palmer#5045.

RESOURCES