Skip to main content
Journal of Pathology Informatics logoLink to Journal of Pathology Informatics
. 2022 Dec 1;14:100162. doi: 10.1016/j.jpi.2022.100162

Maintaining informatics training learning outcomes with a COVID-19 era shift to a fully online flipped course

Heather TD Maness a, Hesamedin Hakimjavadi b,c, Srikar Chamala b,c,
PMCID: PMC9714185  PMID: 36471780

Abstract

The emergence of the coronavirus disease 2019 pandemic forced us to adapt our recently developed informatics training serving a variety of students as well as faculty and staff. The successful flipped classroom course series (a hybrid-format with both asynchronous online learning and in-person synchronous components) was shifted to a fully online format with the synchronous portion now held via web-based video conference. We repeated our participant survey at the end of each of the 3 one-credit courses to compare student satisfaction and learning outcomes achievement to the original offering. The responses were overall very positive again with a slight response distribution improvement in some measures of satisfaction. Likewise, students reported similar achievement of the learning outcomes across all courses with some Unix coourse objectives receiving higher competency agreement in the new, fully online version. Overall, the fully online version of the course series was equally successful, if not more so, than the original version with a physical classroom session each week. Given that participants also had strong agreement with a new question that they would prefer online class meetings instead of in a classroom, even if there wasn’t a global pandemic (citing a variety of logistical reasons such as “convenience of screen sharing,” parking issues, and job-related time constraints), the fully online version of the informatics training will be retained.

Keywords: Competency, Continuing education, Flipped classroom, Graduate medical education, Informatics, Pathology, Online learning, Residency training

Background

Shortly after developing a successful hybrid-format (both online learning and in-person components) course specifically se series, named Informatics for Pathology Practice and Research (previously published1), the emergence of coronavirus disease 2019 (COVID-19) forced us (and every other education provider in the world2) to pivot to a fully online format. We were fortunate that the program structure was already a flipped classroom design where students prepared for the in-person class session each week by asynchronously completing the assigned reading material and some activities in the online environment (via Canvas Learning Management System, Instructure, Inc., “Canvas”). Once the pandemic began, the synchronous portion of the flipped classroom (previously held physically on campus) was then easily transitioned to a synchronous, web-based video conference platform (via Zoom Video Communications, Inc. “Zoom”) to continue to provide the collaborative peer learning opportunities3 and instructor/teaching assistant (TA) support during the active learning components of the course. Zoom was familiar to the instructor and some students since it was an established vendor provider at the institution and was previously integrated into the course as an office hours tool.

Prior to the pandemic, there were minimal examples of virtual flipped classroom use in the literature, for any discipline.4,5 Previous work focused mostly on physical flipped classrooms, often with asynchronous online learning components. Some synchronous online course design research has been conducted, but there was scant evidence for also using a flipped classroom model in those studies. It is further limited when looking for evidence of specific flipped classroom design principles, argued as necessary for meaningful comparisions.6 Yet, this strategy has now been proposed as a viable model, specifically in pathology training.2,7 Furthermore, herein we provide rare comparison data for the physical flipped classroom model to a virtual flipped classroom design, where synchronous activity time was also held in an online environment. In particular, with our shift to the fully online flipped classroom model, we wanted to gauge if the learning outcomes would still be successfully met by most students and if the design was still satisfactory for our broad participants of students, faculty, and staff.

The Informatics for Pathology Practice and Research program began as a series of 3 one-credit courses offered sequentially for 5 weeks each, during a singular semester. It was designed to meet the modern training needs in anatomical/clinical and experimental pathology8,9 serving undergraduates, graduate students, medical students, post-doctoral fellows, residents, staff, and faculty. The series focuses on the Unix Operating System (Unix), Python Programming (Python), and Advanced Data Analysis and Visualization (ADAnV) with Pandas and Matplotlib. As popularity of the program has grown, it was renamed to Programming for Biomedical Research & Clinical Practice to expand outreach to other biomedical professionals and students across the health science colleges—Medicine, Pharmacy, Public Health & Health Professions, and Veterinary Medicine. Although this program was originally designed to serve the needs of the pathology department, this program is applicable (as evident from our participant growth in various disciplines) to all researchers, clinicians, and other healthcare professionals interested in building fundamental computational skills necessary for pursuing specialized and advanced informatics training in the areas of clinical bioinformatics, biomedical informatics, artificial intelligence, digital pathology imaging, etc. It is assumed students have no prior knowledge of programming or advanced computing.

Methods

The original 40-item questionnaire (previous publication on the IPPR course series) was modified to a 31-item version to investigate the research question: How does the online version of the pathology informatics training program compare to the previous physical classroom version? Since this study was a continuation of the previous study,1 it was classified within the same institutional review board approved protocol (study #201602565). The course instructor was the same person (Chamala) but the TA was a different graduate student than the original TA. Beyond using Zoom for the synchronous meetings instead of the physical classroom environment, the only notable change to the Spring 2021 training program was a shift in the course order to ending with Unix instead of beginning with it (Table 1).

Table 1.

Comparison of registrants to respondents for the 2 different offering styles.

Flipped style Title Durationa Registrants Respondents
Physical (2018) Unix Operating System First 5 weeks 21 20
Python Programming Second 5 weeks 22 19
Advanced Data Analysis & Visualization Third 5 weeks 22 15
Online (2021) Python Programming First 5 weeks 34 17
Advanced Data Analysis & Visualization Second 5 weeks 33 7
Unix Operating System Third 5 weeks 32 8
a

All 3 courses are held during single semester that runs for 15 weeks.

Data visualization and statistical analyses were performed using R (version 4.01). Descriptive measures (inclusive median ± interquartile range) were calculated for all questions for all groups of participants. Given the non-parametric nature of the data, Mann–Whitney–Wilcoxon tests were used to examine differences between the 2 course formats for students’ evaluation of each prompt related to satisfaction and objective achievement. For each comparison, we reported statistics (Mann–Whitney U), p value, and confidence interval. Differences with a p-value smaller than .05 were considered significant.

Results

Course participants and survey respondents

Thirty-four individuals enrolled in one or more of the courses in the Spring 2021 semester which was higher than the initial course offerings (Table 1). Unique respondents (N = 20) to our latest surveys included undergraduates (n = 2), graduate students (n = 9), medical students (n = 3), a medical and graduate student (n = 1), post-doctoral students (n = 2), a medical fellow (n = 1), and faculty (n = 2). For this semester, respondents came from a variety of colleges compared to the original offering. While most respondents were still from the College of Medicine (n = 8), we also had respondents from the colleges of Public Health and Health Professions (n = 3), Pharmacy (n = 2), Engineering (n = 2), and Veterinary Medicine (n = 1). There were also 2 respondents who chose not to disclose their affiliation on the questionnaire.

Prior experience

Programming. The majority of respondents had no prior experience with Unix (n = 6) or Python (n = 9) before the course series. The rest chose “a little” (n = 6) or “a moderate amount” (n = 2) to describe their experience, except for 1 student who chose “a lot” for Unix and another who also reported that for Python experience.

Online learning environment. Only 1 respondent had not previously taken a fully online course (either asynchronously or one with synchronous sessions). Therefore, the online learning environment was more familiar to most students than in the original offering where over half did not have previous experience with online learning.

Course satisfaction comparison

Once again, in this offering of the course series the responses were overall very positive. There was no median difference between the 2 semesters when aggregating all satisfaction measures (Fig. 1 and Table 2). When analyzing each of the 3 satisfaction metrics, there was no significant median difference for overall, interaction, or recommendation but there was a significantly higher agreement response distribution for recommending the online version of the course to a friend (Fig. 1 and Table 2).

Fig. 1.

Fig. 1

Satisfaction and learning objective ratings comparisons by format. Distribution and median comparison (* = P < .05) of respondents’ agreement levels (5-point Likert-type scale) for course satisfaction and learning objective metrics in the new, fully online format versus the original physical-presence for synchronous sessions version.

Table 2.

Course satisfaction metrics statistical comparison (fully online:physical format).

Course Satisfaction (median comparison) U (CI) P value
Total combined—Online vs Physical (4:4) 8774.5 (-0.07, 0.22) 0.281
Courses combined—Overall (4:4) 1087.0 (-0.06, 0.41) 0.113
Courses combined—Interaction (3.5:4) 700.5 (-0.44, 0.02) 0.066
Courses combined—Recommend course (4:4) 1120.5 (, 0.46) 0.046*
I-Unix Overall (4:4) 82.5 (-0.42, 0.47) 0.914
Interaction (4.5:4) 62.5 (-0.61, 0.25) 0.340
Recommend course (4:4.5) 94.5 (-0.23, 0.63) 0.308
All 3 combined (4:4) 721.5 (-0.25, 0.29) 0.888
II-Python Overall (4:3) 224.5 (0.03, 0.66) 0.028*
Interaction (3:4) 117.0 (-0.58, 0.10) 0.117
Recommend course (4:3) 219.5 (-0.01, 0.64) 0.060
All 3 combined (4:3) 1725.5 (-0.03, 0.39) 0.080
III-ADAnV Overall (4:4) 90.0 (-0.14, 0.68) 0.115
Interaction (4:4) 66.5 (-0.38, 0.55) 0.682
Recommend course (4:4) 89.5 (-0.15, 0.68) 0.161
All 3 combined (4:4) 744.0 (-0.00, 0.29) 0.040*

Per course, median comparisons between the formats of the aggregated satisfaction measures showed no significant difference in the Unix course, a slight satisfaction improvement in the Python Programming course but not statistically different, and a slight response distribution improvement for satisfaction in the new, fully online Advanced Data Analysis and Visualization course while maintaining the same median score (Fig. 2 and Table 2). Independently, the only satisfaction metric to have a statistically significant difference was a higher overall satisfaction agreement for the online Python Programming course (Fig. 2 and Table 2). The median comparison between the 2 semesters showed no statistically significant difference in the level of interaction they had with others (classmates and instructor/TA) for any of the courses in the program (Fig. 2 and Table 2). Likewise, there was no statistically significant difference in recommending a specific course to a friend (Fig. 2 and Table 2).

Fig. 2.

Fig. 2

Course satisfaction ratings. Distribution and median comparison (* = P < .05) of respondents’ agreement levels (5-point Likert-type scale) for each course offering with the 3 course satisfaction elements (overall, interaction, and recommendation) and when combined.

Achievement of the course objectives comparison

Participants’ self-assessed rating of their achievement of the learning objectives for each course in the new online format, was mostly similar to the ratings by previous students, with the only statistically significant differences occuring in the Unix course (Fig. 3 and Table 3). There was an increase in the agreement distribution for overall learning objective achievement in the new online Unix course that was moved to the end of the series. Impact stemmed specifically from the increase in agreement distribution for the third objective (3. Create/interpret Unix programs) as well as the median and distribution in achievement of the fourth objective (4. Make informed decisions on informatics server needs in a biomedical setting).

Fig. 3.

Fig. 3

Course learning objective achievement ratings. Distribution and median comparison (* = P < .05) of respondents’ agreement level (5-point Likert-type scale) that they achieved competency of the learning objectives in the new online format versus the original physical-presence for synchronous sessions.

Table 3.

Course learning objectives statistical comparison (fully online:physical format).

Course Objective (median comparison) U (CI) P value
Total combined objectives—Online vs. Physical (4:4) 15606.0 (-0.05, 0.20) 0.195
I-Unix 1. Navigate through Unix/Linux environments (4.5:4) 86.0 (-0.39, 0.51) 0.758
2. Use high performance computing servers (4:3.5) 115.5 (-0.00, 0.74) 0.059
3. Create/interpret programs in Unix (4.5:4) 123.0 (0.12, 0.79) 0.023*
4. Determine biomedical informatics server needs (4:3) 131.0 (0.28, 0.85) 0.006*
All 4 Combined I-Unix Objectives (4:4) 1800.0 (0.19, 0.58) 0.0004*
II-Python 1. Think computationally about problem-solving (4:4) 147.5 (-0.36, 0.39) 0.926
2. Discuss software programming concepts (4:4) 125.0 (-0.49, 0.25) 0.474
3. Create high quality programs in Python (3:3) 154.5 (-0.31, 0.43) 0.731
4. Interpret Python programs written by others (4:4) 139.5 (-0.40, 0.34) 0.863
5. Determine biomedical programming needs (3:4) 132.5 (-0.44, 0.30) 0.672
All 5 Combined II-Python Objectives (4:4) 3510.5 (-0.20, 0.15) 0.734
III-ADAnV 1. Extract info. from Big Data using Python (4:4) 56.5 (-0.43, 0.54) 0.772
2. Extract info. from Big Data using Pandas (4:4) 52.5 (-0.49, 0.49) 1.000
3. Create data visualizations using Python (4:4) 53.5 (-0.47, 0.50) 0.968
All 3 Combined III-ADAnV Objectives (4:4) 489.0 (-0.26, 0.32) 0.796

Preference for synchronous online over a physical classroom

Participants were also asked to rate their level of agreement with the statement: “Even if there wasn't a global COVID-19 pandemic, I prefer to have synchronous class meetings at specific times online, instead of in a classroom (considering travel time savings, parking, learning/interaction preferences, technology availability, etc.).” All but 3 respondents agreed or strongly agreed with the statement (n = 19; median = 4; mean 4.16, SD 1.01). One of the two neutral responses noted that while they slightly prefer in-person learning, to make the travel logistics worthwhile, they try to create their class schedule so they can take multiple classes back-to-back. The only respondent who strongly disagreed also cited a preference for in-person classes and found coding to be “a difficult topic to learn via Zoom, especially at a beginner level.” Those that agreed with the statement provided a variety of reasons including the “convenience of screen sharing,” job-related time constraints requiring the need for flexibility in pacing and participation, as well as not having a car. A few of these respondents did also note that they may have learned more from others if they were face-to-face in a physical classroom. Yet, 1 person noted that there was no need for “a live portion” at all.

Conclusions

The increase in self-reported achievement of 2 of the learning objectives in the Unix course was a surprising finding. One hypothesis we have for why that occurred is related to the change in the course sequencing. Based on the initial student feedback from the first offering, instead of starting the course series with Unix, it was moved to the end of the course series so students could better appreciate the application use cases for Unix. Thus, this shift to strengthen the relatedness of Unix to the other course topics may have also increased student perception of learning objective achievement. Likewise, the placement of Python Programming as the first course in the series may have contributed to participants’ higher overall satisfaction ratings and the higher combined satisfaction metric for Advanced Data Analysis and Visualization in its new mid-point presentation. It may also be pertinent that the Unix learning objectives with improved median scores are both of the higher order cognitive skills, using Bloom’s Taxonomy,10 in the set. There is an established need for experimental research on flipped classroom long-term effect on higher level outcomes, such as behavior,11 but more research is also warranted on the impact on lower level versus higher level thinking skills with consideration of learning environment modality. Regardless of the potential course order role in learning achievement reporting, the fully online format certainly did not decrease perceptions of learning objective achievement or satisfaction ratings, and may be a contributor for increased learning objective achievement and satisfaction.

There is great variance in flipped classroom course designs, so it is important to consider the context of this study when considering the greater pedagogical implications. While previous research has shown favorable effects of flipped classroom design on outcomes when skill laboratories were not included in control groups11, 12, 13 and participant preference for flipped over traditional lecture formats,12,13 there were variances in effect sizes. Moreover, these meta-analyses did not consider online synchronous sessions as a variable, which has been previously reported as a detriment to quality and effectiveness of pandemic-learning.14 This contrast to our findings could be related to differences in topic or technologies, but another explanation is it is because of the difference in intentionality and preparation time for our fully online version. Key elements of our flipped classroom design include pre-class videos and graded quizzes in a highly organized learning management system, weekly real-world assignments and feedback, and a diverse learning community with students in a variety of degree programs as well as continuing education learners. The work to create the online student experience and facilitate active learning sessions was done before the pandemic-forced fully online semester, likely leading to a higher quality experience. Additionally, since it is not well-understood yet if the general increased effect from flipped classroom designs is associated more so, or equally, from the changes in student preparation for class or the in-class activities themselves,12 active learning experiences during synchronous class time should also be considered in further research. Our activities were mainly groupwork-encouraged assignments with individual submission and Question/Answer sessions. Future studies should compare outcomes amongst different techniques, such as game-based learning15 and team-based learning,16 with consideration for the proportion of class time utilizing each strategy and variances in student compliance with pre-class requirements and in-class participation.

There are several logistical benefits that were identified with the transition to a fully online course design. Not only did students appreciate the time savings from not needing to travel to a campus classroom for the synchronous session, but the instructor also valued the time savings from the elimination of travel time and parking limitations since he has an off-campus office at a health science center community site. Additionally, the fully online version offers the ability to expand the learner reach to those outside of the college of medicine (those in colleges that are physically distant from the Health Science Center) as well as the potential to expand to interested learners outside of our institution. It also creates more opportunities to expand the expert teachers contributing to people beyond our institution/location, which has been previously reported as positively contributing to training program designs.17 Lastly, some participants noted their appreciation for the ability to quickly share their screen during class discussion. This is an added benefit since it provides additional experience with using their own equipment during bioinformatics analysis tasks17 and likely enhances their confidence in completely these tasks independently.

Overall, the fully online version of the course series was equally successful, if not more so, than the original version with a physical classroom session each week. While acknowledging the response and recall bias our study is limited by, our conclusions are evidenced by the participant preference for fully online format and consistently high student ratings in satisfaction metrics for all courses with occasional higher ratings in the online version, such as recommending the course to a friend. Further support comes from the data showing there were no significant differences in the achievement of the student learning outcomes in Python Programming and Advanced Data Analysis and Visualization and improvement in the Unix course. Given that conducting the pathology informatics training program was also logistically easier in the fully online environment and being exclusively online provides flexibility for broader participation across and beyond the institution, we will likely continue to offer the flipped classroom format of the course series using video conferencing for our weekly synchronous meetings.

Funding

The authors received no financial support for the research, authorship, and/or publication of this article.

ORCID iD

Srikar Chamala https://orcid.org/0000-0001-6367-7615.

Declaration of interests

The authors declare the following financial interests/personal relationships which may be considered as potential competing interests:

Srikar chamala reports was provided by University of Southern California.

Acknowledgements

None.

Contributor Information

Heather T.D. Maness, Email: htdaniel@ufl.edu.

Hesamedin Hakimjavadi, Email: hhakimjavadi@chla.usc.edu.

Srikar Chamala, Email: schamala@chla.usc.edu.

References

  • 1.Maness H.T.D., Behar-Horenstein L.S., Clare-Salzler M., Chamala S. Informatics training for pathology practice and research in the digital era. Acad Pathol. 2020;7:1–11. doi: 10.1177/2374289520911179. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Koch L.K., Chang O.H., Dintzis S.M. Medical education in pathology: general concepts and strategies for implementation. Arch Pathol Lab Med. 2021 doi: 10.5858/arpa.2020-0463-RA. [DOI] [PubMed] [Google Scholar]
  • 3.Bower M., Richards D. Australasian Society for Computers in Learning in Tertiary Education Conference. 2006. Collaborative learning: Some possibilities and limitations for students and teachers; pp. 79–89. [Google Scholar]
  • 4.Ismail S.S., Abdulla S.A. Virtual flipped classroom: new teaching model to grant the learners knowledge and motivation. J Technol Sci Educ. 2019;9(2):168–183. doi: 10.3926/jotse.478. [DOI] [Google Scholar]
  • 5.Phillips C., O’Flaherty J. Evaluating nursing students’ engagement in an online course using flipped virtual classrooms. Student Success. 2019;10(1):59–72. [Google Scholar]
  • 6.Kim M.K., Kim S.M., Khera O., Getman J. The experience of three flipped classrooms in an urban university: an exploration of design principles. Internet High Educ. 2014;22:37–50. doi: 10.1016/j.iheduc.2014.04.003. [DOI] [Google Scholar]
  • 7.Mukhopadhyay S., Booth A.L., Calkins S.M., et al. Leveraging technology for remote learning in the era of COVID-19 and social distancing: tips and resources for pathology educators and trainees. Arch Pathol Lab Med. 2020;144(9):1027–1036. doi: 10.5858/ARPA.2020-0201-ED. [DOI] [PubMed] [Google Scholar]
  • 8.Black-Schaffer W.S., Morrow J.S., Prystowsky M.B., Steinberg J.J. Training Pathology Residents to Practice 21st Century Medicine. Acad Pathol. 2016:3. doi: 10.1177/2374289516665393. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Clay M.R., Fisher K.E. Bioinformatics education in pathology training: current scope and future direction. Cancer Inform. 2017;16:1–6. doi: 10.1177/1176935117703389. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Anderson L.W., Krathwohl D.R., Airasian P.W., et al., editors. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. Longman, Inc.; New York: 2001. [Google Scholar]
  • 11.Chen F., Lui A.M., Martinelli S.M. A systematic review of the effectiveness of flipped classrooms in medical education. Med Educ. 2017;51:585–597. doi: 10.1111/medu.13272. [DOI] [PubMed] [Google Scholar]
  • 12.Hew K.F., Lo C.K. Flipped classroom improves student learning in health professions education: a meta-analysis. BMC Med Educ. 2018;18(1):1–12. doi: 10.1186/S12909-018-1144-Z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Huang H.L., Chou C.P., Leu S., You H.L., Tiao M.M., Chen C.H. Effects of a quasi-experimental study of using flipped classroom approach to teach evidence-based medicine to medical technology students. BMC Med Educ. 2020;20(31):1–9. doi: 10.1186/S12909-020-1946-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Hassell L.A., Peterson J.E., Pantanowitz L. Pushed across the digital divide: COVID-19 accelerated pathology training onto a new digital learning curve. Acad Pathol. 2021:8. doi: 10.1177/2374289521994240. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Attaway C.C., Mani M.M., Fortuna D. Are you ready to play Pathology Pyramid? An exploration of an alternative method of learning through gaming in pathology resident education. Acad Pathol. 2022:9. doi: 10.1016/J.ACPATH.2022.100033. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Hrynchak P., Batty H. The educational theory basis of team-based learning. Med Teach. 2012;34(10):796–801. doi: 10.3109/0142159X.2012.687120. [DOI] [PubMed] [Google Scholar]
  • 17.Eccher A., Fontanini G., Fusco N., Girolami I., Graziano P., Rocco E., et al. Digital slides as an effective tool for programmed death ligand 1 combined positive score assessment and training: Lessons learned from the “Programmed death ligand 1 key learning program in Head-and-Neck squamous cell carcinoma.”. J Pathol Inform. 2021;12(1) doi: 10.4103/JPI.JPI_63_20. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Journal of Pathology Informatics are provided here courtesy of Elsevier

RESOURCES