Skip to main content
MedEdPublish logoLink to MedEdPublish
. 2024 Mar 27;14:19. [Version 1] doi: 10.12688/mep.20084.1

Utilization of marketing automation tools for delivery of a faculty development curriculum

Sarah H Michael 1,a, Cody Brevik 1, Danielle T Miller 1, Jessica Hitt-Laustsen 2, John L Kendall 1
PMCID: PMC11153984  PMID: 38846581

Abstract

Background

Physician clinical educators play important roles in teaching, providing feedback, and evaluating trainees, but they often have variable preparation and competing demands on their time that make universal participation in workshops, seminars, or short courses designed to foster these skillsets inefficient or impossible.

Methods

We designed and implemented a 52-week synchronous curriculum designed to address faculty opportunities to improve teaching skills, feedback for residents and medical students, and evaluation skills, which were delivered using marketing automation tools, including text messaging and email. We evaluated the programmatic impact and feasibility of using the implementation science framework.

Results

Over a 104-week evaluation period, there were at least 10,499 total content impressions and 4558 unique recipients, indicating the significant reach of this program to approximately 120 faculty members. Faculty engagement with continuing education materials remained stable or increased over the 2-year evaluation period, indicating that programs like ours can have sustainable impacts. Resident evaluations of faculty across the six key domains also improved after the implementation of the program.

Conclusions

Our experience with digital marketing tools reflects that they can be used to deliver impactful curricular content to faculty for continuing educational purposes and that faculty can use these resources in a sustainable way. However, because of the incomplete reach with any single communication, this type of content delivery is not appropriate for isolation as a material of critical importance. More research is needed to identify the best practices and additional education-related uses of this technology.

Keywords: Faculty development, Email, Text message, Marketing automation platform, Digital marketing, Engagement

Introduction

Physician clinical faculty have important responsibilities for the provision of teaching, feedback, and evaluation to trainees, yet often have competing demands on their time and limited faculty development within these education domains. The Accreditation Council on Graduate Medical Education (ACGME), which oversees United States-based physician training, requires that residency programs offer faculty development but proffers little further guidance 1 . In the absence of standardization, the preparedness of individual faculty members for their roles as clinical educators is multifactorial, but is often dependent on factors such as whether they had access to residents as teacher programming during training 2 , the availability of opportunities at their own institution, their intrinsic motivation, schedule flexibility, and the impact of competing demands on their time 3, 4 . Therefore, training programs are challenged to identify common opportunities for faculty improvement or to offer resources supporting faculty development 46 .

We performed a needs assessment for a curriculum to address variation in emergency medicine (EM) faculty preparation for clinical education among approximately 120 faculty members who work clinically at one of two emergency departments (EDs) affiliated with our residency program. A qualitative analysis of annual rotation evaluations completed by residents in two prior academic years identified three priorities for faculty development: clinical teaching, feedback, and evaluation. After these three faculty development content areas were identified, we performed a literature search for possible educational strategies to deploy to our faculty. Interventions designed to ensure that faculty groups like ours have access to clinical education resources frequently take the form of workshops, seminar series, and short courses but these formats have limitations that make their application to such large groups a challenge. First, the implementation of workshops and short courses requires significant expenditure of opportunity cost, especially on the part of the developers, whose time preparing for and delivering the content must be accounted for, as well as that of faculty participants who generally have significant competing demands on their time 7 . Additionally, the implementation of workshops, short courses, and seminar series are often infrequent with variable faculty attendance due to competing faculty time demands, making it difficult to include all faculty members. This infrequency may also contribute to the accelerated decay of skill and knowledge 8 and risk creation of an institutional culture where emphasis on teaching feels cyclical rather than continuous and compliance-driven rather than part of the physician faculty professional identity. With an eye toward engagement, we sought content delivery modalities that were used regularly and were already accepted by the faculty.

Thus, we developed a robust faculty development curriculum targeting our needs assessment and coupled it with an email and text message content delivery strategy to provide high-quality, relevant, just-in-time resources and create a multi-institutional alignment around clinical teaching priorities and expectations. The resulting curriculum comprised just-in-time, synchronized content that allowed us to cover many relevant topics with relative depth while avoiding redundancy and creating a sense of perpetuity for the educational mission. A just-in-time strategy 9 focused on content delivery via email and text messages immediately prior to clinicians starting their clinical shifts, allowed cognitive priming, encouragement, and reinforcement immediately before applying teaching and assessment strategies to learners. Each faculty development installment included brief, evidence-based content relevant to clinical teaching, feedback and/or evaluation, links to additional resources, and a call to action. Emails were designed to be read in less than three minutes. Text messages were designed to reinforce the emailed content.

Methods

The implementation and evaluation of this curriculum were granted exempt status with a waiver of consent by the Colorado Multiple Institution Review Board, as is typical for educational program implementation and evaluation at our institution.

Emails were scheduled and sent using a marketing automation platform (MAP) ( Mailchimp; Atlanta, GA, USA; available as web-based software with a “freemium” subscription model) to the faculty every Sunday evening. MAPs allow content developers to easily create engaging materials that are viewable as a web page, actively manage audience contact, and estimate engagement among recipients. All faculty members were automatically enrolled to receive emails but were permitted to disenroll at any time without penalty. During the seven days following the content email, faculty members who were scheduled to work clinically with learners received a text message one hour before their shift start time. These messages included a brief reminder of the week’s content and a clickable link to the MAP-generated webpage for their just-in-time review to prime faculty members for teaching. Text messages were scheduled by a program administrator using third-party software ( TextMagic, San Francisco, CA, USA). All faculty members practicing clinically at the affiliated institutions were automatically enrolled in text messages but were notified in advance that they could opt out at any time without penalty. Email and text message content is archived and linked in the Data Availability section 10 .

After 24 months, we downloaded a campaign report from the MAP, which includes the number of recipients, opening rate, click through rate, number of unique views, unsubscribe rate, and bounce rate for each email. The reports were downloaded to a spreadsheet platform (Excel for Mac v. 16.58; Microsoft Corporation; Redmond, WA, USA) and data intended for business purposes or deemed otherwise not useful, including revenue and number of orders, was excluded for ease of analysis. A similar report was also downloaded to determine the number of text messages sent. Engagement data is not available for text messages, except for opt-out rates. We then graphed the weekly opening rate and analyzed the data and program using an implementation science framework.

The RE-AIM implementation science framework was developed by Glasgow, et al. 11 for the National Institutes of Health, and is the most validated and used model for assessment of implementation. It is freely available for public use. The acronym stands for Reach, Effectiveness, Adoption, Implementation fidelity, and Maintenance. Although this framework was originally developed for use in public health interventions, it can also be an effective tool for examining the impact and sustainability of longitudinal educational programs given their ubiquity in public health initiatives.

Reach was defined as the number of individuals willing to participate in an intervention. To describe this impact, we report the number of individuals enrolled over time, opt-out rates, and number of unique content impressions generated by each dissemination modality.

Effectiveness is a measure of the effects of the intervention, which we assessed using faculty evaluation data generated by emergency medicine residents working with the participants.

Adoption reflects saturation of the intervention within the institution(s). We report the institutional support for continuing the program and the expansion of the program to include additional participants.

Implementation fidelity is a measure of how closely the implemented intervention resembles the planned intervention. We describe the tools and resources used to deliver the curriculum, barriers, and enablers.

Maintenance refers to the sustainability of intervention. We describe ongoing programs and adaptations to improve sustainability.

Results

Reach

Over the first 24 months of the program (January 1, 2018, to December 31, 2019), a total of 120 academic physician faculty members at the primary sites received 11,560 weekly content emails and 9810 text messages 10 . The primary sites were emergency departments in a county-supported urban hospital with an emergency medicine residency program and an emergency medicine residency-affiliated university hospital. The MAP reported that 4558 (39.4%) emails had been opened. The MAP considers emails “opened” when an invisible, single-pixel image loads, so users who do not fully load images when reading email, a common option for the participants’ institutionally supported email client, may inadvertently deflate the estimated rate. As such, it is difficult to ascertain a true impact in terms of content views, but the open rate determined by MAP is likely to be underestimated. Owing to technical limitations associated with text messaging software, it is not possible to know the text message read or click-through rates. The percentage of faculty members who access the content using this modality alone or in conjunction with email is therefore unknown.

Despite the challenge of characterizing a true utilization rate, the technology and process utilized by the MAP to calculate it has not changed since the implementation of this program, and the rate itself has remained stable over time, averaging 38.5% (1059/2756) during months 1–6, 37.4% (1089/2932) during months 7–12, 40.5% (1135/2801) during months 13–18, and 41.6% (1275/3071) during months 19–24 ( Figure 1). One faculty member disenrolled from email and another 11 opted out of text messaging, but there was no overlap between the two groups, so all faculty members received the content through at least one modality during this time 10 .

Figure 1. Estimated percentage of content emails viewed by faculty.

Figure 1.

The marketing automation platform-estimated email opening rate for each week of content demonstrated stable engagement (average 39.5%) during the 104-week evaluation period, ranging from 30.4% to 52.4%. It does not include emails that are open to text without images.

Effectiveness

Quantitative metrics, including clinical teaching by faculty, faculty feedback, and faculty supervision, improved annually from to 2017–2019 on the ACGME resident survey after the curriculum implementation in 2017. Resident evaluations of faculty in several clinically relevant domains improved on annually required ACGME resident surveys after curriculum implementation ( Table 1). This data is provided by the ACGME and delivered aggregated for resident confidentiality, which limits further analysis.

Table 1. Faculty teaching evaluation scores.

Resident evaluations of the faculty for the year prior to implementation (2017) and the following two years showed improvement across six domains relevant to the curriculum delivered. All scores are on a 5-point scale.

Assessment year Percent change
2017–2019
2017 2018 2019
Survey response rate 76% (51/67) 87% (58/67) 87% (58/67)
Domain Clinical teaching by faculty 3.38 3.28 3.52 4.1%
Mentorship provided by faculty 3.17 3.19 3.34 5.4%
Faculty supervision skills 3.48 3.43 3.64 4.6%
Amount of supervision 3.56 3.48 3.81 7.0%
Feedback provided by faculty 2.90 2.88 3.09 6.6%
Resident-faculty interaction 3.33 3.40 3.62 8.7%

Adoption

The program remains an important departmental faculty development initiative with administrative support and expansion of the program over time. In month 13, we adapted the curricular content for Advanced Practice Providers (APPs) (n=32), who worked in an affiliated ED and occasionally taught APP undergraduate and postgraduate trainees. In their first 12 months, the APP faculty had a MAP-calculated opening rate of 40.9% (569/1456) 10 . We were subsequently approached by clinical faculty at an affiliated community hospital where EM residents rotate in the ED. Due to their less frequent shifts with residents, we used an opt-in procedure for this group, forwarding weekly emails for one month and then offering the opportunity to enroll. Of the 59 faculty members who could interact with residents at this institution, 33 (56%) opted for the program. In the first 6 months, the enrollees at the affiliate site had a MAP-calculated open rate of 58.2% (494/848) 10 . Neither the community physicians nor APPs received text messages due to the administrative time that would be required to facilitate that process.

Implementation fidelity

Using text messages and an MAP, we were able to implement a program with high fidelity to the curriculum we designed and sustained delivery of content using both modalities for 24 months. However, the costs of these tools vary. The use of a marketing automation platform ranges from free to more than USD 300/month depending on the size of the intended audience. Text messages can be scheduled and delivered at a small fee. Additional resources are needed to synchronize texts with a clinical or educational schedule, either via integration of another automated tool or through manual administrative time. The greatest enabler for a curriculum such as ours is the high-yield content that is relevant to the audience. Based on participant feedback, changes to the program after the initial 24-month implementation period included the development of an online content repository that allows faculty to quickly access relevant content and decreased text message frequency with an emphasis on the immediate relevancy of the just-in-time content.

Discussion

Our experience with digital marketing tools such as email and text messaging reflect that they can be used to deliver curricular content to faculty for continuing educational purposes and that faculty can use these resources in a sustainable way. At first glance, it is tempting to consider engagement with content as a relative failure because of open rates of 30.4%–52.4%. In their greater context, these rates reflect 10,499 total content impressions and 4558 unique recipient impressions, which, as noted above, likely represent an underestimation of the true impact. Organizing traditional faculty development programs or workshops to achieve content delivery of this magnitude, without marketing automation tools, would not have been feasible for our group.

While the cost of implementing such programs is not zero, curriculum developers may find that they offer outsized returns on investment when strategically employed. More recently, we found that the digital, synchronous content delivery we designed also offers important infection-prevention advantages in the context of the coronavirus pandemic. These strategies may supplement continuing education when opportunities to gather in groups are limited or undesirable. One limitation of using digital marketing tools to supplement other continuing education curricula is that the observed changes in the target audience’s performance are confounded by the presence of other curricula. Obviously, this could be scientifically mitigated with a randomized study design, which does not always align with the institutional goals related to content dissemination. While not addressed in this study, it may be feasible to employ an even broader application of these tools to the continuing education landscape. Theoretically, these tools can also be used to create institutional and organizational alignment around a variety of topics, including updated clinical recommendations and communication of upcoming or summarized educational programming. Additional studies exploring the potential roles of digital marketing tools in continuing education and faculty development would be welcome.

Funding Statement

The author(s) declared that no grants were involved in supporting this work.

[version 1; peer review: 1 approved, 2 approved with reservations]

Data availability

Underlying data

Data underlying the results of resident evaluations of faculty teaching are available as part of the article, and no additional source data are required. They were presented as provided to the residency program by the ACGME.

Open Science Foundation (OSF): Utilization of Marketing Automation Tools for Delivery of a Faculty Development Curriculum. https://doi.org/10.17605/OSF.IO/KJZ8U 10

This project contains the following underlying data:

  • Primary Sites MAP Data.xlsx. (number of recipients, opening rates, and total content impressions among 120 emergency medicine residency physician faculty members over 104 weeks)

  • APP MAP Data.xlsx. (number of recipients, opening rates, and total content impressions among 32 advanced practice provider faculties over 52 weeks)

  • Community Site MAP Data.xlsx. (Number of recipients, opening rates, and total content impressions among 33 emergency medicine residency physician community faculty over 26 weeks)

Extended data

Open Science Foundation (OSF): Utilization of Marketing Automation Tools for Delivery of a Faculty Development Curriculum. https://doi.org/10.17605/OSF.IO/KJZ8U 10

This project contains the following extended data:

  • Text Message Content.pdf (dates and content for text messages sent during 2018 calendar year)

  • 2018 Email Content.pdf (email content with redactions of images and media available in the public domain or as web resources not owned by the authors and for participant privacy)

Data are available under the terms of the Creative Commons Zero "No rights reserved" data waiver (CC0 1.0 Public domain dedication).

References

  • 1. Accreditation Council on Graduate Medical Education: ACGME Common Program Requirements.2019 ed. Accessed 15 July 2019. Reference Source
  • 2. Wamsley MA, Julian KA, Wipf JE: A literature review of "resident-as-teacher" curricula: do teaching courses make a difference? J Gen Intern Med. 2004;19(5 Pt 2):574–581. 10.1111/j.1525-1497.2004.30116.x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. Ramani S, McKimm J, Thampy H, et al. : From clinical educators to educational scholars and leaders: strategies for developing and advancing a career in health professions education. Clin Teach. 2020;17(5):477–482. 10.1111/tct.13144 [DOI] [PubMed] [Google Scholar]
  • 4. Crawford KA, Wood TJ, Lalonde KA, et al. : Faculty Development- Is Some Better Than None? [version 1]. MedEdPublish. 2019;8:18. 10.15694/mep.2019.000018.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. O'Sullivan PS, Irby DM: Reframing research on faculty development. Acad Med. 2011;86(4):421–428. 10.1097/ACM.0b013e31820dc058 [DOI] [PubMed] [Google Scholar]
  • 6. Clyburn EB, Wood C, Moran W, et al. : Valuing the education mission: implementing an educational value units system. Am J Med. 2011;124(6):567–572. 10.1016/j.amjmed.2011.01.014 [DOI] [PubMed] [Google Scholar]
  • 7. Steinert Y, Mann K, Anderson B, et al. : A systematic review of faculty development initiatives designed to enhance teaching effectiveness: A 10-year update: BEME Guide No. 40. Med Teach. 2016;38(8):769–786. 10.1080/0142159X.2016.1181851 [DOI] [PubMed] [Google Scholar]
  • 8. Arthur W, Bennett W, Stanush PL, et al. : Factors that influence skill decay and retention: a quantitative review and analysis. Human Performance. 1998;11(1):5–10. 10.1207/s15327043hup1101_3 [DOI] [Google Scholar]
  • 9. Novak GM: Just-in-time teaching. New Directions for Teaching and Learning. 2011;2011(128):63–73. 10.1002/tl.469 [DOI] [Google Scholar]
  • 10. Michael SH, Hitt-Laustsen J, Kendall JL: Utilization of Marketing Automation Tools for Delivery of a Faculty Development Curriculum. [data set].2023. 10.17605/OSF.IO/KJZ8U [DOI]
  • 11. Glasgow RE, Harden SM, Gaglio B, et al. : RE-AIM Planning and Evaluation Framework: Adapting to New Science and Practice With a 20-Year Review. Front Public Health. 2019;7:64. 10.3389/fpubh.2019.00064 [DOI] [PMC free article] [PubMed] [Google Scholar]
MedEdPublish (2016). 2024 Jun 5. doi: 10.21956/mep.21507.r36838

Reviewer response for version 1

Michael Cassara 1

Thank you for your submission to MedEdPublish. There are no actual or perceived conflicts on interest to report.

Summary: The authors report on their development and implementation of an educational innovation (“synchronous” longitudinal “just-in-time” faculty development curriculum delivered by email and text message pushes using a marketing automated tool) to overcome barriers and challenges to content delivery using more traditional methods (CME courses, asynchronous on-demand learner-initiated content pulls, etc.). The intended audience for this faculty development curriculum were clinical faculty in emergency medicine located at two emergency departments in the United States. The authors’ aim of this faculty development curriculum was to enhance teaching, feedback, and learner assessment (presumably in the patient care environment). The authors gathered data largely centered on the metrics associated with the software (total number of email/texts sent, number/percent opened, disenrollments, etc.) and quantitative metrics aggregated from deidentified responses to items from the Accreditation Council for Graduate Medical Education (ACGME) annual resident survey concerned with the quantity and quality of faculty teaching.

As I am interpreting this manuscript as an educational innovation, I am framing my review on the DoCTRINE guidelines as proposed by Blanco et al. (2022).

Introduction

1. The authors provide an adequate description of the need and justification for the curriculum/innovation.The use of insights and data gained from a needs assessment/gap analysis is in keeping with curricular best practices (e.g., Thomas et al. 2022). This enhances attainment of the DoCTRINE criteria. Thank you. However, I find the author's’ use of the descriptors “synchronous” and “just-in-time” in the manuscript misleading. Synchronous, in my understanding of the term as applied to health professions education, commonly implies the implementation of an event “contemporaneous” with other learners (within a social learning context, whether the learning environment is a physical space or virtual milieu). A longitudinal series of posts to a discussion board, for example, is not usually referred to as “synchronous”. The authors appear to use this term to indicate that content pushes (by email and text) are occurring simultaneously to all recipients; “synchronized”, a term also used by the authors in the manuscript, seems more appropriate and provides the intended effect. I believe the distinction I am making is important: a simultaneously-received automated push does not guarantee or equate with the simultaneous opening of content nor does it represent a synchronous in-the-moment collaboration and social interaction with, by, and involving the content by multiple learners (whether co-locator or distributed). Recommendation: I would not refer to this curriculum or its delivery method as “synchronous” but rather “synchronized”, “coordinated”, or “timed”. Likewise, I find the phrase “just-in-time” has many meanings across education in general and specifically in health professions education. I had initially believed the authors’ use of the term more aligned with the thoughts discussed by Yilman et al (2021). After reading the Novak article and examining the authors’ content (especially the email content; see Reference 10, Michael et al. [2023] below), I better understood the alignment between this manuscript’s design and the concepts described by Novak (2011) (for readers of this review unfamiliar with Novak’s description, I would summarize by stating it embraces a “flipped classroom” approach to content dissemination that cognitively primes and prepares recipients in advance of clinical shifts to enhance student-student, student-teacher, and time on task when students and teachers are present together in the learning environment. Recommendation: As the alignment of Novak’s “just-in-time” framework was not immediately understandable to me as a reader, I propose the authors consider reviewing the Novak framework in better detail and compare with other constructs of “just-in-time” (e.g., Yilman et al. [2021]); consider comparing and contrasting the two.

2. The authors provide an adequate review of the literature within the Introduction, including a recent Best Evidence in Medical Education report (systematic review on faculty development curricula by Steinert et al. [2016]). Later in the manuscript (Methods), the authors also incorporate a previously described and recognized framework to assess their technology-enhanced curricular implementation (RE-AIM). These examples support attainment of the DoCTRINE criterion. I acknowledge the limitations imposed by the brief report format. However, I believe there is room for the authors to provide 1-2 sentences of the theory/framework/model that informs on the Novak model of “just-in-time” and support the authors’ selection of pre-shift educator priming or knowledge dissemination via content pushes in favor of content pulls (see comment above and below this one). Recommendation: To even further enhance fulfillment of the DoCTRINE criterion, the authors should better explain the theory that underlies the Novak framework/model to support why they chose it as part of their curricular design and implementation.

3. Unique contribution: The authors adequately describe how this this curricular innovation (utilization of the marketing automation tools to provides synchronized delivery of faculty development content) is unique from conventional methods of content dissemination for faculty development. They miss the opportunity to explain briefly how content pushes differ from content pulls. Recommendation: To enhance fulfillment of the DoCTRINE criterion, the authors could compare and contrast their innovative method which involves scheduled pushes to its audience with others that leverage on-demand pulls from its audience (consider Orner et al. [2022]).

Curriculum development

4. The authors clearly state the purpose/goals of the educational innovation. Thank you.

5. The authors imply that the objective of this curriculum is to enhance teaching, feedback, and learner assessment (presumably in the patient care environment). The authors do not provide specific detail for readers to understand exactly what learning outcomes (related to the faculty) are expected through engagement with this curriculum. Does the content build in a scaffolded way so that its intended recipients (the faculty as learners) grow developmentally as the longitudinal curriculum unfolds?  Readers would love to see the curricular map of how the content was scheduled to unfold (or was it random?). Recommendation: While the authors provide supplemental digital access to their text messages/emails and the data, they do not provide a curriculum map that lists or describes the goals and objectives related to each faculty development content push by week. I believe this is feasible and would enhance attainment of this DoCTRINE criterion.

6. The authors clearly state the target audience of the educational innovation, supporting attainment of the DoCTRINE criterion. Thank you.

Curriculum implementation and Results

7-8. The authors adequately describe the key features of the software and the resources needed to implement the curriculum, supporting attainment of the DoCTRINE criterion. Thank you.

9. The authors adequately describe the instructional method. In the supplemental materials they provide through their registration with the Open Science Foundation (see Reference 10, Michael et al. [2023] below), the authors provide readers with access to the actual curricular tools, supporting attainment of the DoCTRINE criterion. Thank you.

10. While the authors provide outcomes data on their implementation, most of their evaluation data center on program- and process-centered outcomes. The authors indeed provide adequate proof-of-concept to answer the question: does the marketing automated tool work to effectively disseminate information? Regarding the more important question of curricular efficacy, I do not feel the authors gathered sufficient evidence of curricular efficacy (as described in the RE-AIM) related to the learners. The responses to the ACGME resident survey are not specific enough. While the data is clearly depicted in the table and figure, the level of statistical analysis is minimal and descriptive. We are not truly able to infer correlation between this curriculum and any improvement of scores; clearly, causation is impossible. Even if the intent were to provide descriptive data, I believe the authors could have gathered more and provided a slightly more detailed level of analysis to support their story. They missed an opportunity to include mixed methods approaches to gathering richer data on curricular efficacy (e.g., self-reflection by faculty, free text comments by residents regarding change in the culture of clinical teaching/feedback, etc.). Recommendation: Not sure if the authors are able to address this concern. If the authors have learner-specific outcomes (faculty outcomes, or indirectly, learner outcomes), they should include it to enhance attainment of this DoCTRINE criterion.  

11. The authors describe RE-AIM, one of their primary evaluation frameworks, supporting attainment of the DoCTRINE criterion. Thank you.

12-14. The authors describe the number of participants. The authors provide raw data on the process- and program-centered outcomes(see Reference 10, Michael et al. [2023] below). See comment 10.

DIscussion

15. The authors provide an adequate summary of findings, meeting the minimum expectations of the DoCTRINE criteria. However, I do believe they could have enhanced attainment of the DoCTRINE criteria for the Discussion section by expanding the discussion. I would have liked more reflective critique on the successes and opportunities they encountered over the period of implementation and observation. What happened during 2020-2022 with the experience of the pandemic? Finally, the authors could have provided more reflection and commentary, based on their experience, on the types of content (static v. dynamic, infographics v. video or audio media) that worked best using the Novak model and the marketing automated tool platform. They also could have provided more commentary comparing the push-pull aspects (as mentioned earlier).

Have any limitations of the research been acknowledged?

Partly

Is the study design appropriate and does the work have academic merit?

Partly

Is the work clearly and accurately presented and does it cite the current literature?

Partly

If applicable, is the statistical analysis and its interpretation appropriate?

No

Are all the source data underlying the results available to ensure full reproducibility?

Yes

Are the conclusions drawn adequately supported by the results?

Partly

Are sufficient details of methods and analysis provided to allow replication by others?

Yes

Reviewer Expertise:

health professions education, healthcare simulation

I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.

References

  • 1. : The DoCTRINE Guidelines: Defined Criteria To Report INnovations in Education. Acad Med .2022;97(5) : 10.1097/ACM.0000000000004634 689-695 10.1097/ACM.0000000000004634 [DOI] [PubMed] [Google Scholar]
  • 2. : Just in Time Teaching (JiTT) Infographics App for Teacher Development. J Grad Med Educ .2022;14(3) : 10.4300/JGME-D-21-01011.1 344-345 10.4300/JGME-D-21-01011.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. : Utilization of marketing automation tools for delivery of a faculty development curriculum. MedEdPublish .2024;14: 10.12688/mep.20084.1 10.12688/mep.20084.1 [DOI] [Google Scholar]
  • 4. : Just-in-Time Teaching. Routledge .2009;
  • 5. : Curriculum development for medical education: A six-step approach. Johns Hopkins University Press .2015;
  • 6. : The Learning Loop: Conceptualizing Just-in-Time Faculty Development. AEM Educ Train .2022;6(1) : 10.1002/aet2.10722 e10722 10.1002/aet2.10722 [DOI] [PMC free article] [PubMed] [Google Scholar]
MedEdPublish (2016). 2024 Jun 4. doi: 10.21956/mep.21507.r36837

Reviewer response for version 1

Rory Merritt 1

This article highlights challenges and potential solution to several common problem with faculty development (FD). Primary challenges in FD include lack of participant time, scheduling challenges, variability in content and unclear participation metrics. A potential solution proposed by the authors is to leverage marketing tools to develop bite-sized teaching information to EM faculty using weekly emails and pre-shift text messages. Results are somewhat limited but nonetheless important: number of total emails/texts sent and estimated email engagement. Results also include data related to teaching effectiveness. I am not a statistician but would like the know if the teaching effectiveness (resident ratings) are statistically significant. The use of an implementation science framework to support novel faculty development tools is important and this is a well-done paper.

Have any limitations of the research been acknowledged?

Yes

Is the study design appropriate and does the work have academic merit?

Yes

Is the work clearly and accurately presented and does it cite the current literature?

Yes

If applicable, is the statistical analysis and its interpretation appropriate?

Yes

Are all the source data underlying the results available to ensure full reproducibility?

Yes

Are the conclusions drawn adequately supported by the results?

Yes

Are sufficient details of methods and analysis provided to allow replication by others?

Yes

Reviewer Expertise:

Medical Education

I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.

MedEdPublish (2016). 2024 May 16. doi: 10.21956/mep.21507.r36841

Reviewer response for version 1

Giovanna Sirianni 1,2

Summary: This manuscript outlines a just-in-time faculty development intervention. This intervention included automated, email and text messages pushed to clinical supervisors weekly prior to working with learners. The messages included reminders of weekly content and links to faculty development materials. Outcome included number of emails opened. Engagement with text messages could not be determined. The authors suggest faculty teaching evaluation scores improved in the period of this intervention. Descriptive statistics provided for faculty teaching evaluations over a two-year period; statistical significance is not reported. The authors conclude that digital marketing tools (email/text) can be used to effectively deliver faculty development content. 

Feedback:

1) Faculty development is critical to support effective teaching, feedback and assessment in health professions education. The team should be commended on developing an intervention to support ongoing, just-in-time, faculty development to support clinical supervisors and to address commonly known barriers in this educational sphere. 

2) Limitations of the study include:

i) The introduction could be strengthened with more recent literature on the use of digital technology in the domain of faculty development. In addition, clarification of the 'needs assessment' would be helpful. It appears this was not a true needs assessment of faculty, but rather feedback on areas for improvement from the resident perspective. Changing the 'needs assessment' terminology would help clarify the data source that prompted the intervention. 

i) The reach of the emails/texts reported does not equate with engagement/review/implementation of the content by the recipients. The study would be strengthened with quantitative and qualitative feedback from the clinical faculty on their engagement with the materials and how they applied them during supervision. This is an important piece of data that is missing. 

2) The data presented on teaching evaluations includes descriptive statistics on percentage change over the study period. However, the team does not address if other faculty development interventions during the study period may have impacted these results. This section would be strengthened by addressing other factors that may have impacted scores and applying appropriate statistical tests to ensure the difference in scores is significant.

Have any limitations of the research been acknowledged?

Partly

Is the study design appropriate and does the work have academic merit?

Partly

Is the work clearly and accurately presented and does it cite the current literature?

Partly

If applicable, is the statistical analysis and its interpretation appropriate?

No

Are all the source data underlying the results available to ensure full reproducibility?

Yes

Are the conclusions drawn adequately supported by the results?

No

Are sufficient details of methods and analysis provided to allow replication by others?

Yes

Reviewer Expertise:

Faculty development, competency-based medical education

I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Data Citations

    1. Michael SH, Hitt-Laustsen J, Kendall JL: Utilization of Marketing Automation Tools for Delivery of a Faculty Development Curriculum. [data set].2023. 10.17605/OSF.IO/KJZ8U [DOI]

    Data Availability Statement

    Underlying data

    Data underlying the results of resident evaluations of faculty teaching are available as part of the article, and no additional source data are required. They were presented as provided to the residency program by the ACGME.

    Open Science Foundation (OSF): Utilization of Marketing Automation Tools for Delivery of a Faculty Development Curriculum. https://doi.org/10.17605/OSF.IO/KJZ8U 10

    This project contains the following underlying data:

    • Primary Sites MAP Data.xlsx. (number of recipients, opening rates, and total content impressions among 120 emergency medicine residency physician faculty members over 104 weeks)

    • APP MAP Data.xlsx. (number of recipients, opening rates, and total content impressions among 32 advanced practice provider faculties over 52 weeks)

    • Community Site MAP Data.xlsx. (Number of recipients, opening rates, and total content impressions among 33 emergency medicine residency physician community faculty over 26 weeks)

    Extended data

    Open Science Foundation (OSF): Utilization of Marketing Automation Tools for Delivery of a Faculty Development Curriculum. https://doi.org/10.17605/OSF.IO/KJZ8U 10

    This project contains the following extended data:

    • Text Message Content.pdf (dates and content for text messages sent during 2018 calendar year)

    • 2018 Email Content.pdf (email content with redactions of images and media available in the public domain or as web resources not owned by the authors and for participant privacy)

    Data are available under the terms of the Creative Commons Zero "No rights reserved" data waiver (CC0 1.0 Public domain dedication).


    Articles from MedEdPublish are provided here courtesy of Association for Medical Education in Europe

    RESOURCES