Abstract
Background
To support diversity in biomedical science, the American Heart Association launched the Supporting Undergraduate Research Experiences for undergraduate students from underrepresented backgrounds to provide mentorship and high‐level exposure at 5 leading medical institutions. Here we describe the initial formation of the partnership and the alteration made in response to the program to accommodate COVID‐19 safety precautions.
Methods and Results
We outline how programming shifted from local, in‐person programming in the summer of 2019 to a collaborative, mainly virtual curriculum in 2020 using students’ self‐reported before and after surveys from both 2019 (n=33) and 2020 (n=42). Students from both in‐person (2019) and virtual programs (2020) self‐reported significant gains in scientific proficiency. A qualitative‐directed content analysis of student open‐response questions was performed. Students reported extensive benefits from the 2020 virtual training, including Personal Gains, Research Skills, Thinking and Working Like a Scientist, and Attitudes and Behaviors. Notedly, we observed increases in the Attitudes and Behaviors category. We outline the pros and cons of in‐person and virtual programming and make recommendations moving forward in a postpandemic world with hybrid work and learning systems.
Conclusions
Our effort informs the development of future undergraduate research training programs, significantly maximizing a hybrid training modality. The American Heart Association Supporting Undergraduate Research Experiences serves as a model for building multi‐institutional partnerships and providing research experiences that overcome institutional barriers and support students' interests, commitment, and ability to persist in science, technology, engineering, and math fields.
Keywords: American Heart Association (AHA), multi‐institutional programming, online, Supporting Undergraduate Research Experiences (SURE), training, undergraduate, virtual
Subject Categories: Disparities, Quality and Outcomes, Statements and Guidelines, Basic Science Research
Nonstandard Abbreviations and Acronyms
- AHA SURE
American Heart Association Supporting Undergraduate Research Experiences
- AHA
American Heart Association
Clinical Perspective
What Is New?
We built multi‐institutional digital partnerships and provided research experiences for undergraduate students from underrepresented backgrounds.
These experiences provide benefits, including Personal Gains, Research Skills, Thinking and Working Like a Scientist, and Attitudes and Behaviors.
What Are the Clinical Implications?
The American Heart Association Supporting Undergraduate Research Experiences serves as a model for building multi‐institutional partnerships.
Our effort informs the development of future undergraduate research training programs, significantly maximizing a hybrid training modality.
We outline the pros and cons of in‐person and virtual programming and make recommendations moving forward in a postpandemic world with hybrid work and learning systems.
Cardiovascular disease remains the leading cause of death in the United States; however, its impact on different communities has been inequitable. Minoritized communities and communities of lower socioeconomic status are burdened disproportionally with cardiovascular risk factors and cardiovascular disease mortality. Students from these communities who become physicians are more likely to practice primary care and serve underserved communities. 1 Research has shown that the number of students from underrepresented backgrounds in biomedical science remains low, 2 including individuals from Black, Hispanic, Native American, Alaska Native, Native Hawaiian, and Pacific Islander backgrounds. Although these groups represented ≈33% of the US population in 2019, 3 according to 2016 to 2020 data from the Association of American Medical Colleges, <12% of the total number of US medical school graduates were from these groups. 4 , 5 , 6 A survey also performed by the Association of American Medical Colleges showed that <13% of active physicians in 2018 identified as underrepresented. 4 , 5
In cardiology, the numbers are more disproportionate, with <3% of cardiologists identifying as Black in 2016. 7 In 2019, of the doctoral degrees awarded in science and engineering to US citizens or permanent residents, 7.1% went to Hispanic candidates, 0.2% went to Native American candidates, and 4.1% went to Black candidates. 8 Systemic changes are needed to diversify the biomedical workforce. Research has highlighted the importance of the educational process in increasing the representation of women and underrepresented people in science, technology, engineering, and math (STEM) by demonstrating that the likelihood of persistence across postsecondary training is dependent on race, ethnicity, and gender, 9 , 10 but less research has focused on the nature of the educational experiences and their relationship to career plans within specific STEM disciplines. 11
In 2018, the American Heart Association (AHA) commissioned the RAND Corporation to identify factors associated with this disparity and produce actionable recommendations to promote and foster diversity within the biomedical sciences. From this effort, it was recommended that the AHA develop a mentored research program specifically for undergraduate students from underrepresented backgrounds, with the aims of providing mentorship and high‐level exposure for these students. In 2019, the AHA launched the Summer Undergraduate Research Experience (SURE) pilot program to help promote diversity in cardiovascular biomedical research. Since that time, the AHA SURE institution syndicate was formed with cross‐institutional programming collaboration. Recruitment and the program were designed to recruit students interested in biomedical research but did not select students based on specific graduate education plans (eg, MD, MD/PhD, PhD). The current study uses both the AHA SURE program student data in addition to the National Institutes of Health–funded student programs that shared programming to examine programmatic impact. We report on the structure and components of the AHA SURE program, cross‐intuitional collaboration feasibility, challenges, and its impact on students’ self‐reported scientific proficiency. Ultimately, the goal of this effort is to maximize impact and training practices for underrepresented students in a hybrid (remote/in‐person) training environment.
Methods
As per the AHA journals’ implementation of the Transparency and Openness Promotion Guidelines, the data that support the findings of this study are available from the corresponding author upon reasonable request.
Planning and Recruitment
The AHA SURE program launched as a collaboration between the AHA and 5 American medical schools (Boston University, Medical College of Wisconsin, Northwestern University, Stanford University, and Vanderbilt University). These medical schools, based on location, capture the geographic heterogeneity of the country. The AHA awarded funding to these participating institutions to work directly with regional historically Black colleges and universities and Hispanic‐serving institutions to recruit scholars from underrepresented backgrounds. In the pilot, the 5 participating institutions were given 3 to 4 undergraduate student trainee slots. Students were recruited both from local populations (eg, students enrolled at the host institution) and from open applications available to both local and nonlocal students (Figure S1). Almost all of the students entering the programs intended to pursue graduate‐level degrees such as MD, PhD, or MD/PhD (Table S1). To direct design and implement, the AHA assembled an Oversight Advisory Committee of AHA staff and university representatives. The Oversight Advisory Committee is responsible for overall program guidance, implementation, milestones, and evaluation metrics.
AHA SURE Program Elements
Selected trainees were paired with an AHA‐funded investigator who served as a research mentor for the duration of the summer experience. The summer experience was 10 weeks. The exact program start date differed slightly for each institution but generally ran from early June to mid‐August for both the 2019 and 2020 iterations. AHA SURE scholars were awarded an educational stipend of $6000. Scholars were required to participate in a research project in which they could be a lead junior investigator or make a significant contribution. The research could have a cardiovascular disease focus but also span a wide variety of other specialties in basic, population, or clinical science. The site directors at each institution were charged with quality control and oversight. The AHA also provided travel costs to attend/present at a scientific meeting. In addition to mentored research projects for the scholars, each host institution also had a uniquely designed learning curriculum for their students. However, across all participating institutions, each curriculum included compliance training elements (eg, use of human and/or animal subjects, radioactivity, biohazards), responsible conduct in research, health equity, and stress/time management.
In 2019 the programs were run traditionally, with each institution locally structuring the program and providing the program's content. An overview of the programming in 2019 is shown in Figure 1A. There was a significant amount of overlap in the individually implemented institutional programs in 2019. The common programming among the participating sites included mentored in‐person research projects, social events, and responsible conduct of research training. Additionally, each program hosted an independent symposium for the student research presentations at the end of the program.
Figure 1. Shifting from local programming to collaborative curriculum.
A, Overview of 2019 common and site‐specific programming. In 2019, each institution ran the summer undergraduate program locally and all provided mentored independent research projects. Other elements included journal clubs, professional development seminars, biotechnology company site visits, and other training and professional development opportunities. B, Overview of 2020 shared and unique programming with a collaborative curriculum that leveraged each institution’s successful previous programming. Although all programs retained an independent research component and local social activities, a substantial amount of programming was shared across institutions and therefore available to all students across the sites. Collaborative programming was coordinated by the American Heart Association. GRE indicates Graduate Record Examinations; and VSSA,Vanderbilt Summer Science Academy.
However, several programming elements were unique to each institution in 2019 (Figure 1A). For example, Boston University hosted a professional development seminar series and accompanied the students to biotechnology companies in the Boston area for site visits. Boston University also arranged for SURE scholars to meet local AHA staff and interact with project leads funded by the AHA Social Impact Fund in Boston; the funded projects are community‐based enterprises designed to mitigate social and economic barriers to improve health outcomes. Stanford University and Vanderbilt University offered cardiovascular science seminars and several career‐path conversations to students to demonstrate the wide variety of opportunities available to them within the biomedical discipline. Like Boston University, Stanford University also hosted biotechnology company site visits. Northwestern University offered a Cardiology 101 series that discussed basic clinical cardiovascular disease principles. Northwestern University also hosted a medical professional’s panel for students to learn more about medical students and residency experiences. Vanderbilt University hosted the Vanderbilt‐Meharry Alliance, a larger umbrella program that included the health disparity, health equity, and advocacy piece from the AHA Southeastern region. The students had access to the Vanderbilt Summer Science Academy, which brings together all undergraduates conducting biomedical research at Vanderbilt University School of Medicine and provides programming. The students at Vanderbilt also had access to a Graduate Record Examination preparation course.
Transition to Virtual Programming in 2020
Whereas most in‐person programs across the country were canceled in the summer of 2020, the SURE program switched to a virtual training model and decided to use each participating institution’s strengths to create a collaborative program (Figure 1B). In 2020, each site program offered research opportunities for the scholars; however, the format of these opportunities was unique to each institution. For example, Northwestern University was able to provide in‐person training. Both Boston University and Stanford University opted for individual, mentored virtual research experiences with individual faculty members. Vanderbilt University used a group‐based science communication project model for research exposure, including developing and submitting an institutional review board protocol and administering surveys to evaluate video science communication products. Additionally, each site hosted virtual social events.
Furthermore, the AHA‐supported institutions collaborated to share their unique program offerings to all the scholars across the board. For example, Boston University offered their professional development training in a virtual format. They also provided virtual seminars in data visualization. Vanderbilt University offered virtual events with advocacy and community engagement groups, and online courses in planning your scientific journey, experimental design and hypothesis testing, and digital science communication. Northwestern University offered Cardiology 101 virtually and hosted virtual medical students and residency career path panels. Stanford University hosted shared cardiovascular research seminars, a virtual technique demonstration, design thinking, and virtual exposure to a local biotechnology company. At the end of the summer experience, the scholars presented their work at the SURE Scholars online symposium attended by all students and stakeholders at AHA and at participating institutions.
Overall, the AHA SURE program served as the syndicate that allowed for cross‐institutional collaboration. The current study uses both the AHA SURE program student data in addition to the National Institutes of Health funding student programs to examine programmatic impact.
Evaluation
To better understand the impact of the virtual program implementation, we compared program evaluation data from 2019 and 2020. In general, the program evaluation goal was to assess the implementation and effectiveness of programming spearheaded by AHA SURE, which aimed to prepare students for STEM careers at Boston University, Stanford University, and Vanderbilt University. In 2019, evaluations were performed independently by 2 host institutions: Boston University and Vanderbilt University. In 2020, because of the virtual nature of the research experience, AHA centralized data collection from 3 institutions: Boston University, Stanford University, and Vanderbilt University, at the AHA Center for Health Metrics and Evaluation. Preprogram and postprogram surveys were administered online using Microsoft forms. The preprogram survey served as the control within each year of programming.
To consolidate data and see thematic changes through the program span, we generated thematic categories based on the Undergraduate Research Student Self‐Assessment survey (see Data S1 for details). 12 Questions were interpreted based on their examination of authentic scientific inquiry compared with more traditional, rote class‐based science laboratories. 13
Statistical Analysis
Comparisons in pre‐ and postsurvey responses were made using standard statistical methods: 2‐tailed t tests, paired or unpaired, with an α of 0.05. We used a directed content analysis approach on the open‐end survey responses from all students using the Undergraduate Research Student Self‐Assessment thematic categories. 14 Directed content analysis is appropriate for this purpose and describes the phenomenon and the present theory for experiential training. 15 Overall, 3 coders were used in the directed content analysis, and the Online Kappa Calculator was used to calculate κ values for agreeability between reviewers (http://justusrandolph.net/kappa/). This measurement accounts for the chance‐adjusted measure of agreement for any number of cases, categories, or raters. The online calculator reports 2 variations of κ: Fleiss’s fixed‐marginal multirater κ and Randolph’s free‐marginal multirater κ 16 , 17 with Gwet’s variance formula. 18 The κ values are reported in the figure legends.
For more details about the thematic category development, Likert data analysis, and qualitative analysis methods, and individual program demographics see Table S2. Although 5 institutions participated in the 2019 program and 4 participated in 2020, not all institutions completed institutional review board–approved evaluations in both years. The institutional review boards reviewed and approved the proposals at Boston University Medical Campus, Stanford University, and Vanderbilt University. The participants gave informed consent, and all data were deidentified. In addition, the Medical College of Wisconsin opted out of the shared programming in 2020.
Results
Participant Characteristics
The AHA SURE program’s primary goal was to provide undergraduate research experiences for underrepresented students in biomedical science and enable the collaborative efforts of multiple institutions to develop a multisite training experience. This led to an impact not only for the AHA SURE fellows, but also for other students in similar programs funded by additional training grants (National Institutes of Health R25, AHA Undergraduate Institutional) at some of the institutions. See Table S3 for information about the number of students funded by different mechanisms at each institution.
The programming was the same for all students, regardless of the funding mechanism supporting them. To capture the AHA SURE program’s collaborative efforts, we collected survey responses from all undergraduates who participate in these programs, whether funded by AHA SURE, an National Institutes of Health R25, or an AHA Undergraduate Institutional award. In total, data have been analyzed from 69 students who completed the surveys. Not all programs that participated in the summer programming participated in this research study. Additionally, not all students who were involved in the programs completed a survey. Thirty‐four percent of the students surveyed were men and 66% were women. In total, 54% were non‐Hispanic Black, 25% were Hispanic or Latino/Latina, 11% were non‐Hispanic Asian, 7% were non‐Hispanic White, and 3% were non‐Hispanic NativeAmerican or Alaska Native (see Table S2 for participant ethnicity breakdown by institution). The majority of the students who participated in the program came from primarily minority‐serving home institutions (Figure S1) and had varying levels of research experience (Figure S2). Students who identified as sexual and gender minorities or from disadvantaged backgrounds also participated in the programs, but these data were not included because demographic information about these identities was not consistent across institutions.
Both In‐Person (2019) and Virtual Programs (2020) Provided Students With Significant Gains in Scientific Proficiency and Confidence
We pooled student responses across the participating sites with available data for 2019 and 2020 (Table 1). In 2019, students participated in a single postprogram survey in which they were asked to rank their proficiency in different domains of scientific research training on a scale of 1 to 5. Students were asked 13 questions across 5 thematic categories in 2019 and 33 questions across 6 thematic categories in 2020. We averaged the responses across all questions for each student. In 2019, the median student self‐reported preprogram proficiency (reported postprogram) was 3.46, and the median postprogram proficiency was 4.0 (Figure 2A, upper panel). There was a significant gain in self‐reported proficiency (P=0.003, 2‐tailed paired t test, N=12). In 2020, the median preprogram response values were 3.42, and the median postprogram response was 3.82 (Figure 2A, lower panel). The increase in confidence was significant when only the data from students who completed both the 2020 preprogram and postprogram survey were evaluated (P=0.0002 with Bonferroni correction, 2‐tailed paired t test, N=30), and when the full data were evaluated (P=0.002 with Bonferroni correction, 2‐tailed unpaired t test, N=36 presurvey respondents, N=33 postsurvey respondents; Table S4).
Table 1.
Institutional Survey Completion Data
Year | |||||
---|---|---|---|---|---|
2019 | 2020 | ||||
Institution | No. of program participants | Postsurvey response, n (%) | No. of program participants | Presurvey response, n (%) | Postsurvey response, n (%) |
Boston University | 3 | 3 (100) | 3 | 3 (100) | 3 (100) |
Vanderbilt University | 13 | 9 (90) | 20 | 14 (70) | 11 (55) |
Stanford University | 20 | 0 (0) | 21 | 19 (90) | 19 (90) |
The 3 organizing institutions' demographics are shown for 2019 and 2020 summer programs. In 2019, the number of participants and the number of postsurvey responses areas listed. For 2020, the number of participants, the presurvey responses, and postsurvey responses are listed.
Figure 2. Self‐reported gains in students’ proficiency and confidence in research.
A, Upper panel: 2019 self‐reported presurvey and postsurvey proficiency of students as averaged across 13 questions (N=12 students). Lower panel: 2020 self‐reported presurvey and postsurvey confidence of students as averaged across 33 questions (N=30 students). B, Upper panel: 2019 difference in postsurvey and presurvey proficiencies for individual questions, averaged across students. Questions are divided into 5 thematic categories. Lower panel, right: 2020 difference in postsurvey and presurvey confidence for individual questions, averaged across students. Questions are divided into 6 thematic categories. Avg. indicates average.
We categorized these questions into 6 domains as described in Data S1. By subtracting the average across‐student postprogram proficiency or confidence with the average across student preprogram proficiency or confidence, we were able to measure the average gain assessed by each question. In 2019, the highest gains in student self‐reported proficiency were in the categories of communication and thinking like a scientist, specifically, developing a scientific abstract and reading and evaluating scientific articles. Lower gains were also observed in the categories of Personal Gains and Research Skills (Figure 2B, upper panel). In 2020, the highest gains were also in the domains of communication and thinking like a scientist, as well as in scientific Research Skills (Figure 3B, lower panel).
Figure 3. Directed content analysis of open‐ended responses based on Undergraduate Research Student Self‐Assessment survey themes.
Scatter plots are shown for all 3 coders counts, with the bar graph indicating the mean. The error bars indicate the standard deviation from the mean. A, For the primary code, the average number of codes for Personal Gains, Thinking and Working Like a Scientist, Skills, and Attitudes and Behaviors was 18, 15, 23, and 39, respectively. B, The averages for secondary codes were 20, 8, 5, and 10. C, When both the primary and secondary codes are combined, the averages are 38, 24, 28, and 50. A Kruskal‐Wallis test was performed, followed by Dunn's multiple comparison test. Overall, there was no significant difference between the number of codes for each theme generated by the primary (P=0.15) codes. However, there was a significant difference in the secondary codes (P=0.008). Personal Gains codes responses were significantly more as compared with Skills codes (P=0.03). When the codes were combined, there was no significant difference (P=0.10). D, Representative quotes for each theme, including primary and secondary themes, are shown. A indicates Attitudes; N/A, not applicable; P, Personal Gains; PI, Principal Investigator; S, Skills; and T, Thinking and Working Like a Scientist.
Notably, in 2020, students also had lower postsurvey self‐reported rankings than presurvey rankings in 4 aspects of their training compared with their 2020 preprogram reports: working well independently, comfort in collaborating with others, managing their time effectively, and feeling encouraged by individuals in their life to pursue careers involving research and medicine. Overall, students exhibited significant increases in proficiency and confidence both in the 2019 in‐person program and the 2020 virtual program. Students also exhibited the largest gains in similar training categories: communication and thinking like a scientist.
2020 Student Cohort Experienced Extensive Personal Gains, Research Skills, Thinking and Working Like a Scientist, and Attitudes and Behaviors in a Virtual Environment
To consolidate data and see thematic changes through the program span, we generated thematic categories. These categories were based on those defined by the Undergraduate Research Student Self‐Assessment survey. The Personal Gains related to research work category of the survey is meant to assess "affective characteristics of confidence, comfort, and general self‐efficacy with conducting research and working on a research team and in a lab." The Skills category of questions refers to "traditional academic outcomes associated with more direct assessments." Understanding and defining the skills necessary for science laboratory work vary dramatically across disciplines, but a set of common skills were identified in an earlier study. For this study, we further refined the category of Skills into Communication Skills, such as writing a scientific abstract or presenting a poster, Scientific Research Skills, such as using statistics to analyze data, and science‐adjacent Research Skills, such as time management. Direct quotes from the survey data are provided for each theme in Figure 3, with additional examples available upon request.
To examine the frequency of the qualitative data categories, an analysis to determine if there were statistically significant difference in the frequency of the thematic themes across all responses was conducted (Figure 3). The frequency of the theme Attitudes and Behaviors had the most representation followed by Skills. Comments coded as being in the categories of Thinking and Working Like a Scientist and Personal Gains appeared less often. Although there was a trend toward differences between categories in the primary codes (P=0.06), for the secondary codes, personal gains appeared more often with the other 3 categories, with a significant difference between Personal Gains and Skills (P=0.03; Personal Gains versus Skills, P=0.02). When both the primary and secondary codes were pooled, there was also a nonsignificant trend (P=0.06) for an increase in the presence of comments coded as Attitudes and Behaviors. For the primary codes, the κ agreeability scores were 0.42, 0.62, 0.72, 0.47, 0.52, and 0.65 for each of the open‐response questions 1 through 6, respectively. The average agreeability κ value was 0.56, indicating a moderate degree of agreeability between coders.
Directed Content Analysis From 2020 Students on In‐Person Versus Virtual Opportunities and Multisite Collaborations Versus Local Institutional Programming
Student open‐response comments were also coded for inclusion into directed content analysis with either a yes or a no if they specifically mentioned virtual programming, compared virtual to in‐person programming, or mentioned multisite programming. Of the 198 comments analyzed, 34 were coded referencing virtual or multisite programming. We analyzed the codes associated with the 34 virtually relevant comments in Figure 4. There was a significant difference for the primary codes (P=0.004) for the virtually relevant comment coding. However, there was no difference between the code theme following correction for multiple comparisons. For the secondary codes, there was also a significant difference (P=0.02). Personal gains codes responses were significantly greater compared with Skills codes following the posttest (P=0.03). When the primary and secondary codes were combined, there was still a significant difference (P=0.02). Coding for comments that corresponded to Attitudes and Behaviors was significantly greater than the number of codes corresponding to Thinking and Working Like a Scientist. Representative quotes for each theme, including primary and secondary themes, are shown in Figure 4D.
Figure 4. Directed content analysis of open‐ended responses with virtual relevance.
During the review process, quotes that were related to the virtual nature of the program were flagged. Overall, this represented 34 of the 198 comments. We evaluated the codes associated with these 34 comments. Shown are scatter plots for all 3 coders counts with the bar graph indicating the mean. The error bars indicate the standard deviation from the mean. A, For the primary code, the average number of codes for Personal Gains, Thinking and Working Like a Scientist, Skills, and Attitudes and Behaviors was 1, 1, 8, and 12, respectively. B, The averages for secondary codes were 6, 2, 0, and 2. C, When both the primary and secondary codes are combined, the averages were 7, 3, 8, and 14. A Kruskal‐Wallis test was performed, followed by Dunn's multiple comparison test. There was a significant difference for the primary codes (P=0.0043). However, there was no difference between the theme of the code following the multiple comparison corrections. For the secondary codes, there was also a significant difference (P=0.0164). Personal Gains codes responses were significantly more as compared with Skills codes (P=0.0340). When the primary and secondary codes were combined, there was still a significant difference (P=0.0153). Coding for comments that corresponded to Attitudes and Behaviors was significantly greater than the number of codes corresponding to Thinking and Working Like a Scientist. D, Representative quotes for each theme, including primary and secondary themes, are shown. A indicates Attitudes and Behaviors; AHA, American Heart Association; CVI, Cerebral Visual Impairment; N/A, Not applicable; P, Personal Gains; S, Skills; and T, Thinking and Working Like a Scientist.
Student‐Reported Program Strengths Include Exposure to Information That Helped Them Solidify Their Career Plans, but Mentorship Can Still Be Improved
The qualitative review of the students’ responses to open‐ended questions about the program also provided additional insight into what aspects of the program they found especially beneficial and what they thought could be improved. Students frequently reported that they really appreciated the diversity of topics and that it helped them identify the research domains and career types that they would like to pursue. For example, one student highlighted the exposure to drug development and how it helped them make an informed decision about pursuing a career in bioengineering. Another student described how the exposure to disparities research has resulted in them pursuing the topic as their primary career goal. Another student described how their experience helped them realized they want to pursue more translational as opposed to more basic research. The students also described how much they valued hearing from medical students, graduate students, and Principal Investigators about their career experiences, again because it helped steer them in the right direction for their own career goals. Additionally, the admission committee panels were specifically called out by students as being valuable because they exposed the students to different career options. In terms of what the students reported could be improved about the program, students requested even more guidance in career choices and preparing for graduate applications, but by far the most frequent recommendation was improved mentorship and project planning. Many students would have liked to have more interactions with their Principal Investigators, a challenge that is likely exacerbated by the virtual programming.
Students Report That the Virtual Program Posed Unexpected Benefits and Challenges
Overwhelmingly, the most prevalent unexpected benefit the students experienced were the sense of community they felt and connections they made with their peers and other scientists at their institutions though the program was virtual. Students also specifically described improved facility with digital communication. One student reported that remote learning requires collaboration and clear communication, and that the program helped them learn how to exchange ideas and work effectively in a new way. Another student also explained how they thought the virtual research experience helped them become more adaptable to change by needing to be resourceful about how to accomplish their work with the limited resources that were available at home. Despite these benefits, the virtual nature of the program also made the students feel less connected to their laboratory’s overall mission and research. Students also noted that engagement was less because the program was virtual, and that in some instances the virtual format made it harder for students to feel like they could speak up. An additional challenge that was specific to the virtual format were connectivity issues and time differences, which limited some students’ participation in the program.
Discussion
The AHA partnered with 5 leading medical schools (Boston University, Medical College of Wisconsin, Northwestern University, Stanford University, and Vanderbilt University) to implement an undergraduate research experience for underrepresented students in biomedical science. We have (1) outlined the transition from in‐person, local programming in 2019 to virtual, multisite, collaborative programming in 2020, and (2) qualitatively investigated programming and suggested that shared virtual programming can lead to a collaborative experience that enhances training for students, particularly students from underrepresented backgrounds. We explored how both iterations of programming (individualized and in‐person in 2019 versus virtual and collaborative in 2020) relate to the major themes of undergraduate training (Figures 3 and 4) for 3 of 5 institutions under the AHA SURE program. We summarize our recommendations for the implementation of future hybrid programming in Table 2.
Table 2.
Hybrid Program Model Recommendations
Local | Multisite | ||
---|---|---|---|
In‐person | Pros |
|
Currently not applicable because of time and resource needs. |
Cons |
|
||
Virtual | Pros |
|
|
Cons |
|
|
Outline of the pros and cons for both in‐person and virtual programming in a local and multisite model of undergraduate research programs.
Many excellent programs aim to support universities and colleges in diversifying the nation's STEM workforce. For example, the McNair Scholars Program and the Minority Health and Health Disparities International Research Training program supports research experiences for underrepresented students. 19 , 20 The Louis Stokes Alliances for Minority Participation program is an alliance‐based program that uses the Tinto model for student retention, emphasizing that the degree of success of a student influences the level of commitment a student has to an institution, and academic and career goals. 21 , 22 Additionally, the Society for Advancement of Chicanos/Hispanics and Native Americans in Science and the Annual Biomedical Research Conference for Minority Students have created excellent platforms to drive change, broaden networks, and strengthen opportunities for underrepresented students in STEM. 23 , 24 Although the Louis Stokes Alliances for Minority Participation and other groups support alliances or consortia of multiple degree‐granting institutions, to our knowledge, this is the first time an entire summer research program has been intricately connected across partner institutions with extended networking opportunities. Overall, this study uses the successful models of these other programs in a combined approach focused on research experiences and building an extended and integrated institutional community network. This was made possible by the AHA SURE program, and the programming was made available to all students, both AHA SURE–supported students, R25‐funded students, and others.
Many external factors that affected students, faculty, and program directors may confound our study's evaluation. The most apparent factor is that the students participated in the 2020 program during a global pandemic, which showed profound health inequities across race and ethnicity. Students were also coping with the extreme racism exhibited by the killings of George Floyd, Breonna Taylor, Ahmaud Arbery, and many others, as well as the intense periods of social unrest that followed. These external forces impacted the students’ physical and mental health and influenced their experience of the summer program. Additionally, students’ participation in multiple programs, courses, and events outside the AHA SURE programming also may have affected their experiences and reporting. In 2020, 4 aspects of the students self‐reported experiences decreased between pre‐ and postprogram evaluations: working well independently, comfort in collaborating with others, managing their time effectively, and feeling encouraged by individuals in their life to pursue careers involving research and medicine. Notably, all 4 of these specific aspects would be especially challenging in a virtual instead of an in‐person program. The program's virtual nature in 2020 may have posed additional challenges to students' ability to manage their time, collaborate, and feel supported in their career choices. It is also notable, and perhaps surprising, that students expressed significant gains in Research Skills despite the virtual nature of the 2020 program.
Another challenge for our study was that the programs did not use the same evaluation instrument in 2019 and 2020, making direct comparisons between 2019 and 2020 difficult. To determine if this program improves the representation of underrepresented students in biomedical sciences in the long term, we will need to longitudinally study whether AHA SURE students pursue careers in biomedical research health sciences. Our evaluation was possible because 3 of the 5 sites used a standardized evaluation method. In the future, all sites are encouraged to use the same survey instruments to standardize evaluation and make more generalizable assertions.
The factors described above do not detract from the evinced benefits of virtual programming. Comparing the 2019 surveys to 2020, we observed that students reported positive gains in scientific training that were similar to what was achieved in in‐person experiences. Like others, we report that the transition to a virtual setting successfully recreated and addressed the core goals of a summer internship. 25 We also provided a data‐driven approach that demonstrates that virtual programming touches on all themes outlined in the Undergraduate Research Student Self‐Assessment survey. Our data revealed that virtual research‐supplemental activities can serve to complement traditional mentorship, even as we transition back to in‐person programming. Virtual experiences may circumvent several common barriers to access encountered by underrepresented groups, such as work commitments, financial barriers, geographic constraints, childcare, or family commitments. Virtual programming could even be used to provide additional mentorship opportunities. 26 , 27 , 28
Additionally, the AHA SURE summer program was unique in that a multi‐institutional team worked collaboratively to deliver content. The virtual nature of the 2020 programming allowed content to be widely disseminated and accessed by greater numbers of students than a single institute. This approach to collaborative programming allowed for expertise unique to a given site to be accessed by other participating sites. As noted by student comments, multisite programming exposed the students to a range of interdisciplinary research, clinical, and career development perspectives that will benefit them in their biomedical careers. Furthermore, the collaborative virtual programming led to increased opportunities to network, find mentors, and collaborate with peers, all of which are integral to effective career development. Overall, the AHA SURE program evaluation material presented here supports the benefit of multisite programming. Program directors across multiple sites can collaborate on programming to celebrate and enhance the unique offerings of each institution. Supporting virtual multisite programming as a complement, rather than a replacement, for undergraduate research training will likely diversify perspectives and networks while also providing opportunities to share content and decrease individual site workload across participating programs. It is important to note that multiple institutional support and recognition of the virtual research program's success will be essential for implementing multisite models in the future. When thinking of a hybrid program for 2021 and beyond, we consider an approach that uses the best of features of local, in‐person, or multisite programming (Table 2). This includes hands‐on experience in a laboratory environment (local and in‐person), flexibility in learning with resources efficiency (virtual), and exposure to a wide network of peers and mentors across institutions.
As noted above, our study has some limitations, including the inherent difficulty in disentangling internal and external forces. It is difficult to distinguish between the inherent value of the program when comparing the on‐site 2019 versus the virtual 2020 experience. We also recognize that this study contains only a modest number of participants and institutions; the findings may not be generalizable to other institutions. However, the strength of this study demonstrates that multisite programming may further enhance the training experience of students, even in a hybrid format.
Conclusions
The AHA SURE program can serve as a model for building multi‐institutional partnerships and maximizing virtual programming. The AHA SURE syndicate collaboratively provided research experiences that diminished institutional barriers and supported students’ interests, commitment, and ability to persist in STEM fields. 29 We also describe a novel model of training for undergraduate students to engage in biomedical research, brought about by the rapid virtual transition caused by the COVID‐19 pandemic. The AHA SURE’s multisite success was demonstrated by the overall gains seen by students who participated in the program, which were similar in magnitude to those observed in the 2019 in‐person iteration of the program, despite the challenges posed by conducting research training virtually. We propose that multisite training creates unique networking opportunities, allows for greater transfer of knowledge between student peers, provides more training opportunities overall, and capitalizes on the experts' nationwide availability. As research programs continue to adapt after COVID‐19, we strongly recommend using the virtual platforms developed throughout the pandemic to collaborate across institutions to provide broader and better training opportunities for students pursuing STEM careers.
Moving forward, we will continue robust program evaluation with the idea of increasing the value to the scholars with each successive year. Furthermore, we plan to track the scholars career progression to explore the long‐term outcomes via LinkedIn, Open Researcher and Contributor ID, and email. Furthermore, we will continue to encourage them to be AHA volunteers, because of AHA’s advocacy for public health and health equity. Overall, this program in supporting the AHA’s goal of addressing health equity and structural racism by expanding research opportunities for those from underrepresented racial and ethnic groups.
Sources of Funding
This study was supported by the AHA SURE award from the American Heart Association. The content is solely the responsibility of the authors and does not necessarily represent the official views of the American Heart Association. The authors also report additional funding including Dr Benjamin (R01HL092577, U54HL120163, AHA AF SFRN: AHA_18SFRN34110082), Dr Wu (R25HL147666; AHA_18UFEL33960207); Dr Barnett (R25HL145330), and the Vanderbilt University School of Medicine Basic Sciences Dean’s Office.
Disclosures
None.
Supporting information
Data S1
Tables S1–S4
Figures S1–S2
Acknowledgments
The authors acknowledge the contribution of Dr Shay, who orchestrated the 2020 pre‐ and postprogram evaluations, developed the survey, administered the surveys, collated the results, and ensured the students information was secure and deidentified. The authors also acknowledge Y. Dwarakanath for organizing the Stanford University Undergraduate Summer Research Program events shared across institutional sites. S. Sours‐Brothers contributed to research and decisions on the evaluation survey and our use of the Undergraduate Research Student Self‐Assessment model. The authors also acknowledge that Dr Ajayi recently has accepted a role as a medical resident at Boston Medical Center.
Supplemental Material for this article is available at https://www.ahajournals.org/doi/suppl/10.1161/JAHA.121.022380
For Sources of Funding and Disclosures, see page 12.
References
- 1. Xierali IM, Nivet MA. The racial and ethnic composition and distribution of primary care physicians. J Health Care Poor Underserved. 2018;29:556–570. doi: 10.1353/hpu.2018.0036 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. Populations Underrepresented in the Extramural Scientific Workforce | SWD at NIH . Available from: https://diversity.nih.gov/about‐us/population‐underrepresented
- 3. Population Distribution by Race/Ethnicity | KFF . Available from: https://www.kff.org/other/state‐indicator/distribution‐by‐raceethnicity/?activeTab=graph¤tTimeframe=0&startTimeframe=11&sortModel=%7B%22colId%22:%22Location%22,%22sort%22:%22asc%22%7D
- 4. AAMC Media . Available from: https://www.aamc.org/media/6121/download
- 5. Figure 18. Percentage of all active physicians by race/ethnicity, 2018 | AAMC . Available from: https://www.aamc.org/data‐reports/workforce/interactive‐data/figure‐18‐percentage‐all‐active‐physicians‐race/ethnicity‐2018
- 6. U.S. Census Bureau QuickFacts: United States . Available from: https://www.census.gov/quickfacts/fact/table/US/PST045219
- 7. Mehta LS, Fisher K, Rzeszut AK, Lipner R, Mitchell S, Dill M, Acosta D, Oetgen WJ, Douglas PS. Current demographic status of cardiologists in the United States. JAMA Cardiol. 2019;4:1029–1033. doi: 10.1001/jamacardio.2019.3247 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Doctorate Recipients from U.S. Universities: 2019 | NSF ‐ National Science Foundation . Available from: https://ncses.nsf.gov/pubs/nsf21308/data‐tables
- 9. Riegle‐Crumb C, King B, Irizarry Y. Does STEM stand out? Examining racial/ethnic gaps in persistence across postsecondary fields. Educ Res. 2019;48:133–144. doi: 10.3102/0013189X19831006 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Ferreira M. Gender issues related to graduate student attrition in two science departments. Int J Sci Educ. 2003;25:969–989. doi: 10.1080/09500690305026 [DOI] [Google Scholar]
- 11. Gibbs KD Jr, McGready J, Bennett JC, Griffin K. Biomedical science Ph. D. career interest patterns by race/ethnicity and gender. PLoS One. 2014;9:e114736. doi: 10.1371/journal.pone.0114736 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12. Weston TJ, Laursen SL. The undergraduate research student self‐assessment (URSSA): validation for use in program evaluation. CBE Life Sci Educ. 2015;14:1–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Seymour E, Hunter A‐B, Laursen SL, DeAntoni T. Establishing the benefits of research experiences for undergraduates in the sciences: first findings from a three‐year study. Sci Educ. 2004;88:493–534. doi: 10.1002/sce.10131 [DOI] [Google Scholar]
- 14. Hsieh H‐F, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15:1277–1288. doi: 10.1177/1049732305276687 [DOI] [PubMed] [Google Scholar]
- 15. Holloway I, Galvin K. Qualitative Research in Nursing and Healthcare. John Wiley & Sons; 2016. [Google Scholar]
- 16. Randolph JJ. Free‐marginal Multirater Kappa (multirater K [free]): an alternative to Fleiss. Online Submiss. 2005;20. [Google Scholar]
- 17. Warrens MJ. Inequalities between multi‐rater kappas. Adv Data Anal Classif. 2010;4:271–286. doi: 10.1007/s11634-010-0073-4 [DOI] [Google Scholar]
- 18. Wongpakaran N, Wongpakaran T, Wedding D, Gwet KL. A comparison of Cohen’s Kappa and Gwet’s AC1 when calculating inter‐rater reliability coefficients: a study conducted with personality disorder samples. BMC Med Res Methodol. 2013;13:61. doi: 10.1186/1471-2288-13-61 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19. McNair Scholars . Available from: https://mcnairscholars.com/
- 20. MHIRT Program Overview – MHIRT . Available from: https://mhirt.com/program‐overview/
- 21. Cosentino De Cohen C, Tsui L, Deterding N. Revitalizing the Nation’s Talent Pool in STEM .
- 22. Louis Stokes Alliances for Minority Participation | NSF ‐ National Science Foundation . Available from: https://www.nsf.gov/funding/pgm_summ.jsp?pims_id=13646
- 23. Home ‐ SACNAS . Available from: https://www.sacnas.org/
- 24. ABRCMS . Available from: https://www.abrcms.org/
- 25. Samad T, Fleming HE, Bhatia SN. Virtual undergraduate research experiences: more than a Pandemic Stopgap. Med. 2021;2:118–121. doi: 10.1016/j.medj.2021.01.007 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26. Bangera G, Brownell SE. Course‐based undergraduate research experiences can make scientific research more inclusive. CBE—life Sci Educ. 2014;13:602–606. doi: 10.1187/cbe.14-06-0099 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27. Pierszalowski S, Vue R, Bouwma‐Gearhart J. Overcoming barriers in access to high quality education after matriculation: Promoting strategies and tactics for engagement of underrepresented groups in undergraduate research via institutional diversity action plans. J STEM Educ. 2018;19. [Google Scholar]
- 28. Petersen S, Pearson BZ, Moriarty MA. Amplifying voices: investigating a cross‐institutional, mutual mentoring program for URM women in STEM. Innov High Educ. 2020;45:317–332. doi: 10.1007/s10755-020-09506-w [DOI] [Google Scholar]
- 29. Estrada M, Burnett M, Campbell AG, Campbell PB, Denetclaw WF, Gutiérrez CG, Hurtado S, John GH, Matsui J, McGee R, et al. Improving underrepresented minority student persistence in STEM. CBE Life Sci Educ. 2016;15:1–10. doi: 10.1187/cbe.16-01-0038 [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data S1
Tables S1–S4
Figures S1–S2