Skip to main content
AEM Education and Training logoLink to AEM Education and Training
. 2021 Sep 29;5(Suppl 1):S87–S97. doi: 10.1002/aet2.10664

Defining “county”: A mixed‐methods inquiry of county emergency medicine residency programs

Jennie A Buchanan 1,, Maria Moreira 1, Taku Taira 2, Richard Byyny 3, Zachary Jarou 4, Todd Andrew Taylor 5, W Gannon Sungar 1, Christy Angerhofer 6, Sean Dyer 7, Melissa White 5, Dhara Amin 7, Michelle D Lall 5, David Caro 8, Melissa E Parsons 8, Teresa Y Smith 9,10
PMCID: PMC8480508  PMID: 34616979

Abstract

Introduction

There is no clear unified definition of “county programs” in emergency medicine (EM). Key residency directories are varied in designation, despite it being one of the most important match factors for applicants. The Council of Residency Directors EM County Program Community of Practice consists of residency program leadership from a unified collective of programs that identify as “county.” This paper's framework was spurred from numerous group discussions to better understand unifying themes that define county programs.

Methodology

This institutional review board–exempt work provides qualitative descriptive results via a mixed‐methods inquiry utilizing survey data and quantitative data from programs that self‐designate as county.

Unique treatment, analysis and critique

Most respondents work, identify, and trained at a county program. The majority defined county programs by commitment to care for the underserved, funding from the city or state, low‐resourced, and urban setting. Major qualitative themes included mission, clinical environment, research, training, and applicant recommendations. Comparing the attributes of programs by self‐described type of training environment, county programs are typically larger, older, in central metro areas, and more likely to be 4 years in duration and have higher patient volumes when compared to community or university programs. When comparing hospital‐level attributes of primary training sites county programs are more likely to be owned and operated by local governments or governmental hospital districts and authorities and see more disproportionate‐share hospital patients.

Implications for education and training in EM

To be considered a county program we recommend some or most of the following attributes be present: a shared mission to medically underserved and vulnerable patients, an urban location with city or county funding, an ED with high patient volumes, supportive of resident autonomy, and research expertise focusing on underserved populations.

INTRODUCTION

Since the 1970s, the number of U.S. categorical emergency medicine (EM) programs, residents and graduates have steadily increased.1 The model for the clinical practice of EM is an accepted guide for EM training and considered the backbone for all certifying examinations, yet it does not clearly address the clinical variation in training sites among EM residency programs.2 Every year medical students around the country embark on a journey to learn about EM programs to find the best fit for their training. Many of these students have participated in medical school programs focused on urban medically underserved communities with a focus on clinical training and community service.3 They obtain information from online resources, residency websites, and mentors and through visits to programs in search of data that help them distinguish between training sites. Through their search, students will come across the term “county program” and will question what qualifies a program for this designation.

There is no unified definition of what it means to be a county program despite common use of this term in EM graduate medical education. This definition is extremely important for applicants as an Emergency Medicine Residents Association (EMRA) survey of fourth‐year EM applicants revealed the type of hospital (community vs. county vs. academic) as the most important factor in ranking decisions.4 Another study revealed that the training environment was the fifth most commonly used filter by EMRA Match users.5 The American Academy of Emergency Medicine publication on rules of the road for medical students suggests that “county” is made up of training programs that are city or county owned, have high patient volumes, are ethnically diverse, and serve specific underserved and at‐risk patient populations.6 Academic Life in Emergency Medicine describes “county programs” as programs that “share a special brand of team‐centered training, and residents learn how to make the most of limited resources while caring for patients who are also often resource‐poor.”7 There are a variety of different EM training programs that have either self‐designated as, or been given the label of, a county program. These programs constitute a minority of overall programs with 34 of the 265 residencies (12.8%) listed on EMRAmatch.org self‐designating as having a primary training site that is county.8, 9 In contrast, within the Society for Academic Emergency Medicine (SAEM) Residency Directory website, 15 of the 186 (8.1%) listed residencies self‐identify as county.10 The Fellowship and Residency Electronic Interactive Database Access System (FREIDA) designates programs as university‐based, community‐based/university‐affiliated, community‐based, and military, but does not offer a county designation.11 This lack of clarity creates difficulty for students who are selecting programs, advisors who are mentoring applicants, and residencies who are recruiting students. It may also alienate some programs that share features associated with county programs, but do not have some of the usual monikers connected with the label.

The Council of Residency Directors in Emergency Medicine (CORD) County Program Community of Practice consists of residency program leadership from a unified collective of diverse self‐identified county programs. The framework of this paper was spurred from numerous discussions within this CORD group prompting the creation of a committee to identify unifying themes to elucidate what it means to be a county program. Ultimately, the aim of this working group is to better define what it means to be a county residency training program, which in turn will assist medical students, residents, and faculty alike to describe what this style of training entails. To obtain a better understanding of the collective view of characteristics specific to county programs, we queried program directors (PDs) and academic faculty in EM through the use of the CORD listserv along with a quantitative analysis of program attributes of training programs according to the self‐selected type of training environment of the primary site specified in EMRA Match. This mixed‐methods inquiry provides qualitative descriptive results of the survey, along with a quantitative analysis of program attributes and an in‐depth discussion along with summary recommendations on what it means to be identified as a county program in EM.

METHODOLOGY

We performed an exploratory analysis to inform this special contribution. This work was granted exempt status by the local multiple institutional review board and sanctioned by the CORD Board of Directors. We developed a mixed‐methods survey instrument to better understand the perception of the term county program along with a quantitative analysis comparing program attributes. Utilization of mixed‐methods research where both quantitative and qualitative data are deployed has been shown to produce numerous benefits, including confirmation of gathered data, allowing substantial detail and initiating new paradigms of reasoning. This methodology can be efficacious in eliminating bias in quantitative and qualitative data gathering.12

The survey instrument, in Table 1, was developed by a core group (MM, TS, JB, CA, RB) collectively representing 72 years of practice in three different county programs using a modified Delphi approach. The research group first identified key areas that might differentiate a county program from a noncounty program. These included hospital‐based (funding, hospital mission, resource allocation, location), patient population–based (volume, insurance status, rates of English proficiency), clinical (emphasis on trauma care, indigent care), and departmental focus (emphasis on research vs. service) characteristics. The group identified the future need to characterize both personal characteristics of a county program's ideal medical student applicants as well as ideal application composition.

TABLE 1.

Survey questions

Question Answer options
(Q1) What does being a “county program” mean to you?
  • Urban setting

  • City‐ or state‐funded

  • Serving the underserved

  • Level I trauma

  • Low‐resourced

  • Other (please specify)

    • Open‐ended response

(Q2) Why do you work at a “county Program” (*if you work at one)?
  • Open‐ended response

(Q3) Do you identify your program as a “county” (urban safety net/public hospital) program? Please explain why in the comments.
  • Yes

  • No

  • Option to explain

(Q4) What contributes to your definition? (check all that apply)
  • Volume

  • Payer‐mix status

  • Training experience

  • Location

  • Other (please specify)

    • Open‐ended response

(Q5) What do “county programs” look for in an ideal residency applicant?
  • Mission‐driven

  • SLOE from other County program

  • Experience serving the underserved

  • Research in health care disparity

  • Diverse background

  • Community service experience

  • Medicine not first career

  • Diversity and Inclusion contributions

  • Other (please specify)

    • Open‐ended response

(Q6) How can these applicants boost their application early in medical school?
  • Open‐ended response

(Q7) What are some helpful suggestions to those candidates interested in applying to “county programs?”
  • Open‐ended response

(Q8) What do you think attracts applicants to “county” programs?
  • Clinical training (volume and acuity)

  • Patient population served

  • Community Involvement

  • Public health interest

  • Global health interest

  • Rural health interest

  • Other (please specify)

    • Open‐ended response

(Q9) What do you think is the main educational goal for a “county program?”
  • Clinical training (volume and acuity)

  • Patient population served

  • Community Involvement

  • Public health interest

  • Global health interest

  • Rural health interest

  • Other (please specify)

    • Open‐ended response

(Q10) Does “county” mean that there is no academics?
  • Yes

  • No

  • Other (please specify)

    • Open‐ended response

(Q11) Is your “county” hospital affiliated with a private/academic hospital?
  • Yes

  • No

  • Other (please specify)

    • Open‐ended response

(Q12) Can a “county program” be at an academic center?
  • Yes

  • No

  • Other (please specify)

    • Open ended response

(Q13) Have you trained at a “County program?”
  • Yes

  • No

  • Other (please specify)

    • Open ended response

(Q14) Do you currently work at a “county program?”
  • Yes

  • No

(Q15) If yes, for how many years?
  • Open‐ended response

(Q16) List the two most commonly translated languages at your hospital?
  • Open‐ended response

(Q17) Please list fellowship training if applicable
  • Open‐ended response

(Q18) Where are you in your career?
  • Fellow

  • Early career

  • Mid‐career

  • Late career

(Q19) Gender
  • Female

  • Male

  • Nonbinary/third gender

  • Prefer to self‐describe

  • Prefer not to say

  • Other (please specify)

    • Open ended response

(Q20) Age
  • 25–35

  • 36–45

  • 46–55

  • 56–65

  • 66–older

(Q21) Race or ethnicity
  • White or Caucasian

  • Black or African American

  • Hispanic or Latino

  • Asian or Asian American

  • American Indian or Alaskan Native

  • Native Hawaiian or other Pacific Islander

  • Another race

  • Prefer not to answer

  • Other (please specify)

    • Open ended response

(Q22) Is your residency based out of your program or is it affiliated with another program?
  • Yes

  • No

  • Other (please specify)

    • Open ended response

(Q23) Location: (link to census bureau designated regions)
  • Northeast

  • Midwest

  • South

  • Central

  • West

  • Other (please specify)

    • Open ended response

After the identification of key content areas, the instrument questions were developed through an iterative process. Questions were refined for clarity and relevance and to minimize bias. Because this content area has not been well defined, in an effort to capture unanticipated responses, respondents were allowed to provide qualitative responses for all multiple‐choice questions. Additionally, the instrument included multiple qualitative open‐ended responses.

The survey was distributed via Survey Monkey Inc. (www.surveymonkey.com) to the CORD faculty and PD listserv. To include the entire EM community's definition of county program the survey was sent to members regardless of their experience working and/or training at a county program. The responses were collected for 6 weeks with both an email reminder and a final in‐person reminder at the CORD Academic Assembly annual meeting.

We applied grounded theory to all qualitative data. Qualitative answers for each question were coded (by TT, JB, RB). Codes were then analyzed and categorized to extract major themes. Quantitative data were presented as proportions with associated 95% confidence intervals (95% CIs).

In addition to the survey sent to the CORD list‐serv, a quantitative analysis was performed comparing attributes of training programs according to the self‐selected type of training environment of the primary site specified in EMRA Match (university, community, or county). Program attributes were collected from the public records available through the Accreditation Council for Graduate Medical Education (ACGME; total EM residents, program length, year of original accreditation, and original accreditation organization [ACGME, American Osteopathic Association, or dually accredited]), EMRA Match (ED volume, primary training environment), and the Centers for Medicare and Medicaid Services (hospital ownership, resident‐to‐bed ratio, disproportionate‐share hospital [DSH] patient percentage, and the percentage of inpatient days attributable to Medicare and Medicaid Patients).13, 14, 15 Hospitals were classified as urban or rural based on the 2013 classification scheme proposed by the Centers for Disease Control and Prevention's National Centers for Health Statistics.15 Military programs, those outside the continental United States, and those with an incomplete primary training environment field in EMRA Match were excluded. Categorical variables were compared using Fisher's exact test; continuous variables were compared using Kruskal–Wallis tests. We additionally performed a classification and regression tree (CART) analysis with 10‐fold cross‐validation to categorize programs by their primary training environment using program and hospital‐level attributes compared in univariate analysis.

ANALYSIS, UNIQUE TREATMENT, AND CRITIQUE

We received 46 responses from the CORD faculty and PD listserv, which had 1,786 members at the time of survey release. The demographic data of the respondents are reported in Table 2. Of the respondents, 37 (80% [95% CI = 66% to 91%]) currently work at a county program, 36 (78% [95% CI = 64% to 89%]) identify their program as being a county program, and 39 (87% [95% CI = 71% to 94%) reported training at a county program, but specifics of location were not collected. Respondents were 56% male and 44% female with an age range of 25 to 65 years. Of the respondents that work at county programs, they worked an average of 9.5 years in county programs with a range from 1 to 23 years. Respondents noted that within their county program Spanish was the most commonly translated language followed by Vietnamese, Chinese (various dialects), French Creole, Arabic, and Amharic, and various fellowship opportunities were available ranging from many to none.

TABLE 2.

Demographics of respondents

Region
Northeast 18.6%
Midwest 25.6%
South 37.2%
Central 0%
West 16.3%
Other 2.3%
Ethnicity
White or Caucasian 72%
Black or African American 4.7%
Hispanic or Latino 7%
Asian or Asian American 11.6%
American Indian or Alaska Native 0%
Native Hawaiian or other Pacific Islander 0%
Another race 0%
Prefer not to answer 2.3%
Other 2.3%

The majority of respondents defined county programs by their commitment to care for the underserved (87% [95% CI = 74% to 95%]), funding from the city or state (61% [95% CI = 45% to 75%]), low‐resourced (59% [95% CI = 43% to 73%]), and urban setting (56% [95% CI = 41% to 71%]). The minority of respondents defined county programs by their designation as Level I trauma centers (35% [95% CI = 21% to 50%]). Six of seven qualitative answers focused their definitions on the source of funding (city‐, county‐, or state‐funded). One respondent drew a contrast between tertiary centers and county hospitals, where county hospitals are focused on service to the community and “hands‐on training, heavier on the procedures.” One additional respondent defined county programs as “high [patient] volume, busy department.”

These quantitative answers were reflected in the free text answers to the question “Do you identify your program as a county (urban/safety net) program? Please explain why in the comments.” There were several themes that arose out of the qualitative data. Major themes clustered around “public funding,” “underserved population,” “urban setting,” “trauma care,” and “responsibility to the local population.” Several answers highlighted the nuance surrounding hospital funding. One respondent stated, “We are an urban safety net/public hospital but we don't identify as a county program since we are state funded and not city/county funded.” Another respondent partially rejected the definition by the funding model and instead focused on the mission of the hospital in their response “we identify as ‘county‐like’.” “We aren't directly funded by the city or state, but we are a nonprofit teaching hospital, Level I trauma center and serve as an urban safety net hospital for the surrounding area.” When the respondents were asked what contributed to their definition in their qualitative answers above, 43 answered. Thirty‐four (79% [95% CI = 64% to 90%]) reported payer‐mix, 58% [95% CI = 42% to 73%] identified location and 56% [95% CI = 40% to 71%] identified training experience as being a factor in their definition of a county program.

When asked “Is your ‘county program’ hospital affiliated with a private/academic hospital?,” 30 of 42 (71% [95% CI = 55% to 84%]) respondents reported that their program was affiliated with a private/academic hospital. When asked “Can a ‘county program’ be at an academic center?,” 44 of 44 (100% [95% CI = 92% to 100%]) of respondents responded affirmatively. Consistent with the responses to this question, when asked “Does a ‘county program’ mean that there is no academics?” respondents were unanimous that this was not the case (45/45, 100% [95% CI = 92% to 100%]).

Many of the above themes were echoed in the responses to the question “Why do you work at a ‘county program’?” Responses clustered around the major theme of “mission,” which manifested itself in multiple ways. The role of the county program in the care for the underserved and service to the community was widely discussed and exemplified by the quote: “service to my community. Able to treat everyone and get them follow‐up care without dumping them as soon as stable.” The theme of “mission” underlies the foundation for a shared sense of community and shared values both among colleagues and consultants, exemplified by this response: “Desire to make a real difference. Colleagues in all specialties with like‐mindset so few issues with consults, etc.” Similarly, other respondents identified a shared identity among their colleagues: “Love the patient population, love the diversity, love the grittiness of the people I work with.” The theme of mission also extended to personal values: “I am committed to serving an underserved patient population.” A theme that was closely related to the mission was “satisfaction and meaning.” Several respondents discussed the sense of purpose, satisfaction, and meaning that come from working in county programs: “Having a well‐defined mission at my workplace adds to my sense of purpose.” “I find it more satisfying to work hard for an organization with a strong mission of social justice.” Similarly several respondents cited an increased sense of connection to the community.

There were additional themes of “clinical environment” and “residency training.” Respondents cited a diversity of pathology, large amount of trauma care, patient diversity, and high patient volumes. Other respondents drew a distinction between nonprivate hospitals citing “advance [patient] care and retain autonomy in our practice” and another noting “fewer demanding patients.” In addition several respondents described the clinical environment as “underresourced” or “doing more with less.”

Many of the themes reflected in the qualitative answers were directed at providing guidance to applicants. Many respondents cited that identification with the mission of the hospital was a major factor, for example: “Compassion; have a higher calling to serve an underserved population.” Respondents suggested that volunteer experiences, health disparity work and research, and specific work with the underserved were all concrete ways that applicants could demonstrate identification with the mission of county programs. In line with the mission to care for the underserved, one respondent stated “Avoid complaining about ‘frequent flyers,’ substance abuse, and/or prisoners.” Similarly, another respondent stated, “Make sure that they have a good understanding of the psychosocial issues that often need to be considered in this population and the personal biases that many of us [have] about this patient population.” These themes were nicely summarized by this response: “If your application is mission, vision, and goal‐oriented with the program's values, you will be golden and sought after.”

In addition to extracurricular activities, multiple respondents suggested that students should rotate at a county program both to gain experience with the environment and to demonstrate their commitment and interest. For those students who are unable to rotate at a county hospital, one respondent suggested “utilize [the] personal statement as an opportunity to express how [the] county training environment is [a] good fit.” Multiple respondents provided an important caveat: “Rotate at a county program and at a noncounty program so you really understand the difference. Lot[s] of pros/cons each way, so knowing that a county program is what you really want is important.” This concept was reiterated by this response “know what it is you like about the county programs and why you are interested in going there as opposed to a community or university setting.”

Other major themes that arose were work ethic and a history of success in team settings. In line with the large percentage of respondents who cited the importance of a “county SLOE [standardized letter of evaluation],” one respondent specifically cited the importance of the reliability of the recommendation “… or SLOE for another busy site even if [in a] community [setting] where the ED retains decent autonomy of their practice and is busy—ideally not an ivory tower academic center where many consultants are used and patients are quaternary care and not a slow ED. [P]performance there does not translate to our environment.”

When comparing the attributes of programs by their self‐described type of primary training environment on EMRA Match, county programs are typically larger, older, originally accredited by the ACGME, and 4 years in duration, and have higher ED volumes compared to university and community programs. Additionally, county programs are more likely to be owned and operated by local governments or through governmental hospital districts and authorities. County hospitals see more DSH patients than university or community hospitals, seeing significantly more Medicaid patients and fewer Medicare patients. Both county and university programs have higher resident‐to‐bed ratios, a metric considered to be reflective of a hospital's teaching intensity, compared to community programs.16 County hospitals are more likely to be located in large central metro areas compared to university and community hospitals (Table 3).

TABLE 3.

Characteristics of EM residency programs by self‐described type of primary training environment

Community

(n = 91)

County

(n = 33)

University

(n = 92)

Total

(N = 216)

p‐value
Program characteristics
Year of original accreditation <0.001
Median 2016 1987 1994 1998
Q1, Q3 1994, 2017 1982, 2006 1986, 2005 1986, 2016
Original accreditation type <0.001
ACGME 53 (58.2%) 30 (90.9%) 90 (97.8%) 173 (80.1%)
AOA 33 (36.3%) 3 (9.1%) 2 (2.2%) 38 (17.6%)
Dual 5 (5.5%) 0 (0.0%) 0 (0.0%) 5 (2.3%)
Length of training 0.002
3‐year 73 (80.2%) 17 (51.5%) 73 (79.3%) 163 (75.5%)
4‐year 18 (19.8%) 16 (48.5%) 19 (20.7%) 53 (24.5%)
ED volume (in thousands) 0.024
Median 80 100 84 85
Q1, Q3 65, 106 72, 130 68, 100 67, 105
N‐Miss 2 1 0 3
Hospital characteristics
Hospital ownership <0.001
For‐profit 14 (15.4%) 1 (3.0%) 2 (2.2%) 17 (7.9%)
Government—district/authority 5 (5.5%) 9 (27.3%) 4 (4.3%) 18 (8.3%)
Government—local 1 (1.1%) 15 (45.5%) 0 (0.0%) 16 (7.4%)
Government—state 0 (0.0%) 1 (3.0%) 23 (25.0%) 24 (11.1%)
Nonprofit 71 (78.0%) 7 (21.2%) 63 (68.5%) 141 (65.3%)
% Medicare patients (Medicare inpatient days/total inpatient days) <0.001
Median 0.33 0.14 0.27 0.27
Q1, Q3 0.26, 0.40 0.10, 0.19 0.24, 0.31 0.22, 0.34
% Medicaid patients (Medicaid inpatient days/total inpatient days) <0.001
Median 0.25 0.58 0.29 0.3
Q1, Q3 0.18, 0.34 0.46, 0.65 0.24, 0.35 0.22, 0.38
N‐Miss 0 0 2 2
DSH patient percentage (SSI/Medicare days + Medicaid days/total inpatient days) <0.001
Median 0.34 0.72 0.39 0.39
Q1, Q3 0.28, 0.41 0.58, 0.82 0.32, 0.47 0.30, 0.52
Resident‐to‐bed ratio <0.001
Median 0.23 0.65 0.54 0.4
Q1, Q3 0.16, 0.34 0.39, 0.93 0.42, 0.72 0.23, 0.64
Urban–rural classification <0.001
Large central metro 30 (33.0%) 27 (81.8%) 51 (55.4%) 108 (50.0%)
Large fringe metro 16 (17.6%) 1 (3.0%) 7 (7.6%) 24 (11.1%)
Medium metro 29 (31.9%) 4 (12.1%) 27 (29.3%) 60 (27.8%)
Micropolitan 2 (2.2%) 0 (0.0%) 1 (1.1%) 3 (1.4%)
Small metro 14 (15.4%) 1 (3.0%) 6 (6.5%) 21 (9.7%)

Abbreviations: ACGME, Accreditation Council for Graduate Medical Education; AOA, American Osteopathic Association; DSH, disproportionate share hospital.

Using a binary classification of “county” versus “not county,” CART accurately classifies 201 of 216 (93.1%) programs based on self‐identification on EMRA Match, with 72.7% sensitivity and 96.7% specificity, based solely on DSH patient percentage (Figure 1).

FIGURE 1.

FIGURE 1

Classification tree of DSH. DSH, disproportionate‐share hospitals; EMRA, Emergency Medicine Residents Association

EM training has undergone numerous changes since the specialty was recognized by the American Board of Medical Specialties and the ACGME, including a sharp increase in the number of available training programs and medical student interest in the specialty. The EM Residency Review Committee requirements have continued to evolve and proscribe training expectations and requirements, making a minimum educational standard for all training programs. However, significant variability in training styles of different programs still exists, and identifying characteristics of various styles is important to assist mentors and students to accurately identify the optimal learning environment for individual applicant's residency training experience. Traditionally, different EM training styles were identified by grouping categories such as county, community‐based, and university‐based.

While comparative definitions and distinctions have never formally been published, historically the term “county program” assumes a number of preconceived notions: ncreased patient volumes; a focus on social determinants of health; an underserved patient population; mission‐driven care; payer‐mix imbalance; varied demographics; lack of internal and external resources available for patients and medical providers; increased resident autonomy (and less supervision than other training styles); increased service requirements; more abundant procedural opportunities; research focused on supporting the mission‐driven philosophy; and city, county, or state ownership status of the hospital. Safety‐net hospitals or health systems provide care to low‐income, uninsured, and vulnerable populations.17 Ownership does not necessarily distinguish them as county since they may be publicly owned by local or state governments and/or nonprofits. They are unified by a commitment to provide access to care for people with limited access to other health systems because of their health, financial, or insurance status. Our survey results confirm that there does not seem to be a single definition of what constitutes a county program, although themes of funding (source of funding, lack of resources, and uncompensated care), patient population (uninsured, underserved, medically in need), and mission of the hospital (role as safety net, inner city) seemed to be most significant to defining a program as county:

  • Commitment to care for the underserved (87% [95% CI = 74% to 95%]);

  • Funding from the city or state (61% [95% CI = 45% to 75%]);

  • Low‐resourced (59% [95% CI = 43% to 73%]);

  • Urban setting (56% [95% CI = 41% to 71%];)

Interestingly, many of the above factors identified by our survey respondents were echoed in the quantitative analysis comparing features of self‐designated county programs on the EMRA Match site to publicly available data sources. Self‐designated county programs were more likely to be based at hospitals that are located in urban centers, receive government funding (local, district/authority), have higher ED patient volumes, and care for a more underserved population as defined by a significantly higher percent of DSH patients when compared to self‐designated community and university programs. Additionally, county training programs tend to be older, perhaps pointing to the role that these mission‐driven institutions have in the foundation and ongoing history of our specialty.

However, there were some differences noted between the themes pulled from the survey respondents and the characteristics identified from those programs that self‐designate as county on the EMRA Match. While survey respondents noted a government funding source as a distinction of county programs, the comparative data found an association between county programs and local or district/authority funding, but not state funding. Distinction by funding source is complicated by the fact that many university systems receive funding from state governments and that governmental hospital districts/authorities have significant variation in size and scope, from operating a single hospital to operating multiple hospitals across a state or region. Other themes from the survey were notably absent in the comparative data set, including the level of resources available at the county program primary training site and that site's trauma center designation. While survey respondents felt that county programs tend to be underresourced, there is no agreed‐upon measure of resource availability that is easily extracted from public data.

Our survey is not the only evidence we have that the definition of a county program” is unclear. When examining residency directories (SAEM, EMRA, and FREIDA directories) that are commonly used by medical students, significant discrepancies exist in which programs are listed as county (Table 4).9, 10, 11 FREIDA has no county program” designation and some programs that are designated as county in the EMRA and SAEM directories are in both the academic and the community category in the FREIDA directory. This is important as EM‐interested students list the style of training program as their number one deciding factor when choosing an EM residency program and utilize training environment as a common filter on EMRA Match.4, 5 This discrepancy across various residency directories has the potential to create significant confusion for interested applicants and faculty mentors.

TABLE 4.

County program by directory listing

SAEM directory (self‐designated) EMRA directory (self‐designated) FREIDA directory
Alameda Health System–Highland Medical Center (Highland) Alameda Health System–Highland Medical Center (Highland) No county program designation
Arrowhead Regional Medical Center Arrowhead Regional Medical Center
Emory University (Grady Memorial Hospital) Emory University (Grady Memorial Hospital)
Harbor–UCLA Medical Center (UCLA–Harbor) Harbor–UCLA Medical Center (UCLA–Harbor)
Hennepin County Medical Center (Hennepin) Hennepin County Medical Center (Hennepin)
Jackson Health System/University of Miami (Jackson Memorial) Jackson Health System/University of Miami (Jackson Memorial)
John Peter Smith Health Network (JPS) John Peter Smith Health Network (JPS)
Kings County Hospital/SUNY Downstate (Kings County) Kings County Hospital/SUNY Downstate (Kings County)
Los Angeles County + University of Southern California (LAC+USC) Los Angeles County + University of Southern California (LAC+USC)
MetroHealth, Medical Center, Cleveland MetroHealth, Medical Center, Cleveland
NYU Bellevue Hospital Center (Bellevue) NYU Bellevue Hospital Center (Bellevue)
Texas Tech University Health Sciences Center, El Paso Texas Tech University Health Sciences Center, El Paso
University of Florida College of Medicine/Jacksonville (UF Jacksonville) University of Florida College of Medicine/Jacksonville (UF Jacksonville)
University of Nevada/Las Vegas (Las Vegas) University of Nevada/Las Vegas (Las Vegas)
University of Texas Southwestern (UTSW) UTSW
Baylor College of Medicine
Boston University
Case Western Reserve University/Metro Health
Comanche County Memorial Hospital
Cook County Hospital
Denver Health & Hospital Authority
Detroit Medical Center, Wayne State
Jacobi/Montefoire
Kern Medical Center
Lincoln Medical and Mental Health Center
Louisiana State University New Orleans
Maricopa Medical Center (Maricopa)
Metro Health University of Michigan Health
New York Medical College Metropolitan
Rutgers New Jersey Medicine School
University of Arizona, Tucson
University of California Fresno
University of Central Florida Ocala
University of Missouri Truman Medical Center
University of Washington

IMPLICATIONS FOR EDUCATION AND TRAINING IN EM

Both the survey respondents and the factors associated with self‐designated county programs identified common themes across programs considered county, including a shared mission, both clinical and academic, to serve a medically underserved and diverse patient population, a local funding source, and an overall lower‐resourced setting than other hospital systems. Additional refinement of different descriptive characteristics of various EM residency training styles might further assist applicants and advisors in applicant decision making surrounding the programs that best fit their learning styles. Programs referring to themselves as county programs should meet the shared mission concept and be able to justify how their program meets the definition of a county program, perhaps by using the above CART results, providing for an underserved population, as evidenced by a high percentage of DSH patients. Historically the CORD County of Community Practice group has embraced this mission‐driven, egalitarian, philosophy by starting the unified medical student interview release date, open to all program types, by creating an equitable distribution of interview invitations and a clear timeline to provide maximum opportunity for both programs and applicants.

For applicants interested in county programs, these programs want to know that applicants are driven by a similar mission to serve. Applicants should make sure their application reflects this. A SLOE from another county program is important to assure fit for the applicant and program, as noted by over 50% of respondents recommending rotating at a county program.

The term “county program” is colloquially linked to a funding source; however, the preliminary data revealed that the communal definition centered on mission in the survey versus payer mix in the CART analysis, further contributing to a heterogeneous definition. Additionally, standardization across various residency directories should be sought. These results clearly show the need for a widespread qualitative analysis to better establish the definition of a county program. Additionally, reconsideration of the moniker county program as a descriptive term to define ourselves should be addressed on a national level. In this work, the historic moniker was kept in an attempt to define it. However, recent literature also questions whether the distinction may not matter as much in the future with the frequent consolidation and mergers of large hospital systems; these chasms may cease in this day and age.18 This exploratory analysis is no in any way meant to definitively establish a definition of county programs, but rather to identify common themes, offer some direction to residency applicants, and most importantly, begin a discussion among the EM community.

Notable limitations of this study include a limited survey, largely biased toward self‐identified county program respondents. We did not collect data on the specific program respondents were representing and thus cannot comment on overrepresentation from a single institution. We did not widely test survey questions prior to implementation and some responses exposed ambiguity in the wording of the initial question, which may have resulted in lower quality or more variable answers. Overall, we had a limited number of responses, and the vast majority self‐identified as working at a county program, perhaps skewing the themes identified to represent county programs defining themselves and not including the perception of county programs by noncounty respondents. The survey response is exceedingly low and likely skewed towards those who work at “county programs.” The overall low response rate could have been related to the title of the survey including the word “county,” which may have caused confusion among the listserv, and some may have thought only county programs should respond. That being said, given that this is a preliminary analysis to discover themes linking county programs, even this low response rate offers a meaningful number of responses, providing valuable insight into perceptions of county programs. The majority of the respondents either currently work at or have worked at a county program, which reflects more of a self‐designation rather than the views of the EM community as a whole, which limits our exploratory inquiry and preliminary data.

The primary limitation of the analysis of program characteristics by training environment and CART analysis is that of 257 of the primary training sites that would have been eligible for inclusion, 41 had to be excluded due to not self‐classifying their type of primary training environment. Additionally, due to the fact that county programs were defined as such based only on self‐designation, there may be disagreement amongst the academic community about which programs ought to be identified as county. Some programs were misclassified by CART due to limitations of the methodology, while others were likely misclassified because they do not share the same set of core characteristics.

More quantitative and qualitative work is needed in this area specifically focusing on medical students. This concept paper represents a first look at defining what county means as described by survey respondents that were primarily faculty working at programs self‐identified as county and supported by objective characteristics associated with self‐identified county training programs. The next steps may be a more directed survey to faculty at programs not self‐identified as county and to medical students to help further inform the definition. With changes to the interview process with COVID‐19 and potential implications of future pandemics, having clear definitions for students will provide more accurate information to help them determine their “fit” with programs. In the near future, many students may not have the chance to rotate at a county program to determine if it is the kind of training environment in which they will thrive. Providing them with other opportunities to obtain a better understanding of what county programs offer will be of utmost importance.

Key residency directories list varying programs as county training facilities and some lack the designation all together. This work provides qualitative descriptive results via a mixed‐methods inquiry utilizing survey data, quantitative data about self‐designated county programs, and an in‐depth discussion along with summary recommendations on what it means to be identified as a county program in EM. To be considered a county program we recommend some or most of the following attributes be present:

  1. A shared mission, both clinical and academic, to medically underserved and vulnerable patient populations.

  2. An urban location with a city, county, or district/authority funding source.

  3. An ED with high patient volumes.

  4. A training ethos that supports significant resident autonomy.

  5. Departmental research expertise and a track record of publication in supporting care for the underserved.

CONFLICT OF INTEREST

TYS is a consultant for the Novartis sickle panel and has received funding personally from Novartis for this consulting. The other authors have no potential conflicts to disclose.

AUTHOR CONTRIBUTION

Study concept and design: Jennie A. Buchanan, Maria Moreira, Taku Taira, Richard Byyny, Todd Andrew Taylor, W. Gannon Sungar, Christy Angerhofer, Sean Dyer, Melissa White, Dhara Amin, Michelle D. Lall, David Caro, Melissa E. Parsons, Teresa Y. Smith. Acquisition of the data: Christy Angerhofer, Jennie A. Buchanan, Teresa Y. Smith, Maria Moreira. Analysis and interpretation of the data: Richard Byyny, Taku Taira, Zachary Jarou , Jennie A. Buchanan, Christy Angerhofer. Drafting of the manuscript: Jennie A. Buchanan, Maria Moreira, Taku Taira, Richard Byyny, Zachary Jarou, Todd Andrew Taylor, W. Gannon Sungar, Christy Angerhofer, Sean Dyer, Melissa White, Dhara Amin, Michelle D. Lall, David Caro, Melissa E. Parsons, Teresa Y. Smith. Critical revision of the manuscript for important intellectual content: Jennie A. Buchanan, Maria Moreira, Taku Taira, Richard Byyny, W. Gannon Sungar, Zachary Jarou, Teresa Y. Smith. Statistical expertise: Richard Byyny, Taku Taira, Zachary Jarou. Acquisition of funding: N/A.

Buchanan JA, Moreira M, Taira T, et al. Defining “county”: A mixed‐methods inquiry of county emergency medicine residency programs. AEM Educ Train. 2021;5(Suppl. 1):S87–S97. 10.1002/aet2.10664

This work has not been nationally presented, but originated from the CORD County of Community Practice working group. No funding was provided for this work.

Supervising Editor: Alden Landry, MD, MPH.

REFERENCES


Articles from AEM Education and Training are provided here courtesy of Wiley

RESOURCES