Oral and maxillofacial surgery residents and faculty/attending surgeons receive far too many surveys. Every month we receive new links to SurveyMonkey, Google Forms, REDCap, with subsequent reminder electronic mails (e-mails) whether we complete the form or not. Questionnaires are not limited to research requests either but are generated from our hospitals, dental schools, and from any, and all businesses that have access to our e-mail addresses. Needless to say, we all have survey fatigue. The part concerning the Journal of Oral and Maxillofacial Surgery (JOMS), however, is that low response rates are leading to unreliable results and limited scientific value for publication purposes.
Because our institution accounts for a fair number of the survey-related publications in the JOMS, we are all too familiar with these flaws. The incentive for this perspectives piece originates from our own frustrations with poor response rates from our recent survey requests that aim to evaluate significant issues in our training institutions. For example, to identify the purpose and value of noncategorical internships, we recently surveyed 101 oral-maxillofacial surgery-accredited programs with an e-mail request; we received 1 complete response (<1.0%).
As residents and academicians, we are familiar with the busy workflow that prevents us from completing a short 5-minute survey. However, we also know that when properly constructed, surveys offer a sound methodological tool for examining residency programs. Anonymous surveys are one of the few means we have of gathering information directly from residents.
With this in mind, we aimed to review the surveys in the JOMS during the last 10 years. What is the quality of our publications with regard to the number of surveys distributed and the number of surveys completed?
We conducted a search through PubMed]search from September 14, 2020: ((survey) OR (surveys)) AND (journal of oral and maxillofacial surgery)]. Searches were limited to years 2010 to 2020. The search located 212 results. Articles of interest included surveys that queried oral-maxillofacial surgery residents and oral-maxillofacial surgery program directors (PDs). Articles querying patients, or cross-sectional surveys, were excluded. We also excluded oral-maxillofacial surgery providers because sampling this much larger population has an inherently different study design methodology.
We located 18 articles published in the JOMS during the last 10 years, which used surveys to gather data on OMS residents and PDs (Table 1 ). Ten of these surveys focused on oral-maxillofacial surgery residents, 7 focused on oral-maxillofacial surgery PDs, and 1 queried both residents and PDs.
Table 1.
First Author | Surveys Distributed | Included Surveys | Response Rate (%) | ||
---|---|---|---|---|---|
Title—Resident survey | |||||
How important are letters of recommendation? A survey of oral and maxillofacial surgery residency program directors | Laskin | 122 | 41 | 33.6 | ∗Included former PDs |
Early effects of COVID-19 on oral and maxillofacial surgery residency training—Results from a national survey | Huntley | 101 | 13 | 12.9 | |
Trends and attitudes regarding head and neck oncologic surgery: A survey of United States oral and maxillofacial surgery programs | Clark | 101 | 63 | 62.4 | |
Attitudes and opinions of residency directors and residents about the importance of research in oral and maxillofacial surgery residencies | Mohammed | 101 | 44 | 43.6 | |
How many temporomandibular joint total joint alloplastic implants will be placed in the United States in 2030? | Onoriobe | 101 | 53 | 52.5 | |
Characteristics of oral and maxillofacial surgery residencies that result in graduating residents entering academic positions | Sarraf | 89 | 44 | 49.4 | ∗Civilian programs only |
Sum | 258 | ||||
Average | 43 | 42.4 | |||
Title—PD survey | |||||
Residency interview experiences in oral and maxillofacial surgery differ by gender and affect residency ranking | Lee | 1,150 | 165 | 14.3 | |
Oral and maxillofacial surgery resident perception of personal achievement and anxiety: A cross-sectional analysis | Al Atassi | 1,150 | 238 | 20.7 | |
Early effects of COVID-19 on oral and maxillofacial surgery residency training—Results from a national survey | Huntley | 1,150 | 160 | 13.9 | |
Factors associated with the mental health and satisfaction of oral and maxillofacial surgery residents in the United States: A cross-sectional study and analysis | Smith | 1,150 | 300 | 26.1 | |
What a shame: Increased rates of OMS resident burnout may be related to the frequency of shamed events during training | Shapiro | 1,150 | 217 | 18.9 | |
Does a difference exists in Comprehensive Basic Science Examination scores of 4-year versus 6-year oral-maxillofacial surgery residents? | James | 1150 | 68 | 5.9 | |
Oral and maxillofacial surgery residents have poor understanding of biostatistics | Best | 1150 | 112 | 9.7 | |
Sum | 1,260 | ||||
Average | 180 | 15.7 |
Abbreviations: COVID-19, coronavirus disease 2019; OMS, Oral and Maxillofacial Surgery; PDs, program directors.
Modified population.
Of the 11 resident surveys, 3 focused on senior residents, and 1 sampled only 535 residents, and these were not included in the statistical analysis. The remaining 7 surveys queried all oral-maxillofacial surgery residents at Commission on Dental Accreditation (CODA)-accredited oral-maxillofacial surgery programs, which amount to approximately 1,150 residents, depending on the specific year. The average number of surveys returned was 180 ± 72.6 (range, 68 to 300), for an average response rate of 15.7% (range, 5.9 to 26.1).
Of the 8 surveys of PDs, 1 was focused on dental school-based oral-maxillofacial surgery programs and 1 focused on oral-maxillofacial surgery-medicinae doctor integrated programs, and these were not included in analysis. The remaining 6 surveys queried all PDs at CODA-accredited oral-maxillofacial surgery programs in the United States, which amounts to 101 programs, depending on the year. The average number of surveys returned was 43 ± 15.3 (range, 13 to 63), for an average response rate of 42.4% (range, 12.9 to 62.4).
The concern with these studies is not simply the low number of responses—indeed, a well-designed survey can garner significant reliable information albeit from a small sample size. Specifically, our concern related to the low response rate, particularly when we aim to distribute surveys to all 1,150 residents and receive only 68 to 300 responses. This creates a significant nonresponse bias. For example, if we are trying to gain insight regarding differences in Comprehensive Basic Science Examination (CBSE) score between oral-maxillofacial surgery residents, a group of 68 residents who respond to the survey may have significantly different characteristics from the 1,082 residents who did not respond. Nonresponse bias becomes particularly concerning when questionnaires involve sensitive information (ie, CBSE score).
Standard epidemiologic principles indicate that response rates should be close to 80% to prevent nonresponse bias. Some medical journals require a 60% minimum response rate for publication of surveys, which increases to 80% if the survey is for the purpose of health science education.1
Given our own University of Illinois at Chicago experience, an 80% response rate seems completely unattainable. We know from evaluation of other specialty journals in related fields, such as ENT and plastic surgery, they also have similar low response rates (∼20 to 40%) in their resident survey publications. However, in medical education journals, a recent review found a 71.3% response rate for resident/trainee surveys (range, 26.6 to 100.0%).2
Should we eliminate these survey studies? Should we perform more cross-sectional surveys? For instance, we could request data from the National Board of Medical Examiners (re: CBSE), American Dental Education Association Postdoctoral Application Support Service, or American Board of Oral and Maxillofacial Surgery (re: Oral and Maxillofacial Surgery In-Service Training Examination). Indeed, one of the largest population groups that have not been surveyed adequately are applicants who fail to match into an oral-maxillofacial surgery program; typically, there is a 50% failure rate (200 unmatched applicants/400 total applicants). It would be interesting to determine their average National Board of Medical Examiners CBSE score, class rank, number of programs applied to and interviewed at, to help guide future applicants. However, although these organizations may have these useful data stored in their databases, they are not at liberty to provide this information to third-party investigators.
So, how do we change our approach to surveys to improve response rates?
Currently, the JOMS does not have minimal survey standards according to the author guidelines. Certainly, an arbitrary minimal response rate could be selected by the editors and editorial board. Future surveys could also align with national and international standards for calculation of acceptable response rates—using the American Association for Public Opinion Research.3 Other proven alternatives to improve response rates include provision of financial incentives to respondents; 3 automatic reminder notifications, at minimum; prenotifications of the survey intent; the use of various survey modalities (ie, paper forms, e-mail responses, and use of dedicated survey Web sites).4
However, we propose an alternative option—we call it the annual resident survey.
For the past 102 years, our specialty has enjoyed a regular event that reliably engages oral-maxillofacial surgery surgeons, residents, faculty, and programs across the country—the American Association of Oral and Maxillofacial Surgeons (AAOMS) Annual Meeting and Scientific Sessions. This meeting marks the debrief from the prior year, with a concerted plan for the year(s) ahead. We propose that this time also marks the deadline for completion of the annual resident survey by all PDs and residents.
Currently, oral-maxillofacial surgery PDs already complete 2 separate yearly surveys, 1 from CODA and another from AAOMS. These surveys require an enormous amount of time and effort for completion. However, the CODA and AAOMS surveys only query the PDs, who provide basic demographic data, CBSE score, and yearly training experience for each resident. Neither survey collects any information from residents directly. Currently, residents do not complete a yearly survey. The annual resident survey would be the only 1 of these 3 to probe for any subjective emotions or individual reasoning of the respondents. In a unique fashion, this survey would help assess why oral-maxillofacial surgery programs function in a certain manner and offer us an ability to compare how these characteristics change over time. Only 1 prior example of this is known to us, a 1976 questionnaire repeated in 2000, which showed that residents had altered their selection criteria when applying to residency.5
We propose a small study group/task force, composed of residents and PDs, to construct a 20- to 30-question survey at the Annual AAOMS Meeting—perhaps The Resident Organization of the American Association of Oral and Maxillofacial Surgeons committee meetings would be an ideal forum for such an activity. The group could focus on a different topic each year, combine multiple questions of interest, or repeat a prior survey to obtain more information. Planning of the upcoming survey at this time would ensure solid study design. This could become a concerted effort of our specialty and one from which we could all eventually benefit. If a sponsor were identified, financial incentives could also be offered to oral-maxillofacial surgery programs (PDs, faculty, and residents) that have >80% response rate.4
Certainly, other surveys could be distributed throughout the year to special interest groups or AAOMS clinical interest groups as well as or oral-maxillofacial surgery private practice or part-time faculty surgeons. This annual resident survey would focus on contributing to the understanding and advancement of oral-maxillofacial surgery training programs using a unique approach: obtaining the information about resident training directly from residents themselves! This effort can be publicized at the AAOMS annual meeting as well as in the JOMS. Yearly survey results published in the JOMS could further ritualize the successful completion of this annual resident survey.
Can we ever realistically expect a response rate more than 60%, more than 70%, or even more than 80%? It seems that without a response rate approaching 80%, we are not producing scientifically valid information, and much time is wasted in these efforts. It is time for a change.
Footnotes
Conflict of Interest Disclosures: Dr Miloro is a consultant for AxoGen, Inc, Alachua, FL. All other authors do not have any relevant financial relationship(s) with a commercial interest.
References
- 1.Fincham J.E. Response rates and responsiveness for surveys, standards, and the Journal. Am J Pharm Educ. 2008;72:43. doi: 10.5688/aj720243. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Phillips A.W., Friedman B.T., Utrankar A. Surveys of health Professions trainees: Prevalence, response rates, and Predictive Factors to guide Researchers. Acad Med. 2017;92:222. doi: 10.1097/ACM.0000000000001334. [DOI] [PubMed] [Google Scholar]
- 3.The American Association for Public Opinion Research (AAPOR) Standard definitions: Final dispositions of case codes and outcome rates for surveys. 2016. https://www.aapor.org/AAPOR_Main/media/publications/Standard-Definitions20169theditionfinal.pdf Published online.
- 4.Phillips A.W., Reddy S., Durning S.J. Improving response rates and evaluating nonresponse bias in surveys: AMEE guide No. 102. Med Teach. 2016;38:217. doi: 10.3109/0142159X.2015.1105945. [DOI] [PubMed] [Google Scholar]
- 5.Spina A.M., Smith T.A., Marciani R.D., Marshall E.O. A survey of resident selection procedures in oral and maxillofacial surgery. J Oral Maxillofac Surg. 2000;58:660. doi: 10.1016/s0278-2391(00)90162-9. discussion 666-667. [DOI] [PubMed] [Google Scholar]