Abstract
Objective
To assess whether a combination of Internet‐based and postal survey methods (mixed‐mode) compared to postal‐only survey methods (postal‐only) leads to improved response rates in a physician survey, and to compare the cost implications of the different recruitment strategies.
Data Sources/Study Setting
All primary care gynecologists in Bremen and Lower Saxony, Germany, were invited to participate in a cross‐sectional survey from January to July 2014.
Study Design
The sample was divided into two strata (A; B) depending on availability of an email address. Within each stratum, potential participants were randomly assigned to mixed‐mode or postal‐only group.
Principal Findings
In Stratum A, the mixed‐mode group had a lower response rate compared to the postal‐only group (12.5 vs. 20.2 percent; RR = 0.61, 95 percent CI: 0.44–0.87). In stratum B, no significant differences were found (15.6 vs. 16.2 percent; RR = 0.95, 95 percent CI: 0.62–1.44). Total costs (in €) per valid questionnaire returned (Stratum A: 399.72 vs. 248.85; Stratum B: 496.37 vs. 455.15) and per percentage point of response (Stratum A: 1,379.02 vs. 861.02; Stratum B 1,116.82 vs. 1,024.09) were higher, whereas variable costs were lower in mixed‐mode compared to the respective postal‐only groups (Stratum A cost ratio: 0.47, Stratum B cost ratio: 0.71).
Conclusions
In this study, primary care gynecologists were more likely to participate by traditional postal‐only than by mixed‐mode survey methods that first offered an Internet option. However, the lower response rate for the mixed‐mode method may be partly due to the older age structure of the responding gynecologists. Variable costs per returned questionnaire were substantially lower in mixed‐mode groups and indicate the potential for cost savings if the sample population is sufficiently large.
Keywords: Physician survey, mixed‐mode, methods, response rate, cost implications
Introduction
Physician surveys play an important role in health care research. They are an essential source of information on physicians' knowledge, attitudes, beliefs, and practices related to new or controversial technologies, decision making regarding specific interventions, or the implementation of public health interventions (Kellerman and Herold 2001; Klabunde et al. 2012). Previous reviews have shown that on average response rates of physician surveys are substantially lower than those of nonphysician surveys (Cummings, Savitz, and Konrad 2001; VanGeest, Johnson, and Welch 2007). Consequently, physician surveys have limitations in validity and generalizability of their findings (Kellerman and Herold 2001; Cull et al. 2005; VanGeest, Johnson, and Welch 2007; Martins et al. 2012; Willis, Smith, and Lee 2013). Furthermore, there is evidence suggesting that response rates in physician surveys may be even declining further (Cull et al. 2005; Cook, Dickinson, and Eccles 2009; McLeod et al. 2013).
Several systematic reviews have examined strategies on how best to increase response rates among physicians, mostly focusing on traditional approaches, that is, postal, fax, or telephone surveys (Kellerman and Herold 2001; VanGeest, Johnson, and Welch 2007). For example, a review including 24 studies examining survey methodology (until the end of the 1990s) found that prenotification, personalized mail‐outs, and nonmonetary incentives were not associated with improved response rates among physicians (Kellerman and Herold 2001). More recent reviews, however, have shown that prenotification letters and sponsorship or endorsement by organizations salient to the survey's target population are important factors in establishing the survey's relevance and to facilitate participation (Field et al. 2002; Edwards et al. 2009; McLeod et al. 2013). Regarding the mode of survey contact, postal and telephone surveys achieved higher participation rates than fax or Internet‐based surveys (VanGeest, Johnson, and Welch 2007; McLeod et al. 2013). Surveys using mixed‐mode and telephone‐only or postal‐only surveys obtain higher response rates than compared to email‐only or Internet‐only surveys (VanGeest, Johnson, and Welch 2007; McLeod et al. 2013).
Although researchers have started to examine response rates in physician surveys using the Internet, for example, email or Internet‐only surveys (Braithwaite et al. 2003) and mixed‐mode surveys, that is, initial postal survey followed by Internet‐based or vice versa (McMahon et al. 2003; Beebe et al. 2007; Matteson et al. 2011), evidence is still limited on how best to increase response rates in physician surveys while keeping resource needs low. Mixed‐mode surveys may improve response rates, as long as the different survey modes are presented sequentially: A fixed sequence is important because providing a choice of different modes simultaneously can lead to complexity in the decision‐making process for potential respondents and may reduce response rates (Beebe et al. 2007; Klabunde et al. 2012). However, it is still unclear whether this finding also holds when surveying physicians: Do mixed‐mode surveys lead to improved response rates under certain circumstances, that is, if a primary email address is available for delivering a survey link?
The purpose of this study was, first, to assess whether a combination of Internet‐based and postal survey methods, that is, mixed‐mode, leads to improved response rates among primary care gynecologists compared to postal‐only survey methods and to compare the cost implications of the different recruitment strategies.
Methods
This physician survey was part of a case–control study entitled “Care‐related factors associated with antepartal diagnosis of intrauterine growth restrictions—A case‐control study.” The aims and the study design of the overall project are described elsewhere (Ernst et al. 2014).
Study Design and Participants
Primary care gynecologists in two German federal states (Bremen and Lower Saxony) were eligible for the survey. We invited all gynecologists contracted to the Association of Statutory Health Insurance Physicians (ASHIP) in Bremen (Association of Statutory Health Insurance Physicians Bremen 2015) or Lower Saxony (Association of Statutory Health Insurance Physicians Lower Saxony 2015) at the time of study conduct. ASHIP is part of the statutory health insurance (SHI) system in Germany. The SHI insures roughly 90 percent of the German population, and virtually, all primary care gynecologists are members of the ASHIP (Federal Association of Sickness Funds [GKV Spitzenverband] 2016). Gynecologists only based in hospitals in the study region were not eligible, whereas those providing services at hospitals in addition to their primary care work were considered eligible. The study region covers a geographical area of 48,033 km2 (Bremen: 419 km2; Lower Saxony: 47,614 km2) with 8,447,950 (Bremen: 657,391; Lower Saxony: 7,790,559) residents by the December 31, 2013.
We conducted a randomized trial to assess the effect of mixed‐mode survey methods on response rates and cost implications when conducting a physician survey. This type of survey may also be described as staged Internet‐postal survey, but following other publications (e.g., Beebe et al. 2007), we use the term “mixed‐mode survey.” Postal and email addresses were collected for all eligible primary care gynecologists via Internet search. Postal addresses of all eligible gynecologists are provided on the ASHIP's websites (publicly available lists). To ensure validity of collected email addresses, we exclusively accessed officially provided websites, that is, first, we accessed publicly available lists provided by the ASHIP and the Federal Association of gynecologists (Bundesverband der Frauenärzte e.V.) (Federal Association of Gynecologists 2016). Secondly, if available, we accessed gynecologists' practice websites for adequate email addresses. Both gynecologists' private and practice‐related email addresses were included. Email addresses were not available for all participants. To avoid bias based on the availability of email addresses, we split the sample into two strata: Stratum A included all potential participants with a publicly available email address, and Stratum B those without email address. Within each stratum, the potential survey participants were randomly assigned to the mixed‐mode or the postal‐only group (Figure 1). Consequently, the mixed‐mode used in Stratum A differed from that in Stratum B, due to the availability of email addresses. Allocation ratio was 1 : 1 in each stratum, using a random number algorithm in Microsoft Excel.
Figure 1.
Allocation to Study Arms and Recruitment Strategies
Recruitment Strategies
The survey was conducted from January 2014 up to July 2014. To establish the survey's relevance, gynecologists were pre‐informed about the survey through newsletters and announcements in the institutional bulletin by the ASHIP in Bremen and Lower Saxony and the Federal Association of Gynecologists (Bundesverband der Frauenärzte e.V.) 2 weeks before survey start. Eligible gynecologists were invited for study participation using a multiple‐stage approach, with up to three survey contacts.
In Stratum A, all participants in the mixed‐mode group received an invitation to the Internet‐based survey via email, containing an embedded link and personalized access key. If respondents had not accessed and completed the survey after 3 weeks, we sent a reminder email with the same link and access key. If the survey had not been completed after 6 weeks, a postal letter with a paper‐based questionnaire was sent out (mode switch). In Stratum B, the mixed‐mode group received an initial postal invitation to the Internet‐based survey, containing a personalized access key and the written link to the survey. Three weeks after the first contact a reminder letter was sent, if the survey had not been completed. After 6 weeks, the potential participants received a postal letter with a paper‐based questionnaire (mode switch). The same recruitment strategy was applied to the postal‐only groups in Stratum A and Stratum B. The potential survey participants received a postal invitation, providing a copy of the paper‐based version of the questionnaire. If the questionnaire had not been returned after 3 weeks, a reminder letter was sent out without the questionnaire. A second reminder letter and a new copy of the questionnaire were sent if the potential survey participants had not responded after 6 weeks. Figure 1 provides an overview of the strategies.
Survey Questionnaire
The survey comprised a self‐administered Internet‐based or paper questionnaire, containing 21 items targeting gynecologists' practice routine regarding intrauterine growth restriction (IUGR) with an estimated time for completing the survey of 5–10 minutes, based on pretesting.
To avoid mode effects, that is, changes of responses due to differences in visual appearance of the questionnaires (Dillman 2000), the online survey design and layout was made to be as comparable to the postal version as possible. The questions were displayed in the same order and format as they were in the paper version. A professional web designer developed the study website that hosted the survey. The landing page of the website invited gynecologists to enter their assigned personal access key before the questionnaire was displayed. The study website provided information on aims and objectives of the project. However, the invited gynecologists were blinded to the fact that they were taking part in a methodological experiment.
Descriptions of Cost Types
We recorded recruitment and study conduct costs in either of two categories: fixed or variable costs. Fixed costs are virtually independent of sample size. We split fixed costs further into general costs—that is, document layout, preparation of dataset for the analyses, database maintenance, and technical support—and expenses for information technology—that is, programming and testing of the questionnaire for each survey, programming of the website, and testing of the databases. Variable cost included field costs—that is, expenses for printing and materials (letters, questionnaires, envelopes, and stamped addressed envelopes), expenses for response processing (telephone hotline, checking, scanning data, and sending letters), and postage. We did not include the costs for the Internet search on availability of gynecologists' email addresses in the costs analysis as those will vary strongly by setting, due to availability of provided email‐distribution lists.
Statistical Analyses
We calculated response rates for each wave and compared those between the study arms within each stratum by calculating risk ratios (RRs) with 95 percent confidence intervals (CIs). In addition to reporting the cost items, we calculated the costs per valid questionnaire returned—that is, costs per absolute number—and costs per percentage point of response (PPR)—that is, costs divided by response rate for each cost category. We calculated PPR to account for differences in total sample size.
Ethics Statement and Consent
Ethical approval for all study procedures was obtained from the ethics review board of the Bremen Medical Association.
Results
A total of 1,141 primary care gynecologists were eligible to participate in the study (Figure 1), after removal of three ineligible or duplicate cases due. Overall, email addresses of 60.4 percent (n = 691) of the primary care gynecologists (Stratum A) could be identified, containing 56.6 percent (n = 391) personal email addresses and 43.4 percent (n = 300) practice‐related email addresses. For 450 primary care gynecologists, no valid email address could be identified (Stratum B). In Stratum A, 345 primary care gynecologists were randomized to the mixed‐mode group and 346 to the postal‐only group. In Stratum B, 225 primary care gynecologists were randomized to each group.
In the overall sample, the mean age of the participants was 53.6 ± 7.6 years; more women (67.0 percent) than men (33.0 percent) participated in the survey. The majority of participating gynecologists had more than 5 years of practical experience (6–15 years: 41.1 percent, n = 76; >15 years: 38.9, n = 72), and only a few gynecologists had less or equal to 5 years of practical experience (≤5 years: 15.7, n = 29) (data not shown). Selected characteristics of responding gynecologists according to stratum and data collection mode are shown in Table 1.
Table 1.
Characteristics of Responding Gynecologists According to Stratum and Data Collection Mode
Stratum A | Stratum B | |||
---|---|---|---|---|
Mixed‐Mode (n = 43) | Postal‐Only (n = 70) | Mixed‐Mode (n = 35) | Postal‐Only (n = 37) | |
% (n) | % (n) | % (n) | % (n) | |
Sex | ||||
Male | 34.9 (15) | 44.9 (31) | 20.0 (7) | 21.6 (8) |
Female | 65.1 (28) | 55.1 (39) | 80.0 (28) | 78.4 (29) |
Age groups | ||||
<50 years | 41.5 (18) | 33.8 (24) | 20.0 (7) | 20.0 (7) |
>50 years | 58.5 (25) | 66.2 (46) | 80.0 (28) | 80.0 (30) |
Mean ± (n) | 52.4 ± 7.7 | 53.7 ± 8.7 | 54.4 ± 7.0 | 54.1 ± 5.5 |
Practice location | ||||
Urban* | 86.0 (37) | 71.0 (50) | 77.1 (27) | 59.5 (22) |
Rural* | 14.0 (6) | 29.0 (20) | 22.9 (8) | 40.5 (15) |
Practice type | ||||
Single | 32.6 (14) | 40.6 (28) | 57.1 (20) | 45.9 (17) |
Group | 53.5 (23) | 46.4 (32) | 37.1 (12) | 48.6 (18) |
Clinic | 7.0 (3) | 10.1 (7) | 5.7 (2) | 2.7 (1) |
Other | 7.0 (3) | 2.9 (2) | – | 2.7 (1) |
DEGUM † | ||||
Yes | 27.9 (12) | 23.2 (16) | 2.9 (1) | 5.4 (2) |
*Urban: >20,000 residents; rural: ≤20,000 residents.
†Refers to the ultrasonography diagnostics certification issued by the German Society of Ultrasound in Medicine (DEGUM).
Response Rates
In total, 185 primary care gynecologists completed the survey questionnaire, resulting in an overall response rate of 16.2 percent. In Stratum A, the mixed‐mode group had a significantly lower response rate when compared to the postal‐only group (12.5 vs. 20.2 percent; RR = 0.61 [95 percent CI: 0.44–0.87]). In Stratum B, we observed no statistically significant difference between the study arms (15.6 vs. 16.2 percent; RR = 0.95 [0.62–1.44]; Table 2).
Table 2.
Response Rates by Study Arms and Data Collection Phase
Total (N = 1,144) | Stratum A (n = 691) | Stratum B (n = 450) | ||||
---|---|---|---|---|---|---|
Responses n (%) | Responses n (%) | |||||
Data Collection Phase | Mixed‐Mode (n = 345) | Postal‐Only (n = 346) | Risk Ratio (95% CI) | Mixed‐Mode (n = 225) | Postal‐Only (n = 225) | Risk Ratio (95% CI) |
1st wave | 9 (2.6) | 36 (10.4) | 0.25 (0.12–0.51) | 18 (8.0) | 22 (9.8) | 0.82 (0.45–1.48) |
2nd wave | 8 (2.3) | 12 (3.5) | 0.62 (0.25–1.48) | 4 (1.8) | 6 (2.7) | 0.65 (0.18–2.28) |
3rd wave | 26 (7.5) | 22 (6.4) | 1.07 (0.62–1.85) | 13 (5.6) | 9 (4.0) | 1.40 (0.61–3.20) |
Total | 43 (12.5) | 70 (20.2) | 0.61 (0.44–0.87) | 35 (15.6) | 37 (16.4) | 0.95 (0.62–1.44) |
*Dashed line after 2nd survey wave indicates cross‐over/mode switch.
The difference in the response rate in Stratum A was particularly striking after the first wave with only 2.6 percent responding in the mixed‐mode group and 10.4 percent in the postal‐only group (RR = 0.25 [0.12–0.51]). After the mode switch (third wave), when we sent postal copies of the questionnaire, the response rate in the mixed‐mode group in Stratum A increased by 7.5 percentage points and thus was higher than compared to the first (2.6 percentage points) and second wave (2.3 percentage points) in the same group (Table 2). In Stratum B, the response rates were quite similar across all three waves between the study arms.
Recruitment Costs
The total costs per valid questionnaire returned and per PPR were higher in the mixed‐mode groups than in the postal groups (Stratum A: cost ratio = 1.60; Stratum B: cost ratio = 1.09; Table 3). However, when analyzing costs per category (i.e., fixed vs. variable costs), the picture changes slightly: Total fixed costs varied only slightly between survey modes, that is, they were €1,072 higher for the mixed‐mode (cost ratio: 1.07). This was caused by the difference in the costs for the required information technology. Fixed costs per valid questionnaire returned and per PPR were considerably higher in the mixed‐mode groups with cost ratios ranging from 1.13 to 1.73. However, variable costs were lower in the mixed‐mode groups as compared to the respective postal‐only groups. Cost ratios for variable costs per valid questionnaire and per PPR were 0.77 in Stratum A and 0.76 in Stratum B.
Table 3.
Fixed and Variable Costs for Each Cost Category (in EUR)
Costs | Stratum A | Stratum B | ||||
---|---|---|---|---|---|---|
Mixed‐Mode | Postal‐Only | Cost Ratio | Mixed‐Mode | Postal‐Only | Cost Ratio | |
Fixed costs | ||||||
General (e.g., data base maintenance, technical support, document layout) | 12,108.56 | 12,108.56 | 12,108.56 | 12,108.56 | ||
Information technology (e.g., programming software and website) | 3,911.20 | 2,839.00 | 3,911.20 | 2,839.00 | ||
Total fixed costs | 16,019.76 | 14,947.56 | 1.07 | 16,019.76 | 14,947.56 | 1.07 |
Fixed cost per valid questionnaire returned* | 372.55 | 213.54 | 1.74 | 475.71 | 403.99 | 1.17 |
Fixed costs per percentage point of response† | 1,285.31 | 738.84 | 1.74 | 1,029.84 | 908.97 | 1.13 |
Variable costs | ||||||
Printing and material (e.g., letters, survey, reminder) | 261.06 | 767.88 | 299.06 | 638.69 | ||
Response processing (e.g., telephone hotline, checking scanning data) | 431.37 | 491.23 | 460.73 | 459.77 | ||
Postage | 475.60 | 1,212,80 | 593.25 | 794.60 | ||
Total variable costs | 1,168.03 | 2,471.91 | 0.47 | 1,353.04 | 1,893.06 | 0.71 |
Variable costs per valid questionnaire returned* | 27.16 | 35.31 | 0.77 | 38.66 | 51.16 | 0.76 |
Variable costs per percentage point of response† | 93.71 | 122.18 | 0.77 | 86.98 | 115.12 | 0.76 |
Fixed and variable costs | ||||||
Total fixed and variable costs | 17,187.79 | 17,419.47 | 0.99 | 17,372.80 | 16,840.62 | 1.03 |
Total fixed and variable costs per valid questionnaire returned* | 399.72 | 248.85 | 1.60 | 496.37 | 455.15 | 1.09 |
Total variable and fixed cost per percentage point of response† | 1,379.02 | 861.02 | 1.60 | 1,116.82 | 1,024.09 | 1.09 |
*Number of valid questionnaires in intervention groups are 43 (Stratum A) and 35 (Stratum B), respectively, and in the control groups 70 (Stratum A) and 37 (Stratum B), respectively.
†Response rate in intervention groups are 12.5 percent (Stratum A) and 20.2 percent (Stratum B), respectively, and in the control groups 15.6 percent (Stratum A 35) and 16.4 percent (Stratum B), respectively.
Discussion
We conducted a randomized trail examining whether mixed‐mode survey methods are an alternative to more traditional postal‐only survey methods for surveys among physicians. The results showed that using a combination of Internet‐based and postal survey methods in a survey among primary care gynecologists with available email addresses led to lower response rates compared to traditional postal‐only survey methods.
Recruitment Costs
The potential to reduce costs is often put forward as a major advantage of using Internet‐based survey methods over the more traditional postal survey methods. In our study, the total costs in mixed‐mode groups were higher than in the respective postal‐only groups for both total costs per valid questionnaire returned and per PPR. Variable costs, however, were substantially lower in the mixed‐mode groups due to the lower costs for printing/material and postage than compared to the respective postal‐only groups. Conversely, fixed costs in mixed‐mode groups were considerably higher for both total costs per valid questionnaire returned and per PPR. This is due to the lower response rates than compared to the respective postal‐only groups. Nonetheless, the potential cost savings in variable costs in mixed‐mode groups did not offset the low response rates. A similar study examining (cost) differences between a mixed‐mode survey and postal‐only survey found that costs per returned questionnaire were lower in the mixed‐mode survey than in the postal‐only survey. The authors concluded that the larger the sample, the more cost savings become possible using a mixed‐mode survey (Zuidgeest et al. 2011). Our findings regarding lower variable but higher fixed costs in mixed‐mode groups are in line with results of the aforementioned study. We found that depending on the additional fixed costs and the overall sample size, a mixed‐mode approach can be more cost‐effective in terms of costs per valid questionnaire returned/percentage point of response—if the sample is large. However, when surveying physicians, the target population, that is, specialists, is naturally limited by its number. Thus, it may not be possible to attain a sample size large enough to achieve cost savings.
Strengths and Limitations
Our study had several strengths. First, our trial was conducted under real‐world conditions. Second, adequate email addresses of 60.4 percent of all eligible gynecologists could be identified beforehand. We received only three “bounce‐back” emails out of 345 sent. Additionally we received only a few postal surveys returned as undeliverable. However, we cannot rule out that surveys were delivered to incorrect email or postal addresses and no “bounce‐back” or mail‐return was generated. And third, in our randomized trial the potential survey participants in each stratum were randomly assigned to one of two recruitment strategies to control for unmeasured characteristics between potential participants with and without email address.
The main limitation of our study is the low response rate in all groups compared to other physician surveys examining response rates using Internet‐based and postal survey methods (McMahon et al. 2003; Seguin et al. 2004; VandenKerkhof et al. 2004; Akl et al. 2005; Leece et al. 2006; Beebe et al. 2007; Grava‐Gubins and Scott 2008; Matteson et al. 2011), where response rates of 24–80 percent were achieved for the postal groups and 29–63 percent for the Internet‐based experimental groups. In one of our earlier physician surveys in Germany, we obtained a higher response rate of 36.4 percent (Merzenich et al. 2012), similar to another physician survey on the national level with a response rate of 33 percent (Velasco et al. 2012). However, low response rates similar to the current survey are an issue also for other physician surveys in Germany, for example, with a response rate of 16 percent (Gutsfeld et al. 2014). The specific objectives of the survey targeting gynecologists' practice routines regarding pregnancies affected by IUGR might have caused limited responses. Earlier studies have pointed to quality problems in antenatal care in Germany with regard to IUGR detection (Jahn, Razum, and Berle 1998, 1999; Bais et al. 2004). Thus, due to the sensitive topic of our survey, gynecologists might have felt unwarrantedly inspected in their routine practice.
Due to limited resources we did not use monetary incentives. Previous studies have shown that physician participation in surveys could be successfully improved using incentives, in particular via monetary incentives offered in advance (prepaid; Kellerman and Herold 2001; VanGeest, Johnson, and Welch 2007; Flanigan, McFarlane, and Cook 2008). However, we used other proven strategies to increase the response in our survey, that is, prenotification and a letter of support from an influential professional organization.
Interpretation
There are several plausible explanations why the mixed‐mode approach did not increase the response rates in our experiment. First, it is cumbersome to obtain valid and frequently accessed email contacts of physicians. Receptionists or assistants often refuse to provide email addresses without the physicians' permission. Furthermore, email messages from unfamiliar senders may be diverted by spam filters, or due to the unfamiliar sender, the respondents may choose to ignore them (Klabunde et al. 2012). Our experience showed that it is unlikely that primary care gynecologists provide their personal email addresses in officially provided lists. Similarly, it is currently not feasible to obtain a comprehensive listing of eligible primary care gynecologists (or other physicians) for survey purposes in Germany. However, we identified about 60.4 percent of the email addresses through a multistage approach using online resources. In this regard, we are aware that gynecologists in practices or clinics may not regularly or not at all access their email accounts or had a receptionist or assistant screen and select incoming email messages. The older age structure of our study sample indicates that respondents of older age groups may not use an Internet‐based survey tool as compared to younger age groups. A U.S. national survey among primary care physicians, for example, found that participants older than 60 years of age preferred postal survey methods, whereas those younger than 40 years of age preferred Internet‐based survey methods (Smith et al. 2011). Our results are in line with another study comparing mail and web‐based study modes, stating that younger age groups tended to participate more often in the web survey (Kwak and Radler 2002). Because we surveyed primary care gynecologists only, our results may have limited generalizability to other physician specialties or to physicians in general. However, beyond the medical specialization, the group is not likely to differ substantially from other primary care physicians.
Overall, our presumption that mixed‐mode survey methods lead to increased response rates in a survey among primary care gynecologists could not be confirmed. It seems that sending the survey by email first has a negative effect on the overall response rate (stratum A: mixed‐mode). However, we did see that the response to the postal follow‐up (7.5 percentage points; after mode switch) was almost of the same order as in the first wave of the postal‐only group (10.4 percentage points). It is possible that if we had sent further postal reminders, similar to the postal‐only arm, the response rate might have increased further in this group.
Conclusion
Physician surveys will continue to be an essential tool for assessing information on physicians' knowledge, attitudes, beliefs, and practices. At least for our study sample, primary care gynecologists are more likely to participate using traditional postal‐only rather than mixed‐mode survey methods. The lower response rate for the mixed‐mode survey methods may be partly due to the age structure in our study sample. Despite the lower overall response rate, the variable costs per returned questionnaire were substantially lower in mixed‐mode groups and (depending on magnitude and difference of fixed cost) indicate the potential for cost savings if the sample population is sufficiently large. Future research is needed to investigate whether the same response tendencies are seen for other physician specialties and across different age groups. We presume that it may be useful to repeat similar experiments (including with different sequences of survey modes) at regular intervals, as information technology is expected to change daily medical and research practice substantially over time.
Supporting information
Appendix SA1: Author Matrix.
Acknowledgments
Joint Acknowledgements/Disclosure Statement: We gratefully acknowledge the support of the Association of Statutory Health Insurance Physicians (ASHIP) in Bremen and Lower Saxony and the Federal Association of Gynecologists (Bundesverband der Frauenärzte e.V.). We thank all participating gynecologists in Bremen and Lower‐Saxony and Beate Schütte for her contribution to the data collection. Funding for this research was provided by the German Federal Ministry of Education and Research (BMBF) as a part of the BIUS project under contract No. 01GY1131.
Disclosures: None.
Disclaimer: None.
References
- Akl, E. A. , Maroun N., Klocke R. A., Montori V., and Schunemann H. J.. 2005. “Electronic Mail Was Not Better Than Postal Mail for Surveying Residents and Faculty.” Journal of Clinical Epidemiology 58 (4): 425–9. [DOI] [PubMed] [Google Scholar]
- Association of Statutory Health Insurance Physicians Bremen . 2015. [accessed on January 30, 2016]. Available at http://www.kvhb.de/
- Association of Statutory Health Insurance Physicians Lower Saxony . 2015. [accessed on January 30, 2016]. Available at http://www.kvn.de/Startseite/
- Bais, J. M. , Eskes M., Pel M., Bonsel G. J., and Bleker O. P.. 2004. “Effectiveness of Detection of Intrauterine Growth Retardation by Abdominal Palpation as Screening Test in a Low Risk Population: An Observational Study.” European Journal of Obstetrics, Gynecology, and Reproductive Biology 116 (2): 164–9. [DOI] [PubMed] [Google Scholar]
- Beebe, T. J. , Locke G. R., Barnes S. A., Davern M. E., and Anderson K. J.. 2007. “Mixing Web and Mail Methods in a Survey of Physicians.” Health Services Research 42 (3): 1219–34. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Braithwaite, D. , Emery J., De Lusignan S., and Sutton S.. 2003. “Using the Internet to Conduct Surveys of Health Professionals: A Valid Alternative?” Family Practice 20 (5): 545–51. [DOI] [PubMed] [Google Scholar]
- Cook, J. V. , Dickinson H. O., and Eccles M. P.. 2009. “Response Rates in Postal Surveys of Healthcare Professionals between 1996 and 2005: An Observational Study.” BMC Health Services Research 14 (9): 160. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cull, W. L. , O'Connor K. G., Sharp S., and Tang S. F.. 2005. “Response Rates and Response Bias for 50 Surveys of Pediatricians.” Health Services Research 40 (1): 213–26. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cummings, S. M. , Savitz L. A., and Konrad T. R.. 2001. “Reported Response Rates to Mailed Physician Questionnaires.” Health Services Research 35 (6): 1347. [PMC free article] [PubMed] [Google Scholar]
- Dillman, D. A. 2000. Mail and Internet Surveys: The Tailored Design Method. New York: Wiley. [Google Scholar]
- Edwards, P. J. , Roberts I., Clarke M. J., DiGuiseppi C., Wentz R., Kwan I., Cooper R., Felix L. M., and Pratap S.. 2009. “Methods to Increase Response to Postal and Electronic Questionnaires.” Cochrane Database Systematic Review 8 (3): MR000008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ernst, S. A. , Reeske A., Spallek J., Petersen K., Brand T., and Zeeb H.. 2014. “Care‐Related Factors Associated with Antepartal Diagnosis of Intrauterine Growth Restriction: A Case‐Control Study.” BMC Pregnancy Childbirth 14: 371. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Federal Association of Gynecologists . 2016. [accessed on January 30, 2016]. Available at http://www.bvf.de/
- Federal Association of Sickness Funds (GKV Spitzenverband) . 2016. “Kennzahlen der Gesetzlichen Krankenversicherung” [accessed on January 30, 2016]. Available at https://www.gkv-spitzenverband.de/english/statutory_health_insurance/statutory_health_insurance.jsp
- Field, T. S. , Cadoret C. A., Brown M. L., Ford M., Greene S. M., Hill D., Hornbrook M. C., Meenan R. T., White M. J., and Zapka J. M.. 2002. “Surveying Physicians: Do Components of the “Total Design Approach” to Optimizing Survey Response Rates Apply to Physicians?” Medical Care 40 (7): 596–605. [DOI] [PubMed] [Google Scholar]
- Flanigan, T. , McFarlane E., and Cook S.. 2008. “Conducting Survey Research among Physicians and Other Medical Professionals—A Review of Current Literature.” Proceedings of the Survey Research Methods Section, American Statistical Association, pp. 4136–47.
- Grava‐Gubins, I. , and Scott S.. 2008. “Effects of Various Methodologic Strategies Survey Response Rates among Canadian Physicians and Physicians‐in‐Training.” Canadian Family Physician 54 (10): 1424–30. [PMC free article] [PubMed] [Google Scholar]
- Gutsfeld, C. , Olaru I. D., Vollrath O., and Lange C.. 2014. “Attitudes about Tuberculosis Prevention in the Elimination Phase: A Survey among Physicians in Germany.” PLoS ONE 9 (11): e112681. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jahn, A. , Razum O., and Berle P.. 1998. “Routine Screening for Intrauterine Growth Retardation in Germany: Low Sensitivity and Questionable Benefit for Diagnosed Cases.” Acta Obstetricia et Gynecologica Scandinavica 77 (6): 643. [DOI] [PubMed] [Google Scholar]
- Jahn, A. , Razum O., and Berle P.. 1999. “Routine‐Ultraschall in der Deutschen Schwangerenvorsorge: Ist die Effektivität Gesichert?” Geburtshilfe und Frauenheilkunde 59 (3): 97–102. [Google Scholar]
- Kellerman, S. E. , and Herold J.. 2001. “Physician Response to Surveys: A Review of the Literature.” American Journal of Preventive Medicine 20 (1): 61–7. [DOI] [PubMed] [Google Scholar]
- Klabunde, C. N. , Willis G. B., McLeod C. C., Dillman D. A., Johnson T. P., Greene S. M., and Brown M. L.. 2012. “Improving the Quality of Surveys of Physicians and Medical Groups: A Research Agenda.” Evaluation and the Health Professions 35 (4): 477–506. [DOI] [PubMed] [Google Scholar]
- Kwak, N. , and Radler B.. 2002. “A Comparison between Mail and Web Surveys: Response Pattern, Respondent Profile, and Data Quality.” Journal of Official Statistics 18 (2): 257–74. [Google Scholar]
- Leece, P. , Bhandari M., Sprague S., Swiontkowski M. F., Schemitsch E. H., and Tornetta P.. 2006. “Does Flattery Work? A Comparison of 2 Different Cover Letters for an International Survey of Orthopedic Surgeons.” Canadian Journal of Surgery 49 (2): 90. [PMC free article] [PubMed] [Google Scholar]
- Martins, Y. , Lederman R. I., Lowenstein C. L., Joffe S., Neville B. A., Hastings B. T., and Abel G. A.. 2012. “Increasing Response Rates from Physicians in Oncology Research: A Structured Literature Review and Data from a Recent Physician Survey.” British Journal of Cancer 106 (6): 1021–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Matteson, K. A. , Anderson B. L., Pinto S. B., Lopes V., Schulkin J., and Clark M. A.. 2011. “Surveying Ourselves: Examining the Use of a Web‐Based Approach for a Physician Survey.” Evaluation and the Health Professions 34 (4): 448–63. [DOI] [PMC free article] [PubMed] [Google Scholar]
- McLeod, C. C. , Klabunde C. N., Willis G. B., and Stark D.. 2013. “Health Care Provider Surveys in the United States, 2000‐2010: A Review.” Evaluation and the Health Professions 36 (1): 106–26. [DOI] [PubMed] [Google Scholar]
- McMahon, S. R. , Iwamoto M., Massoudi M. S., Yusuf H. R., Stevenson J. M., David F., Chu S. Y., and Pickering L. K.. 2003. “Comparison of e‐Mail, Fax, and Postal Surveys of Pediatricians.” Pediatrics 111 (4): e299–303. [DOI] [PubMed] [Google Scholar]
- Merzenich, H. , Krille L., Hammer G., Kaiser M., Yamahita S., and Zeeb H.. 2012. “Paediatric CT Scan Usage and Referrals of Children to Computed Tomography in Germany – A Cross‐Sectional Survey of Medical Practice and Awareness of Radiation Related Health Risks among Physicians.” BMC Health Services Research 12: 47. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Seguin, R. , Godwin M., MacDonald S., and McCall M.. 2004. “E‐Mail or Snail Mail? Randomized Controlled Trial on Which Works Better for Surveys.” Canadian Family Physician 50 (3): 414–9. [PMC free article] [PubMed] [Google Scholar]
- Smith, A. W. , Borowski L. A., Liu B., Galuska D. A., Signore C., Klabunde C., Huang T. T.‐K., Krebs‐Smith S. M., Frank E., and Pronk N.. 2011. “US Primary Care Physicians' Diet‐, Physical Activity‐, and Weight‐Related Care of Adult Patients.” American Journal of Preventive Medicine 41 (1): 33–42. [DOI] [PMC free article] [PubMed] [Google Scholar]
- VandenKerkhof, E. G. , Parlow J. L., Goldstein D. H., and Milne B.. 2004. “In Canada, Anesthesiologists Are Less Likely to Respond to an Electronic, Compared to a Paper Questionnaire.” Canadian Journal of Anaesthesia 51 (5): 449–54. [DOI] [PubMed] [Google Scholar]
- VanGeest, J. B. , Johnson T. P., and Welch V. L.. 2007. “Methodologies for Improving Response Rates in Surveys of Physicians: A Systematic Review.” Evaluation and the Health Professions 30 (4): 303–21. [DOI] [PubMed] [Google Scholar]
- Velasco, E. , Noll I., Espelage W., Ziegelmann A., Krause G., and Eckmanns T.. 2012. “A Survey of Outpatient Antibiotic Prescribing for Cystitis.” Deutsches Ärzteblatt International 109 (50): 878–84. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Willis, G. B. , Smith T., and Lee H. J.. 2013. “Do Additional Recontacts to Increase Response Rate Improve Physician Survey Data Quality?” Medical Care 51 (10): 945–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zuidgeest, M. , Hendriks M., Koopman L., Spreeuwenberg P., and Rademakers J.. 2011. “A Comparison of a Postal Survey and Mixed‐Mode Survey Using a Questionnaire on Patients' Experiences with Breast Care.” Journal of Medical Internet Research 13 (3): e68. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Appendix SA1: Author Matrix.