Abstract
Little is known about what strategies are cost-effective in increasing participation among physicians in surveys that are conducted exclusively via the web. To assess the effects of incentives and prenotification on response rates and costs, general internists (N = 3,550) were randomly selected from the American Medical Association (AMA) Masterfile and assigned to experimental groups that varied in the amount of a promised incentive (none, entry into a $200 lottery, $50, or $100) and prenotification (none, prenotification letter only, or prenotification letter containing a $2 preincentive). Results indicated that the response rates were highest in the groups promised $100 and $50, respectively. While the postal prenotification letter increased response rates, the inclusion of a small token $2 preincentive had no effect on participation. Further, unlike mail surveys of physicians, the $2 preincentive was not cost-effective. Among physicians, larger promised incentives of $50 or $100 are more effective than a nominal preincentive in increasing participation in a web-only survey. Consistent with prior research, there was little evidence of nonresponse bias among the experimental groups.
Keywords: physicians, response rates, web surveys, incentives, costs
Introduction
While maximizing response rates in a cost-effective manner is an important goal in studying physicians, past research indicates that physicians are a particularly difficult group to contact and recruit (Flanigan, McFarlane, & Cook, 2008; Kellerman & Herold, 2001). Consequently, response rates in physician surveys are lower when compared to nonphysicians (Asch, Jedrziewski, & Christakis, 1997). Web surveys offer potential advantages over mail and phone surveys including flexibility in designing questionnaires, reduced costs, shorter field periods, quicker data processing, and potential gains in data quality (Couper, 2008; McMahon et al., 2003; Schleyer & Forrest, 2000). In addition, there are designs for which a web survey may be the most viable option, such as when questions ask for sensitive information that might be reported less honestly to an interviewer, or when questionnaires include features, such as extensive skip patterns or long lists of response options, that would be difficult to administer in a paper survey.
Potential advantages of web surveys, however, may be offset by response rates among physicians that are often lower than for mail, phone, or mixed-mode (e.g., mail/web) designs (Beebe, Locke, Barnes, Davern, & Anderson, 2007; VanGeest, Johnson, & Welch, 2007). While there is substantial variability in the range of response rates reported in Internet surveys of physicians (e.g., Braithwaite, Emery, de Lusignan, & Sutton, 2003), rates of under 20% are not uncommon (Golnik, Ireland, & Borowsky, 2009; Rodriguez et al., 2006; Yusuf & Baron, 2006). Higher response rates (e.g., in excess of 50%) tend to occur in studies designed to include additional techniques for securing participation, such as attaching a copy of the questionnaire to the e-mail invitation (e.g., McLean & Feldman, 2001), or in studies in which the sample includes physicians with prior and demonstrated experience using the Internet (e.g., Potts & Wyatt, 2002). More research is needed to isolate techniques that improve response rates in web surveys with physicians.
Summarizing results from mail surveys of physicians that test the impact of incentives, researchers find that small (e.g., $2 to $5), prepaid monetary incentives are particularly effective in increasing participation (Flanigan et al., 2008; VanGeest et al., 2007). Promised monetary incentives, nonmonetary incentives, and lotteries have proved less effective and more costly. Compared to mail surveys, however, it is more difficult to deliver prepaid incentives using the Internet, and the impact of a prepaid versus promised incentive may be different. Further, more current research indicates that incentives of increasingly larger amounts (i.e., $50 or $100) may be needed to secure participation, even for mail surveys (Keating, Zaslavsky, Goldstein, West, & Ayanian, 2008; Malin, Rideout, & Ganz, 2000; Peugh, Sirovich, & Applebaum, 2010). In addition to incentives, research has examined the effectiveness of prenotifying respondents in a mode other than the web, such as using a postal letter. While prenotification has increased response rates in web surveys of nonphysicians (Couper, 2008), the effectiveness of precontacting physicians by mail for a web survey is unknown.
There is essentially no research that examines the effectiveness of incentives in physician surveys conducted exclusively by web. We tested different combinations of methods for increasing participation by randomly assigning physicians to groups created by varying the amount of a promised incentive (none, cash lottery, $50, and $100) and the type of prenotification (none, prenotification letter only, and a prenotification letter containing a $2 preincentive). We examined the impact of incentives and prenotification on response rates, costs, and nonresponse bias in a national web survey of physicians. We focused on determining which incentive combinations would be most cost-effective in increasing response rates.
Methods
Participants
A national, random sample of 3,550 general internists was selected by a private vendor from the AMA Masterfile. Sample criteria specified that the physician be currently practicing, board-certified with internal medicine as their primary specialty, and reside in the United States. Physicians were eligible if they had both an e-mail and postal address on file. Roughly half (47.3%) of those listed in the Masterfile provided an e-mail address. E-mail addresses were less likely to be provided by residents, board-certified physicians, and doctors in certain specialties.
Survey Instrument
The survey was administered by the University of Wisconsin Survey Center between March and May 2009 on behalf of the “Physicians Understanding of Human Genetic Variation Study” (PUHGV) at the Social and Behavioral Research Branch, National Human Genome Research Institute of the National Institutes of Health (NIH). The survey sought to measure physicians' knowledge of human genetic variation and their use of patient's characteristics, including race and ethnicity, in clinical and genetic diagnostic, treatment, and referral decisions. The 80-item survey was conducted exclusively via the web in order to control the order in which questions were asked and to reduce social desirability effects for sensitive questions.
Survey Administration
Study procedures included up to seven points of contact with sample members. First, selected respondents were mailed a postal prenotification letter that used study-specific stationary, bore sponsors' names, described the study's purpose, and noted respondents would be sent an e-mail invitation to complete the web survey. We enclosed a $2 bill for respondents assigned to the prepaid group. Approximately 1 week later, all respondents were sent an e-mail invitation to participate that included a hot-linked (clickable) URL. For the third to sixth contacts, e-mail reminders containing the hotlinked URL were sent. For the seventh contact, nonresponding physicians were sent a postal letter that included a manual URL, which the physician could type into a browser.
Experimental factors included the amount of the promised incentive (none, entry into a $200 lottery, $50 check, or $100 check) and prenotification (none, a prenotification letter only, or a prenotification letter containing a $2 cash preincentive). We selected from among the 12 groups formed by crossing these factors based on which combinations the literature indicated would be most effective, and with a focus on examining the effects of prepaid and promised incentives. Consequently, we omitted the treatment of no prenotification letter from the promised incentive groups. Due to the expense of the promised incentives, we also omitted the prenotification letter only treatment from the $100 group, and sampled fewer cases from the $50 and $100 groups. Thus, the remaining eight experimental groups fielded resulted in an unbalanced design (Table 1). If the respondent was assigned to a promised incentive group, the respondent was reminded of the incentive at all points of contact. The study was approved by the institutional review boards at the University of Wisconsin-Madison and NIH.
Table 1. Overview of Experimental Design and Number of Cases Sampled in Each Group.
Prenotification | Promised Incentive | |||
---|---|---|---|---|
| ||||
None | $200 Lottery | $50 check | $100 check | |
No prenotification letter | 500 | --- | --- | --- |
Prenotification letter only | 500 | 500 | 350 | --- |
Prenotification letter and $2 prepaid incentive | 500 | 500 | 350 | 350 |
Statistical Analysis
Response rates were calculated and analyzed for each contact (RR1; AAPOR, 2011). We included completed surveys in the numerator and all cases fielded in the denominator. Comparisons between experimental groups were analyzed using logistic regression. To test for significant differences between groups, we fitted a baseline model in which we regressed an indicator for whether the respondent completed the survey or not on indicators for the experimental groups, omitting the $100 group, which served as the reference group. The remaining 20 pairwise contrasts formed among the 8 experimental groups were evaluated using the postestimation command lincom (Stata, Version 11; Long & Freese, 2006).
We also examined total costs and cost per completed survey across experimental groups. Our analysis included only variable costs, including mailing costs for the prenotification letter and follow-up letter (e.g., printing, stuffing, postage), and costs associated with the incentives (e.g., their monetary value and administration). We omitted fixed costs that were consistent across the experimental groups.
To test for nonresponse bias, we compared the proportion of respondents with various characteristics (e.g., gender, birth year, geographic location, and practice type) to the distribution of the characteristics in the administrative data, which included both respondents and nonrespondents, comparing across each of the experimental groups. Differences were tested using one sample z tests for proportions.
Results
A total of 343 surveys (9.6%) were completed (Table 2)1. We found a small but significant improvement in response between respondents who received the prenotification letter and those who did not: 6.2% versus 3.0%. The highest response rate, 25.4%, was for the group promised $100, who responded significantly more than any of the other groups. Based on the 20 pairwise comparisons formed among the remaining 7 experimental groups, we found that both groups promised $50 had significantly higher response rates than either of the groups not promised incentives or either of the lottery groups. The two lottery groups and the two groups with prenotification but no promised incentives responded at similar levels. The $2 preincentive did not significantly improve response rates for any of the relevant pairwise comparisons including the contrast between the letter only versus the letter plus $2 group, the letter only versus the letter plus $2 in the lottery groups, or the letter only versus the letter plus $2 in the $50 dollar groups.
Table 2. Response Rates and Costs by Experimental Groups.
Prenotification | Promised Incentive | |||||||
---|---|---|---|---|---|---|---|---|
| ||||||||
None | $200 Lottery | $50 Check | $100 Check | |||||
|
|
|
|
|||||
No Letter | Letter Only | Letter + $2 | Letter Only | Letter + $2 | Letter Only | Letter + $2 | Letter + $2 | |
Number of surveys (n) | ||||||||
Fielded | 500 | 500 | 500 | 500 | 500 | 350 | 350 | 350 |
Completed | 15 | 31 | 31 | 33 | 43 | 47 | 54 | 89 |
Response Rates (%) | ||||||||
Overall Incremental by contact attempt | 3.0a | 6.2b | 6.2b | 6.6b | 8.6b | 13.4c | 15.4c | 25.4d |
E-mail invitation | 0.4 | 2.6 | 2.4 | 1.2 | 2.8 | 3.4 | 3.1 | 6.9 |
First e-mail reminder | 0.4 | 1.2 | 1.6 | 1.6 | 1.8 | 2.3 | 3.4 | 5.7 |
Second e-mail reminder | 1.0 | 0.8 | 1.2 | 1.2 | 1.0 | 1.4 | 2.0 | 1.7 |
Third e-mail reminder | 0.0 | 0.0 | 0.0 | 0.2 | 0.2 | 0.0 | 0.0 | 2.0 |
Fourth e-mail reminder | 0.2 | 0.6 | 0.0 | 1.0 | 0.6 | 0.3 | 1.4 | 1.1 |
Postal letter with URL | 1.0 | 1.0 | 1.0 | 1.4 | 2.2 | 6.0 | 5.4 | 8.0 |
Variable costs | ||||||||
Total | $559 | $1,488 | $2,542 | $1,811 | $2,859 | $3,173 | $4,526 | $10,338 |
Cost per completed survey | $37 | $48 | $82 | $55 | $66 | $68 | $84 | $116 |
Note. Response rates with different superscript alphabets are significantly different based on the results from logistic regression tests.
The response rate for the group that did not receive a letter and was not promised an incentive was significantly lower than the groups that received letters but no promised incentive (p< .05), the lottery group that received the letter only (p< .01), the lottery group that received the letter plus $2 (p< .001), either of the groups promised $50(p< .001), or the group promised $100 (p< .001).
The response rates for the groups that received letters but not a promised incentive were significantly lower than any of the groups promised $50 or $100 (p< .001), but not significantly different from either of the groups promised inclusion in the lottery (p> .10). The response rate for the lottery group that received the letter only was not significantly different from the lottery group that received the letter plus $2 (p> .10), but the response rate for this group was significantly lower than the $50 group that received the letter only (p< .01), the $50 group that received the letter plus $2 (p< .001), or the $100 group (p< .001). The response rate for the lottery group that received the letter plus $2 was significantly lower than the $50 group that received the letter only (p< .05) or the letter plus $2 (p< .01), or $100 (p< .001).
The response rates for the $50 groups that received a letter only versus a letter plus $2 did not significantly differ from each other (p> .10) but were lower than the $100 group at the levels of p< .01 and p< .001, respectively.
The response rate for the group promised $100 was significantly different at the p < .001 level from the any of the groups promised no incentive, entry into a lottery, or $50 only, and significantly different at the p< .01 from the group promised $50 and prepaid $2.
The postal letter with the URL increased the overall response rate by 2.8 percentage points (from 6.8% to 9.6%). The letter was particularly effective in the groups that included promised incentives of $50 and $100, and increased overall response rates for these groups by 5 to 8%. The final letter accounted for 45% of the completed surveys in the $50 letter only group, 35% of the completed surveys in $50 group with the letter plus $2, and 31% of the completed surveys in the $100 group. The letter with the URL was much less effective for the groups that received prenotification letters but no promised incentives and for the two lottery groups. Within these groups, the letter only increased response rates by 1 to 2 percentage points.
Table 2 also presents the results of our cost analysis. Variable costs ranged greatly from a low of $559 for the group that did not receive a prenotification letter and were not promised an incentive to a high of $10,338 for the $100 group. The costs per completed survey ranged from a low of $37 per complete to a high of $116 but not always in a predictable pattern. While the two groups that received a prenotification letter but no promised incentive had similar response rates of 6.2%, the cost per completed survey increased from $48 to $82 with the addition of the $2 preincentive. The cost per complete of $82 for the group that received a prenotification letter plus $2 but not a promised incentive was roughly equal to the cost per complete of $84 for the group that received a prenotification letter plus $2 and the promise of a $50 incentive. The $50 group that received prenotification without $2 was even less expensive at $68 per complete. While the $10,338 spent overall on variable costs for the group prepaid $2 and promised $100 was substantially higher than the $4,526 spent for the group prepaid $2 and promised $50, the costs per complete for these two groups were $116 and $84, respectively, and differed by less than the $50 difference in incentives.
Table 3 presents the results for tests of nonresponse bias. Because the $2 preincentive had no significant effect on response rates, we collapsed across the experimental groups that included a prenotification letter with and without the preincentive to increase statistical power. We found no differences by gender. In contrast, physicians in the groups without any promised incentives were less likely to participate if they were under60 years (born after 1949) and from the Northeast, but these results were only significant for the group that received prenotification but no promised incentive. In the group promised $100, we found that physicians in nonoffice-based practices were more likely to respond.
Table 3. Analysis of Nonresponse Bias: Distributions of Key Variables From the Administrative Data and by Experimental Group.
Experimental Groupsa | ||||||
---|---|---|---|---|---|---|
| ||||||
Administrative Datab (n = 3,550) | No Promised Incentive/No Prenotification (n = 15) | No Promised Incentive(Letter Only and Letter + $2) (n = 62) | $200 Lottery (Letter Only and Letter + $2) (n = 76) | $50 Check (Letter Only and Letter + $2) (n = 101) | $100 Check (Letter + $2) (n = 89) | |
Gender (%) | ||||||
Male | 2,410 (67.9) | 11 (73.3) | 44 (71.0) | 49 (64.5) | 62 (61.4) | 59 (66.3) |
Female | 1,140 (32.1) | 4 (26.7) | 18 (29.0) | 27 (35.5) | 39 (38.6) | 30 (33.7) |
p Level | n.s. | n.s. | n.s. | n.s. | n.s. | |
Year of birth (%) | ||||||
After 1949 | 3,024 (85.2) | 11 (73.3) | 45 (72.6) | 66 (86.8) | 81 (80.2) | 76 (85.4) |
On or before 1949 | 526 (14.8) | 4 (26.7) | 17 (27.4) | 10 (13.2) | 20 (19.8) | 13 (14.6) |
p Level | n.s. | * | n.s. | n.s. | n.s. | |
Geographic location (%) | ||||||
Midwest/South/West | 2,602 (73.3) | 9 (60.0) | 36 (58.1) | 55 (72.4) | 75 (74.3) | 63 (70.8) |
Northeast | 948 (26.7) | 6 (40.0) | 26 (41.9) | 21 (27.6) | 26 (25.7) | 26 (29.2) |
p Level | n.s. | * | n.s. | n.s. | n.s. | |
Type of practice (%) | ||||||
Other | 686 (19.3) | 4 (26.7) | 14 (22.6) | 16 (21.1) | 25 (24.7) | 33 (37.1) |
Office based | 2,864 (80.7) | 11 (73.3) | 48 (77.4) | 60 (78.9) | 76 (75.3) | 56 (62.9) |
p Level | n.s. | n.s. | n.s. | n.s. | * |
Note. Comparisons are made for each experimental group and the administrative data using one sample z tests for proportions. Differences that were significant at the p< .05 level are indicated with *. The abbreviation of “n.s.” indicates the difference was not significant.
Experimental groups were formed by collapsing across the “Letter Only” and “Letter þ $2” treatments for the groups who were not promised an incentive but received a prenotification letter, the lottery groups, and the $50 groups.
The sample for the administrative data includes all respondents and nonrespondents.
Discussion
Small prepaid incentives, which tend to increase response rates in mail surveys of physicians, may not be effective for web surveys. This may be because the incentive is usually delivered with the mail survey, thus allowing physicians to simultaneously consider the survey, incentive, and task. With web surveys, it is more difficult to couple the incentive with the request to complete the task, and a larger inducement may be necessary to entice participation. In contrast, the response rate among physicians offered a check for $100 (25.4%) was significantly higher than among physicians offered $50 (13.4–15.4%), inclusion in a $200 lottery (6.6–8.6%), or no promised incentive (3.0–6.2%).
These results offer direction for the use of prenotification in web surveys of physicians. Consistent with findings from web surveys of nonphysicians, a postal prenotification was effective in increasing response rates. We speculate that the prenotification letter highlighted the survey, added legitimacy to it, and reduced the tendency to regard the e-mail invitation as spam. Prenotification (either with or without the $2 preincentive) was not enough, however, to yield satisfactory response rates. The final postal letter that included the manual URL increased the overall response rate by almost one third and was much more effective for the $50 and $100 groups. These findings suggest that for web surveys, if prenotification includes an incentive, it should also include a manual URL to allow users to complete the survey upon receipt. This would entail sending a “prenotification letter” that simultaneously invites participation and notifies respondents of the forthcoming e-mail that will allow them to click a link to complete the survey.
Cost analyses must be considered in the context of response rates and data quality. While we had hoped to be successful in securing an adequate response rate using only an inexpensive token preincentive, we were not. In contrast to results from mail surveys of physicians (Asch, Christakis, & Ubel, 1998), we found that a prepaid $2 incentive was not cost-effective. In contrast, including a promised incentive of $50 to the group that also received a prenotification letter and $2 was surprisingly cost-effective. While costs increased only slightly—from $82 to $84 per completed survey—the response rate increased by 9 percentage points (15.4% vs 6.2%). Even more striking, when we compare the group prepaid $2 but not promised an incentive to the group that was promised $50 but not prepaid $2, we find that eliminating the $2 preincentive while adding the $50 promised incentive actually reduced the per complete cost from $82 to $68 while still increasing the response rate by 7 percentage points.
We had also hoped to find a lower threshold for our promised incentives, such that $50 would perform as well as $100, but we did not. Increasing the promised incentive from $50 to $100 increased the response rates by 10 percentage points. While the $100 promised incentive was not as cost-effective as $50, it did not increase costs by $50 per completed survey. Because of the increased effectiveness of the $100 incentive in securing participation, the increase in cost per completed survey was only $32 when comparing the $100 group to the group that received a $2 preincentive and a promise of $50. This was partially due to less follow-up being required for the $100 group, particularly for the final postal letter with the manual URL. Given the ineffectiveness of the $2 preincentive, we might have found even greater cost benefit had we sent the prenotification without the $2.
One frequent advantage of web surveys is that they often are cheaper than mail or phone surveys (Couper, 2008). However, this may not be true for physicians. If substantial incentives are needed to ensure participation, it may be more cost-effective to conduct surveys using a different mode or mixed-mode design (Beebe et al., 2007). Concerns about cost-effectiveness must be examined in the context of the extremely low response rates obtained for the experimental groups not guaranteed a post-incentive, and the low response rates overall2. The cost savings that web-only administration offers may be quickly reduced by the need for large incentives or a mixed-mode design.
While past research indicates low nonresponse bias in surveys of physicians (Flanigan et al., 2008; Kellerman & Herold, 2001), little research has explored nonresponse bias specifically in web-based physician surveys (Beebe et al., 2007). Although our ability to draw conclusions is hampered by small sample sizes for some of the comparisons, we find that consistent with previous studies, there is little evidence of nonresponse bias across the incentive groups. For some variables, we see bias reduced with the use of larger monetary postincentives. An exception is that the $100 incentive drew more physicians in positions other than office-based practices (e.g., physicians with positions in administration, research, and teaching), possibly because they interact more with computers on the job.
In their review of methods for improving response rates in physician surveys, VanGeest et al. (2007) describe several factors that are associated with higher response rates including incentives, questionnaire design, sponsorship, and mode of administration. In this study, we manipulated two factors—prenotification and incentives—that have been shown to affect participation in surveys. While these are the only factors to account for differences between the experimental groups, we believe several other features of our study were associated with the lower overall response rates we observed. First, as discussed earlier, response rates are often lower for surveys administered via the web versus other modes among physicians. Second, the topic of the study—knowledge about genetic variation and its influence on clinical practice—may have not been salient to some physicians and may have been seen as sensitive to others. Third, the questionnaire, which potential sample members were told would take 30 min to complete, was long for a web survey. Finally, because we used the AMA Masterfile and targeted a nationally representative sample, we were not able to appeal to a local sponsor or reference a specific professional organization in our e-mail invitations or postal letters.
Future research should continue to explore methods to increase participation among physicians in web surveys. Based on our findings, we suggest that more studies are needed to evaluate the impact on response rates and costs of larger prepaid versus larger promised incentives, and the effectiveness of inserting a manual URL in a postal prenotification letter that includes varying amounts of prepaid incentives.
Acknowledgments
The authors wish to thank Nora Cate Schaeffer and John Kennedy for their comments.
Funding: This research was supported in part by the Intramural Research Program of the National Human Genome Research Institute, National Institutes of Health.
Footnotes
Among the remaining 3,207 sample members who did not complete a survey, 33 members explicitly refused to participate in the study, 97 members logged onto the web survey with 86 of these completing at least one question but not the entire survey, 81 members had emails bounce back because they were invalid, and 2,996 members were noncontacts who did not respond to any of the contacts. Rates of refusals, partials, and bounce-backs did not vary disproportionately among the experimental groups.
Another limitation to studies like ours in which researchers desire to make generalizations to all physicians residing in the U.S. is that coverage (the proportion of physicians that can be contacted using email addresses) is incomplete. While the AMA Masterfile is the most complete frame for physicians in the U.S., only half of the physicians listed at the time of our study had email and postal addresses. Thus, our results for the effects of pre-notification and incentives are only generalizable to physicians who provided this information.
Authors' Note: This article represents the views of the authors and not necessarily those of the National Human Genome Research Institute, the National Institutes of Health, or the U.S. Department of Health and Human Services. An earlier version of this article was presented at the 2009 annual meeting of the American Association for Public Opinion Research.
Declaration of Conflicting Interests: The author(s) declared no potential conflicts of interests with respect to the authorshipand/or publication of this article.
Financial Disclosure: The author(s) disclosed receipt of the following financial support for the research and/or authorship of this article
References
- The American Association for Public Opinion Research. Standard definitions: Final dispositions of case codes and outcome rates for surveys. (7th) 2011 AAPOR. Accessed on April 11th, 2011 from http://aapor.org/Content/NavigationMenu/AboutAAPOR/StandardsampEthics/StandardDefinitions/StandardDefinitions2011.pdf)
- Asch DA, Christakis NA, Ubel PA. Conducting physician mail surveys on a limited budget: A randomized trail comparing $2 bill versus $5 bill incentives. Medical Care. 1998;36:95–99. doi: 10.1097/00005650-199801000-00011. [DOI] [PubMed] [Google Scholar]
- Asch DA, Jedrziewski MK, Christakis NA. Response rates to mail surveys published in medical journals. Journal of Clinical Epidemiology. 1997;50:1129–1136. doi: 10.1016/s0895-4356(97)00126-1. [DOI] [PubMed] [Google Scholar]
- Beebe TJ, Locke GR, Barnes SA, Davern ME, Anderson KJ. Mixing web and mail methods in a survey of physicians. Health Services Research. 2007;42:1219–1234. doi: 10.1111/j.1475-6773.2006.00652.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Braithwaite D, Emery J, de Lusignan S, Sutton S. Using the Internet to conduct surveys of health professionals: A valid alternative? Family Practice. 2003;20:545–551. doi: 10.1093/fampra/cmg509. [DOI] [PubMed] [Google Scholar]
- Couper MP. Designing effective web surveys. Cambridge: Cambridge University Press; 2008. [Google Scholar]
- Flanigan TS, McFarlane E, Cook S. Conducting survey research among physicians and other medical professionals: A review of current literature. ASA Proceedings of the Section on Survey Research Methods. 2008:4136–4147. [Google Scholar]
- Golnik A, Ireland M, Borowsky IW. Medical homes for children with autism: A physician survey. Pediatrics. 2009;123:966–971. doi: 10.1542/peds.2008-1321. [DOI] [PubMed] [Google Scholar]
- Keating NL, Zaslavsky AM, Goldstein J, West DW, Ayanian JZ. Randomized trial of $20 versus $50 incentives to increase physician survey response rates. Medical Care. 2008;46:878–881. doi: 10.1097/MLR.0b013e318178eb1d. [DOI] [PubMed] [Google Scholar]
- Kellerman SE, Herold J. Physician response to surveys: A review of the literature. American Journal of Preventive Medicine. 2001;20:61–67. doi: 10.1016/s0749-3797(00)00258-0. [DOI] [PubMed] [Google Scholar]
- Long JS, Freese J. Regression models for categorical dependent variables using Stata. 2nd. College Station, TX: Stata Press; 2006. [Google Scholar]
- Malin JL, Rideout J, Ganz PA. Tracking managed care: The importance of a cash incentive for medical director response to a survey. American Journal of Managed Care. 2000;6:1209–1214. [PubMed] [Google Scholar]
- McLean SA, Feldman JA. The impact of changes in HCFA documentation requirements on academic emergency medicine: Results of a physician survey. Academic Emergency Medicine. 2001;8:880–885. doi: 10.1111/j.1553-2712.2001.tb01148.x. [DOI] [PubMed] [Google Scholar]
- McMahon SR, Iwamoto M, Massoudi MS, Yusuf HR, Stevenson JM, Chu SY, Pickering LK. Comparison of e-mail, fax, and postal surveys of pediatricians. Pediatrics. 2003;111:e299–e303. doi: 10.1542/peds.111.4.e299. [DOI] [PubMed] [Google Scholar]
- Peugh J, Sirovich B, Applebaum S. Getting a physician's attention: Results of an upfront incentive mail survey experiment; Paper presented at the American Association for Public Opinion Research; Hollywood, FL. 2010. May, [Google Scholar]
- Potts HWW, Wyatt JC. Survey of doctors' experience of patients using the Internet. Journal of Medical Internet Research. 2002;4:e5. doi: 10.2196/jmir.4.1.e5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rodriguez HP, von Glahn T, Rogers WH, Chang H, Fanjiang G, Safran DG. Evaluating patients' experiences with individual physicians: A randomized trial of mail, internet, and interactive voice response telephone administration of surveys. Medical Care. 2006;44:167–174. doi: 10.1097/01.mlr.0000196961.00933.8e. [DOI] [PubMed] [Google Scholar]
- Schleyer TKL, Forrest JL. Methods for the design and administration of web-based surveys. Journal of the American Medical Informatics Association. 2000;7:416–425. doi: 10.1136/jamia.2000.0070416. [DOI] [PMC free article] [PubMed] [Google Scholar]
- VanGeest JB, Johnson TP, Welch VL. Methodologies for improving response rates in surveys of physicians: A systematic review. Evaluation & the Health Professions. 2007;30:303–321. doi: 10.1177/0163278707307899. [DOI] [PubMed] [Google Scholar]
- Yusuf TE, Baron TH. Endoscopic transmural drainage of pancreatic pseudocysts: Results of a national and an international survey of ASGE members. Gastrointestinal Endoscopy. 2006;63:223–227. doi: 10.1016/j.gie.2005.09.034. [DOI] [PubMed] [Google Scholar]