Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2015 Oct 1.
Published in final edited form as: J Public Health Dent. 2014 Apr 7;74(4):276–282. doi: 10.1111/jphd.12054

Supplementing online surveys with a mailed option to reduce bias and improve response rate: The National Dental PBRN

Ellen Funkhouser 1, Jeffrey L Fellows 2, Valeria V Gordan 3, D Brad Rindal 4, Patrick J Foy 5, Gregg H Gilbert 1, for the National Dental PBRN Collaborative Group
PMCID: PMC4352546  NIHMSID: NIHMS663818  PMID: 24707895

Abstract

Objective

Dentists in the National Dental Practice-Based Research Network are offered online and mailed options for most questionnaire studies. We sought to quantify differences in (1) characteristics of dentists who completed a questionnaire online compared to a mailed option offered to non-responders, and (2) prevalence estimates for certain practice characteristics.

Methods

Invitation letters to participants provided an identification number and log-in code to complete the online survey. Non-respondents received a reminder letter after the fourth week, and after an additional four-week period, a final reminder was sent, along with a paper questionnaire version, allowing completion online or by paper.

Results

Of 632 US dentists who completed the survey, 84 (13%) used the paper version. Completion by paper was more common among males, older dentists, and those in general practice (p< 0.05). The proportions of dentists who have electronic dental records, who consistently used a rubber dam, and who either worked with or employed expanded-function auxiliaries were lower among dentists who completed survey using paper, mailed version than online; these differences remained significant in models adjusted for gender, age and practice type.

Conclusion

Even in an era of increasingly electronic communication by dentists, not including a paper option would have overestimated key dental practice characteristics.

Keywords: health surveys, bias, questionnaires, online, dental care

INTRODUCTION

Response rate is the most-common single standard for measuring the quality of surveys (1,2). Response rates have been declining, especially in the past 10 years (3,4). McLeod et al. (3) reviewed surveys of health care providers reported between 2000 and 2010. Using a 60% response rate as a benchmark, the percentage of surveys that met this benchmark decreased from 61% (11/18 surveys) in 1998–2000 to 36% (9/25) 2005–2008. With regard to the mode of administration, the telephone mode had the highest response rates, followed by mail, then mixed-mode, then online (also referred to as internet or web-based).

Low response rate is cited commonly as a weakness of online surveys (5,6). Recent examples specific to dentists also manifest this weakness. For example, Olabi et al. examined office-based sedation by board-certified pediatric dentists practicing in the United States (7). Their survey was sent to all board-certified pediatric dentists who made their email address available on the membership roster provided by the American Academy of Pediatric Dentistry (N=2,586); 659 were returned as undeliverable. Of the 1,917 dentists emailed requesting participation in the online questionnaire, only 494 (26%) completed the survey. Goodchild and Donaldson reported on a web-based survey about sedation in the dental outpatient setting, which emailed requests to 7,276 dentists, specialists and non-specialists, in a list compiled by the authors, of American Dental Association (ADA) and Academy of General Dentistry members and non-members (8). Requests were sent in three separate email ‘blasts’ at one-week intervals to the same dentists, of whom only 716 (9.8%) completed the short web survey. Henry et al. examined the use of social media by U.S. dentists using an online survey sent to a purchased list of email addresses for 22,682 dentists (9). An initial email was sent to all dentists on the sample list, with a link to the survey. Two weeks later, a reminder email was sent, followed one week later by a final email. A total of 573 (2.5%) dentists completed the survey. None of these surveys gave a comparison of responders and non-responders for which bias could be assessed, other than by response rate, and with response rates of 2.5% to 26%, these surveys were vulnerable to response bias.

Hardigan et al. conducted a randomized trial of mode of survey among 6,000 practicing dentists in the Florida Tobacco Control Survey in 2009 (10). Dentists were randomly assigned into one of three groups: choice (mail or online-based), postal mail, or online. A total of 1,232 surveys were returned by the three different groups (21% overall response rate). Response rates were best for the postal mail (26%), similar for choice (25%), and worst from the online group (11%). In the choice group, 94% chose mail. Mode of completion did not differ by gender, age, years in practice, or general dentist/specialist. In terms of cost per completed survey, online surveys were more cost-effective.

An exception to low response rates reported by the above surveys of dentists is a study by Schleyer and Forrest (11). They surveyed 438 dental professionals on their use of the Internet in clinical practice, of whom 292 (67%) completed the survey online and 52 (12%) returned the survey using email or fax. Their sampling frame was the largest (email) discussion list for general dentistry (Internet Dental Forum) at the time. The report demonstrated that within segments of the dental community, it was possible to conduct a survey online; however, even within this segment of the dental community, alternative options (email and fax) were included. No study sample descriptions are presented to evaluate to what population of dentists the findings might to generalizable to, or comparisons of differences in response mode.

Although response rate is the single most common indicator of survey quality, it is not the sole indicator. Data quality, cost, timeliness, sampling frame, and non-response bias are also indicators to consider. In general, online surveys have advantages over postal-mail surveys in terms of quality, speed, and depending on circumstances, cost (5, 6, 12). The primary costs for online surveys are start-up, infrastructure costs. Costs are marginal for expanding or follow-up reminders, once the initial forms (with data entry) have been programmed. Quality often is superior in online surveys compared to postal-mail versions because of the ability to program range checks, skip patterns, etc; however, this improvement has not translated into any reported improvement in reliability. Because online surveys often have low response rates, yet provide high-quality data in a cost-effective manner, mixed-mode approaches have been recommended (13). We report here on a mixed-mode approach.

Objectives of the present study

The current study was conducted by the National Dental Practice-Based Research Network, a consortium of dental practitioners who have affiliated to investigate research questions and to engage in sharing experiences and expertise (14). The over-arching goal of practice-based research is to improve the oral health of patients, so the typical network study involves clinical data collection during the patient care process. However, survey questionnaire information derived by querying practitioners directly can provide important information for designing clinical studies and assessing network capabilities; this was the overall purpose for conducting a network “Infrastructure Update Survey (IUS)”. This IUS is the survey from which analyses were conducted for this report. Dentists in the network are offered online and mailed options for most network questionnaires. We quantified differences in (1) characteristics of dentists who completed a questionnaire online compared to using a mailed option, and (2) prevalence estimates for three key practice characteristics between online and mailed completers. The three characteristics were (1) use of an electronic dental record, (2) use of rubber dam during root canal treatment, and (3) experience with expanded-function auxiliaries. These three characteristics were focused on for this report because they were the primary objectives of the IUS survey.

METHODS

The sample frame for this study consisted of all U.S. dentists in the network who had previously participated in one or more network studies and who were in current practice with an active practice address (n=828). At the time of the survey, practices were predominately located in Alabama, Mississippi, Florida, Georgia, Minnesota, Oregon, and Washington. This project was approved by the human participants institutional review boards at the University of Alabama at Birmingham and all of the network’s regional centers. The reporting conforms to STROBE Guidelines (http://www.strobe-statement.org).

The design was cross-sectional, consisting of a single administration of the IUS. The results of this questionnaire were combined with those of an Enrollment Questionnaire that all practitioners completed when they enrolled in the network. Both questionnaires are publicly available (please see the reference to this publication at http://nationaldentalpbrn.org/peer-reviewed-publications.php).

Thirty-four practitioners and network staff pre-tested a pilot version of the survey to assess the feasibility and comprehension of each questionnaire item. Items regarding electronic dental record use were taken from a study of clinical computing (15). Pilot testing started with network staff, across the regions and from different browsers, then in groups of 6–8 practitioners. The practitioners were queried regarding their experience taking survey, including access, clarity and ease of use. Pilot testing was considered complete when all practitioners in the group reported no problems with access, clarity or ability to complete. Subsequently, letters were sent by the main network administrative site to eligible practitioners, inviting them to participate, and to provide them with a unique identification number and log-in code to complete the online survey. Practitioners were asked to complete the questionnaire within three weeks. Non-respondents received a reminder letter after the fourth week. After an additional four weeks, a final reminder was sent, along with a printed version of the questionnaire allowing the option of completing the online or paper version. Network staff entered data received on mailed, paper versions through an online portal. Individuals who had not responded after a final three-week waiting period were considered non-respondents. Practitioners or their business entities could request a $50 remuneration as a gesture of appreciation for completing the questionnaire, of whom 92% did so. The practitioners who chose the mail option have access to the internet, but either chose not to complete the survey online because of personal preference or perhaps because of difficulty with network or connectivity problems. Prior to and during conduct of the IUS survey, the network staff received queries from practitioners and their staff regarding ability to logon or reconnect, for enrollment questionnaire and survey. These were few and were not documented, nor was the specific reason for using mail option rather than completing the survey online determined.

The overall intent of the IUS was to quantify the (1) use of electronic dental records; (2) use of rubber dam during root canal treatment; and (3) utilization of dental staff for specific clinical procedures, specifically, expanded-function auxiliaries. The questionnaire comprised 25 primary questions, with over 100 branching questions. The survey was completed between December 2010 and June 2011. Survey findings regarding use of electronic dental records, use of rubber dams during root canal procedures and utilization of dental staff have been published (1618).

Bivariate cross-tabulations were calculated to examine associations of the following with completion of the survey by mail: 1) respondents and their practice characteristics and setting; and 2) their use of electronic dental records, use of rubber dams during root canal procedures, and 3) experience and attitude towards working with expanded-function auxiliaries. The chi-squared test was used to assess significance of the differences for bivariate analysis; logistic regression was used for adjusted analysis. Odds ratios (OR) and 95% confidence intervals (CI) were calculated from the models. Statistical significance was assumed for a p-value less than 0.05. All analyses were performed using SAS (SAS/STAT version 9.3, SAS Institute, Inc.).

RESULTS

Overall, 632 (76%) of 828 U.S. dentists surveyed completed the questionnaire. Response rates did not differ by gender, whether or not in general practice, or year graduated from dental school, but did differ by region of the country (p<0.001). The response rates were lowest in Alabama/Missisippi (270/415=65%), moderate in other southeastern states (198/260=76%) and highest in the west and midwest regions (128/153=84%). Of 632 U.S. dentists completing the survey, 84 (13%) used the paper, mailed version. This increased the overall response rate from 66% to 76%.

The majority of participants were male, white, non-hispanic, general practitioners, located in the southeast, and about half were in solo practices (Table 1). Completion by paper was more common among males and older dentists (graduated before 1970). Response mode did not differ significantly by race/ethnicity, whether or not general practitioner, practice setting (solo, large group practice, or public health) or region of the country (Table 1). In adjusted analysis (Table 2), significant predictors of completing survey on paper were male (OR=2.1; 95% CI: 1.0–4.6, p=0.05), general practice (OR=4.6; 95% CI: 1.1–20.2, p=0.04) and graduation before 1970 (OR=5.6; 95% CI: 2.5–12.4, p<0.001).

Table 1.

Distribution of 632 U.S. dentists according to practice/practitioner characteristics and whether they completed the survey using the mailed-paper version instead of online.

All (N=632) Mail, Paper
(N=84)
Online
(N=548)
Characteristic N % n % n % p
Sex 0.02
Male 512 81% 76 90% 436 80%
Female 120 19% 8 10% 112 20%
Race-Ethnicity 0.07
Non-Hispanic White 502 79% 73 87% 429 78%
Other/unknown 130 21% 11 13% 119 22%
Graduation year <0.001
<1970 30 5% 13 15% 17 3%
1970–79 171 27% 28 33% 143 26%
1980–89 208 33% 21 25% 187 34%
1990–99 140 22% 20 24% 120 22%
2000 or later 83 13% 2 2% 81 15%
General practitioner 0.07
Yes 584 92% 82 98% 502 92%
No 48 8% 2 2% 46 8%
US Region 0.2
Midwest/Northeast 84 13% 6 7% 78 14%
Southeast 495 78% 72 86% 423 77%
West 53 8% 6 7% 47 9%
Practice setting 0.3
Solo private 351 56% 55 65% 296 54%
Group private 168 27% 17 20% 151 28%
PDA/HP 82 13% 9 11% 73 13%
Other 31 5% 3 4% 28 5%

PDA: Permanente Dental Associates; HPDG: HealthPartners Dental Group

Table 2.

Associations of practice/practitioner characteristics with whether or not completed the survey using mailed-paper version instead of online.

Bivariate Adjusted1
Characteristic Odds Ratio 95% Confidence Interval p Odds Ratio 95% Confidence Interval p
Male vs. Female 2.4 1.1 – 5.2 0.02 2.1 1.0 – 4.6 0.05
Non-Hispanic White vs. other/unknown 1.8 0.9 – 3.6 0.07 x
Graduated: <1970 vs. 1970 or later 5.7 2.7 – 12.3 <0.001 5.6 2.5 – 12.4 <0.001
General practitioner vs. specialist 3.8 0.9 – 15.8 0.07 4.6 1.1 – 20.2 0.04
Southeast region vs. other 1.8 0.9 – 3.4 0.08 x
Solo, private practice vs. other 1.6 1.0 – 2.6 0.05 x
1

Adusted for characteristics listed, p<=0.05

The proportions of dentists who have electronic dental record (65% vs. 78%, p=0.01), who consistently use a rubber dam (70% vs. 78%, p=0.007), and who either worked with or employed expanded-function auxiliaries (30% vs. 46%, p=0.008) were lower among dentists who completed survey using paper, mailed version than online (Table 3). These differences remained significant in models adjusted for gender, whether or not in general practice and whether or not graduation before 1970 (Table 4).

Table 3.

Distribution of U.S. dentists according to specific characteristics and whether they completed the survey using the mailed-paper version instead of online.

All (N=632) Mail, Paper
(N=84)
Online (N=548)
Characteristic N % n % n % p
Has electronic dental records 0.01
Yes 482 76% 54 65% 428 78%
No 149 24% 29 35% 120 22%
  missing – 1
Performs root canal procedures 0.09
Yes 489 77% 59 70% 430 78%
No 143 23% 25 30% 118 22%
Uses rubber dam 100% (among whom performs root canal) 0.02
Yes 228 47% 19 32% 209 49%
No 260 53% 40 68% 220 51%
  missing – 1
Ever worked with or employed expanded function auxiliaries 0.008
Yes 266 44% 25 30% 241 46%
No 338 56% 57 70% 281 54%
  missing – 28
Believes expanded function auxiliaries have positive impact on care 0.4
Yes 335 54% 41 49% 294 54%
No 291 46% 42 51% 249 46%
  missing – 6

Table 4.

Associations of specific characteristics with whether or not completed the survey using mailed-paper version instead of online.

Bivariate Adjusted for practitioner characteristics1
Characteristic Odds Ratio 95% Confidence Interval p Odds Ratio 95% Confidence Interval p
Has electronic dental records 0.5 0.3 – 0.8 0.02 0.6 0.4 – 1.0 0.04
Ever worked with or employed expanded function auxiliaries 0.5 0.3 – 0.8 0.009 0.6 0.4 – 1.0 0.04
Uses rubber dams 100% (among whom performs root canal) 0.5 0.3 – 0.9 0.02 0.5 0.3 – 0.9 0.04
1

Adjusted for practitioner gender, whether in general practice and whether graduated before 1970

DISCUSSION

Use of a postal mail option to supplement an online survey increased the response rate from 66% to 76%. In addition, and more importantly, participants who responded using the mail option differed in some key characteristics from respondents who completed the survey online. Omitting this option would have reduced the generalizability of findings in ways that could not be measured without the option, specifically, the proportions of practitioners who use electronic dental records, who consistently use rubber dams during root canal treatment, and who have experience with expanded-function auxiliaries would have been over-estimated. Estimating these proportions were primary objectives of the IUS survey. Without the paper, mailed version, all three of these characteristics would have been overestimated.

A number of studies conducted among physicians have examined response rates for mixed-mode designs, concluding that both modes should be used with physicians in order to enhance the coverage of the survey (1, 1921). The study of physicians that is the most comparable to the current dental study is the study by Kroth et al. who examined clinicians’ response rate across three medical practice-based research networks (22). Their survey was of active network members with valid email addresses, as was ours. They examined combining online and postal-mail surveys to response rates in which the survey was first implemented online with five rounds of electronic solicitation for an online-based questionnaire, and then by two rounds of a paper-based version mailed only to non-responders. The electronic solicitations (emails) were personalized, came from the physician’s home practice-based network, and had a customized link to the online survey that provided automatic log-in. Their overall response rate was 61% (n=398/653): 46% (n=301) online and 15% (n=97) on paper. Despite intense promotion of the survey in the online phase, 24% of the total survey responses received was in the paper mode. They concluded that not including a paper option would have resulted in a response rate that was insufficient to support internal validity of the survey results.

The present study is limited in that it was not a randomized trial evaluating response modes. It is possible that some of the 84 dentists who completed the survey using the mailed-paper option sent with the second reminder might have completed the survey online had the mailed-paper option not been offered. However, considering the response rates of 11% online and 26% mailed-paper in a recent randomized trial among dentists we believe it unlikely that very many would have completed the survey online (10).

Our sampling frame was from network members, specifically, from active members who had participated in one or more network studies. We have previously reported that the characteristics of network dentists are similar to dentists in general (23). The same similarities can be observed when comparing the present study sample to the 2010 ADA survey (24). Two comparisons to illustrate: percents female were 19% (present study) and 17% (ADA); for solo practice, 56% and 55%, respectively. Though we did not collect age per se, based on year graduated dental school, the network members from the prior report (23) were younger than ADA sample, as are our study sample when compared to the 2010 ADA survey. We had a relationship with these dentists, they are all interested in research, at least to some degree, and they are younger, on average, than ADA surveys. These characteristics should lead to higher response rates, specifically, higher for online surveys, than typical surveys. Our findings are consistent with this in the 66% response rate to the online survey which increased to 76% with the paper, mail option. Thus, even with slightly younger population of dentists, and dentists more interested in research than a random sample of dentists would likely be, there was still a need to include a mailed-paper option. This option increased response rates for face validity, and more importantly, reduced bias for the key measures the survey was designed to assess.

The recent randomized trial among dentists of response mode and rate (8) found a similar difference in terms of magnitude (15%) between online and paper based as we did, though their overall response rate was much lower (21%) than ours (76%). We also believe the fact that the American Dental Association continues to conduct its annual survey using mailed-paper format supports the need for including a mailed-paper option when conducting surveys among dentists (24).

The mixed-mode design, online initially but then allowing for a mailed-paper option, is a way to reduce overall costs, facilitate adequate response rate, yet reduce the probability of non-response bias. Not including an option to allow completion by paper would have resulted in overestimation of key practice characteristics among network dentists.

References

  • 1.Cull WL, O’Connor KG, Sharp S, Tang SS. Response rates and response bias for 50 surveys of pediatricians. Health Services Research. 2005;40:213–26. doi: 10.1111/j.1475-6773.2005.00350.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Dillman DA. Mail and Internet Surveys: The Tailored Design Method 2007 update with new internet, visual, and mixed-mode guide. Hoboken, New Jersey: John Wiley & Sons, Inc; 2007. [Google Scholar]
  • 3.McLeod CC, Klabunde CN, Willis GB, Stark D. Health care provider surveys in the United States, 2000–2010: A review. Evaluation & the Health Professions. 2013;36:106–126. doi: 10.1177/0163278712474001. [DOI] [PubMed] [Google Scholar]
  • 4.Galea S, Tracy M. Participation rates in epidemiologic studies. Annals of Epidemiology. 2007;17:643–653. doi: 10.1016/j.annepidem.2007.03.013. [DOI] [PubMed] [Google Scholar]
  • 5.de Leeuw ED. Counting and measuring online: The quality of internet surveys. Bulletin Sociological Methodology. 2012;114:68–78. [Google Scholar]
  • 6.van Selm M, Jankowski NW. Conducting online surveys. Quality & Quantity. 2006;40:435–56. [Google Scholar]
  • 7.Olabi NF, Jones JE, Saxen MA, Sanders BJ, Walker LA, Weddell JA, et al. The use of office-based sedation and general anesthesia by board certified pediatric dentists practicing in the United States. Anesthesia Progress. 2012;59:12–17. doi: 10.2344/11-15.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Goodchild JH, Donaldson M. The use of sedation in the dental outpatient setting: a web-based survey of dentists. Dental Implantology Update. 2011;22:73–80. [PubMed] [Google Scholar]
  • 9.Henry RK, Molnar A, Henry JC. A survey of US dental practices’ use of social media. Journal of Contemporary Dental Practice. 2012;13:137–141. doi: 10.5005/jp-journals-10024-1109. [DOI] [PubMed] [Google Scholar]
  • 10.Hardigan PC, Succar CT, Fleisher JM. An analysis of response rate and economic costs between mail and web-based surveys among practicing dentists: A Randomized Trial. Journal of Community Health. 2012;37:383–394. doi: 10.1007/s10900-011-9455-6. [DOI] [PubMed] [Google Scholar]
  • 11.Schleyer TK, Forrest JL. Methods for the design and administration of web-based surveys. Journal of the American Medical Informatics Association. 2000;7:416–25. doi: 10.1136/jamia.2000.0070416. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Shih TH, Fan X. Response rates and mode preferences in web-mail mixed-mode surveys: A meta-analysis. International Journal of Internet Science. 2007;2:59–82. [Google Scholar]
  • 13.de Leeuw ED. To mix or not to mix data collection modes in surveys. Journal of Official Statistics. 2005;21:233–55. [Google Scholar]
  • 14.Gilbert GH, Williams OD, Korelitz JJ, Fellows JL, Gordan VV, Makhija SK, et al. Purpose, structure and function of the United States National Dental Practice-Based Research Network. Journal of Dentistry. 2013;41:1051–1059. doi: 10.1016/j.jdent.2013.04.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Schleyer TK, Thyvalikakath TP, Spallek H, Torres-Urquidy MH, Hernandez P, Yuhaniak J. Clinical computing in general dentistry. Journal of the American Medical Informatics Association. 2006;13:344–352. doi: 10.1197/jamia.M1990. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Schleyer T, Song M, Gilbert GH, Rindal DB, Fellows JL, Gordan VV. Electronic dental record use and clinical information management patterns among practitioner-investigators in The Dental Practice-Based Research Network. Journal of the American Dental Association. 2013;144:49–58. doi: 10.14219/jada.archive.2013.0013. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Anabtawi MF, Gilbert GH, Bauer MR, Reams G, Makhija SK, Benjamin PL, et al. Rubber dam use during root canal treatment: findings from The Dental Practice-Based Research Network. Journal of the American Dental Association. 2013;144:179–186. doi: 10.14219/jada.archive.2013.0097. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Blue CM, Funkhouser E, Riggs S, Rindal DB, Worley D, Pihlstrom DJ, et al. Utilization of non-dentist providers and attitudes toward new provider models: findings from The Dental Practice-Based Research Network. Journal of Public Health Dentistry. 2013 doi: 10.1111/jphd.12020. (In press) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.McMahon SR, Iwamoto M, Massoudi MS, Yusuf HR, Stevenson JM, David F, et al. Comparison of e-mail, fax and postal surveys of pediatricians. Pediatrics. 2003;111:e299–303. doi: 10.1542/peds.111.4.e299. [DOI] [PubMed] [Google Scholar]
  • 20.Beebe TJ, Locke GR, 3rd, Barnes SA, Davern ME, Anderson KJ. Mixing web and mail methods in a survey of physicians. Health Services Research. 2007;42:1219–34. doi: 10.1111/j.1475-6773.2006.00652.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Scott A, Jeon SH, Joyce CM, Humphreys JS, Kalb G, Witt J, Leahy A. A randomized trial and economic evaluation of the effect of response mode on response rate, response bias, and item non-response in a survey of doctors. BMC Medical Research Methodology. 2011;11:126. doi: 10.1186/1471-2288-11-126. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Kroth PJ, McPherson L, Leverence R, Pace W, Daniels E, Rhyne RL, Williams RL. Combining web-based and mail surveys improves response rates: a PBRN study from PRIME Net. Annals of Family Medicine. 2009;7:245–8. doi: 10.1370/afm.944. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Makhija SK, Gilbert GH, Rindal DB, Benjamin PL, Richman JS, Pihlstrom DJ, for The DPBRN Collaborative Group Dentists in practice-based research networks have much in common with dentists at large: Evidence from the Dental Practice-Based Research Network. General Dentistry. 2009;57:270–275. [PMC free article] [PubMed] [Google Scholar]
  • 24.American Dental Association, Survey Center. The 2010 Survey of Dental Practice. Chicago: American Dental Association; 2012. [Google Scholar]

RESOURCES