ABSTRACT
BACKGROUND
Important changes are occurring in how the medical profession approaches assessing and maintaining competence. Physician support for such changes will be essential for their success.
OBJECTIVE
To describe physician attitudes towards assessing and maintaining competence.
DESIGN
Cross-sectional internet survey.
PARTICIPANTS
Random sample of 1,000 American College of Physicians members who were eligible to participate in the American Board of Internal Medicine Maintenance of Certification program.
MAIN MEASURES
Questions assessed physicians’ attitudes and experiences regarding: 1) self-regulation, 2) feedback on knowledge and clinical care, 3) demonstrating knowledge and clinical competence, 4) frequency of use and effectiveness of methods to assess or improve clinical care, and 5) transparency.
KEY RESULTS
Surveys were completed by 446 of 943 eligible respondents (47 %). Eighty percent reported it was important (somewhat/very) to receive feedback on their knowledge, and 94 % considered it important (somewhat/very) to get feedback on their quality of care. However, only 24 % reported that they receive useful feedback on their knowledge most/all of the time, and 27 % reported receiving useful feedback on their clinical care most/all of the time. Seventy-five percent agreed that participating in programs to assess their knowledge is important to staying up-to-date, yet only 52 % reported participating in such programs within the last 3 years. The majority (58 %) believed physicians should be required to demonstrate their knowledge via a secure examination every 9–10 years. Support was low for Specialty Certification Boards making information about physician competence publically available, with respondents expressing concern about patients misinterpreting information about their Board Certification activities.
CONCLUSIONS
A gap exists between physicians’ interest in feedback on their competence and existing programs’ ability to provide such feedback. Educating physicians about the importance of regularly assessing their knowledge and quality of care, coupled with enhanced systems to provide such feedback, is needed to close this gap.
KEY WORDS: board certification, physician attitudes, feedback, medical education, transparency
INTRODUCTION
Society grants the medical profession many privileges, and in return, expects the profession will set and enforce standards for safe practice.1,2 Historically, physicians would obtain a state medical license, sit for specialty boards once, and receive a lifetime certificate. Ongoing competence was maintained through Continuing Medical Education (CME) courses and conversations with colleagues. This approach had several limitations.3 Importantly, it did not allow direct evaluation of clinicians’ practice. Little information about physician competence was publicly available.
Important changes are underway in how the medical profession approaches maintaining and assessing competence, with an emphasis on self-directed learning.4,5 CME programs are moving from the traditional passive approach (e.g. didactic lectures), which has limited ability to help physicians acquire and implement new knowledge, to more interactive approaches (e.g. workshops).5,6 There is growing recognition of the critical role that multisource feedback can play in promoting learning.7 Specialty boards have changed the certification process and no longer issue lifetime certificates.3 Over time, Maintenance of Certification (MOC) has expanded to include more self-assessment and measures of performance in practice (e.g. the care being delivered), evolving towards more ongoing, continuous assessment.
As these changes are implemented, it is important to understand physicians’ attitudes regarding assessing and maintaining clinical competence.8,9 Therefore, we surveyed practicing internal medicine physicians who are involved in recertification processes to understand their attitudes and behaviors in these areas.
METHODS
Survey Subjects
We conducted an internet survey among a random sample of 1,000 physician members of the American College of Physicians (ACP) using the following criteria: ACP member living in the US with known contact information, less than 67 years old, not retired, and Board Certified in or after 1990. This sample was chosen to ensure that issues of Maintenance of Certification were salient to respondents. Of these 1,000 potential respondents, 57 had failed email delivery status, leaving 943 subjects who were eligible to complete the survey.
Survey Content
The survey was developed by experts in medical education, quality of care, and physician assessment, and then pilot tested on eight physicians. The survey took approximately fifteen minutes to complete. Survey questions explored a range of topics related to physicians’ attitudes about assessing and maintaining clinical competence. The complete survey instrument is available on request.
Survey Implementation
After receiving institutional review board approval, data was collected between 15 September 2010 and 30 December 2010. Informed consent was implied through completion of the survey, and token incentive for participants was offered in the form of a $25 gift card on completion of the survey.
Statistical Analysis
We calculated descriptive statistics for all study variables, as well as measures of association (e.g., Spearman ordinal rank-order correlations) between variables, as appropriate. Differences between proportions were used to explore group differences on specific variables (e.g., Do generalists differ from medicine sub-specialists in their beliefs in the desirability of feedback). Spearman rank-order variables were used to compare the rank ordering of specific variables between two variables (e.g., Does the rated level of the importance of feedback correlate with the rated desirability for more frequent clinical assessments).10 We used the Bonferroni adjustment to control for possible Type-I error associated with multiple variable comparisons.
Because many studies have demonstrated the value of feedback for learning and performance improvement,11–14 we constructed a scale to reflect respondents’ attitudes regarding the importance of feedback (the Importance of Feedback scale). These four items comprise the scale: 1) “How important or unimportant is it to you to get formal feedback on your medical knowledge in your specialty”; 2) “How important or unimportant is it to you to get feedback on the quality of your clinical care”; 3) “Participating in structured programs that assess my medical knowledge is an important component of staying up-to-date”; and 4) “Participating in structured programs that assess my practice performance is an important component of staying up-to-date.” Each question used a 5-point Likert response scale. The item comprising the Importance of Feedback scale were scaled from 1 to 5 points, with 1 point representing a “low” reported importance of feedback and 5 points representing a “high” reported importance of feedback. The aggregate scale was the arithmetic average of the four individual items scores ranging from a low value of 1.0 to a high value of 5.0.
RESULTS
Surveys were completed by 446 (47 %) of 943 eligible physicians (Table 1). Sixty-two percent practiced as internal medicine generalists, compared to 20 % practicing in internal medicine subspecialties. Nearly half (48 %) of respondents were due for recertification after 2011. The sociodemographic profile of our respondents was generally similar to the overall ACP membership (Table 1).
Table 1.
Characteristics of Survey Respondents (n = 446) and Overall American College of Physicians (ACP) Membership
| Survey Respondents, Sept–Dec 2010 | ACP Membership, Jan 2011 | |
|---|---|---|
| Response rate n (%) | 446/943 (47) | |
| Mean age (years) | 44 | 45 |
| Women n (%) | 164 (39)* | 34 % |
| Specialization, n (%)† | ||
| Internal medicine generalist | 248 (62) | 56 % |
| Internal medicine subspecialist | 82 (20) | 34 % |
| Other | 71 (18) | 10 % |
| Professional activities, n (%) | ||
| Clinical practice | 377 (85) | 77 % |
| Research | 18 (4) | 4 % |
| Administration | 17 (4) | 6 % |
| Teaching | 16 (4) | 4 % |
| Residency, Fellowship | 13 (3) | 7 % |
| Other | 3 (1) | 2 % |
| Primary work setting (most time spent), n (%) | ||
| Private ambulatory care office | 118 (27) | 26 % |
| Academic medical center (AMC)/medical school | 114 (26) | 27 % |
| Private community hospital (excluding AMC) | 110 (25) | 23 % |
| Multispecialty clinic | 36 (8) | 7 % |
| Community health center/clinic | 17 (4) | 4 % |
| Federal government hospital | 15 (3) | 6 % |
| Nursing home | 7 (2) | 1 % |
| Other | 23 (5) | 5 % |
| Primary work location, n (%) | ||
| Urban area | 215 (49) | 52 % |
| Suburban area | 150 (34) | 32 % |
| Small town or rural area w/in 60 miles of urban/suburban area | 54 (12) | 11 % |
| Small town or rural area farther than 60 miles of urban/suburban area | 18 (4) | 5 % |
| Uncertain | 3 (1) | 1 % |
| Size of practice, n (%) | ||
| 1 physician (solo) | 53 (12) | 17 % |
| 2 physicians | 27 (6) | 8 % |
| 3–5 physicians | 61 (14) | 19 % |
| 6–10 physicians | 84 (19) | 16 % |
| 11–20 physicians | 82 (19) | 11 % |
| 21–50 physicians | 42 (10) | 9 % |
| 51–100 physicians | 29 (7) | 5 % |
| Over 100 physicians | 65 (15) | 16 % |
| Timeline for maintenance of ABIM certification in Internal Medicine, n (%) | ||
| Certified before 1990 and do not plan to recertify | 1 (0) | N/D |
| Plan to recertify in Internal Medicine in 2010 or 2011 | 87 (20) | N/D |
| Up for recertification in 2010 or 2011, but do not plan to maintain IM certification | 13 (3) | N/D |
| Recertified in 2006, 2007, 2008, or 2009 | 105 (24) | N/D |
| Recertification year is after 2011 | 213 (48) | N/D |
| Other | 21 (5) | N/D |
*No information on 28 respondents
†No information on 45 respondents
General Attitudes About Self-Regulation
Respondents believed the medical profession was succeeding with regards to supporting provider competence. Eighty percent reported the profession was doing a “good” or better job of ensuring that physicians are practicing high-quality medicine, with 10 % reporting it was doing an “excellent” job, 37 % a “very good” job, and 33 % a “good” job. Seventy-six percent believed that specialty boards were effective (somewhat or very) at ensuring physicians practice high quality medicine, compared with 70 % who rated hospital medical staffs as effective (somewhat or very) and 57 % who rated professional societies as effective (somewhat or very).
Attitudes About Board Certification and Maintenance of Certification
Eighty-four percent considered Board Certification an important (somewhat or very) marker of a high quality physician, and 73 % agreed or strongly agreed that Board certification is important to their patients. Ninety-two percent were committed to maintaining their Board Certification. Fewer respondents considered recognition by a medical professional society (60 %), patient experience ratings from an external entity (59 %), or recognition by a payer-sponsored program such as pay-for-performance (29 %) as important (somewhat or very) markers of a high-quality physician.
Feedback on Medical Knowledge and Quality of Care
Feedback Gap
Respondents considered it important to receive feedback on their medical knowledge and clinical care. Eighty-eight percent stated it was important (somewhat or very) to get formal feedback on specialty-specific medical knowledge, with 94 % reporting it was important (somewhat or very) to get feedback on the quality of their clinical care. However, only 24 % reported that they receive useful feedback on their medical knowledge most or almost all of the time, and only 27 % reported receiving useful feedback on the quality of their clinical care. Moreover, nearly all (98 %) agreed that they regularly incorporate new medical knowledge into the clinical care they provide patients, but over half (62 %) also agreed that they have colleagues who seem unaware of important gaps in their competence. Ninety-three percent reported the quality of care they provide to patients to be above average (somewhat or considerably.)
Resources Used to Assess or Improve Quality of Clinical Care
Respondents reported extensive use of resources to assess or improve the quality of their clinical care in the last 3 years (Table 2). Nearly all respondents relied on reading the medical literature (99 %), informal feedback from patients (89 %), national (88 %) and local (83 %) CME programs, and informal feedback from peers (79 %). Somewhat fewer (68 %) reported making use of specialty society knowledge self-assessment programs and results from MOC knowledge self-assessment programs (64 %).
Table 2.
Frequency and Usefulness of Different Resources to Assess or Improve the Quality of Clinical Care, 15 September 2010 to 30 December 2010 (n = 446)
| Used in the last 3 years to assess/improve clinical care (%) | Very/extremely useful (%)* | |
|---|---|---|
| Reading the medical literature | 99 | 77 |
| Informal feedback from patients | 89 | 53 |
| National CME programs | 88 | 68 |
| Local CME programs | 83 | 52 |
| Informal feedback from peers | 79 | 63 |
| Specialty society knowledge self-assessment program (e.g., MKSAP) | 68 | 75 |
| Results from MOC Program knowledge self-assessment | 64 | 62 |
| Results from MOC Program performance in practice module | 53 | 57 |
| Results from MOC Program secure exam | 51 | 64 |
| Performance feedback from medical group | 49 | 49 |
| Practice audits | 40 | 49 |
| Performance feedback from insurer | 38 | 27 |
| Performance feedback from national quality organization | 30 | 46 |
| Formal study group with peers | 26 | 63 |
*Perceived usefulness measured among those respondents who reported using the resource in the past 3 years
Respondents were willing to participate in multi-component assessment programs as a part of staying current with medical advances. Seventy-five percent agreed that participating in such programs to assess their medical knowledge is an important component of staying up-to-date. However, only 52 % reported participating in such programs at least once every 3 years, with reported frequency of participation varying from 16 % stating several times per year to 34 % stating every 5-10 years. Similar trends were noted when assessing practice performance. Sixty-three percent agreed that participating in structured programs is an important component of staying up-to-date, but participation varied from 15 % participating several times per year to 26 % participating only every 5–10 years, and only 47 % reported participating in such programs at least every 3 years.
Requirements for Formal Participation in Demonstration of Knowledge/Competence
Recognition among physicians about the importance of feedback did not necessarily translate into support for requirements that physicians regularly demonstrate their competence. Fifty-eight percent believed that physicians should demonstrate their clinical knowledge via a secure, written exam only every 9–10 years, while 12 % supported requiring physicians to demonstrate their knowledge every 4–6 years. Similarly, nearly half (47 %) supported a 9–10 year interval for physicians to demonstrate their clinical competence. In contrast, physicians believed that the quality of care in their practices should be measured more frequently, with 48 % recommending such assessment every 4–6 years.
Factors Associated with Belief in the Importance of Feedback
The Importance of Feedback scale had a mean of 2.3 (sd = 0.87), and had acceptable internal consistency, with a Cronbach alpha of 0.71. Spearman rank-order correlations showed a variety of factors were positively correlated with higher Importance of Feedback scores (Table 3). For example, respondents who rated Board Certification (rs = 0.46, p < 0.001), scores on self-assessment program (e.g. MKSAP) (rs = 0.40, p < 0.001), and recognition by the NCQA (rs = 0.38, p < 0.001) and payer-sponsored program (“pay-for-performance”) (rs = 0.34, p < 0.001) as important markers of a high quality physician also had higher Importance of Feedback scores. Similarly, respondents who agreed it is important to patients that their physicians are Board Certified (rs = 0.40, p < 0.001), and agreed that they were committed to maintaining their Board Certification (rs = 0.48, p < 0.001) also had higher Importance of Feedback scores. Higher Importance of Feedback scores were correlated with support for more frequent requirements that physicians demonstrate their clinical knowledge via a secure written examination (rs = 0.50, p < 0.001), demonstrate their clinical competence (rs = 0.411, p < 0.001), and assess the quality of care in their practice (rs = 0.32, p < 0.001).
Table 3.
Selected Correlations with Importance of Feedback Scale*, 15 September 2010 to 30 December 2010 (n = 446)
| Item (response scale) | Correlation coefficient | P value |
|---|---|---|
| Importance of Markers of High Quality Physician (5-point scale from “very unimportant” to “very important”) | ||
| Board Certification | 0.464 | < 0.001 |
| Score on self-assessment program (e.g., MKSAP) | 0.404 | < 0.001 |
| Recognition by the NCQA | 0.375 | < 0.001 |
| Recognition by a Medical Professional Society (e.g. ACP Fellowship or Mastership) | 0.355 | < 0.001 |
| Recognition by a payer-sponsored program (“pay-for-performance”) | 0.337 | < 0.001 |
| Good patient experience ratings from an external entity | 0.212 | < 0.001 |
| Attitudes Regarding Board Certification (5-point scale from “strongly agree” to “strongly disagree”) | ||
| I am committed to maintaining my Board Certification | 0.484 | < 0.001 |
| It is important to my patients that I am Board Certified | 0.397 | < 0.001 |
| Recommended Frequency of Assessment (4–6 years, 7–8 years, 9–10 years, once through initial certification, never) | ||
| How frequently, if at all, should physicians be expected to demonstrate their clinical knowledge via a secure, written exam? | 0.495 | < 0.001 |
| How frequently, if at all, should physicians be expected to demonstrate their clinical competence? | 0.411 | < 0.001 |
| How frequently, if at all, should physicians be expected to examine the quality of care in their practice? | 0.316 | < 0.001 |
| Frequency of Participation (several times per year, yearly. 2–3 years, 4–5 years, 5–10 years, never, n/a) | ||
| How frequently do you participate in structured programs to assess your overall practice performance? | 0.273 | < 0.001 |
| How frequently do you participate in structured programs to assess your medical knowledge | 0.237 | < 0.001 |
| Usefulness of methods to assess or improve quality of clinical care (among those who reported using these methods in last 3 years) (5-point scale ranging from “not at all useful” to “extremely useful”) | ||
| Results from MOC Program secure exams | 0.586 | < 0.001 |
| Results from MOC Program performance in practice module | 0.518 | < 0.001 |
| Results from MOC Program knowledge self-assessment | 0.451 | < 0.001 |
| Performance feedback from national quality organizations | 0.369 | < 0.001 |
| Specialty society knowledge self-assessment programs (e.g., MKSAP) | 0.296 | < 0.001 |
| Performance feedback from medical group | 0.271 | < 0.001 |
| Practice audits | 0.249 | 0.001 |
| Performance feedback from insurer | 0.222 | 0.007 |
| Demographics: no association with work setting, professional activities, work location, number of physicians in practice, age, gender, or specialization | ||
*Four-item scale described in detail in Methods, Cronbach alpha 0.71
Importance of Feedback scores were also associated with respondents’ reports of different self-assessment behaviors. Respondents who reported more frequent participation in structured programs to assess their medical knowledge (rs = 0.24, p < 0.001) and participation in structured programs to assess their overall practice performance (rs = 0.27, p < 0.001) also had higher Importance of Feedback scores. Among respondents who reported participating in different methods to assess or improve the quality of their clinical care, higher Importance of Feedback scores were also correlated with increased ratings of the methods’ usefulness. Notably, these correlations were strongest between the Importance of Feedback scores and the reported usefulness of results from MOC Program secure examination (rs = 0.59, p < 0.001), results from MOC Program performance in practice module (rs = 0.52, p < 0.001), results from MOC Program knowledge self-assessment (rs = 0.45, p < 0.001), and performance feedback from national quality organizations (rs = 0.37, p < 0.001).
After controlling for increased risk of Type-I error rate with the Bonferroni adjustment, there were no interpretable significant relationships between Importance of Feedback scores and physician demographics or the time line for maintenance of American Board of Internal Medicine (ABIM) certification.
Transparency
Transparency of information about physicians’ qualifications and performance is increasingly being used to supplement self-regulation. Our respondents were unsure about whether such transparency would be desirable (Table 4). Less than half (41 %) reported that it was likely (somewhat or very) that providing information to the public about the status of a physician’s Board certification would increase the quality of healthcare. Only 30 % considered it likely (somewhat or very) that providing information to the public about the status of a physicians’ involvement in the MOC would increase the quality of healthcare.
Table 4.
Attitudes Regarding Transparency of Board Certification and Maintenance of Certification activity, 15 September 2010 to 30 December 2010 (n = 446)
| Should information be public? (% yes) | Might be misinterpreted by patients? (% yes) | |
|---|---|---|
| Board Certification Information | ||
| Whether a physician is Board certified | 91 | 41 |
| Dates when a physician passed exam | 55 | 55 |
| Dates when a physicians' certificate expires | 49 | 65 |
| Adverse actions against a physician by Board | 55 | 78 |
| Maintenance of Certification (MOC) | ||
| Whether a physician is enrolled in MOC | 28 | 75 |
| Whether a physician is actively participating in MOC | 26 | 76 |
| List of years in which a physician had MOC activity | 19 | 77 |
| How many MOC points a physician has earned per year | 12 | 81 |
| Which MOC modules a physician has done | 10 | 79 |
DISCUSSION
The rapid growth in medical knowledge, coupled with awareness of significant deficits in the quality of healthcare, is stimulating important changes in how physicians’ competence is supported and measured. Our study suggests that the current system may not be providing sufficient quantity or frequency of feedback to physicians about their competence.15,16 Only one-quarter of respondents reported receiving useful feedback on their medical knowledge and quality of care most or all/almost all of the time. Despite respondents’ strong support for Board Certification, and emerging evidence of an association between participation in Maintenance of Certification activities and the quality of actual care,17–20 only slightly more than half of respondents had used MOC resources in the last 3 years. Refinements of MOC products, coupled with other strategies for educating and engaging physicians in ongoing activities to maintain competence, are needed.21
Respondents’ desire for feedback did not translate into support for requiring that physicians demonstrate their knowledge or competence more frequently than the current once-per-decade interval. Of interest, physicians supported more frequent examinations of clinical competence than of knowledge. Physicians’ beliefs in this area are not well aligned with the rapidity of knowledge turnover or the robust body of system improvement science, emphasizing the need for continuous attention to quality improvement. A 2003 survey also showed limited physician engagement in quality improvement, suggesting that progress has been slow.22 Because most physicians in this study believe they are providing above-average care, they may not see a need for more frequent self-assessment. However, research shows that self-assessment abilities are limited in the absence of external data.6,14,23,24 As specialty boards move towards more continuous MOC processes, educational campaigns may be needed to help physicians understand the rationale for this change.
While some elements of assessing and maintaining competence can be imposed on physicians, the success of self-regulation ultimately hinges on physicians engaging in self-regulated learning, which Shunk and Zimmerman define as “learning that results from students’ self-generated thoughts and behaviors that are systematically oriented toward the attainment of their learning goals.”25 MOC, through its self-directed assessment activities, can be a meaningful facilitator as a component of physicians’ self-regulated learning. However, the resources physicians reported using most commonly to assess or improve their quality of clinical care, such as reading the medical literature, informal feedback from patients, and national or local CME programs, vary widely in the evidence supporting their effectiveness.5 Physicians’ ratings of the importance of feedback were most strongly correlated with Board Certification activities, such as participation in structured programs to assess knowledge and practice performance. These correlations do not allow for causal conclusions. Nonetheless, they suggest that Board Certification activities, while representing only a narrow slice of self-directed learning, may promote physicians’ engagement in ongoing activities to maintain their competence.
Major changes have also been taking place around transparency, with patients and other stakeholders now having much greater access to physician-level information. Our results highlight physicians’ discomfort with such transparency. Only half of physicians supported making public information that is already available on most specialty board websites (e.g., the date when a physician passed their exam or any adverse action taken against a physician by the Board). Physician education regarding the benefits of transparency, as well as efforts by specialty boards to ensure that reported information is useful to patients, may help specialty boards share more information about physician competence with the public in ways that are acceptable to physicians.
This study has important limitations. While our response rate is adequate given the challenges associated with the recruitment of busy clinicians,26 non-response bias may have affected our findings. In particular, we were unable to assess the degree to which non-respondents differed from respondents around key attitudes, such as the value attached to self-assessment. In addition, the data are all self-reported, and we cannot determine if physicians’ reported behaviors correspond to actual behaviors. Social desirability bias may have prompted respondents to report more positive attitudes and behaviors than they actually hold. We surveyed practicing internists who are ACP and eligible for Maintenance of Certification programs, limiting generalizability to physicians from other specialties, non-ACP member physicians, those who hold lifetime specialty certificates, or those who are not Board Certified.
Rapid evolution is taking place in medical knowledge. A similarly dramatic change is needed in how physicians assess and maintain their knowledge and competence. A broad-based educational campaign should reshape how physicians approach maintaining and improving their competence, tapping into physicians’ high intrinsic motivation to provide exemplary patient care. All physicians should understand the significant limitations of self-assessment in isolation, and the value of seeking feedback, both through data and from others (peers, co-workers and patients). Most physicians find participating in improvement activities to be satisfying and a source of professional pride.
In addition, enhanced tools are needed to maintain physicians competence, such as feedback methods that are “real-time” and available at the point of care. Registries embedded in electronic medical records can provide longitudinal and ongoing patient data as a form of real-time performance audit and feedback. Methods to identify and close knowledge gaps while caring for patients, akin to a rapid plan-do-study-act cycle, can function as ongoing assessments to help “keep up” with the burgeoning medical literature.27,28 New research is needed to understand which tools are most effective for self-directed learning and maintaining provider competence.
The medical profession’s ability to regulate the competence of its providers should be seen by physicians as a privilege. Greater participation by physicians in activities to assess and maintain their competence, coupled with improved approaches to providing feedback to physicians, is needed for the medical profession to fulfill its share of this vital social contract.
Acknowledgements
Funding
This study was supported by the American Board of Internal Medicine Foundation. Dr. Gallagher is also supported by a Robert Wood Johnson Foundation Investigator Award in Health Policy Research.
Contributors
Thanks to Lorie Slass, MA, for her support of this project, Sara Kim, PhD, for her thoughtful comments on this manuscript, and to Ben Dunlap for assistance with manuscript preparation.
Prior Presentations
None.
Conflict of Interest
Dr. Weissman is an employee of the American College of Physicians (ACP), and Dr. Holmboe is an employee of the American Board of Internal Medicine (ABIM). Both ACP and ABIM develop and sell educational products to physicians in support of the ABIM Maintenance of Certification Process. Dr. Holmboe also reports royalties from Mosby Elsevier Publisher related to authorship of a textbook on assessment. No other author declares a conflicts of interest.
REFERENCES
- 1.Starr P. The social transformation of American medicine. New York: Basic Books; 1984. [Google Scholar]
- 2.Levinson W, Holmboe E. Maintenance of certification: 20 years later. Am J Med. 2011;124(2):180–5. doi: 10.1016/j.amjmed.2010.09.019. [DOI] [PubMed] [Google Scholar]
- 3.Levinson W, Holmboe E. Maintenance of certification in internal medicine: facts and misconceptions. Arch Intern Med. 2011;171(2):174–6. doi: 10.1001/archinternmed.2010.477. [DOI] [PubMed] [Google Scholar]
- 4.Weiss KB. Future of board certification in a new era of public accountability. J Am Board Fam Med JABFM. 2010;23(Suppl 1):S32–9. doi: 10.3122/jabfm.2010.S1.090283. [DOI] [PubMed] [Google Scholar]
- 5.Davis D, O'Brien MA, Freemantle N, Wolf FM, Mazmanian P, Taylor-Vaisey A. Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? JAMA. 1999;282(9):867–74. doi: 10.1001/jama.282.9.867. [DOI] [PubMed] [Google Scholar]
- 6.Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296(9):1094–102. doi: 10.1001/jama.296.9.1094. [DOI] [PubMed] [Google Scholar]
- 7.Archer JC. State of the science in health professional education: effective feedback. Med Educ. 2010;44(1):101–8. doi: 10.1111/j.1365-2923.2009.03546.x. [DOI] [PubMed] [Google Scholar]
- 8.Levinson W, King TE, Jr, Goldman L, Goroll AH, Kessler B. Clinical decisions. American board of internal medicine maintenance of certification program. N Engl J Med. 2010;362(10):948–52. doi: 10.1056/NEJMclde0911205. [DOI] [PubMed] [Google Scholar]
- 9.Drazen JM, Weinstein DF. Considering recertification. N Engl J Med. 2010;362(10):946–7. doi: 10.1056/NEJMe1000174. [DOI] [PubMed] [Google Scholar]
- 10.Caruso J, Cliff N. Empirical size, coverage, and power of confidence intervals for Spearman's Rho. Ed and Psy Meas. 1997;57:637–54. doi: 10.1177/0013164497057004009. [DOI] [Google Scholar]
- 11.Boud DME. Feedback in higher and professional education : understanding it and doing it well. London. New York: Routledge; 2013. [Google Scholar]
- 12.Crommelinck M, Anseel F. Understanding and encouraging feedback-seeking behaviour: a literature review. Med Educ. 2013;47(3):232–41. doi: 10.1111/medu.12075. [DOI] [PubMed] [Google Scholar]
- 13.Hattie JTH. The power of feedback. Rev Educ Res. 2007;77(1):81–112. doi: 10.3102/003465430298487. [DOI] [Google Scholar]
- 14.Sargeant J, Armson H, Chesluk B, Dornan T, Eva K, Holmboe E, et al. The processes and dimensions of informed self-assessment: a conceptual model. Acad Med J Assoc Am Med Coll. 2010;85(7):1212–20. doi: 10.1097/ACM.0b013e3181d85a4e. [DOI] [PubMed] [Google Scholar]
- 15.Norcini J. The power of feedback. Med Educ. 2010;44(1):16–7. doi: 10.1111/j.1365-2923.2009.03542.x. [DOI] [PubMed] [Google Scholar]
- 16.Mazor KM, Holtman MC, Shchukin Y, Mee J, Katsufrakis PJ. The relationship between direct observation, knowledge, and feedback: results of a national survey. Academic medicine. J Assoc Am Med Coll. 2011;86(10 Suppl):S63–7. doi: 10.1097/ACM.0b013e31822a6e5d. [DOI] [PubMed] [Google Scholar]
- 17.Bernabeo EC, Conforti LN, Holmboe ES. The impact of a preventive cardiology quality improvement intervention on residents and clinics: a qualitative exploration. Am J Med Qual. 2009;24(2):99–107. doi: 10.1177/1062860608330826. [DOI] [PubMed] [Google Scholar]
- 18.Chen J, Rathore SS, Wang Y, Radford MJ, Krumholz HM. Physician board certification and the care and outcomes of elderly patients with acute myocardial infarction. J Gen Intern Med. 2006;21(3):238–44. doi: 10.1111/j.1525-1497.2006.00326.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Holmboe ES, Lipner R, Greiner A. Assessing quality of care: knowledge matters. JAMA. 2008;299(3):338–40. doi: 10.1001/jama.299.3.338. [DOI] [PubMed] [Google Scholar]
- 20.Holmboe ES, Meehan TP, Lynn L, Doyle P, Sherwin T, Duffy FD. Promoting physicians' self-assessment and quality improvement: the ABIM diabetes practice improvement module. J Contin Educ Health Prof. 2006;26(2):109–19. doi: 10.1002/chp.59. [DOI] [PubMed] [Google Scholar]
- 21.Eva KW, Regehr G. "I'll never play professional football" and other fallacies of self-assessment. J Contin Educ Health Prof. 2008;28(1):14–9. doi: 10.1002/chp.150. [DOI] [PubMed] [Google Scholar]
- 22.Audet AM, Doty MM, Shamasdin J, Schoenbaum SC. Measure, learn, and improve: physicians' involvement in quality improvement. Health Aff. 2005;24(3):843–53. doi: 10.1377/hlthaff.24.3.843. [DOI] [PubMed] [Google Scholar]
- 23.Jamtvedt G, Young JM, Kristoffersen DT, O'Brien MA, Oxman AD. Does telling people what they have been doing change what they do? A systematic review of the effects of audit and feedback. Qual Saf Health Care. 2006;15(6):433–6. doi: 10.1136/qshc.2006.018549. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Sargeant J. 'To call or not to call': making informed self-assessment. Med Educ. 2008;42(9):854–5. doi: 10.1111/j.1365-2923.2008.03142.x. [DOI] [PubMed] [Google Scholar]
- 25.Shunk D, Zimmerman B. Self-Regulation and Learning. In: WM R, GE M, editors. Handbook of Psychology. New JErsey: Wiley and Sons; 2003. p. 59–75.
- 26.James KM, Ziegenfuss JY, Tilburt JC, Harris AM, Beebe TJ. Getting physicians to respond: the impact of incentive type and timing on physician survey response rates. Health Serv Res. 2011;46(1 Pt 1):232–42. doi: 10.1111/j.1475-6773.2010.01181.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Green ML, Reddy SG, Holmboe E. Teaching and evaluating point of care learning with an Internet-based clinical-question portfolio. J Contin Educ Health Prof. 2009;29(4):209–19. doi: 10.1002/chp.20039. [DOI] [PubMed] [Google Scholar]
- 28.Ivers N, Jamtvedt G, Flottorp S, Young JM, Odgaard-Jensen J, French SD, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;6 doi: 10.1002/14651858.CD000259.pub3. [DOI] [PMC free article] [PubMed] [Google Scholar]
