Skip to main content
Geriatric Orthopaedic Surgery & Rehabilitation logoLink to Geriatric Orthopaedic Surgery & Rehabilitation
. 2011 Sep-Nov;2(5-6):163–171. doi: 10.1177/2151458511423646

Early Experience in Implementation of a Learning Assessment Toolkit in the AOTrauma Geriatric Fracture Course

Natasha T O’Malley 1,, Michael Cunningham 2, Frankie Leung 3, Michael Blauth 4, Stephen L Kates 1
PMCID: PMC3609399  PMID: 23569686

Abstract

Background: Surgical education is continually expanding to encompass new techniques and technologies. It is vital that educational activity is directed at gaps in knowledge and ability to improve the quality of learning. Aim: The aim of this study is to describe a published learning assessment toolkit when applied to participants attending AOTrauma Orthogeriatric Fracture courses. Methods: Precourse, participants received a questionnaire covering 10 competencies to assess knowledge gaps and a 20-question clinical knowledge test. The knowledge gap between perceived and desired knowledge was correlated with clinical knowledge test results to help course faculty focus the course curriculum to meet identified educational needs. A commitment to change survey was also administered. Results: Over 3 courses, 48% of registered attendees responded to the precourse survey, 44.5% responded postcourse. The precourse gap scores were generally highest for 2 competencies (“address secondary prevention,” “build a system of care”) indicating a higher level of motivation to learn in these topics and lowest for a variety of competencies (eg. “restore function early,” “co-manage patient care in the US surgeons group”) indicating lower motivation to learn in these competencies. These precourse gap scores guided adaptations in the course structure. Postcourse gaps were reduced in the 4 cohorts. Large improvements were seen in “Address secondary prevention” and “Build a system of care” in many of the cohorts. Competencies with the lowest precourse knowledge test scores were noted in each cohort. Where low pretest scores were noted, it highlighted the need for faculty to put appropriate emphasis on these topics in the delivery of the course content. Conclusion: The technique of evaluating and identifying gaps in knowledge and ability allows course designers to focus on areas of deficits. Measurable success was shown with a subjectively decreased gap score and objectively improved clinical knowledge, as demonstrated by improved test results after course completion.

Keywords: geriatric trauma, continuing medical education, aging population, geriatric medicine, geriatric fracture

Introduction

Lifelong learning is a vital element of the continuum of medical education,1 from graduation through to retirement. The continued advances in surgical techniques and equipment mean that surgical education must continually evolve using new methods.

Given the time constraints of clinical practice, it is critical that all postgraduate educational activity is directed toward gaps in knowledge to improve the quality of learning.2 Continuing Medical Education (CME) activities are a cornerstone of the process of lifelong learning. The Institute of Medicine (IOM) issued a report3 in 2010 that recommended development of “a better system … to identify effective learning activities based on standardized, transparent pilot testing and evaluation of resultant learning, performance improvement and patient outcomes.”

The American College of Graduate Medical Education (ACGME) in its “Educating Physicians for the 21st Century” program2 has also identified the need for outcome measure-focused learning and highlighted the need to identify competencies, implement an assessment system, and develop a competency-based curriculum with an evaluation plan. Moore et al4 in 2009 highlighted the need for CME activities to be planned according to principles that can demonstrate improvement in physician competence, physician performance, or patient health status, and that CME should be based on assessed need.5,6

The Learning Assessment Toolkit7 is a published set of educational instruments specifically designed to ensure the goals listed above are prospectively identified and subsequently addressed during educational events. This is accomplished through the development of specific competencies (see Glossary Box for definition) for a targeted area of practice and the assessment of knowledge gaps (precourse and postcourse knowledge testing). The authors of the Learning Assessment Toolkit freely encourage use of their techniques and we now outline the results in the application of these techniques to 3 AOTrauma orthogeriatric fracture courses.

The resulting educational research aimed to provide data on the implementation of this particular toolkit (subjective and objective measurement of needs, commitment to change practice survey, etc) to evaluate the impact of an educational activity with practicing physicians.

Background

The Learning Assessment Toolkit was initially designed and tested on a fracture management course aimed at residents and trainees in their first years of orthopedics. The AOTrauma orthogeriatrics course is founded on the same principles of improving patient care on a foundation of scientific knowledge. However, the orthogeriatrics course is directed at more advanced practitioners (most having more than 15 years of practice experience), with a curriculum dedicated to improved medical, surgical, and preventative management of fragility fractures in the elderly individuals. The orthogeriatrics course also differs in that both faculty and participants include geriatricians and other medical specialists as well as the majority audience of orthopedic surgeons. Given the experience and needs of the participants, and in particular that this is a cross specialty forum, it is vital that participants' expectations and learning needs are anticipated and met.

After publication of the Learning Assessment Toolkit,7 a decision was made to trial these educational methods at 3 AOTrauma orthogeriatrics courses offered during 2010. These courses were offered in 3 different countries, varied in duration from 2.5 to 4 days, and were delivered by expert international faculty. An expert panel (see Glossary Box for definition) was assembled including physician educators, educationalists, and a sampling of physician learners. This group identified 10 competencies for the overall program on orthogeriatrics (Table 1). Each competency (eg, “recognize comorbidities and polypharmacy,” “prevent, identify, and treat complications”) describes what the surgeon must be able to do in order to address the patient problems and to provide optimal care in this area of practice. For assessing knowledge in the various topics, a multiple choice question (MCQ) bank of 40 questions (4 per competency) was created with the preferred answers and rationale for the correct answer supported by published literature. These questions were developed and refined by the expert panel, including 4 educationalists, a group of senior surgeons, and an expert in question psychometrics.

Table 1.

Competencies for the AOTrauma Orthogeriatrics Program

Correctly fix fragility fractures as indicated
Adapt treatment in line with aging
Recognize comorbidities and polypharmacy
Prevent, identify, and treat complications
Address secondary prevention (osteoporosis, falls)
Restore function early
Apply the basic science of fracture fixation and biology of aging bone in fixation
Initiate and comanage patient care with those involved in the health care team
Build a system of care appropriate to fragility fracture care
Seek to restore and preserve functionality, independence, and quality of life

Additionally, the expert panel developed a questionnaire to evaluate the participants' self-reported gap between their present level of knowledge and ability and their desired level of knowledge and ability, on a Likert scale of 1 to 5 for each of the 10 competencies. This difference between the desired level and the present level is referred to as the gap score and offers an indication of the participants' motivation to learn.

In order to validate the course competencies, a Change Readiness Inventory (CRI)8 survey was conducted with a random sample of orthopedic trauma surgeons from the AOTrauma database who practice in North America, Latin America, Europe, the Middle East, and Asia-Pacific to determine the level of gaps in the 10 competencies. The data obtained from this survey were used to guide the design of the planned educational experience for the AOTrauma orthogeriatrics courses.

Methods

The protocol for the survey questionnaire and the assessment items was reviewed and approved by the local Research Subjects Review Board.

The study population consisted of practicing orthopedic surgeons with 1 to 35 years of practice experience, as well as geriatricians and other medicine physicians at one of the courses. Many of the surgeons had selected the course for the purpose of CME. An online precourse self-assessment was undertaken by participants attending 3 AOTrauma orthogeriatrics courses that were run between September 2010 and December 2010.

The following program components were implemented for each course:

  1. The competencies (guided the faculty in developing content)

  2. An online participant self-assessment before the course

  3. A precourse data report for faculty on the needs of the participants

  4. A practice change survey on the last day of the course, with 3-month follow-up

  5. An online participant self-assessment after the course

  6. An online forum with cases, questions, and resources before and after the course

All self-assessment questions were administered as an online survey. Participants registered for the courses were sent an e-mail asking them to complete a precourse self-assessment with a link to the questions. All enrolled participants were sent an e-mail 4 weeks before their course date with a live link to a Web site inviting them to participate, with 2 follow-up reminder e-mails also sent. It was specifically explained in the e-mail that “the information gathered will help the chairs and faculty to prepare the course to meet your needs. It was not an individual examination and the results would only be viewed as an aggregate.”

Precourse Assessment

Gap Assessment

On accessing the link, participants were presented with a list of the 10 competencies (Table 1) and asked to state, for each one, their:

  • Present level of ability: What is your current level of expertise right now?

  • Desired level of ability: What level would you like to successfully apply these competencies within your practice?

The response to each question was scored from 1 to 5, with 5 indicating the highest performance on a Likert scale. The difference between the 2 responses for each competency is defined as the “gap score.” In addition to these questions, participants also provided information on their years in orthopedic practice and number of geriatric fracture cases they perform per year.

Clinical Knowledge Test

On accessing the link to the questionnaire, candidates are advised that “This is not an exam or a test. Topics in this assessment will be discussed during the course.

All data are examined from the participants as a group and not analyzed by individual participant.” The clinical knowledge questions were designed by the previously described expert panel to assess knowledge corresponding to the previously identified competencies. Each question comprises a clinical vignette, a question, and 4 answer options. A total of 20 questions (2 questions per competency, with each question rated by the expert group as either easy or difficult) were presented to participants in the precourse self-assessment. Upon completion of the questions, candidates had the opportunity to review the correct answers and were provided with an explanation and references from the current literature.

Precourse

Prior to the start of each course, the faculty convened to discuss the precourse assessment results and to focus the course on the areas identified by the gap scores and clinical knowledge test results. Methods of integrating these participant issues into the predefined curriculum were based on areas of greatest need as determined by the largest precourse gap and lowest knowledge test results.

Course

Minor alterations in the course curriculum were implemented based on the needs assessment and included emphasizing the areas of greatest participant need with small-group case discussions and didactic lectures.

On the final course day, participants were asked to complete a 2-question “commitment to change” survey asking whether they would change their clinical practice upon returning home as a result of participating in the course and to rate their level of commitment to implementing this change. This tool is widely used and is based on the experience that physicians who make a commitment to change their practice behavior are more likely to make changes.911 In addition, this survey tool provides some information to help assess whether the educational activity changes the participants at the performance level of outcome assessment, identified by Miller12 as what the surgeon “does” in practice. It also provides additional valuable information to guide the design of future educational events.

Postcourse assessment

The self-assessment process was repeated after the course, with the candidates being invited by e-mail to respond to a different set of 20 clinical knowledge questions and the same gap assessment. The online assessment was available 1 day after the course and for a period of 3 weeks after. Three months after the course, an e-mail was sent specifically enquiring as to whether the participants had been able to make their intended practice change. Participants categorized each intended change as either “fully implemented,” “partially implemented,” or “not implemented.” For the changes that were implemented, participants were asked how many patients they believe had benefited. For the intended changes that had not been implemented, participants were asked to identify the barriers to implementation.

Results

During the study period, 3 AOTrauma orthogeriatrics courses with a total of 151 participants were held at different international sites (United States, South Korea, and Switzerland). The groups were predominantly orthopedic surgeons (85%), while the US event was also attended by a group of geriatricians/internal medicine physicians. Most participants were practicing surgeons at the attending or consultant level, and some were residents in training/registrars (Figure 1A and B).

Figure 1.

Figure 1

A, Years of practice as reported by course attendees. B, Average number of geriatric fracture cases treated per year by attendees.

The voluntary response rates to the precourse and postcourse questionnaires were highest at the US course, with a 57% precourse and 60% postcourse response rate from the surgeons, and a 60% precourse and 66% postcourse response from the medical physicians. The surgeons at the South Korea course had a 33% precourse and 33% postcourse response rate, while surgeons attending the international course in Switzerland had a 42% precourse and a 19% postcourse response rate.

The results of each course were compiled in order to demonstrate the differences between the precourse and postcourse gap scores (Table 2 and Figure 2) and the clinical knowledge question scores (Table 3). Gap and test scores for each competency for each cohort are shown, as well as the mean value for all competencies for each cohort. Competencies with high motivation and high test scores indicated that participants were eager to learn but that prior knowledge was good, implying that candidates would potentially require a different approach on the topic, that is, given that their knowledge was good, dedicating the more formal lectures was not merited.

Table 2.

Precourse and Postcourse Gap Scores (With Calculated Postcourse Reduction in the Gap Scores)a

US (surgeons)
US (medicine physicians)
Korea (surgeons)
Switzerland (surgeons)
Before course After course Decrease in gap Before course After course N = 10 (66%) Decrease in gap Before course After course Decrease in gap Before course After course Decrease in gap
N = 24 (57%) N = 26 (60%) N = 9 (60%) N = 12 (33%) N = 12 (33%) N = 24 (42%) N = 11 (19%)
Competency Gap Gap Pre to post Gap Gap Pre to post Gap Gap Pre to post Gap Gap Pre to post
Correctly fix fragility fractures as indicated 1.3 1.1 –0.2 0.8 0.5 –0.3 1.5 1.1 –0.4 1.6 1 –0.6
Adapt treatment in line with aging 1.7 1.3 –0.4 1.2 0.9 –0.3 1.7 1 –0.7 1.8 1.2 –0.6
Recognize comorbidities and polypharmacy 1.4 1.1 –0.3 0.8 0.7 –0.1 1.9 1.2 –0.7 1.8 1.5 –0.3
Prevent, identify, and treat complications 1.4 1 –0.4 1.2 1.1 –0.1 1.7 0.9 –0.8 1.5 1.2 –0.3
Address secondary prevention (osteoporosis, falls) 1.8 1.3 –0.5 1.1 1.1 0 2 1 –1 1.8 1 –0.8
Restore function early 1.3 1.1 –0.2 1 0.9 –0.1 1.6 0.8 –0.8 1.4 1 –0.4
Apply the basic science of fracture fixation and biology of aging bone in fixation 1.8 1.2 –0.6 0.8 1.1 0.3 2 1 –1 1.8 1 –0.8
Initiate and comanage patient care with those involved in the health care team 1.3 1.2 –0.1 1.5 0.9 –0.6 2.1 1.5 –0.6 1.9 1.4 –0.5
Build a system of care appropriate to fragility fracture care 2.2 1.5 –0.7 1.7 1.4 –0.3 2.2 1.5 –0.7 2.1 1.8 –0.3
Seek to restore and preserve functionality, independence, and quality of life 1.4 1.1 –0.3 1.2 0.6 –0.6 2 1 –1 1.6 1.3 –0.3
Mean 1.56 1.19 –0.37 1.13 0.92 –0.21 1.87 1.1 –0.77 1.73 1.24 –0.49

a A gap score of <1 represent low motivation to learn and between 1 and 2.5 indicates good motivation levels. A score of > 2.5 may indicate learners are anxious about their low levels of ability.

Figure 2.

Figure 2

Example of precourse data report.

Table 3.

Pre- and Postcourse Test Scores (Based on 2 Multiple Choice Questions per Competency), With Calculated Changesa

US (surgeons)
US (medicine physicians)
Seoul (surgeons)
Switzerland (surgeons)
Competency Before course (N = 24 [57%]) After course (N = 26 [60%]) Change Before course (N = 9 [60%]) After course (N = 10 [66%]) Change Before course (N = 12 [33%]) After course (N = 12 [33%]) Change Before course (N = 24 [42%]) After course (N = 11 [19%]) Change
Correctly fix fragility fractures as indicated 55% 70% +15% 40% 25% –15% 50% 70% +20% 50% 45% –5%
Adapt treatment in line with aging 75% 70% –5% 65% 80% +15% 60% 70% +10% 60% 40% –20%
Recognize comorbidities and polypharmacy 50% 45% –5% 65% 60% –5% 35% 45% +10% 45% 55% +10%
Prevent, identify, and treat complications 75% 35% –40% 40% 35% –5% 90% 30% –60% 85% 60% –25%
Address secondary prevention (osteoporosis, falls) 10% 90% +80% 15% 55% +40% 30% 75% +45% 25% 80% +55%
Restore function early 35% 95% +60% 40% 90% +50% 35% 90% +55% 60% 85% +25%
Apply the basic science of fracture fixation and biology of aging bone in fixation 35% 70% +35% 35% 55% +20% 45% 55% +10% 45% 90% +45%
Initiate and comanage patient care with those involved in the health care team 70% 80% +10% 90% 95% +5% 85% 85% +0% 80% 75% –5%
Build a system of care appropriate to fragility fracture care 55% 100% +45% 50% 95% +45% 15% 100% +85% 30% 95% +65%
Seek to restore and preserve functionality, independence, and quality of life 90% 70% –20% 75% 55% –20% 80% 80% +0% 65% 75% +10%
Mean 55% 73% +18% 52% 65% +13% 53% 70% +18% 55% 70% +16%

a Negative numbers show a decrease in the score, for example a decrease in the clinical question test score, and positive numbers an increase in the score.

Gap Score Results

The precourse gap scores were generally highest for 2 competencies (“address secondary prevention” and “build a system of care”) indicating a higher level of motivation to learn in these topics and lowest for a variety of competencies (eg, “restore function early” and “comanage patient care” in the US surgeons group), indicating lower motivation to learn in these competencies. These precourse gap scores guided adaptations in the course structure as shown in the examples in Table 4. Postcourse gaps were reduced by an average of 0.37, 0.21, 0.77, and 0.49 for all 10 competencies in the 4 cohorts (US surgeons, US physicians, Korean surgeons, and surgeons at the course in Switzerland). Large postcourse gap score improvements were seen in “address secondary prevention” and “build a system of care” in many of the cohorts.

Table 4.

Examples of Educational Suggestions for Addressing Gaps and Knowledge Test Scores on US Course

High motivation/good test score High motivation/poor test score
 • Affirmation of current practices  • Emphasize in lectures
 • Optional educational lunch to discuss issues  • Emphasize in practical exercises
 • Keynote lecture
Low motivation/good score Low motivation/poor test score
 • Demonstrate and add case examples  • Emphasize topics in lectures
 • Multiple talks to repeat the concept  • Highlight in cases during small-group discussions

Clinical Knowledge Test Results

Competencies with the lowest precourse knowledge test scores were noted in each cohort as illustrated in Table 3, that is, “restore function early,” “address secondary prevention,” and “fracture fixation, biology of aging bone” in the US surgeons group. These low test scores were noted and highlighted the need for faculty to consider increasing the focus on these topics in the delivery of their course content.

Competencies showing the greatest improvements in the postcourse scores were “restore function early” and “address secondary prevention.” Overall, the participants' postcourse knowledge improved compared with the precourse level for all 3 course cohorts, with a greater improvement noted in the surgeon groups compared with the physician group.

Commitment to Change Results

A commitment to change can identify what learners take away from an activity and intend to use in practice, while enabling an assessment of the impact on clinical practice. Table 5 shows participants' intended changes from the US course, highlighting differing priorities and ambitions of surgeons and geriatricians. The key data from the application of the commitment to change in the courses in this study are outlined in Table 6. The response rate at the end of each course was approximately 50% for all 3 events, and the response rates for the 3-month follow-up ranged from 29% to 63% of those who completed the first survey. For the 3 courses combined, a total of 162 intended practice changes covering a wide variety of surgical, medical, and system of care issues were committed to by 114 participants (including Faculty).”.

Table 5.

Participants' Intended Changes From One Course

Topic areas where change of practice is intended Participants
Surgeons (28) Geriatricians (23)
Osteoporosis, metabolic bone, and so on 7 5
Vitamin D 8 2
Comanaged care (or some part of teamwork, communication, etc) 3 8
Surgical techniques, timing, weight bearing, and so on 12 2
Develop geriatric fracture center program 6 1
Misc medical/medication issues 2 5
Pre/postoperative prep and orders 2 6
Hydration 1 2
Delirium 3

Table 6.

“Commitment to Change” Survey Data From the 3 Courses

US course South Korea course Switzerland course
graphic file with name 10.1177_2151458511423646-fig3.jpg graphic file with name 10.1177_2151458511423646-fig4.jpg graphic file with name 10.1177_2151458511423646-fig5.jpg graphic file with name 10.1177_2151458511423646-fig6.jpg
Estimated number of patients who have benefited 213-241 190 70-90

As illustrated in Table 6, at 3-month follow-up, a total of 15 of intended practice changes had been “fully implemented” and 32 had been “partially implemented.” Various barriers were identified when participants were unable to implement the intended change by the end of the 3-month time period (eg, lack of time, need to get more of the local health care team involved, and lack of “buy-in”). The participants from all 3 events self-reported that a total of 521 patients had benefited from the practice changes they had made as a result of attending the AOTrauma Orthogeriatric fracture course.

Discussion

Given the recommendations by both the Institute of Medicine3 and the American College of Graduate Medical Education,2 the publication of the Learning Assessment Toolkit provides a concise, timely, cost-effective approach to implementing an outcome focused and measured educational process. By following this method, we demonstrate an improvement in participants' gap scores and an increase in their knowledge test scores as a result of attending one focused course.

Though participant numbers in our study are small, the fact that there is a measurable difference before and after the course is indicative of improvements and achieving the defined learning goals within the competencies measured. There are some instances where the postcourse results caused a decrease in the knowledge score. This can in part be attributed to our use of only 2 questions in an effort to increase participant response rate. Additional data collection is planned for future AOTrauma orthogeriatrics courses to enable further study of the techniques of the learning assessment toolkit and to further evaluate regional differences, as well as differences between physician and surgeon participants.

The data obtained on the US course correlated with the mandatory CME reports generated by the course chairman and evaluator, as well as the standard ACME questionnaires administered to all participants, reflecting the reported impression of an improvement in the quality of the course.

We followed the components of the learning assessment toolkit with one exception. de Boer et al7 advise that for the precourse assessment each half of the group receives one of 2 different sets of questions and that for the postcourse assessment each individual receives the other set, thus removing any “test–retest” bias of the answers but utilizing the same questions. Thus, we would have had to identify the respondents to ensure no crossover of the group; however, due to the voluntary and anonymous responses in our implementation, a completely different set of questions were tested postcourse to eliminate test–retest bias. Data on the performance of each question are analyzed so that any question that is too easy or too difficult is identified for editing to improve its validity and reliability. Questions not meeting high standards will be identified and replaced in an ongoing quality improvement process.

Einstein stated, “In the middle of difficulty lies opportunity.” The correlation between the gap scores—motivation to learn—and the knowledge test results in each competency permitted course leaders the opportunity to make adjustments and focus on learning opportunities within the course schedule as indicated (Table 4).

We readily acknowledge that one limitation of our method is the fact that the pre- and postcourse data may not be from the same participants. Given late registration for the course and also higher postcourse response rates for some centers, we recognize this as a weakness in our data collection. The orthogeriatrics expert panel will explore various methods to increase participation rates overall and to ensure that participants understand that the assessments are a valuable component of the overall course.

The data from our first implementation of the commitment to change survey suggests that this tool may be appropriate for assessing whether participation in our educational activities may result in benefits to patient care. Our early results are limited by participation rates and their self-report nature, and further investigation of this educational strategy seems warranted. The barriers to change identified by the participants (lack of time, need to get more of the local health care team involved, lack of “buy-in”) offer opportunities to future participants to perhaps discuss and prepare for the likely obstacles they will encounter, ensuring a higher success rate in the future.

Conclusion

As Calman et al stated, “In considering how best to improve the quality of medical education and individual doctor’s educational experience, it is important not to lose sight of the overall objective which must be about improving patient care.”1 The Learning Assessment Toolkit helps us achieve this goal by demonstrating to the course faculty areas of participant perceived needs, both objective need through the clinical question/knowledge test and subjective need through the gap score. The faculty can then focus and adapt the curriculum as indicated, with the success of the course supported by the improvements in the postcourse question scores.

Also, while each course is improved using the precourse results associated with that specific event, the summary results provide cumulative data trends and information to help the design of future courses. This method also ultimately helps improve patient care as suggested by the follow-up responses to the commitment to change questionnaire.

The main advantages and take-home points we have established from our use of the learning toolkit are the following:

  • It enables course designers to ensure all educational effort is tailored to knowledge gaps. Clear understanding of the learners' needs cannot be established without this type of assessment process.

  • The learning toolkit method is focused on the goal of better patient care, rather than CME or industry incentives.

Thus, we demonstrate, through our early experience in implementation of the learning assessment toolkit, improvements in the AOTrauma orthogeriatric fracture course.

Glossary

Competency The unique combination of knowledge, skills, and attitudes that enables one to perform in practice; describes what a physician must be able to do to diagnose and treat patients with a specific clinical problem or issue
Expert panel Educational group comprising experts, educationalists, and target learners
Gap score The difference between a participant's self-reported “Desired level” of ability minus his or her “Present level” for a given competency; provides an indication regarding motivation to learn
Likert scale 5-point self-report scale (1 = low, 5 = high)
Learning assessment toolkit A set of learning and assessment instruments developed by the AO to provide insights into the effectiveness of their educational activities

Footnotes

Michael Cunningham, Ph.D. is an employee of the AO Foundation.

Stephen Kates, MD has received unrelated research grant support from the AO Foundation and Synthes USA.

References

  • 1. Calman KC, Temple JG, Naysmith R, Cairncross RG, Bennett SJ. Reforming higher specialist training in the United Kingdom—a step along the continuum of medical education. Med Educ. 1999;33(1):28–33 [DOI] [PubMed] [Google Scholar]
  • 2. ACGME Outcome Project. http://www.acgme.org/outcome/e-learn/e_powerpoint.asp
  • 3. IOM Redesigning Continuing Education in the Health Professions. Washington, DC: Institution of Medicine; 2010 [Google Scholar]
  • 4. Moore DE, Jr, Green JS, Gallis HA. Achieving desired results and improved outcomes: integrating planning and assessment throughout learning activities. J Contin Educ Health Prof. 2009;29(1):1–15 [DOI] [PubMed] [Google Scholar]
  • 5. Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance. A systematic review of the effect of continuing medical education strategies. JAMA. 1995;274(9):700–705 [DOI] [PubMed] [Google Scholar]
  • 6. Fox RD, MP, Putnam RW. Changing and Learning in the Lives of Physicians. New York, NY: Praeger; 1989 [Google Scholar]
  • 7. de Boer PG, Buckley R, Schmidt P, Fox R, Jupiter J. Learning assessment toolkit. J Bone Joint Surg Am. 2010;92(5):1325–1329 [DOI] [PubMed] [Google Scholar]
  • 8. Fox RD. Using theory and research to shape the practice of continuing professional development. J Contin Educ Health Prof. 2000;20(4):238–246 [DOI] [PubMed] [Google Scholar]
  • 9. Lockyer JM, Fidler H, Ward R, Basson RJ, Elliott S, Toews J. Commitment to change statements: a way of understanding how participants use information and skills taught in an educational session. J Contin Educ Health Prof. 2001;21(2):82–89 [DOI] [PubMed] [Google Scholar]
  • 10. Wakefield JG. Commitment to change: exploring its role in changing physician behavior through continuing education. J Contin Educ Health Prof. 2004;24(4):197–204 [DOI] [PubMed] [Google Scholar]
  • 11. White MI, Grzybowski S, Broudo M. Commitment to change instrument enhances program planning, implementation, and evaluation. J Contin Educ Health Prof. 2004;24(3):153–162 [DOI] [PubMed] [Google Scholar]
  • 12. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 suppl):S63–S67 [DOI] [PubMed] [Google Scholar]

Articles from Geriatric Orthopaedic Surgery & Rehabilitation are provided here courtesy of SAGE Publications

RESOURCES