Abstract
Training in quality improvement (QI) and patient safety for clinicians are needed for continued progress in health care quality. A project-based QI curriculum training faculty, residents, and staff in an academic health center for >10 years are reviewed and evaluated. Didactic curriculum includes QI knowledge domains, and QI methods are applied to a project during the course. There are 638 graduates and 239 projects since implementation. Most projects (84%) effected behavior change, change in clinical practice, and benefit to patients. Faculty have used the training to develop formal QI programs for Graduate Medical Education (GME). Graduates value the skills for their professional and personal lives, and for career enhancement. Experiential QI training for practicing professionals is valuable and effective. Collaboration and support from stakeholders are key factors in success. The Clinical Safety & Effectiveness course is a reproducible and relevant model of interprofessional QI education for practicing professionals and staff.
Keywords: quality improvement, quality improvement education, medical education, health system science
Introduction
Progress has been made in the 2 decades since the Institute of Medicine’s (IOM) report, To Err is Human: Building a Safer Health System,1 but challenges remain. When this course was implemented 10 years ago, the National Health Care Quality Report showed slow improvement.2 The 2018 National Health Care Quality and Disparities Report showed improvement for many person-centered care and patient safety measures,3 however, less than half of effective treatment measures, only 30% of care coordination measures, and no affordability measures were improving. Training front-line professionals, clinical leadership, and trainees in quality improvement (QI), patient safety, and health care value is necessary to effect change. QI skills have only recently been routinely included in medical training, and practicing professionals need an organized effort to learn these skills.4 Project-based, experiential learning in QI is a challenge for practicing faculty and staff due to the time commitment and resources needed.
A recent review reports that only about half of the published QI curricula report clinical process or outcome results.5 This same review reports that the number of curricula requiring QI projects with clinical process or outcome metrics has decreased from 2005 to 2013 as compared to the previous decade. Although many QI curricula have been published, few provide outcomes of behavioral change, changes in clinical practice, and benefits to patients.5,6
The Clinical Safety & Effectiveness (CSE) course at UT Health San Antonio (UTHSA) Long School of Medicine (LSOM) is described, with its process, outcomes, and adaptations over 10 years as an effective, attainable, and sustainable model of QI training for practicing professionals in an academic medical center with its major health system partner, University Health (UH). UH is a nationally recognized 700 bed teaching hospital and network of outpatient health care centers, owned by the people of Bexar County.
Methods
The activities associated with this article were reviewed by the UTHSA IRB and were determined to not require IRB approval under DHHS regulations at 45 CFR 46.
In 2005, the University of Texas M.D. Anderson Cancer Center (MDACC) began a project-based QI course, the CSE Course, based on the Intermountain Health Care Advanced Training Program initiated by Brent James, MD.7 The MDACC CSE course was successful and in 2007, the then Executive Vice Chancellor for Health Affairs in the University of Texas (UT) System, Dr Kenneth Shine, supported the implementation of a course throughout UT System health centers.8 Each UT health campus developed their own adaptation of the course. In 2009, the CSE course was implemented at the LSOM campus. Support for the course decentralized from UT System to local campus support in 2015. The LSOM CSE course goals were to train a core of clinical leaders in QI methods and improve quality and safety through project-based learning.
The interprofessional course is open to academic faculty and staff of LSOM and employees of UH, supporting stakeholders of the course, and is provided free of charge to these participants. Resident and fellow trainees participate with program director approval. Current projects are based at UH (hospital or community clinics) or at LSOM outpatient clinics and cancer center. The course has been open to external participants including a community-based long-term care facility, a federally qualified health center, and other hospitals when space was available; tuition was required for those participants.
The LSOM Office of Continuing Medical Education (CME) operates the course. The Associate Dean for Quality and Lifelong Learning is the course director. She has been involved in adult health care education for >30 years, is a certified health care continuing professional development professional (CHCP), and has led the LSOM CME program for >10 years. A contracted QI consultant/educator with a masters in education and many years of experience in health care QI education serves as course faculty as well as a QI coach coordinator. A senior CME conference coordinator supports the course. The curriculum was developed using the Kern method of curriculum development9 with topics and skills for successful QI as shown in Table 1.
Table 1.
Problem identification and general needs assessment | General: need for continued improvement of health care and training for health care workers in quality improvement |
Specific project: identified problem/gap in a clinical care or health care educational program | |
Targeted needs assessment | Participants: the nominated participant is assessed for training, experience, appropriateness, and desire for training |
Project: Is the project a priority for the institution? Feasible within the timeframe? Quality improvement and not research? | |
Goals and objectives | Aim statement, QI tools (process flow and cause effect diagram), baseline data, intervention, postintervention data, return on investment, analysis, future plans, reflective group presentation |
Educational strategies | Didactic lectures, interactive group learning, QI coach facilitation, team development, team work, periodic team updates using storyboards, final group presentation |
Implementation | Identification of funding from stakeholders, identification and recruitment of national, regional, local experts, QI coaches, work with health systems to recruit projects and participants |
Evaluation and feedback | Teams are assessed monthly for progress in large group setting. QI coaches assess progress in team meetings. Opportunities for feedback at each monthly session and in evaluation. Final presentation documents required elements and reflection |
Curriculum topics are outlined in Table 2 and include the 8 knowledge domains recommended by the Institute for Health Care Improvement for competency in health care improvement and inclusion in QI curricula.10,11 The curriculum framework is that of Kolb’s Experiential Learning Theory,12 since the cycle of Abstract Conceptualization (Think) – Active Experimentation (Plan) – Concrete Experience (Do) – Reflective Observation (Observe) is quite similar to the improvement model used in the course—Plan-Do-Study-Act (PDSA).13 The 7 didactic course days are spread out over several months to allow implementation and completion of the actual QI project. Course participants form a QI project team, including team members who are not taking the course, and conduct team meetings in their health care microsystem; these meetings are independent from the didactic course sessions.
Table 2.
Day 1 | Course introduction, quality, and safety overview | Local course director |
Introduction to quality improvement tools: aim statement, process flow, team effectiveness | National expert, experienced in QI tools and projects | |
Day 2 | Understanding data variation | QI statistician/industrial engineer |
Day 3 | The state of health care system quality in the United States, quality and cost relationship | National expert in health care quality |
Day 4 | Adaptive leadership: effecting change in the health care system | National expert in leading health care change |
Day 5 | Lean in health care | National expert in Lean Health Care |
Day 6 | Root cause analysis and failure mode effectiveness analysis | Local expert |
Team STEPPS | Local expert | |
Building safety culture: lessons from aviation | Local military expert | |
Measuring error in health care | National patient safety expert | |
CSE alumni project presentation | Local CSE Alumni | |
Return on investment | Local health care financial analyst | |
Day 7 | Graduation. All teams present projects with stakeholders, sponsors, and team members present | All course participants |
Abbreviations: CSE, Clinical Safety and Effectiveness Course; QI, quality improvement; Team STEPPS, Team Strategies to Enhance Performance and Patient Safety.
Course faculty include national and local experts. National experts inspire and demonstrate the institutional commitment and gravity of the content. Local experts are CSE course alumni and local resources for QI expertise.
The QI consultant/educator coordinates the QI coaches, who facilitate 1–3 of the project teams. The coaches are CSE alumni who have led successful QI projects, want further QI experience, and participate with their supervisor’s approval. These QI coaches check in with the teams to ensure team meetings, participant involvement, and milestone achievements. They assist in problem solving when barriers occur, assist with data analysis and QI software, and encourage accountability. A training program was implemented for our internal QI coaches to give them additional training in adaptive leadership and QI facilitation.14
Course participants and projects are nominated by the clinical leadership of the UTHSA, LSOM department chairs, division chiefs, and UH leadership, including the chief medical officer, and nursing leadership. Self-nominations are accepted but projects must be approved by clinical leadership. Projects aim to improve at least one dimension of health care as defined by the IOM in 2001: safe, timely, effective, efficient, equitable, patient-centered.15 The participant’s supervisor signs a nomination form acknowledging the course time commitment, and the participants signs a commitment form including the course schedule, requirements for team meetings, and graduation requirements. A typical course cohort would include 30–40 participants and 10–12 projects.
Requirements for course graduation include attendance, team meeting participation, implementation of PDSA and QI tools to include an aim statement, process flow, cause effect diagram, intervention, data collection, statistical process flow chart (or appropriate equivalent), projected return on investment (ROI) calculation, and sustainment plans. There is an online platform containing a template for each course requirement and participants are given a schedule of when each requirement is due. Project team report-outs were implemented at each class session to report on progress with requirements, obstacles, and reflections.
Each course participant takes part in project presentation at course graduation, to acknowledge their contribution and enable self-reflection. Course graduation is attended by project sponsors, course stakeholders, and project team members outside the course. Selected project presentations are posted on our CME website to facilitate reference and spread of best practices.16 Course participants receive a framed certificate and lapel pin at graduation. Physicians are eligible for AMA PRA Category I CME corresponding to the number of didactic hours attended, and for 20 hours of AMA Category I Performance Improvement CME if requirements are met. Our LSOM CME Department joined the American Board of Medical Specialties Portfolio Program17 in 2013 which enables approval of projects for Maintenance of Certification Part IV for participating specialties.
Course expenses include honoraria and travel for nationally recognized course faculty, partial FTE support for the course director and course coordinator effort, contracted QI consultant/educator, books and supplies, catering (didactic sessions are full days), and room charges. Expenses are supported by the UTHSA physician practice and President’s Office, the LSOM Dean’s office, and UH. Each year a report summarizing the projects and their projected ROI is submitted to UH and to the UT Health chief operating officer.
The course director, QI consultant/educator, and QI coaches screen projects to ensure that they are QI projects and not research. When aim statements are formed, each project with the aim statement and a list of the participants is submitted to the UTHSA IRB and to the UH Research Committee. The IRB reviews and responds with a statement acknowledging that the projects are QI and not research.
Program Evaluation
Learning outcomes were based on the Kirkpatrick Model.18 Levels 1 and 2a (learner satisfaction and attitudes) were measured from Likert scale scores by learners at end of the course from nine available cohorts. A Likert scale score of 4 or 5 was considered satisfaction with content. Level 2b (demonstration of knowledge and skills) was measured by course personnel (course director, QI consultant/educator, QI coaches, and conference coordinator) for the 609 participants by observing the demonstration of the application of QI tools, implementation of the PDSA cycle, and other course requirements (aim statement, QI tools analysis, pre and postintervention data, results, ROI, and future plans) during the course in team meetings, project team report-outs, and by observing completion and presentation of course elements at graduation. Level 3 (behavior change) was measured in the 609 participants during project team report-outs and at graduation by course personnel, and during team meetings by QI coaches. Levels 4a (changes in clinical practice) and 4b (benefits to patients) were measured in the 609 participants by evaluation of the 230 project results by the course director and QI consultant/educator at graduation and by course director review of the graduation presentations of each project team.
In 2019, an email survey was sent to CSE course alumni to determine if they had continued in QI work (behavior change), and what impact their QI training has had on their career.
Results
From the period 2008 to June 2019, there were 638 graduates and 239 projects in 24 course cohorts. Graduates are from multiple professions and include physicians (300/638; 47%), nurses (131/638; 21%), pharmacists (16/638; 3%), dentists (13/638; 2%), other health professionals (36/638; 6%), others (health administrators, medical assistants, and technicians) (136/638; 21%).
Learning outcomes based on the Kirkpatrick Model are shown in Table 3. For levels 1 and 2a, there were 112 respondents of 260 participants (43%) on these evaluations; 98% (110/112) were satisfied with the learning. The mean and standard deviation of the Likert scale score for overall course satisfaction was 4.77 ± 0.24. Examples of “Other Comments,” some of which indicate behavior change in the participant, included: “The great thing about the CS&E course is that the concepts, and national thought leaders, are presented in real time. Quality and Safety becomes real and evident in daily work. Doing the project cements that. The concepts are applied and change happens before their very eyes. That is very powerful. ROI exceeds the course cost…more far-reaching is the human ROI.” “One of the best experiences I’ve had. It redirected my career.” “Extremely valuable” “It has served faculty and trainees beyond any expectation.”
Table 3.
Levels 1 and 2a Responses | 98% participants learner satisfaction and attitudes |
Level 2b Learning | 98% participants demonstrated knowledge and skills |
Level 3 Behavior | 98% projects effected behavior change |
Level 4a Results Level 4b Results |
90% projects with changes in clinical practice 84% projects with benefits to patients |
Levels 1 and 2a were measured from available Likert scale evaluations at end of the course. Level 2b was measured by course personnel by observing the demonstration of the application of QI tools, implementation of the PDSA cycle, and other course requirements. Level 3 was measured during project team report-outs and graduation by course personnel, and during team meetings by QI coaches. Level 4a (changes in clinical practice) and Level 4b (benefits to patients) were measured by evaluation of project results by the course director and QI consultant/educator at graduation and by course director review of the graduation presentations of each project team.
For level 2b, 9 participants (9/638; 1.4%) were not able to demonstrate knowledge as outlined above. Four participants (4/638; 0.6%) did not complete the course; 3 due to time constraints; and 1 due to a job change. Thus, the demonstration of knowledge was measured as 98% (625/638) of participants.
For level 3, behavior change occurred in 98% (225/230) projects, reflecting the work of 94% (598/638) of participants. The behavior change of QI project team members typically led change in clinical practice in their clinical microsystem where they had influence. In addition, teams often influenced behavior change of people in another discipline or profession. An example was a team from vascular surgery that changed clinical practice of internal medicine physicians ordering CT pulmonary angiography to evaluate for pulmonary embolism. The use of a clinical decision tool resulted in higher diagnostic yield and less inappropriate angiograms.19
For level 4, changes in clinical practice (4a) occurred in 90% (206/230) projects and benefits to patients (4b) occurred in 84% (194/230) of projects. One alumna cites the skills learned in the course in helping her to develop a community-based, population health program to improve asthma care for children.20 Examples of positive results from selected QI projects achieving Kirkpatrick levels 3 and 4 are listed in Supplemental Digital Content, Appendix 1, available at http://links.lww.com/AJMQ/A51. Reasons for lack of level 3 or 4 results included: nonclinical projects such as educational curriculum projects, or revenue cycle projects. The frequency of the achievement of the project’s aim statement was also analyzed in the 194 projects achieving levels 3 and 4 by comparing the aim statement with the results achieved. Even though positive changes in practice and patient benefits were achieved in 90% and 84% of projects, respectively, the desired aim statements were fully achieved in 79% (154/194) of the projects.
Results of the CSE alumni survey in 2019 were used to reflect longer term learner satisfaction. A number of graduates, an estimated 32% (201/638), are no longer at the institution after this 11-year period, due to graduating resident departures, retirements, faculty/staff turnover, and change in affiliated institutions. There were 437 potential alumni respondents. The response rate was 26% (114/437). Responses are shown in Table 4, reflecting that alumni continue to find the course valuable to them professionally, personally, and to their institutions. Many were able to present their work at professional society meetings, and at least 5 peer-reviewed publications resulted from the projects. The course also enhanced careers, as reflected in comments in Table 5.
Table 4.
Question | Response |
---|---|
Did your CSE project, or a related one, continue beyond the CSE course? | 76% (87/114) Yes |
Did your work unit and/or patient population benefit from the CSE course project? | 93% (106/114) Yes |
Has your CSE project spread to other areas of your institution? | 51% (58/114) Yes |
Has any participant from your team presented your CSE project at a regional, national, or international professional meeting? | 36% (41/114) Yes |
Has any participant from your team published your project as a manuscript in a professional, peer-reviewed journal? | 4% (5/114) Yes |
Has your project resulted in a positive return on investment for your unit and/or your institution? | 75% (85/114) Yes |
Have you received any awards, recognition, grants, or funding as a quality improvement or patient safety champion? | 22% (25/114) Yes |
Were your quality improvement efforts considered in your evaluation or promotion process? | 60% (68/114) Yes |
Have you continued quality improvement and/or patient safety work after the CSE course? | 93% 106/114) Yes |
Are the quality improvement/patient safety tools and skills learned in the course valuable to you professionally? | 98% (112/114) Yes |
Are the quality improvement/patient safety tools and skills learned in the course valuable to you personally? | 97% (111/114) yes |
Abbreviation: CSE, Clinical Safety and Effectiveness Course
Table 5.
“This interprofessional education course is key to continuous quality improvement and innovation science.” |
“CSE taught excellent lessons and provided the students with tools to successfully implement a quality/process improvement project and, in some cases, continue after graduation.” |
“The course was invaluable to me. It taught me the skills of Quality Improvement that, as an analyst, I have used as a catalyst for my career and to become a valuable member of each of the departments I have worked for.” |
“The course has allowed us to provide QI training all of our residency faculty members who had not received adequate training during residency. Basically, this included all but the most recent residency graduates on the faculty.” |
“All physicians should be required to take this course because it enhances the level of care they provide within the system we work in! Great skills acquired personally and professionally.” |
“Amazing course. Helps the student develop strong foundation in QI tools so that they can continue to effectively initiate QI projects independently. Strongly recommended to make professionals more effective. I use the principles I learned during this course on a routine basis.” |
“Excellent course. Well run, provides tools needed to carry on meaningful projects in the future. It has been looked at favorably by my employer and given me opportunities I otherwise would not have had.” |
Abbreviations: CSE, Clinical Safety and Effectiveness Course; QI, quality improvement.
ROI has also been substantial. In recent years, an estimated, projected ROI is required as part of the project. The sum of projected ROIs for a cohort of projects has always exceeded the financial investment in the course. It is not always possible to measure whether projected ROI has been achieved, but examples of realized ROI include a successful outpatient hypertension project showing sustained population health management improvement with substantially increased reimbursement, and outpatient antibiotic therapy projects with pharmacist support at 2 different hospitals that have continued to show decreased outpatient complications and hospital readmissions. Another example is sustainment in improved guideline-concordant care for chronic obstructive pulmonary disease in primary care clinics for >2 years. This work was recognized at national and regional meetings for excellence in educational outcomes.21,22
ACGME added the Clinical Learning Environment Review in 2012.23 Clinical Learning Environment Review site representatives conduct site visits to environments where residents and fellows train. Focus areas include QI and patient safety. The CSE program increased the recruitment of residents, fellows, and program directors to the course. Of the 300 physicians who have taken the course, 50 (17%) were residents. In addition, residents, fellows, and medical students are often recruited to the QI project team, even if they are not course participants. There have been 33 program directors and 11 associate program directors that have completed the program. Several program director course graduates have gone on to develop structured QI curricula for their residency program, including didactic topics that are addressed in the course, and longitudinal QI projects involving residents.24–26 This expanded the impact of the course beyond the faculty and residents directly in the course. UH and LSOM GME leadership recognized the course as an example of resident QI and patient safety involvement.
The course also aligned QI with CME and continuous professional development which has been recognized as a value of this type of training.27 There were 9676 AMA Category I CME credits claimed, 5420 AMA Category I Performance Improvement CME credits claimed, and 12 234 hours of continuing education were awarded to nonphysicians. In addition, 30 physicians were awarded Maintenance of Certification Part IV credit for the QI project done in the course since joining the ABMS Portfolio Program.
Discussion
Training in QI and patient safety is still needed to improve health care. While QI and patient safety experience is now standard in GME, many faculty who train residents, fellows, and students have not received training in relevant principles. Training experience is needed for practicing academic physicians, not only to teach trainees, but to improve health care at a faster pace. This QI curriculum includes a real-time clinical improvement project with an interprofessional team and a QI coach. As noted in the literature, a curriculum that includes these components is challenging to implement and maintain.5 Yet, the combination of didactic and project work, as well as the link to health system improvement efforts, are general principles for education in health care improvement.28
The CSE curriculum has been sustained for 10 years now due to course satisfaction and support by alumni and clinical leaders, support (including financial support) of LSOM and UH, consistency in course directorship, and positive ROI. The support from alumni, clinical leaders, and health care institutions has largely been due to the positive results in clinical processes and outcomes for the health system and patients. This not only improves the patient experience, but providers learn that, while change is difficult, care can be improved with QI tools and methods.
The CSE curriculum also includes many characteristics of successful QI curricula that involves physicians, such as choosing topics of clinical importance, accounting for time of the endeavor, resident involvement, use of data, and interprofessional engagement.6 The evaluation of Kirkpatrick levels from the research groups’ experience is also valuable, since few published QI curricula provide outcomes of behavioral change, changes in clinical practice and benefits to patients.29
Interprofessional engagement is key in a QI curriculum. Course alumni are 47% physicians and 53% nonphysicians. Many project teams include >1 physician and all project teams, which include persons outside the course, have multiple professions. Including physicians is key for health care change30 and yet learning the importance of an interprofessional team is critical to the QI training experience.
There have been multiple lessons learned since the beginning of the course. The course curriculum has been adapted over time, benefitting from input of participants and stakeholders. Topics deemed necessary in the early years such as evidence-based medicine, finding evidence, clinical decision support, and disclosing adverse events have become standard in other training opportunities at the study institution and no longer need to be addressed in the course. Sessions on ROI and the driver diagram have been added. ROI has been particularly important in sustaining financial support from stakeholders for the program. Overall, the curriculum design has shifted to be less subject-centered, to more learner-centered and project-centered. The QI topics and tools are focused on the participant’s QI project.
Communicating the time commitment for the course is crucial. A nomination form and a commitment form were implemented at the time of participant nomination. This lessens misunderstandings about time commitment.
Moving along a real-time QI project with busy, practicing clinicians is also a challenge. Several interventions have helped with this dilemma. Project group progress updates in the form of team report-outs are now done during class time; this increases the curriculum’s focus on learner needs and goals and on the project. The updates are helpful in accountability as well as seeking support and ideas from others in the cohort and from visiting faculty when barriers arise. Based on participant input, team workgroup time is included at the end of each class day, further centering the curriculum on the learner and the project.
QI coaches are an important success factor in the program, making sure that team meetings occur and that the project is progressing. Due to feedback that there was insufficient time to complete interventions and collect postintervention data for a substantial number of projects, the overall duration of the course was changed from the original 5–6 months, to 7 months to allow more time for implementation and postintervention data collection.
Another important lesson is the method of project selection. Initially, participants were selected with only a vague idea of their project and the project was solidified after the course was underway. In recent years, projects and participants are nominated/recruited primarily by clinical leadership who also review and endorse a project before the course. This has been the most important key success factor for projects, since clinical leadership support allows availability of data access and collection which is necessary to measure improvement, as well as change implementation.
There are some limitations. The course is not easily adaptable to medical student schedules. Some projects have involved medical students, however, as noted above, and opportunities to involve students are still sought. Also, although this training is needed for front-line clinicians, it is hard for practicing clinicians and busy staff to find time for the course. This prevents some from participating, and some who do participate occasionally have to miss some class time due to urgent clinical assignments. To address this, lessons are often videotaped to watch asynchronously. The course is also not designed to provide sustainment coaching or follow-up assessments. This is largely due to structure, since the course is run by the Office of CME in the LSOM and not by the quality departments of the UTHSA practice or UH. This is also due to resources; the course is supported for training and to facilitate a QI project as part of that training and not the long-term sustainment of a project. Now that clinical leadership is highly involved in project selection and endorsement; however, the projects are more often institutional priorities that continue to be monitored over time as priority clinical metrics.
It is acknowledged that the methods used to determine levels 2b and 3 depend on course personnel assessments of the demonstration of knowledge and skills, and behavior change of the learners. Knowledge/skill acquisition and behavior change assessment are based on the expert opinion of course personnel. Course personnel have been stable over a decade, and have extensive experience evaluating QI project implementation in many venues. Other institutions may not have access to this expertise, in which case validated assessment tools might be used to increase the reliability of learner assessment. In addition, the behavior change was measured by formal but unstructured observation, comments, and survey of the learners, and long-term behavior change was not assessed by a validated tool.
For level 3, behavior change in particular, the results from the alumni survey of whether the participant has continued to do QI and patient safety work after the CSE course (94%) may be a better measurement for sustained behavior change, although response rate to the survey was low, as discussed below.
The survey regarding longer term learner satisfaction and effect, done at ten years after starting the program at our institution, has a low response rate. The results, while positive, could be biased since those with positive experiences may have been more likely to respond. A more frequent follow-up survey could have a better response rate.
While many participants have presented their QI work at professional society meetings, the publication rate for projects is low and could be improved. The SQUIRE method for QI publications is now discussed in the course31 and will be emphasized in future course sessions.
Conclusion
Participants in the CSE course at UTHSA LSOM have achieved changes in clinical practice, benefits to patients, and positive ROI for the institution. Faculty have used the training to develop formal QI training for GME programs. Course graduates have valued the skills for their professional and personal lives, and careers have been enhanced. The course has evolved based on input from participants and stakeholders and has sustained support for >10 years. It is a reproducible and relevant model for interprofessional education of practicing professionals and staff to improve the health care system.
Acknowledgments
Peg McNabb, Joe Cepeda, and Trey Ximenez are thanked for assistance with the course operations and gathering course data for this review.
Presented at the American Board of Medical Specialties Organizational Forum on Quality Improvement, June 2014, Chicago. and the Association of American Medical Colleges Integrating Quality Meeting, June 2013, Chicago.
Conflicts of Interest
The authors have no conflicts of interest to disclose.
Funding
The Clinical Safety and Effectiveness Course had received support from the UT Health San Antonio Long School of Medicine physician’s practice and Dean’s Office, the UT Health San Antonio President’s Office, and University Health. Dr Patterson received a UT Health Care Safety Effectiveness Grant for Medical Education in Quality, Safety, Medical Errors and Patient Outcomes (2010–2013) that provided partial support for this course. The other authors received no funding for this project.
Author Contributions
Dr Patterson oversaw the course, participated in curriculum design, program evaluation, and was the primary author of the manuscript. Ms Martin participated in design of the course curriculum, program evaluation and contributed to the manuscript. Dr Hutcherson participated in the course, program evaluation and contributed to the manuscript. Dr Toohey participated in the course, program evaluation, and manuscript review. Ms Bresnahan participated in design and operation of the course curriculum, program evaluation, and manuscript review. Ms Garza participated in design and operation of the course curriculum, program evaluation, and manuscript review. Dr Alsip participated in selection of course projects, support of data collection, program evaluation, and manuscript review. Dr Shine participated in course content, program evaluation, and manuscript review.
This evaluation project was submitted to the UT Health San Antonio IRB and received the following statement on April 29, 2020. The activities associated with this manuscript were reviewed by the IRB and were determined to not require IRB approval under DHHS regulations at 45 CFR 46.
Supplementary Material
Footnotes
Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal’s website (www.AJMQonline.com).
References
- 1.Institute of Medicine. To Err is Human: Building a Safer Health System. National Academies Press; 2000. [PubMed] [Google Scholar]
- 2.Agency for Healthcare Research and Quality. National Healthcare Quality Report 2008. Accessed December 4, 2019. https://archive.ahrq.gov/research/findings/nhqrdr/nhqr08/nhqr08.pdf. [DOI] [PubMed]
- 3.Agency for Healthcare Research and Quality. National Healthcare Quality and Disparities Report 2018. Accessed December 4, 2019.https://www.ahrq.gov/research/findings/nhqrdr/nhqdr18/index.html.
- 4.Myers JS, Tess A, Glasheen JJ, et al. The Quality and Safety Educators Academy: fulfilling an unmet need for faculty development. Am J Med Qual. 2014;29:5–12. [DOI] [PubMed] [Google Scholar]
- 5.Starr SR, Kautz JM, Sorita A, et al. Quality Improvement Education for Health Professionals: A Systematic Review. Am J Med Qual. 2016;31:209–216. [DOI] [PubMed] [Google Scholar]
- 6.Jones AC, Shipman SA, Ogrinc G. Key characteristics of successful quality improvement curricula in physician education: a realist review. BMJ Qual Saf 2014;0:1–12. [DOI] [PubMed] [Google Scholar]
- 7.Intermountain Healthcare. Advanced Training Program. Accessed December 5, 2019. https://intermountainhealthcare.org/about/transforming-healthcare/institute-for-healthcare-delivery-research/courses/advanced-training-program/.
- 8.Thomas EJ, Patterson J, Martin S, Quinn D, Reed G, Shine K. The University of Texas Clinical Safety and Effectiveness Course. AHRQ Perspectives on Safety. February 1, 2011. Accessed December 5, 2019. https://psnet.ahrq.gov/perspective/university-texas-system-clinical-safety-and-effectiveness-course.
- 9.Thomas PA, Kern DE, Hughes MT, Chen BY. Curriculum Development for Medical Education: A Six-Step Approach. 3rd ed. Johns Hopkins University Press; 2016:6–9. [Google Scholar]
- 10.Institute for Healthcare Improvement. Eight Knowledge Domains for Health Professional Students. Accessed December 5, 2019. http://www.ihi.org/education/IHIOpenSchool/resources/Pages/Publications/EightKnowledgeDomainsForHealthProfessionStudents.aspx.
- 11.Boonyasai RT, Windish DM, Chakraborti C, Feldman LS, Rubin HR, Bass EB. Effectiveness of teaching quality improvement to clinicians. Jour Amer Med Assoc 2007;298:1023–1037. [DOI] [PubMed] [Google Scholar]
- 12.Kolb DA. Experiential Learning: Experience as the Source of Learning and Development. 2nd ed. Pearson Education, Inc; 2015. [Google Scholar]
- 13.Langley GL, Moen R, Nolan KM, Nolan TW, Norman CL, Provost LP. The Improvement Guide: A Practical Approach to Enhancing Organizational Performance (2nd ed.). Jossey-Bass Publishers; 2009. [Google Scholar]
- 14.Garza C, Bresnahan L, Patterson JE. Creating a Facilitator Preparation Program for a Project-based Longitudinal Quality Improvement Course. Society for Academic CME; 2020. [Google Scholar]
- 15.Institute of Medicine. Committee on Quality of Health Care in America. Crossing the Quality Chasm: A new health system for the 21st century. National Academy Press, 2001. [Google Scholar]
- 16.UT Health CME website. Clinical Safety & Effectiveness Projects. Accessed December 6, 2019. http://cme.uthscsa.edu/cseprojects.asp.
- 17.American Board of Medical Specialties. Portfolio Program. Accessed December 5, 2019. https://mocportfolioprogram.org/about-us/abms-member-board-participation/.
- 18.Kirkpatrick DL. Evaluation of training. In: Craig RL, Bittel LR, eds. Training and Development Handbook. McGraw-Hill;1967:87–112. [Google Scholar]
- 19.Baer HE, Hicks TD, Haidar GM, et al. Getting to choosing wisely: the value of a PE clinical decision tool to enhance appropriateness of care (Abstract). J Vasc Surg. 2017; 66:e58–e59. [Google Scholar]
- 20.SA Kids B.R.E.A.T.H.E Building Relationship, Effective Asthma Teaching in Home Environments. City of San Antonio Metropolitan Health District. Accessed November 5, 2021. https://www.sanantonio.gov/Health/HealthServices/Asthma.
- 21.Adams SG, Bresnahan LZ, Bowen EO, Patterson JE. Best in class in outcomes, keynote presentation: COPD educational interventions improve clinical practice. Oral presentation at the Alliance Industry Summit (AIS) meeting. May 11, 2015, Philadelphia, PA. [Google Scholar]
- 22.Adams S, Bresnahan L, Patterson JE. UT Health San Antonio quality improvement and WipeDiseases™ education interventions facilitate sustainable clinical practice changes in primary care. 14th Annual UT Shine Academy of Health Science Education Annual Meeting, February 2018, Austin, TX. [Google Scholar]
- 23.Accreditation Council for Graduate Medical Education. Clinical Learning Environment Review. Accessed November 5, 2021. https://www.acgme.org/What-We-Do/Initiatives/Clinical-Learning-Environment-Review-CLER.
- 24.Wathen P, Freeman M. The PS2 rotation: making patient safety real. Annual AAMC Integrating Quality Meeting, June 2013, Chicago, IL. [Google Scholar]
- 25.Wathen P. Leveraging a block schedule to teach patient safety and QI, Annual Meeting, ACGME Annual Educational Conference, February 2016, Washington, DC. [Google Scholar]
- 26.Svatek M. Resident quality improvement: educating, engaging, implementing. University of Texas System Conference Shared Visions: Improving Systems to Improve Lives. April 2016, April 2016. [Google Scholar]
- 27.Shojania KG, Silver I, Levinson W. Continuing medical education and quality improvement: a match made in heaven? Ann Intern Med. 2012;156:305–308. [DOI] [PubMed] [Google Scholar]
- 28.Armstrong G, Headrick L, Madigosky W, et al. Designing education to improve care. Jt Comm J Qual Patient Saf. 2012;38:5–14. [DOI] [PubMed] [Google Scholar]
- 29.Wong BM, Etchells EE, Kuper A, et al. Teaching quality improvement and patient safety to trainees: a systematic review. Acad Med. 2010;85:1425–1439. [DOI] [PubMed] [Google Scholar]
- 30.Li J, Hinami K, Hansen LO, et al. The physician mentored implementation model: a promising quality improvement framework for health care change. Acad Med. 2015;90:303–310. [DOI] [PubMed] [Google Scholar]
- 31.Davidoff F, Batalden P, Stevens D, et al.; SQUIRE Development Group. Publication guidelines for improvement studies in health care: evolution of the SQUIRE Project. Ann Intern Med. 2008;149:670–676. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.