Skip to main content
Learning Health Systems logoLink to Learning Health Systems
. 2016 Dec 15;1(1):e10018. doi: 10.1002/lrh2.10018

Creating a purpose‐driven learning and improving health system: The Johns Hopkins Medicine quality and safety experience

Peter J Pronovost 1,, Simon C Mathews 1, Christopher G Chute 1, Antony Rosen 1
PMCID: PMC6516722  PMID: 31245554

Abstract

Health care has often relied on independent silos of medical research to drive progress and innovation. However, this approach does not adequately address the complexities and opportunities within the modern health care environment. We posit that creating a learning and improving health system that is purpose‐driven will ultimately lead the next transformation in health care. We share the experience within Johns Hopkins Medicine that established a learning and improving health system in quality and safety. The system is built around a clear and compelling patient‐centered purpose and leverages a fractal framework that provides horizontal links for peer learning and vertical links for accountability. It dismantles traditional research and clinical silos and combines basic and applied research with health system operations. As a result, the system aligns the goals and strengths of a diverse set of stakeholders including clinicians, patients, researchers, and administrators toward a common goal.

Keywords: quality and safety, learning health system

1. INTRODUCTION

What do Pasteur, Bell Labs, and the race to the Moon have in common? They were widely productive at learning, and they did it by combining basic and applied research to achieve a clear, compelling purpose. They started with a purpose and worked backward. History shows us that the engine of progress often runs best when it is focused on a purpose. We believe that creating a learning and improving health system that is purpose driven will ultimately lead the next transformation in healthcare. This transformation is supported by 2 major trends: an ever increasing need to enhance the value of healthcare and an expanding ability to precisely measure and analyze a variety of variables. We describe how Johns Hopkins Medicine (JHM) sought to integrate these 2 trends through its quality and safety structure to work toward becoming a learning and improving health system.

2. BACKGROUND

The traditional model of medical research that is predicated on the exploration of basic fundamental principles developed in independent silos that feed into applied research grew out of a report to President Franklin Roosevelt in 1945 entitled, Science The Endless Frontier,1 by Vannevar Bush who was Director of the Office of Scientific Research and Development. With little evidence, the report asserted that basic research comes first, applied science follows, and the 2 shall not meet. With this backdrop, it is no surprise that the US healthcare system strives to be one that prioritizes curiosity and the identification of new knowledge. However, knowledge generation is only part of establishing a learning health system (LHS). The Institute of Medicine's vision of an LHS includes the key characteristics of integrating science and informatics, patient‐clinician partnerships, incentives, and culture to facilitate continuous learning, best care, and lower cost.2 Embedded in these defining characteristics is the concept of rigorous self‐improvement, which is often separated from the generation of knowledge and learning in real‐world practice. This gap is similarly present in quality and safety efforts, which have traditionally focused on identifying individual problems and solutions, instead of also creating an operating management system3 whose byproduct is quality and safety. Traditional functional roles within health systems have reinforced this separation with researchers focused on discovery while administrators and executives focused on improvement and performance. We realized that quality and safety represented an opportunity to combine these roles. Our goal at JHM was to redefine the modern health delivery and discovery ecosystem by dismantling traditional silos and combining basic and applied research with health system operations, anchored and driven by analytic systems that enable discovery. We have strived to create an LHS by partnering with patients, their loved ones, and others to end preventable harm, to continuously improve patient outcomes and experience, and to eliminate waste in healthcare. By‐products of this system have included knowledge creation, support of continuous improvement, and the spread of learning to the rest of our institution.

The process of creating this new learning and improving system for quality and safety at JHM (Box BOX 1. Establishing a quality and safety learning and improving health system) began with a diverse group of leaders across the institution and health system first defining a clear and compelling purpose. Ours is to help patients thrive: to prevent disease when possible, to cure when we cannot prevent, and to care when we cannot cure. The next step at JHM was to establish a framework for realizing our purpose. Our framework was informed by theory and experience and includes 5 steps: (1) formalize a system to declare and communicate goals, (2) create an enabling infrastructure, (3) engage clinicians and connect them in peer learning communities, (4) develop analytic tools to interpret and manage data, and (5) ensure accountability.

BOX 1. Establishing a quality and safety learning and improving health system.

Step 1: Leadership to define a clear and compelling purpose

Step 2: Formalize a system to declare and communicate goals

Step 3: Create an enabling infrastructure

Step 4: Engage clinicians and connect them in peer learning communities

Step 5: Develop analytic tools to interpret and manage data

Step 6: Ensure accountability

3. COMMUNICATION AND ORGANIZATIONAL INFRASTRUCTURE

We declared and communicated goals both broadly across the health system and for each vertical entity within in it. Specifically, the Armstrong Institute for Patient Safety and Quality (AI),4 a transdisciplinary group that coordinates research, training, and operations for quality improvement and patient safety, spread health system goals across JHM. In addition, AI provided direct project management, analytic, technical, and research expertise. Its faculty and staff are the leaders of the quality and safety team within our institution and provided the management and implementation science training for the health system through a variety of approaches, including e‐learning materials for all hospital staff on the science of safety, dedicated patient safety certificate programs for interested faculty and staff, specialized management and quality training in Lean and Six Sigma methodology, TeamSTEPPS5 teamwork master training for patient safety leaders, and human factors workshops for those involved in hospital quality. In addition, the director of AI who is also the health system senior vice president for quality and safety (and first author) directly met with departmental leadership as well as hospital and health system executive leadership to assist in creating a unified and strategic quality and safety plan for each respective area that is supplemented the goals of individual area leaders. More broadly, AI coordinated quality and safety efforts by linking to numerous Hopkins entities, including researchers from 18 different disciplines from every school in Johns Hopkins University plus the Applied Physics Lab to contribute their knowledge and learning to common purpose. AI interfaced with every branch within the health system because it is part of the enabling infrastructure for governance and leadership in quality and safety.

The quality and safety infrastructure at our institution draws its inspiration from the idea of a fractal—an elegant structure in nature, such as a fern, containing identical shape and varying size structures, providing horizontal links for peer learning and vertical links for accountability. While there is a hierarchical organizational structure for quality and safety, its foundation is based on the integration of smaller units that are similar in structure (composition of faculty/staff), process (use of similar tools), and approach (using a common framework to address issues). This organization stresses the importance of having local leadership and structure for quality that is linked the broader management and administration of quality at the entity level, providing horizontal connections for peer learning and vertical connections for broader organizational learning and accountability. Through this fractal structure,6 JHM leadership has oversight for the quality of care delivered anywhere under the JHM system, just as it has oversight for every dollar that is spent or received throughout JHM. This means that at every level (eg, unit, division, department, hospital, and health system), there is an organizational structure for quality and safety that interacts with the level above and below it. In addition, there are connections across different geographic and functional areas to take advantage of learning. For example, at the unit level, there are dedicated comprehensive unit‐based safety teams that bring together physicians, nurses, staff, and administrators to discuss quality and safety issues. These conversations can form the basis of division and department wide discussions that are addressed formally in regularly scheduled quality and safety meetings. These, in turn, are integrated in a common departmental and hospital tool for organizing quality and safety content called the management's discussion and analysis (MD&A)7 (Figure 1), modeled after a concept in financial reporting that provides a broad overview and assessment of performance and risk.8 This document provides quantitative data about performance and narrative to better understand the data, helping to shape the discussion at the hospital, health system, and board level. In addition to safety teams, structured meetings, and tools, we also facilitate new learning and improvement in quality and safety through clinical communities.

Figure 1.

Figure 1

Management's discussion and analysis. Reproduced with permission according to Elsevier Publishing author guidelines

4. CLINICAL COMMUNITIES

The institution engages clinicians in peer learning communities called clinical communities.9 These groups are tasked with translating our health system purpose in the context of focused clinical domains. They are generally coled by academic and community physicians and include all areas where care is delivered. Johns Hopkins now has 40 clinical communities that are diverse, including geography based, such as the hospitalist unit; topic based such as transfusion; and service line based such as heart failure. Each community achieves its purpose within a shared organizational framework that addresses 4 categories: patient safety (which represents internal risks), performance on externally reported measures, patient experience, and value. Communities include clinicians and applied researchers, but also thanks to the support of the enabling infrastructure described above, basic and clinical researchers are being integrated as well. These communities are built on robust analytic platforms and tools which translate vast amounts of data into usable information (see Boxes BOX 2. Learning and improving health system case study—surgical site infections and BOX 3. Learning and improving health system case study—patient experience for case studies). Armed with this data and supported by an analytics platform, the clinicians and researchers in our clinical communities can help to apply precision medicine in their work to solve problems and achieve our purpose. These teams are able to examine the individual patient, provider, and environmental factors that may affect meaningful clinical outcomes. In this context, precision measurement applies to everything that the institution does—recognizing that every patient and clinical encounter is unique and accordingly identifying the challenges and solutions that accommodate this diversity in experience. The institution must assume responsibility for the full range of care delivered or not delivered and account for the inherent variation in our patients, providers, and environments.

BOX 2. Learning and improving health system case study—surgical site infections.

Context: The Department of Obstetrics and Gynecology was not reaching its performance target on surgical site infections for cesarean sections. This issue was first identified as a unit level concern and then promoted as an area of improvement at the department level. It was subsequently addressed through the institution's quality and safety infrastructure

Action: The OB/GYN department chair presented infection data at multiple quality forums, including the hospital's Patient Safety and Quality Committee. The department chair and vice chair for quality subsequently met with all faculty and staff to communicate clear goals for improvement. In addition, the vice chair for quality worked with the hospital's infection prevention team and Armstrong Institute to complete a root cause analysis on every infection. The team subsequently implemented antibiotic stratification dosing with patients who have a higher body mass index and started a new vaginal cleansing technique.

Outcome: The number of infections decreased dramatically by over 50% in the first year and is on track to be another 50% decline for the following year.

BOX 3. Learning and improving health system case study—patient experience.

1.

Context: The hospital was under performing in various categories in the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey. This issue was identified at our hospital's quality and safety board meeting and subsequently addressed using the health system's quality and safety infrastructure. Hospital leadership in discussion with our patient safety institute's team of experts recognized that patient experience was an issue that needed both a central and decentralized approach to improvement.

Action: Subsequently a chief patient experience officer position was created. This individual is responsible for overseeing operations across the hospital and providing support and resources to departments. At the department level, leaders now review HCAHPS scores and unit level patient satisfaction reports on a monthly basis at departmental quality meeting. Departments are able to take advantage of the fractal quality infrastructure by identifying patient experience challenges that are specific to the local unit level but also have access to the expertise and resources of a central support structure. As a result, a variety of common interventions (eg, language of caring, executive rounding, etc) have been disseminated and tailored specifically to all departments using centralized training and education resources.

Outcome: A steady improvement in HCAHPS scores across all departments has been seen throughout the hospital

5. ANALYTIC TOOLS AND DATA MANAGEMENT

The opportunity to integrate learning and improvement has intensified because of 2 concurrent revolutions: (1) an information revolution, in which the speed and power of data analysis are unprecedented and (2) a data measurement revolution, in which the number and type of parameters that can be measured are enormous. These revolutions have the power to improve the lives of patients and decrease costs by synthesizing vast patient care data to guide prevention, prediction, and diagnosis of disease.

At JHM, we leveraged these trends by developing a corresponding standards‐based infrastructure. For quality and safety, we developed Project Emerge, which is a tablet application that coordinates and integrates data from all monitoring equipment and information systems in the intensive care unit setting to create a dashboard for surveillance of preventable harm and management of daily tasks. It rests on the principles of establishing comparable and consistent clinical data as the foundation for reliable and relevant interpretation. Furthermore, we have built this foundation on a secure open data management platform (dataFascia), which can ingest a variety of data sources, accommodate the real‐time challenges of information generated within intensive care units as well as the burgeoning scale of clinical data generated by physiologic monitors, personal devices, and the healthcare process. Specifically, we are invoking the emergent generation of health information technology standards involving state‐of‐the‐art computer messaging technologies (HL7 FHIR with RESTful interfaces) and highly scalable, big‐data management environments that embrace data security and authorization down to the data‐cell level (Apache Accumulo).

Our adoption of a robust and scalable infrastructure enabled Hopkins to accommodate the speed and volume of the current information revolution. Our embracing of standards‐based principles allowed our data‐measuring algorithms and processing routines to be sharable among the community of organizations pursuing LHSs objectives, creating a shared intellectual marketplace where the best analytic approaches will emerge. The implications of these efforts are that Hopkins and the institutions and organizations with which it partners to exchange analytic protocols and techniques can efficiently identify best practices, improve outcomes, and lower costs at scale.

6. ENSURING ACCOUNTABILITY

In our learning and improving approach for quality and safety, we created a structure for ensuring accountability for realizing our purpose. We applied a model of shared leadership accountability in which senior leaders held others accountable because these leaders empowered their employees to be successful by providing the time, skills, and resources required for improvement. Health system resources were explicitly committed to supporting the Armstrong Institute, health system and hospital level Board of Trustees for quality and safety, a senior vice president for patient safety and quality, and vice chairs for quality across all departments. As a result, at JHM, the concept of quality and safety has formal and financial support that extends from the executive board level to the individual unit.

Within this background of mutual support, we set the expectation of regular and transparent reporting of results, which is accompanied by an explicit chain of accountability, modeled on the same rigor and reporting of our finance structure. This is specifically accomplished with each functional unit and/or clinical community regularly presenting data through dashboards. These data are then aggregated and shared at the division and department levels using additional tools (sample dashboard, Figure 2). Accountability is supported by hierarchal relationships—division directors meet with department directors, department directors meet with hospital leaders, hospital leaders meet with health system leaders, and trustees to ensure we are achieving our purpose (Figure 3). However, it is our cascading fractal model with its consistent organizational staffing, shared tools, and common framework for supporting peer learning and for addressing issues that provides the foundation for assessing performance and responsibility fairly. When department or hospital‐specific goals are not met, we leverage the shared experience of the health system and the expertise of the Armstrong Institute to create an improvement plan. Executives and other hospital leaders regularly review progress and are responsible for presenting updates to their peers (including fellow hospital presidents) and to those in the accountability chain above and below them (eg, department chiefs).

Figure 2.

Figure 2

Sample dashboard. Reproduced with permission according to Wolters Kluwer Publishing author guidelines

Figure 3.

Figure 3

Chain of accountability. Reproduced with permission according to Wolters Kluwer Publishing author guidelines

7. EVALUATING PROGRESS

Our model for quality and safety is still very much a system in evolution in that it builds on and learns from the regular feedback of its constituent members. Its success can be measured in both tangible and intangible ways. We have successfully applied this model to improve performance across several JHM core measures (eg, percutaneous coronary intervention in less than or equal to 90 minutes in acute myocardial infarction, discharge instructions for heart failure patients, etc) as 1 example of progress.10 In addition, individual departments have made significant strides in multiple areas such as hand hygiene or infection rates as an example (see Box BOX 2. Learning and improving health system case study—surgical site infections). However, perhaps more importantly, success could also be measured by the significant interest and motivation within our health system to adapt this system across nearly all areas. We started implementing this quality and safety performance system at our primary academic hospital, beginning first within its largest departments and most active functional areas. However, as leadership as well as unit level providers recognized the power of having a structured system for driving improvement, they soon requested to be part of the process. Previously, individual departments and their constituent units would create ad hoc committees and processes to address quality and safety issues. Now these efforts are standardized and institutionalized as well as linked to common resources making it more efficient to solve problems. This has resulted in the expansion of this system across departments and hospitals within our health system. Notably, there was buy‐in for this concept at the top and bottom of the health system based on discussion with senior leadership and unit staff. Internally, we continue to monitor our progress as a health system by leveraging the tools and structure previously described. We are also currently in the process of sharing and implementing this framework with other institutions. The ability to reproduce and generalize our experience elsewhere would also be a marker of success.

8. CHALLENGES AND LESSONS LEARNED

Throughout this multiyear process, our institution evolved in its understanding of quality and safety. We made a fundamental change in our institution's approach, from previously viewing quality and safety as a project to it now being seen as an integrated operating management system.11 The senior leadership's mentality changed from depending on the individual heroic actions of faculty and staff to depending on design of a safe system.

However, this transition has been accompanied by several challenges. Although the importance of quality and safety was widely appreciated at our institution, the financial commitment to support dedicated infrastructure came gradually. Building this momentum would not have been possible without the assistance of philanthropic and strong executive leadership support as well as demonstration of “early wins.” The patient safety institute was partially funded through a private donation as well as institutional commitment from the Board of Trustees. We initially focused our improvement efforts on external measures, those that are defined by and reported to outside agencies (eg, Centers for Medicare and Medicaid Services and state agencies) such as readmissions and mortality because these measures had some existing quality infrastructure in place in terms of staff and data collection. In addition, we realized that our resources in quality and safety were widely distributed and not operating as efficiently as a result. Centralizing our resources and staff through a patient safety institute while working interdependently with local teams throughout the health system has been a major facilitating factor in making our overall transition. A significant part of our learning is to find the right balance of independence and interdependence and creating a fractal structure5 to support achieving that balance. Despite these efforts, we also quickly recognized that we did not have sufficient capacity, with respect to specialized staff and expertise, to support the broad infrastructure we hoped to build. As a result, we have subsequently recruited faculty and staff as well as created training programs within our patient safety institute to attract and foster internal development. As a result, we now have vice chairs for quality in every major department with plans to expand to the division level. Other industries have full time positions dedicated to quality and safety; however, we have had to invest in this process gradually at our institution so that experience now extends from the executive to unit level. This degree of depth and breadth has taken time and significant resources and is ongoing. With respect to accountability, we recognized that without the proper infrastructure and appropriate resources, it was unreasonable to expect sustainable progress. Our shared leadership accountability model is based on leaders understanding that improvement and change require resources. Before holding others to account, higher level leaders must first hold themselves accountable by ensuring that those reporting to them are set up to be successful; the supporting faculty and staff must know the goal and their role; they must have the skills, resources, and time to improve; and they must receive performance feedback.

In addition, where other industries have well‐defined standards and a common way for reporting data, currently health care does not have universal standards or the ability to aggregate quality into a commonly accepted “bottom line.” As a result, at our institution we have had to develop our own tools (eg, MD&A) to consistently communicate and present the various aspects of quality and safety in a way that every unit and every hospital can understand. Developing this common basis for understanding and communicating quality and safety has made ensuring accountability much easier.

We have also learned to adapt our tools to reflect the still evolving progress in quality and safety research. For example, we are now starting to incorporate the concept of health care equity,12 specifically viewing our data in quality and safety through the lens of different populations and demographics to understand barriers to progress.

9. CONCLUSION

Health systems strive to be learning organizations, but they should also strive to be improving ones guided by a common purpose. We describe 1 model for creating a learning and improving health system for quality and safety that is driven by a patient‐oriented purpose. It is supported by a robust model of transparency and accountability and is strengthened by a broad network of allied entities and organized into focused clinical communities. As a result, this system aligns the goals and strengths of a diverse set of stakeholders, including clinicians, patients, researchers, and administrators toward a common goal. The pathways of scientific discovery and improvement often have a prescribed beginning, but their destinations continue to evolve. The journey to creating a learning and improving health system similarly should not end when early goals are reached but rather build upon early milestones and evolve to address the challenges that lie ahead. Learning and improving should not be seen as a siloed activity of a few but rather as part of an integrated and interdependent operating management system of all staff.

DISCLOSURES

Dr. Pronovost reports receiving grant or contract support from the Agency for Healthcare Research and Quality, the Gordon and Betty Moore Foundation (research related to patient safety and quality of care), the National Institutes of Health (acute lung injury research), and the American Medical Association Inc. (improve blood pressure control); honoraria from various health care organizations for speaking on patient safety and quality (the Leigh Bureau manages these engagements); book royalties from the Penguin Group for his book Safe Patients, Smart Hospitals; and stock and fees to serve as a director for Cantel Medical. Dr. Pronovost is a founder of Patient Doctor Technologies, a startup company that seeks to enhance the partnership between patients and clinicians with an application called Doctella.

Pronovost PJ, Mathews SC, Chute CG, Rosen A. Creating a purpose‐driven learning and improving health system: The Johns Hopkins Medicine quality and safety experience. Learn Health Sys. 2017;1:e10018. doi: 10.1002/lrh2.10018

REFERENCES

  • 1. National Science Foundation. Science the Endless Frontier . https://www.nsf.gov/od/lpa/nsf50/vbush1945.htm. Accessed November 1, 2015
  • 2. Best Care at Lower Cost: The Path to Continuously Learning Health Care in America. Committee on the Learning Health Care System in America, Institute of Medicine. http://www.nationalacademies.org/hmd/Reports/2012/Best‐Care‐at‐Lower‐Cost‐The‐Path‐to‐Continuously‐Learning‐Health‐Care‐in‐America.aspx. Accessed May 15, 2016. [PubMed]
  • 3. Grote G. Safety management in different high‐risk domains—all the same? Safety Science. 2012 Dec 31;50(10):1983–1992. [Google Scholar]
  • 4. Pronovost PJ, Holzmueller CG, Molello NE, et al. The Armstrong Institute: an academic institute for patient safety and quality improvement, research, training, and practice. Acad Med. 2015;90(10):1331–1339. [DOI] [PubMed] [Google Scholar]
  • 5. Epps HR, Levin PE. The TeamSTEPPS approach to safety and quality. J Pediatr Orthop 2015;35(5 Suppl 1):S30–S33. [DOI] [PubMed] [Google Scholar]
  • 6. Pronovost PJ, Marsteller JA. Creating a fractal‐based quality management infrastructure. J Health Organ Manag. 2014;28(4):576–586. [DOI] [PubMed] [Google Scholar]
  • 7. Mathews SC, Demski R, Pronovost PJ. Management's discussion and analysis: a tool for advancing quality and safety. Healthcare 2016. DOI: 10.1016/j.hjdsi.2016.02.006 [DOI] [PubMed] [Google Scholar]
  • 8. Federal Accounting Standards Advisory Board. Management's Discussion and Analysis http://www.fasab.gov/pdffiles/15_md%26a.pdf. Accessed May 15, 2016.
  • 9. Gould LJ, Wachter PA, Aboumatar H, et al. Clinical communities at johns hopkins medicine: an emerging approach to quality improvement. Jt Comm J Qual Patient Saf. 2015;41(9):387–381. [DOI] [PubMed] [Google Scholar]
  • 10. Pronovost PJ, Armstrong CM, Demski R, et al. Creating a high reliability health care system: improving performance on core process of care measures at Johns Hopkins Medicine. Acad Med. 2014;90:165–172. [DOI] [PubMed] [Google Scholar]
  • 11. Sutcliffe KM, Paine L, Pronovost PJ. Re‐examining high reliability: actively organising for safety. BMJ Qual Saf. 2016; [Epub ahead of print]. DOI: 10.1136/bmjqs-2015-004698 [DOI] [PubMed] [Google Scholar]
  • 12. Braveman P, Gruskin S. Defining equity in health. J Epidemiol Community Health 2003;57(4):254–258. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Learning Health Systems are provided here courtesy of Wiley

RESOURCES