Skip to main content
BMJ Open Access logoLink to BMJ Open Access
. 2015 Dec 23;25(3):147–152. doi: 10.1136/bmjqs-2015-005076

The problem with Plan-Do-Study-Act cycles

Julie E Reed 1, Alan J Card 2,3
PMCID: PMC4789701  PMID: 26700542

Introduction

Quality improvement (QI) methods have been introduced to healthcare to support the delivery of care that is safe, timely, effective, efficient, equitable and cost effective. Of the many QI tools and methods, the Plan-Do-Study-Act (PDSA) cycle is one of the few that focuses on the crux of change, the translation of ideas and intentions into action. As such, the PDSA cycle and the concept of iterative tests of change are central to many QI approaches, including the model for improvement,1 lean,2 six sigma3 and total quality management.4

PDSA provides a structured experimental learning approach to testing changes. Previously, concerns have been raised regarding the fidelity of application of PDSA method, which may undermine learning efforts,5 the complexity of its use in practice5 6 and as to the appropriateness of the PDSA method to address the significant challenges of healthcare improvement.7

This article presents our reflections on the full potential of using PDSA in healthcare, but in doing so we explore the inherent complexity and multiple challenges of executing PDSA well. Ultimately, we argue that the problem with PDSA is the oversimplification of the method as it has been translated into healthcare and the failure to invest in a rigorous and tailored application of the approach.

The value of PDSA in healthcare improvement

The purpose of the PDSA method lies in learning as quickly as possible whether an intervention works in a particular setting and to making adjustments accordingly to increase the chances of delivering and sustaining the desired improvement. In contrast to controlled trials, PDSAs allow new learning to be built in to this experimental process. If problems are identified with the original plan, then the theory can be revised to build on this learning and a subsequent experiment conducted to see if it has resolved the problem, and to identify if any further problems also need to be addressed. In the complex social systems of healthcare, this flexibility and adaptability of PDSA are important features that support the adaption of interventions to work in local settings.

A successful PDSA process does not equal a successful QI project or programme. The intended output of PDSA is learning and informed action. Successful application of the PDSA methodology may enable users to achieve their QI goals more efficiently or to reach QI goals they would otherwise not have achieved. But it is also successful if it saves wasted effort by revealing QI goals that cannot be achieved under realistic constraints or if it identifies new problems to tackle instead of the originally identified issue. A well-conducted PDSA promises learning. But it does not, and cannot, promise that users will achieve their desired outcomes.

As PDSA has been translated into healthcare from industrial settings, an emphasis has been placed on rapid small-scale tests of change, often on one, three and then five patients in ‘ramps’ of increasing scale, and responsibility delegated to frontline staff and improvement or quality managers. This pragmatic approach has been embraced and has been seen as providing a new freedom for healthcare staff to lead change and improvement in local care settings.

However, the process of change rarely progresses in simple linear ramps.6 8 The conduct of PDSAs can reveal other related issues that need to be addressed in order to achieve the improvement goal. Such issues may relate to minor changes to current practices or processes of care, but can often reveal larger cultural or organisational issues that need to be addressed and overcome.

Recent evaluations have reported on the failure of the PDSA method to help frontline staff address the multiple improvement challenges they faced as the scale of investigation and range of issues they needed to address increased.7 9 A report evaluating the Safer Clinical Systems programme in the UK identified ‘the need for clarity about when improvement approaches based on PDSA cycles are appropriate and when they are not’, viewing some challenges as ‘too big and hairy’ for the PDSA method and beyond the scope of small-scale tests of change run by local clinical teams.7

We argue that any improvement situation, no matter how big and hairy, is conducive to application of the PDSA method. The four stages of PDSA mirror the scientific experimental method of formulating a hypothesis, collecting data to test this hypothesis, analysing and interpreting the results and making inferences to iterate the hypothesis.5 10

Whether improvement initiatives have been planned at national level to support standardisation of care or planned over a cup of coffee to solve a minor local problem, we believe there will always be a role for PDSA. In moving from planning to implementing a change in practice, PDSA provides a structure for experimental learning to know whether a change has worked or not, and to learn and act upon any new information as a result.

But it is not a magic bullet. Increasingly complex problems require increasingly sophisticated application of the PDSA method, and this is where we believe the problem with the PDSA method lies.

Its simplicity belies its sophistication

One of the main narratives surrounding the use of PDSA in healthcare is that it is easy, and can be applied in practice by anyone. At one level this is true, and the simplicity of the PDSA method and its applicability to many different situations can be viewed as one of its main strengths. However, this simplicity also creates some of the greatest challenges to using PDSA successfully. Users need to understand how to adapt the use of PDSA to address different problems and different stages in the lifecycle of each improvement project. This requires an extensive repertoire of skills and knowledge to be used in conjunction with the basic PDSA model.

One of the main problems encountered in using PDSA is the misperception that it can be used as a standalone method. PDSA needs to be used as part of a suite of QI methods, the exact nature of which may be influenced by the broader methodological approach that is being followed (eg, model for improvement, lean). An important role of the wider methodological approach is to conduct investigations prior to starting the use of PDSA to ensure that the problem is correctly understood and framed. Investigations can include process mapping, failure mode effects analysis, cause and effect analysis, stakeholder engagement and interviews, data analysis and review of existing evidence.

A second misperception is that the PDSA is limited to small-scale tests of change on one, three and five patients. PDSA is an extremely flexible method that can be adapted to support the scale up of interventions and used in conjunction with monitoring activities to support sustainability. But, this flexibility gives rise to a number of key dimensions that require careful consideration. For instance, the scope and scale of change, the amount of preparation prior to use, rigour of the evaluation, time, expertise, management support and funding must be carefully aligned. Often these needs must be rebalanced over the project's lifecycle. If managed well, these adjustments enable the use of PDSA to adapt to new learning and support the design and conduct of ‘tests of change’ as they increase in scale, and often complexity, to achieve the desired improvement goal.

Using PDSA as an iterative design framework to help solve ‘big hairy problems’ or ‘big hairy audacious goals’11 is, therefore, entirely appropriate. In fact, developing solutions to large-scale ‘wicked problems’12 may require ‘an iterative explorative and generative’13 approach of the sort PDSA provides, in which ‘knowledge is built through designing’.13 The key is to understand that this framework will need to be implemented (and resourced) very differently for large and complex problems than for smaller and more ‘tame’ problems. One size does not fit all.

While frontline staff with little training or support may successfully address some quality problems, the complexity of many problems demands greater organisational support, with direct involvement of senior managers to facilitate adequate planning. Projects in which frontline staff must fend for themselves also run the risk of insufficient usage of theory and existing evidence to develop the intervention and a suboptimal evaluation.

Quick (not dirty) tests of change

In healthcare, PDSA training often overemphasises the conceptual simplicity of the framework and underemphasises the different ways in which the method can be adapted to solve increasingly complex problems. This frequently leads people to leap into PDSA with insufficient prior investigation and framing of the problem, to delegate management of the process to frontline staff who have little influence over broader systemic concerns that need to be addressed, and to provide these staff with little support to overcome the obstacles and barriers they face. The resources, skills and expertise required to apply PDSA in the real world are often significantly underestimated, leading to projects that are destined to fail.

This has led to the impression that PDSA cycles involve ‘quick and dirty’ tests of change. In the rush to empower healthcare staff, there is a danger that the scientific rigour of the PDSA method is frequently compromised. A systematic review5 revealed that the core principles of PDSA are often not executed in practice, with ‘substantial variability with which they are designed, executed and reported in the healthcare literature’.6 A failure to properly execute PDSAs can undermine learning efforts… ‘if data collection does not occur frequently enough, if iterative cycles are few, and if system-level changes are not apparent as a result of these cycles, the improvement work is less likely to succeed’.6 While its scientific principles differ from those of controlled trials, rigour in the application of PDSA is still required for PDSA to maximise the learning obtained from tests of change.

In addition to a lack of fidelity with PDSA guiding principles, there is the need to ensure that each stage of the cycle is conducted well. But the frenetic culture endemic in healthcare organisations can make it difficult to achieve sustained engagement in the deliberative processes of PDSA.

Just get on with it

While ‘planning paralysis’ can be an issue in healthcare organisations, the more common problem is a serious underinvestment in the planning phase. The pervasive cultural compulsion to ‘just get on with it’14 leads many teams to move too quickly from ‘plan’ to ‘do.’ The consequences of skipping this up-front work can include wasted PDSA cycles or projects that fail altogether. Table 1 describes some of the key failure modes for the planning and preplanning (ie, investigation and problem-framing) steps of the PDSA process.

Table 1.

Key failure modes for the investigation/problem framing and plan steps

PDSA stages Key failure modes Potential consequence
Investigation and problem framing
Define the problem; determine its causes/contributing factors; identify stakeholders; set the criteria for success
Poor definition of the problem and its causes/contributing factors1 5 21–25 Time, money and goodwill may be wasted trying to solve the wrong problem or solve it in the wrong way
Failure to clearly define the criteria for success and how performance will be measured5 22 26 A poor match between the design of the intervention and its intended impact; inability to assess success during ‘study’ phase
Failure to identify key stakeholders22 27 Important knowledge may be left out of the planning process
Plan
Design an intervention and data collection plan; specify how the intervention will be implemented (Do), evaluated (Study) and sustained (if successful)
No theory of change/programme theory connecting the intervention to its intended outcomes28–31 Poorly targeted interventions that may be inefficient or may fail altogether. Poor buy-in due to a perceived lack of legitimacy
Planned intervention, implementation plan and study protocol that are not in proportion to one another and the problem to be solved22 32 33 Underinvestment leading to projects that do not achieve their goals or that cannot be proven to have achieved their goals; Overinvestment leading to wasted resources
Designing a data collection and analysis plan that is incapable of providing the required answers26 Impossible to know if the intervention was effective; excessive PDSA cycles required; aggravation among frontline staff that the administrative burden of data collection was wasted
Not consulting key stakeholders during the planning stage21 27 34 35 Proceeding with an intervention that is predictably doomed to fail; disengagement among frontline staff
Not planning for the ‘who, what, where, when, and how’ of implementation (the ‘do’ phase)5 22 36 Poor understanding of resource requirements and cost-effectiveness; poor execution of the ‘do’ and ‘study’ phases
Adopting weak interventions (eg, administrative controls, such as training and policies) without considering more robust options37–41 Interventions that do not achieve their goals or do not sustain them
Not assessing cultural and structural barriers/facilitators related to the intervention14 21 42–44 ‘Fish out of water’ interventions put in place without attention to the broader changes required to make them successful; systemic issues not tackled and only superficial change attempts made
Failure to plan for how the intervention will be sustained in practice, if successful16 7 38 45 46 Performance reverts to previous standards, staff frustrated with unsuccessful change effort and disengage from future attempts
Failure to consider the intervention's failure modes and potential side effects (positive and negative)21 45 47 Interventions that are designed to fail or that create more problems than they solve; failure to select the most cost-effective solutions

PDSA, Plan-Do-Study-Act.

Why do planning failures present such a challenge to the successful use of PDSA? It is much more difficult to correctly execute and learn from a plan that has not been well thought out. And even perfect execution cannot ensure success if the plan, itself, is wrong.

The iterative nature of PDSA enables course corrections, but this feature of the approach is much more effective if there was a clear and reasoned course in the first place. Many of the barriers to success in the do, study and act phases can be predicted and mitigated through more effective planning.

Overcoming the prevailing culture of ‘Do, Do, Do’

The structured, reflective practice required for PDSA runs counter to the main mode of operation in healthcare organisations, ‘doing’, with the time required for planning and reflection regarded as a luxury rather than a necessity. As a result, teams often get ‘stuck’ in the ‘do’ phase, failing to progress to the ‘study’ phase. While these problems may reflect poor planning, they may also be caused by problems beyond the control of the project team, such as the challenges of creating time to conduct tests of change, staff turnover and changing or competing priorities. To stop at the ‘do’ phase is to throw away the core contribution of PDSA: its support for iterative design as a way of making improvement interventions more successful.15 Another important but frequently overlooked part of the ‘do’ phase is inductive learning, noticing the unexpected and feeding these observations into the study phase.

Poor planning or conduct of the ‘do’ phase in turn can significantly undermine the ‘study’ phase. In some cases, improvement teams appear to bypass the ‘study’ phase altogether, moving directly from ‘do’ to ‘act’.5 In other cases, the ‘study’ phase may collect insufficient data or may not collect the right type of data to answer questions about the intervention's effectiveness and acceptability. For instance, quantitative data can assess the impact of a given change, without qualitative feedback; the reasons for the results or staff attitudes and ideas about what could be improved will remain unknown. It is also possible that teams draw the wrong conclusions from the data they have collected or fail to notice unanticipated consequences, which may lead to incorrect actions.

Failure to take appropriate action based on what was learned from the ‘study’ phase and previous PDSA cycles is another common concern.5 Inappropriate actions may include adopting or scaling up an intervention that has not proven effective and acceptable,16 or ending a project that has proved successful, or is on track to do so. An important part of the act phase consists of reviewing and revising the theory of how the intervention is intended to achieve its desired impact. This iterative refinement of theory is a key component of PDSA methodology, which is often overlooked in practice.

Effectively managing the PDSA process is about more than individual PDSA steps or cycles. Connecting PDSA cycles together is a messier and far more complicated endeavour than most of the literature on the approach suggests.6 Progression across cycles is seldom linear, and double-loop learning17 may lead to revised goals, as well as revised interventions, and requires significant oversight to manage emergent learning and coordination of PDSA activities over time.

Table 2 describes some of the key failure modes for the execution of the do, study and act steps of the PDSA process.

Table 2.

Key failure modes for executing the do, study and act steps

PDSA stages Key failure modes Potential consequence
Do
Implement the plan (including both the QI intervention and the data collection plan)
Failure to implement the QI intervention as intended27 36 Impossible to learn whether the planned QI intervention works as expected; wasted effort; disillusionment among staff involved with intervention design
Failure to collect the data as intended27 36 Undercuts the Study phase; may be difficult or impossible to tell whether the intervention worked as expected; difficult or impossible to learn about the effectiveness of the original data collection plan
Failure to capture unanticipated learning17 22 27 Missed learning opportunities (especially for qualitative learning about how and why the intervention did/did not work); project failure; unnecessary PDSA cycles
Failure to abandon the Do phase despite manifest failure or severe negative side effects24 Wasted effort; excessive disruption; adverse outcomes from side effects
Study
Analyse data and compare results to the definition of success; distil and communicate what has been learned from the formal data analysis and unanticipated learning
Failure to conduct a study5 or inappropriate failure to follow the study plan No/limited opportunity to learn whether the intervention works as intended; potential for biased and misleading results
Failure to communicate what has been learned27 46 Loss of stakeholder engagement; reinventing the same broken wheel in the service of other QI projects; loss of institutional knowledge if there is turnover among project leaders
Act
Based on what has been learned, either:
  1. Revisit the investigation and problem framing phase

  2. Begin a new PDSA cycle at the Plan phase

  3. Fully implement and sustain the intervention or

  4. End the project without investing further effort

Failure to engage in ‘double loop learning’17 that questions the goals of the project in light of what has been learned Wasted effort continuing to work on the wrong problem, or one that cannot realistically be solved; Excessive PDSA cycles spent trying to achieve a goal that is set too high, when a more realistic goal would deliver real improvement
Moving too quickly from small-scale tests of change to full-scale implementation and sustainment5 Failure to uncover barriers to broader use prior to implementation; project failure; disruption associated with deimplementation; wasted resources/goodwill

PDSA, Plan-Do-Study-Act; QI, quality improvement.

The problem with PDSA: failure to invest in rigorous and tailored application

While the PDSA method is conceptually simple, simple does not mean easy. That said, PDSA is a powerful approach, and projects that make successful use of PDSA can solve specific quality problems and also help shape the culture of healthcare organisations for the better. So, the effort required to apply PDSA successfully has a substantial return on investment. But the resources and supportive context required for success (including funding, methodological expertise, buy-in and sustained effort)18 are often underestimated. Inadequate human resources and financial support doom many projects to fail and also undermine organisational culture, contributing to change fatigue and disillusionment as yet another project produces no real improvement. It is therefore crucial, at both the project level and the programmatic level, that the resource requirements for successful application of PDSA for a given project are well understood and that the process is well managed.

The barriers to ensuring this type of practice in a healthcare culture of ‘just get on with it’ and ‘do, do, do’ are difficult to overcome. To be successful, the use of PDSA must be supported by a significant investment in leadership, expertise and resources for change.

Academia and researchers have a potential role to play to support appropriate rigour of planning and studying and understanding how to manage emergent learning while engaging diverse stakeholder groups. Working in partnership will be beneficial to support effective use of PDSA and is essential to establish genuine learning organisations.19 20

Footnotes

Twitter: Follow Julie Reed at @julie4clahrc and Alan Card at @AlanJCard

Competing interests: None declared.

Disclaimer: This article presents independent research commissioned by the National Institute for Health Research (NIHR) under the Collaborations for Leadership in Applied Health Research and Care (CLAHRC) programme for North West London. The views expressed in this publication are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health.

Provenance and peer review: Commissioned; internally peer reviewed.

References

  • 1.Langley GJ, Moen R, Nolan KM, et al. . Changes that result in improvement. In: The improvement guide: a practical approach to enhancing organizational performance. 2nd edn San Francisco: Jossey-Bass, 2009:15–25. [Google Scholar]
  • 2.Toussaint JS, Berry LL. The promise of lean in health care. Mayo Clin Proc 2013;88:74–82. 10.1016/j.mayocp.2012.07.025 [DOI] [PubMed] [Google Scholar]
  • 3.Schroeder RG, Linderman K, Liedtke C, et al. . Six sigma: definition and underlying theory. J Oper Manag 2008;26:536–54. 10.1016/j.jom.2007.06.007 [DOI] [Google Scholar]
  • 4.Brannan KM. Total quality in health care. Hosp Mater Manage Q 1998;19:1–8. [PubMed] [Google Scholar]
  • 5.Taylor MJ, McNicholas C, Nicolay C, et al. . Systematic review of the application of the plan-do-study-act method to improve quality in healthcare. BMJ Qual Saf 2014;23:290–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Ogrinc G, Shojania KG. Building knowledge, asking questions. BMJ Qual Saf 2014;23:265–7. 10.1136/bmjqs-2013-002703 [DOI] [PubMed] [Google Scholar]
  • 7.Dixon-Woods M, Martin G, Tarrant C, et al. . Safer Clinical Systems: evaluation findings. London, 2014. http://www.health.org.uk/sites/default/files/SaferClinicalSystemsEvaluationFindings_fullversion.pdf [Google Scholar]
  • 8.Taylor-Adams S, Vincent C. Systems analysis of clinical incidents: the London protocol. Clin Risk 2004;10:211–20. 10.1258/1356262042368255 [DOI] [Google Scholar]
  • 9.Benning A, Ghaleb M, Suokas A, et al. . Large scale organisational intervention to improve patient safety in four UK hospitals: mixed method evaluation. BMJ 2011;342:d195 10.1136/bmj.d195 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Berwick DM. Developing and testing changes in delivery of care: a cardiologist's perspective. Ann Intern Med 1998;128:651–6. 10.7326/0003-4819-128-8-199804150-00009 [DOI] [PubMed] [Google Scholar]
  • 11.Nanji KC, Ferris TG, Torchiana DF, et al. . Overarching goals: a strategy for improving healthcare quality and safety? BMJ Qual Saf 2013;22:187–93. 10.1136/bmjqs-2012-001082 [DOI] [PubMed] [Google Scholar]
  • 12.Rittel H, Webber M. Dilemmas in a general theory of planning. Policy Sci 1973;4:155–69. 10.1007/BF01405730 [DOI] [Google Scholar]
  • 13.Sevaldson B. Discussions & movements in design research a systems approach to practice research in design. Form Akad 2010;3:8–35. [Google Scholar]
  • 14.Dixon-Woods M. Why is patient safety so hard? A selective review of ethnographic studies. J Health Serv Res Policy 2010;15:11–16. 10.1258/jhsrp.2009.009041 [DOI] [PubMed] [Google Scholar]
  • 15.Padula W V, Duffy MP, Yilmaz T, et al. . Integrating systems engineering practice with health-care delivery. Heal Syst 2014;3:1–11. 10.1057/hs.2013.18 [DOI] [Google Scholar]
  • 16.de Saint Maurice G, Auroy Y, Vincent C, et al. . The natural lifespan of a safety policy: violations and system migration in anaesthesia. Qual Saf Healthc 2010;19:327–31. 10.1136/qshc.2008.029959 [DOI] [PubMed] [Google Scholar]
  • 17.Argyris C. Double loop learning in organizations. Harv Bus Rev 1977;55:115–26. [Google Scholar]
  • 18.Kaplan HC, Provost LP, Froehle CM, et al. . The Model for Understanding Success in Quality (MUSIQ): building a theory of context in healthcare quality improvement. BMJ Qual Saf 2012;21:13–20. 10.1136/bmjqs-2011-000010 [DOI] [PubMed] [Google Scholar]
  • 19.Marshall M, Pagel C, French C, et al. . Moving improvement research closer to practice: the Researcher-in-Residence model. BMJ Qual Saf 2014;23:801–5. 10.1136/bmjqs-2013-002779 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.National Advisory Group on the Safety of Patients in England. A promise to learn- a commitment to act: Improving the Safety of Patients in England. London, 2013. https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/226703/Berwick_Report.pdf [Google Scholar]
  • 21.Batalden P. Making improvement interventions happen—the work before the work: four leaders speak. BMJ Qual Saf 2014;23:4–7. 10.1136/bmjqs-2013-002440 [DOI] [PubMed] [Google Scholar]
  • 22.Card AJ, Ward JR, Clarkson PJ. Rebalancing risk management—part 1: the Process for Active Risk Control (PARC). J Healthc Risk Manag 2014;34:21–30. 10.1002/jhrm.21155 [DOI] [PubMed] [Google Scholar]
  • 23.Dorst K. The core of “design thinking” and its application. Des Stud 2011;32:521–32. 10.1016/j.destud.2011.07.006 [DOI] [Google Scholar]
  • 24.Walley P, Gowland B. Completing the circle: from PD to PDSA. Int J Health Care Qual Assur 2004;17:349–58. 10.1108/09526860410557606 [DOI] [PubMed] [Google Scholar]
  • 25.Harvey G, Jas P, Walshe K. Analysing organisational context: case studies on the contribution of absorptive capacity theory to understanding inter-organisational variation in performance improvement. BMJ Qual Saf 2015;24:48–55. 10.1136/bmjqs-2014-002928 [DOI] [PubMed] [Google Scholar]
  • 26.Portela MC, Pronovost PJ, Woodcock T, et al. . How to study improvement interventions: a brief overview of possible study types. BMJ Qual Saf 2015;24:325–36. 10.1136/bmjqs-2014-003620 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Platts N, Shepherd S. Managing change in critical care: a toolkit for practice. Ciritcal Care Outreach; 2006:243–68. [Google Scholar]
  • 28.Davidoff F, Dixon-Woods M, Leviton L, et al. . Demystifying theory and its use in improvement. BMJ Qual Saf 2015;24:228–38. 10.1136/bmjqs-2014-003627 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Foy R, Ovretveit J, Shekelle PG, et al. . The role of theory in research to develop and evaluate the implementation of patient safety practices. BMJ Qual Saf 2011;20:453–9. 10.1136/bmjqs.2010.047993 [DOI] [PubMed] [Google Scholar]
  • 30.Dixon-Woods M, Tarrant C, Willars J, et al. . How will it work? A qualitative study of strategic stakeholders’ accounts of a patient safety initiative. Qual Saf Heal Care 2010;19:74–8. 10.1136/qshc.2008.029504 [DOI] [PubMed] [Google Scholar]
  • 31.Reed JE, McNicholas C, Woodcock T, et al. . Designing quality improvement initiatives: the action effect method, a structured approach to identifying and articulating programme theory. BMJ Qual Saf 2014;23:1040–8. 10.1136/bmjqs-2014-003103 [DOI] [PubMed] [Google Scholar]
  • 32.Card AJ, Ward JR, Clarkson PJ. Trust-level risk evaluation and risk control guidance in the NHS east of England. Risk Anal 2014;34:1471–81.http://doi.wiley.com/10.1111/risa.12159 (accessed 17 December 2013). [DOI] [PubMed] [Google Scholar]
  • 33.Hillson D. Developing effective risk responses . Proceedings of the 30th annual project management institute 1999 seminars & symposium Philadelphia, Pennsylvania, USA, 1999. [Google Scholar]
  • 34.Mills PD, Neily J, Kinney LM, et al. . Effective interventions and implementation strategies to reduce adverse drug events in the Veterans Affairs (VA) system. Qual Saf Health Care 2008;17:37–46. 10.1136/qshc.2006.021816 [DOI] [PubMed] [Google Scholar]
  • 35.Mills PD, Neily J, Luan DD, et al. . Using aggregate root cause analysis to reduce falls and related injuries. Jt Comm J Qual Patient Saf 2005;31:21–31. [DOI] [PubMed] [Google Scholar]
  • 36.Wandersman A, Alia K, Cook B, et al. . Integrating empowerment evaluation and quality improvement to achieve healthcare improvement outcomes. BMJ Qual Saf 2015;24:645–52. 10.1136/bmjqs-2014-003525 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Card AJ, Ward J, Clarkson PJ. Successful risk assessment may not always lead to successful risk control: a systematic literature review of risk control after root cause analysis. J Healthc Risk Manag 2012;31:6–12. 10.1002/jhrm.20090 [DOI] [PubMed] [Google Scholar]
  • 38.Bagian JP. Health care and patient safety: the failure of traditional approaches—how human factors and ergonomics can and must help. Hum Factors Ergon Manuf 2012;22:1–6. 10.1002/hfm.20261 [DOI] [Google Scholar]
  • 39.Card AJ, Simsekler MCE, Clark M, et al. . Use of the Generating Options for Active Risk Control (GO-ARC) technique can lead to more robust risk control options. Int J Risk Saf Med 2014;26:199–211. 10.3233/JRS-140636 [DOI] [PubMed] [Google Scholar]
  • 40.Card AJ, Ward JR, Clarkson PJ. Rebalancing risk management -part 2: the Active Risk Control (ARC) toolkit. J Healthc Risk Manag 2015;34:4–17. 10.1002/jhrm.21160 [DOI] [PubMed] [Google Scholar]
  • 41.Card AJ, Ward JR, Clarkson PJ. Generating Options for Active Risk Control (GO-ARC): introducing a novel technique. J Healthc Qual 2014;36:32–41. 10.1111/jhq.12017 [DOI] [PubMed] [Google Scholar]
  • 42.Iedema R, Jorm C, Braithwaite J. Managing the scope and impact of root cause analysis recommendations. J Heal Organ Manag 2008;22:569–85. 10.1108/14777260810916551 [DOI] [PubMed] [Google Scholar]
  • 43.Card AJ. A new tool for hazard analysis and force field analysis: the lovebug diagram. Clin Risk 2013;19:87–92. 10.1177/1356262213510855 [DOI] [Google Scholar]
  • 44.Greenhalgh T, Robert G, Macfarlane F, et al. . Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q 2004;82:581–629. http://sfx.library.curtin.edu.au/sfx_local?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2011-07-17T14:35:04IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Oyler J, Vinci L, Johnson JK, et al. . Teaching internal medicine residents to sustain their improvement through the quality assessment and improvement curriculum. J Gen Intern Med 2011;26:221–5. 10.1007/s11606-010-1547-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Larson DB, Mickelsen LJ. Project Management for Quality Improvement in Radiology. Am J Roentgenol 2015;205:W470–7. 10.2214/AJR.15.14807 [DOI] [PubMed] [Google Scholar]
  • 47.Card AJ. The Active Risk Control (ARC) toolkit: a new approach to designing risk control interventions. J Healthc Risk Manag 2014;33:5–14. 10.1002/jhrm.21137 [DOI] [PubMed] [Google Scholar]

Articles from BMJ Quality & Safety are provided here courtesy of BMJ Publishing Group

RESOURCES