NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Cover of Evidence Brief: Implementation of High Reliability Organization Principles

Evidence Brief: Implementation of High Reliability Organization Principles

Investigators: , MPH, , MS, and , MPH.

Washington (DC): Department of Veterans Affairs (US); .

Preface

The VA Evidence Synthesis Program (ESP) was established in 2007 to provide timely and accurate syntheses of targeted health care topics of importance to clinicians, managers, and policymakers as they work to improve the health and health care of Veterans. These reports help:

  • Develop clinical policies informed by evidence;
  • Implement effective services to improve patient outcomes and to support VA clinical practice guidelines and performance measures; and
  • Set the direction for future research to address gaps in clinical knowledge.

The program is comprised of four ESP Centers across the US and a Coordinating Center located in Portland, Oregon. Center Directors are VA clinicians and recognized leaders in the field of evidence synthesis with close ties to the AHRQ Evidence-based Practice Center Program and Cochrane Collaboration. The Coordinating Center was created to manage program operations, ensure methodological consistency and quality of products, and interface with stakeholders. To ensure responsiveness to the needs of decision-makers, the program is governed by a Steering Committee comprised of health system leadership and researchers. The program solicits nominations for review topics several times a year via the program website.

Comments on this evidence report are welcome and can be sent to Nicole Floyd, Deputy Director, ESP Coordinating Center at vog.av@dyolF.elociN.

Executive Summary

Background

The ESP Coordinating Center (ESP CC) is responding to a request from the VA National Center for Patient Safety for a rapid evidence review on implementing High Reliability Organization (HRO) principles into practice. Findings from this review will be used to inform the implementation of the VA’s High Reliability Organization Initiative.

Methods

To identify studies, we searched MEDLINE®, PsycInfo, CINAHL, Cochrane Central Register of Controlled Trials, and other sources from Jan. 2010- Jan. 2019. We used prespecified criteria for study selection, data abstraction, and rating internal validity and strength of the evidence. Full methods are available on PROSPERO register of systematic reviews (CRD42019125602)

Objective: To systematically evaluate literature on frameworks for high reliability organization (HRO) implementation, metrics for evaluating a health system’s progress towards becoming an HRO, and effects of HRO implementation on process and patient safety outcomes.

Key Findings

  • We identified 5 common HRO implementation strategies across 8 frameworks. Based on those, the Joint Commission’s High Reliability Health Care Maturity Model (HRHCM) and the Institute for Healthcare Improvement’s Framework for Safe, Reliable, and Effective Care emerged as the most comprehensive, as they included all 5 strategies, contained sufficient detail to guide implementation, and were the most rigorously developed and widely applicable.
  • The Joint Commission’s HRHCM/OroTM 2.0 is the most rigorously developed and validated tool available for evaluating health care organizations’ progress on becoming an HRO; however, it has some conceptual gaps that may be addressed by incorporating metrics from other evaluation tools.
  • Multicomponent HRO interventions delivered for at least years are associated with improved process outcomes (eg, staff reporting of safety culture) and patient safety outcomes (eg, serious safety events). However, the overall strength of evidence is low, as each HRO intervention was only supported by a single fair-quality study.

High Reliability Organizations (HROs) are organizations that achieve safety, quality, and efficiency goals by employing 5 central principles: (1) sensitivity to operations (ie, heightened awareness of the state of relevant systems and processes); (2) reluctance to simplify (ie, the acceptance that work is complex, with the potential to fail in new and unexpected ways); (3) preoccupation with failure (ie, to view near misses as opportunities to improve, rather than proof of success); (4) deference to expertise (ie, to value insights from staff with the most pertinent safety knowledge over those with greater seniority); (5) and practicing resilience (ie, to prioritize emergency training for many unlikely, but possible, system failures). Nuclear power and aviation are classic examples of industries that have applied HRO principles to achieve minimal errors, despite highly hazardous and unpredictable conditions. As death due to medical errors are estimated to be the third leading cause of death in the country, a growing number of health care systems are taking interest in adopting HRO principles. In 2008, the Agency for Healthcare Research and Quality (AHRQ) published a seminal white paper that described the application of the 5 key HRO principles in health care settings, including the specific challenges that threaten reliability in health care, such as higher workforce mobility and care of patients rather than machines. Adoption of these HRO principles in health care offers promise of increased excellence; however, major barriers to widespread implementation include difficulty in adopting organization-level safety culture principles into practice; competing priorities between HRO and other large-scale organizational transformation initiatives such as electronic health records; and difficulty in creating and implementing process improvement tools and methods to address complex, system-level problems.

In February 2019, the Department of Veterans Affairs (VA) rolled out a new initiative outlining the definitive steps toward becoming an HRO. As literature has emerged to guide health systems in implementing and evaluating their HRO journey, an understanding of the quality and applicability of existing HRO resources is important to developing best practices, identifying barriers and facilitators to implementation, measuring progress, identifying knowledge gaps, and spreading implementation initiatives to other systems. In this review, we evaluate literature on the frameworks for HRO implementation, metrics for evaluating a health system’s progress towards becoming an HRO, and effects of HRO implementation on process and patient safety outcomes.

We identified 20 articles published on HRO frameworks, metrics, and evidence of effects. Eight articles addressed frameworks, and of these, the Joint Commission’s High Reliability Health Care Maturity Model (HRHCM) and the Institute for Healthcare Improvement’s (IHI) Framework for Safe, Reliable, and Effective Care emerged as the most comprehensive, rigorously developed, applicable, and sufficiently detailed to guide implementation. The most commonly reported implementation strategies across the 8 frameworks were: (1) developing leadership, (2) supporting a culture of safety, (3) building and using data systems to track progress, (4) providing training and learning opportunities for providers and staff, and (5) implementing interventions to address specific patient safety issues. Most of these frameworks were developed via a consensus process – typically with a group of health system leaders and experts in patient safety – and were intended to be implemented by a variety of health care providers and staff. Articles varied in the depth of information provided on how to implement these frameworks, with some providing specific guidance on implementation activities such as workshops and time frames for implementation and others providing overarching, conceptual guidance.

Eight articles and 1 online tool described metrics for measuring a health system’s progress towards becoming an HRO. The OroTM 2.0 tool emerged as the most rigorously designed and validated, as it was developed by a leading group in health care improvement, informed by industries leaders across HROs, and tested in a total of 52 US hospitals both within and outside of the VA. Otherwise, metrics varied in terms of the concept measured, ranging from surveys on culture of safety to extent of integration of HRO principles into practice. The process for developing these metrics also varied by tool. Many groups relied on a literature review or expert consensus, whereas others underwent rounds of revisions and piloted their tool in multiple hospital settings.

Seven articles evaluated the effects of HRO implementation, primarily in children’s hospitals. The most notable finding is that organizations experienced significant reductions in serious safety events (range, 55% to 100%) following the implementation of the 4 most comprehensive, multicomponent HRO initiatives. Moreover, time since initiation and safety improvements appear to have a dose-response relationship. Only one of these studies explicitly discussed using a framework identified in Key Question 1 (ie, the IHI framework). Common implementation activities included some form of basic error prevention training for staff and leadership training for leaders, enhanced root cause analysis processes using an electronic tracking system, provider peer safety coaches to coach their peers in the use of error prevention techniques, routine sharing of good catches and lessons learned, and increased communication through safety huddles. Successful facilitators to implementation include hiring an outside consultant (eg, Healthcare Performance Improvement), leadership commitment to implement HRO principles, and enacting policies to facilitate data-sharing. Barriers to implementation include competing priorities (eg, widescale implementation of an Electronic Medical Record systems) and high costs.

A major limitation of the literature is that none of these studies compared an HRO intervention to a concurrent control group. Therefore, it is difficult to determine whether these effects are due to HRO implementation or a concurrent intervention or secular trend. Studies also lacked information on whether intervention components were delivered with fidelity over time and whether the interventions were associated with unintended effects on provider workload or efficiency. Future HRO implementation research should utilize quasi-experimental designs, such as natural experiments that deliver HRO interventions at a group of sites with other sites serving as a wait list control, to evaluate the effects of specific intervention components and assess the mechanism of change driving outcomes.

Introduction

Purpose

The ESP Coordinating Center (ESP CC) is responding to a request from the Department of Veterans Affairs (VA) National Center for Patient Safety for a rapid evidence review on implementing High Reliability Organization (HRO) principles into practice. The purpose of this review is to evaluate the literature on frameworks, metrics, and evidence of effects of HRO implementation. Findings from this review will be used to inform the implementation of the VA’s HRO Initiative.

Background

In their 2000 report “To Err is Human,” the Institute of Medicine’s (IOM) Committee on Quality of Health Care in America cited deaths due to medical errors as more common than those due to motor vehicle accidents, breast cancer, or AIDS.1 Despite continued widespread, discrete process improvement initiatives such as handwashing protocols, patient identification to reduce ‘wrong person’ procedures, protocols for clear communications between care teams and visual indicators for high risks such as fall injury or allergies, a 2016 British Medical Journal report estimated that medical errors continue to be the third leading cause of death in the US.2 Additionally, the IOM Committee identified care fragmentation as a root cause of medical errors.1 In response, they called for a comprehensive, system-level approach to improve patient safety, that shifts the focus away from a culture of blame to one of error analysis and process improvement. Therefore, health care organizations have begun to explore system-level approaches to cultivating a culture of safety, with a focus on collaboration, communication, and coordination.

AN HRO IS:

“An organization that experiences fewer than anticipated accidents or events of harm, despite operating in highly complex, high-risk environments.”

HRO is one such organizational approach to achieving safety, quality, and efficiency goals.3,4 At the core of HRO is a culture of “‘collective mindfulness’, in which all workers look for, and report, small problems or unsafe conditions before they pose a substantial risk to the organization and when they are easy to fix.”3,5 Use of HRO is designed to change the thinking about patient safety through the following 5 principles: (1) sensitivity to operations (ie, heightened awareness of the state of relevant systems and processes); (2) reluctance to simplify (ie, the acceptance that work is complex, with the potential to fail in new and unexpected ways); (3) preoccupation with failure (ie, to view near misses as opportunities to improve rather than proof of success); (4) deference to expertise (ie, to value insights from staff with the most pertinent safety knowledge over those with greater seniority); (5) resilience (ie, to prioritize emergency training for many unlikely but possible system failures).4 See Figure 1 below.

Figure 1. Five HRO principles.

Figure 1

Five HRO principles.

HRO was originally pioneered in extremely hazardous industries, such as nuclear power and commercial aviation, where even the smallest of errors can lead to tragic results. These industries have achieved and sustained extraordinary safety levels, thereby generating much interest in how to adapt HRO principles to health care and replicate this success. In their 2007 book “Managing the Unexpected,” Weick and Sutcliffe define the 5 principles of HROs and describe how these principles can be applied to improve reliability across diverse industries.5 In their 2008 seminal Agency for Healthcare Research and Quality (AHRQ) white paper, Hines et al apply these 5 principles to health care settings and describe the specific challenges threatening health care reliability, such as higher workforce mobility and care of patients rather than machines.4 Implementation of HRO initiatives into health care settings is an inherently complex and costly process that involves organizing people, processes, and resource activities across often large organizations. For example, the Nationwide Children’s Hospital’s HRO journey involved increasing their quality improvement (QI) personnel from 8 in 2007 to 33 in 2012, with a budget increase from $690K to $3.3M.6 External consultants, such as Healthcare Performance Improvement, LLC, can provide support to organizations undertaking an HRO journey. HRO interventions commonly include activities like basic error prevention education; leadership training in reinforcement approaches; enhanced root cause analysis processes using an electronic tracking system; promotion of a ‘just culture’ – a culture in which providers and staff are fairly penalized for mistakes – that supports routine reporting errors; sharing good catches and lessons learned; and training in error prevention technique by provider peer safety coaches.

Examples of health systems’ successful adoption of HRO principles are already emerging. Providence St. Joseph Health – a national, not-for-profit Catholic health system comprised of more than 50 hospitals, 800 clinics and 5 million patients across 7 states – has had success implementing their HRO program, Caring Reliably. Two years after implementation of the program, which included partnering with an outside consulting firm to coach them through a leader toolkit, which focused on culture, and a toolkit for everyone, which reduced errors, Providence St. Joseph Health experienced a 5% improvement in the safety climate domain of the Safety Attitudes Questionnaire and a 52% decrease in serious safety events (G. Battey, oral communication, February 2019).7 The VA has also experienced HRO implementation successes. The Harry S. Truman Memorial Veterans Hospital began a 3-year HRO project in March 2015 by partnering with the VA National Center for Patient Safety to deliver Clinical Team Training to every inpatient and outpatient clinical service. This included formal interactive classroom training, application of the principles in a project that was unique for each clinical area, and refresher classroom and simulation training after one year. In May 2016, Truman VA augmented their HRO program using a 23-module HRO Toolkit provided by VISN 15, as part of its HRO initiative rolled out across all 7 of its medical centers. According to Truman VA Associate Director Robert Ritter (R. Ritter, oral communication, February 2019), their HRO program has already resulted in remarkable improvements in staff attitudes and perceptions and significant increased participation in morning multidisciplinary huddles. However, despite the promise of increased excellence as described in the 2013 Joint Commission’s HRO report,3 major barriers to widespread implementation readiness of HRO at the VA and elsewhere include the complexity of organization-wide incorporation of safety culture principles and practices and prioritizing the adoption of process improvement tools and methods, among other competing priorities.

To reaffirm their commitment to high reliability and zero harm (working to “reduce errors and to ensure that any errors that may occur do not reach our patients and cause harm”), in February 2019, the VA rolled out a new initiative outlining the definitive steps for becoming an HRO.8 The first step is for HRO activities to begin at 18 lead facilities selected based on greater readiness as demonstrated by higher levels of safety performance, leadership commitment, and staff engagement. Initial HRO activities include the establishment of work groups, performance readiness assessments, and conducting training events and programs. Following analysis of lessons learned from these lead sites, the VA plans a national roll-out to achieve the goal of an VA-wide HRO transformation. To ensure success of HRO-related activities and consistent outcomes across the enterprise, VA is using resources from the Joint Commission Center for Transforming Healthcare resource library, including the Oro 2.0 High Reliability Assessment tool. Additionally, VA is working on developing a standard set of HRO tools, including training, implementation models, and measures.

Emerging literature can guide health systems in implementing and evaluating their HRO journey.9,10 However, an understanding of available frameworks, metrics, and initiatives and their use are currently limited by their complexity and wide variability of their key characteristics, their target participants (eg, leadership, medical staff), their foundation, their structure, which of the 5 HRO principles they address, and health system setting type. Understanding the quality and applicability of existing HRO resources is important to developing best practices, identifying barriers and facilitators to implementation, spreading implementation initiatives to other systems, measuring progress, and identifying knowledge gaps.

Scope

This rapid evidence review will address the following key questions and eligibility criteria:

Key Questions

Image vabriefhighrelif1.jpg
Key Question 1.

What are the frameworks for guiding HRO implementation?

Key Question 1A.

What are the main implementation strategies of these frameworks?

Key Question 1B.

What were the processes for developing these frameworks (eg, consensus, literature review, etc)?

Key Question 1C.

What are the intended settings of these frameworks?

Key Question 1D.

Who participates in implementing these frameworks?

Key Question 1E.

What are the processes for implementing these frameworks?

Image vabriefhighrelif2.jpg
Key Question 2.

What are the metrics for measuring a health system’s progress towards becoming an HRO?

Key Question 2A.

What are the main characteristics (ie, domains, scales) of these metrics?

Key Question 2B.

What were the processes for developing these metrics (eg, consensus, literature review, etc)?

Key Question 2C.

To what extent have these metrics been validated or used to inform health system decision-making?

Image vabriefhighrelif3.jpg
Key Question 3.

What is the evidence on HRO implementation effects?

Key Question 3A.

On patient safety/organizational change goals (eg, number of sites that met goal of 50% reduction in serious safety events)?

Key Question 3B.

On patient safety/organizational change measures (eg, mean change in number of serious safety events)?

Key Question 3C.

On process measures (eg, mean change in inter-departmental communication, provider or patient satisfaction)?

Eligibility Criteria

The ESP included articles published from January 2010 to January 2019 that describe implementation frameworks, metrics for measuring progress towards becoming an HRO, and its effects. The timeframe of 2010 and onward was selected because it is 2 years after the publication of AHRQ’s 2008 white paper, when one could reasonably expect publication of new research on implementing HRO principles in health care settings. To be included, articles needed to be explicitly grounded in HRO theory and specifically seek to advance organizational or cultural change. We operationalized this by only including articles that evaluated HRO principles at the organization level or higher (ie, we excluded articles of HRO implementation in individual departments). Outcomes for KQ3 include any that are linked to the pathway between the 5 principles of HROs (ie, sensitivity to operations, reluctance to simplify, preoccupation with failure, deference to experience, and resilience) and the ultimate goal of health care organizations: exceptionally safe, consistently high-quality care, as outlined in the AHRQ white paper.4 See Figure 2 below for the logic model linking the 5 HRO principles to the end goal of improved patient safety outcomes, based on the model described in Hines 2008.4

Figure 2. HRO logic model.

Figure 2

HRO logic model.

We prioritized articles using a best-evidence approach to accommodate the timeline (ie, we considered meeting safety goals [KQ3A] to be a higher priority than intermediate outcomes [KQ3B and KQ3C]). We also prioritized evidence from systematic reviews and multisite comparative studies that adequately controlled for potential patient-, provider-, and system-level confounding factors. We only accepted inferior study designs (eg, single-site, inadequate control for confounding, noncomparative) to fill gaps in higher-level evidence.

Methods

To identify articles relevant to the key questions, our research librarian searched MEDLINE, CINAHL, PsycINFO, and Cochrane Central Register of Controlled Trials (CCRT) using terms for high reliability and health care from January 2010 to January 2019 (see Supplemental Materials Appendix A for complete search strategies). Additional citations were identified by hand-searching reference lists and consultation with content experts. We limited the search to published and indexed articles involving human subjects available in the English language. Study selection was based on the eligibility criteria described above. Titles, abstracts, and full-text articles were reviewed by one investigator and checked by another. All investigators have expertise in conducting systematic reviews of health services research. Any disagreements were resolved by consensus.

No standard tool is currently available to assess the quality of complex interventions. We therefore culled concepts from reporting checklists for complex interventions, QI initiatives, and implementation interventions – including the Standards for Quality Improvement Reporting Excellence (Squire 2.0),11 Standards for Reporting Implementation Studies (StaRI),12 and Template for Intervention Description and Replication TIDieR13 – to develop a 7-item quality assessment checklist. Through this checklist, we evaluated whether the study adequately reported on (1) the conceptual link between the intervention and HRO principles, (2) intervention components and delivery, (3) implementation fidelity, (4) evaluation of the intervention, (5) adverse events, (6) confounders, and (7) the use of a concurrent control group. We considered items 1-4 to be basic criteria in determining whether the study was reported well enough to be reproduced. We considered items 5-7 to be advanced criteria that would increase our confidence that bias was minimized in the study results (see Supplemental Materials Appendix C for detailed information on the quality assessment checklist). All quality assessments were completed by one reviewer and then checked by another. We did not quantify inter-rater reliability through a kappa statistic; however, qualitatively, our agreement was generally high. Disagreements were generally limited to interpretation of individual risk of bias domains and not overall risk of bias ratings for a study. We resolved all disagreements by consensus.

We abstracted data from all studies and results for each included outcome. All data abstraction and internal validity ratings were first completed by one reviewer and then checked by another. We resolved all disagreements by consensus. We informally graded the strength of the evidence based on the AHRQ Methods Guide for Comparative Effectiveness Reviews by considering study limitations (includes study design and aggregate quality), consistency, directness, and precision of the evidence.14 Ratings typically range from high to insufficient, reflecting our confidence that the evidence reflects the true effect.

Where studies were appropriately homogenous, we synthesized outcome data quantitatively using StatsDirect statistical software (StatsDirect Ltd. 2013, Altrincham, UK) to conduct random-effects meta-analysis to estimate pooled effects. We assessed heterogeneity using the Q statistic and the I2 statistic. Where meta-analysis was not suitable due to limited data or heterogeneity, we synthesized the evidence qualitatively.

Throughout the report, we use the following terminology to describe different levels of HRO theory and implementation (Table 1).

Table 1. HRO terminology used throughout report.

Table 1

HRO terminology used throughout report.

The complete description of our full methods is available on the PROSPERO international prospective register of systematic reviews (http://www.crd.york.ac.uk/PROSPERO/; registration number CRD42019125602). A draft version of this report was reviewed by peer reviewers as well as clinical leadership. Their comments and our responses are presented in the Supplemental Materials (see Appendix D).

Results

Literature Flow

The literature flow diagram (Figure 3) summarizes the results of search and study selection (see Supplemental Materials Appendix B for full list of excluded studies). Our search identified 525 unique, potentially relevant articles. Of these, we included 20 articles that addressed one or more of our key questions. Eight articles addressed Key Question 1,3,1521 8 articles addressed Key Question 2,3,16,2227 and 7 articles addressed Key Question 3.6,17,2832

Figure 3. Literature flowchart.

Figure 3

Literature flowchart.

Image vabriefhighrelif4.jpgKEY QUESTION 1. What are the frameworks for guiding HRO implementation?

We identified 8 frameworks that guide implementation of HRO principles into a health care system: the Joint Commission’s High Reliability Health Care Maturity Model (HRHCM)3; the Institute for Healthcare Improvement’s (IHI) Framework for Safe, Reliable and Effective Care18; the American College of Healthcare Executives’ (ACHE) Culture of Safety framework16; 2 frameworks developed at Johns Hopkins’ (JH) Armstrong Institute for Patient Safety and Quality including an Operating Management System17 and a Safety and Quality framework15; the Office of the Air Force Surgeon General’s Trusted Care framework19; Advancing Research and Clinical Practice through close Collaboration (ARCC) Model20; and a framework focused on developing high reliability teams.21 The Joint Commission’s HRHCM and IHI Framework for Safe, Reliable and Effective Care emerged as the most comprehensive, as they both covered all 5 strategies commonly reported across frameworks (Figure 4); were the most rigorously developed; were broadly applicable; and were sufficiently detailed to inform implementation.

Figure 4. 5 Common HRO implementation strategies.

Figure 4

5 Common HRO implementation strategies.

Appendix C of the supplementary materials contains full details on these frameworks’ implementation strategies, development process, and intended settings, as well as staff and processes required for implementation. Highlighted findings appear below.

KQ1A. What are the main implementation strategies of these frameworks?

Table 2 summarizes the 5 commonly reported key HRO implementation strategies from these 8 frameworks.

Table 2. Common HRO implementation strategies across 8 identified frameworks.

Table 2

Common HRO implementation strategies across 8 identified frameworks.

Image vabriefhighrelif10.jpgThe first key strategy is developing leadership. The Joint Commission discussed the need for leadership (eg, board members, CEO/management, and lead physicians) to commit to the goal of zero patient harm.3 IHI described the need for leaders to facilitate and mentor teamwork, improvement, respect, and psychological safety.18 ACHE incorporated elements from both of these frameworks, including selecting, developing, and engaging a board; prioritizing safety in the selection and development of leaders; and establishing a compelling vision for safety. The JH Operating Management System framework and the Air Force emphasized the importance of leadership accountability.17,19 The JH Safety and Quality framework encouraged QI leaders to pursue formal degrees to support their work.15 The ARCC and high reliability team models did not explicitly discuss leadership as a key strategy, although the ARCC model did discuss the importance of developing and using mentors to guide evidence-based decision-making.20

Image vabriefhighrelif11.jpgThe second key strategy is supporting a culture of safety. The Joint Commission described building trust, accountability, identifying unsafe conditions, strengthening systems, and assessment as key activities within this strategy.3 The IHI listed culture, including psychological safety, accountability, teamwork and communication, and negotiation, as one of their major 2 domains.18 The ACHE named their framework “culture of safety” and emphasized the need to both lead and reward a just culture and establish organizational behavior expectations.16 The Air Force described the importance of trust between leaders and staff, respectful communication, and willingness to admit errors within their culture of safety domain.19 The ARCC model incorporated an assessment of culture as a key aspect of implementation,20 and the high reliability team model emphasized that responses to poor outcomes should be based on behavioral choices and not severity of outcome.21 Neither JH frameworks explicitly discussed culture of safety.

Image vabriefhighrelif12.jpgThe third key strategy is building and using data systems to measure progress. The Joint Commission discussed the need to track and display quality measures and to involve IT support in the development of solutions to quality problems.3 IHI described the need for open sharing of data and other information concerning safe, respectful, and reliable care and to continually improve work processes and measure progress over time.18 The JH Operating Management System discussed the need to share and synthesize data to gain insights to make new discoveries and improve processes,17 and their Safety and Quality framework included a plan to evaluate processes.15 The Air Force described standardizing processes to gather and share information about patient care episodes, knowledge data, and processes to improve care delivery.19 The ARCC model described data management and outcomes monitoring as one of their implementation workshops.20 The high reliability team model did not include a strategy related to measurement of progress.

Image vabriefhighrelif13.jpgThe fourth key strategy is providing training and learning opportunities for providers and staff. The Joint Commission discussed the importance of training all staff on robust process improvement (eg, a blended performance improvement model aimed at improving patient safety in health care settings by integrating Lean Six Sigma and formal change management principles) as appropriate to their jobs.3,33 IHI and the Air Force discussed developing learning systems, although the learning has more to do with implementing QI initiatives and learning from results, rather than learning how to implement HRO principles.18,19 The JH Safety and Quality framework listed examples of training that each type of staff member should receive.15 The ARCC model described a workshop dedicated to evidence-based practice skills-building,20 and the high reliability team model discussed implementation of TeamSTEPPS, a teamwork curriculum for health care staff.21 ACHE and the JH Operating Management System did not specifically discuss training or learning opportunities.

Image vabriefhighrelif14.jpgThe fifth key strategy is implementing quality improvement interventions to address specific patient safety issues. This strategy is discussed in broad strokes as robust process improvement by the Joint Commission and Air Force,3,19 and as improvement and measurement by the IHI.18 In the ARCC model, participants complete a 12-month evidence-based practice implementation project focused on improving quality of care, safety, and patient outcomes.20 The JH Safety and Quality framework discussed the role of safety and quality experts in designing and directing system improvement efforts and provided examples of potential initiatives.15 The high reliability team framework described simulation training where teams can practice briefing, huddles, and debriefing strategies.21 Neither the ACHE nor the JH Operating Management System explicitly discussed QI initiatives.

In addition, we identified several complementary practices for strengthening implementation. We identified these by looking across the 8 frameworks to see what complementary practices were commonly recommended. These complementary practices are meant to be applied across implementation strategies to strengthen the overall delivery of HRO.

  • Incorporation of justice, equity and patient-centeredness: The ACHE describes building trust, respect and inclusion as a key domain of building a safety of culture.16 The framework encourages leaders to value diversity and inclusion when selecting leaders and staff and to work towards evaluating and eliminating disparities in patient care. The Air Force selected patient-centeredness as a key domain of its framework.19 This practice could be integrated into HRO delivery through activities such as hiring a diverse workforce or prioritizing QI initiatives that address safety issues that disproportionately affect patients from racial/ethnic minority groups.
  • Involvement of a variety of stakeholders involved in health care delivery, including patients and families: The JH Operating Management System described establishing patient and family advisory councils as an implementation activity that could be undertaken to advance one of their key implementation strategies.17 Other possible activities include assessing patient perspectives of culture of safety or inviting patients to serve on HRO leadership committees.
  • Assembling transdisciplinary teams: Several frameworks – including the JH Operating Management System,17 ARCC model,20 and high reliability team framework21 – discuss forming transdisciplinary teams as an important activity towards advancing HRO. This practice could be integrated into HRO delivery through activities like inviting providers from different specialties to attend daily safety huddles; or having nurses, physicians, and staff all attend the same HRO training sessions together.
  • Utilizing change management strategies such as Lean Six Sigma to promote change: Most frameworks recommended health systems use complementary change management strategies – such as Lean Six Sigma,1517 IHI’s Model for Improvement,18 or a combination of strategies such as the Joint Commission’s robust process improvement3,19 – to implement HRO principles into practice. This complementary practice could be integrated into several aspects of HRO delivery, such as training staff on Lean Six Sigma, or applying Lean thinking to root cause analysis to identify what is contributing to patient safety events and identifying and implementing solutions.

KQ1B. What were the processes for developing these frameworks (eg, consensus, literature review, etc)?

The Joint Commission’s HRHCM stood out as being the most rigorously developed framework, as the process involved a literature review, consensus among subject experts, pilot testing among an expert panel, and pilot testing with leadership at 7 US hospitals. However, the latter pilot testing effort was primarily focused on evaluating the tool to measure a health system’s progress on the framework (KQ2). The Air Force19 and JH Safety and Quality framework15 were developed through both a literature review and consultation with health care leaders and content experts. The IHI18 framework was developed specifically for the IHI Patient Safety Executive Development Program curriculum and was informed by an analysis of high-performing, proactive, and generative work settings. The ACHE framework was developed through partnership between the ACHE, the IHI, and the National Patient Safety Foundation (NPSF) Lucian Leape Institute (LLI). It involved consensus-building with industry leaders and experts who have had success in transforming their organizations into system-wide cultures of safety.16 The ARCC model was initially developed through a strategic planning process on how to rapidly integrate research findings into clinical processes.20 The 2 remaining articles did not discuss the process of how frameworks were developed (JH Operating Management System,17 high reliability teams21).

KQ1C. What are the intended settings of these frameworks?

All frameworks were intended to be delivered in any health care delivery setting, except for the Air Force’s framework, which was designed specifically for the Air Force Medical Service.19 IHI’s framework was initially developed for use in acute care settings, although it has since evolved to be applicable to other settings.18

KQ1D. Who participates in implementing these frameworks?

Most frameworks were intended to be implemented by a variety of health care leaders, providers, and staff, including frontline providers, local and middle managers, and high-level managers and executives, as well as safety and quality leaders, across a variety of service areas.3,15,1820 IHI’s framework also included components to be implemented by patients and families.18 Exceptions are the ACHE16 and the JH Operating Management System17 frameworks, which were specifically designed for health care leadership,16 and the high reliability team framework which was designed for nursing professionals.21

KQ1E. What are the processes for implementing these frameworks?

Articles varied in the depth of information provided on how to operationalize the implementation of these frameworks, with the ARCC, Joint Commission, and IHI models emerging as the most comprehensive.

  • The ARCC model provided details on providing learning and training opportunities (ie, 6 educational workshops, 8 days of educational and skills-building sessions over 1 year), as well as on implementing an intervention to address a specific patient safety issue (ie, 12-month project focused on improving quality of care, safety, and/or patient outcomes).20
  • The Joint Commission3 and IHI18 provided high-level recommendations for operationalizing HRO implementation, including building and using tools to measure progress (ie, assess the current state of HRO maturity; develop tools to advance maturity), as well as specific examples of activities that could advance these strategies.

Other frameworks provided some guidance on how to operationalize implementation, although they were less comprehensive.

  • ACHE described 2 levels of implementation practices: foundational practices which focus on laying the groundwork for HRO implementation and sustaining practices which focus on spreading and embedding HRO concepts, specifically a culture of safety.16
  • The JH Operating Management System suggests approaches to implementing the core concepts of the model, including developing and using data systems (ie, providing leaders with a standardized reporting format to assist in reporting on department progress), using systems engineering methodology, and convening stakeholder groups.17
  • The JH Safety and Quality initiative provided recommendations based on the role of a specific health care provider or staff member. For example, they have specific suggestions on training and learning opportunities (ie, provide front line providers and staff with basic medical school education on safety and quality; provide managers with patient safety certificate programs and workshops on Lean Six Sigma and other change management processes).15
  • The Air Force’s suggestions for operationalization include standardizing and stabilizing processes, engaging staff in behaviors to continuously improve these processes, mentoring staff, and leadership goal-setting, as well as a description of the desired future state of HRO integration into practice.19
  • The high reliability team framework described specific approaches that touch on several implementation strategies including learning and training opportunities (ie, simulation training and provision of a structured HRO curriculum) and supporting a culture of safety (ie, development of a just culture system for penalizing staff when patient harm occurs).21

Image vabriefhighrelif15.jpgKEY QUESTION 2. What are the metrics for measuring a health system’s progress towards becoming an HRO?

We identified 8 articles3,16,2227 on 6 tools for measuring the progress toward becoming an HRO (Table 3). The Joint Commission’s HRHCM/OroTM 2.0 emerged as the most rigorously developed, validated, and applicable tool for VA settings. However, other tools such as the ACHE’s Culture of Safety Organizational Self-Assessment Tool16 may be useful in developing specific items missing from the OroTM 2.0 framework, such as teamwork culture and system-focused tools for learning and improvement.27 Four additional tools have unclear applicability to the VA, as they were developed in countries outside the US,2225 did not report measurement items,23,24 or require qualitative expertise to analyze results.22 Full details on these studies appear in Supplementary Materials, Appendix C, and selected findings appear below.

OroTM 2.0

The tool that most comprehensively addressed all 5 HRO implementation strategies identified in KQ1 was the HRHCM/OroTM 2.0.3,34 As discussed in KQ1, the HRHCM is the Joint Commission’s framework for implementing HRO principles. This framework includes 4 levels (beginning, developing, advancing, approaching) for each of the 14 components (56 total) to guide health care leaders in assessing their systems’ level of maturity on becoming an HRO. The OroTM 2.0 is a web-based application that uses branching logic to guide health care leaders through the HRHCM assessment and produces a visual report that synthesizes data from multiple respondents within a single hospital.34 Of note, the OroTM 2.0 was designed to be used at the individual hospital, rather than at a system level, and is only available to Joint Commission-accredited organizations. The tool outputs data into reports that could theoretically be shared between hospitals but it is not an automatic feature.

To develop the metrics used in by the HRHCM/OroTM 2.0, a team at the Joint Commission spent over 2 years engaging with high reliability experts from academia and industry, leading safety scholars outside of health care, and the published literature.3 Iterative testing with hospital leaders – first among 5 individuals in executive leadership positions, then among leadership teams from 7 US hospitals – was conducted to finalize the framework and included metrics. The resultant tool has since been validated in peer-reviewed research studies, including 1 study that tested the content validity of the tool at 6 VA sites.27 Another study tested the internal reliability and discriminative ability in detecting different levels of HRO maturity in 46 hospitals from the Children’s Hospitals’ Solutions for Patient Safety network.26

The VA study was a secondary analysis of qualitative data from 138 VA employees with patient safety expertise at various levels of leadership (eg, patient safety managers, executive leadership and service chiefs, infection control nurses) from 6 VA sites. The original study validated the AHRQ-developed patient safety indicator tool; the secondary analysis looked at how well responses mapped onto the Joint Commission’s HRHCM model. Researchers found that 12 of the 14 HRHCM components were represented, indicating good content validity. Two additional HRO components were identified through interviews that were not represented in the HRHCM model: teamwork culture and systems-focused tools for learning and improvement. While less applicable to the VA, the study that tested the HRHCM in 46 children’s hospitals found that the HRHCM had good internal reliability (Cronbach’s alpha = 0.72 to 0.87, depending on the domain), good discriminative ability (ie, health system average scores on beginning, developing, advancing, and approaching levels of maturity resembled a bell curve), and was responsive to change (ie, safety culture decreased after major organizational changes), indicating it may perform well at detecting progress on becoming an HRO.

Table 3. Metrics for measuring progress on becoming an HRO.

Table 3

Metrics for measuring progress on becoming an HRO.

ACHE Culture of Safety Organizational Self-Assessment Tool

While less comprehensive, rigorously developed, or evaluated than the HRHCM/OroTM 2.0, the ACHE’s Culture of Safety Organizational Self-Assessment Tool is an additional metric for evaluating progress on becoming an HRO. It incorporates additional perspectives (ie, patients, families) and specific items (eg, teamwork culture) that may be informative to the VA.

The ACHE tool addresses 3 (leadership, culture of safety, and data systems) of the 5 key HRO implementation strategies. It consists of 18 items concerning an organization’s capabilities and processes scored on a 5-point Likert scale. Lower (worse) scores prompt a review of foundational tactics towards becoming an HRO, moderate scores prompt a review of both foundational and sustaining tactics, and higher (better) scores prompt a review of sustaining tactics.16

The ACHE tool was developed through partnership with the IHI/NPSF LLI and others as described in KQ1.16 The tool has not undergone any formal validation processes. While limited in terms of the number of strategies covered and extent of validity testing, the ACHE tool offers 2 additional features not covered by the HRHCM/OroTM 2.0. First, it specifically seeks perspectives beyond leadership, including providers and staff, as well as patients and families. However, of note, patients and families may have difficulty completing many of the ACHE tool items, such as the extent to which board members spend discussing patient safety issues in meetings and the extent to which leadership performance assessments and incentives are aligned with patient safety metrics. Second, the ACHE tool includes items related to teamwork and systems, such as the item: “My organization uses and regularly reviews a formal training program and defined processes for teamwork and communication.”

Other tools

We identified 4 additional tools that covered 2 or fewer of the 5 HRO implementation strategies. They have more limited applicability to the VA due to their narrower focus, lack of reporting on the specific tool items, and/or development outside the US.

The Cultural Assessment Survey (CAS) is a metric used to measure culture of patient safety and was designed specifically for use in obstetric units in Canada.23 The CAS had a rigorous development process, including a literature review to develop a list of over 100 values and practices that support a culture of safety, a short list of prioritized values and practices developed after sending the 300 surveys to employees at 8 hospitals, a pilot test of the short list at 10 hospitals, and testing of its internal reliability and content validity. However, the article did not include a copy of the tool or the items included in the tool. The narrow focus on obstetric units also limits the applicability of the tool to the VA’s broad HRO implementation.

The University of Tehran developed 2 metrics: The first is a 55-item survey assessing a health care system’s readiness for HRO implementation. It was developed through a literature review and pilot-testing among 98 senior or middle managers from 15 hospitals.24 The second is a 24-item survey and checklist that assesses knowledge of HRO concepts and integration of HRO principles into practice. It was developed through interviews with managers and staff at 80 medical and nonmedical departments.25 These metrics are notable as being the only ones specifically designed around the 5 HRO principles described by Hines et alia 2008.4 However, both metrics were limited in terms of the extent to which they covered HRO implementation strategies – with one assessing 2 out of 5 strategies25 and the other with unclear coverage, as it did not report any specific examples of its metric items.24 Both of these were evaluated in terms of their content validity and performed well. However, the applicability of these tools to the VA is unclear, as they were developed for a specific health care system in Tehran, Iran.

One additional metric developed by the Delft University of Technology in the Netherlands offers a qualitative framework for assessing level of reliability.22 This framework resembled the HRHCM/ORO 2.0 in that it has 4 stages of maturity: craft, watchful professional, collective professionalism, and high reliability. It was developed through literature review to identify the common domains that are essential to high reliability hospitals and did not undergo any validity testing. This metric also has unclear applicability to the VA, due to significant differences between the US and Dutch health care systems. Delivering the framework in its current state at the VA would also be challenging, as it has open-ended items to promote thinking about the overall strengths and limitations of a health care system, rather than specific questions to which a provider or health care leader could concretely respond (eg, under organizational culture, a less reliable hospital would have qualities of “learning by doing” while a more reliable hospital would have “a preoccupation with possible failure.”)

Image vabriefhighrelif16.jpgKEY QUESTION 3. What is the evidence on HRO implementation effects?

We identified articles from 7 health care organizations, primarily children’s hospitals, on the effects of HRO initiative implementation on safety culture, HRO process, and patient safety measures.6,17,2832 Full details on these articles are available in Supplementary Materials Appendix C, and selected findings appear below.

The most notable finding is that organizations experienced significant reductions in serious safety events (SSEs) (range, 55% to 100%) following the implementation of the 4 most comprehensive, multicomponent HRO initiatives.6,2931 Moreover, time since initiation and safety improvements appear to have a dose-response relationship, and the improvements were maintained for upwards of 9 years (Table 4).6,2931 Of note, only one of these studies explicitly discussed using one of the frameworks discussed in KQ1 (ie, the IHI framework).6 Two years after implementation, SSE reductions were 55% and 83%, respectively, in hospitals with a 12-month average of 0.9 (Ohio Children’s Hospital Association)30 and 1.15 (Nationwide Children’s Hospital)6 SSEs per 10,000 adjusted patient days. At 4 years, Cincinnati Children’s Hospital Medical Center reported a 67% reduction in SSE rates and a baseline 12-month average of 0.9 events per 10,000 adjusted patient days.31 After 9 years, Genesis Health System reported achieving its goal of zero SSE (100% reduction).29 In these studies, SSE was typically defined as “the most serious harm events that occur in hospitals and are defined by serious patient harm events that directly results from a deviation in best practice or standard of care.”30 Improvements in safety culture were also reported, including improvement in safety attitudes6 and an increase in safety success story reporting,29 but changes across various other safety culture dimensions had mixed results.31 At Cincinnati Children’s Hospital Medical Center,31 responses to the AHRQ Hospital Survey on Patient Safety Culture indicated improvements in organizational learning and continuous improvement, feedback and communication about error, and staffing. However, they reported no change in supervisor/manager expectations and actions promoting safety, teamwork within hospital units, nonpunitive responses to error, and a decline in communication openness.

A commonality across the 4 hospitals that reported SSE reductions is that they implemented their HRO initiative with the help of the same external consultant, Healthcare Performance Improvement (HPI), LLC.6,2931 Although the components varied somewhat across these 4 hospitals, they generally aligned with the 5 strategies discussed in KQ1: (1) developing leadership (eg, leadership training); (2) supporting a culture of safety (eg, increased communication through safety huddles; routine sharing of good catches and lessons learned); (3) providing training and learning opportunities for providers and staff (eg, error prevention training for staff; provider peer safety coaches coached their peers in use of the error prevention techniques); (4) building and using data systems to track progress (eg, enhanced root cause analysis processes using an electronic tracking system); and (5) implementing interventions to address specific patient safety issues (eg, embedding “time outs” and “debriefs” into standard surgical processes, using standardized checklists). Despite these similarities, initiatives conceptualized their goals of zero patient harm in different ways: one initiative’s board encouraged management to “aspire to eliminate preventable harm” by reducing the preventable harm index to zero6; one aimed to reduce SSEs to zero29; and 2 others aimed to reduce SSEs by 75%-80%.30,31 In addition, the structure of the Ohio Children’s Hospital Association was unique in that it is a state-wide collaboration of 8 tertiary pediatric referral centers that specifically refuse to compete on matters related to patient safety.30 To promote transparent sharing of critical safety data among the collaborative to facilitate lessons learned without fear of undue liability, Ohio House Bill 153 was passed in 2010 to provide a legal framework expressly providing peer review protection for the 8 participating hospitals.

In addition to the 4 HPI-assisted initiatives, we also identified a similarly comprehensive initiative independently implemented by JH Hospital and Health System: the Operating Management System.17 Although the study did not report on SSEs, the authors reported improved compliance in Joint Commission process measures and a 79% reduction in potential preventable harms.

Finally, we found that process improvements are possible even with less intensive HRO initiatives that are more focused in scope.28,32 When the Riley Hospital for Children at Indiana University Health implemented a Daily Safety Brief, they found improvement in communication, awareness, and working relationships, but not in comfortability in sharing errors.32 The Children’s National Medical Center experienced an increase in Apparent Cause Analysis (ACA) reliability scores following implementation of 13 interventions across education, process, and culture categories. They also reported an increase in efficiency (4 fewer days to turn around ACA) and increased satisfaction with the process.28

While the results of these studies are promising, the overall strength of this evidence is low. Each initiative was only evaluated in a single study (consistency unknown), and each study was fair quality (common methodological weaknesses included lack of reporting on implementation fidelity and no concurrent control groups), with generally indirect outcomes and populations (few reported whether they met their goal of zero harm; none were conducted in Veterans). The main strengths of these studies were that they generally provided sufficient detail on how the intervention is conceptually linked to HRO, their main intervention components, and how they evaluated effects. Their main limitation was that a cause-effect relationship could not be established between these HRO initiatives and outcomes, because no study used a concurrent control group that would have ruled out the possibility that the effect was due to concurrent interventions (eg, implementation of an Electronic Medical Record [EMR]) or improved specialty-specific disease management).6

Table 4. Key findings from studies assessing effects of HRO implementation.

Table 4

Key findings from studies assessing effects of HRO implementation.

Summary and Discussion

To our knowledge, this is the first evidence review to systematically evaluate primary research on the effects of HRO implementation in health care settings. Furthermore, although much has been written about the concepts of HRO and individual health care systems’ experience with HRO implementation, few have looked across different systems to describe similarities and differences in frameworks and metrics, and what lessons might be learned based on the successes and challenges encountered using different approaches. Gaining a better sense of how HRO has been successfully delivered is critical to informing the work of the VA and other health systems as each embarks on its HRO journey.

Although a variety of frameworks for implementation of HRO principles are available, the Joint Commission’s HRHCM and the IHI’s Framework for Safe, Reliable, and Effective Care stand out as being the most comprehensive, applicable, and sufficiently descriptive to be used by the VA. Both of these frameworks cover 5 common HRO implementation strategies seen across frameworks, including (1) developing leadership, (2) supporting a culture of safety, (3) building and using data systems to track progress, (4) providing training and learning opportunities for providers and staff, and (5) implementing interventions to address specific patient safety issues. Complementary practices to strengthen implementation seen across these frameworks include the need to incorporate an awareness of justice, equity, and patient-centeredness into all elements of HRO implementation; the importance of involving a variety of stakeholders involved in health care delivery, including patients and families; and the value of integrating change management strategies into HRO delivery. The selection of one of these frameworks – or development of a new framework – should be informed by the staff being targeted for HRO implementation (eg, all providers and staff, only leadership, only nursing professionals); the approach desired (eg, developing a high-level operations management system vs training staff and providers on HRO principles and practices); and the capacity of the system in implementing certain components of the HRO framework (eg, a system that does not have strong leaders in evidence-based medicine may not want to implement the ARCC model).

Of the metrics available to evaluate a health system’s progress towards becoming an HRO, the Joint Commission’s HRHCM/Oro 2.0TM is the most comprehensive, rigorously developed, and applicable to the VA HRO initiative, given that its content validity has been evaluated at 6 VA hospitals.27 This tool was not designed to facilitate sharing data across hospitals; however, the tool outputs data into reports that could be shared. Of note, findings from the VA validation study27 indicate that certain concepts (teamwork culture and system-focused tools) are missing from the HRHCM framework and should be added. An example from the ACHE tool that might address these concepts include: “My organization uses and regularly reviews a formal training program and defined processes for teamwork and communication.”16 The VA HRO Initiative may consider adding these or similar concepts to the current tool being used to assess VA sites’ progress on becoming HROs. Additionally, other tools published prior to 2010 may be appropriate for capturing process outcomes on the pathway between the 5 HRO concepts and the end-goal of improved safety outcomes, such as the Safety Attitudes Questionnaire35 and the Safety Organizing Scale.36

Multicomponent HRO interventions that incorporate some of the 5 common HRO implementation strategies identified in KQ1 and that are delivered for at least 2 years are associated with improved process outcomes (eg, staff reporting of safety culture) and patient safety outcomes (eg, SSEs). However, the overall strength of evidence is low, as each HRO intervention was only evaluated in a single fair-quality study. Successful facilitators to implementation may include hiring an outside consultant (eg, HPI) to assist in the implementation, enacting of policies to facilitate data sharing (eg, passage of a state house bill to enable a collaborative of children’s hospitals to share critical safety data30), and leadership committing to implementing HRO principles. Barriers to implementation may include competing priorities, such as widescale implementation of an EMR system30, and costs (eg, one system increased quality improvement staff from 8 to 33, with a budget increase of over $2 million6).

Limitations

Primary study limitations

HRO interventions and other complex interventions are inherently difficult to study, because many interventions are implemented by many different people across multiple time points. Each hospital may also choose to implement different components of HRO interventions, depending on their individual needs and context. As a result, isolating the specific components of an HRO intervention that cause a specific effect on process and patient safety outcomes is difficult.37 Furthermore, without a control group, we cannot conclude that the HRO intervention, rather than another concurrent intervention or secular trend, caused the change. One study commented that other simultaneously implemented interventions, including EMR implementation and improved specialty-specific disease management, may have contributed to improved outcomes.6 EMR implementation is likely to be a confounder across multiple studies and could improve patient safety by making it easier to find and use patient health information, to collaborate with colleagues in other departments, and by building checklists and other automated processes into patient appointments. Other plausible confounders include utilization of other change management strategies, such as Lean Six Sigma, before or during the HRO implementation. Therefore, while promising, evidence of improved outcomes after HRO implementation should be interpreted cautiously.

Many studies commented that HRO was delivered among high-performing hospitals. Whether or not lower-performing hospitals would have the same outcomes is unclear. In addition, few studies commented on the fidelity of implementation or compliance, such as whether providers attended the required number of trainings or continually maintained safety event reporting systems. Therefore, we cannot determine whether health care staff continued to be invested in HRO implementation over time. Studies that reported some compliance measures reported that staff responses to culture surveys increased over time and the number (but not percent) of providers that completed trainings. Only 1 study described the potential unintended consequences of HRO implementation (ie, ACA turnaround time decreased).28 Study authors hypothesized that reasons for this increased efficiency included the availability of a standardized toolkit, clear rubrics to follow, and the availability of additional resources facilitated completion of the process. The effect of HRO implementation on provider and staff workload and efficiency is an important research question that should be the subject of future research.

Rapid review limitations

First, searching from 2010 forward means that we did not include earlier publications on HRO framework design and implementation. However, our search strategy and consultation with topic experts likely resulted in identification of the most recent and relevant articles that incorporated AHRQ’s conceptualization of the 5 HRO principles in healthcare settings. Second, our use of a single investigator to review articles, with second reviewer checking, may also have resulted in missing eligible studies. However, we used objective criteria to minimize the potential for differences between investigators. Finally, our quality assessment checklist on complex interventions was not designed to conduct a comprehensive assessment of all areas of bias, but rather to ascertain whether the study authors reported enough information that the intervention and evaluation could be reproduced and to highlight common issues in reporting and methodology seen across studies. Therefore, while it may not have captured all areas of bias seen in these studies, the use of another more formal tool would likely not have changed our conclusions.

Gaps and Future Research

The biggest gaps in knowledge on HRO implementation are (1) whether the improvements in process and safety outcomes are truly caused by HRO interventions or due to concurrent interventions or secular trends; (2) if HRO does indeed lead to improved outcomes, which components of HRO interventions are causing the effects; (3) whether certain implementation frameworks lead to better outcomes; and (4) what are the contextual factors (such as barriers and facilitators) affecting successful HRO implementation. Randomized controlled trial study designs are not a practical option for evaluating HRO interventions due to both the complexity of intervention as well as the delivery; therefore, other study designs such as quasi-experimental or natural experiments should be utilized instead. The VA HRO initiative is in a unique position to conduct these types of experiments. Implementing HRO principles at a select number of VA sites while other sites serve as a “wait-list” control would create a natural experiment to see if HRO implementation leads to improved outcomes. If this approach is taken, consideration should be given to how much wait-list control sites have begun implementing HRO concepts on their own or whether they’re implementing similar initiatives such as Lean Six Sigma. In addition, the widescale implementation of HRO across different sites likely means that each site will deliver slightly different interventions based on their individual contexts. Careful recording of the intervention components, when they were delivered, where they were delivered (eg, medical or surgical service areas), and whether they continued to be delivered may help to elucidate the effects of some of these individual intervention components on outcomes. This can inform where to invest future resources, and to tailor HRO delivery to specific contexts.

In addition, we were unable to determine what the mechanism of change was between HRO implementation and improvement in outcomes. While HRO delivery is theorized to lead to change in thinking about patient safety, resulting in improved processes and outcomes, this was not empirically examined in any of our included studies. Instead, some studies suggested that the impact of HRO on other process measures, such as safety culture, is mixed.31 This indicates that the mechanism of action driving changes in outcomes is more complex. Future studies should evaluate what is the mechanism of change, such as improved mindfulness or safety culture, to help answer both the how and why HRO implementation may lead to improved patient safety outcomes. Future studies may also want to consider the extent to which HRO implementation overlaps – or doesn’t – with system redesign strategies, as these are complementary approaches to improving quality of care.

Conclusions

A variety of frameworks and evaluation tools are available for HRO implementation and evaluation, with the Joint Commission’s High Reliability Health Care Maturity (HRHCM)/ORO 2.0 among the most rigorously developed and validated. Multicomponent HRO interventions that include several of the 5 common implementation strategies and that are delivered for at least 2 years are associated with improved process outcomes, such as staff perceptions of safety culture, and important patient safety outcomes, such as reduced SSEs. Future research studies should incorporate concurrent control groups through quasi-experimental designs to rule out the possibility that the effects are due to other interventions or secular trends. Future research should also focus on identifying whether certain frameworks, metrics, or components of interventions lead to greater improvements.

Acknowledgments

This topic was developed in response to a nomination by the VA National Center for Patient Safety for the purpose of informing the implementation of the VA’s High Reliability Organization Initiative. The scope was further developed with input from the topic nominators (ie, Operational Partners), the ESP Coordinating Center, and the technical expert panel.

In designing the study questions and methodology at the outset of this report, the ESP consulted several technical and content experts. Broad expertise and perspectives were sought. Divergent and conflicting opinions are common and perceived as healthy scientific discourse that results in a thoughtful, relevant systematic review. Therefore, in the end, study questions, design, methodologic approaches, and/or conclusions do not necessarily represent the views of individual technical and content experts.

The authors gratefully acknowledge Emilie Chen and Julia Haskin for editorial support, Scott Grey for his expertise on HRO research, and the following individuals for their contributions to this project:

Operational Partners

Operational partners are system-level stakeholders who have requested the report to inform decision-making. They recommend Technical Expert Panel participants; assure VA relevance; help develop and approve final project scope and timeframe for completion; provide feedback on draft report; and provide consultation on strategies for dissemination of the report to field and relevant groups.

  • William Gunnar, MD
    Executive Director
    National Center for Patient Safety
  • Amy Kilbourne, PhD, MPH
    Director
    Quality Enhancement Research Initiative

Technical Expert Panel

To ensure robust, scientifically relevant work, the TEP guides topic refinement; provides input on key questions and eligibility criteria, advising on substantive issues or possibly overlooked areas of research; assures VA relevance; and provides feedback on work in progress. TEP members are listed below:

  • Laura Damschroder, MPH, MS
    Center for Clinical Management Research
    Ann Arbor, MI

Key Informants

The ESP sought input from 2 Key Informants with diverse experiences and perspectives in implementing HRO interventions into large, integrated health care systems.

  • Glenda J. L. Battey, PhD
    Providence St Joseph Health
    Renton, WA
  • Robert G. Ritter, FACHE
    Harry S. Truman Memorial Veterans’ Hospital
    Columbia, MO

Peer Reviewers

The Coordinating Center sought input from external peer reviewers to review the draft report and provide feedback on the objectives, scope, methods used, perception of bias, and omitted evidence. Peer reviewers must disclose any relevant financial or non-financial conflicts of interest. Because of their unique clinical or content expertise, individuals with potential conflicts may be retained. The Coordinating Center and the ESP Center work to balance, manage, or mitigate any potential nonfinancial conflicts of interest identified.

References

1.
Committee on Quality of Health Care in America. To Err Is Human: Building a Safer Health System. Washington, DC: Institute of Medicine; 2000.
2.
Makary MA, Daniel M. Medical error-the third leading cause of death in the US. BMJ. 2016;353:i2139. [PubMed: 27143499]
3.
Chassin MR, Loeb JM. High-reliability health care: getting there from here. Milbank Q. 2013;91(3):459–490. [PMC free article: PMC3790522] [PubMed: 24028696]
4.
Hines S, Luna K, Lofthus J, Marquardt M, Stelmokas D. Becoming a high reliability organization: operational advice for hospital leaders. AHRQ Publication No. 08-0022. Rockville, MD: Agency for Healthcare Research and Quality; 2008.
5.
Weick KE, Sutcliffe KM. Managing the unexpected: Resilient performance in the age of uncertainty., 2nd ed. San Francisco, CA: Jossey-Bass; US; 2007.
6.
Brilli RJ, McClead RE, Jr., Crandall WV, et al. A comprehensive patient safety program can significantly reduce preventable harm, associated costs, and hospital mortality. J Pediatr. 2013;163(6):1638–1645. [PubMed: 23910978]
7.
Meyer D, Battey G, Mezaraups L, Severs L, Feeney S. High Reliability + Value Improvement = Learning Organization. Paper presented at: IHI National Forum on Quality Improvement in Health Care 2018; Orlando, FL.
8.
Department of Veterans Affairs. Memorandum: Veterans Integrated Service Networks (VISN) Plan for High Reliability and High Reliability Organization (HRO) Lead Facilities (VIEWS #00167710). February 11, 2019.
9.
Zero Harm: How to Achieve Patient and Workforce Safety in Healthcare. Press Ganey Associates, Inc.; 2018.
10.
Weick KE, Sutcliffe KM. Managing the Unexpected: Sustained Performance in a Complex World. Wiley; 2015.
11.
Ogrinc G, Davies L, Goodman D, Batalden P, Davidoff F, Stevens D. SQUIRE 2.0 (Standards for QUality Improvement Reporting Excellence): revised publication guidelines from a detailed consensus process. BMJ Qual Saf. 2015;25:986–992. [PMC free article: PMC5256233] [PubMed: 26369893]
12.
Pinnock H, Barwick M, Carpenter CR, et al. Standards for reporting implementation studies (StaRI) statement. BMJ. 2017;356:i6795. [PMC free article: PMC5421438] [PubMed: 28264797]
13.
Hoffmann TC, Glasziou PP, Milne R, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687. [PubMed: 24609605]
14.
Berkman ND, Lohr KN, Ansari M, et al. Grading the Strength of a Body of Evidence When Assessing Health Care Interventions for the Effective Health Care Program of the Agency for Healthcare Research and Quality: An Update Methods Guide for Effectiveness and Comparative Effectiveness Reviews. Rockville MD 2013.
15.
Aboumatar HJ, Weaver SJ, Rees D, Rosen MA, Sawyer MD, Pronovost PJ. Towards high-reliability organising in healthcare: a strategy for building organisational capacity. BMJ Qual Saf. 2017;26(8):663–670. [PubMed: 28546510]
16.
American College of Healthcare Executives. Leading a Culture of Safety: A Blueprint for Success. 2017.
17.
Day RM, Demski RJ, Pronovost PJ, et al. Operating management system for high reliability: Leadership, accountability, learning and innovation in healthcare. J Patient Saf Risk Manag. 2018;23(4):155–166.
18.
Frankel A, Haraden C, Federico F, Lenoci-Edwards J. A Framework for Safe, Reliable, and Effective Care. White Paper. Cambridge, MA: Institute for Healthcare Improvement and Safe & Reliable Healthcare; 2017.
19.
Office of the Air Force Surgeon General. Trusted Care Concept of Operations (CONOPS). 2015.
20.
Melnyk BM. Achieving a high-reliability organization through implementation of the ARCC model for systemwide sustainability of evidence-based practice. Nurs Adm Q. 2012;36(2):127–135. [PubMed: 22407205]
21.
Riley W, Davis SE, Miller KK, McCullough M. A model for developing high-reliability teams. J Nurs Manag. 2010;18:556–563. [PubMed: 20636504]
22.
Ikkersheim DE, Berg M. How reliable is your hospital? A qualitative framework for analysing reliability levels. BMJ Qual Saf. 2011;20(9):785–790. [PubMed: 21441603]
23.
Kenneth MJ, Bendaly N, Bendaly L, Worsley J, FitzGerald J, Nisker J. A measurement tool to assess culture change regarding patient safety in hospital obstetrical units. J Obstet Gynaecol Can. 2010;32(6):590–597. [PubMed: 20569541]
24.
Mousavi SM, Dargahi H, Mohammadi S. A study of the readiness of hospitals for implementation of high reliability organizations model in Tehran University of Medical Sciences. Acta Med Iran. 2016;54(10):667–677. [PubMed: 27888596]
25.
Mousavi SMH, Jabbarvand Behrouz M, Zerati H, et al. Assessment of high reliability organizations model in Farabi Eye Hospital, Tehran, Iran. Iran J Public Health. 2018;47(1):77–85. [PMC free article: PMC5756604] [PubMed: 29318121]
26.
Randall KH, Slovensky D, Weech-Maldonado R, Patrician PA, Sharek PJ. Self-reported adherence to high reliability practices among participants in the children’s hospitals’ solutions for patient safety collaborative. Jt Comm J Qual Patient Saf. 2019;45(3):164–169. [PubMed: 30471989]
27.
Sullivan JL, Rivard PE, Shin MH, Rosen AK. Applying the high reliability health care maturity model to assess hospital performance: a VA case study. Jt Comm J Qual Patient Saf. 2016;42(9):389–411. [PubMed: 27535456]
28.
Crandall KM, Sten MB, Almuhanna A, Fahey L, Shah RK. Improving apparent cause analysis reliability: a quality improvement initiative. Pediatr Qual Saf. 2017;2(3):e025. [PMC free article: PMC6132456] [PubMed: 30229162]
29.
Cropper DP, Harb NH, Said PA, Lemke JH, Shammas NW. Implementation of a patient safety program at a tertiary health system: A longitudinal analysis of interventions and serious safety events. J Healthc Risk Manag. 2018;37(4):17–24. [PubMed: 29604147]
30.
Lyren A, Brilli R, Bird M, Lashutka N, Muething S. Ohio children’s hospitals’ solutions for patient safety: a framework for pediatric patient safety improvement. J Healthc Qual. 2016;38(4):213–222. [PubMed: 26042749]
31.
Muething SE, Goudie A, Schoettker PJ, et al. Quality improvement initiative to reduce serious safety events and improve patient safety culture. Pediatr. 2012;130(2):e423–e431. [PMC free article: PMC3408689] [PubMed: 22802607]
32.
Saysana M, McCaskey M, Cox E, Thompson R, Tuttle LK, Haut PR. A step toward high reliability: implementation of a daily safety brief in a children’s hospital. J Patient Saf. 2017;13(3):149–152. [PubMed: 25119785]
33.
Joint Commission Center for Transforming Healthcare. High Reliability Training. 2019; https://www​.centerfortransforminghealthcare​.org/what-we-offer​/high-reliability-training#b419bfd8cf4544d6b1c3b3384a2426b8​_960713b6fb6347fa9babdbca288edb04. Accessed May 15, 2019.
34.
Joint Commission Center for Transforming Healthcare. Oro® 2.0. 2019; https://www​.centerfortransforminghealthcare​.org/what-we-offer/oro-2. Accessed April 5, 2019.
35.
Sexton JB, Helmreich RL, Neilands TB, et al. The Safety Attitudes Questionnaire: psychometric properties, benchmarking data, and emerging research. BMC Health Serv Res. 2006;6:44. [PMC free article: PMC1481614] [PubMed: 16584553]
36.
Vogus TJ, Sutcliffe KM. The Safety Organizing Scale: development and validation of a behavioral measure of safety culture in hospital nursing units. Med Care. 2007;45(1):46–54. [PubMed: 17279020]
37.
Guise JM, Chang C, Viswanathan M, et al. Agency for Healthcare Research and Quality Evidence-based Practice Center methods for systematically reviewing complex multicomponent health care interventions. J Clin Epidemiol. 2014;67(11):1181–1191. [PubMed: 25438663]

Supplemental Materials

APPENDIX A. Search Strategies

1. Search for current systematic reviews

Date Searched: 1/31/19

Sources:Strategy:
AHRQSearch: High-reliability
CADTHSearch: High-reliability

NICE

(NHS Evidence)

Search: “High-reliability”
VA Products: VATAP, PBM, HSR&D publications, VA ART Database
Search: High-reliability
Cochrane Database of Systematic Reviews

Database: EBM Reviews - Cochrane Database of Systematic Reviews <2005 to January 30, 2019>

Search Strategy:

------------------------------------------------------------

1 (High-reliability organization* or High-reliability practice* or High-reliability principle* or High-reliability healthcare or High-reliability health care).mp. (0)

***************************

BlueCross BlueShield Foundation Massachusetts

Search: High-reliability

Relevant Results:

Campbell CollaborationSearch: High-reliability
CMS PoliciesSearch: High-reliability
HayesSearch: High-reliability
Institute for Clinical Evaluative SciencesSearch: High-reliability

The National Academies of Science

(formerly IOM)

Search: High-reliability
McMaster Health Systems EvidenceSearch: High-reliability
Robert Wood JohnsonSearch: High-reliability
UBC Centre for Health Services and Policy ResearchSearch: High-reliability
WHO Health Evidence NetworkSearch: High-reliability

2. Systematic reviews currently under development (forthcoming reviews & protocols)

Date Searched: 1/31/19

Sources:Strategy:

PROSPERO

(SR registry)

Search: High-reliability

DoPHER

(SR Protocols)

Search: High-reliability

3. Current primary literature

Date Searched: 1/31/19

Sources:Strategy:
MEDLINE

Database: Ovid MEDLINE(R) and Epub Ahead of Print, In-Process & Other Non-Indexed Citations, Daily and Versions(R) <1946 to January 28, 2019>

Search Strategy:

------------------------------------------------------------

1 (High-reliability organization* or High-reliability practice* or High-reliability principle* or High-reliability healthcare or High-reliability health care).mp. (211)

***************************

CINAHL

Database: CINAHL Plus with Full Text

Search Strategy:

------------------------------------------------------------

1 TX (High-reliability organization* or High-reliability practice* or High-reliability principle* or High-reliability healthcare or High-reliability health care) (370)

2 Limit Source Type: Academic Journals (217)

***************************

PsycINFO

Database: PsycINFO <1806 to January Week 3 2019>

Search Strategy:

------------------------------------------------------------

1 (High-reliability organization* or High-reliability practice* or High-reliability principle* or High-reliability healthcare or High-reliability health care).mp. (175)

***************************

CCRCT

Database: EBM Reviews - Cochrane Central Register of Controlled Trials <December 2018>

Search Strategy:

------------------------------------------------------------

1 (High-reliability organization* or High-reliability practice* or High-reliability principle* or High-reliability healthcare or High-reliability health care).mp. (1)

***************************

PubMedSearch: “High-reliability organization*”[All Fields] or “High-reliability practice*”[All Fields] or “High-reliability principle*”[All Fields] or “High-reliability healthcare”[All Fields] or “High-reliability health care”[All Fields]

APPENDIX B. List of Excluded Studies

Exclude reasons: 1=Ineligible population, 2=Ineligible intervention, 3=Ineligible comparator, 4=Ineligible outcome, 5=Ineligible timing, 6=Ineligible study design, 7=Ineligible publication type 8=Outdated or ineligible systematic review

#CitationExclude reason
1Brass SD, Olney G, Glimp R, Lemaire A, Kingston M. Using the Patient Safety Huddle as a Tool for High Reliability. Joint Commission journal on quality and patient safety. 2018;44(4):219-226.E4
2Carrico R. The Joint Commission aims for high-reliability health care, unveils framework to move hospitals toward zero harm. ED management : the monthly update on emergency department management. 2013;25(12):suppl 3-4, 139.E7
3Clements K. High-reliability and the I-PASS communication tool. Nursing management. 2017;48(3):12-13.E4
4Davenport PB, Carter KF, Echternach JM, Tuck CR. Integrating High-Reliability Principles to Transform Access and Throughput by Creating a Centralized Operations Center. Journal of Nursing Administration. 2018;48(2):93-99.E2
5Deloitte. Transforming into a high reliability organization in health care. 2017.E2
6Eriksson N. Followership for organizational resilience in health care. In: The resilience framework: Organizing for sustained viability. New York, NY: Springer Science + Business Media; US; 2018:163-179.E10
7Fieldston E, Tsarouhas N. CT hospital slashes door-to-balloon times to reduce patient harm. ED Management. 2014;26(7):80-83.E7
8Gabriel PE, Bergendahl HW, Burke SV, Solberg TD, Maity A, Hahn SM. Incident Learning in Pursuit of High Reliability: Implementing a Comprehensive, Low-Threshold Reporting Program in a Large, Multisite Radiation Oncology Department. Joint Commission Journal on Quality & Patient Safety. 2015;41(4):160-168.E5
9Griffith JR. Understanding High-Reliability Organizations: Are Baldrige Recipients Models? Journal of Healthcare Management. 2015;60(1):44-61.E2
10Hales DN, Chakravorty SS. Creating high reliability organizations using mindfulness. Journal of Business Research. 2016;69(8):2873-2881.E5
11Hendrich A, Haydar Z. Building a High-Reliability Organization: One System’s Patient Safety Journey. Journal of healthcare management / American College of Healthcare Executives. 2017;62(1):13-17.E2
12Hershey K. Culture of safety. Nurs Clin North Am. 2015;50(1):139-152.E7
13Jones WS. Military Graduate Medical Education: Training the Military Health System into a High-Reliability Organization. Military medicine. 2015;180(11):1121-1123.E7
14Knox GE, Simpson KR. Perinatal high reliability. Am J Obstet Gynecol. 2011;204(5):373-377.E7
15Lyren A, Brilli RJ, Zieker K, Marino M, Muething S, Sharek PJ. Children’s Hospitals’ Solutions for Patient Safety Collaborative Impact on Hospital-Acquired Harm. Pediatrics. 2017;140(3).E2
16Magnano P, Platania S, Ramaci T, Santisi G, Di Nuovo S. Validation of the Italian version of the Mindfulness Organizing Scale (MOS) in organizational contexts. TPM-Testing, Psychometrics, Methodology in Applied Psychology. 2017;24(1):45-64.E5
17May EL. The power of zero: steps toward high reliability healthcare. South Carolina Safe Care Commitment. Healthcare executive. 2013;28(2):26.E7
18McCraw B, Crutcher T, Polancich S, Jones P. Preventing Central Line-Associated Bloodstream Infections in the Intensive Care Unit: Application of High-Reliability Principles. Journal for Healthcare Quality. 2018;40(6):392-397.E2
19McFarland DM, Doucette JN. Impact of High-Reliability Education on Adverse Event Reporting by Registered Nurses. Journal of nursing care quality. 2018;33(3):285-290.E5
20Middleton LP, Phipps R, Routbort M, et al. Fifteen-Year Journey to High Reliability in Pathology and Laboratory Medicine. American Journal of Medical Quality. 2018;33(5):530-539.E2
21Mossburg SE, Weaver SJ, Pillari M, Daugherty Biddison E. Manifestations of High-Reliability Principles on Hospital Units With Varying Safety Profiles: A Qualitative Analysis. Journal of nursing care quality. 2018;21:21.E4
22Oster CA, Deakins S. Practical Application of High-Reliability Principles in Healthcare to Optimize Quality and Safety Outcomes. Journal of Nursing Administration. 2018;48(1):50-55.E7
23Prasanna P, Nagy P. Learning from high-reliability organizations. J. 2011;8(10):725-726.E7
24Pronovost PJ, Armstrong CM, Demski R, et al. Creating a high-reliability health care system: improving performance on core processes of care at Johns Hopkins Medicine. Academic medicine : journal of the Association of American Medical Colleges. 2015;90(2):165-172.E2
25Provost SM, Lanham HJ, Leykum LK, McDaniel RR, Jr., Pugh J. Health care huddles: managing complexity to achieve high reliability. Health care management review. 2015;40(1):2-12.E2
26Quigley PA, White SV. Hospital-based fall program measurement and improvement in high reliability organizations. Online journal of issues in nursing. 2013;18(2):5.E2
27Roney L, Sumpio C, Beauvais AM, O’Shea ER. Describing clinical faculty experiences with patient safety and quality care in acute care settings: A mixed methods study. Nurse education today. 2017;49:45-50.E2
28Saunders CL, Brennan JA. Achieving High Reliability with People, Processes, and Technology. Frontiers of health services management. 2017;33(4):16-25.E2
29Shabot MM. New tools for high reliability healthcare. BMJ quality & safety. 2015;24(7):423-424.E2
30Sitterding M. Overview and Summary: Creating a Culture of Safety: The Next Steps. Online journal of issues in nursing. 2011;16(3):1-1.E7
31The Health Foundation. Evidence Scan: High reliability organisations. 2011.E6
32Thomas AD, Pandit C, Krevat SA. Race Differences in Reported Harmful Patient Safety Events in Healthcare System High Reliability Organizations. Journal of patient safety. 2018.E4
33Van Spall H, Kassam A, Tollefson TT. Near-misses are an opportunity to improve patient safety: adapting strategies of high reliability organizations to healthcare. Curr. 2015;23(4):292-296.E7
34Vogus TJ, Singer SJ. Creating Highly Reliable Accountable Care Organizations. Medical Care Research & Review. 2016;73(6):660-672.E7
35Wasden ML. High-Reliability Principles Must Be Tied to Value-Based Outcomes. Frontiers of health services management. 2017;33(4):26-32.E7
36Wentlandt K, Degendorfer N, Clarke C, et al. The Physician Quality Improvement Initiative: Engaging Physicians in Quality Improvement, Patient Safety, Accountability and their Provision of High-Quality Patient Care. Healthcare quarterly (Toronto, Ont). 2016;18(4):36-41.E7
37Woodhouse KD, Volz E, Maity A, et al. Journey Toward High Reliability: A Comprehensive Safety Program to Improve Quality of Care and Safety Culture in a Large, Multisite Radiation Oncology Department. Journal of oncology practice/American Society of Clinical Oncology. 2016;12(5):e603-612.E2

APPENDIX D. Peer Review Comments

Comment #Reviewer #CommentAuthor Response
Are the objectives, scope, and methods for this review clearly described?
11YesNone.
22YesNone.
33YesNone.
44YesNone.
55YesNone.
66YesNone.
77YesNone.
Is there any indication of bias in our synthesis of the evidence?
81NoNone.
92NoNone.
103NoNone.
114NoNone.
125NoNone.
136NoNone.
147NoNone.
Are there any published or unpublished studies that we may have overlooked?
151NoNone.
162NoNone.
173NoNone.
184NoNone.
195Yes - Zero Harm: how to achieve patient and workforce safety in health care, 2019, Clapper, Merlino & Stockmeier (editors). Press Ganey associates inc.We added a reference to this book in the “Background” section.
206NoNone
217No - I think that ESP was very thorough in their literature search and found all the relevant articles for this review. There is a book titled Managing the Unexpected: Sustained performance in a complex world, by Karl Weick & Kathleen Sutcliffe, 3rd Edition, Wiley & Sons, New York, NY., that has a Mindful Organizing Scale (p. 43) that is noteworthy. This scale was originally published in 2007, so it fell outside of the scope of this review. It is one of the few such scales and may be worth mentioning in the review.Added a description of the Safety Organizing Scale (as it is referred to by Weick & Sutcliffe in 2007) to the discussion.
Additional suggestions or comments can be provided below. If applicable, please indicate the page and line numbers from the draft report.
221This is an excellent and well-written report of a difficult topic (because of it’s “fuzzy” definitions) in a quick timeframe. I would say it’s quite responsive to our partners’ request for state of published knowledge on HRO. It will provide an excellent starting point to inform VA’s push toward more mature HROs throughout the system. My comments below are suggested in the spirit of further strengthening the report.Thank you.
231

1. The authors seem to rely on the AHRQ report on HRO as the “core” or “standard” definition for HRO. This is implied by the timeframe for review starting with 2008 (2 years after AHRQ’s 2008 report). If this is the case, then this should be stated at the beginning and reinforced throughout.

1a. E.g., L40, p1: needs a citation… AHRQ?

Added a sentence describing the Hines 2008 paper as a seminal white paper describing the adaption of HRO principles into healthcare settings.

We do not include citations in the executive summary.

2412. The authors need to more clearly differentiate the domains of HRO (as listed in the AHRQ report) versus the components (or strategies) for ‘implementation*. This language needs to be set forth early in the report. The KQs all relate to information about ‘implementation* (of the AHRQ-defined HRO framework with the 5 domains)…and measurementChanged terminology used in KQ1 from “implementation domains” to “implementation strategies.” The 5 components of the AHRQ HRO model are described as “principles.”
2512a. Starting L10, P2 and L8/P10 and Table 1 and elsewhere: Terminology around “Implementation frameworks” needs clarification. For example, referring to five “domains” across the implementation frameworks and five domains of AHRQ’s HRO. My suggestion is this: refer to implementation frameworks that are comprised of high-level strategies for implementing HRO. (you could cite Nilsen 2015, who would characterize these frameworks as “prescriptive” “which are frameworks that help guide implementations). The five strategies listed all have active verbs except the first one which should be reworded slightly to: “Developing leadership”

We used the Nilsen 2015 article to guide us in developing a table that defines the terminology we use throughout the report. This table appears in the “Methods” section and defines the terms: HRO principles, implementation strategies, implementation cross-cutting themes, and implementation activities.

In the findings section, we changed the terminology to indicate that “implementation frameworks” comprised of “implementation strategies” or just “strategies.” We also changed “leadership development” to “developing leadership.”

261

2b. L47, P2: use the term “strategies” instead of “components”

Nilsen P. Making sense of implementation theories, models and frameworks. Implementation science. 2015 Dec;10(1):53.

Changed “components” to “strategies.”
2713. P1/L51 - (and again later in the report) the authors cite lack of leadership commitment to “zero patient harm.” Is this how the goal is worded/conceptualized in the literature? There is much discussion about how singular focus on zero harm may cause unintended negative consequences. Some refer to this goal as “zero avoidable harm” - or link it to key cultural goals (e.g., just or safety culture). Can something be said about this, or is the literature (the 20 articles) silent on this important point?

Revised to say “leadership commitment to implement HRO principles” and framed as a facilitator rather than a barrier, as it is more often framed this way in the literature.

There is much variation in the literature on how ‘zero harm’ is characterized. For clarification, we added the MA’s definition of “zero harm”- reducing errors and ensuring that errors that do occur do not reach patients and cause harm- to the fifth paragraph in the introduction.

Yes, we agree that we should add something about this variation of ‘zero harm characterization and to illustrate this variation, we also added a sentence about how the 4 most comprehensive HRO initiatives defined their goals of zero harm to the “Findings” section.

2813a. What about leaders’ lack of “managerial patience” – i.e., are leaders lacking commitment to zero harm as an end goal altogether, or do focus at first and then lose interest? I ask this in context of the finding related to dose-response relationship with time. This linkage could be made more clear even in EXEC SUMMARY bullets by acknowledging the 2-year outcomes based on the articles, but that 2-year horizon may be limited by the lack literature; though there may have been good initial effects in focused areas, this timeframe may be too short for lasting, meaningful effects. Is 2 years realistic…are there indications that longer timeframe is needed to achieve more lasting effects - especially related to changes in culture?

Of the studies >2 years long, there continued to be improvements over time in patient safety outcomes (i.e., SSE rates continued to decrease) or improvements were maintained (i.e., SSE rates plateaued at a rate lower than baseline). We have added a sentence in “Findings” to indicate improvements were maintained.

There is no clear pattern in whether HRO interventions resulted in improvements in process outcomes (i.e., safety culture), which includes results >2 years after initiation of the intervention.

2914. L42, p2 (and elsewhere). Authors refer to strategies working in “primarily children’s hospitals.” It’s not that these findings only work in children’s hospitals…rather that these findings come from studies only done in children’s hospitals (a potential limitation). It’s notable that a couple of different systems/networks of children’s hospitals (Nationwide and CHSPS Network) have led the way with HRO - they are early adopters.Edited throughout to indicate our identified studies were primarily conducted in children’s hospitals- not that we only found improvements in children’s hospitals.
3015. L9/P3: authors should acknowledge the impracticality of RCTs to test HRO because of its complexity and complex implementation. Highlight the need for pragmatic, quasi-experimental study designs with full transparent reporting as a way to more feasibly build the knowledge base needed.Edited this section as well as “Future research needs” to speak to the impracticality of RCTs and how quasi-experimental designs with detailed reporting of intervention elements should be utilized instead.
3116. L37/P4: Build the history of HRO more clearly. It started within the nuclear and aviation industries and then AHRQ is the seminal report introducing/defining HRO for healthcare, yes? Did AHRQ describe the same 5 domains as used in nuclear and aviation industries?Added a sentence on the 2007 Weick and Sutcliffe book that defined the 5 principles of HROs. The Hines 2008 paper built on this by applying the principles to health care settings.
3217. L5/P5. Lists of “components” (should be strategies) seem to be differently described in different places. Be consistentChanged this sentence to indicate these are common HRO intervention activities. Activities are the actual tasks that a health care organization would take to implement the more overarching implementation strategies.
3318. Paragraph starting L13/p5: suggest flipping the order of the Providence St Joseph case with the VA to better segue into the next paragraph about VA.Put the Providence St. Joseph Health example before the VA example.
3419. L10/P13: I’m not sure how KQ1E differs from the overall goal to ID frameworks to guide implementation of HRO. This paragraph muddles concepts: intervention, process, implementing. I think this can be clarified by providing more detailed descriptions for how to operationalize the 5 high-level strategies in the implementation frameworks. For example, educational workshops might be a way to “Provide training and learning.”Revised this paragraph to make it clear we are talking about how to operationalize HRO implementation and linked the 5 implementation strategies to the specific implementation activities described by each model.
35110. We have found that it’s impossible to use JC’s Oro system for measurement because participants are told not to share with anyone outside their organization, and the questions seem to shift. Is there any reference to this in the literature? Sullivan’s article seems to have the best open definitions/operationalization of their domains.This isn’t explicitly discussed in the literature, but the ORO 2.0 website discusses how it’s designed to be used in at the individual hospital level rather than the health system level. We added a sentence and reference to this. In this section, we also discuss how the tool uses branching logic, which explains why the questions shift.
36110a. Their “RPI” domain relies on a trademarked (proprietary?) program, I think.Noted.
37110b. These are all limitations to using this system for measurement.though the development and intent of it, is the best developed.Noted.
38111. Love Table 2!Thank you.
39112. L60/P16: It would be clearer to refer to AHRQ HRO rather than Hines 2008 - this is first mention of Hines other than in the reference listIn response to an earlier comment, we added a description of Hines 2008 to the introduction, so this sentence now refers back to that description.
401EXECUTIVE SUMMARY
  1. Comments about the bullets
    • Add a bullet that identifies AHRQ source as the “seminal” (or core or foundational) definition of HRO which has 5 (fuzzily defined) domains
Added a sentence describing the 2008 AHRQ paper to the first paragraph of the ES.
411
  • Clarify that the current 1st bullet (L8+/p1) refers to ‘implementation* frameworks.
In response to an earlier comment, we edited this to indicate there were 5 implementation strategies across frameworks.
421
  • Also, 5 are listed here, but later in the report, 8 were identified….
Edited to indicate we identified 5 common HRO implementation strategies across 8 frameworks.
431
  • Oro 2.0 may be well-defined/develop but may have an issue of not being openly/publicly available (see comment above)
Edited the “Findings” section to indicate this tool is only available to Joint Commission-accredited organizations.
442Great report!Thank you.
453noneNone.
464

I thought this Evidence Brief was well written and describes my intuitive understanding of the current state of HRO frameworks, metrics, and effects. I thought the authors did a nice job of simplifying what can sometimes be very complicated concepts.

I’ve provided several questions and clarifying comments below.

None.
474Page ii: Table 3 title capitalization looks offFixed this.
484EXECUTIVE SUMMARY page 1 lines 40-52: might mention how HROs differ for health care (similar to background section)Added the healthcare-related definition of each HRO principle in parentheses and added a description of the unique challenges that threaten reliability in health care to this section.
494

page 2

line 3-5: may move “spreading implementation initiatives” to the last thing mentioned in the sentence.

Moved “spreading implementation initiatives” to the end of the sentence.
504lines 17-20: although there were 5 common domains, it would be interesting to mention some of the other domains not reported as commonly.Added a list of additional complementary practices that emerged from the literature.
514line 21: might say a little more about what consensus process means hereAdded language to indicate this consensus process typically involved a group of health system leaders and experts in patient safety.
524line 32: how many does multiple hospitals refer to?Clarified the tool/framework was tested in 52 hospitals.
534line 33-34: might give an example of “the variation in concepts measured” also I think the phrase “types of measures” is missing from that sentence. I might also define what levels of practice refers toRevised to indicate the range of concepts measured and removed “levels of” before practice.
544lines 55-58: It’s striking that there are so few barriers to implementation in the literature given all we know about implementation and organizing for quality. This seems like a major limitation.

The identification of barriers and facilitators to HRO implementation was not a key aim of our review. Therefore, we did not do a thorough search or analysis of these outcomes, but instead provide a few examples that were discussed in our included articles. More details on these barriers we found are available in the “discussion” section.

554

INTRODUCTION

page 4

line 55-56: add “in health care organization” or hospitals after the phrase “Implementation of HRO initiatives… is an

Added “into healthcare settings.”
564

page 5

lines 3-10: I do not see provide training in systems redesign (e.g. LEAN six sigma, Kaizen events, hFEMA, etc) or robust process improvement tools listed

Our findings did not indicate that systems redesign training was a key component of HRO implementation success. However, we agree it is important to discuss these change management strategies in this report, so we added a description of which frameworks recommend which strategies to the “Findings” section.
574lines29-33: Could the caring reliably program assess if it was the toolkit or the consulting which made the differences or was it bundled?It was a bundled initiative.
584line 36-40: Were the barriers reported in a particular type of service (e.g. focus on medical or surgical) or more general?These are more general barriers.
594

page 7

line 13-15: might outline the 5 HRO principles again here.

Defined the 5 HRO principles again here.
604

page 8

line 9-10: what was the rational for hand-searching references lists and consulting with content experts?

These are both steps typically conducted in systematic reviews.
614line 15-15: describe the types of expertise the investigator/staff had Experiences health services research, HROs, evidence briefs, etc.Added that all investigators have expertise in conducting systematic reviews of health services research.
624line 34-35: What was the level of disagreements which needed to be resolved by consensus?Agreement was generally high. We added a qualitative description of level of agreement to the report.
634

page 10

lines 37-60: seeing the table made me think about what were the other domains highlighted in the articles but not shown here.

Added a list of additional complementary practices that emerged from the literature under KQ1 in response to an earlier comment.
644

page 11

lines 50-53: might define what robust process improvement means. It can be a confusing term.

Added definition of robust process improvement.
654

page 12

line 58-59: say more about what variety of health care leaders, providers and staff means… what service areas do they cover ? what type of managers? are safety and quality leaders executive level leaders or middle level managers?

Added more detail here to indicate the range of leaders, providers, and staff targeted by these frameworks. Also added detail to indicate these frameworks target a variety of service areas.
664

page 13

lines 24-46: There is a lot of information in this paragraph and it’s easier to get lost in the details. It might be easier to comprehend it if it was provided in a bulleted format to allow easier comparison across frameworks.

Added bullet points to this paragraph.
674

page 14

line 42-42: “VA sites were interviewed about integration of HRO into their health care systems” is not an accurate depiction of this study. I believe the study assessed patient safety practices aligned with HRO principles. It was a secondary analysis of data collected for a study focused on patient safety indicators.

An important shortcoming of the ORO 2.0 tool is that it is not meant to compare results across multiple hospitals. As it has developed, I’m not sure if Joint Commission’s opinion has moved on this. I’m not certain if any of the tools presented have tried to compare cross-hospital progress.

Revised this section to better describe the original study and the secondary data analysis.

Also included the fact that the ORO 2.0 was designed to be used in a single hospital in response to an earlier comment.

684page 17 line 38: term SSE hasn’t been used in awhile, may want to define here again.Added definition here.
694

SUMMARY AND DISCUSSION

A few discussion points come to mind as I read this section.

1) it is critical to think about context

Added a statement on how health care systems may implement different HRO interventions depending on their individual needs and contexts to our “Limitations” section in response to another comment.
7042) How do these tools allow for cross-hospital comparisons? Is this the goal of VA’s HRO initiative?Correct, the VA is looking for tools that allow for cross-hospital comparisons. We added a statement to the “Findings” and “Discussion” sections to describe that the although the ORO 2.0 (the most comprehensive HRO evaluation tool) was not designed specifically for cross-hospital comparisons, the data is output in a way that it could be shared and analyzed between VHA hospitals.
7143) the need for training on HRO principles may not be enough to move an organization. I did not see training on system redesign tools and methodologies listedAgreed that HRO training may not be enough to move an organization. We added a statement to “Gaps and future research” suggesting that future research may want to explore the extent to which HRO training does-or doesn’t- address/overlap with system redesign.
7244) It’s unclear how HRO frameworks deal with differences in HRO practices across different service (e.g. medical, surgical). Should they? Have frameworks focused on this?We only included studies that assessed HRO implementation at a system level (ie, included both medical and surgical units as appropriate), so all our frameworks addressed multiple services and none conducted subgroup analyses by service type. We added a statement to the “Gaps and future research” suggesting that future research studies note where intervention components were delivered (eg, medical or surgical service areas) to help tailor HRO delivery to different contexts.
7345) Have HRO frameworks been developed and aligned with organizational transformation models or other frameworks for improving quality? There may be other measures or concepts to assess which have not been presented in this evidence-brief.Yes, 6 out of 8 frameworks recommended using other change management strategies in HRO implementation. We added a description of which frameworks recommend utilization of which change management strategies to the “Findings” section in response to an earlier comment.
744

LIMITATIONS

page 23

lines 12-14:

I might mention HRO intervention are inherently difficult to study because they can have many different components (potentially with different foci across different hospitals)

Added that each hospital may also choose to implement different components of these interventions, depending on their individual needs and context.
754

GAPS AND FUTURE RESEARCH

page 24

line 7-8: “3) whether certain implementation frameworks or facilitators lead to better outcomes” could be separated out to 3) whether implementation or other frameworks for improving quality frameworks are applied and lead to better outcomes and 4) what the factors affecting HRO implementation are.

Revised to split up #3 into 2 parts.
764line 10-11: the wait-list control point is a good one BUT many facilities already have in place high reliability practices at baseline which will need to be assessed. Many sites could also have already participated in initiatives so they are more prepared for the journey (Improvement capacity/adoption of Lean Six Sigma, old Clinical Teams training, etc). How do we account for these on-going or older initiatives?Added language to indicate consideration should be given to how much wait-list control sites have begun implementing HRO on their own, or are delivering similar interventions such as Lean Six Sigma
774line 27-28: say more about mechanism for change…. is this organizational transformation? something else?Added that the mechanism of change might involve improving mindfulness or safety culture, as this aligns with our conceptual model based on Hines 2008. We see organizational transformation as the end-goal, represented through improved patient safety outcomes.
784

CONCLUSION

I might mention something about measurement here as it is a key aim of the brief.

Changed “tools” to “metrics” in the last sentence of the conclusion.
795P1 L47: I would be cautious in stating that medical error is the 3rd leading cause of death in the affirmative and/or saying continues to be as the Makary & Daniel article was a commentary based off of extrapolated data from current literature attempting to articulate how big a problem it is. Since medical errors are not listed as the cause of death this number is difficult to find and the assessment of death from harm is not as black and white in all cases. I would recommend stating something along the lines of if we were to document medical error as cause of death, Makary and Daniel have ascertained that it would be the 3rd leading cause of death in the country.Changed to indicate that death due to medical errors are estimated to be the third leading cause of death in the country.
805P2 L10: remove total, reads as if there are only a total of 20 articles published which is not the case.Removed “total.”
815P5 L33: In review of the additional reviews of measurement, I don’t recall if I mentioned that we also improved on the Safety Climate Domain of the Safety Attitudes Questionnaire (SAQ) from 2016 to 2017 (during the time of everyone up and running on training) by 5 percentage points with a sample size greater than 68,000 respondents so it was found to be quite significant. In addition, when drilling down to our regions, all showed improvements from 3 to 10 percentage points. Your option to add if you so choose.Added that Providence St. Joseph Health had a 5% improvement in the safety climate domain of the Safety Attitudes Questionnaire.
825P5 L37: “lack of leadership commitment to zero patient harm” I would revise to indicate that it is lack of what it takes to get to zero harm. Most leaders would agree yes we need to get to zero patient harm and even indicate that they are doing work to do so. What doesn’t happen from my experience is they believe in it but do not provide the resources (people, money, skills) that it takes to get there. This sentence also needs a colon and some comas to separate the ideas of the list.Revised this section to remove “leadership commitment to zero patient harm” in response to an earlier comment and included additional detail on what are the barriers to incorporating safety culture principles and practices and adopting process improvement tools and methods.
835P16 L35 Many who assess HRO use some form of Safety Climate survey as part of the assessment such as Safety Attitudes Questionnaire (SAQ) which was created by Sexton and team at Univ Texas and reflects similarities to the Flight Management Attitudes Questionnaire used in aviation to assess some of its HRO components. Something to consider adding as a measurement perhaps.Since this tool was published before 2010, we added a discussion of this tool to the “discussion” section
846

Thank you for providing me the opportunity to review this report. Excellent rapid review on a complex topic. See some suggested revisions below:

For the Key Findings box contained within the Executive Summary, it would have been helpful to have an initial bullet that succinctly listed the goals of the report, such as the aims described in the last sentence of the second paragraph of the Executive Summary. It would have also been helpful to have the 5 domains listed in the first bullet of the Key Findings box.

Added the “objective” of the report to the key findings box. Because we want to keep this section brief, and the 5 common implementation strategies appear shortly afterward in the executive summary, we did not add these to the key findings box.
856In the Background section of the Introduction, in the fourth paragraph, the Joint Commission’s 2013 HRO report is noted, but should also be cited/referenced.Added citation.
866In the Background section of the Introduction, in the sixth paragraph, the second sentence states that an understanding of available frameworks and their use is limited, but what about our understanding of available measures, and the impact of initiatives on those measures? Given the aims of this report, should note these areas as well.Added “metrics and initiatives” to this sentence as the description of variability actually applies to all 3 key questions.
876Under Eligibility Criteria, why not extend the search from 2008 to present, instead of 2010? Seems like if AHRQ is publishing a white paper in 2008, others may have also begun publishing on this topic at this time.2010 was chosen as a start date in consultation with the operational partner. We expected it would take at least 2 years for research integrating the 5 HRO principles discussed by Hines 2008 to be published.
886In the Oro 2.0 section, third paragraph, last sentence, did safety culture decrease as described, or is this a mistake, and did it increase?Safety culture did indeed decrease. Study authors don’t note what these organizational changes were, but it appears they negatively affected safety culture.
896In Table 3, please include abbreviation for PHI in the Table legend; in the third row of the Table, “zero SSE rate achieved in 2017” seems redundant with the statement directly above; in the fourth row of the Table, in the last column, please include Month and Year for the baseline

Added definition of PHI to table key.

Deleted “zero SSE rate achieved in 2017” from Cropper 2018 study.

Added dates used for baseline data in Lyren 2016 study.

906In the first paragraph in the Summary and Discussion, in the second sentence, please change the order to “frameworks and metrics”, rather than “metrics and frameworks”, to better match the aims.Reordered “frameworks and metrics.”
916In the Limitations section, in the first paragraph, second sentence, please consider citing: J Clin Epidemiol. 2014;67(11):1181-91. PMID: 25438663Added this citation.
926In the Conclusions, please change the order of the first sentence to read: “frameworks and evaluation tools”. The second sentence should probably read “reduction in SSEs” rather than simply “SSEs”.

Reordered “frameworks and metrics.”

Revised to say, “reduced SSEs.”

937Overall, I think this evidence brief is excellent. It is thorough, thoughtful, and very well done! The ESP team identified their Key Questions, which were tied to the request from the Office of the National Center for Patient Safety. The method was clearly laid out and executed. The Key Questions were answered, gaps identified, and plans for future research addressed.Thank you.
947I found “Table 1. Common HRO implementation domains across 8 identified frameworks,” very useful. This table quickly identified all 8 HRO frameworks and their included components. Only 3 of the 8 contained all 5 HRO components.None.
957Table 2. - Metrics for measuring progress on becoming an HRO - was also extremely enlightening. This side-by-side comparison of the 6 methods identified by the ESP group will be helpful for VHA Leadership to understand the differences between these methods, and then select the best one.None.
967Table 3 highlighted the challenge of comparing studies of disparate quality, methods, measures, and results reporting. This is a shortcoming in the HRO literature and was clearly communicated in this table.None.
977I agree with the ESP assessment of the gaps in the research. It is theorized that the implementation of HRO principles leads to improved safety outcomes and a culture of safety. This has not been validated by the research, nor has the mechanism by which these changes and improvements occur. The secular trends mentioned on page 24, which cannot be ruled out as contributing to improvements in patient safety outcomes, could be expanded on. What are these secular trends, and how are they impacting patients safety outcomes?Added a sentence on the role that EMR could play in improving patient safety outcomes. Also added a sentence that implementation of Lean Six Sigma before or during interventions could plausibly affect outcomes as well.
987I also agree with the statement about the VA being in a unique position to conduct a natural experiment with the current HRO Initiative. This is an excellent insight on the part of the ESP team. I am not criticizing, only providing additional information. The HRO Initiative is limited to 18 lead sites, but many other sites are clamoring to be part of it. I am not clear on the criteria VISN Directors used to select the lead sites, but it is likely that other sites within their VISNs, and across the VHA, are not experimentally naive. 1 am aware of 2 other sites within VISN 15 that are on HRO journeys already, and were not selected as the lead site for that VISN. I imagine that is may be true for other VISNs as well. There is no “perfect” way to conduct this type of research, and all research has limitations of some kind. I personally would love the opportunity to be involved in that kind of research.Added that consideration should be given to the extent to which “wait-list control” sites are implementing HRO on their own or using other types of change management strategies in response to an earlier comment.

References

1.
Aboumatar HJ, Weaver SJ, Rees D, Rosen MA, Sawyer MD, Pronovost PJ. Towards high-reliability organising in healthcare: a strategy for building organisational capacity. BMJ Qual Saf. 2017;26(8):663–670. [PubMed: 28546510]
2.
American College of Healthcare Executives. Leading a Culture of Safety: A Blueprint for Success. 2017.
3.
Chassin MR, Loeb JM. High-reliability health care: getting there from here. Milbank Q. 2013;91(3):459–490. [PMC free article: PMC3790522] [PubMed: 24028696]
4.
Day RM, Demski RJ, Pronovost PJ, et al. Operating management system for high reliability: Leadership, accountability, learning and innovation in healthcare. J Patient Saf Risk Manag. 2018;23(4):155–166.
5.
Frankel A, Haraden C, Federico F, Lenoci-Edwards J. A Framework for Safe, Reliable, and Effective Care. White Paper. Cambridge, MA: Institute for Healthcare Improvement and Safe & Reliable Healthcare; 2017.
6.
Melnyk BM. Achieving a high-reliability organization through implementation of the ARCC model for systemwide sustainability of evidence-based practice. Nurs Adm Q. 2012;36(2):127–135. [PubMed: 22407205]
7.
Office of the Air Force Surgeon General. Trusted Care Concept of Operations (CONOPS). 2015.
8.
Riley W, Davis SE, Miller KK, McCullough M. A model for developing high-reliability teams. J Nurs Manag. 2010;18:556–563. [PubMed: 20636504]
9.
Ikkersheim DE, Berg M. How reliable is your hospital? A qualitative framework for analysing reliability levels. BMJ Qual Saf. 2011;20(9):785–790. [PubMed: 21441603]
10.
Kenneth MJ, Bendaly N, Bendaly L, Worsley J, FitzGerald J, Nisker J. A measurement tool to assess culture change regarding patient safety in hospital obstetrical units. J Obstet Gynaecol Can. 2010;32(6):590–597. [PubMed: 20569541]
11.
Mousavi SM, Dargahi H, Mohammadi S. A study of the readiness of hospitals for implementation of high reliability organizations model in Tehran University of Medical Sciences. Acta Med Iran. 2016;54(10):667–677. [PubMed: 27888596]
12.
Mousavi SMH, Jabbarvand Behrouz M, Zerati H, et al. Assessment of high reliability organizations model in Farabi Eye Hospital, Tehran, Iran. Iran J Public Health. 2018;47(1):77–85. [PMC free article: PMC5756604] [PubMed: 29318121]
13.
Randall KH, Slovensky D, Weech-Maldonado R, Patrician PA, Sharek PJ. Self-reported adherence to high reliability practices among participants in the children’s hospitals’ solutions for patient safety collaborative. Jt Comm J Qual Patient Saf. 2019;45(3):164–169. [PubMed: 30471989]
14.
Sullivan JL, Rivard PE, Shin MH, Rosen AK. Applying the high reliability health care maturity model to assess hospital performance: a VA case study. Jt Comm J Qual Patient Saf. 2016;42(9):389–411. [PubMed: 27535456]
15.
Brilli RJ, McClead RE, Jr., Crandall WV, et al. A comprehensive patient safety program can significantly reduce preventable harm, associated costs, and hospital mortality. J Pediatr. 2013;163(6):1638–1645. [PubMed: 23910978]
16.
Crandall KM, Sten MB, Almuhanna A, Fahey L, Shah RK. Improving apparent cause analysis reliability: a quality improvement initiative. Pediatr Qual Saf. 2017;2(3):e025. [PMC free article: PMC6132456] [PubMed: 30229162]
17.
Cropper DP, Harb NH, Said PA, Lemke JH, Shammas NW. Implementation of a patient safety program at a tertiary health system: A longitudinal analysis of interventions and serious safety events. J Healthc Risk Manag. 2018;37(4):17–24. [PubMed: 29604147]
18.
Lyren A, Brilli R, Bird M, Lashutka N, Muething S. Ohio children’s hospitals’ solutions for patient safety: a framework for pediatric patient safety improvement. J Healthc Qual. 2016;38(4):213–222. [PubMed: 26042749]
19.
Saysana M, McCaskey M, Cox E, Thompson R, Tuttle LK, Haut PR. A step toward high reliability: implementation of a daily safety brief in a children’s hospital. J Patient Saf. 2017;13(3):149–152. [PubMed: 25119785]
20.
Muething SE, Goudie A, Schoettker PJ, et al. Quality improvement initiative to reduce serious safety events and improve patient safety culture. Pediatr. 2012;130(2):e423–e431. [PMC free article: PMC3408689] [PubMed: 22802607]
Prepared for: Department of Veterans Affairs, Veterans Health Administration, Health Services Research & Development Service, Washington, DC 20420 Prepared by: Evidence Synthesis Program (ESP), Coordinating Center, Portland VA Health Care System, Portland, OR, Mark Helfand, MD, MPH, MS, Director

Suggested citation:

Veazie S, Peterson K, Bourne D. Evidence Brief: Implementation of High Reliability Organization Principles. Washington, DC: Evidence Synthesis Program, Health Services Research and Development Service, Office of Research and Development, Department of Veterans Affairs. VA ESP Project #09-199; 2019. Available at: https://www.hsrd.research.va.gov/publications/esp/reports.cfm.

This report is based on research conducted by the Evidence Synthesis Program (ESP) Center located at the Portland VA Health Care System, Portland, OR, funded by the Department of Veterans Affairs, Veterans Health Administration, Health Services Research and Development. The findings and conclusions in this document are those of the author(s) who are responsible for its contents; the findings and conclusions do not necessarily represent the views of the Department of Veterans Affairs or the United States government. Therefore, no statement in this article should be construed as an official position of the Department of Veterans Affairs. No investigators have any affiliations or financial involvement (eg, employment, consultancies, honoraria, stock ownership or options, expert testimony, grants or patents received or pending, or royalties) that conflict with material presented in the report.

Copyright Notice

This publication is in the public domain and is therefore without copyright. All text from this work may be reprinted freely. Use of these materials should be acknowledged.

Bookshelf ID: NBK542883PMID: 31233295

Views

Other titles in this collection

Related information

Similar articles in PubMed

See reviews...See all...

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...