NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Henriksen K, Battles JB, Keyes MA, et al., editors. Advances in Patient Safety: New Directions and Alternative Approaches (Vol. 1: Assessment). Rockville (MD): Agency for Healthcare Research and Quality; 2008 Aug.

Cover of Advances in Patient Safety: New Directions and Alternative Approaches (Vol. 1: Assessment)

Advances in Patient Safety: New Directions and Alternative Approaches (Vol. 1: Assessment).

Show details

Common Cause Analysis: Focus on Institutional Change

, MSN, RN, , PharmD, , MSN, CRNP, RN, , MSN, RN, and , MD, SM.

Author Information


The Children’s Hospital of Philadelphia has created a mechanism for sharing root cause analysis (RCA) findings with senior leaders through annual common cause analysis (CCA). As each RCA is completed, reports are shared with senior leaders and discussed each month at the Patient Safety Advisory Committee meeting. We have found it helpful to summarize these findings each year by organizing the action items into themes. This practice was initiated 2 years ago, and as a result, several high-scoring items have been included on the organizational operating plan for the upcoming fiscal year. Given that endorsement by senior leadership is key to initiating change, these data have proven beneficial to gain this endorsement.

In a year, an average of 25 events, including serious events and near misses, are evaluated. Of those 25 events, about 16 undergo a formal RCA, yielding approximately 10 action items each. These are sorted according to various headings, such as by department and by National Center for Patient Safety Triage Card ™ categories (human factors communication, training, fatigue and scheduling, barriers, rules/policies/procedures, and environment). Once themes are identified, action items are listed under the appropriate theme. Themes are then prioritized by scoring for severity, occurrence, and detectability. Findings presented to senior leaders and other appropriate groups provide objective data for departments. Staff members who conduct RCAs are included in discussions to provide details of the findings and recommendations. They are also included whenever possible in efforts to make changes. This process closes the loop for those conducting the RCA. Once items are added to the organizational operating plan, multiple issues can be addressed through one effort, raising the level of commitment to address the items.


Root cause analysis (RCA) is a widely recognized tool utilized by high-risk industries to identify underlying causes upon retrospective analyses of events. RCA has been systematically introduced into health care as a part of the patient safety movement. Typically, RCA in health care is performed in response to a single patient safety event or a cluster of similar events with the goal of identifying causal factors. These factors necessitate a detailed plan for improvement in response to the event. Organizations in which numerous RCAs are conducted can easily become overwhelmed with the list of improvement actions. In addition, this process may not reveal deeper themes and more common causes of patient safety events. Necessary improvement actions may be broader in scope than a particular case may reveal. In some cases, the findings from a single case may cause an organization to take action, only to learn that the action was only a partial solution, or worse yet, the action produced unintended negative consequences.

At The Children’s Hospital of Philadelphia, a less well-known approach—common cause analysis (CCA)—is utilized. CCA helps us analyze data from RCAs, which in turn allows us to recognize trends and establish themes in patient safety. CCA ultimately allows us to gain support for leveraging change by prioritizing and incorporating the identified themes into an annual organization-wide operating plan.

Conducted annually, CCA consists of a review of all RCA findings from the previous year. The process includes:

  • Assessing all identified action plans based on identified vulnerabilities.
  • Determining the extent to which action plans were completed.
  • Sorting and analysis of data to identify common themes.
  • Assigning risk priority numbers (RPNs) to themes, in a manner similar to the risk prioritization used in failure modes and effects analysis (FMEA).
  • Discussing and reviewing CCA findings with key stakeholders and ultimately senior administrative and medical leadership.
  • Using the themes, once they are validated, to shape institutional priorities.

Senior leaders at our institution convene annually to create an organizational operating plan, which defines priorities and goals for the coming year. The plan is organized according to a “five-pillar model”:

  1. Quality and patient safety.
  2. Service.
  3. People.
  4. Growth.
  5. Finance.

Each pillar identifies the highest priority projects for the coming year and defines key measures that are carefully tracked and reported regularly across the institution.1

For the past 2 years, vulnerabilities identified through the CCA process have been incorporated into the development of the annual organizational “Quality and Patient Safety” pillar. This course of action assures that appropriate resources and attention are devoted to the most important patient safety vulnerabilities. The outcome of this process also inspires an increased level of commitment and investment for staff members who participate in RCA teams, as their efforts are validated by institutional response to RCA findings. In this way, staff members and physicians are able to appreciate the importance that the organization places on patient safety.


In recent years, improving patient safety has been identified as one of the key challenges in health care. Professional and consumer literature constantly draws attention to human error and the frailties of our health care system. One of the critical transforming concepts of safety science is that a “system,” not an individual acting alone, predominates in establishing safety. The patient safety movement seeks to build a culture in which systems mitigate human error and prevent harm to patients.2

The basic premise of the systems approach is that humans are fallible, and errors are to be expected, even in the best people and organizations. When an error results in harm to a patient, it should be regarded as a consequence rather than cause, having its origins in upstream systemic factors.3 In other words, systems should be designed to mitigate human error. The system becomes the focus in creating an environment that is safe for patients and employees.

Reason’s “Swiss Cheese Model” (Figure 1) is often used to depict the way systems place barriers designed to prevent harm and create safe processes.4 Yet, every barrier has weaknesses.“Active failures” are unsafe acts committed by people who are in direct contact with the patient or system. They include slips, lapses, mistakes, etc. “Latent conditions” are the inevitable systems failures, that relate to design—such as alarms that are not trustworthy, understaffing, or poor product design.

Figure 1. The “Swiss Cheese Model” depicting the way systems place barriers designed to prevent harm and create safe processes.

Figure 1

The “Swiss Cheese Model” depicting the way systems place barriers designed to prevent harm and create safe processes. Source: Adapted from Reason 2002 and US Department of Veterans’ Affairs NCPS Triggering and Triage Cards ™ (more...)

Individually, one active failure or latent condition may not threaten patients, but they can align to allow a human error to result in a harmful serious event. Left unchanged, a system can only be expected to continue to achieve the same results. To achieve a different level of performance, it is essential to change the system in ways that improve its ability to intercept errors.5

The medical model has traditionally been one of looking at human error and assigning blame to the error. However, failure to recognize and address the system context in which clinicians provide patient care will doom subsequent clinicians to initiate the same chain of events, resulting in an injury to future patients. Identifying system deficiencies and vulnerabilities requires specifically designed analytic tools. RCA is one such tool.

When confronted with many system vulnerabilities, institutional leaders often need further guidance in prioritizing the approach to address these vulnerabilities. In this report, we describe our experience using CCA to learn about the deeper causes of patient safety events that have been identified using RCA, and about the associated methodology that help prioritize opportunities for improving patient safety at an organizational level. CCA is used in industry as a method of identifying the common causes of errors. However, it has not been widely used in health care.6 Like many safety techniques that are developed, practiced, and accepted in high reliability industries, such as aviation and nuclear energy, CCA also offers value when applied to health care.

To understand how we have adapted CCA, it is important to first review our RCA methodology. Although RCA—or an intense analysis of patient safety events—has been mandated for sentinel events by the Joint Commission since 1997, institutions vary widely in their approaches to this requirement.7 In the Commonwealth of Pennsylvania, additional requirements have been in place as a part of the Medical Care Availability and Reduction of Error Act (MCARE), or Act 13 of 2002.8 The RCA program is designed to meet the requirements of our regulatory organizations and more importantly, to assure that the organization’s standards and commitment to patients are upheld.

At The Children’s Hospital of Philadelphia, the RCA method was developed with a goal of reducing variation in the process, to achieve reliable results and to optimize the investment of time by front-line clinicians. A dedicated team of Clinical Process Managers facilitates all of the RCAs in conjunction with a team leader and team members, who are selected for each case from relevant front-line staff.

To have a reality-based discussion about an event, representatives from front-line staff are critical members of each team. These individuals work daily within the system and provide authoritative information regarding its failure points as well as potential solutions. Immediately following the identification of a near miss or serious event resulting in patient harm, an RCA team is chartered through the Patient Safety Officer. The assigned Clinical Process Manager then recruits a physician or nurse team leader with expertise relevant to the event. Next, five to six additional front-line staff and physician members are chosen to make the team representative of the major roles involved in the event. Individuals actually involved in the event are interviewed to assure that the team is working with accurate information, but they are not included in the RCA team.

Following a substantial amount of pre-work and investigation by the Clinical Process Manager, the team plans three to four 2-hour team meetings. In addition to analyzing the event in question, these meetings serve to teach participating clinicians to view the event from a systems perspective. They learn the language of systems thinking and maintain a focus on systems and processes while avoiding blame and hindsight bias.3, 9, 10, 11, 12, 13, 14 Participants advance their knowledge of patient safety, and most importantly, they identify vulnerabilities and formulate action plans through consensus on the suggested improvements.

This RCA methodology was adopted from that used by the U.S. Department of Veterans Affairs National Center for Patient Safety (NCPS). It starts with a process flow developed for the event. The team carefully reviews each step leading up to the event. Three questions are asked:

  1. What happened?
  2. What usually happens?
  3. What should happen?

The goal is to have a thorough review through repeated questioning. At each juncture, we ask “Why?” five times, to be sure every relevant aspect is revealed.14 For this to occur and to assure that real root causes are identified, it is imperative to provide an environment that supports open, honest discussion.

RCA teams use the NCPS Triage Cards ™, a cognitive aid that guides participants in evaluating an event.15, 16 The standard, objective questions on these cards help keep the RCA process thorough and credible and provide a mechanism for determining the real source(s) of the vulnerability. The questions posed in NCPS Triage Cards ™ are divided into categories that reflect potential contributing factors (Table 1):

Table 1. Department of Veterans Affairs National Center for Patient Safety triage categories.

Table 1

Department of Veterans Affairs National Center for Patient Safety triage categories.

  • Human factors: communication—assesses issues related to communication, flow of information, and availability of information. This is the category with the highest frequency of identified vulnerabilities, both nationally and at our institution.17, 18
  • Human factors: training—assesses issues related to routine job training, special training, and continuing education.
  • Human factors: fatigue/scheduling—examines the influence of stress and fatigue, which may result from change, scheduling, staffing issues, sleep deprivation, or environmental distractions, such as noise.
  • Environment/equipment—evaluates factors related to use and location of equipment, fire protection and disaster drills, codes, specifications, and regulations.
  • Rules/policies/procedures—assesses the existence and accessibility of directives used to inform and implant a consistent approach to various care processes, including technical information for assessing risk, a mechanism for feedback on key processes, and relevant committees or other leaders who inform the development of these rules.
  • Barriers—assesses the effectiveness of the barriers or processes put into place to protect patients from harm and the interaction or relationship to rules/policies/procedures.

Once vulnerabilities are identified in all of the relevant categories, the team is charged with identifying potential improvements. Assigning of this work may begin during the RCA process, but improvement actions are often initiated even before completion of the RCA.

Once an analysis of the event is completed, the team prepares a final report, which includes the identified vulnerabilities by category and priority, as well as recommended improvement actions. The report is made available first to the team, to those involved in the event, to relevant leaders at all levels of the organization, to the committees responsible for patient safety, and to senior leaders. The report is then presented to the Patient Safety Advisory Committee, which comprises unit level medical directors, nursing leadership, pharmacy, information systems, risk management, and other administrative leaders.

At the committee meetings, a triage process identifies the most important actions and potential process owners, with a focus on action items that are appropriately handled locally or are within the scope of existing committees. For example, if it is determined that the hospital formulary does not clearly specify dosing information related to a certain medication, the Therapeutic Standards Committee would be the body with the authority to make these changes. Pharmacy, as an agent of the Therapeutic Standards Committee, would lead this work to edit the formulary in consultation with physician experts.

All vulnerabilities and suggested improvement actions are documented in an RCA database, which is also used to track completion of improvement actions. Not only does this RCA database assist in tracking progress, it is also integral to the conduct of the annual CCA.

Because no single person or department completely owns the responsibility for actions, this improvement process can be complex. For example, if an RCA were conducted and a vulnerability identified with a piece of equipment with an inherent patient safety risk, the departments involved might include nursing—possibly the primary group to use this equipment, the supply chain, which has the link to the manufacturer, and the physicians or others who write orders related to the piece of equipment.

If safety concerns with a given piece of equipment are identified, sorting this topic by NCPS categories might reveal “Environment & Equipment” issues. The need for a “Rule/Policy/Procedure” may be a factor if it is determined that safety information was not available to the end user. Finally, if the equipment was used incorrectly, “Human Factors Training” may be a concern. Assigning departments and NCPS categories to vulnerabilities can be challenging, but more detail in this part of the process provides direction for later improvements.

An average of 25 to 30 serious events and high-risk near miss situations are identified each year, and RCA is conducted for about half of these. The other half are discussed and reviewed in other forums, including multidisciplinary morbidity and mortality conferences, where systems vulnerabilities may also be identified. Over a 2-year period from June 2005 through June 2007, 35 RCAs yielded 375 vulnerabilities. Some were addressed immediately: changes in order sets, reference information, training, etc. However, many identified vulnerabilities were more complex and required significant resources and goals in order to be addressed fully.


Each RCA identifies on average about 10 improvement actions, and the RCA database contains the action plans related to all the identified vulnerabilities. The overwhelming number of vulnerabilities and action plans rapidly saturated the organization’s ability to make all recommended changes, and it became evident that a systematic method was needed to organize data in a way that would help prioritize improvement efforts. Evaluation of trends and patterns across the organization became the impetus for the development of a CCA.

The Department of Veteran’s Affairs NCPS has been analyzing aggregate RCA findings to improve patient safety by focusing on one topic at a time, including patient falls, medication errors, or missing patients.19 Our methodology does not categorize the RCA findings by incident type, but the aggregate findings obtained through our CCA process provide a powerful tool to focus and prioritize findings and themes for senior leaders.

The first step of the CCA involved sorting data within each of the NCPS categories. Each year, action items in the RCA database that were less than 100 percent complete were sorted so that we could identify themes. Degree of completion was defined using the following possible scores:

  • 0 percent: no action taken.
  • 25 percent: process owner assigned and first meeting has taken place.
  • 50 percent: recommendations for change identified by team.
  • 75 percent: implementation of some changes, or pilot study.
  • 100 percent: improvement action completed.

Of the 375 vulnerabilities identified between June 2005 and June 2007, 165 (44 percent) met the criteria to be included in the CCA. First, they were sorted according to primary department. This entailed isolating department(s) with authority to make change(s) and identifying relevant department(s). Items were then sorted according to NCPS categories.

After items were sorted into departmental and NCPS categories, trends emerged and themes were identified encompassing multiple items. Once themes were identified, the process of scoring began. To score themes objectively, a criticality index was employed, and Risk Priority Numbers (RPN) were assigned.20 The mechanism for assigning an RPN was the same as that used in “Failure Modes and Effects Analysis (FMEA).” Each theme was scored in three categories – severity, occurrence, and detectability. Each of these three factors was assigned a score using a scale of 1 to 10. The scores were then multiplied for a total RPN that ranged from 1 to 1000. Institutional experience with FMEA suggested a standard for an RPN cutoff point of 250; themes scoring higher than that threshold were considered priority action items that merited a plan of action.

  • Severity (S) involves evaluating the potential outcome for the patient. For example, if there were no effect at all, the score would be S = 1; if there were a moderate effect or temporary harm, the score might be S = 5; a catastrophic event or multiple deaths would rate an S score of 10.
  • Occurrence (O) evaluates the risk of the event occurring again. The likelihood that an event might almost never occur or occur <1/1,000 times would rate a score of O = 1; if the possibility that something might happen again were slight, e.g., 2 to 3/1,000, the O score might be 5; if the risk of recurrence were almost certain, e.g., >300/1000, the score would be O = 10.
  • Detectability (D) refers to the clinician’s ability to recognize when an error has occurred and to respond before an adverse outcome affects the patient. For example, an event that was obvious and immediately self-revealing would receive a score of D = 1; if we had to wait for an early warning/symptom of the problem to show up in a test or alarm, then the D score might be 5; if it were impossible to detect the problem in time to react or if it went completely unnoticed, the score would be D = 10. Error detectability can also relate to a process. For example, a medication error at the point of ordering might be detected at one of many steps, whereas an error at the point of administration would be less apparent; hence, the latter would receive a higher detectability score.

To assess inter-rater reliability, each RCA Clinical Process Manager independently scored each theme for severity, occurrence, and detectability. Clinical Process Managers then engaged in a discussion to assure that individual scores truly reflected severity, occurrence, and detectability. The results of this discussion led to a consensus and confirmed the RPN.

Once these steps were completed, the annual CCA report was developed to include themes with an RPN of over 250. Common causes or themes were ranked in descending order by RPN, and the report was finalized. Before making the report public, findings were tested with relevant leaders for face validity.

While significant emphasis was placed on developing and recognizing themes or common causes, ultimately some proportion of action items and vulnerabilities could not be reasonably incorporated into any of the top five or six themes. These remained in our RCA database for continued tracking.

In our experience, a subset of action items were thoroughly assessed and found to be “not feasible.” An example of a patient safety vulnerability for which the recommended action item was “not feasible” might have related to the transfer of patients on night shift. For a variety of reasons, transferring a patient from one unit to another during off-hours poses great challenges. Since circumstances can arise that require patient transfer at night, it would not be feasible at this time to create a rule or standard that no patients be transferred on night shift. Instead, we focused on understanding and communicating the vulnerability, reducing the frequency, making the night shift transfer process safe, and creating systems and safeguards around this known vulnerability.

Items determined to be “not feasible” remained listed in the RCA database and thus provided evidence to support relevant efforts to mitigate their impact, even if they could not be completely addressed. Some “not feasible” items related to technology limitations. Since there is always the potential to incorporate solutions as technologies evolve, they remain in the database and are reevaluated regularly.

Case Example

The following is a step-by-step example of how the RCA and CCA processes combined to improve care. The scenario described, although altered somewhat from our actual data, provides a realistic representation of our process.

Below is a sampling of action items representative of those taken from one RCA, which contributed to a theme in our CCA. The action items are representative of a serious event that involved a patient found with the cardiorespiratory monitor alarms disabled and additional cases where monitors linked to a nurse call system failed.

Sample RCA suggested improvement actions:

  1. Institute a policy and provide education stating that the nurse caring for the patient is the only person who may disable the cardiorespiratory monitor alarms, with the stipulation that the nurse needs to be physically present at the patient’s bedside to take this action.
  2. Reduce variation in practice and guide clinicians by creating a cardiorespiratory order set in the electronic Computer Physician Order Entry system.
  3. Add priority capabilities to the central alarm system on the unit where the event occurred.
  4. Reduce variation in the types of monitors used in medical-surgical units. Replace all monitors throughout the network with a standard monitor.
  5. Develop the practice and an associated policy for the “Code Blue” function to be integrated with the nurse-call system.
  6. Write SOPs (Standard Operating Practices) and incorporate these into the standards of care and patient care policies for the nurse-call system.

The first step, as described above, was to look at the departments involved or impacted. In this case, biomedical engineering played a significant role. Front-end users needed to develop policies and consider education and standards of care, so other departments cited included nursing, respiratory therapy, physical therapy, occupational therapy, speech, child life, and physicians. Existing organizational groups—such as Clinical Decision Support, the Medical Devices Committee and others—provided a mechanism for expediting the work.

After sorting by department, items were sorted by NCPS category. In this case, “Environment and Equipment” and “Rules, Policies and Procedures” both represented categories of vulnerabilities. “Human Factors Communication” and “Human Factors Training” also contained vulnerabilities. Sorting for trends among affected departments and NCPS categories allowed us to identify a theme. Themes each involved multiple departments and matching NCPS categories. For example, other themes might relate more specifically to the department of surgery, to issues of procedures and communication in the operating room, or a theme related to those issues might emerge.

In evaluating these findings, the following statement captured the essence of these items as they related to the internally published “Common Cause” or theme that was a part of the annual CCA: “Evaluate and implement a safe and effective nurse-call system. Include, but do not limit to an evaluation of staffing requirements, downtime procedures, and standard operating practices for all users.” The RPN for this item was 576 (S = 8, O = 9, D = 8).

For reporting purposes, all action items and vulnerabilities are bulleted under the theme, along with identifiers relating them to specific RCAs, and tracked in the RCA database. In general, the CCA results in five or six main themes each year. In a review of the process, these themes account for the majority of identified vulnerabilities and improvement actions from RCA. Themes are presented in rank order; organizational prioritization includes such considerations as resource availability and other pillar plan priorities.

In the case described above, capital budget was planned to purchase new monitors, so the recommendation of the RCA team—to have the same cardiorespiratory monitors throughout the organization—was quickly implemented. The critical nature of the RPN score provided the impetus for the creation of a team to address all components affecting the monitor: order set, creation of policy, standards around which members of the health care team can safely disable alarms, etc. The degree of urgency directly correlated with the severity of this situation, and work was accomplished in a timely manner.

Nurse-call issues were also assigned to special teams to address the specific issues identified and to evaluate the patient-to-nurse communication system and make improvements. These numerous actions taken by CCA-driven teams have resolved 20 individual items. More importantly, though, they have addressed the deeper underlying theme with respect to the vulnerabilities of patients on cardiorespiratory monitors.

Although improvement work was initiated at the completion of the RCA on the monitoring case, it was not until related findings were collected from several RCAs using the CCA approach that the organization understood the depth and breadth of the vulnerabilities. This new understanding related to the cardiorespiratory monitoring systems and the associated nurse-call system as the two are utilized as integrated technology with monitor alarm notifications sent via the nurse-call system. It was at this point that the item was incorporated into the organization’s annual operating plan. As a result, the work accelerated to the level needed in order to make the significant changes in practice, changes in equipment, and continued improvement of all associated systems across the institution. Changes of this breadth and scope often require significant allocation of resources, personnel, and time. They would not be expedited without a credible prioritization process, such as we provided by CCA.

The initial improvement team included biomedical engineering, physician and nursing leaders and front-line staff, respiratory therapy, occupational therapy, physical therapy, speech, child life, a nursing educator, and a team facilitator from the Center for Quality and Patient Safety. The team worked to test and implement standards of care and hard-coded the expected practices in policies. A broad education program was initiated, and a system for measuring performance was put into place. Periodically, the team was asked to report to the Patient Safety Committee to assure that the work was progressing and to identify major barriers that might require leadership intervention. At the same time, additional teams were set up to evaluate the nurse-call system and the integration with the cardiorespiratory monitoring system. Staffing patterns changed, and responsibilities for alarm response were better defined. Education and training of clinical support staff assured that they understood their role in cardiorespiratory monitoring.


CCA has been an extremely effective method for distilling individual patient safety findings to prioritized themes that have a powerful impact when presented to operational and physician leaders, as well as the hospital’s board of trustees. The findings from this process are a guiding foundation in the development of the organization’s Operating Plan for Quality and Patient Safety in each of the past 2 years.21 The five or six annual themes with an RPN score >250 become a prioritized quality and patient safety goal. Executive sponsorship is defined, operational leadership is assigned, and an improvement team is chartered to test, implement and spread the changes needed to bring about improvement. Measures are tracked monthly, and regular progress reports are provided in a number of forums. When the identified area for improvement is limited to a single clinical department, as opposed to being organization-wide, the goal is incorporated into the department’s operating plan, and improvement continues on a smaller scale.

To make this process effective, Clinical Process Managers from the RCA team provide department leaders with detailed summaries of their findings and review cases that were influential in the items encompassed in the theme. Although an established process for sharing findings with leaders assures that they see each RCA report, a synopsis of findings that applies to their area is powerful. A meeting that includes leaders from the appropriate area provides an opportunity for questions to be asked and for clarification of each item. These sessions often lead to brainstorming about how to improve patient safety systems. Whenever possible, RCA Clinical Process Managers, who facilitated the related work, are included in the actual planning in order to provide insight, or in some cases, to facilitate or support working groups. This process helps convey nuances that a high level report cannot, and it allows information to be shared with the front-line staff who participated in teams.

To be recognized as effective, a patient safety program must identify the critical priorities and make improvements to address those priorities. The CCA report is a vital tool in prompting action, since it inspires a greater degree of confidence that the identified problems represent the most important vulnerabilities that need to be addressed. Competing priorities at the level of patient care units create difficulty in determining how to incorporate new work that results from RCA. CCA findings can help department leaders align their efforts with organizational priorities. Overall, this process provides greater confidence that we are working on the most critical issues and, hopefully, improving patient care and making it safer.

Next Steps

For the CCA described here, RCA findings were the primary source of data (Figure 2). Although we carefully worked to contemplate broad, system ramifications and also looked at their severity and potential impact on patient safety, we realized that many additional sources of information could be systematically incorporated into this process.

Figure 2. Common cause analysis: Years 1 and 2.

Figure 2

Common cause analysis: Years 1 and 2.

Input from RCAs is ultimately limited to the relatively few cases that warrant this type of analysis. In the future, an increased breadth of information will be utilized for conducting CCA (Figure 3). Although we have an incident reporting database, a new electronic safety reporting system currently being implemented will expand the quality of patient safety event data. This system ultimately will provide information from families and patients, as well as from employees. It will be a rich source of data for aiding in identifying institutional vulnerabilities, with much more robust data mining capabilities. Aggregate data from FMEAs, infection control findings, and a newly developed process within our institution called Mini-RCA (in which we train unit-based staff in the method so that it can be applied to more near miss events) will also enrich the analysis process in the future. Although patient safety considerations generated by various accrediting bodies (e.g., the Joint Commission, regulatory bodies such as the Centers for Medicare & Medicaid Services, national patient safety advocacy groups, and others) will continue to influence patient safety priorities, our clinicians will always give higher priority to data from internal experience, as developed through CCA.

Figure 3. Common cause analysis: What’s next?

Figure 3

Common cause analysis: What’s next?

There are many driving forces in patient safety, but one of the benefits of an institutional CCA is the unique and specific findings it generates, which relate to actual events and near miss scenarios. Patient safety work becomes much more compelling when it relates to one’s own experiences. The significance placed on patient safety improvement work in our institution affirms that it is the top priority, and it inspires clinicians to participate in RCAs, report their experience, and mobilize their energy to contribute to improvement initiatives.


Studer Group LLC. Hardwiring excellence. Gulf Breeze: Fire Starter; 2003.
Morath JM, Turnbull JE. To do no harm. San Francisco: Jossey Bass; 2005.
Reason J. Combating omission errors through task analysis and good reminders. Qual Saf Health Care. 2002;11:40–44. [PMC free article: PMC1743575] [PubMed: 12078368]
Reason J. Human error: Models and management. Br Med J. 2000;320:768–70. [PMC free article: PMC1117770] [PubMed: 10720363]
Berwick DM. Improvement, trust, and the health care workforce. Qual Saf Health Care. 2003;12:448–452. [PMC free article: PMC1758027] [PubMed: 14645761]
Heising CD, Rasmussen NC, Mak CH. Common cause analysis: A review and extension of existing methods. Massachusetts Institute of Technology; Cambridge (USA) Energy Lab: Oct., 1982. No PB-166520; MIT-EL-82-038.
Sentinel event policy and procedures. Oakbrook Terrace, IL: Joint Commission; 2005. [Accessed January 20, 2008]. Available at: www​​/NR/rdonlyres/F84F9DC6-A5DA-490F-A91F-A9FCE26347C4​/0/SE_chapter_july07.pdf.
Medical care availability and reduction of error (MCARE) Act. March 20, 2002. [Accessed January 10, 2008]. Available at: www​​/lib/psa/act_13/act_13.pdf.
Leape LL. Error in medicine. JAMA. 1994;272:1851–1857. [PubMed: 7503827]
Nolan TW. System changes to improve patient safety. Br Med J. 2000;320:771–773. [PMC free article: PMC1117771] [PubMed: 10720364]
Shappell SA. U.S. Department of Transportation, Federal Aviation Administration. Feb, 2000. The human factors analysis and classification system – HFACS. DOT/FAA/AM-00/7.
Kaye R, Crowley J. Medical device use-safety: Incorporating human factors engineering into risk management. U.S. Department of Health and Human Services, Food and Drug Administration; Jul, 2000. p. 1497.
Gosbee J, Anderson T. Human factors engineering design demonstrations can enlighten your RCA team. Qual Saf Health Care. 2003;12:119–121. [PMC free article: PMC1743682] [PubMed: 12679508]
Scarrow PK, Routon C, White SV. Michael Cohen on medication error reporting and patient safety. J Health Care Qual. 2005;27:29–36. [PubMed: 16190309]
Bagian JP, Gosbee J, Lee CZ, et al. The Veterans Affairs root cause analysis system in action. Jt Comm J Qual Patient Saf. 2002;28:531–545. [PubMed: 12369156]
NCPS triage cards for root cause analysis. VA National Center for Patient Safety. 2007. [Accessed January 10, 2008]. Available at: www​​.html.
Bagian JP, Lee C, Gosbee J, et al. Developing and deploying a patient safety program in a large health care delivery system: You can’t fix what you don’t know about. Jt Comm J Qual Patient Saf. 2001;27:522–532. [PubMed: 11593886]
Root cause analysis statistics. Oakbrook Terrace, IL: Joint Commission; [Accessed January 10, 2008]. Available at: www​​/NR/rdonlyres/FA465646-5F-5F-4543-AC8F-E8AF6571E372​/0/root_cause_se.jpg.
Dayton E, Henrikson K. Communication failure: Basic components, contributing factors, and the call for structure. Jt Comm J Qual Patient Saf. 2007;33:34–47. [PubMed: 17283940]
Neily J, Ogrinc G, Mills P, et al. Using aggregate root cause analysis to improve patient safety. Jt Comm J Qual Patient Saf. 2003;29:434–439. [PubMed: 12953608]
Failure mode and effect analysis: A practical guide. Plymouth Meeting, PA: Emergency Care Research Institute; 2006. [Accessed January 11, 2008]. Partnership for patient care. Available at: www​


  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this page (565K)

Other titles in this collection

Related information

  • PMC
    PubMed Central citations
  • PubMed
    Links to PubMed

Similar articles in PubMed

See reviews...See all...

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...