• We are sorry, but NCBI web applications do not support your browser and may not function properly. More information
Logo of bmjBMJ helping doctors make better decisionsSearch bmj.comLatest content
BMJ. Mar 18, 2000; 320(7237): 759–763.
PMCID: PMC1117768

Reporting and preventing medical mishaps: lessons from non-medical near miss reporting systems

Paul Barach, clinical fellow and Stephen D Small, assistant anaesthetist

Reducing mishaps from medical management is central to efforts to improve quality and lower costs in health care. Nearly 100 000 patients are estimated to die preventable deaths annually in hospitals in the United States, with many more incurring injuries at an annual cost of $9 billion. Underreporting of adverse events is estimated to range from 50%–96% annually.13 This annual toll exceeds the combined number of deaths and injuries from motor and air crashes, suicides, falls, poisonings, and drownings.4 Many stakeholders in health care have begun to work together to resolve the moral, scientific, legal, and practical dilemmas of medical mishaps. To achieve this goal, an environment fostering a rich reporting culture must be created to capture accurate and detailed data about nuances of care.

Outcomes in complex work depend on the integration of individual, team, technical, and organisational factors.5,6 A continuum of cascade effects exists from apparently trivial incidents to near misses and full blown adverse events.7,8 Consequently, the same patterns of causes of failure and their relations precede both adverse events and near misses. Only the presence or absence of recovery mechanisms determines the actual outcome.9 The National Research Council defines a safety “incident” as an event that, under slightly different circumstances, could have been an accident.10 Focusing on data for near misses may add noticeably more value to quality improvement than a sole focus on adverse events.

Schemes for reporting near misses, “close calls,” or sentinel (“warning”) eventshave been institutionalised in aviation,w1 w2 nuclear power technology,w3 w4 petrochemical processing, steelw5 production,w6 military operations, and air transportation.w7-w11 In health care, efforts are now being made to create incident reporting systems for medical near misses8,1115 to supplement the limited data available from mandatory reporting systems focused on preventable deaths and serious injuries.

There are, however, powerful disincentives to reporting.1618 Management attitudes and institutional climate can greatly influence the success or failure of reporting efforts.19 Reason identifies four critical elements of an effective safety culture—that is, a reporting, just, flexible, and learning culture.20 Can this model be validated in health care? Given the lack of a review that addresses these questions, we report our preliminary findings of a study of incident reporting systems for near misses in non-medical domains.

Summary points

  • Research studies have validated an epidemic of grossly underreported, preventable injuries due to medical management
  • Recent policy documents have placed high priority on improving incident reporting as the first step in addressing patient injuries, and have called for translation of lessons from other industries
  • Complex non-medical industries have evolved incident reporting systems that focus on near misses, provide incentives for voluntary reporting, ensure confidentiality while bolstering accountability, and emphasise perspectives of systems in data collection, analysis, and improvement
  • Reporting of near misses offers numerous benefits over adverse events: greater frequency allowing quantitative analysis; fewer barriers to data collection; limited liability; and recovery patterns that can be captured, studied, and used for improvement
  • Education and engagement of all stakeholders of health care and negotiation of their conflicting goals will be necessary to change the balance of barrier incentives in favour of implementing reporting systems

Methods

Our analysis comes from three main sources: a literature search to identify incident reporting systems and related research; a compilation of nomenclature and classification of key features of select incident reporting systems; and interviews with directors of reporting systems and experts to explore the design of systems, output, and operational aspects.

Firstly, we searched computerised bibliographic databases for 1966-99, including Medline, ABI Inform, Psychlit, Social Science Citation Index, and the internet, for citations by keywords: incidents, accidents, human errors, near miss, risk, safety, quality assurance, and medical audit. Secondly, we hand searched the most relevant journals, studies in abstract form, dissertations, theses, and book chapters. We reviewed the references of each citation to identify additional descriptions of incident reporting systems in three non-medical domains. Thirdly, experts helped identify reports and issues missing from public citation lists. Definitions of key terms were extracted from reports of incident reporting systems.

The research built on interviews guided by a semistructured standardised questionnaire (see appendix 1 on website) with system directors and designers. The experts were identified from the literature search and interviews with other experts and included consultants concerned with safety monitoring systems in academia, industry, government, and the military.

Results

The box lists 12 of the 25 non-medical incident reporting systems we reviewed. Definitions were assembled from the literature of the commonest terms used to describe adverse events. With few exceptions, the existing studies each report data from different populations, and they often differ in the way they define, count, and track adverse events. We found a large variation in nomenclature with no fixed and universally accepted definitions (see table A on website). Experts commented on the importance of accepted definitions to focus priorities, data collection, research, and impact of changes in the systems.

Reporting systems for non-medical events

Aviation

  • Aviation safety reporting system (ASRS)
  • Aviation safety airways program (ASAP)
  • Air Altitude Awareness Program
  • Canadian aviation safety reporting system (CASRS)
  • British Airways safety information system (BASIS)Air safety report (ASR)Confidential human factors reporting program (CHFRP)Special event search and master analysis (SESMA)
  • Human factors failure analysis classification system (HFACS)

NASA

  • Safety reporting system

Petrochemical processing, steel production

  • Prevention and recovery information system for monitoring and analysis (PRISMA)

Nuclear (nuclear power and radiopharmaceutical industries)

  • Licensing event reports (LER)Human performance information systems (HPIS)Human factors information system (HFIS)
  • Nuclear Regulatory Commission allegations systems process (NRCAS)
  • Diagnostic misadministration reports—regulatory information distribution system (RIDS)

We collected numerous structural characteristics about incident reporting systems for non-medical events (table (table1).1). Seven of the 12 systems were mandated and implemented by the federal government, with voluntary participation. Ten systems were confidential, the other two anonymous. All stimulated elaboration by narrative. (The aviation safety reporting system has saved all of its 500 000 reports in their entirety.) Most offered feedback to their respective communities. Some offered legal immunity to reporters as long as data were submitted promptly (up to 10 days after the event for the aviation safety reporting system; see appendix 2 on website).

Table 1
Non-medical incident reporting systems

We reduced these elements to several common threads characterising near miss reporting (box). Finally, we analysed the mix of barriers and incentives that ultimately govern the success of incident reporting systems (table (table2).2).

Common conflicts in near miss reporting systems, with examples

  • Sacrificing accountability for information—Negotiating moral hazards in choosing between good of society compared with needs of individuals
  • Near miss data compared with accident data—Near miss data plentiful, minimises hindsight bias, proactive, less costly, no indemnity
  • A change in focus from errors and adverse events to recovery processes—Recovery equals resilience; emphasis on successful recovery, which offers learning opportunity
  • Trade offs between large aggregate national databases and regional systems—National offers longer denominators, capture of rare events; regional offers potentially more specific feedback and local effectiveness
  • Finding right mix of barriers and incentives—Supporting needs of all stakeholders; ecological model
  • Safety has up front, direct costs; payback is indirect—Spending “hard” money to save larger sums and reduce quality waste
  • Safety and respect for reporters as well as patients—A just culture that acknowledges pervasiveness of hindsight bias and balances accountability needs of society
  • The need for continuous timely feedback that reporters find relevant; changing bureaucratic culture—Critical to sustain effort of ongoing reporting

Table 2
Barriers and incentives to reporting

Comparison of near misses with adverse outcomes offers advantages: (a) near misses occur 3-300 times more often than adverse events, enabling quantitative analysis7,14,21; (b) fewer barriers to data collection exist, allowing analysis of interrelations of small failures22; (c) recovery strategies can be studied to enhance proactive interventions and to de-emphasise the culture of blame5,20,23; and (d) hindsight bias is more effectively reduced.24,25

The sum of barriers and incentives can be considered in terms of their impact on individuals, organisations, and society. Powerful disincentives to reporting depend on the organisational culture26 and include extra work, scepticism, lack of trust, fear of reprisals, and lack of effectiveness of present reporting systems. Incentives to reporting included, in addition to confidentiality, that incident reporting systems should be prophylactic (provide some degree of immunity), philanthropic (reporters identify with injured patients and other healthcare providers that could benefit from data), and therapeutic (reporters learn from reporting about adverse events).24 Incentives for society included accountability, transparency, enhanced community relations, and sustaining trust and confidence in the healthcare system.

Examination of successful non-medical domains indicates that the following factors are important in determining the quality of incident reports and the success of incident reporting systems: immunity (as far as practical); confidentiality or data de-identification (making data untraceable to caregivers, patients, institutions, time); independent outsourcing of report collection and analysis by peer experts; rapid meaningful feedback to reporters and all interested parties; ease of reporting; and sustained leadership support.

Discussion

We aimed to provide an educational resource about incident reporting systems of near misses and related lessons on safety that are transferable from other industries. An organisation's interpretation of near misses influences how it collects information related to safety, and thus its capacity to prevent the recurrence of undesirable events.7 Tamuz emphasises that the use of broad ambiguous definitions of potential dangers aids discovery of risks that escape existing definitions18 (see table on website). Concessions to reporters ultimately lead to discoveries, which enable focused improvements in training, organisation, management of work, and the design of systems.

In medicine there is a long tradition of examining past practice to understand how things might have been done differently.27 However, conferences on morbidity and mortality, grand rounds, and peer review all currently share the same shortcomings: a lack of human factors and thinking about systems; a narrow focus on individual performance to the exclusion of contributory team and larger social issues; hindsight bias; a tendency to search for errors as opposed to the myriad causes of error induction; and a lack of multidisciplinary integration into an organisation wide safety culture. The situation is akin to that of the field of injury control, where until there was focused public attention and demand for action on injuries and their prevention, injury remained a neglected health problem.28 Only recently, however, have the quality and patient safety movements brought this mindset to bear on all healthcare services.3

Near miss reporting

We defined a near miss as any event that could have had adverse consequences but did not and was indistinguishable from fully fledged adverse events in all but outcome.23 Reporting systems are thought to have contributed importantly to low accident rates in industries with huge catastrophic potential by enabling managers to take a proactive, preventive approach.13,19 Finally, near misses offer powerful reminders of system hazards and retard the process of forgetting to be afraid.7

Aviation reporting systems

Investigation into public accidents and confidential near miss analyses have been complementary in the successful effort to improve air safety.24 After three decades, over 500 000 confidential near miss reports (currently over 30 000 yearly reports) have been logged by the aviation safety reporting system. Eligibility for limited immunity for non-criminal offences is a powerful incentive to report. Cracks in the framework of trust among stakeholders in aviation have been associated with noteable decreases in reporting.18

Risk management in aviation illustrates how organisations cooperate, by capturing near miss information to augment the sparse history of crashes and injuries.20 The decades long aviation effort to improve safety through system monitoring and feedback holds many important lessons for health care. Data from incident reporting systems on near misses have been effectively used to redesign aircraft, air traffic control systems, airports, and pilot training, and to reduce human error.18 An overarching lesson from 25 years of aviation experience is that methods for data collection and structures evolved to simultaneously maximise confidentiality, bidirectional information flow, and improvement in local processes.29

Nuclear power reporting systems

In the highly charged political, financially accountable, and legal environment of the nuclear power industry, no penalties are associated with reporting non-consequential events, or “close calls,” to the human performance enhancement system. The Three Mile Island disaster led to the emergence of norms throughout the industry. The dread of even a single potential catastrophe and its implications for all industry members outweighed any objection to a reporting system for near misses. Backed by communal pressure, local proactive safety methods became institutionalised and effective across the industry. The intensified approach to process improvement through a focus on safety led to financial gains through more efficient power production (fewer outages, shutdowns, and reduction of capacity).30 As in aviation, there is a trend to capture the most nuanced information using a nested systems approach, with confidentiality and other protections increasing in proportion to the sensitivity, value, and difficulty of obtaining the desired information.

Reporting participation: mandatory versus voluntary

The analysis and evolution of reporting systems for non-medical near misses supports the contention that all reporting, to an extent, is voluntary. Clearly, both voluntary and mandatory approaches are required, each with its own benefits and limitations. Mature safety cultures are driven by forces external and internal to industries, and over time these forces nourish voluntarism and reporting of near misses. Furthermore, rapidly improving technology and information systems enable wider monitoring and public awareness of adverse outcomes in open systems.31 These developments diminish distinctions between mandatory and voluntary behaviour.32

Anonymous versus confidential provisions

The most obvious way of ensuring confidentiality of the data and reporter is to have the reports filed anonymously. For example, excerpts from reports to the aviation safety reporting system are published anonymously in a weekly newsletter, Callback, with candid accounts of actions contributing to dangerous situations20 (see appendix 3 on website). Reports in numerous medical incident reporting systems travel only one way, anonymously.11,32,33,34

O'Leary and Chappell point out, however, that anonymity is not always possible or desirable.35 Analysts cannot contact reporters for more information; anonymous reports may be unreliable; and, in some situations, it is difficult to guarantee anonymity. Anonymity may also be criticised for its threat to accountability and transparency, both at variance with the ethics of professionalism.36 It may, however, be important to provide anonymity early in the evolution of an incident reporting system, at least until trust is built and reporters see practical results.

Medical reporting systems

Health care has lagged behind other industries in implementing reporting systems and other initiatives related to safety.1,3,20 In the past five years, however, there has been a concerted effort in this direction. Studies in anaesthesia,11 w5 w24 w25 emergency care,12 intensive care,32 w26 w27 transfusion medicine,15 cytology,w30 occupational and industrial medicine,w31 w32 cardiac surgery,w33 pharmacy,w34 and nursingw35; the Veterans Administration near miss incident reporting systemw36; and in medicine research into human factors6,20 w9 w10 represent a critical mass of safety research.

A recent report from the Institute of Medicine, To Err is Human, strongly recommends complementary mandatory incident reporting systems and voluntary near miss reporting systems in health care.3 Experts in non-medical domains are quick to share anecdotes of dangers controlled by information from incident reporting systems. Many directors of reporting systems whom we interviewed believe that the debriefing process involved in confidential reporting of an incident brings closure, adds to long term recall, and supports behavioural change. The benefits of incident reporting systems in health care will be defined by a combination of: longitudinal observational studies of liability and injuries, ethnographic case studies, complex economic analyses, and strong face validity.

The barrier analysis

How can we transform the current culture of blame and resistance to one of learning and increasing safety? Understanding the balance of barriers and incentives to reporting is the first step (table (table2).2). It will be essential to introduce norms that inculcate a learning, non-punitive safety reporting culture in professional schools and graduate training programmes, with support from consumers, patient advocacy groups, regulators, and accreditors. Some trial and error learning will be necessary. Legal protection for reporters will need to be reinforced, as it has as been in Australia and New Zealand, where incident reporting systems have been successful in gaining acceptance and credibility.37

Cost benefit analysis

Many high risk fields such as nuclear power technology, aviation, and petrochemical processing have shown that implementing incident reporting systems for near misses are essential because they benefit their organisations more than they cost.7,30,38 w23 The system developed for petrochemical processing uses seven quality indicators to assess the effectiveness of reporting systems but highlights the fairness and the cost effectiveness. Directors of systems we interviewed believe that these systems not only reduce quality waste but are cost effective.39 This is similar to the worker safety climate, where companies that have had to embrace the safety rules of the occupational safety health administration have discovered the profit of a healthy workforce.40

Evidence based medicine and improvement in outcomes are accelerating the translation of lessons learned in other domains over the past decades.41 Studies of incident reporting systems for non-medical near misses hold promise for extending this trend and catalysing a shift in the healthcare culture from a punitive to a collaborative mindset that seeks to identify the underlying system failures.42

Conclusions

Non-punitive, protected, voluntary incident reporting systems in high risk non-medical domains have grown to produce large amounts of essential process information unobtainable by other means. Non-medical incident reporting systems have evolved over the past three decades to emphasise near misses, in addition to adverse events, to encourage confidentiality over anonymity, and to move beyond traditional linear thinking about human error, to analyses of multiple causation at the level of systems.

For healthcare reporting systems there must be incentives to promote voluntary reporting—completely, confidentially, and objectively. Reporting should be the right, easy, and safe policy for healthcare professionals. To maximise the usefulness of incident reporting systems there will be a need to balance accountability, system transparency, and protections for reporters. To ease the implementation of incident reporting systems, the community must be involved in system oversight, support, and advocacy. The top priority must be to design systems geared to preventing, detecting, and minimising effects of undesirable combinations of design, performance, and circumstance. Experience with non-medical incident reporting systems in aviation, nuclear power technology, and petrochemical processing, offer lessons applicable to the design of safety reporting systems in health care.

Supplementary Material

[extra: Definitions, extra references and appendices]

Acknowledgments

We thank Hal Kaplan, John Carroll, Emily Roth, Elihu Richter, and Jeff Cooper for advice and helpful suggestions.

Footnotes

Competing interests: None declared.

website extra: Definitions, extra references, and appendices appear on the BMJ's website www.bmj.com

References

1. Leape LL. Error in medicine. JAMA. 1994;272:1151–1157.
2. Cullen D, Bates W, Small S, Cooper JB, Nemeskal AR, Leape LL, et al. The incident reporting system does not detect adverse drug events: A problem in quality assurance. Joint Commission Journal on Quality Improvement. 1995;21:541–548. [PubMed]
3. Institute of Medicine. To err is human: building a safety health system. Washington, DC: National Academy Press; 1999.
4. Baker SP, O'Neill B, Ginsburg M, Guohua L. The injury fact book. 2nd ed. New York: Oxford University Press; 1992.
5. Vincent C, Ennis M, Audley RJ. Medical accidents. Oxford: Oxford University Press; 1993.
6. Bogner MS. Human error in medicine. Hillsdale, NJ: Erlbaum; 1994.
7. March JG, Sproull LS, Tamuz M. Learning from samples of one or fewer. Organ Sci. 1991;2:1–3. [PMC free article] [PubMed]
8. Gambino R, Mallon O. Near misses—an untapped database to find root causes. Lab Report. 1991;13:41–44.
9. Van der Schaff TW. Development of a near miss management system at a chemical process plant. In: Van der Schaff TW, Hale AR, Lucas DA, editors. Near miss reporting as a safety tool. Oxford: Butterworth-Heinemann; 1991.
10. National Research Council, Assembly of Engineering, Committee on Flight Airworthiness Certification Procedures. Improving aircraft safety: FAA certification of commercial passenger aircraft. Washington, DC: National Academy of Sciences; 1980.
11. Runciman WB, Sellen A, Webb RK, Barker L. Errors, incidents and accidents in anesthetic practice. Anesth Intensive Care. 1993;21:506–519. [PubMed]
12. Shea CE. Manchester: University of Manchester; 1996. The organization of work in a complex and dynamic environment: the accident and emergency department [dissertation]
13. Van der Schaff TW. Proceedings of enhancing patient safety and reducing errors in health care. Rancho Mirage, CA: Annenberg Center; 1998. Hospital-wide versus nationwide event reporting: an empirical framework based on single-department studies in hospitals; pp. 190–192.
14. Battles JB, Kaplan HS, Van der Schaff TW, Shea CE. The attributes of medical event reporting systems. Arch Pathol Lab Med. 1998;122:132–138.
15. Kaplan HS, Battles JB, Van der Schaff TW, Shea CE, Mercer SQ. Identification and classification of the causes of events in transfusion medicine. Transfusion. 1998;38:1071–1081. [PubMed]
16. Vincent C. Reasons for not reporting adverse events: an empirical study. J Eval Clin Pract. 1999;5:1–9. [PubMed]
17. Wu A, Folkman S, McPhee S, Lo B. Do house officers learn from their mistakes? JAMA. 1991;265:2089–2094. [PubMed]
18. Tamuz M. Developing organizational safety information systems for monitoring potential dangers. In: Apostolokis GE, Wu JS, editors. Proceedings of physical sciences annual meeting II. San Diego, CA: Galen Press; 1994.
19. Roberts K. Research in nearly failure-free, high reliability organizations: having the bubble. IEEE Trans Eng Manage. 1989;36:132–139.
20. Reason J. Managing the risks of organisational accidents. Aldershot: Ashgate; 1997.
21. Petersen LA, Orav JA, Teich JM, O'Neil AC, Brennan TA. Using a computerized sign-out program to improve continuity of inpatient care and prevent adverse events. Joint Commission J Qual Improvement. 1998;24:77–87. [PubMed]
22. Kletz T. Learning from accidents. 2nd ed. Oxford: Butterworth-Heinemann; 1994.
23. Barach P, Small SD, Kaplan H. Designing a confidential safety reporting system: in depth review of thirty major medical incident reporting systems, and near-miss safety reporting systems in the nuclear, aviation, and petrochemical industries. Anesthesiology. 1999;91:A1209.
24. Billings CE. Some hopes and concerns regarding medical event reporting systems: lessons from the NASA aviation safety reporting system (ASRS) Arch Pathol Lab Med. 1998;121:214–215. [PubMed]
25. Fischhoff B. Hindsight does not equal foresight: the effect of outcome knowledge on judgement under uncertainty. J Exp Psych: Human Perception Performance. 1975;1:288–299.
26. Westrum R. Cultures with requisite imagination. In: Wise J, Hopkin D, Stager P, editors. Verification and validation of complex systems: human factors issues. Berlin: Springer-Verlag; 1992. pp. 401–416.
27. Bosk C. Forgive and remember, managing medical failure. Chicago: University of Chicago Press; 1979.
28. Haddon W. Advances in the epidemiology of injuries as a basis for public policy. Public Health Rep. 1980;95:411–421. [PMC free article] [PubMed]
29. Pidgeon NF. Safety culture and risk management in organizations. J Cross-Cult Psychol. 1996;22:129–140.
30. Lucas DA. Human reliability in nuclear power. London: IBC Technical Services; 1987. Human performance data collection in industrial systems.
31. Evans RS, Pestonik SL, Classen DC. A computerized-assisted management program for antibiotics and other anti-infective agents. N Engl J Med. 1998;338:232–238. [PubMed]
32. Beckman U, Baldwin I, Hart GK, Runciman WB, et al. The Australian incident monitoring study in intensive care: AIMS-ICU. The development and evaluation of a voluntary anonymous incident reporting system. Anesthesia and Intensive Care. 1996;24:315–326.
33. Geiduschek JM. Registry offers insight on preventing cardiac arrests in children. ASA Newsletter. 1998;62(6):16–18.
34. Staender S. Human recoveries and the management of critical incidents in anesthesiology. Annenberg, Rancho Mirage, Nov 8-10, CA. www.medana.unibas.cirs; accessed 25 November 1999.
35. O'Leary M, Chappell SL. Confidential incident reporting systems create vital awareness of safety problems. ICAO J. 1996;51:11–13. [PubMed]
36. Emanuel L. In reply. JAMA. 1997;278:21–22. . (Reply to editorial by L Emmanuel ( JAMA 1997; 278:21) and article by J H McArthur and F D Moore ( JAMA 1997; 277:985-9).)
37. Runciman W. Iatrogenic injury in Australia. Adelaide: Australian Patient Safety Foundation; 2000.
38. Corcoran WR. The phoenix handbook: the ultimate event evaluation manual for finding profit improvement in adverse events. Windsor, CT: Nuclear Safety Review Concepts; 1998.
39. Langley G, Nolan K, Nolan T, Norman C, Provost L, editors. The improvement guide. San Francisco: Josey-Bass; 1996.
40. Robertson L. Injury epidemiology, research and control strategies. 2nd ed. Oxford: Oxford University Press; 1998.
41. Berwick DM. Continuous improvement as an ideal in health care. N Engl J Med. 1989;370:53–56. [PubMed]
42. Millenson M. Demanding medical excellence, doctors and accountability in the information age. Chicago: University of Chicago; 1997.

Articles from BMJ : British Medical Journal are provided here courtesy of BMJ Group
PubReader format: click here to try

Formats:

Related citations in PubMed

See reviews...See all...

Cited by other articles in PMC

See all...

Links

  • Cited in Books
    Cited in Books
    PubMed Central articles cited in books
  • PubMed
    PubMed
    PubMed citations for these articles

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...