The traditional approach to medical errors has been to blame the provider who delivers care directly to the patient, acting at what is sometimes called the “sharp end” of care: the doctor performing the transplant operation or diagnosing the patient's chest pain, the nurse hanging the intravenous medication bag, or the pharmacist preparing the chemotherapy. Over the last decade, we have recognized that this approach overlooks the fact that most errors are committed by hardworking, well-trained individuals, and such errors are unlikely to be prevented by admonishing people to be more careful, or by shaming, firing, or suing them.
The modern patient safety movement replaces “the blame and shame game” with an approach known as systems thinking. This paradigm acknowledges the human condition—namely, that humans err—and concludes that safety depends on creating systems that anticipate errors and either prevent or catch them before they cause harm. Such an approach has been the cornerstone of safety improvements in other high-risk industries but has been ignored in medicine until the past decade.
British psychologist James Reason's “Swiss cheese model” of organizational accidents has been widely embraced as a mental model for system safety1,2 (Figure 2-1). This model, drawn from innumerable accident investigations in fields such as commercial aviation and nuclear power, emphasizes that in complex organizations, a single “sharp-end” (the person in the control booth in the nuclear plant, the surgeon making the incision) error is rarely enough to cause harm. Instead, such errors must penetrate multiple incomplete layers of protection (“layers of Swiss cheese”) to cause a devastating result. Reason's model highlights the need to focus less on the (futile) goal of trying to perfect human behavior and more on aiming to shrink the holes in the Swiss cheese (sometimes referred to as latent errors) and create multiple overlapping layers of protection to decrease the probability that the holes will ever align and let an error slip through.
James Reason's Swiss cheese model of organizational accidents. The analysis is of “The Wrong Patient” case in Chapter 15. (Reproduced with permission from Reason JT. Human Error. New York, NY: Cambridge University Press, 1990. Copyright © 1990 Cambridge University Press.)
The Swiss cheese model emphasizes that analyses of medical errors need to focus on their “root causes”—not just the smoking gun, sharp-end error, but all the underlying conditions that made an error possible (or, in some situations, inevitable) (Chapter 14). A number of investigators have developed schema for categorizing the root causes of errors; the most widely used, by Charles Vincent, is shown in Table 2-1.3,4 The schema explicitly forces the error reviewer to ask whether there should have been a checklist or read back, whether the resident was too fatigued to think clearly, or whether the young nurse was too ...