Skip to Main Content

In late 1999, the Institute of Medicine published To Err is Human: Building a Safer Health Care System.1 Although the IOM has published more than 600 reports since To Err, none have been nearly as influential. The reason: extrapolating from data from the Harvard Medical Practice Study,2,3 performed a decade earlier, the authors estimated that 44,000 to 98,000 Americans die each year from medical errors. More shockingly, they translated these numbers into the now-famous “jumbo jet units,” pointing out that this death toll would be the equivalent of a jumbo jet crashing each and every day in the United States.

Although some critiqued the jumbo jet analogy as hyperbolic, I like it for several reasons. First, it provides a vivid and tangible icon for the magnitude of the problem (obviously, if extended to the rest of the world, the toll would be many times higher). Second, if in fact a jumbo jet were to crash every day, who among us would even consider flying electively? Third, and most importantly, consider for a moment what our society would do—and spend—to fix the problem if there were an aviation disaster every day. The answer, of course, is that there would be no limit to what we would do to fix that problem. Yet prior to the IOM Report, we were doing next to nothing to make patients safer.

This is not to imply that the millions of committed, hardworking, and well-trained doctors, nurses, pharmacists, therapists, and healthcare administrators wanted to harm people from medical mistakes. They did not—to the degree that Albert Wu has labeled providers who commit an error that causes terrible harm “second victims.”4 Yet we now understand that the problem of medical errors is not fundamentally one of “bad apples” (though there are some), but rather one of competent providers working in a chaotic system that has not prioritized safety. As Kaveh Shojania and I wrote in our book, Internal Bleeding:

Decades of research, mostly from outside healthcare, has confirmed our own medical experience: Most errors are made by good but fallible people working in dysfunctional systems, which means that making care safer depends on buttressing the system to prevent or catch the inevitable lapses of mortals. This logical approach is common in other complex, high-tech industries, but it has been woefully ignored in medicine. Instead, we have steadfastly clung to the view that an error is a moral failure by an individual, a posture that has left patients feeling angry and ready to blame, and providers feeling guilty and demoralized. Most importantly, it hasn't done a damn thing to make healthcare safer.5
Try for a moment to think of systems in healthcare that were truly “hardwired” for safety prior to 1999. Can you come up with any? I can think of just one: the double-checking done by nurses before releasing a unit of blood to prevent ABO transfusion errors. Now think about other error-prone areas: preventing harmful drug interactions or giving patients medicines to which they are allergic; ensuring that patients' preferences regarding resuscitation are respected; guaranteeing that the correct limbs are operated on; making sure primary care doctors have the necessary information after a hospitalization; diagnosing patients with chest pain in the emergency department correctly—none of these were organized in ways that ensured safety.

Interestingly, many of the answers were there for the taking—from industries as diverse as take-out restaurants to nuclear power plants, from commercial aviation to automobile manufacturing—and there are now dozens of examples of successes in applying techniques drawn from other fields to healthcare safety and quality (Table P–1).6 Why does healthcare depend so much on the experiences of other industries to guide its improvement efforts? In part, it is because other industries have long recognized the diverse expertise that must be tapped to produce the best possible product at the lowest cost. In healthcare, the absence of any incentive (until recently) to focus on quality and safety, our burgeoning biomedical knowledge base, our siloed approach to training, and, frankly, professional hubris have caused us to look inward, not outward, for answers. The fact that we are now routinely seeking insights from aviation, manufacturing, education, and other industries, and embracing paradigms from engineering, sociology, psychology, and management, may prove to be the most enduring benefit of the patient safety movement.

TABLE P–1 EXAMPLES OF PATIENT SAFETY PRACTICES DRAWN AT LEAST IN PART FROM NON-HEALTHCARE INDUSTRIES
Strategy (Described in Chapter X) Nonhealthcare Example Study Demonstrating Value in Healthcare Impetus for Wider Implementation in Healthcare
Improved ratios of providers to “customers” (Chapter 16) Teacher-to-student ratios (such as in class-size initiatives) Needleman et al. (2011) Legislation in many states mandating minimum nurse-to-patient ratios, other pressure
Decrease provider fatigue (Chapter 16) Consecutive work-hour limitations for pilots, truck drivers Landrigan et al. (2004) Accreditation Council for Graduate Medical Education (ACGME) regulations limiting resident duty hours
Improve teamwork and communication (Chapter 15) Crew resource management (CRM) in aviation Neily et al. (2010) Some hospitals now requiring team training for individuals who work in risky areas such as labor and delivery or surgery
Use of simulators (Chapter 17) Simulator use in aviation and the military Bruppacher et al. (2010) Medical simulation now required for credentialing for certain procedures; technology improving and costs falling
Executive Walk Rounds (Chapter 22) “Management by Walking Around” in business Thomas et al. (2005) Executive Walk Rounds not required, but remain a popular practice
Bar coding (Chapter 13) Use of bar coding in manufacturing, retail, and food sales Poon et al. (2010) U.S. Food and Drug Administration now requires bar codes on most prescription medications; bar coding or its equivalent may ultimately be required in many identification processes
Reproduced and updated with permission from Wachter RM. Playing well with others: “translocational research” in patient safety. AHRQ WebM&M (serial online); September 2005. Available at: http://webmm.ahrq.gov/perspective.aspx?perspectiveID=9.
Bruppacher HR, Alam SK, LeBlanc VR, et al. Simulation-based training improves physicians' performance in patient care in high-stakes clinical setting of cardiac surgery. Anesthesiology 2010;112:985–992.
Landrigan CP, Rothschild JM, Cronin JW, et al. Effect of reducing interns' work hours on serious medical errors in intensive care units. N Engl J Med 2004;351:1838–1848.
Needleman J, Buerhaus P, Pankratz VS, et al. Nurse staffing and inpatient hospital mortality. N Engl J Med 2011;364:1037–1045.
Neily J, Mills PD, Young-Xu Y, et al. Association between implementation of a medical team training program and surgical mortality. JAMA 2010;304:1693–1700.
Poon EG, Keohane CA, Yoon CS, et al. Effect of bar-code technology on the safety of medication administration. N Engl J Med 2010;362:1698–1707.
Thomas EJ, Sexton JB, Neilands TB, et al. The effect of executive walk rounds on nurse safety climate attitudes: a randomized trial of clinical units. BMC Health Serv Res 2005;5:28.

All of this makes the field of patient safety at once vexing and exciting. To keep patients safe will take a uniquely interdisciplinary effort, one in which doctors, nurses, pharmacists, and administrators forge new types of relationships. It will demand that we look to other industries for good ideas, while recognizing that caring for patients is different enough from other human endeavors that thoughtful adaptation is critical. It will require that we tamp down our traditionally rigid hierarchies, without forgetting the importance of leadership or compromising crucial lines of authority. It will take additional resources, although investments in safety may well pay off in new efficiencies, lower provider turnover, and fewer expensive complications. It will require a thoughtful embrace of this new notion of systems thinking, while recognizing the absolute importance of the well-trained and committed caregiver. Again, from Internal Bleeding:

Although there is much we can learn from industries that have long embraced the systems approach,… medical care is much more complex and customized than flying an Airbus: At 3 A.M., the critically ill patient needs superb and compassionate doctors and nurses more than she needs a better checklist. We take seriously the awesome privileges and responsibilities that society grants us as physicians, and don't believe for a second that individual excellence and professional passion will become expendable even after our trapeze swings over netting called a “safer system.” In the end, medical errors are a hard enough nut to crack that we need excellent doctors and safer systems.5
I wrote the first edition of Understanding Patient Safety in 2007. In preparing this new volume four years later, I was astounded by the deepening understanding of some very fundamental issues in safety, and by how remarkably dynamic this field has proven to be. Some of the recent epiphanies and trends, all of which will be discussed in detail, include:

Information technology (Chapter 13): In the early days of the safety movement, many people saw information technology (IT) as the holy grail. Our naiveté—about the value of IT and its ease of implementation—has been replaced by a much more realistic appreciation of the challenges of implementing healthcare IT systems and leveraging them to prevent harm. Several installations of massive and expensive IT systems have failed (including one at my own hospital), and the adoption curve for IT has remained sluggish. The U.S. federal government is providing more than $20 billion to support the diffusion of computerized systems that meet certain standards (“meaningful use”), which is finally leading to a significant uptick in implementations.7 With more systems going online, we are beginning to gain a better appreciation of the true value of IT in patient safety, as well as how to mitigate some of the unanticipated consequences and potential harms.8
Measurement of safety, errors, and harm (Chapters 1 and 14): In the early years of the safety field, the target was errors, and we focused on measuring, and decreasing, error rates. This paradigm has largely given way to a new focus on measuring and attacking “harm” or “adverse events.” The Global Trigger Tool9—an instrument that supports a focused chart review looking for harm—has become increasingly popular, particularly as the limitations of other methods (incident reports, the AHRQ Patient Safety Indicators) have become clearer.10 One influential and disheartening study found no significant improvement in harm measures in North Carolina hospitals between 2003 and 2008, driving additional pressure for improvement.11
The checklist (Chapter 15): The remarkable success of checklist-based interventions in preventing central line–associated bloodstream infections12 and surgical complications,13,14 coupled with articles and books by respected safety leaders,15–17 have given the “lowly checklist” a newly exalted status in the patient safety field. The same leaders, however, caution that checklists are not a magic bullet, and that they can fail when introduced without sufficient attention to questions of culture and leadership.18,19
Safety targets: The safety field's embrace of healthcare-associated infections as a key target was driven by the fact that such infections are more easily measured and, in some cases, prevented than many other kinds of harm. This prioritization is natural but risks paying inadequate attention to other crucial targets that are less easily measured and fixed. One of my pet peeves is the short shrift we've given to diagnostic errors (Chapter 6), a state of affairs that has begun to change only in recent years.20
Policy issues in patient safety: In the early years of the safety field, much of the pressure to improve came from accreditors such as the Joint Commission and from the media, local and regional collaborations, and nongovernmental organizations such as the Institute for Healthcare Improvement.21 We are finally witnessing the emergence of a true business case for safety, driven by public and governmental reporting systems,22 along with fines for serious cases of harm and “no pay for errors” policies.23 Increasingly, concerns about the cost of healthcare are being coupled with concerns about patient safety—leading to payment penalties tied to substandard performance in areas such as readmissions, healthcare-associated infections, and others.24 In other words, we have entered an era in which the business case for patient safety has become sufficiently robust that many boards and CEOs now consider it a mission-critical endeavor.
Balancing “no blame” and accountability: As I mentioned earlier, the focus of the early years of the safety field was on improving systems of care and creating a “no blame” culture. This focus was not only scientifically correct (based on what we know about errors in other industries) but also politically astute. Particularly for U.S. physicians—long conditioned to hearing the term “error” and, in a kind of Rorschach test, thinking “medical malpractice”—the systems approach generated goodwill and buy-in.
But perhaps the greatest change in my own thinking between writing the first and second editions of this book is an increased appreciation of the need to balance a “no blame” approach (for the innocent slips and mistakes for which it is appropriate) with an accountability approach (including blame and penalties as needed) for caregivers who are habitually careless, disruptive, unmotivated, or fail to heed reasonable quality and safety rules.25 Getting this balance right is one of the most central questions we face in patient safety over the next decade.
This is just a short list designed to hint at some of the major changes that have influenced, even rocked, the still-young field of patient safety in the past few years. Another measure of the field's evolution is the fact that this second edition is about 30% longer than the first and has more than twice as many references. In other words, if you're looking for a stable, settled field, look elsewhere.

This book aims to teach the key principles of patient safety to a diverse audience: physicians, nurses, pharmacists, other healthcare providers, quality and safety professionals, risk managers, hospital administrators, and others. It is suitable for all levels of readers: from the senior physician trying to learn this new way of approaching his or her work, to the medical or nursing student, to the risk manager or hospital board member seeking to get more involved in institutional safety efforts. The fact that the same book can speak to all of these groups (whereas few clinical textbooks could) is another mark of the interdisciplinary nature of this field. Although many of the examples and references are from the United States (mostly because they are more familiar to me), my travels and studies (including the time I spent in England as a Fulbright Scholar in 2011) have convinced me that most of the issues are the same internationally, and that all countries can learn much from each other. I have made every effort, therefore, to make the book relevant to a geographically diverse audience, and have included key references and tools from outside the United States.

The book is divided into three main sections. In the introduction, I'll describe the epidemiology of error, distinguish safety from quality, discuss the key mental models that inform our modern understanding of the safety field, and summarize the policy environment for patient safety. In Section II, I'll review different error types, taking advantage of real cases to describe various kinds of mistakes and safety hazards, introduce new terminology, and discuss what we know about how errors happen and how they can be prevented. Although many prevention strategies will be touched on in Section II, more general issues regarding various strategies (from both individual institutional and broader policy perspectives) will be reviewed in Section III. After a concluding chapter, the Appendix includes a wide array of resources, from helpful Web sites to a patient safety glossary. To keep the book a manageable size, my goal is to be more useful and engaging than comprehensive—readers wishing to dig deeper will find relevant references throughout the text.

Some of the material for this book is derived or adapted from other works that I have edited or written. Specifically, some of the case presentations will be drawn from Internal Bleeding: The Truth Behind America's Terrifying Epidemic of Medical Mistakes,5 the “Quality Grand Rounds” series in the Annals of Internal Medicine (Appendix I),26 and AHRQ WebM&M.27 Many of the case presentations came from cases we used for the QGR series, and I am grateful to the patients, families, and caregivers who allowed us to use their stories (often agreeing to be interviewed). Of course, all patient and provider names have been changed to protect privacy.
I am also indebted to my partner in many of these efforts, Dr. Kaveh Shojania, now of the University of Toronto, for his remarkable contributions to the safety field and for reviewing an earlier draft of this book and authoring the glossary. Thanks too to my other partners on Quality Grand Rounds (Dr. Sanjay Saint and Amy Markowitz), AHRQ WebM&M and AHRQ Patient Safety Network28 (Drs. Brad Sharpe, Niraj Sehgal, Russ Cucina, John Young, and Sumant Ranji [a special tip of the hat to Sumant, who is the primary author of the superb AHRQ PSNet Patient Safety Primers, which proved to be a rich source of information for this edition]; Professors Mary Blegen, Brian Alldredge, and Joe Guglielmo; and Lorri Zipperer and Erin Hartman), and to the sponsoring organizations (Rugged Land, publisher of Internal Bleeding; the California HealthCare Foundation and the Annals of Internal Medicine for Quality Grand Rounds; and the U.S. Agency for Healthcare Research and Quality for AHRQ WebM&M and PSNet).

I wrote this second edition during my sabbatical at Imperial College London, and owe a special thanks to my British colleagues, particularly Professor Charles Vincent, to the US–UK Fulbright Commission for sponsoring my time in the United Kingdom, and to Brad Sharpe and Maria Novelero and the rest of the UCSF Division of Hospital Medicine for holding down the proverbial fort during my absence. Additional thanks to Bryan Haughom, who coauthored the original version of Chapter 7, to my colleagues on the American Board of Internal Medicine, to my administrative assistant Mary Whitney, and to Jim Shanahan of McGraw-Hill, who conceived of this book and has nurtured it every step of the way. This book would not have been possible without the contributions of all these extraordinary people and organizations. Katie Hafner, with whom I share my life, is a joy, an inspiration, and one hell of a great writer and editor. Katie, I dedicate this book to you, and us.

Finally, although this is not primarily a book written for patients, it is a book written about patients. As patient safety becomes professionalized (with “patient safety officers”), it will inevitably become jargon-heavy—“We need a root cause analysis!” “What did the Failure Mode Effects Analysis show?”—and this evolution will make it easy to take our eyes off the ball. We now know that tens of thousands of people in the United States and many times that number around the world die each year because of preventable medical errors. Moreover, every day millions of people check into hospitals or clinics worried that they'll be killed in the process of receiving chemotherapy, undergoing surgery, or delivering a baby. Our efforts must be focused on preventing these errors, and the associated anxiety that patients feel when they receive medical care in an unsafe, chaotic environment.

Some have argued that medical errors are the dark side of medical progress, an inevitable consequence of the ever-increasing complexity of modern medicine. Perhaps a few errors fit this description, but most do not. I can easily envision a system in which patients benefit from all the modern miracles available to us, and do so in reliable organizations that take advantage of all the necessary tools and systems to “get it right” the vast majority of the time. Looking back at the remarkable progress that has been made in the 12 years since the publication of the Institute of Medicine report on medical errors, I am confident that we can create such a system. My hope is that this book makes a small contribution toward achieving that goal.

REFERENCES

1. Kohn L, Corrigan J, Donaldson M, eds. To Err is Human: Building a Safer Health System. Committee on Quality of Health Care in America, Institute of Medicine. Washington, DC: National Academy Press; 2000.
2. Brennan TA, Leape LL, Laird NM, et al. Incidence of adverse events and negligence in hospitalized patients. Results of the Harvard Medical Practice Study I. N Engl J Med 1991;324:370–376.
3. Leape LL, Brennan TA, Laird N, et al. The nature of adverse events and negligence in hospitalized patients. Results of the Harvard Medical Practice Study II. N Engl J Med 1991;324:377–384.
4. Wu AW. Medical error: the second victim. West J Med 2000;172:358–359.
5. Wachter RM, Shojania KG. Internal Bleeding: The Truth Behind America's Terrifying Epidemic of Medical Mistakes. New York, NY: Rugged Land; 2004.
6. Wachter RM. Playing well with others: “translocational research” in patient safety. AHRQ WebM&M (serial online); September 2005. Available at: http://webmm.ahrq.gov/perspective.aspx?perspectiveID=9.
7. Blumenthal D. Launching HITECH. N Engl J Med 2010;362:382–385.
8. Sittig DF, Singh H. Defining health information technology–related errors. New developments since To Err is Human.Arch Intern Med 2011;171:1281–1284.
9. IHI Global Trigger Tool for Measuring Adverse Events. Available at: http://www.ihi.org/knowledge/Pages/Tools/IHIGlobalTriggerToolforMeasuringAEs.aspx.
10. Shojania KG. The elephant of patient safety: what you see depends on how you look. Jt Comm J Qual Patient Saf 2010;36:399–401.
11. Landrigan CP, Parry GJ, Bones CB, et al. Temporal trends in rates of patient harm resulting from medical care. N Engl J Med 2010;363:2124–2134.
12. Pronovost P, Needham D, Berenholtz S, et al. An intervention to decrease catheter-related bloodstream infections in the ICU. N Engl J Med 2006;355:2725–2732.
13. Haynes AB, Weiser TG, Berry WR, et al.; for the Safe Surgery Saves Lives Study Group. A surgical safety checklist to reduce morbidity and mortality in a global population. N Engl J Med 2009;360:491–499.
14. de Vries EN, Prins HA, Crolla RM, et al.; SURPASS Collaborative Group. Effect of a comprehensive surgical safety system on patient outcomes. N Engl J Med 2010;363:1928–1937.
15. Gawande A. The checklist. The New Yorker. December 10, 2007;83:86–95.
16. Gawande A. The Checklist Manifesto: How to Get Things Right. New York, NY: Metropolitan Books; 2009.
17. Pronovost P, Vohr E. Safe Patients, Smart Hospitals: How One Doctor's Checklist can Help Us Change Health Care from the Inside Out. New York, NY: Hudson Street Press; 2010.
18. Bosk CL, Dixon-Woods M, Goeschel CA, et al. Reality check for checklists. Lancet 2009;374:444–445.
19. Dixon-Woods M, Bosk CL, Aveling EL, et al. Explaining Michigan: developing an ex post theory of a quality improvement program. Milbank Q 2011;89:167–205.
20. Wachter RM. Why diagnostic errors don't get any respect—and what can be done about them. Health Aff (Millwood) 2010;29:1605–1610.
21. Wachter RM. Patient safety at ten: unmistakable progress, troubling gaps. Health Aff (Millwood) 2010;29:165–173.
22. Rosenthal J. Advancing patient safety through state reporting systems. AHRQ WebM&M (serial online); June 2007. Available at: http://webmm.ahrq.gov/perspective.aspx?perspectiveID=43.
23. Wachter RM, Foster NE, Dudley RA. Medicare's decision to withhold payment for hospital errors: the devil is in the details. Jt Comm J Qual Patient Saf 2008;34:116–123.
24. Nelson B. Value-based purchasing raises the stakes. The Hospitalist. May 2011. Available at: http://www.the-hospitalist.org/details/article/1056049/Value-Based_Purchasing_Raises_the_Stakes.html.
25. Wachter RM, Pronovost PJ. Balancing “no blame” with accountability in patient safety. N Engl J Med 2009;361:1401–1406.
26. Wachter RM, Shojania KG, Saint S, et al. Learning from our mistakes: quality grand rounds, a new case-based series on medical errors and patient safety. Ann Intern Med 2002;136:850–852.
27. Available at: http://webmm.ahrq.gov.
28. Available at: http://psnet.ahrq.gov.

Pop-up div Successfully Displayed

This div only appears when the trigger link is hovered over. Otherwise it is hidden from view.