Skip to Main Content

We have a new app!

Take the Access library with you wherever you go—easy access to books, videos, images, podcasts, personalized features, and more.

Download the Access App here: iOS and Android

INTRODUCTION

Cuiusvis hominis est errare, nullius nisi insipientis in errore perseverare. [To err is the nature of any man, to persist in error is of no one except a fool.]

Marcus Tullius Cicero, 106 BCE–43 BCE, Philippica XII, ii, 5

Errare humanum est, perseverare diabolicum… [To err is human, to persist in error is diabolical…]

Attributed to Lucius Annaeus Seneca (“the Younger”), c.4 BCE–65 CE

Human beings make mistakes—and have done so for millennia—and these quotes caution us to learn from these mistakes so as not to repeat them. In 1999, the Institute of Medicine (now the National Academy of Medicine) included a similar phrase in the title of its call to arms regarding patient safety—To Err Is Human: Building a Safer Health System. We discussed aspects of this report in Chapter 3. In this report, the authors take the idea a step further and outline the need for developing healthcare systems that plan for human error by “building safety into the processes of care”;1 that is, designing systems that support healthcare workers in providing high-quality and safe patient care and that reduce opportunities for making errors. In fact, they specifically state that “safety does not reside in a person, device, or department, but emerges from the interactions of components of a system.”1 Since the publication of this report, this philosophy has permeated the medical community, and the community now understands that errors are the result of a multifaceted and interconnected framework that includes humans and system features (e.g., organization, environment, tools, and teams) with which they interact. More strongly worded, it is not inherent human fallibility that causes errors, but the interactions between the elements of the system, of which humans are a part.

Key Point image

It is not inherent human fallibility that causes errors, but the interactions between the elements of the system, of which humans are a part.

This line of thought also suggests that humans need to learn not just from their mistakes, but also about the overall system within which they work. Again, the Institute of Medicine’s report highlights the need for learning systems: “Characteristics of highly reliable industries include an organizational commitment to safety, high levels of redundancy in personnel and safety measures, and a strong organizational culture for continuous learning and willingness to change.”1 While this was a new idea for healthcare, it was not a new idea in general. In fact, other high-reliability industries, such as commercial aviation, have already implemented these ideas into their operations and culture.

In the almost two decades following the publication of To Err is Human,1 many initiatives have been started to create new models for providing healthcare, such as the Patient-Centered Medical Home (PCMH)2 and the Perioperative Surgical Home (PSH)3—frameworks for providing more coordinated and focused care between members of ...

Pop-up div Successfully Displayed

This div only appears when the trigger link is hovered over. Otherwise it is hidden from view.