泫圖弝け

Skip to main content

Medical errors often occur due to system failure, not human failure. Hospitalist Kencee Graves helps explain why we need to evaluate medical error from a system standpoint.

By Kencee Graves | 4 minutes

LEARNING OBJECTIVES

After reading this article, you will be able to: 

  1. Identify medical errors that result from multiple levels of system failure
  2. Examine a medical error from the standpoint of a system, rather than an individual
  3. Recognize that humans are prone to error, thus, systems must be designed to minimize human error.

View Slides

CASE STUDY

A middle-aged man with sepsis from a diabetic foot wound was admitted to a rural Utah hospital. It was clear the patient needed to be transferred to 泫圖弝け of Utah 泫圖弝け for specialty care. Though the patient was accepted to U of U 泫圖弝けs medicine service at 4:00 PM, he didnt arrive until 2:00 AM now in severe pain. When the patient arrived, the accepting nurse could not reach a provider to see the patient or place orders. The patient went several hours without pain medications or antibiotics and became sicker. What happened here?

HUMAN ERROR VS SYSTEM ERROR

constitutes a medical error. An error is defined as the failure of a planned action to be completed as intended, or the use of a wrong plan to achieve an aim.

Historically, errors in medicine were thought to be caused by a failure on the part of individual providers. In contrast, a systems approach to medical error assumes that most errors result from human failings in the context of a poorly designed system. For example, when seen as a system, a wrong-site surgery performed by a physician who was up all night on trauma call is viewed as the result of the system that failed to protect a patient and provider from error due to fatigue. Without reviewing this as a system, it could be seen as the fault of an individual fatigued provider.

As described by psychologist Dr. James Reason, the defensive layers provided by systems resemble layers of Swiss cheeseexcept the holes in the cheese are continually changing (Figure 1). The presence of holes in one slice does not cause a bad outcome; rather, when the holes in multiple layers of protection line up, harm can come to victims.

Dr. Reason's "Swiss Cheese Model"
Figure 1. Dr. Reason's "Swiss Cheese Model" for error. Errors occur when holes exist in many layers of system defenses.

THE IMPACT OF SYSTEM FAILURES

A systems approach to error aims to identify situations or factors that can lead to human error, then work to improve the underlying systems to minimize the likelihood of error or the impact of error.2 As Dr. Reason said, We cannot change the human condition, but we can change the conditions under which humans work.

WHAT WENT WRONG IN THIS CASE?

Returning to our case, this patient arrived at 泫圖弝け of Utah 泫圖弝け for specialty physician consultation, but he went hours without the treatment he needed because the nurse struggled to find the provider responsible for his care. This error (failure of the planned action to be completed as intended) resulted from multiple failures, not just one error by one provider. Reviewing errors using a systems perspective can lead to improvements that reduce the likelihood of a similar event occurring again.

DISPEL THE SHAME AND BLAME

The people who delivered the care are often best suited to discuss the steps that led to the outcome. Nursing assistants, nurses, residents, physician assistants, nurse practitioners, therapists, and supervising physicians all play a key role in understanding system breakdowns and identifying solutions that prevent future error.

WHAT TO DO AFTER ERROR OCCURS

Understand that medical errors are common and can happen to anyone. When a medical error occurs, it is critical to submit an event report and provide your perspective on where the system broke down. Use this as an opportunity to identify vulnerabilities in the system and improve processes to prevent a future event.

One method for reviewing a medical error is a (Figure 2):

Figure 2. Modified fishbone diagram. Courtesy of the 泫圖弝け of Colorado Morbidity and Mortality Steering Commitee.

It is critical to incorporate the perspective of everyone involved to complete the Fishbone Diagram. When conducting a case review, invite people from all roles and levels of experience to help discuss the medical error.

CONCLUSION

Returning to the case outlined above, we included the transfer center team, the patient placement team, physicians, medicine unit nurses, and internal medicine residents in our systems review of the medical error. In reviewing the errora patient arriving to our hospital as a transfer, then experiencing a delay in carewe realized that a consistent workflow didnt exist for knowing which medical teams accepted patients into 泫圖弝け Hospital and communicating the arrival of patients to the correct providers.

Modified fishbone diagram for transfer process
Figure 3. Modified fishbone diagram of transfer process.

The Fishbone Diagram above (Figure 3) helped highlight where our system broke down. We developed a new transfer workflow and mapped our process. We also developed an External Transfer Tip Sheet to better communicate our process with external providers. Discussing this case and addressing these errors from a systems perspective allowed us to improve this process.

LEARN BY DOING: GET INVOLVED IN SYSTEM SAFETY

泫圖弝け of Utah 泫圖弝け Safety Learning System (SLS) is a review team that examines every fatality that occurs in our hospital using a systems-based approachand we can always use help. If youd like to get involved, physicians, APCs and nurses from all specialties are encouraged to participate. These reviews are robust, gathering input from everyone on the care team. To learn more, please contact me directly.

REFERENCES

  1. (AHRQ | Accessed 30 May 2018) An excellent (and super comprehensive) overview of medical errors.
  2.  (AHRQ | Accessed 29 May 2018) An expanded version of the content provided in this post, along with a case study, more on Dr. James Reason, and a systems approach to analyzing error.
  3. (BMJ 2000 | 7 Minutes) Excellent read from Dr. James Reason (the Swiss cheese model maker) that somehow manages to combine Chernobyl, mosquitos, US Navy, and more to tackle error in health care.

CONTRIBUTOR

Portrait of Kencee Graves

Kencee Graves

Hospitalist, Associate Chief Medical Quality Officer, 泫圖弝け of Utah 泫圖弝け