Please wait...

Patient Safety Scenario #21: Black Box Thinking

Patient Safety Scenario #21: Black Box Thinking
Captain Rick Saber at the controls.

This essay is the 21st installment of the monthly Patient Safety essays, produced by the Patient Safety Subcommittee of the Ethics and Professionalism Committee. The essays are written in the spirit of the aviation industry’s concept that every miss or near miss is a valuable lesson. To read earlier essays and learn how to contribute, please click here.

The airline industry is very much like surgery: high technology, high stress, large teams, and mistakes kill people. However, our two industries are dramatically different in how we regard errors. The airline industry investigates errors, analyzes them looking for how bad organizational structures contributed to the error, then widely disseminates the lessons learned. The medical industry investigates errors, but sometimes rather than analyzing the organizational structures that contribute to the error, there is a tendency to “blame, shame, and bury.” In some organizations, there is a search for a “bad” person who is then cast as the scapegoat, and the focus is on how they should be punished. The lessons learned are then buried, so that no attorney can get their hands on it. This essay is about how the airline industry has learned that errors are lessons, and that near errors are free lessons. We need to learn all we can from each.

This scenario is a bit different from previous essays, in that the mistake was made by an anesthesiologist, not a hand surgeon. However, the analysis of the errors and the lessons learned are just as applicable to our practices.

David Nelson, MD Chair, Patient Safety Subcommittee
Scott Lifchez, MD Co-Chair, Ethics and Professionalism Committee
Julie Adams, MD Co-Chair, Ethics and Professionalism Committee

Black Box Thinking

An experienced anesthesiologist with 20+ years of providing high-quality care is assigned to room 5. The first case went well and the patient was taken to the recovery room. The anesthesiologist got ready for the 2nd case, in which the patient needed to be paralyzed for the procedure. The chart was reviewed, the patient was interviewed and examined. The patient appeared to be healthy enough for the procedure. The surgeon and the anesthesiologist discussed the case. The patient was brought to the room. The time out was done properly, the patient was anesthetized, and after the patient was asleep and intubated, the anesthesiologist gave the initial dose of cisatracurium and then started the infusion at the normal rate. When the case was close to ending, the infusion was stopped. In preparation for the conclusion of the case, the anesthesiologist used his muscle stimulator to assess the stage of return of motor function. There was no twitch at first, but the anesthesiologist knew that the paralytic needs some time to be metabolized, and was not concerned. At 15 minutes after stopping the infusion, there was no twitch and he realized that there was going to be some delay in getting out of the room, but was not terribly concerned as there was still a lot to do to finish the case. At 30 minutes without a twitch, he started becoming a little more anxious. At 100 minutes after the infusion was stopped, the anesthesiologist became concerned that something was wrong, and the chart was reviewed to see if there might be some explanation in the patient’s history. At 120 minutes without a twitch, the anesthesiologist started to review his drugs, searching for an explanation. The cisatracurium bottle was a bit different in appearance than the one he usually got. It was a branded drug called Nimbex, which was different than the one the pharmacy usually sent. The labeling was in pretty small print but it was clearly marked as Nimbex (cisatracurium). Squinting at the minute print, the anesthesiologist realized with a shock that this cisatracurium was 10 mg/cc, not the normal 2 mg/cc. Not only did the patient get 5x the normal starting dose, for the entire case the patient was getting 5x the normal infusion! This is such a large dose that no one has ever given this much before, and the anesthesiologist has no idea of what to expect. How could this happen?

It happened the way that almost all errors in complex systems happen: many contributions from many people and multiple organizational factors along a long path.

As has been mentioned in this column previously, James Reason is an industrial psychologist at the University of Manchester who studies human error in complex systems, such as the DuPont explosion in Bhopal India 1984, the Tennerif airline accident in 1977, and the Fukushima nuclear meltdown in 2011. His fundamental analysis is that every human is going to make mistakes. Therefore, any complex system that is designed to have humans involved but never have a mistake is inevitably doomed to failure. Rather, complex systems have to be designed to expect that errors will be generated throughout the system, and therefore have multiple defenses in depth to catch the errors before they affect the mission of the organization. His analogy for this is the Swiss cheese theory of errors: each slice of Swiss cheese is a part of the system or person involved in the system, and each hole is an error. Every slice has holes, but the mission of the organization is only affected if all the holes line up:

Dr. Reason has emphasized the point that no matter how trained you are, no matter how professional you are, no matter how experienced you are, no matter how often you pass recertification, you are still going to make mistakes. Let us analyze the mistakes in the scenario that opened this essay.

The pharmacy has a foolproof system for drug distribution called Pyxis. Every drug has a unique barcode, and its barcode is scanned as it is picked out of the supply bin by the pharmacy technician, and every Pyxis slot that the drug goes into has its barcode scanned, so that only the correct drug is put into the correct slot. The nurses can access the drugs only through Pyxis, and the computer has an order barcode such that only a drug that has been ordered by a doctor can be withdrawn by the nurse. Foolproof, right?

However, at this hospital, drugs that go to the operating room and need refrigeration do not go through the Pyxis system. The designers of the Pyxis system for this hospital did not feel that the barcode system would be needed for the operating room, as all the drugs are administered by doctors. Instead of a Pyxis distribution system, all the operating room drugs are placed on a special cart and the head anesthesia technician picks them up, brings them to the anesthesia drug room in the OR, and places them into the anesthesia drug refrigerator. On this night, the pharmacy technician, who was new, did not recognize that the operating room gets the 2 mg per cc Nimbex, not the 10 mg per cc, which only goes to the ICU to be mixed as an infusion and given to patients who are on a respirator. The pharmacy tech, simply doing his job, received a computer command that Nimbex has been ordered for the operating room. The 2 mg/cc and the 10 mg/cc Nimbex vials were on the same shelf and not clearly delineated as different and dangerous drugs, so he grabbed the Nimbex and put it on the cart for the operating room. The 10 mg per cc Nimbex. There was no bar code scanning, there was no Pyxis used. Foolproof?

The head anesthesia tech was called into the operating room in the middle the night to assist on an emergency heart case. During a lull in the case, she decided to get the pharmacy cart and bring it down to the operating room. There was a shortage of anesthesia techs, they all felt a bit overworked, and the OR schedule tomorrow was quite full. She knew she would not have time the next day to get the drugs, so she decided to do it now. She picked up the cart and brought it down to the operating room. At this moment, problems developed in the heart case, and she was called back to the room. She did not have a chance to check the drugs, but simply put them in the refrigerator so that they would stay cold, and she planned to check them in later. The heart case however developed further problems, went longer than expected, and before she knew it, the night flew by. The next day was upon her and the rooms needed to get prepped for the day. Although she worked all night, there was a shortage of techs, and she was the one assigned for the first shift. She scrambled to get the rooms ready, and never got back to the refrigerator.

The anesthesiologist in our scenario had used up the 2 vials of cisatracurium on his cart in his first case, so he went to get a new supply of cisatracurium. The vials were a little different shaped and the labels were a little different color, but this time the pharmacy had bought a brand of cisatracurium called Nimbex. He was used to getting the generic cisatracurium, so the different shape and labels, he reasoned, were due to the fact that this was a brand-name drug. The print on the vial was little bit small and hard to read with his trifocals, but the bottle was clearly marked cisatracurium. Throughout his 20+ year career, the cisatracurium he had used had only come as 2 mg per cc so there was no reason for him to try to read the small print on the bottle. He had never worked in the ICU and had never heard of 10 mg per cc cisatracurium. He picked out the Nimbex vials and took them to his anesthesia cart. When the time came to give the drug during the case, he looked at the bottle, verified that it was cisatracurium, and withdrew the amount he needed. Unfortunately, it was 5 times more than he intended.

Analysis

Almost every error in a complex system is caused by multiple levels of mistakes. Some of the mistakes are organizational problems and some of the mistakes are human errors, but things rarely affect the mission of the organization that do not have multiple causes. Rarely is the cause a human being who intended to make an error, was a bad person, or was simply careless. Most often it is highly trained, highly motivated people who are working in a system that is designed in a way that generates errors. This is the case in this scenario.

The Pyxis system was designed to prevent medication errors, but an exception was made for the operating room. This is a design flaw of that system. All it took was for one new or poorly trained pharmacy technician to make the entire system fail. The pharmacy technicians need to be trained to recognize the ease with which any two similar medications could be confused, due to similar names, different concentrations, etc. In addition, in this scenario someone had stocked pharmacy shelves putting two very similar drugs right next to each other, which contributed to the mix up. There was no indication on the shelving that one drug only went to the operating room and another drug only went to the ICU.

The pharmacy also had not instituted a HIPPOST policy. This acronym stands for heparin, insulin, paralytics, potassium, opioids, (hypertonic) saline, and thrombolytics. All of these drugs have a higher than normal danger for injury due to human error, and special handling protocols are necessary for these drugs throughout their pathway in the hospital. One protocol that is used is very similar to the protocol used for identifying blood before it is given: one person reads the order and the other person handles the drug, and they each check each other.

The head pharmacy tech is assigned the job of verifying the drugs that they bring down to the operating room. Normally they do this during daylight hours, as part of their regular day, and have time to devote their full attention to the task. However, because of a shortage of anesthesia techs, in this case the head anesthesia tech tried to do two jobs at once, both attending to the emergency heart case as well as bringing the drugs down. Unfortunately, the fact that she was overworked, and tried to do two jobs at once, contributed to the failure to catch the error generated in the pharmacy, and allowed it to continue. There was also a systems and a human error, in that she did not remember to go back to the refrigerator and check the drugs, in part due to her pressing schedule as the head anesthesia tech.

The final error obviously occurred in the operating room. The anesthesiologist had checked the vial for the correct medication before he gave it, but did not check the concentration. There is a systems error here. It should be anticipated that the anesthesiologist might not check for something that he did not know existed. Since every vial of cisatracurium the anesthesiologist had ever handled previously had been 2 mg per cc, the only reasonable expectation is that the anesthesiologist will treat every vial of cisatracurium as 2 mg per cc. The higher concentration drugs need to have some kind of labeling that makes it clear that this is a higher concentration, and the labeling has to be clear and obvious. In addition, there was a human error: the anesthesiologist is responsible for verifying not only the name of the drug that they are giving, but also the concentration.

The airline industry obviously examines every crash to see what went wrong. In addition, it has learned to mine each adverse event that did not result in a crash as a “free lesson.” The purpose of the “black box” (actually a misnomer because they are painted bright orange, to make them easier to find) is to gather as much data as possible, so every near miss can be analyzed for whatever improvement might be made in the system. No event is looked on as a “one-off,” a circumstance that will never occur again. Eastern Airlines flight 401 was coming in for a landing at Miami. When the landing gear indicator light did not turn green, the cockpit team turned their attention to the lightbulb, and lost situational awareness. Someone carelessly bumped to the autopilot into the off position and the plane crashed. One off, never to happen again? Only 6 years later, United flight 173 was trying to make a landing in Portland, the landing gear green light did not come on, and the flight crew turned their attention to the landing gear. They lost situational awareness, ran out of gas, and crashed in a forest outside of Portland. Every miss or near miss needs to be analyzed for the lessons it can teach, because the situation will happen again.

In some ways the scenario above about the Nimbex was a “free lesson,” because the patient, while intubated longer than was necessary, was not permanently harmed. It is very important not to brush this aside as merely a near miss or of no importance because no one was hurt. Life is giving you a “free lesson.” You better learn the lesson. The next one might not be free.

References


1  James Reason’s 12 Principles of Error Management. http://aerossurance.com/helicopters/james-reasons-12-principles-error-management/  Accessed 2/12/20.

https://en.wikipedia.org/wiki/Swiss_cheese_model  Accessed 2/12/20.

3 https://www.sun-sentinel.com/news/fl-xpm-1992-12-29-9203090325-story.html

4 https://en.wikipedia.org/wiki/United_Airlines_Flight_173

Leave comments

Your email is safe with us.