Patient Safety Scenario #18: To Err Is Human (but Surgeons Are Not Allowed to Be Human)
This essay is the 18th installment of the monthly Patient Safety essays, produced by the Patient Safety Subcommittee of the Ethics and Professionalism Committee. The essays are written in the spirit of the aviation industry’s “Black Box Thinking”: in order to inform and improve our medical safety record, we need to analyze our errors. To read earlier essays and learn how to contribute, please click here.
Pilots are highly trained, have thousands of hours of experience, and are regularly subjected to recertification. Sound like surgeons? As good as they are, analysis of aviation accidents and near misses has demonstrated that no matter how qualified or experienced the pilot, mistakes can still happen. In fact, the worst aviation accident in history was caused by the most qualified pilot in the airline, Captain Jacob Veldhuyzen van Zanten. He was not only the chief safety officer, he also the chief flight instructor, the pilot who gave all the other pilots their annual check rides. He may have been the most qualified person in the cockpit, but he made a mistake the team did not catch. The plane crashed and burned, and 583 people died, because of a failure of teamwork in the cockpit. This essay is about the reality that surgeons also make mistakes, and how “black box thinking” views mistakes.
David Nelson, MD Chair, Patient Safety Subcommittee
Scott Lifchez, MD Co-Chair, Ethics and Professionalism Committee
Julie Adams, MD Co-Chair, Ethics and Professionalism Committee
Simple case: ORIF of a radius
The surgeon was looking forward to his day in the OR. He had two radius fractures to start off the day, and he loved radius fractures and had done hundreds if not thousands. The first patient was a 62-year-old woman who was vacationing in the area and had tripped over a curb and had a fairly simple transverse right distal radius fracture. It was not at all close to the joint, which meant the plate did not need to crowd the watershed line, and these cases always went a bit quicker than average. The 2nd case, a 70-year-old man, had a left distal radius fracture, with the transverse fracture closer to the joint. More attention would have to be paid to the watershed line.
The assistant for the case would be a scrub tech who was a traveler, and the surgeon had not worked with him before. The surgeon greeted him before the case, discussed basically how he expected the sequence of the reduction and procedure to go, and the scrub tech assured him that he was familiar with the procedure, as he had done many of these cases before. The case went quite smoothly. The fracture reduced easily, the plate fit nicely and did not need to be placed close to the watershed line at all but was comfortably in the pronator fossa. The patient was taken to the recovery room, and shortly thereafter the team started the 2nd case. The fracture reduced quite easily, and the surgeon asked for a left short plate. The surgeon had been involved in the design of the plate system, and was proud of the fact that the right plates were all one color and the left plates were all a different color, but was somewhat dismayed to see that he was handed a left distal radius plate, but it was the same color as they had put into the previous case, a right radius fracture. Had he placed a left plate into a right radius fracture? Impossible! The surgeon was sure that he always checked the plate when it was handed to them and read out loud whether it was a right or a left plate. And the plate in the last case fit well! Impossible! But the undeniable fact was that the plate in his hands was obviously a left plate, and was the same color as the previous case. It should have been the opposite color. His stomach sank as the fact was irrefutable: the previous case had a left plate placed on a right fracture. How was that possible? He was sure he always read out the side. How was it possible to fit a left plate onto a right fracture? Trying to keep his composure, he asked the circulator to call upstairs to the recovery room, and ask them to not discharge his first patient until he could come up and speak to the patient. He also wanted to check the x-rays.
The current case finished without any problems, and as soon as the patient was safe in the recovery room, he looked at the x-rays from the previous case. Sure enough, a left plate had been placed onto a right radius. The fracture was so far proximal that the angle of the screws did not make any difference, and the PA and lateral angles were anatomic. Although the plate was a bit more proud that he would like, it was far from the watershed line and was well covered by the pronator quadratus reconstruction. It was an embarrassing, bonehead mistake, but luckily would have no effect on the patient’s recovery. The only negative outcome would be to the surgeon’s pride.
After talking to the first patient, showing the patient and her husband the x-rays and a sample plate, he assured her that there was no negative effect on her recovery or her function, and that there was no need for the plate to be removed or for revision surgery. She accepted the explanation, and the surgeon returned to the operating room, grilling himself on how he could have made such a stupid error.
How could this happen? He was sure, absolutely sure, that in every single case, he checked the laser etch on the plate for “right” or “left.” Indeed, he went so far as to read the laser etch out loud and state that he was placing a right or a left plate, and to get concurrence from the team that it was the correct plate. Absolutely sure. In each and every case.
But obviously, the evidence was just as absolute: he failed to do it in this case. He had asked for a right plate, the new scrub tech, who was unfamiliar with the instrument set, had simply handed him the wrong plate, and he failed to catch the error. It was simply his own fault.
How could this happen? It happened because every surgeon is human, and people make mistakes. James Reason, an industrial psychologist at the University of Manchester who studies human error, has emphasized the point that no matter how trained you are, no matter how professional you are, no matter how experienced you are, no matter how often you pass recertification, you are still going to make mistakes. Dr. Reason’ s specialty is analyzing industrial mistakes that kill hundreds, such as the DuPont explosion in Bhopal India 1984, the Tennerif airline accident in 1977, and the Fukushima nuclear meltdown in 2011. But the lessons he teaches apply just as well to the single accidents that happen quietly in the operating room. The baseline fact is that humans make mistakes, and surgeons have to develop a system that can bring the error rate as close to zero as possible. John Nance, who is a military pilot who became an attorney and wrote the book Why Hospitals Should Fly, calls this process Getting to Zero. How do we get to zero patient harm?
We get to zero patient harm in part by first admitting that we are fallible, and then designing a system or process to catch every error before it affects the patient. Dr. Reason’s description of the concept is multiple layers of defense, and his model for this is the “Swiss cheese theory of errors,” which has been covered in a previous Perspectives column (http://asshperspectives.org/2019/09/patient-safety-scenario-15-defense-in-depth-checks-and-backup-checks/).
Each person on the team should be on the watch for errors, and the error only affects the mission of the team if the error gets by each person. In the model, each person is a slice of Swiss cheese, and the error only gets through if all of the holes line up.
A surgeon who is cocky and sure that they cannot make an error is the surgeon who is sure to screw up. If you talk to any airline captain, he would rather have an inexperienced first officer than a cocky first officer. A cocky first officer is a danger to the entire airplane. Have you ever gone to the grocery store because you are out of butter, and while you were there you get some beans, some bagels, and some beer, and you get home and realize that you forgot the butter? Have you ever put down your cell phone and not remembered where you put it? Even worse, have you ever promised your spouse that you will bring some milk home from the grocery store, and then showed up in the front step empty-handed?
We get to zero patient harm by first admitting that we make mistakes.
The 2nd step is to develop a system or process to minimize errors. This column has reviewed some of the techniques to get to zero patient harm in previous essays, such as teamwork with an emphasis that each person on the team is charged with catching any error, and empowered to challenge the surgeon if they think an error is being made. Another technique is designing your own checklist, one for the office and another one for the operating room, that lists whatever you feel is essential on the checklist. Ignore the hospital’s checklist: that is designed to protect them. Design your own checklist to protect you. You can only do this if you design your own checklist, and use it consistently. Yet another technique is minimalization of variables, so that everything is done exactly the same way each time, with the same equipment and the same staff to the extent possible, and distractions and interruptions are eliminated.
We get to zero patient harm by first admitting that we make mistakes.