
Quality for the Rest of Us
Quality for the Rest of Us
The Perfect Heist in Patient Safety (15 mins)
What can a con-man teach us about patient safety? Sometimes normal human reactions are just what is needed to breach protocol and result in harm. Today's episode explores the relationship between the perfect heist and safety culture.
Key Points:
-Reaction Videos
-Dual Auditing
-Decoys and False Alarms
References:
Diaz, Naomi (Nov. 17th, 2022). Why healthcare mergers and acquisitions are a cybersecurity risk. Becker’s Health IT. https://www.beckershospitalreview.com/cybersecurity/why-healthcare-mergers-and-acquisitions-are-a-cybersecurity-risk.html.
For more information, visit PorterQI.com, or email Q4Us@porterqi.com.
Reaction Videos and Safety Culture
Have you noticed that our culture has become increasingly reactionary? I mean, there is an entire genre on YouTube for Reaction Videos. People sit and watch other people react to videos, news reels, and opening boxes, for example, and they are incredibly popular. In life, there seem to be a lot of people sitting and waiting for things to happen just so they can react to it. They fall into Doom Scrolling as they search for something to react to, and I’m left wondering where all the proactive people are hiding.
One place where a reactive culture does not work is in patient safety. If the safety culture of an organization is reactionary, then you’re always putting out fires — literally! — and the same mistakes happen over and over again. However, the approach in Patient Safety acknowledges that human lives, and the processes that support them, are fragile. Our patient’s lives are precious, and if we were blind to our potential for error, we would be even more vulnerable.
That’s why patient safety often approaches basic clinical practices with a Murphy’s Law mentality. If everything that can go wrong does go wrong, how can the healthcare team prevent the worst outcomes, identify problems quickly, and recover promptly? In fact, patient safety professionals hope to be bored by their case reviews, but they also have the imagination to foresee potential problems and identify near misses.
What does imagination have to do with patient safety exactly?
I love the foretelling that happens in the movie Ocean’s 11, where they walk through all of the impossible security and limitations that make the casino heist impossible. The casino owner, Terry Benedict, knows that he might get robbed the night of the fight. He is not unaware. In fact, he has made every possible effort to protect the vault. It seems impenetrable.
There are many situations in healthcare that are like that heist. For example, the Operating Room is like Terry Benedict’s vault. We place guards and checkpoints at every opportunity. We have checklists and protocols and RFID badges. We scour every inch of the room with disinfectants. We sterilize tools. We bathe, disinfect, and even shave the patient. We know the date and time of highest risk. We are not unaware.
Yet, somehow, opportunistic microbes breach that security barrier every day all across the nation. This is why imagination matters. We have to study and review and imagine new access points for infection if we hope to solve the mystery of why patients continue to contract surgical site infections, along with every other type of safety event. Imagination matters in this work because we are not dealing with a stagnant enemy, but with a continuously evolving organism. And it’s not just viruses and bacteria, our patients aren’t static either. We spend so much money trying to keep patients safe, but they keep falling, getting surgery on the wrong body parts, and all kinds of other errors and shenanigans.
Spoiler alert: The casino owner couldn’t protect the vault from his office.
Confidence man Danny Ocean not only robs him, but he robs him right under his nose. And what did Ocean have that the rich guy didn’t? For one thing, he had a loyal, unified team. It’s not enough to get buy-in at the bedside. We need buy-in from the bed itself. If we’re going to build a team that can adequately face the crafty bandits looking to take away our patients’ safety, we need patients on the team, and they need to know the plan, the risks, the possible failures, and how to speak up if they see a problem.
Once we have the confidence side down and trust is established we can move on to thwarting the heist. Much of the success of the Ocean’s 11 team had to do with exploiting normal human reactions. In the movie, “How to Steal a Million” with Peter O’Toole and Audrey Hepburn, O’Toole actually explains that normal human reactions are precisely how he intends to rob the museum.
Likewise, Danny Ocean, used things like distraction, over-confidence, and system downtimes to stress the system. It was like getting a Joint Commission survey during a disaster. In fact, a fake auditor was one of the distractions in the movie. As a member of the quality team who was often asked to audit the units on safety and quality, I can tell you that we tend to avoid the busiest times of the day for an audit. Those audits can be really distracting! Everyone is stressed already, and we would hate to accidentally hurt someone because of an unnecessary exercise.
But there are other times that I wondered if we shouldn’t plan to push the system a bit. What if it all happens at onc,e and we need to manage anyway? I’m not necessarily recommending it, but perhaps there are planned “stressors” that could be run simultaneously with another audit. Like asking whether it’s ok to get your colleague’s patient up to the bathroom during a fake system downtime. Is there any way to tell whether it’s safe outside the computer? Can they walk or are they bedbound? Barriers to communication and new dependencies may come to life in this sort of combination exercise and I think it could be very informative.
So an audit could be distracting, but I don’t typically see anyone pointing to them as a concern for safety. There are probably a lot of distractions that we consider to be good and useful overall but may have something of a dual effect on patient safety in practice. The point is that we have certain expectations for an audit — like how it should happen one-at-a-time when I have time to get all my answers right. But the goal of safety audits shouldn’t be to catch a perfect moment. It is supposed to be a chance to catch the wrong answers and push the system to see where it breaks. It has nothing to do with individuals and everything to do with system stressors. Just switching up the audit could bring out some important discoveries.
Over-confidence is another problem we are familiar with. Mr. Benedict was dripping with disdain when he told Danny Ocean he would go straight to jail if he tried anything. He was very confident that his plan was strong, that his staff were reliable, and that everything was ready in his impenetrable fortress. Over-confidence happens in healthcare, too. One example is when we are confident in our experience and stop reading the labels of medications, supplements, and supplies. If it looks familiar, we gloss over it and move along with our day. “I’ve given this a thousand times. I know what I’m doing.” We end up sounding like Terry Benedict gloating about his vault, and we know that doesn’t turn out well. Overconfidence is a reaction to a victory not yet won.
In the casino, there were also several false alarms. I’m glad that alarm safety and alarm fatigue is getting attention in healthcare because it is really tough to listen to alarms all day and filter through the baseline noise to hear the targeted safety events. The museum heist in How to Steal a Million is successful because an important official complains about the noisy response to several false alarms. One complaint, one risk to someone’s professional reputation, and suddenly all the alarms are shut off on millions of valuable treasures. That’s pretty reactionary, but it’s also pretty predictable.
Another test for the healthcare system is a crowd. The night of the heist was the night of the highest casino holdings — the most cash on hand — which also meant that there were a ton of people showing up at once. If you have ever walked through the ruins of an ancient civilization you can see just how much a crowd of people and some time can wear things down: Huge stones sink to form the curvature of human feet, while thousands of hands over hundreds of years can wear the corners of ancient cornerstones. It’s incredible what a crowd will do to a system. In a previous episode I compared the generational rush on primary care clinics to a denial of service attack where a swarm of bots try to access a webpage at the same time. Networks are not built to support that kind of volume all at the same time, and it causes the website to crash. Well, healthcare handles a lot of shifting crowd levels, including mass casualties and pandemics, but even though the system flexes and handles the situation, there are a lot of things that must change in a crowd event. Additional staff is called in. Protocols are streamlined and paper charting is stacked in a corner for someone to enter sometime in the future when things calm down. Crowds push the limits of an organization. People are less likely to pursue a detailed question or trace a problem to its origin when they are overwhelmed by a crowd. In a crowd, everything is reactionary and temporary.
This type of situation presents one of the other techniques employed by the Ocean’s 11 team: Helpful workarounds. If someone comes to you as a buddy and asks for an exception, how likely are you to grant it? Do you even know this person, or do you just have a good feeling about them? Well, unfortunately a lot of people are incredibly helpful about granting exceptions and helpful workarounds to nefarious sources. I spoke with a source recently who works with a network and cybersecurity company who explained that hacker groups are actually providing customer service lines at this point. Today, there are helpful employees who will assist you in paying the ransom for your cyberattack. So if we are willing to bend the rules for a friendly face, that is a reactionary response rather than the proactive action of investigating first.
Becker’s healthcare also talked about how healthcare organizations are most at risk during mergers and acquisitions.[1] Why? One problem is that there is a ton of protected data moving from one place to the next. That actually goes pretty well most of the time. But in the meantime, the everyday audits, routines, updates and daily monitoring gets pushed aside to focus on the very important data movement. This is like Saul walking into the casino with a locked suitcase full of fake jewels. It seems like an important task, but it’s really a decoy. The very important data is critical, but it doesn’t negate the value of the everyday routine audits. If we wanted to improve safety in this area, we might bring in extra staff from a cybersecurity firm to handle the giant data transfer and allow our full-time teams to continue their work as usual. That way we wouldn’t be duped out of watching our every day work because of a decoy.
Some of the other tools used by the Ocean’s team were confusion, unplanned changes, faulty equipment, emotionally charged incidents, fear, and people fighting. Have you ever been to a healthcare organization that lacked confusion, unplanned changes, faulty equipment, emotionally charged incidents, fear, and fighting people? It’s like a guide to an average day at work! Yet, each factor can cause employees to become reactionary rather than proactive. In the movie, two of the thieves pose as security guards and pretend their badge doesn’t work to let them into the secure area. They start yelling, and the guard at the door gets uncomfortable and lets them in even though it’s against protocol. We all want to solve problems quickly and quietly. We don’t have a plan for faulty badges in a high-stakes environment. We aren’t ready to manage people yelling in the lobby, but we need them to calm down fast. So we react, and the vault is breached.
One of my favorite games in safety lessons is called “Find the Exception.” I ask participants to think of some scenario where the rule would not hold up. If I don’t have a story of how that exception ended badly, we walk through the possibilities and see if it holds up.
Some events, like the urgent need to open the door with a badge that doesn’t work, come up often. Our weaknesses are well-known but often poorly explained. The answer is no, all the time. Shut the door on them. But it’s against every social moree to do that, so it helps to hear the stories of what happened when someone else did the same thing.
I haven’t heard a good exception yet for the badge situation, but I’ve heard some great points on other issues that led to revised policies for things like behavioral health sitter policies. Sometimes workarounds are necessary, and the policy needs to change rather than the workaround. But if not, it helps to run through scenarios that we generally think should do the trick — the perfect exception — and then reinforce why it fails. In many ways, patient safety is like working backwards through some of the most successful safety heists in healthcare history and making every effort to prepare for them. You wouldn’t be able to get Terry Benedict’s vault open a second time, right? Or could you?
Danny Ocean predicted human behavior. We need to recognize all of the normal responses that are hazardous and work them backwards like its worth a million dollars to keep it from happening.
Because in many cases, it really is.
[1] Diaz, Naomi (Nov. 17th, 2022). Why healthcare mergers and acquisitions are a cybersecurity risk. Becker’s Health IT. https://www.beckershospitalreview.com/cybersecurity/why-healthcare-mergers-and-acquisitions-are-a-cybersecurity-risk.html.