close

HealthChannels is now part of ScribeAmerica.com

February 05, 2018
How to Build a Culture of Safety in Health Care
In health care, we don’t refer to once-in-a-while events or not-very-often events. The phrase that describes wrong-site surgeries, medication errors, retained objects, surgical fires and a host of other potentially tragic occurrences is never events. And that’s as it should be, says consultant and safety expert Spence Byrum. “How can it be anything else?” he asks.

In health care, we don’t refer to once-in-a-while events or not-very-often events. The phrase that describes wrong-site surgeries, medication errors, retained objects, surgical fires and a host of other potentially tragic occurrences is never events.

And that’s as it should be, says consultant and safety expert Spence Byrum. “How can it be anything else?” he asks. “These are patients’ lives and well-being.” Accepting anything else “is like a pilot saying I hope I get 9 out of 10 of these landings right.”

The aviation comparison is apt. Mr. Bynum is an expert in high-reliability organizations — those united by the common goal of striving to minimize adverse events despite carrying out intrinsically complex and hazardous work. Airlines embraced that goal in the late 20th century, and have managed to reduce fatalities from about one in every 200,000 flights, to about one in every 2 million.

Top down

The startling rate of death from medical errors clearly indicates that we’re still struggling to match that level of safety. The challenges are great, but there’s much we can learn from the concepts of high reliability and from the experiences of our peers.

One tenet of high reliability is that an effective culture of safety must start at the top. But, says Tejal Gandhi, M.D., president and CEO of the National Patient Safety Foundation, board members tend to be more focused on financial decisions than on overseeing quality and safety. “Safety is everyone’s job,” she says, “but leadership sets the tone.”

Tragic errors at two of the leading U.S. health systems — Memorial Hermann and Virginia Mason — inspired the leaders of those organizations to reevaluate and involve senior leadership in safety. “If you don’t put the stake in the ground, you will certainly never get to zero [errors],” says Charles D. Stokes, RN, president and CEO of Memorial Hermann, which oversees more than 50 facilities in Texas. Leadership commitment to improving safety is “foundational and fundamental,” adds Gary Kaplan, M.D., chairman and CEO of Virginia Mason, which operated a network of regional medical centers in Washington state.

Just culture

The commitment starts with organization leaders, but in the hierarchy of health care, everyone plays a crucial role in building a culture of safety, and everyone needs to accept and embrace his or her role.

Along with acknowledging the high-risk nature of the work involved, high-reliability organizations encourage collaboration across ranks and disciplines. And effective collaboration relies on a blame-free environment — one in which people can report errors and near-misses without fear of reprisal.

But the no-blame concept, too, can be fraught with challenges. After all, risky and reckless behavior shouldbe responded to punitively. As such, the concept of “just culture” is appropriate in complex and hazardous endeavors. The goal of just culture is to focus on systems issues that lead to unsafe behaviors, but also to hold people accountable for reckless behavior. As such, the appropriate response to any given error depends on the behavior that led to it, not the severity of the event.

Speaking up

Beyond nurturing a system in which blame is administered only when appropriate, a culture of safety must encourage a free flow of communication up and down the hierarchy. The vastly improved safety of aviation is attributable in part to the recognition that encouraging crew members to speak up, to be willing, in effect, to challenge the captain-first hierarchy, can prevent tragic mistakes.

That’s obviously important in health care, too. After making a deeply disturbing error in the operating room, David Ring, M.D., confronted the issue head-on in with an essay in the New England Journal of Medicine. He wanted to understand the systemic errors that led to his performing a carpal tunnel surgery on a patient who’d been admitted for a trigger release. And he wanted others to learn from his experience. Now, he says, he insists that all co-providers voice any and all concerns and questions, no matter how trivial they might seem. “In the operating room, people speak up and say, hey, what about this? And 99 times out of 100 you meant to do it that way and it’s fine.” But gratitude, not impatience, is the right response, he says: “I’m glad you spoke up,” he tells nurses, techs, residents and anesthesia providers, “because if I weren’t supposed to do it this way, you would have really saved me from making a mistake.”

Scribes promote safety

Of course, by the time the patient gets to the OR, the wobbling wheels that can lead to a mistake may already be in motion. “A disproportionate number of wrong sites have their origins in the physician’s office,” says Mr. Bynum. “It could be an incomplete handoff, it could be H&Ps or consents that are not signed. Any time you’re dealing with labs, MRIs, CTs, X-rays — if you’re trying to make everything come together just before the surgery, it’s very difficult.”

Medical scribes can help reduce errors and bolster a culture of safety in several ways. By allowing physicians to keep their eyes and ears on their patients, instead of on their keyboards and screens, they reduce distractions and the need for multitasking, which helps ensure that they get the whole story from patients. By reliably delivering messages among providers, and to patients and their families, scribes help manage the flow of critical and sometimes fluid information. And by allowing physicians to focus on the job they’re trained for, instead of spending countless hours documenting information, they reduce the potential for physician burnout, a phenomenon that’s both on the rise and that’s been shown to lead to potentially disastrous errors.