Make No Mistake

ASQ’s Healthcare Update:
published in collaboration with the
ASQ Healthcare Division
Make No Mistake
Protect patients by harnessing near-miss events
Interview by Megan Schmidt, editor
A near miss in aviation is when two planes in flight nearly collide with each other.
In healthcare, a near miss is an error that is detected or corrected before producing harm.
Near misses aren’t good catches or close calls. Whether in healthcare or in aviation, a near miss signals a
system failure. Patient safety experts Doug Dotan, Marlene Beggelman and Anngail Smith say near
misses are valuable learning opportunities that can transform patient safety. They believe healthcare
professionals should monitor processes and behaviors, not just outcomes. By documenting and
analyzing the factors that are in place when things almost go wrong, healthcare practitioners can better
identify when, where and how adverse events occur and protect patients from being harmed.
The Institute of Medicine (IOM) has attributed medication errors as the most common type of
healthcare error, which harms 1.5 million people every year and result in an estimated $3.5
billion in costs. Every time someone was harmed, there were many more occurrences that
could have caused harm but did not due to chance or an intervention.
CRG Medical, based in Houston, is the developer of KBCore, a knowledge building application for
healthcare application that includes a linguistic analytical module that identifies practices and processes
that have the potential to hurt people. Leaders of CRG Medical—Dotan, CEO and Smith, operations
director—and Beggelman, a physician and the CEO of Enhanced Medical Decisions, a natural language
processing company that extracts information from medical records and clinician notes and partner of
ASQ Healthcare Update
published in collaboration with the ASQ Healthcare Division
March 2013
www.asq.org
CRG Medical, spoke to ASQ Healthcare Update about medication safety and the importance of
leveraging near-miss events.
Healthcare Update: Why is it important to study near misses?
Dotan: Every time something goes wrong, there’s about 200-300 times that the same scenario occurred,
but someone or something prevented the bad outcome. Even though all the contributing factors, but
one, were in place, there was a single barrier that helped prevent a bad outcome. We’re not learning
from those potentially harmful events. We’re not sharing, documenting or analyzing them. For every
harmful event, we have 300 chances to prevent it. Safety is about culture, it’s about management and
it’s about being driven to have zero accidents. You’re not going to have zero accidents, but if you try to
have zero accidents, you’ll have less and less. You have to try each day.
Smith: In order to achieve patient safety and quality you need to learn from the past to prevent in the
future. Medication errors are a good example of this. In some states, if a nurse has a certain number of
medication errors reported to the board, they can be disciplined. It’s a real catch 22 for nurses to want
to report a medication error, a medication variance, or a near miss – when something almost happened
but it was stopped from happening. You’d think you’d want to brag about a near miss and get credit.
But there’s a real tension for someone involved in an event. It’s an issue for people.
We need to learn about all of the times that something almost happened or does happen so we can
learn about the contributing factors. Sometimes staff don’t recognize when an adverse event is
happening. The natural language processor reviews notes and keywords, which can identify something
happening. So there’s no need to rely on someone recognizing an event and filling out a formal report.
The computer works for us instead of relying on staff.
Dotan: Systems and processes that are not integrated will create gaps, chasms and silos between
different processes and disciplines. When things fall through gaps, you miss them, and you don’t get
things right. That’s how you hurt people and increase costs considerably. Integrating processes, systems
and thinking is critically important to learning. Learning means you’re creating knowledge, and thereby
creating learning organizations.
ASQ Healthcare Update
published in collaboration with the ASQ Healthcare Division
March 2013
www.asq.org
Eighty percent of the knowledge in any organization is in the heads of the people working there. That’s
called tacit knowledge. Everyone has a computer in their heads. Humans need to download what they
know and share with others, learn from each other and create a better, safer society. Only about 20% of
explicit knowledge is written down or we can read in books—the rest of it, 80%, is in our heads. We are
the only ones who really know what we do and why we do it.
Beggelman: If you are a human chart reviewer, you can go through a chart, compile information and
then reach conclusions about problems that almost or did happen. But that human effort takes too long.
Natural language processors are trained to read and understand data like a human would. It’s modeled
like an expert system. It goes through data, pulls language that implies a certain event and then
automatically extracts it and creates a report with that information. It might take a chart review a halfhour to 45 minutes. The computer can do a review in less than a second.
Most of the information contained in electronic medical records is in narrative form. Over the last few
years, there’s been a push to get healthcare providers to conform to the same language and say things
in a uniform way. It’s an appropriate goal for certain data but there’s only so much data entry a clinician
can do. When caregivers spend all of their time structuring information, they’re spending less time
capturing in rich detail about what happened. When you’re untangling a safety event, if the rich
language is not there, you’ve lost the granularity of the information and there’s no way to deconstruct
what happened. The natural language processor looks at details, extracts information that identifies a
safety risk and then it populates the report automatically to save people time.
Healthcare Update: How big of a problem are medication errors?
Beggelman: The data that the IOM published in their major report is still pertinent. IOM attributed up to
98,000 deaths annually due to medication error on the inpatient side. It’s suspected that medication
error is responsible for many, many more deaths. It’s the third to fourth leading cause of the death in
the United States and an overwhelming problem. As more people take multidrug regimens, medication
error is only going to increase.
ASQ Healthcare Update
published in collaboration with the ASQ Healthcare Division
March 2013
www.asq.org
Dotan: These are staggering numbers and it’s like an epidemic actually. A smaller hospital might have a
major event once every three years, but there are thousands of hospitals. A larger hospital might have
three major events a year—maybe more, maybe less. These are rule of thumb averages. Put them
together, and you get a frightening number. No one comes to the hospital with the expectation that
they might die. Yet you sign a statement that it’s possible. You’ll hear surgeons say that the surgery was
a success, but the patient died. The fact of the matter is if you don’t know how something happens, you
can’t prevent it from happening.
Smith: No practitioner goes to work and says: “I think I’m going to hurt someone today.” That’s not
what they’re planning to do. But they do hurt someone or they almost hurt someone. Many of us have
spent many hours on calls listening to staff say, “I’m so upset this happened and I don’t know how I’m
going to go to work tomorrow. I don’t know why it happened.” Staff wants to leave work happy, jingling
their keys and content with their work. You can’t stop everything from going wrong. But if something
starts to go wrong, you can have things in place to stop it from going further. You can stop it from
getting to the patient, or stop the patient from going worse. That’s called an intervention. You want
interventions to be planned. That’s the most important thing about analyzing a near-miss. You can find if
it was a planned process or if it was stopped by chance. Was it that the person has been there, seen it,
done it and recognized it? If you can get people to report something that almost went wrong, harvesting
that information is amazingly wonderful.
Healthcare Update: What role does culture play in medication safety?
Dotan: Anngail [Smith] used the word report. Unfortunately, report is synonymous with error. It’s
negative. So, you don’t report something that went wrong. So when we say reporting system, that’s
usually analogous with failure. It’s not about success. You don’t write reports on what you did right, you
write about what you did wrong. In order to change culture, you have to change the language.
Sister Mary Jean Ryan, who led SSM Health Care in St. Louis to be the first healthcare organization to
receive the Baldrige award, actually changed the language in the hospital. When referring to PowerPoint
slides, you weren’t allowed to say “bullet points” because the words signify violence. You had to call
them dots. When you change the words you use it changes your thinking, attitude, culture and behavior,
too. It’s very hard to change behavior without changing language.
ASQ Healthcare Update
published in collaboration with the ASQ Healthcare Division
March 2013
www.asq.org
Instead of “reporting system,” call it a “patient safety evaluation system.” We’re not reporting. We’re
evaluating. We do it for learning, not for punishment. We do it for safety, not for disciplinary action. My
colleagues and I have found that when staff engage with a system that allows them to voice their
opinion, not just the facts, learning actually takes place.
Healthcare needs to take away the fear of reprisal, reputation loss and disciplinary action. There’s going
to be different degrees of use of healthcare’s new systems and laws that are aimed toward this goal, but
it’s certainly better than what we have now.
Here’s a medication error everyone knows about involving an actor named Dennis Quaid. He had two
twins that needed a procedure. And they needed Hep-Lock flush in their IV. Instead of flushing it with
Hep-Lock, they flushed it with heparin. These medications have a similar sounding name, were in vials of
the same size and the labels were similar in color. Heparin is 1,000 times stronger than Hep-Lock. These
babies were given two doses of the adult medication. You can bleed to death. They didn’t die and it
doesn’t appear that they will suffer from any long-term consequences. Quaid was very upset and he
launched a patient safety campaign that promoted electronic health records and computerized
physician order entry. But the real problem was that no one really asked was why the heparin was inside
the cart with the Hep-Lock. Protocol is to keep the different units separate. How many people saw it and
could have put it back? Did anyone know it was missing? Did anyone document it? In life, as long as we
do the right thing, we’re okay. But we don’t think about what if someone else did the wrong thing.
Should we speak up? Should we do something to remedy the situation?
Quiad sued Baxter for not differentiating the design of the vials. Six children in Indiana were given an
overdose a few years earlier in Indiana and three of them died. But people forget. The lesson was not
learned nationwide. We need to ask what do we do in healthcare do to continually learn, improve and
continually change the culture to be proactive and preventive rather than punitive and corrective.
Healthcare Update: What makes the healthcare improvement journey different from those that other
industries have undergone?
Dotan: Accidents in aviation peaked 30 years ago. Now, aviation accidents are rare. There was a change
in culture. Aviation started investigating for the sake of safety, not punitive reasons. Safety improved
across the board, all over the world. It was continual improvement of the entire industry worldwide.
ASQ Healthcare Update
published in collaboration with the ASQ Healthcare Division
March 2013
www.asq.org
Failure wasn’t acceptable. In healthcare, it’s also not acceptable, but the learning is not there. We don’t
have the mechanisms in place to continually learn.
I was in the Air Force. Every single morning, we had to read and sign a document that listed all of the
hazards that could impact the safety of our mission. I know healthcare is different and has its own
complexities, but we can do something. We don’t have to copy models from other industries, but we
have to learn about the thinking that created change. Surgeons will say, “I’m operating on people and
it’s not the same as flying a jet.” But there are certain principles that are the same. We’re humans and
we make mistakes. From janitors to board members, everyone has to contribute things that will make
the organization incrementally better. Over time, that amounts to massive change.
Many hospitals are beginning to learn how to stop the line, which comes from the Toyota lean
philosophy. It means if something is wrong, you speak up. There should be a debriefing every time an
adverse event occurs or there was a near miss. They should document the contributing factors, what
they think should have happened and what did happen. If we capture that knowledge, we will have a
changed culture and improved patient outcomes. Management also has to have better evaluation
systems and get everyone engaged.
Healthcare Update: You mention a transformation in aviation—when do you predict healthcare will
achieve the same?
Dotan: Change will take a long time because medicine has the guild mentality. Whatever happens there,
stays there and mistakes are not shared. You won’t find risk management people in the operating room.
For a physician, opening up to his peers about a mistake is devastating. The chief of surgery doesn’t
come around and say, “Hey guys, I almost made a mistake. Has that ever happen to you?”
Culture is driven from the top. It’s sustainable only from the top. Management can’t delegate
accountability, they need to own it. The reason we see it delegated is because many hospitals are still
ran by administrators and not physicians. So they can’t take accountability for medical procedures when
they don’t know anything about them. So, it’s delegated down to the chief medical officer and so on
down the staff line. That’s one of the reasons why I think it’ll take a long time to change.
ASQ Healthcare Update
published in collaboration with the ASQ Healthcare Division
March 2013
www.asq.org
Healthcare Update: Have you found that there are particular disciplines or cases where the chances of
medication error are greater?
Beggelman: If you don’t mind, I am going to reframe your question a bit. The most common medication
errors are those that are never detected. I believe that undetected errors are more prevalent than we
realize. These are medication reactions that are idiosyncratic, meaning, you can’t anticipate them. A
person is on the right medication at the right dose, but the patient develops a reaction. Often, it’s
attributed to the patient’s underlying illness. The adverse medication event is never diagnosed and even
though there might be terrible consequences, no one realizes what really happened. In many cases
where a patient’s health is going downhill, it’s not clear to the caregivers why it’s happening. What’s
missing is early detection that can result in early intervention. Approaches that try to address this
problem through interaction alerts are missing an important point. You can’t prevent the type of
reaction that is unpredictable.
Smith: Many times when physicians obtain new lab data, they assimilate it into what they know or
where they were already headed. But if they were seeing the data for the first time, they might have
gone a different way. That’s what I try to teach residents –if it doesn’t fit, or even if it does, pretend
you’re starting from scratch. What’s nice about computers is that they have no attachment to their prior
hypothesis or theories and they spark a different way of thinking about what might be going on with a
patient.
With drugs, I don’t know why, but there seems to be an enormous resistance to recognizing drug
reactions and problems. I know this first-hand. I was put on a drug called an ACE-inhibitor and I
developed a horrible cough that I had for six months. It was so bad that at times I had to leave meetings
and I couldn’t talk. A pediatrician I was in a meeting with diagnosed it as an ACE-inhibitor cough. I went
back to my doctor with this information, and she was sending me over to the pulmonary department,
and then she thought I had asthma and I was given asthma drugs. I mean, it’s an ACE-inhibitor cough.
The cough is named after the drug class. I can’t tell you the suffering I went through until a pediatrician
who specialized in kidney disease picked up on it. It’s almost like they work up the diagnosis and the
idea of it being a known reaction or an idiosyncratic reaction.
ASQ Healthcare Update
published in collaboration with the ASQ Healthcare Division
March 2013
www.asq.org
Beggelman: It’s not even a rare occurrence. Around 10% of patients on that medication develop the
cough. It’s not that you shouldn’t have been prescribed an ACE-inhibitor, the problem is that once you
developed a cough, detection of the cause was delayed. Early detection would have saved you difficulty,
disability, unnecessary procedures, medications and tests. The impact on you and unnecessary costs to
the system could have been averted.
Healthcare Update: What role should technology play in patient safety?
Beggelman: Technology has not come anywhere near its full potential in healthcare. One of the most
important roles that technology should be playing is in the area of outcomes assessment. There are
relatively so few outcome analyses because they’re so expensive to perform without the assistance of
computerized data. If all records were digitized within an electronic medical record and we used systems
to analyze that information, we would have a much better understanding of the kind of care realty
works for different situations. .
Smith: If people are practicing less efficiently, and you don’t know it because you’re not tracking it, you
don’t have any way to give feedback and to have a dialogue.
Dotan: If the culture doesn’t welcome technology, it doesn’t matter what technology it is, it will be a
waste of time and money. Technology is critically important, especially with staff shortages. So the
more we can do through machines, it will help us improve the quality of care.
Healthcare does not implement quality management systems. And that’s a major downfall of
healthcare. I’m not saying we can always standardize treatment, because every patient is different and
needs to be treated as a whole. But there are protocols that you can standardize. A human brain can
only do so much. Each person behaves differently each day. We try to standardize our behavior, but we
have circadian rhythms and bio-rhythms. We are different on different days. We are not machines. The
computer is the same every day. It will repeat what you teach it over and over again. Humans are not
like that. Technology is critically important.
Healthcare needs a quality management system to guide how it operates, how it thinks, how it
measures and how it improves. ISO 9001 is a guideline on how to measure your quality, and you got
measurements and audits built in. I’m seeing hundreds of hospitals leave the Joint Commission and
ASQ Healthcare Update
published in collaboration with the ASQ Healthcare Division
March 2013
www.asq.org
going to DNV, which is more strict, rigid and consistent. The Quaid family situation probably never would
have happened if the hospital operated using a QMS. Its people would have been trained to remove the
misplaced heparin. Or the heparin wouldn’t have been there in the first place.
ASQ Healthcare Update
published in collaboration with the ASQ Healthcare Division
March 2013
www.asq.org