Play Live Radio
Next Up:
Available On Air Stations

A Doctor Confronts Medical Errors — And Flaws In The System That Create Mistakes

Dr. Danielle Ofri, author of <em>When We Do Harm: A Doctor Confronts Medical Error,</em> says medical mistakes are likely to increase as resource-strapped hospitals treat a rapid influx of COVID-19 patients.
Dr. Danielle Ofri, author of <em>When We Do Harm: A Doctor Confronts Medical Error,</em> says medical mistakes are likely to increase as resource-strapped hospitals treat a rapid influx of COVID-19 patients.

For more than two decades as an internist at New York City's Bellevue Hospital, Dr. Danielle Ofri has seen her share of medical errors. She warns that they are far more common than many people realize — especially as hospitals treat a rapid influx of COVID-19 patients.

"I don't think we'll ever know what number, in terms of cause of death, is [due to] medical error — but it's not small," she says.

Ofri's new book, When We Do Harm, explores health care system flaws that foster mistakes — many of which are committed by caring, conscientious medical providers. She notes that many errors go unreported, especially "near misses," in which a mistake was made, but the patient didn't suffer an adverse response.

"Near misses are the huge iceberg below the surface where all the future errors are occurring," she says. "But we don't know where they are ... so we don't know where to send our resources to fix them or make it less likely to happen."

Ofri says the reporting of errors — including the "near misses" — is key to improving the system, but she says that shame and guilt prevent medical personnel from admitting their mistakes. "If we don't talk about the emotions that keep doctors and nurses from speaking up, we'll never solve this problem," she says.

Interview Highlights

<a href=""><em>When We Do Harm</em></a>, by Danielle Ofri, MD
/ Penguin Random House
<a href=""><em>When We Do Harm</em></a>, by Danielle Ofri, MD

On Ofri's experience of making a "near-miss" medical error when she was a new doctor

I had a patient admitted for so-called "altered mental status." There was an elderly patient from a nursing home and they were sent in because someone there thought they looked a little more demented today than they looked yesterday. And of course, we were really busy. ... And the labs were fine. The radiology was fine. And so I just basically thought, "Let me get this patient back to the nursing home. It's all fine."

So I sent the patient to kind of an intermediate holding area to just wait until their bed opened up back at the nursing home. Well, it turns out that the patient was actually bleeding into their brain, but I missed it because I hadn't looked at the CAT scan myself. Somebody said to me, "radiology, fine." And so I took that at their word and didn't look at the scan myself as I should have.

Now, luckily, someone else saw the scan. The patient was whisked straight to the [operating room], had the blood drained and the patient did fine. So in fact, this was a near-miss error because the patient didn't get harmed. Her medical care went just as it should have. But, of course, it was still an error. It was error because I didn't do what I should have done. And had the patient gone home, they could have died. But, of course, this error never got reported, because the patient did OK. So we don't know. It never got studied or tallied. So it was missed, kind of, in the greater scheme of how we improve things.

On the effect of having made that 'near-miss error' on Ofri's subsequent judgment

In the short run, I think I was actually much worse, because my mind was in a fog. My soul was in a fog. I'm sure that many errors were committed by me in the weeks that followed because I wasn't really all there. I'm sure I missed the subtle signs of a wound infection. Maybe I missed a lab value that was amiss because my brain really wasn't fully focused and my emotions were just a wreck [after that serious near miss]. I was ready to quit. And so I'm sure I harmed more patients because of that.

Now that it's been some time, it's given me some perspective. I have some empathy for my younger self. And I recognize that the emotional part of medicine is so critical because it wasn't science that kept me [from reporting that near miss]. It was shame. It was guilt. It was all the emotions.

On the source of medical errors in COVID-19 treatment early on in New York and lessons learned

We did pull a lot of people out of their range of specialties and it was urgent. But now that we have some advance warning on that, I think we could take the time to train people better. Another example is we got many donated ventilators. Many hospitals got that, and we needed them. ... But it's like having 10 different remote controls for 10 different TVs. It takes some time to figure that out. And we definitely saw things go wrong as people struggled to figure out how this remote control works from that one. And so trying to coordinate donations to be the same type in the same unit would be one way of minimizing patient harm.

The other area was the patients who don't have COVID, a lot of their medical illnesses suffered because ... we didn't have a way to take care of them. But now we might want to think ahead. What do we do for the things that are maybe not emergencies, but urgent — cancer surgeries, heart valve surgeries that maybe can wait a week or two, but probably can't wait three months?

On how patient mix-ups were more common during those peak COVID-19 crisis months in NYC

Dr. Danielle Ofri is a clinical professor of medicine at the New York University Medical School. Her previous books include<em> What Doctors Feel.</em>
Rogelio Esparza. / Beacon
Dr. Danielle Ofri is a clinical professor of medicine at the New York University Medical School. Her previous books include<em> What Doctors Feel.</em>

We had many patients being transferred from overloaded hospitals. And when patients come in a batch of 10 or 20, 30, 40, it is really a setup for things going wrong. So you have to be extremely careful in keeping the patients distinguished. We have to have a system set up to accept the transfers ... [and] take the time to carefully sort patients out, especially if every patient comes with the same diagnosis, it is easy to mix patients up. And so, thinking ahead to what does it take to have enough time and space and resources to make sure that nobody gets mixed up.

On how the checklist system used in medicine was adapted from aviation

In the aviation industry, there was a whole development of the process called "the checklist." And some people date this back to 1935 when a very complex [Boeing] B-17 [Flying] Fortress was being tested with the head of the military aviation division. And it exploded, and the pilot unfortunately died. And when they analyzed what happened, they realized that the high-tech airplane was so complex that a human being could not keep track of everything. And that even if he was the smartest, most experienced pilot, it was just too much and you were bound to have an error. And so they developed the idea of making a checklist to make sure that every single thing you have to check is done. And so it put more of the onus on a system, of checking up on the system, rather than the pilot to keep track of everything. And the checklist quickly decreased the adverse events and bad outcomes in the aviation industry.

Once you start paying attention to the steps of a process, it's much easier to minimize the errors that can happen with it.

And that's been adapted to medicine, and most famously, Peter Pronovost at Johns Hopkins developed a checklist to decrease the rate of infection when putting in catheters, large IVs, in patients. And the checklist is very simple: Make sure the site is clean. Put on a clean dressing. Make sure you're wearing the right PPE. Nothing unusual; it's kind of like checklisting how to brush your teeth. Yet the rate of infections came right down and it seemed to be a miracle. Once you start paying attention to the steps of a process, it's much easier to minimize the errors that can happen with it.

On how the checklist system did not result in improved safety outcomes when implemented in Canadian operating rooms

The problem is, once you have a million checklists, how do you get your work done as an average nurse or doctor? ... They just get in the way of getting through your day. And so we just check all the boxes to get rid of it. And that's what happened with this pre-op checklist in Canada. And, again, the preoperative checklist was making sure you have the right patient, the right procedure, the right blood type. Very simple. And [the checklist] showed impressive improvements in complication rates in hospitals — both the academic and high-end and even hospitals in developing countries. So, in 2010 the minister of health in Ontario mandated that every hospital would use it — plan to show an improvement in patient safety on this grand scale. And ... the data did not budge at all, despite an almost 100% compliance rate. And that lets you know that at some point, people just check the boxes to make them go away. And they're not really gaming the system, per se, but it lets you know that the system wasn't implemented in a way that's useful for how health care workers actually work.

On why electronic medical records are flawed and can lead to errors

[Electronic medical records] really started as a method for billing, for interfacing with insurance companies and medical billing with diagnosis codes. And that's the origin. And then it kind of retroactively was expanded to include the patient care. And so you see that difference now.

For example, ... [with] a patient with diabetes ... it won't let me just put "diabetes." It has to pick out one of the 50 possible variations of on- or off- insulin — with kidney problems, with neurologic problems and to what degree, in what stage — which are important, but I know that it's there for billing. And each time I'm about to write about it, these 25 different things pop up and I have to address them right now. But of course, I'm not thinking about the billing diagnosis. I want to think about the diabetes. But this gets in the way of my train of thought. And it distracts me. And so I lose what I'm doing if I have to attend to these many things. And that's really kind of the theme of medical records in the electronic form is that they're made to be simple for billing and they're not as logical, or they don't think in the same logical way that clinicians do. And it's very fragmented. Things are in different places. Whereas in the chart — in the old paper chart — everything was in one spot. And now they're in many spots.

On her advice for how to stay vigilant when you're a patient

Be as aware as you can. Now, of course, you're busy being sick. You don't necessarily have the bandwidth to be on top of everything. But to the best that you can, have someone with you, keep a notebook, ask what every medication is for and why you're getting it. What are the side effects? And if people are too busy to give you an answer, remind them that that's their job and it's your right to know and your responsibility to know. And if you can't get the information you want, there's almost always a patient advocate office or some kind of ombudsman, either at the hospital or of your insurance company. You should feel free to take advantage of that.

The information in the chart is yours. You own it. And so if someone's not giving you the time of day or the explanation, it's your right to demand it. Now, of course, we recognize that people are busy and most people are trying their best. And you could certainly acknowledge how hard everyone's working. But don't be afraid to speak up and say, "I need to know what's going on."

Sam Briger and Thea Chaloner produced and edited the audio of this interview. Bridget Bentz, Molly Seavy-Nesper and Deborah Franklin adapted it for the Web.

Copyright 2020 Fresh Air. To see more, visit Fresh Air.

KUER is listener-supported public radio. Support this work by making a donation today.