This website uses cookies to enhance your experience in navigating the site, to personalize your academic tutoring, and to contribute to our research in education. By continuing, we understand that you accept the use of cookies. I Accept Cookies policy

PROF LEAPE, expert on reducing medical error

“Systems thinking says being human we’re going to have decision-making problems”

Prof Lucian Leape is world famous for his contribution to the field of patient safety.

Prof Dr Lucian Leape is a key thinker in how to improve the medical system to reduce medical error. He quit a career as a pediatric surgeon to embrace patient safety. In 1994 he had his article Error in Medicine published in JAMA and in 2000 he testified before a subcommittee of the US Senate with his recommendations for improving medical safety. Also, in 2007 he founded the Lucian Leape Institute at the National Patient Safety Foundation and in 2015 he retired as a professor at Harvard School of Public Health. His policy prescription: “Errors must be accepted as evidence of system flaws not character flaws.”

Madrid/Boston - June 18, 2019. What made you quit clinical practice and change it for a career focused in patient safety and medical error?

A number of reasons of course, but primarily I was just getting frustrated with the political aspects of Academic medicine, which is in the USA are very intense. At the same time, I was very concerned that Economics were taking over our health policy. Not much of a problem in Spain I suppose, but in a health system that is ran like a business, the economists have a lot to say and I thought they did not really understand medicine and they were not interested into bringing the physicians voice into the policy. 

How was this transition in terms of preexisting infrastructure and support?

I was fortunate I was able to get a fellowship at the Rand Corporation, which is a think tank in Southern California, and I spent a year there studying health policy, Economics and that sort of things. After that, I was pretty much on my own in terms of support, but I got involved in research projects and got research grants. 

In 1994, were you aware of the potential of your article ‘Error in Medicine’? Were you somehow afraid of the acceptance of the conclusions you were about to expose?

I knew from the time I began, when I did my research and discovered the importance of human factors and cognitive psychology, I knew this was a big discovery and I was very enthusiastic about it. At the very beginning I thought this would cause a shift in the paradigm in the way we look at medical practice and it really did so I knew it was big. I wasn’t really afraid, I thought it would take a while for people to embrace these ideas. People in medicine are skeptical and I don’t think new ideas work right away. On the other hand, I wanted to do the right thing. I thought the substance was there and that this was clearly a better way to go, that this would win out in the end, and the thing is  I think it has. 

Talking about error, what’s the impact of clinical reasoning, attention, concentration, metacognition and thinking skills in this scenario?

These of course are the reasons people make mistakes and that’s what system’s thinking is all about. Systems thinking says being human we’re going to have decision-making problems so the question is: can you reduce those, can you prevent them by designing systems that in effect guide thinking interactions? Well, the evidence, is pretty clear: the best way to deal with these issues isn’t education but rather rethinking the systems rethinking the systems so that it is difficult or impossible for people to make mistakes.

Numerous studies show that approximately 10% of hospitalized patients worldwide suffer some type of damage caused by a care error. In the US, it is estimated that each year more than one million patients are injured and at least 100,000 die due to care errors. If these figures apply to Spain, there are at least 150,000 preventable injuries and more than 12,000 deaths per year. We make many mistakes, but we have to stop punishing people and change the systems that lead people to make mistakes. "Errors must be accepted as evidence of system failures, not as personal failures."

What have we learned from those days till now?

The first thing is it works. Other industries have applied these techniques with a dramatic success. We’ve had a lot of successes: we’ve hospitals in the state of Michigan that totally eliminated central line or blood infections, we have admitted big impact in hospital acquired infections, we’ve greatly reduced medication errors so I don't think there's any question that this concept is sound and that when you actually apply these principles the problem is properly adressed.

The problem is getting everybody to do it and that's of course the natural problem with any changes. People resist change, but I think we've had a significant amount of improvement and that that we are continuing to see a growth in patient safety everywhere. It is difficult to change the systems. Research always reveals multiple errors in the systems behind each error. Even simple changes in processes require a team effort with doctors, nurses and other professionals. However, we have come a long way.

Can you give us some figures?

Well, the numbers are always a bit suspect because they rely on voluntary reporting. In any country we don’t have a good system for monitoring, but the Agency for Health Research Education says we have a 40% decline in hospital acquired infections and there’s been several studies in the US that show an 80% decline in central line infections. I don’t have numbers on medication errors, but there’s no much question we’ve made a huge progress. 

Are we still struggling with a certain resistance? Is fallibility a taboo in medicine nowadays?

I don’t think so. I think what we are doing is recognising that fallibility is part of human existence. I think embracing the idea that the system needs to compensate for and prevent people’s mistakes that’s broadly accepted. Resistance is not so much to the idea, it is to putting it into practice.  That’s when people have to change what they are doing, when everybody has to participate. Everybody thinks a surgical checklist is a good idea but telling a surgeon what to do in the operate room that’s another matter. The main barrier to improve our processes is a dysfunctional culture. The ethical dimension may be the big issue to be addressed.

Why is not the notion of uncertainty (and complexity) introduced into undergraduate education? Many schools continue to separate the basic sciences education cycle from the clinical practice…

That’s a very good question. We called for it ten years ago. We wrote a monograph in white paper about what needed to be done in Medical Education and I think it resulted in some changes. Harvard Medical Shool, the one I know the best, has turned the curriculum upside down. Instead of lectures and going to class to study biochemistry or anatomy, the students start dealing with clinical issues on day one. They also work in teams, which is what we are trying to get doctors to do. From day one, they’re thinking in terms of real issues, they use case studies right from the beginning. I think education it’s beginning to change, although there is still space for improvement. 

What is the human factor? How do systems and people intervene?
This is a common misunderstanding. It’s not that we have systems and we have people. Our systems are people: people’s interactions. Everything we’re talking about is human factors. 
We’re trying to educate, motivate and help people to change the way they think, to work in teams and to be more open and collaborative and less autocratic and dogmatic. I think we’re making progress on that. 

We’ve learned that team workers are really the essence of that. We’ve also learned that accountability is important:  it’s one thing when a person fails to perform a duty because they don’t know or they don’t understand, but it’s another thing when they deliberately refuse to do it. If you have a physician who refuses to disinfect his hands there should be consequences. They should understand that’s not acceptable. It’s two sides of the same coin: we want people to work together, to collaborate to understand and to progress; on the other hand, we will not tolerate conduct misconducts. 

How can both clinical simulation and clinical reasoning training  contribute to reduce the medical error rate?

Both we’ve a long history with this. Here is Boston and on the West Coast people have been working on medical simulation for 20 years. We’ve well developed systems for treating, they’re now part of the medical curriculum and a lot of medical schools. 

Simulation training either on mannequins for things like resuscitation and anesthesia and surgery or with human actors for interviewing techniques and disclosure and that sort of things, it’s very well-established and I think that's where we might have made the most progress in terms of improving medical education. 

One week from today is the sixth anniversary of the Boston Marathon terrorist bombing that killed three people and injured 264. Those three people were killed by the bombs from they exploded. None of the 264 injuries died after that. The reason they didn’t die is that medical response teams had practised through simulations of catastrophe medicine. They knew just what to do and did it.   I’ve never been so proud of my profession. 

Script Connect
Personalized help center

Our personalized help center enables you to obtain technical support and help for navigating through the site and using the program.

Frequently asked questions