Current Issue Pic Top Harvard Health Policy Review Archives Top
Current Issue Pic Middle About Us Fill Archives Bottom
Links  
Contact Us Fall 2000; Volume 1, Number 1
In Focus

Improving Patient Safety
Donald Berwick, MD, MPP
page 1 | page 2 | page 3

For well over a decade, I have participated in an effort to develop and encourage a new level of investment in quality improvement in American health care. This has been in many ways a very gratifying process, but it has also been frustrating. Despite many decades of research and literally thousands of journal papers documenting how much improvement is scientifically possible in health care, neither the professionals nor the public have become truly energized to tackle the challenge. Issues in quality of health care have remained largely background issues; few people have regarded quality as a central problem.

On November 29, 1999, that changed. That day, perhaps the most dramatic single event in the recent history of the American health care quality movement occurred. It was the release by the Institute of Medicine of the National Academy of Sciences of a report on problems in patient safety. The report was called: To Err Is Human. The day it appeared, it became headline news on every major American television network, in every major American newspaper, and in thousands of public and professional gatherings. The wave of interest that began on November 29 has not yet ended. Meetings on patient safety continue everywhere in the U.S. today on nearly a daily basis, and hundreds, if not thousands, of safety improvement projects are now underway in hospitals and clinics throughout America.

The IOM's Committee on Quality of Care in America, on which I serve, and which wrote the report To Err Is Human, issued, basically, six major findings, as follows:

  • First, safety problems in health care and preventable patient injuries are common and serious. Based on thorough review of dozens of research papers, the Committee found, for example, that almost seven percent of patients in American hospitals are exposed to a serious or potentially serious medication error, almost four percent suffer from an "adverse event," defined as injury from the health care that was supposed to help them, and, most alarming of all, between 44,000 and 98,000 Americans die each year from adverse events in hospitals. Thousands more probably die from errors in care in nursing homes, home health care, and office-based surgery, but these are areas about which we have little scientific information. Deaths from adverse events in hospitals are so common that they rank in the U.S. as at least the eighth leading cause of death, more common than deaths from breast cancer, motor vehicle accidents, or AIDS.
  • Second, the Committee found that safety problems and injuries in health care are not generally due to bad doctors or bad nurses, or to carelessness or incompetence in individuals. They are due to systemic flaws - hazards that are built into work processes, job descriptions, and equipment designs, for example. To put it simply, if we simply fired from their jobs today every single health care worker who was involved in an error, we would have exactly the same frequency of errors tomorrow. It is not bad people who injure patients; it is bad systems. If the systems do not change, the injury rates will remain the same.
  • Third, the Committee found that safety could be vastly improved in health care if we were to incorporate basic principles of human factors engineering, industrial engineering, and other safety sciences into health care designs. As of now, many health care designs violate basic safety principles, and thus the patients and the clinicians get trapped in unsafe, accident-prone circumstances.
  • Fourth, the Committee found that incorporating safe designs into health care will require major cultural change, not just technical changes. We need to adopt a "culture of safety" throughout health care. One of the most important aspects of such a culture is that issues of error, hazard, and safety can be discussed openly, and that people can talk freely about their own errors and the hazards they encounter, without fear of blame or punishment. The Committee found that the current health care culture is nearly the opposite of this in the U.S. today; conversations about safety and error provoke fear, resistance, punishment, and secrecy. People hide their errors instead of revealing them.
  • Fifth, the Committee found that the public ought to be much better informed than it is today about hazards in health care. This openness would both help patients and families to participate more effectively in making the system safer, and would improve trust by showing people that health care is honest and serious about improving its own safety.
  • Sixth, and finally, the Committee recommended a strong investment in new research on patient safety, both to assure incorporation of existing scientific knowledge about safety from other industries, and to develop new designs and theories directly relevant to health care. For the U.S., we recommended a patient safety research budget of $30 million per year, rising quickly to $100 million.

Patient Safety: A Dimension of Quality
My area of work is not mainly in the field of human error and safety, but rather in "improvement," or "quality of care." I want to make care better. My personal definition of the word, "quality," is very broad. It incorporates all of the dimensions of performance of a system that the people who depend on that system care about. Actually, the concept is better expressed by the word, "qualities," plural, instead of "quality," singular. The qualities of care that we care about include, of course, the obvious main objectives of care: to save life, restore health, prevent disease, and ease pain, for example. But, they also include factors in the total experience of care—dignity, responsiveness, timeliness, and emotional support—for example, and the determinants of efficiency, especially the capacity to avoid waste and thereby to reduce costs.

According to the theory and practice of quality improvement in health care as in other industries, improvements in many qualities of performance are achievable simultaneously in a complex system if one is bold enough, committed enough, and creative enough to design and redesign that system continually, sometimes involving even the first principles—the basic, original design of the system at its core.

One of the unfortunate, and potentially divisive, misperceptions in the snowballing effort to reduce errors and improve safety in health care—the effort that the IOM Report has launched—is that that campaign is somehow different from or even in competition with the rising tide of will in the U.S. to improve the quality of health care. In my opinion, nothing could be farther from the truth.

I have heard more than one leader in the field of patient safety lament that the energy going into quality improvement in health care was distracting people from the important agenda of improving patient safety. This concern reflects a basic misunderstanding. According to the proper, modern notion of quality, the search for safety and the reduction of errors can be seen as absolutely central—an ideal starting place for the quest for improved care—not a distraction, but a threshold issue and a perfect test-case for improvement of all we do. If we cannot improve safety, then what, after all, can we possibly mean by the term "quality improvement?"

Patient Safety and Systems Thinking
The specific search for safety and the more global search for improvement in all of the effects of our work are united by the concept of a system. The fundamental theoretical foundation for both improvement of safety and improvement in general lies in the notion that performance—something like an error or an error rate—is a property of a system. If the system is stable, the performance is predictable.

Let me give you an example. I drive a Ford Windstar van. If I floored the accelerator on an open highway, its speed would climb to some maximum—say, 94 miles per hour. That is all. Depending on the road, the weather, the wind, and the gas, that top speed might vary a little—between, say 91 and 98 miles per hour, but, in general, we can make a pretty good prediction of the top speed㭚, give or take a few. My Windstar van is a system, and its top speed is a predictable property of that system. Suppose I would like my Windstar to go 130 miles per hour. I could scream at it very loud, or put an incident report in its file when it failed, but that would be stupid. Screaming at a system is a very interesting comment on the screamer, but tells us nothing at all about the system.

I could go faster, but I would need a new system. I could buy a Ferrari, for example. There would still be a top speed—maybe 194 miles per hour—but it is a different top speed, because the system is different. New system, new speed.

One example of a system is illustrated by a photograph of two vials of medication from an article in The New England Journal of Medicine some two decades ago. One vial is a bottle of racemic epinephrine, which was put down the nasotracheal tubes of some premature infants to help them breathe better. The other is a bottle of Vitamin E, which was put down the nasogastric tubes of premature infants who were Vitamin E deficient. The two bottles look nearly identical, with similar size and very similar labels. The article was about an outbreak of deaths in a neonatal intensive care unit. Babies were dying. You can guess why. The racemic epinephrine was being put into nasogastric tubes, causing gastric hemorrhage. This is a system perfectly designed to kill newborn babies. Not all babies. Just a few. Predictably.

Properties like errors are system properties. And rates of error are predictable properties of the systems in which those errors occur. The fundamental law of improvement is this: "Every system is perfectly designed to achieve exactly the results it gets."

If we want a new level of performance, we must get a new system. This applies equally to all forms of performance—the functional status outcomes of care, hemoglobin A1C levels in diabetics, immunization rates in children, waiting times, pain relief, answering questions, maintaining privacy, easing death, closing the racial gaps in health status, and—in the case at hand—improving patient safety. What a wonderful place to start! If we can figure out how to change to new systems, so as to reduce errors and mitigate their effects, then we are bound to learn generalizable lessons about change, itself. Error-reducers and quality-improvers are in exactly the same boat. In fact, they are pulling the same oar.

Now, if we understand that performance features—features like safety—are system properties, then we get curious about systems. What is a "system" after all? Formally, a system is a set of interdependent elements interacting to achieve a common aim; a set of things that work together to get to a goal. Now, it gets a little more complicated. Systems thinking is not easy. In fact, for many people it is an unnatural act.

Worse, if we really want to think in systems terms, it is important to realize that we have to worry not about one kind of system, but about two.

First, we have to worry about the system of work—the thing that gets the job done, and whose performance—top speed or safety level—is the property we care about. What's the difference between a Windstar and a Ferrari? The answer is complicated, and, if we really wanted to turn a Windstar into a Ferrari, we would have to know a lot about cars as a system. We would be talking about carburetors, metal alloys, fuel injection designs, and such. The same thing applies to medication errors and patient safety. If we want Ferrari safety levels instead of Windstar safety levels, then we would have to be talking about designs. For example, what does a safe medication system look like? What are the differences between a safe medication system and the one we have now?

But, that is not enough. If our current system has a performance level that we do not like, then we have to get a new one. And, that means that we have to change. Maybe one rich person can switch, all by himself or herself, from a Windstar to a Ferrari—just go and buy one. But, it is hard to buy a safety system. Probably, it is impossible. Hospitals or clinics are just too complicated. The elements and interdependencies in the system of care whose property is a specific level of safety involve people, departments, habits and traditions, rules, equipment, hierarchy and sociology, patients with varying needs, constantly evolving technologies and medications, and much more. This system makes a Ferrari look very simple.

page 1 | page 2 | page 3

Subscribe
EPIHC
 
Home
 
Fill
Fall 2000, Volume 1, Number 1
Table of Contents
Editor's Note
Features: Election 2000
Health Highlights
In Focus
Glossary of Health Care Terms
Seal
 
Bar

about us | links | contact us | subscribe | epihc