Current Issue Pic Top Harvard Health Policy Review Archives Top
Current Issue Pic Middle About Us Fill Archives Bottom
Links  
Contact Us Fall 2000; Volume 1, Number 1
In Focus

Improving Patient Safety (cont.)
page 1 | page 2 | page 3

Guy's session with me changed my life and my career. It is extremely difficult to convey the richness of what he showed me. In one sense, that is the first point. It took him five hours, and he hadn't even really begun. In response to the question, "How do you get good enough to get to the moon?" Guy Cohen had no one-liners to offer me. He didn't say, Report Cards," or "Market Forces," or "Incentive Pay," or even "Accountability." In fact, as I recall it, not one of those words came up in the time we spent together. His views of human nature, organizations, systems, and change would not permit one-line answers.

I do remember the first thing that caught my attention. It was the organizational structure within which he worked. "I report to the NASA Administrator," Guy said, "and he reports to the President of the United States. I'm two levels below the President." That, he told me, was the importance NASA attached to his assignment: "Quality, Safety, and Reliability." I would later learn, in different words, the same lesson from experts in improvement: improvement is never an accident. It requires strong, clear, specific, and visible attention, best of all from the top of an organization.

Of the many lessons from Guy Cohen, I will describe only the very strongest, which came in the form of a story—a true story—from the days of NASA's Titan rocket program, the predecessor of the Gemini Program that first put Americans in space.

The Titan rocket was a liquid-fueled rocket whose two enormous tanks, one of liquid hydrogen and the other of liquid oxygen, supplied fuel through pumps at the bottom of the tanks to a rocket motor chamber where a controlled explosion occurred. On the day in question, with a Titan launch scheduled for the next day, NASA was worried because a problem had developed with the liquid fueled system, and a prior mission had been lost. NASA knew why. Here was the problem.

The design of the rocket required great precision in the use of fuel—every drop of liquid oxygen and hydrogen had to be consumed before engine shutdown, not a bit less or more. Like the water in your bathtub, the liquid in the huge oxygen tank had a tendency to swirl as the level got low, which led to a problem called "cavitation," as the swirling oxygen formed a funnel with a hole in the middle. The high pressure pump would suck on the center of that funnel, sense in error that there was no more liquid oxygen, and shut the engine down too early, leaving swirling oxygen unconsumed in the tank.

The solution to the cavitation problem was to place four small vertical metal baffles at the bottom of the liquid oxygen tank, to keep the liquid from swirling. Unfortunately, the baffles were a little too big, and NASA knew that a little liquid oxygen was being kept away from the pump. NASA had substituted damming for cavitation, and a rocket exploded as a result.

Please notice, first, at this point how much NASA knew and could find out about its problems. This is at the Ferrari level—the design level—and it was simply amazing to me how thoroughly NASA could examine a problem—loss of a rocket—and trace its cause back to such a simple, although powerful, design flaw. That is a characteristic of a great safety system—to find the roots of defect, beginning, sometimes, with a disaster—but it is not the lesson I want to focus on right now.

NASA had a way to fix the problem. It was expensive, but necessary. For the next day's launch, empty the tank of liquid oxygen, lower a man on a harness in a diving suit into the highly toxic gaseous environment of the empty tank, have him trim some metal from the four baffles, and haul him back up. He would carry metal shears, plus a small cloth bag into which he would put the trimmed metal and the four bolts, one removed from each baffle, that would become redundant when he trimmed the metal. This little cloth bag was important, because, if any metal fragments were left behind at the bottom of the tank, they would undoubtedly be sucked into the high-pressure pump along with the fuel, and the rocket would explode in flames.

They did it. The man lowered into the tank was a front-line worker. The NASA officials, including Guy Cohen, were there to watch as the man carried out his assignment. It went like clockwork. He finished, and, while he unsuited, they began closing the tank hatch for refueling.

But then, a problem developed. The employee emptied the contents of the cloth bag onto a table, and out fell the metal fragments, and three bolts, not four. The worker thought hard. He tried to remember removing the four bolts, but couldn't. "There must have been only three," he said. But that wasn't good enough. Risking more delay and expense, the team decided to unbolt the hatch and look for the missing bolt. One after another, the officials took a flashlight, leaned far into the tank (one passed out from the toxic fumes), and peered to the bottom, looking for the missing bolt. It wasn't there. "There must have been only three bolts," said Jerry Gonsalves, a front-line NASA employee who had searched carefully for the bolt. And they all agreed.

That night, Guy Cohen was sleeping peacefully at home—the man two levels below the President of the United States—when his phone rang. It was Jerry Gonsalves. This is what he said. He could not sleep, he said. On his own initiative, he had dressed at midnight, driven an hour to the storage facility where another Titan liquid oxygen tank was resting on its side, empty, awaiting a future assembly. Jerry had taken a bolt and a flashlight, and crawled around in and out of the hatch, down to the bottom of the tank, placing the bolt in different locations, trying to see if there was anywhere at the bottom of the tank where the bolt could have hidden from the probing light of the flashlight peering down from the open hatch.

"I found two places," Jerry told Guy Cohen. "We could have missed it."

That was enough. Guy Cohen's next call was to the flight director, and, at 5:00 a.m., they were all assembled at the launch pad, not preparing for the launch, but emptying the fuel tank, at extremely high cost, suiting up a man in a diving suit, and lowering him back into the tank. He went to the first of the two hiding places that Jerry Gonsalves had found, reached his hand down, and picked up the missing bolt.

Guy Cohen asked me a question at that stage in his story. "Suppose the worker who removed the bolts and the employee who said they were not in the tank, Jerry Gonsalves, had been nurses," he asked, "and we were talking about a serious drug error. What would happen in one of your hospitals?"

I knew the answer very well. "There would be an incident report," I said. "And, the nurses would have some sort of warning put in their files. If the patient had died, they probably would be fired or worse."

"Then you'll never be safe," he said. "That's not what we did. We saved that bolt, and we had it gold plated and mounted on a plaque. And we had the NASA Administrator come to the launch of that rocket a couple of days later. And in full view of everyone there, we gave that plaque to Jerry Gonsalves and his colleagues, and we dedicated the launch to them."

I think the point is clear. You have to be very smart to design a rocket right, and even smarter to figure out what happened and correct it when something goes wrong. But you have to be even smarter still to design and lead the supraordinate system—not the system of work, but the system of leadership and management in which the work system will thrive or wither. That is what Guy Cohen taught me, and it took him five hours to show me even the outlines of world-class management.

Eight Guidelines for Managing and Improving Safety Systems in Health Care
Compared with NASA in its heyday, and with respect to safety as a goal, health care is now barely on the track. We are beginners. That is good news and bad. The bad is that we have a long way to go. The good is that it almost does not matter where we start now—the fine-tuning isn't the issue, starting is the issue. Let me suggest the outlines of the steps we should be taking now on the meta-system—the management system in which the Jerry Gonsalves' of our future will either thrive or be silenced.

First, improved safety must be our specific, declared, and serious aim, beginning at the top of our organizations. Boards of Directors must show that they are committed to this aim by regular, close oversight of the safety of the institutions they shepherd, and their reviews of progress in results and system design should be frequent, detailed, quantitative, and demanding. Safety improvement aims should be quantitative and annual. Perhaps the organization's Annual Report could contain a regular section on safety improvement, highlighting involvement of staff;

Second, executives, both clinical and non-clinical, should make regular review of safety systems part of their work and schedule. Such reviews would include monthly audits of the safety system, "walk-throughs" to evaluate hazardous areas and designs, incorporation of safety improvement goals into annual business plans, with clear reporting and assessment of progress, and intolerance of serious procedural violations by people, including doctors, in high-risk settings. In a walk-through, safety-conscious executives might, for example, notice and direct attention to disorder in medication areas, illegible charts, unlabeled medications, or confusing signs, asking that work plans be established to reduce specific hazardous conditions. All senior leaders should be especially watchful for complexity in work systems, and offer support for sensible forms of simplification;

Third, a non-punitive hazard and error reporting system should be in place, with all personnel expected and encouraged to report errors and hazards, including "near misses." This stimulated reporting system should be supplemented and calibrated by independent audits of hazardous circumstances, such as through chart reviews and "walk-throughs;"

Fourth, processes should be in place for the thorough investigation, review, and analysis of errors and near misses, so as to identify patterns of hazard and vulnerable designs. Discussions of near-misses, reported errors, hazardous conditions, and potential system changes might occur as a regular part of all nursing and pharmacy staff meetings. I think that all modern safety systems must have the ability to study cases in detail and learn wise lessons from that study;

Fifth, responsibility for oversight of hazardous systems as a whole should be clearly located in an individual with the time to discharge this duty. For example, oversight of a hospital's medication system as a whole, including its safety and its improvement, may be placed under a single physician, with 50% or more of his time devoted to that role;

Sixth, the organization should maintain an ongoing process for the discovery, clarification, and incorporation of basic principles and innovations for safe design, searching the health care industry, other industries, and research on human factors engineering, organizational and social psychology, and cognitive psychology for potentially fruitful concepts. Organizations need sound, scientifically-grounded theories about errors and safety;

Seventh, cultural supports to safety and its improvement should be continually reinforced, such as through recognition systems for individuals and departments who contribute to safety improvement, and through repeated public recognition—a "relentless drumbeat"—of the importance of improving safety as an organizational imperative. Communication should be repeated and multi-channel: thank-you notes from executives, reports in newsletters of specific improvements in system safety, bulletin board displays recognizing safety innovations each week, and more;

Eighth, and finally, as organizational knowledge about safety grows, so also should grow a more and more specific, organization-wide conceptual base for the principles of safe design. Some of these principles, such as the value of wise simplification and standardization, the importance of complete information exchange and communication, the value of teamwork, the unwillingness to accept any hazard as inevitable, the avoidance of blame in the search for hazard, and the commitment to strong and immediate recovery procedures when errors do occur, become, in the organization truly committed to safety, matters of day-to-day values in work. They become, "The way we do things around here," and not topics for repeated struggles for control, battle-by-battle. Safety, and its associated design principles, are in the very marrow of the organization, and, rather than treating each error and hazard as a unique, surprising, separate, and sometimes-tragic event, people view the entire organization as system, and the search for improved safety as a life-long, shared journey.

I learned a lot from Guy Cohen, but his approach is not a secret. It is grounded in strong and interesting sciences, tested by long and well-documented experience, published, and motivated by the very best forms of faith in the nature of humans and in the unending possibility of doing things better. As we pursue safety in health care, we are on a well-worn path, and, if we have the wisdom and good will to follow it, our improvements will know no bounds.

page 1 | page 2 | page 3

Donald Berwick, MD, MPP is President and C.E.O. of the Institute for Health Care Improvement and Clinical Professor of Pediatrics and Health Care Policy at Harvard Medical School. This article is adapted from the text of the Plenary Address which Dr. Berwick delivered at the 79th Symposium of the Japanese Society for Quality Control on September 23, 2000.

Subscribe
EPIHC
 
Home
 
Fill
Fall 2000, Volume 1, Number 1
Table of Contents
Editor's Note
Features: Election 2000
Health Highlights
In Focus
Glossary of Health Care Terms
Seal
 
Bar

about us | links | contact us | subscribe | epihc