Issue Date: July 14, 2008
Dealing With Disaster
Shortly after midnight on Dec. 3, 1984, a deadly cloud of methyl isocyanate gas spewed from Union Carbide’s pesticide plant at Bhopal, in the central Indian state of Madhya Pradesh. The accident killed upward of 4,000 sleeping residents of the poor, densely crowded community surrounding the facility and seriously sickened thousands more.
Shortly before midnight on April 25, 1986, in a Soviet nuclear power plant at Chernobyl, in what is now northern Ukraine, an ill-conceived test of emergency equipment went out of control. Water coolant overheated, blowing off the top of the reactor. Tons of highly radioactive material escaped into the atmosphere; 31 workers at the plant died of radiation poisoning shortly after. People living in the nearby town of Pripyat were badly contaminated. Fallout from the accident, drifting far over the Soviet Union and the rest of Europe, led to untold numbers of cases of radiation disease and death.
A common thread runs through these and many other calamities examined by Mark D. Abkowitz in his book “Operational Risk Management: A Case Study Approach to Effective Planning and Response.” Their dire consequences could have been avoided—or, at least, considerably mitigated—by better planning, preparedness, and response, according to Abkowitz, who teaches civil and environmental engineering at Vanderbilt University and has served on a passel of national and international advisory committees and boards. His book is intended for leaders in government and business.
Abkowitz classifies his case studies as man-made accidents, acts of terror, or acts of God. He includes the Exxon Valdez oil spill on March 24, 1989; the bombing of the Alfred P. Murrah Federal Building in Oklahoma City on April 19, 1995; and the Dec. 26, 2004, earthquake and tsunami in the Indian Ocean that left more than 300, 000 people dead.
Looking at each of these tragedies in detail, Abkowitz describes what went wrong, how it might have been prevented, and what lessons can be drawn to prevent or mitigate similar catastrophes in the future. He spells out 10 major “risk factors” that he blames for most calamities. They range from flaws in design or construction of facilities to communication systems that break down under stress. He says they include arrogance, overconfidence, and negligence as well as lack of forethought or preparedness on the part of employees and others.
Abkowitz finds that several of these risk factors played a key role in each of the disasters he covers. He writes that economic pressures, poor planning, and failed communication, in fact, were fatal flaws in nearly all. And when one risk comes into play, destruction often may spiral out of control, he says.
The unprofitable pesticide plant at Bhopal, for example, was run on a shoestring budget, Abkowitz writes. It was plagued by flawed design, leaky valves, poorly functioning gauges and safety equipment, and other failures, as well as inadequate maintenance. Workers were not well trained. Accidents had been common at the facility ever since it came onstream in 1969. Planning for emergencies was minimal. Supervisors were slow both to react when the disastrous leak was initially observed and to spread a warning to the nearly 900,000 people living near the plant, he continues. Seemingly, just about everything that could have gone wrong that fatal night did go wrong.
The focus of Abkowitz’ book clearly is on megadisasters. More mundane, workaday risks common to plant managers—a burst pipeline spewing flammable gases or liquids, a runaway reactor, or a derailed tank car—do not get his attention. Certainly, though, his 10 risk factors apply to lower priority accidents as well as to full-fledged calamities.
So what lessons can be drawn from Abkowitz’ case studies? In two brief final chapters he sums them up with a few tried-and-true clichés: the old Boy Scout motto “Be Prepared,” for example, or “plan ahead” and “safety first.” Don’t short-change the system or short-cut prudent operating procedures, he urges, and be sure that everyone involved in an incident can talk with each other.
Human flaws are a common thread through most of Abkowitz’ risk factors: arrogance, overconfidence, imprudent behavior, carelessness, negligence, cutting corners, and lack of foresight. But he by no means makes it clear how such frailties can be weeded out of the workforce in order to promote safety and preparedness.
Abkowitz does not overlook the importance of a little bit of luck in averting catastrophe. He rounds out his disaster accounts with two “success stories” in which good training and preparedness averted or at least lessened potential disaster. One of these involved United Airlines flight 232, on its way from Denver to Chicago on July 19, 1989. When one of the airliner’s jet engines exploded over Iowa, all three of the plane’s hydraulic control systems were wiped out. The craft and all of its 296-member crew and passengers apparently were doomed. Valiant efforts by the crew, however, managed to put the plane down at the Sioux City, Iowa, airport 45 minutes after the explosion. Although 111 people died in the uncontrolled rough emergency landing, that anyone survived was extraordinary.
A reader can learn much from Abkowitz’ compilation of cautionary tales of disaster. But, as he admits, no management practices in the world can ensure a risk-free life. “The bottom line,” he concludes, “is that we can and should do much better at being a master rather than a victim of risk.” And we should keep our fingers crossed.
- Chemical & Engineering News
- ISSN 0009-2347
- Copyright © American Chemical Society