Select Page

NORMAL ACCIDENTS

NORMAL ACCIDENTS

by William H. Benson

April 1, 2010

     Disasters surround us. They inundate our news-saturated minds, pierce our sense of safety, and disturb our sleep. That we live in a world filled with awe-inspiring technologies makes our lives vastly more comfortable and entertaining, but at the cost of heightened and frightening risks. Certain “technological systems” are prone to error and subject to a possible accident and a catastrophe that can and will kill the innocent, despite the presence of sufficient back-up plans and alternate routes.

     Most people fully recognize the danger inherent in certain systems, such as in aircraft and airways, nuclear power plants, petrochemical plants, space travel, bridges, dams, supertankers, nuclear weapons, and biotechnology. Yet, we feel helpless to do anything; we rely upon the technology, but we are powerless to control any phase of it.

     For example, late last week a South Korean naval ship, for an unknown reason, exploded, severing the vessel into two parts, and drowning nearly half of the 104 seamen. Truly tragic it was, but not that uncommon on the high seas where marine disasters have thwarted men and women throughout humanity’s history.

     Homer said it best, that “humans tempt the gods when they plow these green and undulating fields,” called the seas.

     Also, out on that open sea in an ocean lane, “non-collision course collisions” actually do occur. “One would not think that ships could pile up as if they were on the Long Island Expressway, but they do.”

     There was the Titanic disaster on the night of April 14-15, 1912, when 1503 people perished “partly as a result of an overconfident captain imperturbably sailing into a field of icebergs at night, thinking he had an unsinkable ship.” The iceberg that the ship hit sliced open five watertight compartments: “[T]he designers had assumed that no more than three could ever be damaged at once.” And they were wrong.

     The worst commercial nuclear accident in the U.S. occurred on March 28, 1979, 31 years ago, at Three Mile Island, near Middletown, Pennsylvania, when operators made a crucial error at the same time that the equipment failed—a value was closed when it should have been opened. Despite Jane Fonda’s frightening portrayal in the movie The China Syndrome of what may have happened had Three Mile Island exploded, no lives were lost, fortunately.

     History’s worst nuclear accident that resulted in an explosion and a fire, happened on April 26, 1986, at Chernobyl, near Kiev in the USSR, now in the Ukraine, in which at least 31 people were immediately killed, and radioactive winds swept across Europe.

     But the worst explosion in terms of lives lost—3,849—was at a petrochemical plant that exploded at Bhopal, India, on December 3, 1985.

     The truly terrifying, and perhaps the next big accident, will be that of the supertanker loaded with chemicals. “A fair bit of the world’s sulfuric acid, vinylidene chloride, acetaldehyde, and trichloroethylene, etc. move by tankers that are unregulated, and often old, in poor condition, and traveling through winter storms.”

     Most experts attribute the cause of explosions and accidents to “operator error” or to “engineering and design flaws,” and that is mainly true. However, one writer named Charles Perrow has looked deeper into these accidents, and in his 1999 book Normal Accidents, Perrow has identified at least two common factors among them: 1) the tight coupling and 2) the sheer complexity of the pieces that have evolved into a system

     Perrow defines a “normal or a system accident” as that where “given the system characteristics, multiple and unexpected interactions of failures are inevitable.” This definition goes well beyond simple pilot error or an engineer seated at a drawing-table miscalculating a length of a line on a blueprint.

     The system itself can create the accident with its hidden interconnectedness, its fair chance of multiple failures, and its utterly incomprehensible design, for no one solitary person can know all the moving pieces and how they will interact when put into service.

     Another thinker, Dr. Andrew W. Lo, has added to Perrow’s two conditions a third: that of “the absence of normal feedback over an extended period of time.” When incidents rarely if ever become accidents, the notion spreads that the system has become less risky. When people with a Type A personality person think that that they are insulated from a diminished risk, they will take more risks. Accidents will happen.

     This is the world we live in; one that is fraught with real dangers due to our technological successes, in spite of our innocence and our noble intentions.