normal accident
n. An accident that is the nearly inevitable result of technological interactions so complex that they cannot be fully predicted or controlled.
Examples
2003
Natural disasters — earthquakes, floods — could affect anybody, and most companies think about that. Second are normal accidents that come about when complexity builds in the potential for an accident. In IT, complexity can lead to a glitch like Y2k. That's a normal crisis — an accident that's almost inevitable, but not intentional. Then you have abnormal accidents: Someone deliberately causing the accident. 9/11 was abnormal. Enron was an abnormal economic crisis because it was caused by shenanigans.
—Ian I. Mitroff, “Facing the Unthinkable,” Computerworld, April 21, 2003
1985
An alternative vision for a peaceful and productive world requires the emergence of the political will to insist that a future of unlimited technological growth, self-anointed managers and "normal" accidents is unworthy of the best in human potential and may well be unendurable.
—Robert Engler, “Technology out of control,” The Nation, April 27, 1985
1984 (earliest)
If interactive complexity and tight coupling — system characteristics — inevitably will produce an accident, I believe we are justified in calling it a normal accident, or a system accident. The odd term normal accident is meant to signal that, given the system characteristics, multiple and unexpected interactions of failures are inevitable.
—Charles Perrow, Normal Accidents: Living with High-Risk Technologies, Basic Books, March 01, 1984
Notes
We live in a world that is increasingly run by complex systems — from nuclear power stations and chemical production plants to the computer industry and the aviation industry. These are systems where multiple technologies must not only interact, but can only work properly if all the other technologies work correctly. (Such a system is said to be tightly coupled.) In other words, if one fails, the system itself fails. Thankfully, most complex systems have built-in redundancies and fail-safe mechanisms that prevent such a system failure. However, the interactions between technologies in a complex system are so, well, complex, that it isn't possible to predict all the ways that any one failure will affect the system. Therefore, accidents in these systems are more or less inevitable. This isn't strictly Murphy's Law: "If something can go wrong it will." Instead, it's a variation on the theme: "If something can go wrong, it usually won't, but eventually it will."
Filed Under