Normal Accidents

Normal Accidents: Living with High-Risk Technologies is a 1984 book by Yale sociologist Charles Perrow, which provides a detailed analysis of complex systems from a sociological perspective. It was the first to "propose a framework for characterizing complex technological systems such as air traffic, marine traffic, chemical plants, dams, and especially nuclear power plants according to their riskiness". Perrow argues that multiple and unexpected failures are built into society's complex and tightly coupled systems. Such accidents are unavoidable and cannot be designed around.[1]

Normal Accidents
AuthorCharles Perrow
PublisherBasic Books
Publication date
1984
ISBN978-0-691-00412-9

Perrow's argument, based on systemic features and human error, is that big accidents tend to escalate, and technology is not the problem, the organizations are. Each of these principles is still relevant today.[1][2]

System accidents

"Normal" accidents, or system accidents, are so-called by Perrow because such accidents are inevitable in extremely complex systems. Given the characteristic of the system involved, multiple failures that interact with each other will occur, despite efforts to avoid them. Perrow said that, while operator error is a very common problem, many failures relate to organizations rather than technology, and big accidents almost always have very small beginnings.[3] Such events appear trivial to begin with before unpredictably cascading through the system to create a large event with severe consequences.[1]

Normal Accidents contributed key concepts to a set of intellectual developments in the 1980s that revolutionized the conception of safety and risk. It made the case for examining technological failures as the product of highly interacting systems, and highlighted organizational and management factors as the main causes of failures. Technological disasters could no longer be ascribed to isolated equipment malfunction, operator error, or acts of God.[4]

Perrow identifies three conditions that make a system likely to be susceptible to Normal Accidents. These are:

  • The system is complex
  • The system is tightly coupled
  • The system has catastrophic potential

Three Mile Island

The inspiration for Perrow's books was the 1979 Three Mile Island accident, where a nuclear accident resulted from an unanticipated interaction of multiple failures in a complex system.[2] The event was an example of a normal accident because it was "unexpected, incomprehensible, uncontrollable and unavoidable".[5]

Perrow concluded that the failure at Three Mile Island was a consequence of the system's immense complexity. Such modern high-risk systems, he realized, were prone to failures however well they were managed. It was inevitable that they would eventually suffer what he termed a 'normal accident'. Therefore, he suggested, we might do better to contemplate a radical redesign, or if that was not possible, to abandon such technology entirely.[4]

New reactor designs

One disadvantage of any new nuclear reactor technology is that safety risks may be greater initially as reactor operators have little experience with the new design. Nuclear engineer David Lochbaum has explained that almost all serious nuclear accidents have occurred with what was at the time the most recent technology. He argues that "the problem with new reactors and accidents is twofold: scenarios arise that are impossible to plan for in simulations; and humans make mistakes".[6] As one director of a U.S. research laboratory put it, "fabrication, construction, operation, and maintenance of new reactors will face a steep learning curve: advanced technologies will have a heightened risk of accidents and mistakes. The technology may be proven, but people are not".[6]

Sometimes, engineering redundancies which are put in place to help ensure safety, may backfire and produce less, not more reliability. This may happen in three ways: First, redundant safety devices result in a more complex system, more prone to errors and accidents. Second, redundancy may lead to shirking of responsibility among workers. Third, redundancy may lead to increased production pressures, resulting in a system that operates at higher speeds, but less safely.[7]

Readership

Normal Accidents is a very widely cited book, with more than 1,000 citations in the Social Science Citation Index and Science Citation Index to 2003.[7] A German translation of the book was published in 1987, with a second edition in 1992.[8]

gollark: I really should set a font on there so it doesn't look all bad and serify on some systems.
gollark: It would technically be possible to do it automatically, but it would be a lot more work and it could probably be abused and such.
gollark: https://lucasnorth.uk/sapply/
gollark: It's managed manually, you have to send them to me and I can add them.
gollark: Oh, Github, not Discord.

See also

Literature

References

  1. Daniel E Whitney (2003). "Normal Accidents by Charles Perrow" (PDF). Massachusetts Institute of Technology.
  2. Clearfield, Chris; Tilcsik, András (2018). Meltdown: Why Our Systems Fail and What We Can Do About It. New York: Penguin Press. ISBN 9780735222632.
  3. Perrow, Charles. Normal Accidents: Living with High-Risk Technologies New York: Basic Books, 1984. p.5
  4. Pidgeon, Nick (22 September 2011). "In retrospect:Normal accidents". Nature. Retrieved 4 September 2019.
  5. Perrow, C. (1982), "The President’s Commission and the Normal Accident", in Sils, D., Wolf, C. and Shelanski, V. (Eds), Accident at Three Mile Island: The Human Dimensions, Westview, Boulder, pp.173–184.
  6. Benjamin K. Sovacool. A Critical Evaluation of Nuclear Power and Renewable Electricity in Asia, Journal of Contemporary Asia, Vol. 40, No. 3, August 2010, p. 381.
  7. Scott D. Sagan (March 2004). "Learning from Normal Accidents" (PDF). Organization & Environment. Archived from the original (PDF) on 2004-07-14.
  8. see data of the book in the German National Library http://d-nb.info/920805043
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.