Risk Perception

Charles Perrow’s award-winning book Normal Accidents focuses on the inevitability of failures in high-risk technology. Perrow is a respected scholar on large organizations and high-risk systems (Perrow 2011b, 52). While he works in areas ranging from the origins of capitalism to terrorist attacks and other disasters, Perrow has much to say about nuclear systems in general, precisely that “nothing we build […] has as much catastrophic potential as nuclear power plants (Perrow 2011a).

In the last 50 years, the nature of industrial disasters has changed. Our industrial systems have become complex beyond our awareness, making it impossible to analyze situations of multiple failures (Perrow 1999, 11). Nuclear power plants are extremely complex and intricate systems. There are many automatic safety devices (ASDs), with cooling systems one of the most critical (Perrow 1999, 20). Each ASD is a failsafe to protect against some component malfunctioning or failing. The problem is this: writing about one particular ASD, Perrow says “It is expected to fail once in every fifty usages, but on the other hand, it is seldom needed” (Perrow 1999, 20). Each layered component has its own failure rate and if there are 20 such layers, each that is expected to fail only once every few decades, the time will come when the failures align – this is the “interaction of multiple failures” that Perrow theorizes (Perrow 1999, 23).

Probabalistic vs. Possibilistic

I have always found Mathematics’ Probability Theory to be incredibly non-intuitive, and the connection to high-risk systems is worthy of attention. Our current attitude towards risk is probabilistic, an approach that de-emphasizes preparation for improbable events. (Perrow 2011a). I see to be pertinent the common misconception that, since the probability of flipping a heads on a fair coin is 1/2, every other toss must be a heads. In reality, it could easily be the case that you flip 5 heads in a row (because probability is based on a very large number of flips), but we might call this “defying the odds”. This isn’t wrong, per se – certain scenarios are more “likely” than others; but in high-risk situations, it is dangerous for us to assume that, if the probability of something is 1/n, then that thing will happen every nth time. There is no way to ensure that failures in multiple connected or unconnected systems will not line up and create disasters like Fukushima. Sociologist Lee Clarke proposes a “possibilistic” approach to risk instead, recognizing that impossible things happen frequently (Perrow 2011a).

Incomprehensibility

In addition to intricate coupling of high-risk technologies, another common component of the normal accident is that of incomprehensibility. Perrow uses this idea to argue that the operators at the Three Mile Island (TMI) could not possibly have been at fault because they were unaware of what was occurring (Perrow 1999, 23). While describing the events of TMI, the numerous warning signs and failed systems, Perrow repeatedly writes that the operators simply could not have known (Perrow 1999, 19-30). Now we could always say “well they should have known”, but what does this statement really mean – how could they have known? If we delve down this rabbit hole, it is a never ending cycle of impossibility. Aptly stated by Nancy Leveson, it is easy to see what could have been done in retrospect, but “‘impossible to go back and understand how the world looked to somebody not having knowledge of outcome'” (Chernov and Sornette 2016, 131).

Nuclear systems are incomprehensibly complex and components fail frequently. Usually, such blips go unnoticed until next time, when they align with a host of other failures. After an accident, investigations can reveal problems that no one would have considered a problem until a large-scale disaster occurs (Perrow 1999, 16). Furthermore, technological fixes to problems usually give us a false sense of security that only increases risk (Perrow 1999,  11). For example, the addition of warning lights and indicators is an additional layer meant to illuminate a failure when it happens (Perrow 1999, 21). However, if a sensor malfunctions, then a problem that no one knew about or thought to check can become fatal.

Fukushima as Normal

Under Perrow’s categorization, accidents like Fukushima are to be expected; they are “normal” accidents for many reasons (innate human disorganization, the increasing complexity of systems, and our reliance on technology to fix problems and reduce risk). Organizational disabilities are inherent and to be expected in human systems; however, when coupled with high-risk systems, disastrous outcomes can occur (Perrow 1999, 10). “Normal accidents” are precisely the accidents that occur within our system of “interactive complexity and tight coupling” (Perrow 1999, 5). In this disaster, the intricate web of multiple failures is the tsunami, earthquake, and each failure that made the nuclear meltdown what it was. “‘There is no emergency plan to protect the public when there is both an earthquake and a nuclear accident’, said Green Action head Aileen Mioko Smith” in the early days of the disaster (Johnston 2011). This disaster was truly an incredible combination of events. The earthquake was the largest ever recorded in Japan and the accompanying tsunami was just as unexpected (Chernov and Sornette 2016, 132). Such an intricate combination of things went wrong, which is perfectly in line with Perrow’s analysis. Perrow certainly recognizes mistakes, such as “official denial and secrecy, refusal to accept outside help”, etc. (Perrow 2011b, 50); however, recognizing that normal accidents are inevitable can better our response when disaster hits.

In comparing Perrow’s assessment that failures like the Fukushima disaster are inevitable to sources like The Japan Times, who generally argue that we should have been able to prevent this, the lack of clarity surrounding nuclear power is illuminated. In other sections of this website, there are references to sources who dismiss the disaster and subsequent response, saying that we should have known better. Hence, there is no consensus on nuclear power; there is only uncertainty that fuels the fear and misunderstandings.