MY colleague Marlen Ronquillo's column on Wednesday, June 8 ("Geology-challenged pols: Curb your BNPP enthusiasm"), sketched out the main geological arguments against resurrecting the zombie Bataan Nuclear Power Plant (BNPP). These are that it is located on the slope of an at least hypothetically active volcano (Mount Natib), lies within potential striking distance of one definitely active volcano (Mount Pinatubo) and is in close proximity to a significant earthquake fault (Lubao Fault).

As I said in last week's column on this topic, I do not think the "geological risk" argument against the BNPP is the strongest one there is, though it is not entirely invalid. There are credible arguments in both directions; i.e., that the risk is significant, or that it is not significant. Thus, the only conclusion that can be drawn with certainty at this point is that the risk is not zero, but that it is also undefined. Geologic events — earthquakes, volcanic eruptions — are notoriously unpredictable, and so "uncertain but not zero" might be the best anyone can do.

So why am I bringing it up here? It serves a useful purpose, that's why. Marlen's argument, and more particularly, the negative reactions to it highlight a huge shortcoming in nuclear policy that researchers have only identified recently. To properly understand this, we need to back up a little bit and discuss the theory that inadvertently illustrates it best.

'Normal' accidents

"Normal" accidents, or "system" accidents, is an idea first presented in a 1984 book by Yale University sociologist Charles Perrow, whose work was inspired by the 1979 Three Mile Island nuclear accident. Perrow's theory, which has in the years since been demonstrated numerous times by real-world events, is that there are technological systems that are so complicated that multiple, unexpected failures are inherent to them, and cannot be designed out of them.

Get the latest news
delivered to your inbox
Sign up for The Manila Times’ daily newsletters
By signing up with an email address, I acknowledge that I have read and agree to the Terms of Service and Privacy Policy.

Perrow explained that there are three criteria that identify systems that will inevitably fail in some way: One, the system is complex; two, the system is tightly interconnected with one or more other complex systems; and three, the system has the potential for catastrophic failure. The only way to minimize the impact of inevitable significant failure in these systems is to subject them to substantial redesign, in most cases on a continuous basis. The only way to avoid or prevent the significant failure is to abandon the technology entirely. Examples provided by Perrow of systems that will experience "normal accidents" include air traffic, marine traffic, chemical plants and refineries, hydroelectric dams, and especially, nuclear power plants.

The Chernobyl nuclear disaster was a ‘normal’ accident. UN PHOTO
The Chernobyl nuclear disaster was a ‘normal’ accident. UN PHOTO

One of the key points stressed by Perrow is that the human management structure of complex systems is an inseparable part of those systems' complexity. Even systems that operate relatively autonomously ultimately do so for human purposes, thus some level of human interaction is always present. What Perrow found is that the human factor in normal accidents is always either the point of failure, or the catalyst that turns one or more small failures into a catastrophic failure.

The theory of normal accidents has been an important tool for anti-nuclear activists since it was presented almost 40 years ago, but of course, the same knowledge is available to nuclear advocates as well, and so they have had just as long to polish counter-assertions to it. Thus, while people like me argue that every incident or accident in nuclear power systems has been historically unique because the systems are inherently prone to unpredictable failures, the advocates, drawing on the same knowledge, argue that the same incident doesn't occur twice because "lessons are learned."

Socio-technical systems

It was only relatively recently that researchers have been able to flesh out why that sort of conflict persists, in two different papers published in 2015 and 2019. The first, by Stanford University professor François Diaz-Maurin and Professor Zora Kovacic of the Universitat Oberta de Catalunya and published in the journal Global Environment Change in March 2015, analyzed the differences in nuclear power narratives and outcomes.

In other words, assertions about economic factors, safety, reliability, and so on, whether they are presented by industry parties, governments or advocates at various levels prior to the construction and operation of nuclear plants are not borne out in practice. The difference is consistent in that it always exists, though it varies in details from case to case.

In their words, what the researchers found is "...a systemic inconsistency between the way in which the story about nuclear energy is told and the experience gained after implementing nuclear energy according to the story. This inconsistency is due to the incompatible levels of observation used by different social actors endorsing different perspectives. The implementation of nuclear power has been based on the engineering view, focusing on the functioning of the nuclear power plant considered in abstraction from the wider implications of the adoption of this technology on the environment, on the economy and on society." This leads them to conclude that "the controversy over nuclear power may be treated as a problem of contrasting beliefs and normative values in clear disjunction from experience."

The second paper, authored by a dozen scientists from the Institute of Nuclear Energy Safety Technology at the Chinese Academy of Sciences in Hefei, was published in the journal PNAS in March 2019, and took a look at nuclear safety factors in the surge of interest in nuclear power in developing countries, primarily in Asia.

Using a different approach, they end up in the same place as the first research team: Advocacy for nuclear power, whether individual or institutional, is significantly (but of course, not entirely) based on beliefs and values, and this is even more pronounced in developing countries, largely because they do not have experience with the technology and are reliant on the prospective suppliers for the supporting narrative. Bear in mind that this is coming from a group of scientific experts who are by nature and profession advocates for nuclear energy: "Nuclear power plants are complex socio-technical systems, and their safety has never been fully defined. We argue that social aspects, rather than just technical measures, must be involved to ensure nuclear safety," they write.

And that brings us back to the geological risk factor for the BNPP, where the arguments on both sides are revealed as matters of belief based on slightly different things. Those who see a significant risk believe there is one because of the existence of fault line and the nearby volcanoes; those who do not believe there is not a risk because there has been no demonstrated threat from those things.

If the BNPP was a warehouse, or a shopping mall, or a candy factory, then the geological risk perhaps could be dismissed out of hand. But the BNPP is a nuclear power plant, a system that will, if it is operated, at some point experience a normal accident. We could hope it would not be a very serious accident, but it is inevitable. Failure is the only option, and the only uncertainty is its scale and scope. That being the case, the geological risk becomes relevant and strengthens the case against the BNPP, even if only to a modest degree.

[email protected]

Twitter: @benkritz