The safety of the nuclear power industry: is there an alternative?
The safety of the nuclear power industry: is there an alternative?

The safety of nuclear power industry is the number one problem. It became clear that truly absolute, total safety with respect to the possibility of major radiation accidents is necessary today, after the Kyshtym accident back in 1957, the disaster in 1979 in Three Mile Island, in 1986 in Chernobyl, and in 2011 in Fukushima.

Their grave consequences are such that it is a crime to lead a discussion about the “unlikely occurrence”. Accidents at nuclear power plants should be completely eliminated, and not in the context of “acceptable permissible risks”, which scientists spoke with more or less probability at different times, but completely excluded, regardless of the severity of the possible consequences.

The debate in the scientific community regarding emergency risks at nuclear plants in general and at nuclear power plants in particular has been ongoing for a long time, and the point is not in the formulation of possible risks or strategies here, but in essence. It would seem that if the calculation of the probability of a possible major accident of one reactor gave, for example, a value of once per million years, then you may not worry.

But this is not so. Initial data that are not actually fulfilled are laid in the probable calculation, therefore the result of calculating the probability causes objective distrust among specialists and doubt among the majority of members of the public. For example, if we take the probability of failure of any element of the most complicated technical system of the reactor, such a “failure in technology” is usually considered independent. Moreover, quite often there is a whole chain of such failures in practice.

Therefore, the small coefficient of a major accident that was initially incorporated into the calculations proves little, moreover, it creates a very vague effect of well-being, which actually has not been in the industry yet. It is possible to reduce the probability of equipment failure by introducing redundancy of nodes, complicating the logic of control schemes by maintaining new technical elements. But even so, the probability of equipment failure decreases only formally, and a side effect arises – the possibility of “false commands” of the control system itself. Hence, there is no reason to believe a small probability coefficient of major accidents.

That is exactly what the doctor of physical and mathematical sciences M. Herzenstein and the candidate of physical and mathematical sciences V. Klavdiev believed in 1990. The categorical nature of a number of statements by the country’s leadership regarding the complete exclusion of the repeated Chernobyl scenario at the nuclear power plants of the USSR was subjected to serious scientific audit due to their scientific position. The essence of such an expert conclusion was that the power of the reactor could only be controlled with the help of rods automatically inserted into the working area at that time.

Moreover, it is important to emphasize that the reactor was kept in working condition almost all the time on the verge of an explosion. The fuel at the same time had a critical mass at which the chain reaction is in equilibrium. But could back in 1990 completely rely on automation? The Pygmalion effect sometimes acts in complex systems, however paradoxical that may seem. The line between mental processes and real circumstances is blurred in practice.

This means that the system can behave unpredictably in a critical situation, not as laid down by its creators. There is always a risk that sophisticated equipment in the hands of personnel who are firmly convinced of the fidelity of some unproven information that is not technically confirmed is involuntarily acting so that it receives actual confirmation. At first glance, this looks abnormal, but sometimes practice proves that it is not in vain for a person to doubt a lot of things. What kind of absence of risks can one talk about in such a situation?

Scientists had other considerations at that time. Hypothetically, they boiled down to a possible competition for a market between thermal power plants and nuclear power plants. TPPs were largely lost to nuclear plants with increasing requirements for manufactured products of combustion, and this did not exclude the possibility of some discrediting others in the struggle for the sales market. Looking back at safety precautions is not a thankful task in such a struggle, although this factor should be given the main preference.

Vulnerabilities of Nuclear Power Plants

The most vulnerable element of nuclear power plants is automation. In its simplest explanation, it is a control system in which a signal about the neutron field level in the reactor leads to the movement of the rods. The rods go down with an increase in the level, the absorption of neutrons increases, thereby all processes return to their original position. This feedback is called negative. But if you change the sign of feedback, making the reaction positive, then the rods will go up, the absorption of neutrons will drop and the reactor will go to pieces. This is quite acceptable with computer sabotage prepared from the outside.

To that end, you can easily install a microprocessor in the control system and use it to remotely affect the operation of the rods. It is necessary to timely develop in excess of technical means of protection in order to prevent such penetration into the equipment of nuclear power plants. The autonomous control mode is, the more demanding should be the protection system.

Consequently, scientists actively declared their position in the early 90s in order to prove the need to pay the highest possible requirements to the safety system of nuclear power plants. It was after such an expert conclusion that all domestic science, programmers and atomic scientists themselves joined the struggle for the introduction of the most advanced technologies at the stations that ensure atomic and energy safety. Then, in the early 90s, this problem was solved, and at the start – even with the help of standard equipment that existed at that time at most nuclear power plants of the former Soviet Union.

Time moves on, and many began to conclude that humanity has drawn the necessary conclusions and learned to keep the situation with the atom under control. But the fire at the AS-31 Losharik deep-sea nuclear power station and the recent incident at Nenoks have again crushed this theory to the nines. 14 submariners were killed at July 1, 2019, as a result of a fire at the Atomic Station -31 at the training range of the Northern Fleet in the Kola Bay of the Barents Sea.

An emergency occurred in early August 2019, during a test of new equipment that took place in the area of the Navy rocket range in the Arkhangelsk region, as a result of which five people died on the spot, two died of injuries in the hospital and four more victims received high radiation doses. Severodvinsk was exposed to a short-term high background radiation. The explosion in the White Sea occurred on the offshore platform and was recorded by the Norwegian seismological center.

As you can see, we have not learned to put human life at the forefront, instead of the need for further tests with the atom. How many more graves will have to be excavated thanks to similar experiments? How many mutilated fates will join the list of those whom the experiments with the atom were left disabled or orphans?

And there is no difference between the dead and injured at a nuclear power plant, or at a nuclear missile test site, or at a deep-sea nuclear power plant.

The reasons may be different, but the root of evil is the same for everyone – indifference to human life against the background of a total struggle for leadership in the development of the nuclear industry. How one can not recall the events of 33 years ago at the Chernobyl nuclear power plant, when the “experiment at any cost” became a driving force in the chain of catastrophic consequences of the Chernobyl accident. There can be no excuses for those who have not reached the necessary conclusions.