Taming the Bomb: Inexorably Rising Costs
The history of technology shows that nuclear power for energy supply is a hopelessly outdated technology / Episode 2
The first episode of this 12-part series dealt with the turbulent development of the natural sciences at the beginning of the last century until the discovery of nuclear fission. In my energy-politically motivated review of the history of science and technology, I continue here with the beginnings of the peaceful use of atomic energy, the fundamental problems that arose and the declining importance of this energy source over the last thirty years.
With the Second World War, scientific development shifted from Europe to America. Many European scientists fled the Nazis and emigrated to the USA. They played a decisive role in the development of the American atomic programme.
The atomic bomb and nuclear energy
The Hungarian physicist Leó Szillárd persuaded Albert Einstein to write a letter to US President Roosevelt in August 1939, warning that the Germans might build an atomic bomb. After tentative beginnings, the "Manhattan Project" for constructing atomic bombs was started in 1942, in which more than 150,000 people worked under the strictest secrecy until 1945. On 2 December 1942 in Chicago, the first nuclear reactor went "critical" for the first time, i.e. the nuclear fission chain reaction was able to sustain itself (without decreasing or exploding). However, the research reactor was not optimised for energy production; it only had a maximum heat output of 200 watts. The goal was to produce weapons-grade material. For this purpose, also complex facilities were developed for enriching specific isotopes in uranium and separating plutonium.
After the first successful test of an atomic bomb in New Mexico on 16 July 1945, the first nuclear weapons were used in Hiroshima and Nagasaki on 6 and 9 August 1945, killing a total of 200,000 people, half of them in the first four months after the bombs were dropped. The Soviet Union detonated its first atomic bomb in 1949, Great Britain in 1952 and France in 1960 - all in above-ground tests. China exploded its first nuclear weapon test in 1964 using Soviet technology. While the first nuclear weapons were based on the fission of heavy atomic nuclei (uranium or plutonium), they soon sought to achieve even greater energies by fusing light atomic nuclei. The USA detonated its first hydrogen bomb in 1952, the Soviet Union followed in 1953, Great Britain in 1957, China in 1967, and France in 1968.
It was not until December 1952 that electricity was generated for the first time with the help of a nuclear reactor. The EBR-1 research reactor supplied heat for operating a conventional steam turbine, which drove a generator that covered little more than the plant's own needs. The first nuclear power plant for commercial electricity generation went into operation in 1954 in Obninsk near Moscow with an electrical output of 5 MW.
One year after the successful first electricity production with the EBR-1, US President Eisenhower announced the international programme "Atoms for Peace" on 8 December 1953. This opened up the nuclear programme, which had previously been kept under the strictest secrecy, to international civilian cooperation. Eisenhower's initiative was an effective propaganda measure in the Cold War: the forced nuclear armament of the USA to compensate for the Soviet numerical superiority in conventional weapons was combined with an export offensive for the civilian benefit of humanity. In the years that followed, there were intense struggles in various countries over which reactor technology should prevail. Military-strategic considerations played a role that should not be underestimated.
In the enthusiasm over the groundbreaking findings of nuclear physics, a happy "atomic age" was painted, especially in the aftermath of the international Geneva Atomic Energy Conference in 1955. Nuclear-powered ships, cars and aeroplanes, mini-reactors for heating buildings and industrial plants, a chemical industry revolutionised by radiation chemistry, the irrigation of deserts, the opening up of the Arctic and the nuclear blasting of new shipping canals were supposed to ensure almost inexhaustible prosperity. Electricity would soon become "too cheap to meter", declared US Admiral Lewis L. Strauss in 1954, chairman of the Atomic Energy Commission, which coordinated the development of both military and civilian nuclear technology.
I can well remember this spirit of optimism in the sixties. My father moved from depressing post-war Germany to Italy in 1960 as a young physicist with his wife and four children to contribute to the construction of a united Europe with the development of nuclear energy at the EURATOM research centre. However, after a few years, the European reactor project was caught between the millstones of national interests, in which military ambitions played a not insignificant role – but that is another story.
The novel force brings two novel problems
By the end of the 1960s, only large nuclear power plants for electricity generation remained from these expectations. Among various competing reactor design principles, the "light water" technology had prevailed, using low-enriched uranium and normal water as coolant. (In natural uranium, the share of the uranium isotope U-235 is 0.7%, for low enriched uranium, it must be raised to 3 to 5%, while nuclear bombs need over 20%). Other reactor types continued to be developed for decades as "future technologies" but were gradually abandoned. Until today, light water reactors have remained the preferred design for nuclear power plants. The reactor is used to heat water, which is then used to generate electricity with conventional steam turbines and connected generators. Multi-stage steam turbines were perfected from the end of the 19th century onwards based on a principle known since ancient times. About half of the energy released by the reactor core is lost through cooling.
In contrast to military use, civilian use of the enormous energies released during the fission (or fusion) of atomic nuclei had to come to grips with two novel problems:
First, the permanent shielding of intense radioactive radiation and the handling of radioactive material on a large scale.
Secondly, the stable maintenance and simultaneous limitation of a potentially explosive chain reaction, the damage potential of which exceeded all previous technologies.
Only gradually did it become clear how the two sets of problems mutually reinforce each other. Concerning nuclear fusion, it has not yet been possible to maintain a chain reaction in a controllable manner.
Nobel laureate Marie Curie discovered several naturally occurring radioactive elements at the beginning of the twentieth century and eventually died from the effects of radiation. Only over the decades did the extent of the danger of radioactive radiation and the need for extensive protective measures in using radioactive substances become clear. Marie's daughter Irène Joliot-Curie, for her part, received a Nobel Prize for creating new isotopes, often intensely radioactive, that do not occur on Earth. With the industrial use of nuclear energy, large quantities of low-level radioactive materials began to be extracted from the ground for uranium production. Above all, however, the reactors produced and continue to produce considerable quantities of naturally non-occurring radioactive isotopes. And not only in the actual "nuclear fuel". Due to the radiation in the reactor, conventional components also become radioactive through nuclear transformation and can thus change their properties. This not only creates additional radioactive waste. It also necessitates more frequent and much more costly inspections and repairs, which are not known in conventional plant construction. The initial hope that the highly radioactive material produced during operation could be rendered harmless by further nuclear transformations could not be realised at a reasonable cost. Radioactive waste must be stored for hundreds of thousands of years in such a way that neither radiation nor radioactive substances can escape and that it is also protected from unauthorised human access - to date, there is no solution for this.
Unstoppably rising costs
The more was known about the harmfulness of radioactive radiation, the greater the expense for shielding, series-connected cooling circuits and measurements. At the same time, the awareness increased of the danger posed by an uncontrollable chain reaction with significant heat development and release of the radioactive inventory. While the devices for controlling, shutting down and emergency cooling were still rudimentary in the first small experimental reactors, more and more weak points were discovered over time - and so ever more elaborate safety mechanisms were developed: Multiple systems to slow down the chain reaction, to cool the reactor core even long after shutdown, to retain radioactive gases and liquids that escaped from the actual reactor, to prevent accidental or malicious external interference, to monitor the facilities and the employees... The possible chain reactions of failures soon occupied nuclear physicists more than the chain reactions of nuclear fission. Thousands of highly qualified researchers spent their entire professional lives dealing with these safety issues - with the decline of nuclear energy, it became increasingly difficult to find highly motivated, good specialists to manage the remaining risks. (As a physicist in the seventies myself involved in unravelling far too optimistic assumptions, I soon found that it was more important to look at alternatives).
In order to save costs with these increasingly elaborate safety facilities, larger and larger reactors were built, which in turn increased the size of the possible damage and the demands on the materials. The size of the power plants also made it impossible to use the waste heat commercially to any significant extent. The more that was known, the more expensive the nuclear power plants became. In most countries, construction costs increased significantly over time. Only Korea succeeded in slightly reducing costs in the last years before the Fukushima accident through standardisation and a "steady regulatory environment". There is little experience with decommissioning because of the long decay time required. In all completed projects, the dismantling costs exceeded the construction costs, as far as is known. Because, despite all efforts, the risks of a reactor accident remain almost incalculable due to the extreme extent of damage with a low probability of occurrence, governments promoting nuclear energy have strongly limited the liability risk of the operators by special international conventions and national laws. (In Germany, for example, nuclear power plant operators are only insured for damages up to €2.5 billion. In the US, as of 2011, nuclear power plant operators had to cover a liability limited to $375 million, supplemented by $112 million by an industry fund. In comparison, the Japanese government estimates the costs of the Fukushima accident to exceed €75 billion.)
A major cost factor, the handling of radioactive waste, has not even been priced in yet. For a long time, people relied on the chemical "reprocessing" of spent fuel elements. In the 1960s, it was still assumed that the costs of reprocessing and final storage could be covered by separating both unused nuclear fuel and nuclear fuel newly "bred" by nuclear conversion. The first reprocessing plants were developed to extract plutonium for bomb production. But the processing of highly radioactive substances proved to be treacherous and expensive. Today - apart from Japan - only countries that develop nuclear weapons still operate such plants (France, Great Britain, the USA, Russia, India, Japan, North Korea). Although reprocessing can significantly reduce the amount of highly radioactive nuclear waste, the (anyway larger) amount of low-level radioactive waste will increase many times over. It is difficult to estimate how high the costs for permanent storage will be because the problem remains unsolved. To ensure reliable handling of radioactive waste, the state has assumed the responsibility and financial liability in most countries.
Doubts about the economic viability of nuclear energy began to arise in the 1970s. Since the meltdown at the Three Miles Island power plant in 1979, projects have been abandoned in series. In 1984, at the end of a study on the economics of nuclear power, I concluded: "The nuclear power project has failed today", quoting S. David Freeman, a director of TVA, the largest electric utility in the US, who said in 1982 after abandoning most of his nuclear power projects: "The costs of nuclear power are not simply high, they are unpredictable. No capitalist right in his mind will build something for which he can't make a cost-benefit calculation because the costs are unknown."
Increasing risk awareness in free societies
Although in this series, I would like to highlight the often neglected scientific-technical contexts in my quick ride through the history of technology, it should not go unmentioned that the considerable - and costly - improvements in the safety of nuclear technology would hardly have been taken so far if there had not been a strong anti-nuclear movement and a broad socio-political debate in the more liberal Western countries about the consequences of using this technology. Alvin Weinberg, the inventor of the pressurised water reactor, high-ranking science manager and influential US government advisor, wrote in 1972, "we nuclear people have made a Faustian bargain with society” and proposed that radioactive waste be secured for thousands of years by a "military priesthood". What Weinberg still propagated as a positive vision in the sense of the technological solutions to societal problems he advocated was a horror vision for many - such as Robert Jungk, who had considerable influence in the international peace and environmental movement, not least with his book "the nuclear state" (1977). The sociologist Ulrich Beck hit the nerve of the time with his concept of the "risk society" - the reactor catastrophe of Chernobyl provided him with the up-to-date keyword for the preface of his much-cited social analysis published in 1986. Thus, also beyond nuclear energy, a fundamental debate developed about centralised and decentralised concepts of technology, about economic interests, social models and ideological and psychological motives.
Against this background, it may be surprising in retrospect to what extent the development of nuclear energy was not rationally planned, and political, economic or military interests can only partly explain the decisions of politics and business. The fact that many of these were highly complex decision-making processes with dynamics of their own that have been forgotten today can be read in detail in three exciting books on the history of technology: Irvin C. Bupp and Jean-Claude Derian, two high-ranking players in nuclear policy in the USA and France, presented their disappointed view as early as 1978 in "Light Water - How the Nuclear Dream Dissolved". With the support of reactor safety expert Lothar Hahn, historian Joachim Radkau presented an updated version of his highly regarded 1983 dissertation on the "Rise and Fall of the German Nuclear Industry" in 2013. In 1998, the historian of technology Gabrielle Hecht, who now teaches at Stanford, described the history of nuclear power in France in "The Radiance of France: Nuclear Power and National Identity after World War II" based on extensive interviews with contemporary witnesses.
Next episode 3/12: Nuclear energy in a dead end