Climate science reveals: collective threat requires disruptive overhaul of the energy system
Episode 7 of the series: History of technology - Nuclear power for energy supply is a hopelessly outdated technology

Translation from the German original by Dr Wolfgang Hager
This series started by tracing the scientific revolution at the beginning of the 20th century, leading to the discovery of new worlds in the nanometre range. Among these, very early on, was nuclear fission and its military and civilian uses. After the early heyday and subsequent crisis of nuclear energy, the nano sciences have spawned disruptive technologies in many fields, not least microelectronics and digitalisation.
This episode is about the history of the new climate sciences and the growing realisation that we are endangering system Earth with old fossil energy technologies potentiated by new tools; and how we are faced with the daunting challenge of changing course in a short space of time.
In retrospect, the year 2021 could prove to be a watershed. It was the year in which climate change became a tangible reality for many people in politically powerful countries. That year, climate impacts, which are only slowly emerging from behind the fluctuations in weather statistics, increasingly touched even the industrialised countries which had thought of themselves as safe: The sudden onset of winter in Texas, Hurricane Ida on the US East Coast, the fires in California, the severe floods in Germany and Belgium, Cyclone Yaas in India and Bangladesh, rainfall in Henan, water shortages in Brazil. There had always been natural disasters, but last year there were so many extreme events that more and more people got worried. In 2021, somethingscientists had been warning against for decades became tangible: significant disruptions to the Earth's climate system.
Catastrophes caused by weather once were the sole responsibility of Gods. Man’s contribution was limited to wars, genocide, occasionally famine, and, lately, poisoned water and air and the wholesale destruction of forests. Although the number of victims increased - World War II cost over 50 million lives - these were still spatially limited catastrophes.Threats to humanity as a whole were more likely to come from natural calamities such as pandemics (plague), volcanic eruptions (Little Ice Age, Pinatubo), asteroids (extinction of the dinosaurs) or (from the second half of the 19th century) extraterrestrial life. Even after the atomic bombs dropped at the end of the Second World War, man-made catastrophes still seemed regionally limited.
And the immediate sensual experience still played a paramount role. If you did not see or feel the damage, it was probably harmless. As late as 1952, reporters were still observing atomic bomb tests near Las Vegas from only ten miles away, completely unprotected, and a lively atomic tourism developed for observing the three-weekly bomb tests. Only years later cancer statistics showed unequivocally the harm done by radiation. Thus,in 1963, the US, the Soviet Union and the UK signed a treaty banning above-ground nuclear weapons testing. Slowly it became clearer that some dangers could not immediately be registered with our senses and become apparent only after long delays.
Many see the beginning of the modern environmental movement in the 1962 book "Silent Spring", in which Rachel Carson warned of the consequences of the massive use of pesticides. This made it clear to a wider audience that beyond immediate poisoning, large ecosystems can be upset by human actions in ways that are initially imperceptible, delayed, for instance, through developing resistance.
Modern research methods revealed in growing detail that the Earth is a highly complex system that not only provides humans with only a limited amount of resources, but can also be severely disturbed by an increase inintensive human activities. It was only through research into the laws governing atoms and molecules at the nano-level that we began to understand interrelationships at the mega-level of the Earth system. Relationships that we - with the help of our hugely grown technical possibilities, especially through the same nano sciences - are threatening to disrupt. This is particularly evident in the history of climate science.
The atmosphere at risk? Only accurate measurements of radioactivity made simple climate models possible
Around 1860, John Tyndall had estimated the greenhouse effect of carbon dioxide with the first radiation experiments in the laboratory, which, as we know today, accounted for around 280 ppm (parts per million) or 0.280 per mille of the air before industrialisation. A large part of the spectrum of incident solar radiation can pass through the air layer relatively unhindered and warm the earth's surface. However, the reverse radiation of the longer-wave thermal radiation into space is slowed down by the carbon dioxide in the air, which warms up the Earth's air layer. Just like a south-facing window, where the glass lets the sun in but retains the heat, so the room warms up. As early as 1896, the Swedish physicist and chemist Svante Arrhenius pointed out that the burning of coal and oil, i.e. the release of (fossil) carbon stored in the earth's crust in the form of carbon dioxide, will lead to a warming of the atmosphere. But he assumed that this would only become relevant in a few centuries.
More precise investigations only became possible with the help of nuclear physics. In 1946, Willard F. Libby, who had previously worked on uranium enrichment in the Manhattan Project, invented radiocarbon dating, for which he was awarded the Nobel Prize in 1960. It is not only valued in archaeology to this day, but above all became a central instrument in the study of climate systems. In the upper atmosphere, cosmic radiation leads to the formation of small amounts of the radioactive carbon isotope C-14, which decays with a half-life of 5730 years. As a result, the carbon dioxide in the atmosphere contains not only the stable carbon isotope C-12, but also an equal proportion of C-14 - everywhere. The relatively rapid mixing of the earth’s atmosphere was demonstrated by the fact that the radioactive emissions from the atomic bomb tests could be measured across the globe with little delay. When carbon dioxide from the air is trapped anywhere, e.g. in wood or in tiny air bubbles in a glacier, the proportion of radioactive C-14 slowly decreases through decay. With his highly sensitive instruments, Libby succeeded in determining the ratio of C-14 to C-12 so precisely that the age of the samples could be determined from it. He calibrated the method on the tree rings of millennia-old Sequoia trees.
In 1957, Roger Revelle and Hans E. Suess demonstrated that the proportion of C-14 in the atmosphere was falling and explained this by the input of fossil carbon from the burning of coal and oil. Revelle assumed that this could lead to serious problems in about fifty years. Then, from 1958, Charles Keeling showed that the CO2 concentration in the atmosphere was growing steadily. Measurements continue to this day on remote Mauna Loa in Hawaii.

With the development of more sophisticated measurement methods and better drilling techniques in the Greenland ice, Hans Oeschger was then able to study the climate history more precisely in the 1960s and 1970s, and develop the first climate models in an international network. During my studies in Bern, I prepared samples for his measurements for years. At the same time, the possibility of calculating more complex models on computers slowly developed.
Growing concern about system Earth
The 1973 OPEC oil crisis put an abrupt end to the strong growth in oil consumption that had been occurring since 1940. Car-free Sundays and price increases made sure that environmental and resource problems received greater attention across the general public. Not only among us students, who the year before had closely followed and discussed the first UN Conference on the Environment in Stockholm and the Club of Rome report "Limits to Growth". It was all new at the time - fifty years ago. Above all, a lack of resources seemed set to curtail economic growth. The still very simple computer model in the report to the Club of Rome predicted a drastic decline in industrial production and food per capita after the turn of the millennium under a business-as-usual scenario.

Nuclear energy was touted after 1973 as the solution to resource problems and the threat of climate change.But if the growth model would have to be greatly altered anyway to avoid collapse - or so it seemed to many - then climate change, while serious, was much further in the future than the immediate dangers of nuclear energy...
Inspired by the first results of climate models and climate history, which showed that the biosphere apparently possesses self-regulating mechanisms that have kept various parameters (oxygen content, temperature, salinity of the oceans...) amazingly constant since the presence of life on land, the microbiologist Lynn Margulis and the chemist and physician James Lovelock then developed the "Gaia hypothesis" in 1974, which is still hotly debated today. Named after the Greek earth goddess, this theory postulates that the earth as a whole can be understood as a kind of living organism. The discussion about this has sharpened the view that the Earth with its biosphere is obviously a highly complex system with self-regulating mechanisms that ensures dynamic states of equilibrium, which are essential for life on Earth. But these equilibria can be destabilised by excessive disturbances beyond tipping points and suddenly change into completely different states. The symbol of this new way of looking at things was the famous photograph of the Earth taken from the moon's orbit by the Apollo-8 space shuttle in 1968. Many people were deeply impressed by this first picture of the Earth as a whole.

Another chapter of climate research that has contributed significantly to our perception of the Earth as a system today is the story of the ozone hole. In 1970, the meteorologist and chemist Paul Crutzen had discovered that nitrous oxide (NO2), produced mainly by soil bacteria in heavily fertilised soils, rises upwards and depletes the ozone layer in the stratosphere. While ozone - a molecule with three instead of two oxygen atoms - is a poison to living things on the ground, the ozone layer discovered in 1913 at an altitude of 15 to 35 km protects life on Earth from excessively strong ultraviolet solar radiation. In 1974, Rowland and Molina discovered that CFCs (chlorofluorocarbons), which were industrially produced in ever greater quantities and used primarily in refrigeration systems and spray bottles, deplete the ozone layer much more rapidly in very similar catalytic processes. After initial fierce resistance from the chemical industry, a gradual international ban on CFCs was agreed as early as 1987 in the Montreal Protocol. The hole in the ozone layer, which was at times dangerously large, has since shrunk considerably. This. the greatest success of a global environmental agreement to date was possible because - in contrast to the greenhouse gas problem - the serious damagingeffects were immediately visible and the costs to industry clearly manageable.
In 1982, forty years ago, thirty years after the first electricity was generated with nuclear power, millions of people demonstrated in Europe and the USA against the threat of nuclear war. Unlike the anti-Vietnam War movement fifteen years earlier, this was about humanity and the Earth as a whole. The book "The Fate of the Earth" by Jonathan Schell, which saw the main danger of nuclear war in the destruction of the ozone layer, had a significant influence. In the same year, Paul Crutzen once again caused an international sensation when he warned of a dramatic cooling of the Earth if, in the event of a nuclear war, subsequent fires caused huge amounts of dust to reach the stratosphere (later called "nuclear winter"). This turned out to be much more dangerous than the threat to the ozone layer.
Finally, in 2000, Crutzen proposed calling the current Earth Age the "Anthropocene" in view of the decisive influence of humans - a term that has since become a central concept in the discussion about the common future of humankind far beyond the specialist geological debate.
Detailed research into the climate system shows the extent of the threat from fossil fuels
The rapid developments in computer technology, nano science-based metrology and global earth observation with satellites and probes of all kinds made it possible to develop ever more elaborate climate models from the end of the 1970s onwards. The record built up by climate history gradually dispelled remaining doubts about warming.
1979 brought a breakthrough in the recognition of the climate problem with the first World Climate Conference in Geneva. There, the foundation was laid for the United Nations Framework Convention on Climate Change (UNFCCC) adopted in 1992, as well as for the establishment of the IPCC (International Panel on Climate Change) in 1988, which to this day regularly compiles the latest findings of climate research in an unprecedented scientific endeavour.
Over time, climate models became more and more precise, using different approaches to try to reproduce the Earth's climate system and its reaction to man-made influences - especially to the output of large amounts of CO2 through the burning of fossil fuels. To this end, not only the dynamics of a multitude of climate-relevant subsystems (oceans, polar caps, tropical forests, agriculture, ocean currents, mountains, water systems...) had to and still have to be researched in greater detail and their interplay investigated. But they also have to be modelled so precisely that dangerous tipping points of individual systems can be identified and their approximate risks determined: the loss of the Gulf Stream, the disappearance of the Greenland ice, the thawing of the methane-containing permafrost soils etc. This not only requires extensive calculations, but also the collection of huge amounts of data, which are continuously acquired with new types of probes, sensors, satellite systems and methods of analysis.
In the 1980s, the question arose as to whether, and if so how, one could define a still-tolerable level of climate change. With the data available at the time, however, only very qualitative answers and a rather politically arbitrary setting of limits were feasible. Discussions began about an average global warming that could still be tolerated. In 1997, the UNFCCC Kyoto Protocol was adopted. It provided, in a first phase, for emission reductions by the industrialised countries until 2012. These were actually implemented: -5.2% compared to 1990. However, the USA refused to ratify the Protocol and Canada pulled out again. Against the backdrop of increasingly ominous results of climate research, the Paris Climate Agreement was ultimately adopted in 2015, after heated discussions. Without binding commitments, it aims to limit warming to well below 2 degrees, if possible 1.5 degrees. This is the cornerstone of today's climate policy.
The emissions budget - a revolutionary concept that sheds new light on our role in nature
The idea of calculating a still acceptable residual amount of climate-relevant emissions -already considered in 1989 - was initially dismissed as too unscientific. It was not until 2009, against the backdrop of an emerging political consensus for a two-degree target, that scientists began to look more closely at calculating a maximum still-tolerable amount of emissions (carbon budget) with the help of much better data and models. It had become clear that the impact of any additional emissions on average warming could be directly calculated, largely independent of the time of emission. The IPCC reports of 2014, 2018 and 2021 dealt intensively with this issue. They showed that there is a CO2 emissions budget of only 284 gigatonnes (Gt) available today for the temperature increase to remain below 1.5 degrees, with a 66% probability. If consumption remains constant, this budget would be used up in 6 years and 9 months (2029). According to the latest findings on impending tipping points, it seems urgent to limit warming to 1.5 rather than 2 degrees. A two-degree target would give us 24.5 years at current consumption levels.

The chart above drastically shows the dramatic challenge we are facing. Incremental policies are no longer enough. In the wake of the Paris conference, many governments have formulated their own targets, but these are still far from sufficient to achieve the 1.5 degree target. And the concrete policies for implementation are once again lagging far behind these targets. Since the first international climate protocol was adopted in Kyoto in 1997, emissions have risen by more than half. With the gradual progress as presently envisaged, we are not moving fast enough.
The carbon budget concept has more profound implications for the perception of our role in the world than it might first appear. It has only recently found its way into the public discussion. A previously underestimated milestone was the ruling of the German Federal Constitutional Court of 24 March 2021 on climate policy. The highest German court saw elementary fundamental rights of the young plaintiffs violated by the fact that the Climate Protection Law provided for only small emission reductions until 2030. The Court argued that by spending much of the available CO2 budget already by 2030, only very harsh technical and social responses could turn around the situation in the limited time then remaining. This could only be achieved by sharply reducing freedoms. This posed an unfair burden on today’s younger generation.
Within a month the German Climate Protection Act was amended - but the targets and measures are still not sufficient. This gives time and future generations a different role. Wait and see policies, declarations of intent and good efforts are no longer enough. In another court ruling in the Netherlands, Shell, the largest European oil company, was obliged to reduce its own and its customers' emissions by 45 per cent by 2030, using similar reasoning.
In the meantime, science may have achieved another Copernican revolution - a hundred years after quantum theory changed the scientific world view and assigned man a much more modest role in nature. This requires reigning in the unbridled material expansionism of the last hundred years. Whether the necessary turnaround can be implemented given socio-political realities is another question....
How much risk is acceptable?
The dependence on fossil energies is high, and deeply rooted in the origins of the industrial system. Today, they cover 77% of global primary energy consumption. A hundred years ago (1920), they already supplied 60% of what was then an almost ten times lower consumption. In the industrialised countries, the dependency goes back much further: even including muscular power, in Great Britain already around 1800 about 77% of the primary energy demand was covered by coal, while in France and Germany it was only 50% in the early 1860s. Reducing this dependence to zero in about twenty years (see chart above) is no longer realistic, according to the IPCC.
Therefore, calculations are now being made as to when the 1.5 degree target will be exceeded. In the scenarios considered by the IPCC, this will already occur in the early 2030s. Because warming will continue thereafter, it is assumed that past emissions will be recaptured from the atmosphere with technologies that are not yet available today. With recapturing, the 1.5-degree target would only be exceeded temporarily ("negative emissions"). Whether this can succeed is anything but certain in view of the enormous effort required.

The International Energy Agency (IEA), which was founded by the rich OECD countries in response to the OPEC oil cartel after the 1973 oil crisis, has calculated revealing scenarios: If the policy measures already adopted are implemented, emissions would remain at about the present level. Under the most stringent scenario, NZE, proposed by the IEA and including specific measures, emissions would fall to zero by 2050 - with "negative emissions" also coming into play. This exhausts the carbon budget, which the IPCC calculates is sufficient to meet the 1.5-degree limit with a 50% (not 66%) probability. A growing number of young people consider it quite irresponsible to risk a 50% probability that this limit will not be met.
Sensually intangible dangers and long time horizons threaten to overwhelm us
In any case, overcoming the climate problem requires a global effort of the kind that human societies have only made in geographically limited and exceptional situations. This requires above all a strong conviction, both rational and emotional, that the impending dangers are so great, and the chances of overcoming them so convincing, that it is worth challenging old habits, established vested interests, powerful interests, old conflicts and ingrained convictions with exceptional determination.
Both are more tightly linked to the scientific and technological development of the last hundred years than the current discussion suggests. The enormous increase in knowledge and power through the new, nanoscience-based technologies is difficult for us to comprehend in its consequences. Its speed and scope overwhelms our acquired capacity for dealing with longer horizons. This has led to serious deficits in recognising systemic dangers and seizing new opportunities. Future episodes of this series will focus on the novel, hitherto underestimated technological opportunities for nature-compatible prosperity. But here I would first like to take a closer look at the difficulties in even recognising the situation in which we find ourselves.
We are evolutionarily and socially attuned to valuing the present, that which is close to us, that which can be directly experienced, far more highly than the results of model calculations obtained with highly complicated methods based on measurements that cannot be comprehended by the senses. Since the nanosciences became increasingly effective after the Second World War, the (increasing) perception of the abstract consequences of human action for the Earth system has lagged ever more dangerously behind the appreciation of the sensually perceptible enjoyment of the fruits of these technologies.
As far as the consequences of climate change are concerned, we can broadly distinguish two levels. On the one hand, there are the already now observable ever more severe extreme weather events, geographicclimate shifts, changes in flora and fauna, changes in the water balance, the drying out of whole swathes of land, etc., which cause huge damages and, ultimately, massive migration.
These are all changes that can be traced back to past emissions - after all, the earth's temperature has already risen by more than 1.1 degrees since pre-industrial times. The climate system is inert. There are several decades between the emission and the warming it causes - the time lag depends on the amount of emission and is related to various feedback processes and secondary effects. Therefore, the reasons for the damage in the next decades are all already in the past. If we look at the emissions curve and see how sharply emissions have risen in recent decades, we can guess what we are facing while no longer being able to do anything at all about it.
But what we are now beginning to experience are all still gradual regional changes, which people are now increasingly trying to counter with adaptation. They will bring increasingly massive problems in agriculture, in coastal cities, for health, or migration by the millions. But they do not yet pose an existential threat to humanity and the Earth.
However, this will change if the emissions over the next few years cause temperatures to rise - with the usual lag of a few decades - to where they trigger tipping effects that lead to existentially threatening disruptive global changes. These would profoundly affect the habitability of large parts of the Earth's surface, and thus call into question the very existence of our civilisation. We tend to assume relatively continuous developments. Climate researchers, however, are pointing us more and more forcefully to a series of increasingly likely, sudden and irreversible changes, e.g.:
With a temperature increase of between 0.8 and 3 degrees, it is likely that the Greenland ice will melt completely. However, this could take more than a thousand years - but the consequences would be an existential threat: sea levels would rise by seven metres and the ocean currents that determine the climate in many regions of the world would be thrown into disarray. Other ice sheets, such as the West Antarctic, face a similar threat.
Above a threshold of about 1.5 degrees temperature increase (between 1 and 2.3 degrees), the boreal permafrost soils will thaw within 200 years, possibly triggering a more rapid collapse of the permafrost soils, which would lead to an additional 0.2 to 0.4 degrees of global warming due to the release of methane.
With a temperature increase of between 1.4 and 8 degrees, the Atlantic circulation (the Gulf Stream) can collapse within 50 years, which would lead to a warming of the southern hemisphere and dramatic climate shifts, giving Northern Europe the climate of Nova Scotia on the same latitude.
Unlike the already observable gradual climate changes, the existential consequences of these tipping effects will probably only affect future generations. But whether they occur is already being decided today. It is important to note that the threshold values cannot yet be precisely determined, and that the collapse of one subsystem can trigger the collapse of others. For a whole series of tipping effects, the triggering temperature threshold seems to be between 1.5 and 2 degrees warming. If the still rising trend of the last five years continues in a linear fashion, the carbon budget for two degrees of warming will also be exhausted as early as 2044.
It is to be hoped that the climate damage that is becoming visible today, the cause of which already lies in the past, will lead humanity to quickly stop further burning of fossil fuels so that our civilisation does not risk perishing with a time delay of perhaps one or two centuries. Before humanity-threatening tipping effects become reality, gradual deterioration will continue and smaller tipping effects occur. The faster we change course, the smaller the problems that will arise in the coming decades. We simply must bring our perceptive capacity up to the level of our technical possibilities. If we do not want dictatorships, this will also require enormous educational efforts.
Common strategies for a heterogeneous humanity require an unprecedented level of cooperation
Now, it is not as if nothing had been done about climate change since the nineties. For the past few years, it has been one of the main topics at all international meetings and even in the financial world. And yet climate-damaging emissions continue to rise. The detailed analysis of the global figures shows up additional challenges for our handling long time horizons.

The development of emissions broken down by world region shows a huge increase in China and other less rich countries. The differences in CO2 emissions per capita are still huge: while the world average is 4.5 tonnes in 2021, the USA emits on average more than 3.4 times as much per inhabitant, namely 15 tonnes, Germany 8.1 tonnes, China 7.3 tonnes, Italy 5.6 tonnes, Switzerland with 4.0 tonnes less than the world average and India 1.7 tonnes. If one additionally takes into account the historical totals of the amounts emitted by the countries per capita, then China is in a much better position, and the USA even worse. This makes it very difficult to agree on target values for each country.
An analysis of the drivers of change since 1990 shows that the differences in world regions in terms of population and income growth result in very different challenges in combating climate change. Even a substantial reduction in energy intensity (energy use per unit of economic output), i.e. the improvement in energy efficiency, was not able to compensate for the effect of population growth worldwide (but it could in China, where it had more than five times the impact); and much less so for significant increases in private incomes. In low-growth, rich countries, the main issue is to replace fossil fuels with savings or new energy sources in existing structures. In view of the now short deadlines and the long lifespan of energy technology plants, this can mean that still functioning and capital-intensive plants have to be shut down - the resistance and reluctance of the industries and investors concerned is correspondingly great. In poor countries, on the other hand, where incomes or populations or both are growing rapidly, the first priority must be to meet the additional energy demand efficiently and with renewables and to avoid building up “stranded assets” of the future. If new, climate-neutral plants also produce more cheaply, this results in a long-term competitive advantage.

It is also worth noting that in the period since 1990, the shift in the energy mix - i.e. the replacement of fossil fuels with other energy sources - yielded four times less worldwide than the reduction in energy intensity (ten times less in China). However, this has changed somewhat in the past decade due to the expansion of photovoltaics and wind. Energy efficiency is and remains of central importance for the mitigation of climate change.
The importance of population and income growth is illustrated by the following graphs.
In Asia, the population will soon stop growing, while in Africa, where incomes are much lower, it is likely to continue to rise sharply. As far as the potential growth of greenhouse gas emissions is concerned, care must therefore be taken, especially in Africa, to ensure that a climate-neutral energy supply is established from the outset with the latest technologies and the corresponding professional skills.
Dramatic challenges do not allow delays in seizing new opportunities
The climate and energy crises are closely intertwined with other pressing crises and have multiple political, sociological and psychological aspects. But here, these can only be peripheral to our main point, which is to show how the history of science and technology has contributed to today's predicament, and to work out what options this leaves us.
Climate science and the protests of the younger generations show us that the necessary transformation must take place at a speed unprecedented in industrial history - even the transformation of the telecommunications industry in the wake of microelectronics and the internet was slow by comparison. The capital investments to be mobilised are huge. Before Russia's invasion of Ukraine, only 5% of the world's fossil energy consumption came from Russia. The dislocations that the attempt to save these quantities has triggered on world markets give a foretaste of the challenges ahead.
In view of the urgency of the climate problem, the time horizons alone make it clear that new nuclear power plants cannot make any meaningful contribution here. By the time the first of them could be available, emissions will have to be almost at rock bottom. Photovoltaic power plants, wind turbines and storage batteries can be built much faster. Every million that is put into nuclear power instead of photovoltaics is lost for the solution of mankind's biggest problem.
Increasing local disasters will lead to an increased awareness of climate issues. However, attention varies widely: this summer's flooding in Pakistan, which covered an area the size of Italy and drove 33 million people from their homes, was barely noticed against the backdrop of the Ukraine war in Europe. The very different ability and willingness to perceive the impending dangers and to take or accept appropriate measures will exacerbate social tensions everywhere.
As we will see in more detail in the next instalments of this series, new, climate-friendly technologies are not only necessary, but also more cost-effective. Those who are quicker to adopt new, climate-friendly technologies have an advantage. This also applies to countries and continents that keep outdated structures alive by subsidising outdated technologies. China is now a pioneer in renewable energies. In Europe, which was a technological leader, old industries have been holding back the conversion to renewables for decades.
The history of the use of coal oil and gas in the next episode of this series will illustrate the difficulty of overcoming the inertia of existing systems, communities, habits, beliefs and industries, but also show the new technical approaches that the new sciences have opened up in these areas.
Do we see peak oil in this graph?