Atomic bombs and climate models

Climate Voices

Atomic bombs and climate models

Share on

How we learned to map our climate destiny

Today, we take for granted the ability to understand the most complex dynamics of the atmosphere. And yet, 70 years ago, our atmospheric knowledge had barely moved beyond “how the weather works.” Understanding of the multidimensional complexity of the Earth’s atmosphere remained elusive until the middle of the 20th century. It turns out the human brain is hopeless at computing numbers at scale without help.

That all began to change in World War II. To defeat the Nazis, military planners soon recognized that the ability to crunch numbers in ever larger ways was as important as guns and butter. The secret Oppenheimer effort to build a super bomb would require intense computational power. So much so that by the war’s end, the U.S. Army had built the world’s first programmable digital computer —  the “electronic numerical integrator and computer,” or ENIAC. (The British claim Bletchley Park’s code-breaking “Colossus” was the first computer.)

The ENIAC was a beast. At 30 tons, it was housed in a 1,500-square-foot room and required 40 nine-foot cabinets containing more than 18,000 vacuum tubes and 1,500 relays. Two 20-horsepower blowers could barely cool the computer where temperatures regularly reached 120 degrees Fahrenheit. That was because the ENAIC could crunch numbers 2,400 times faster than a human.

 

The birth of atmospheric “maps”

As the war raged, the Army dispatched Princeton mathematician John von Neumann on a top-secret mission to ENIAC’s home at the University of Pennsylvania. Considered among the brightest mathematicians of his era, Von Neuman was tasked with harnessing the computing power of the beast to figure out the impact of an atomic bomb detonation on people and the atmosphere.

In his research, he discovered something with even greater potential — a rudimentary means of mapping and predicting the dynamic natural forces of the planet’s atmosphere. Building on that early work, scientists now use three-dimensional “general circulation climate models” (GCMs) powered by supercomputers that can model almost any atmospheric factor. It is one of the quieter but most remarkable scientific achievements of the 20th century.

The global warming problem is discovered

Learning to understand a complex element of nature is impressive. Predicting how it will evolve is miraculous. But that is what a new generation of computer-powered climate modelers was able to do by the 1960s. Two in particular, scientific modelers, Yukuro (Suki) Manabe, and Joseph Smagorinsky made a prediction on which the fate of the earth now rests. They developed a table that will go down in history as the first robust estimate of how much the world would warm if carbon dioxide concentrations doubled.

The results, however, did not go over well with many, particularly the fossil fuel industry, which, over the next decade, campaigned to undermine any data that linked their industry to rising atmospheric temperatures. The debate became so heated that then-President Jimmy Carter asked the National Academy of Sciences to review existing climate data. After much study, the Academy issued the seminal Charney Report in 1979, the first comprehensive assessment of global climate change due to carbon dioxide. Like previous reports, it also confirmed that a warming of between 1℃ and 4℃ would accompany a doubling of atmospheric CO2.

We are now personally experiencing just what models predicted we would if we continued our  “business as usual” ways.

Remember Mount Pinatubo?

Despite furious pushback from the oil and gas industry, the Charney Report’s predictions — with a few modest adjustments — today sit unmoved, like the Rock of Gibraltar. In 1991, climate models were put to their greatest test when Mount Pinatubo erupted in the Philippines, pumping a cloud the size of Iowa into the stratosphere. Scientific models  predicted cooling for the next few years. And that is precisely what happened.

The history of climate modeling 

  • 1979: The Charney Report published

  • 1988: James Hansen testifies to Congress on “three scenarios” of climate change

  • 1990: The first Intergovernmental Panel on Climate Change (IPCC) report published

  • 1995: The Coupled Model Intercomparison Project (CMIP) launches

  • 2013-present: IPCC publishes six comprehensive assessment reports reviewing the latest climate science and 14 special reports on particular topics

Springtime in New York

Fast-forward to the spring of 2024. Here in New York, temperatures last week were 20 degrees above the historic average. We are now personally experiencing just what models predicted we would if we continued our  “business as usual” ways. It’s all mapped out in pretty graphs and charts.

Unfortunately, much of the world’s business, finance, and government establishments can’t seem to do much about it. Sure, there are big initiatives like President Biden’s climate legislation, the Inflation Reduction Act (IRA), the EU’s Green Taxonomy, and China’s ambitious 10-year climate plans. But the models don’t lie, and the status quo prevails. As concerns grow, computer modelers, like navigators on the Titanic, are using the best technology of the moment to track rapidly worsening conditions. We are very good at that. Yet, all the computing power in the world will not change the behavioral patterns of molecules and atoms being warmed by fossil fuels emissions.

If there is a lesson from the past 100 years of heroic climate modeling, it is that we worked hard to research and model the problem but have not had the courage to solve it.

Written by

Peter McKillop

Peter McKillop is the founder of Climate & Capital Media, a mission-driven information platform exploring the business and finance of climate change.