1955: Why the US Chose Nuclear Energy Over Solar

This amazing excerpt from the book, Let It Shine: The 6000-Year Story of Solar Energy, provides fascinating context to energy choices the US made in the 1950s. It was a pivotal moment for the advent of solar energy, but the US supported nuclear instead.

What’s most interesting is all-out backing the US government gave the nuclear energy industry to get it off the ground. Similar histories are likely written about government support for oil and gas when they first emerged. Renewable energy industries have had no such support – infinitesimal by comparison. It’s a testament to pioneers in the solar and wind industries and a handful of supportive governments that they are nearing grid parity today.

Prelude to the Embargo

For almost three decades after the end of WWII, the US had few problems with its energy supply. Its industry, commerce, and homes all had ready access to oil and gas from both domestic and foreign sources. Most of the oil was close to the surface, easy to tap, and economical to extract. Foreign governments sold their oil to American companies at extremely low prices, and US government subsidies also helped to keep prices low and profits high. Natural gas prices were also low and enjoyed the same tax advantages as oil.

Corporate spokespeople assured the public that this rosy situation would continue almost indefinitely. With fuel apparently so abundant and cheap, electric companies expanded to meet demand. Liberal government policies made it easy to procure capital to build larger and more efficient power plants. Utilities encouraged greater consumption because the costs of building new plants and installing electric lines could be recovered more easily if customers used more energy.

"Once you had the lines in, you hoped people would use as much electricity as possible," a utility executive remarked. "You wanted to get as much return on your investment as you could." Gas companies took a similar approach – "if you sell more you make more."

They promoted consumption through advertising campaigns and preferential rate structures. It worked as families rushed to buy electric and gas-powered appliances. The growing affluence and postwar baby boom pushed electricity generation up over 500% between 1945-1968, and gas production almost tripled from 6-16 trillion cubic feet during those years. US fuel consumption more than doubled.

Enter Solar

The frenetic pace at which America was gobbling up its energy resources alarmed only a few farsighted individuals. Eric Hodgins, editor of Fortune, called the careless burning of coal, oil and gas a terrible state of affairs, enough to "horrify even the most complaisant in the world of finance."

Writing in 1953, he warned that "we live on a capital dissipation basis. We can keep this up for another 25 years before we begin to find ourselves in deepening trouble." But such warnings were treated with derision or ignored because too much money was being made on energy sales.

A few scientists and engineers took the same dim view and sought an alternative to a fuel crisis they saw as inevitable. In 1955, they founded the Association for Applied Solar Energy and held a World Symposium in Phoenix, Arizona. Delegates from around the world attended, presenting research and exhibiting solar devices.

Israel displayed its commercial solar water heaters, and representatives from Australia and Japan discussed their nations’ increasing use of the sun. To many, the symposium represented the dawn of a new solar age, but the careless confidence of energy-rich America squelched that hope here.

Solar energy received virtually no support in the ensuing years, and by 1963 the association found itself bankrupt.

The governments of Israel, Australia and Japan deliberately aided the solar industry, but the US Congress and White House sat on the sidelines. True, as early as 1952 the President’s Materials Commission, appointed by Harry Truman, came out with a report, Resources for Freedom, predicting that America and its allies would be short on fossil fuels by 1975. It urged that solar energy be developed as a replacement.

"Efforts made to date to harness solar energy are infinitesimal," the commission chided, despite the fact that the "US could make an immense contribution to the welfare of the free world" by exploiting this inexhaustible supply. They predicted that, given the will to go solar, there could be 13 million solar-heated homes by the mid-1970s.

Atoms For Peace

The Commission advocated for a 50-50 split for nuclear and solar contributions to America’s energy future, but the US government lavished billions on atomic power research while spending a pittance on solar. International cold war politics more than technological advantages accounted for the difference.

The Soviet’s growing military might and possibility of nuclear warfare dominated. Rather than scare Americans, President Eisenhower decided to give nuclear weapons a happy face by introducing the peaceful atom.

At the United Nations in 1953, Eisenhower assured the world body of US determination to help solve the fearful atomic dilemma – "to find the way the miraculous inventiveness of man shall not be dedicated to his death, but consecrated to life." When he proposed the peaceful use of the atom "to apply atomic energy to the needs of agriculture and medicine … and to provide abundant electrical energy in power-starved areas of the world" – everyone sprang up and applauded and kept on cheering.

Someone called Eisenhower’s plan "Atoms for Peace" and the phrase stuck. Selling the peaceful atom as the world’s future energy source suddenly became America’s number one priority.

Nuclear Energy

Congress passed the Atomic Energy Act of 1954, making available at no cost to the industry "the knowledge acquired by 14 years and $10 billion worth of government research." In this act, the government pledged to undertake for the private sector "a program of conducting, assisting, and fostering research and development to encourage maximum scientific and industrial progress."

In other words, the government paid all the expenses and took all the risks for the nascent nuclear energy industry. There was no parallel "Solar Energy Act.

People from every national and political inclination heralded the arrival of the atomic age, the "third great epoch in human history." A few people though had second thoughts.

Nobel prize-winning chemist Dr. Glenn Seaborg, who later headed the Atomic Energy Commission, argued that the difficulty of finding sites for disposal of dangerous radioactive waste would severely hamper development. Worse, experts agreed that the owners of atomic power plants could quickly convert their fissionable material to build bombs. Even members of the Eisenhower administration admitted having "some unhappy second thoughts – that ‘atoms for peace’ could turn into ‘atom bombs for all.’ The specter of nations in the underdeveloped world arming themselves atomically was "terrifying."

What About Solar?

Dr. James Conant, the American scientist who first oversaw the making of America’s first nuclear weapons, agreed that nuclear power was too dangerous and expensive. He urged the nation to instead create a program like the Manhattan Project for the development of solar energy.

The NY Times also suggested the government should "transfer some of its interest in nuclear to solar." But the attitude of Washington and the private sector mirrored that of a nation hypnotized by seemingly limitless supplies of cheap fossil fuel and by the almost magical aura surrounding nuclear energy.

Life Magazine put it aptly in an article, "The Sun: Prophets Study Rays for Far-Off Needs." A few farsighted scientists are dreaming of ways to save the US when coal, oil, gas and uranium run out. That may be 200-1000 years away, the article said.

George Russler, chief staff engineer at the Minneapolis-Honeywell Research Center, suggested that solar energy could better tackle the growing need to replace oil by providing heat for houses and office buildings. He pointed out that the low-temperature heat required "ideally matches the low-grade heat from the simplest and most efficient solar energy collectors."

This was the perfect way to start putting solar to widespread use and ameliorating the ominous circumstance that the number of new oil discoveries in the US had fallen every year after 1953, while reliance on imported oil kept growing. In fact, in 1967, for the first time in the nation’s history, crude oil reserves declined.

And renowned oil engineer Marion King Hubbert predicted in 1956 that American petroleum production would peak between the late 1960s and early 1970s. Most in the oil industry ridiculed his work, but in 1970 the laughing ceased. His prediction had come to pass.

++++

This is an excerpt from an article in the May/June issue of Solar Today.

John Perlin, author of Let It Shine: The 6000-Year Story of Solar Energy (2013) is an analyst in the Department of Physics and Director for implementation of solar and energy efficiency at University of California/ Santa Barbara. He writes and lectures widely on the history of energy, solar in particular. Check out his website: http://john-perlin.com/

(Visited 134 times, 20 visits today)

Post Your Comment

Your email address will not be published. Required fields are marked *