Many Americans, including me, had our views of nuclear power strongly shaped by the 1979 film The China Syndrome. This polemic’s alarming message was amplified when the Three Mile Island nuclear accident occurred just days after its release.
Splitting uranium atoms to generate electricity had once been touted as a miraculous solution to the world’s energy needs, but by the late 1970s, most Americans had come to see nuclear power as a radioactive Trojan horse, carrying the sinister potential for core meltdowns and our own destruction. This, of course, was before we realized that burning coal, oil, and natural gas was creating a different, slow-motion disaster, triggering a greenhouse effect that is now visibly altering global climate.
The dire need to cut those emissions dramatically has led some scientists — and some environmentalists — to a surprising change of heart: If there is to be any realistic hope of limiting climate change, we may need to build new nuclear power plants to supplement solar and wind.
The real world often compels us to choose the lesser evil.
Wind and solar, which now supply just 9 percent of U.S. electricity, can be scaled up over time, but few scientists think renewables will provide most or all of the energy needs of the planet’s 7.6 billion people by 2030 or 2040.
“Nuclear power paves the only viable path forward on climate change,” says former NASA scientist James Hansen, who first sounded the alarm on global warming 30 years ago.
Nuclear is expensive and not without risks, but the fourth generation of nuclear plants is engineered to prevent meltdowns and burn much of its own waste. Pollution from burning fossil fuels, meanwhile, quietly kills millions of people worldwide every year. Unchecked climate change will be even more catastrophic. If we want to power our devices, cars, and homes, we will have to accept trade-offs. Lunch, alas, is not free.
Source: The Week Magazine