The New Nuclear Energy Revolution

Escalating costs for nuclear power plant construction in the U.S. were not inevitable, according a new study in Energy Policy. They were largely the result of the increased regulation that followed the Three Mile Island reactor meltdown in 1979. Had the nuclear transformation actually taken place, the entire U.S. 1,000-gigawatt electricity generation sector could notionally have been replaced by 1,000 nuclear power plants, each with a generating capacity of 1,000 megawatts. Among other things, this would have made the country’s greenhouse gas emissions more than 30 percent lower than they currently are.

In the mid-1960s, the capital costs for nuclear construction had been steeply declining, falling to around $600 to $900 per kilowatt of generation capacity in current dollars. But that proved to be the nadir in U.S. nuclear construction costs. The late ’60s and early ’70s saw a plethora of new environmental regulations. For example, in 1971 the District of Columbia Federal Court of Appeals ruled in a case brought by the Sierra Club and the National Wildlife Federation that the Atomic Energy Commission must change its rules to conform to the new National Environmental Policy Act’s requirement to consider the environmental impact of each new power plant. In addition, accidents led to tightened, and more expensive, safety requirements and equipment back-fits. As a result, construction costs more than doubled during the 1970s to a range of $1,800 to $2,500 per kilowatt of generating capacity.

Things proceeded differently abroad. In Japan, nuclear plant construction costs doubled in the 1970s and then remained flat for the next 30 years. In India, nuclear construction costs rose 150 percent between 1976 and 1990 but have since fallen by 10 percent. And in South Korea, the cost of building nuclear plants has fallen by 50 percent since 1972. America’s steeply rising cost trend for nuclear construction is clearly the outlier.

Click here to read the full publication →