Plausibility and Implications of a Nuclear Energy Glut?

Delta Force

Banned
Prior to 1973, energy consumption had historically and consistently increased by 7% per year. With the 1973 and 1979 Energy Crises, consumption patterns began to adopt a new curve as higher energy prices led to a greater emphasis on energy conservation, leading to reduced consumption. This led to the cancellation of hundreds of power plants in the United States, and many utilities had surplus capacity factors of 20% to 30% into the late 1980s and early 1990s. Nuclear power was hit particularly hard by the economic climate of the time, as nuclear power plants take years to plan and build, and inflation was very high. The overnight cost (cost of plants without the time value of money and financing charges) kept rising too due to tightened safety and environmental regulations, which were constantly shifting. This led to a feedback loop of spiking costs, as firms were forced to take out financing at high interest rates, while being subject to slipping time schedules and escalating costs from regulations. The economic climate was so bad that nuclear plants started being cancelled well before Three Mile Island, with the last American nuclear reactor order from the first era of nuclear energy being placed in 1978.

Assuming there are no 1970s energy crises and more nuclear reactors are built, there are some interesting implications. Historically, nuclear energy has had the second lowest variable cost of power generation, behind only hydropower. They would thus be a favored source of energy in a utility's portfolio in the event that there is more electricity capacity than demand. In fact, the capital costs of building the reactor make up such a large amount of total costs relative to fuel and maintenance that a utility could lose less money during an energy glut by selling nuclear energy at a cost higher than the low variable costs. Since reactors are licensed for 40 years (extendable to 60 years, with many expecting 80 years will eventually be allowed) this could lead to an energy gut for decades, depending on how extensive nuclearization is and how high energy consumption is when an energy crisis finally occurs.

For this scenario to occur, it would seem that avoiding the energy crises would be most critical, as well as avoiding the record high inflation levels seen during the 1970s. Also, the nuclear industry having someone in a position of power to push for a more predictable approach to regulation would probably help as well. I'm wondering how plausible it would be for these events to occur, and if there are any others that would be required.

What would it mean for the nuclear industry, energy industry, business, society, etc. for this to occur? It seems that electricity prices would likely be more stable (possibly even lower if a glut occurs after a crisis), and the energy sector would be less carbon intensive. Would there be as much of an emphasis on conservation, or would lower electricity prices reduce it? Also, if there is a large surplus of nuclear energy, could electric, hydrogen, or ammonia become popular as alternative sources of energy for transportation?
 
With an oversupply of electricity at nearly fixed cost, I'd expect electric heating to look good. IOTL, British governments encouraged the installation of electric storage heaters to provide overnight base load for nuclear power plants.

Railway electrification is a strong possibility, and much more practical than electric power for other forms of transport - in fact, the economic advantages are huge even without a glut of electric power. Again IOTL, British Rail conducted a study into electrification, and found that the more widespread the program and the faster it was conducted, the greater the benefits.

Electric cars might get a boost when oil eventually does spike; the availability of electrical power might actually mean that fuel cells for on-road use are strangled in the cradle. There are some interesting tricks that can be used to balance the grid with electric cars, but you need fairly advanced battery and controller technology to do it.

You might even see people looking at more desalination plants for places like California, which would be rather useful. I wonder, too, how much mining equipment could be electrically powered - the advantages should be similar to those for rail applications. Things like bucketwheel excavators and draglines often are, anyway; electric power shovels and haul trucks might be worth a look.
 

Delta Force

Banned
The glut might increase over time too. The capacity factor for nuclear energy plants wasn't very high for much of its history. I've seen 75% capacity factor as something of the optimistic figure in some of the early nuclear energy documents. However, starting in the late 1990s utilities quickly and dramatically improved reactor operations, and they can now achieve capacity factors of 90%, making them one of the most reliable means of generating power. Only geothermal is more reliable, at around 92% capacity factor. Also, utilities have been able to squeeze more power out of existing designs over time, and several plants have been authorized to operate at higher power levels after modifying procedures or upgrading turbines and other components.

Here's a graph of the capacity factor of nuclear energy over time.

capacity_factors.png
 

Delta Force

Banned
This is a double post, but I remembered some calculations I did on nuclear power economics. Using a scheme such as Offshore Power Systems, SNUPPS, or a small modular nuclear reactor, the cost per kilowatt could be significantly reduced. Nuclear reactors cost so much to build partially because each unit is custom built. Even at the same site reactors are typically built to different design specifications.

This post is originally from here.

For the first assumption, I'm using the estimate of $5,530 per kilowatt construction cost from here, and the 10% reduction assumption. I've seen the 10% reduction mentioned in an academic paper for large power reactors, and elsewhere I've seen it mentioned as typical of ship construction.

Single Unit Plant:
5530 * 0.9 ^ (log 1 / log 2) = $5530 per kilowatt

Twin Unit Plant:
5530 * 0.9 ^ (log 2 / log 2) = $4977 per kilowatt

Triple Unit Plant:
5530 * 0.9 ^ (log 3 / log 2) = $4680 per kilowatt

Quadruple Unit Plant:
5530 * 0.9 ^ (log 4 / log 2) = $4479 per kilowatt

Quintuple Unit Plant:
5530 * 0.9 ^ (log 5 / log 2) = $4330 per kilowatt

Sextuple Unit Plant:
5530 * 0.9 ^ (log 6 / log 2) = $4212 per kilowatt

Using a 15% reduction figure which one source I've found mentioned as the cost reduction typically seen for aircraft, the following prices would be seen. The final figure for a six unit run would make the capital costs for the plant as low as those for a coal fired plant with advanced filters.

Single Unit Plant:
5530 * 0.85 ^ (log 1 / log 2) = $5530 per kilowatt

Twin Unit Plant:
5530 * 0.85 ^ (log 2 / log 2) = $4701 per kilowatt

Triple Unit Plant:
5530 * 0.85 ^ (log 3 / log 2) = $4274 per kilowatt

Quadruple Unit Plant:
5530 * 0.85 ^ (log 4 / log 2) = $3995 per kilowatt

Quintuple Unit Plant:
5530 * 0.85 ^ (log 5 / log 2) = $3792 per kilowatt

Sextuple Unit Plant:
5530 * 0.85 ^ (log 6 / log 2) = $3633 per kilowatt

The 20% reduction seen for some aircraft, and which I've seen mentioned for wind turbines in an academic paper would see the following prices. It seems rather low, but some Asian nations have managed to build nuclear power plants around those prices.

Single Unit Plant:
5530 * 0.8 ^ (log 1 / log 2) = $5530 per kilowatt

Twin Unit Plant:
5530 * 0.8 ^ (log 2 / log 2) = $4424 per kilowatt

Triple Unit Plant:
5530 * 0.8 ^ (log 3 / log 2) = $3883 per kilowatt

Quadruple Unit Plant:
5530 * 0.8 ^ (log 4 / log 2) = $3539 per kilowatt

Quintuple Unit Plant:
5530 * 0.8 ^ (log 5 / log 2) = $3294 per kilowatt

Sextuple Unit Plant:
5530 * 0.8 ^ (log 6 / log 2) = $3106 per kilowatt
 
Fusion power actually has this potential (one of the key scientists in that technology said it probably could be done withing 10-20 years with something as big as a 'manhattan' project -- and no Mr Fusion on your car, but vai large power plants).

Suddently the inefficiencies of electic options and alternate fuels (like liquified nitrogen for cars or hydrogen for planes) and mass desalinization are not significant anymore. Entire classes of materials which were prohibited by energy costs become viable.

And remember that wars usually start over resources, so massively cheap energy would shift the likelihood of that.
 

Delta Force

Banned
Glut and the name of a resource do not go together outside some 1950's fantasy.

It depends on the time scale you're looking at. Energy gluts have happened many times, and there's even a glut of petroleum going on right now globally, as well as a glut of natural gas in North America.

It's possible for there to have been a glut of electricity if the energy crises of the 1970s had been delayed. Those shifted energy consumption onto an entirely new curve. In fact, the crises were probably aggravated by the low petroleum prices of the time discouraging investment in efficiency, and even leading to a bandwagon market for petroleum fired power plants. The 1980s petroleum glut may have been partially caused by energy companies investing too much in petroleum infrastructure, as utilities rapidly converted their petroleum fired units to coal or natural gas, and efficiency was increased for other uses.

Given that the world went through a scenario in which the energy crises came early, we know how much energy can be saved through efficiency improvements and conservation. If the 7% annual capacity increases that had prevailed prior to 1973 occurred, the world could be looking at a lot of surplus energy capacity even if an energy crisis occurs only a few years later. At 7%, it only takes a decade for consumption to double.
 

Delta Force

Banned
It turns out that there actually is a real life case of this. France uses nuclear energy for 76.9% of its electricity needs (link), and due to technical requirements it can't use it for much more. In fact, even with its reactors modified to run in load following mode (varying electrical output to ensure grid stability by not producing too much energy) and the largest quantity of power exports in the world (link), French nuclear reactors have a capacity factor of only 73.9%, below the world average of 80% and the American standard of over 90% (link). Some nuclear power stations actually close on the weekends for lack of demand.

France doesn't seem to have any water issues, already encourages electric heating (and presumably electric cooking), has a good electrified mass transportation system, and has some of the lowest power prices in Europe. Ignoring the fact that France probably shouldn't have built so much in the first place, what kinds of things could they or another country in such circumstances do to better utilize their plants? Could electric vehicles or electrolysis for hydrogen/ammonia fuel and fertilizers be a feasible option? Apparently ammonia is competitive with gasoline at $3 per gallon, and European fuel prices tend to be much higher (more information on hydrogen and ammonia in general here).
 
Top