AHC: Change how the USA gets Electrified!!!

Can Electrical Storage be made a legal requirement early on in the USA?


  • Total voters
    17
  • Poll closed .
Your challenge, should you choose to accept it, is to change how Electrification spreads across the USA, right from the get go. The system we have today in a national power grid that relies on generation/distribution, meaning that if part of the grid goes down, somebody is going without power, and in cases where there is widespread damage, the number of folks affected could be quite large, and take some time to get them all back up and running.

That's the system we have today.

The challenge is to have laws put in place that illegalize running electricity to an end user more that a mile from its generational source, and instead require the transmission lines to be hooked to a local electrical storage facility/device.

Back in the early days, not to much could be run off electricity after all, mainly lights and heavy equipment, but little in the way of household devices, and as electric power becomes more the way things work, have the laws require more and ever more duration of the local storage facility/device.

What effect would such laws have in the following areas?
Urban/City areas.
Industrial/commercial.
Suburban/Big towns.
Small towns.
Rural areas/individual homes in the countryside.

Would the use of batteries on such a massive scale make our battery technology much more advanced/affordable, if after ~100 years of this requirement being enforced, had been the case? Or would we see totally different/better means to storing electricity being developed alternatively? Liquid Metal Battery, or Rotational Storage, perhaps? Here.

What would make a good standard for folks living way out in the countryside? An hours storage for their house/farm on their property, that they own and can only be used by them? Perhaps additional storage, that they could sell back to the grid, but separate from their own personal use facility? Would it make sense to require just a single hour, for folks that far from the Generators? Maybe 24 hours would make more sense? I personally would like to see far more duration of average electrical use required in such locations, like in terms of at the very least, weeks if not months, and would be thrilled to see it in terms of 'years', but that probably would never fly.

So, what are your thoughts?
 
Congrats, you just fucked over every city in the country. Do you know how much space this is going to take up in transformers and electrical storage?
 
Congrats, you just fucked over every city in the country. Do you know how much space this is going to take up in transformers and electrical storage?
No clue. But then again, let's look at the way back days, when electricity was not used widely yet, as I understand it, at first, outside of heavy machinery, the only thing using electricity is going to be old style arc lighting, no?
 
No clue. But then again, let's look at the way back days, when electricity was not used widely yet, as I understand it, at first, outside of heavy machinery, the only thing using electricity is going to be old style arc lighting, no?
It’s first widespread use was by industry. With laws like this in place much of that industry will remain on steam engines for much longer. You may well have killed the US’s industrial advantage.
 
It’s first widespread use was by industry. With laws like this in place much of that industry will remain on steam engines for much longer. You may well have killed the US’s industrial advantage.
A couple things, though.

Only industry beyond a mile from the generator facility would even be affected by this law, and this gives rise to industrial sectors having at their heart a power station.

As technologies improve, perhaps the laws legal limit goes up? And/or, perhaps the laws could be different for the different end users.

My main issue is for individual homes, in the modern era, where folks would have been made to get used to having to have battery backups for their homes for decades, and what effects we would see from this by today? Would we see everyone with some solar panels on their roofs, or having an Ambri liquid metal battery?
 
Re-watching those videos brings back the excitement I felt watching them for the first time, back when. Here is a new link to an article published the 6th of July, 2020. Very exciting stuff. In the beginning of the research into liquid metal batteries, they had to be kept at 700° C, or about 1,300° F. The newest reports as for a liquid metal battery that just needs 20° C, or about 68° F.
 
This is my industry. The way I pay for this internet connection. I have actually designed such systems. Though my work has ended up being more in Building Automation I still get to design some renewable systems once in a while.

There are three basic types of distributed electrical production systems:

1. Grid tied - these are systems that simply dump renewable produced power into the grid, there to be balanced by the provider. In the context of home systems, this would be one where your system causes your meter to spin backwards while producing. It is the cheapest and simplest way to do distributed production.

2. Grid Interactive- These systems use storage but are stilled tied to the grid. Basically you determine the level of storage that you want to maintain locally and the relative priority of charging your own systems or returning power to the grid (the grid being the larger part of your storage). In context of home systems this would be having enough battery storage to run essentials for a day or two if the power goes out while still gaining the benefits of running your meter backwards the rest of the time. This is also popular with time-of day rated buildings as a method of load shifting to the cheaper times.

3. Off-grid- This is your cabin in the woods scenario. It is, by far, the most expensive and difficult variation. Because in this scenario the combination of your storage and on site generation need to be able to account for not just your normal loads but your peak loads. It requires an order of magnitude more storage.

Your suggestion is effectively to turn each mile radiused area around an generation station into an off grid system. This requires that the generator and the storage be sized for the load that exists in that area.

On the generation side this requires that instead of fewer, more efficient generating units you have thousands of smaller, less efficient generators. These will often need to be running outside of their rated load, reducing efficiency still further. They will also need to be replaced more often as loads increase and decrease with the movement of businesses and homes and the relative increase or decrease of electricity use.

And to this you have added a battery bank. Now to actually get this done it should probably be started prior to 1900, but if we assume that this happens around 1900 your options for secondary (rechargeable) batteries are, in order of likelihood Lead-acid, Nickle-Iron and Nickle-Cadmium batteries. None are great options for mass storage. All have limits as to their rate of charge, rate of discharge, required minimum level of charge and charge efficiency. So your already inefficient production system will need to charge and discharge from a battery system, losing energy to heat both ways (nickle based batteries charge/discharge efficiency is about 80%, modern lead-acid is 90%. This will be applied both going in and coming out). If you are lucky enough to have the right physical conditions you could maybe do pumped storage (75%-85% now, likely less then) or gravity storage (modern systems around 90%, again less in 1900).

With this level of extra cost and complexity added to the process of electrification, you can expect it to be much slower and steam systems to remain competitive longer. This increases the relative cost of production, eating away at the advantage US business was gaining in this era. It also, ironically, increases the environmental impact of that production since it basically all still needs to be produced with fossil fuels and you have made that process much less efficient. large areas will not be able to afford a generation system within a mile of them and will remain unelectrified, or be relying on relatively small, inefficient gas, diesal, oil, natural gas or coal generators.

In general this is a bad idea, and an inefficient method of encouraging battery development. Many of the more promising technologies in use today had to wait on breakthroughs in material science. The Molten Salts of liquid metal batteries are understood because of nuclear energy research, for example. Lithium Ion batteries required 3 interrelated breakthroughs in material science in the 80's.

If you want to increase the speed and uptake of renewables and battery development (which I am guessing is driving your question) the time to do it is in the 1970's. The Oil crisis drove a lot of research into energy efficient and alternative energy technologies that was largely underfunded when the crisis passed. If the drive for these systems remains through the 80's and 90's, many of them could have been developed and reached financial viability earlier, during the 90's and 2000's.
 

marathag

Banned
thru the onset of rural electrification, Edison Nickel Iron Cells, first patented in 1899, and 32 volt wind turbines were the popular Farm power system
Those cells wouldn't wear out, even though low efficiency
Most turbines were from 1000-2500 watts
 
Your challenge, should you choose to accept it, is to change how Electrification spreads across the USA, right from the get go. The system we have today in a national power grid that relies on generation/distribution, meaning that if part of the grid goes down, somebody is going without power, and in cases where there is widespread damage, the number of folks affected could be quite large, and take some time to get them all back up and running.

That's the system we have today.

The challenge is to have laws put in place that illegalize running electricity to an end user more that a mile from its generational source, and instead require the transmission lines to be hooked to a local electrical storage facility/device.

Back in the early days, not to much could be run off electricity after all, mainly lights and heavy equipment, but little in the way of household devices, and as electric power becomes more the way things work, have the laws require more and ever more duration of the local storage facility/device.

What effect would such laws have in the following areas?
Urban/City areas.
Industrial/commercial.
Suburban/Big towns.
Small towns.
Rural areas/individual homes in the countryside.

Would the use of batteries on such a massive scale make our battery technology much more advanced/affordable, if after ~100 years of this requirement being enforced, had been the case? Or would we see totally different/better means to storing electricity being developed alternatively? Liquid Metal Battery, or Rotational Storage, perhaps? Here.

What would make a good standard for folks living way out in the countryside? An hours storage for their house/farm on their property, that they own and can only be used by them? Perhaps additional storage, that they could sell back to the grid, but separate from their own personal use facility? Would it make sense to require just a single hour, for folks that far from the Generators? Maybe 24 hours would make more sense? I personally would like to see far more duration of average electrical use required in such locations, like in terms of at the very least, weeks if not months, and would be thrilled to see it in terms of 'years', but that probably would never fly.

So, what are your thoughts?

Perhaps DC power systems retain their popularity (largely due to the realitive simplicity of DC battery backup systems with early 1900's technolgy) and poly phase AC Systems are novelties that are only used in specific circumstances.
 
There is also the problem of power conversion. Are you going to use a DC system? Batteries are DC, modern grids are AC. If so you will need lots of generators. You will not be able to use one of the greatest inventions of all time, the AC induction motor. You will have to use DC motors and their associated maintenance problems, brushes. DC generators are also very maintenance intensive with lots of brushes, commutators, and atmosphere conditions to maintain the proper film on the commutator. If you go with AC, then you need to convert AC to DC and back. Diodes were invented around 1906 so you can use rectifiers for AC to DC. Until modern inverters you will need motor generators to go from DC to AC (they can also go AC to DC). That is more maintenance you will need to do. Lead acid batteries also need lots of maintenance. Proper temperature environment, proper air flow to remove hydrogen generated during charging, maintaining electrolyte flow in each cell, etc. Every conversion also introduces losses into the system making it less efficient. It would be so much easier and cost effective to just go with large AC plants and transformers.
 

marathag

Banned
Diodes were invented around 1906 so you can use rectifiers for AC to DC. Until modern
That was with the Diode Vacuum Tube.
However, there was a missed opportunity of the sorta-semiconductor Selenium-Iron properties has been discovered in the 1870s, wasn't till 1933 that Selenium Rectifiers were commercially available. Low efficiency compared to Vacuum Diodes or later Germanium and Silicon true semiconductors, were easier and cheaper to manufacture
 
Ironically enough, after I posted this thread, I had the power go out just long enough to make me restart my computer/clocks etc...

And then, about 7:00am, it went out and stayed out for like 8-8.5 hours!!!

Makes me really wish there had been something that made folks have to have a battery backup system of some kind, but from the sounds of things, we just didn't have the material sciences to pull this off till much later.
This is my industry. The way I pay for this internet connection. I have actually designed such systems. Though my work has ended up being more in Building Automation I still get to design some renewable systems once in a while.
That is great! It is doubly ironic, because your post happened while I was sitting here in the dark, lol.

In general this is a bad idea, and an inefficient method of encouraging battery development. Many of the more promising technologies in use today had to wait on breakthroughs in material science. The Molten Salts of liquid metal batteries are understood because of nuclear energy research, for example. Lithium Ion batteries required 3 interrelated breakthroughs in material science in the 80's.
Well, that kind of puts the proverbial ka'bash of getting them earlier on, or at least soon enough to make this threads premise worth investigating, in the context of OTL's Electrification process.
If you want to increase the speed and uptake of renewables and battery development (which I am guessing is driving your question) the time to do it is in the 1970's. The Oil crisis drove a lot of research into energy efficient and alternative energy technologies that was largely underfunded when the crisis passed. If the drive for these systems remains through the 80's and 90's, many of them could have been developed and reached financial viability earlier, during the 90's and 2000's.
Yep, you guessed it, that is indeed my focus, but I goofed and thought that it would have to have been something imposed right at the start, but we just were not there yet.

I have been kinda sorta following Donald Sadoway with his liquid metal battery thing, and I seem to vaguely remember some video where they were saying that a single 53 foot tractor-trailer trailer could carry enough of his LMB to power 240 homes (although I have no clue for how long), and I though to myself, wouldn't that be something, if every neighborhood had one of those things today? Looking at the Ambri site before the power went out, it seems they are not actually at the field testing stage, yet?
 
Last edited:
So, with the information provided by @ArtosStark
Could we then see some major effort back in the 1970's, bringing us LMB sooner rather than later, and< given an earlier start, when might we have seen them coming to market?
 
The Molten Salts of liquid metal batteries are understood because of nuclear energy research, for example.
Today with this thread is the first I have ever heard of liquid metal batteries, and so the idea and its limits and liabilities has not had much time to settle in my head. But having seen the idea proposed, I have to wonder now--how much of the waiting time until it could be probably invented in a workable format could be a "Columbus's Egg" sort of thing--"it is easy and simple once one knows how." Versus ideas that simply require materials and/or methods that cannot be transposed back in time very far?

So the molten salt electrolyte is a crucial component, one of three core items (plus auxiliary stuff like vessels that can contain the hot liquids without either deteriorating rapidly or contaminating the core elements) without which the concept does not work at all. And in fact it would seem no one OTL had any expertise or experimental data to guide either selecting an appropriate electrolyte nor handling it practically--that was a spinoff of post-WWII "atomic energy" as they liked to call it in early days, when a really wide variety of reactor types were considered conceptually, and a few (but still numerous compared to number of basic design types getting a lot of work done on them today) tested out.

That's fine, but how outlandish would be for some chemist, engineer or industrialist visionary with a fair degree of technical knowledge to get the idea of a molten metal storage battery directly, and then reasoning from chemical first principles, infer they need to develop a molten salt electrolyte, and thus turn to experimenting with candidate mixtures at high temperatures way back in say 1900, or even earlier? If we go too far back in the 19th century, basic atomic theory chemistry is in the wilderness of empirical weird stuff lacking the sorts of insights into valences and so forth that would point the way toward the basic concept. I would think by 1900 we'd be on fairly safe ground there, and all but a few of the more exotic metals, most of them "rare earth" types, would be identified positively, with relatively few gaps left in the major part of the Periodic Table, a thing that empirically is well justified even though the modern atomic theory that lets us understand why both nuclei and atomic shells are organized the way they are is decades in the future. Much chemistry we understand well today "because of quantum" would then be tentative vaporware theories accounting for limited empirical data, but the fundamentals of many candidates for molten metal cathodes and anodes are fairly well known, at least for the solid state--undertaking to work with molten material is going off on an experimental adventure of course.

I would think one price to be paid in mucking around with molten metal batteries in the 19-Oughts would be that one would be unlikely to stumble upon the low temperature options any time soon, as they'd be playing experimental blind man's bluff with lots of permutations without sophisticated principles to point the way to the low-temperature options. Therefore they would have to pay the price of seeking for and developing extremely high temperature refractory materials in a non-reactive liner of some kind.

Some of the OTL progress of the past decade or so will not happen fast in the 1900s, such as finding the lower temperature combinations, though that would progress empirically I suppose by brute force trial and error, particularly if something useful can be brought to market early on, and then later replaced by something cheaper to operate. But other aspects I think can emerge directly from the experimental work. The discovery for instance that a lead-antimony mix can be mostly relatively cheap lead and only moderately antimony was an experimental surprise that in retrospect had a clear and obvious explanation, and this sort of discovery, assuming that it is practically possible to work with vessels of metals and molten salts that are sitting at kilo-K, could lead toward economical and useful operation.

Now I read and took to heart your analysis of why the OP proposal is counterproductive even if we had distributed storage media that were cheap to obtain and operate. Whereas a dieselpunk era molten metal storage battery is sort of the opposite; like the power plants, it seems plain that pragmatically speaking one would want to have rather few of these cells in a gigantic installation (minimizing the volume/surface area ratio, though I gather electric power storage/release rates are also proportional to area and not volume).

So one application that comes to mind given the OTL realities of power generation as a commodity is that power plants tend to operate most efficiently at a fixed high level of demand, whereas consumption has a strong tendency to be cyclic and also atop at somewhat chaotic in its demand. I gather for other related reasons, the drive to operate industrial machinery 24/7 actually developed way back in the steam age, in part indeed because of steam engines themselves, more deeply the general principle that capital machinery accomplishes no productive purpose while it sits idle. So industry at any rate is not too difficult to leverage over to continual production with a constant power draw, pretty near, but other aspects of electric power consumption remain stubbornly tied to the human community's need to be fairly well in phase with the solar day and night.

But what if we have on the site of some central power generation station, a great big liquid metal battery? We can run the plant at peak output for peak efficiency at all times, and whenever demand is slack, store up power usefully if not with perfect efficiency in the battery (perhaps leveraging power plant exhaust waste heat to help maintain a suitable layered thermal blanket around the battery), then when demand surges up beyond plant installed capacity, up to a point we can supplement output with power drawn from the battery.

As a practical setup, the AC dynamo rotor can be mechanically coupled to a DC rotor, with a system to detect rising or falling power demand; if demand falls, excitation of the rotors is shifted from the AC output to DC, delivering current to be stored in the battery. There ought to be a range at which the primary power source, I am assuming here a steam turbine, can raise or lower its output torque at a target RPM, so that when demand drops over time it is possible to throttle back the steam plant output, maintaining RPM while we slack off the torque generating net current--that is, we throttle back the rate of DC power flow to avoid oversaturating the storage battery and to save on parasite losses due to Ohm resistance and so forth, but the steam plant is buffered from having to respond very rapidly. Then vice versa if demand for power surges up, the DC generator runs in reverse as a torque-generating motor to maintain fixed RPM and thus AC cycle constant while higher excitation on the AC side produces more current output, again hopefully if high demand persists we can ramp up the primary plant generation within a certain range to relieve the DC anti-surge system.

This setup permits sizing the primary generator closer to average demand, which will be smaller than any system that can provide the same range of output wattage without a battery buffer. Of course the actual dynamo part of the whole set-up is increased in mass, doubled seems like a good estimate (which makes it more effective as a passive flywheel to be sure, but demands heavier plant and more energy to spin it up initially). Another way to use stored power in the batteries would be to have a surge demand generator to add to total output capacity, and use a portion of the stored battery power to accelerate general spin-up of that system to come on line.

In this context, a version of the OP concept of storage units outside the main generation plants would be to have substations where there is only a battery plus double dynamo but no prime thermal (or whatever) power generation. If surges up or down tend to be localized within the grid, these surgy sections can be isolated by one of these substations; a generally constant AC flow transmitted at high voltage long distance connects to a triple dynamo as a Tesla style induction motor driving the integral rotor, with the DC dynamo/motor either tapping off excess power into storage or feeding stored power out while the output AC dynamo delivers local AC in widely surging wattage. If the primary AC feed from central power storage is cut off completely, the battery/DC system goes into high power output to maintain the output AC for a time--this emergency can trigger a signal that I suppose can travel along the power lines themselves to sound an alarm warning of possible imminent power shutdown, and with some sophistication in the system, compliant power users can either manually or via automatic machinery gradually but firmly curtail their power demand--say we have a hospital on the same circuit as a block of residential housing--the householders get an alarm warning them their power could go out pretty soon now, a sophisticated communication system could give them an estimated time of cut-off. They know that if they cut out unnecessary power they can stretch that ETA if everyone cooperates, whereas certain priority users (say people in homes using electrically powered medical equipment) are exempted by switching arrangements at the power substation. The hospital gets top priority (rather, their facility is wired with essential circuits in parallel with non-essential, the latter subject to cut-off but not the former) until its own installed emergency generator is up and running and the facility switches over, signaling the substation they too can be cut off if need be. Once the battery is drained to low levels the output is dropped to any remaining priority circuits that a trickle can maintain; this buys time for authorities to evacuate any persons who absolutely require ongoing power if main power cannot be restored in time.

Thus, the central power generation station can handle a major surge in demand by interrupting the supply to such substations intermittently, so that the on and off surges of power rationed out to the substations are deemed adequate to sustain low levels of demand, with the communication elements of the system advising high demand potential users to refrain (noncompliance can cause their feed lines to be cut out of circuit). Of course any customers who agree to be placed on the back burner like that will expect to pay lower rates, whereas any who demand high priority will be asked to pay higher rates (generally; governmental regulation might well mandate exceptions for high priority public service entities such as hospitals, fire stations, police stations, etc).

Another field where I would think a precocious focus on molten metal batteries might find both markets and leverage for public support for more intensive research and development leading to incremental improvements would be naval submarines.

In the real world, in theory one might have submarines for other than military purposes, and certainly oceanic science is a thing independent in principle from these--but in practice submerged vehicles don't turn out to have many practical applications in transport. In theory, a submarine below the depth of the wavelength of surface waves can lower the drag considerably, if suitably streamlined, and thus require less energy to traverse a kilometer of distance--especially if they can go slow, as hydrodynamic drag avoiding wave drag goes as the square of flow speed, so halving the speed cuts power demand by a factor of eight (assuming high efficiency propellers). But generally speaking, cargoes bulk a lot more than one cubic meter per tonne (that is, are less dense than water) so that to carry miscellaneous cargo a submarine must also haul a lot of dense ballast, overall mass to shove through the water rises a lot. Only a few specialty niche cargos look good even on paper (crude oil for instance, but even that is less dense than water). And prior to development of nuclear power, practical propulsion requires oxygen in some form or other, or straight battery power--hauling the oxidant along with fuel to drive the submarine is a losing proposition economically, whereas putting up some kind of snorkel to breathe in air offsets the drag reduction advantage. Overall the only context in which submarines sensibly haul cargo is again military, when it is necessary to evade hostile enemy navies or commerce raiders who would shut down one's straightforward and much more economical shipment of goods on surface vessels. This is why, while a certain number of submersible vehicles are indeed science vessels, much oceanic science piggy-backs on military naval submarine operations, and broadly speaking, only navies operate subs; their virtue as a "stealth" weapons platform is what pays for their considerable cost vices!

Around 1900 in particular was a period of time in which development of submarine technology was running hot, in all major world navies (even Spain was involved in some advanced submarine projects in this pre-Great War time frame). On the side of primary power supply, the diesel engine was the crucial breakthrough (one which, despite Holland's precocious development of the general art, Americans lagged far behind the Germans in, so that despite the greater risks and other liabilities Great War era US subs were still gasoline fueled) though the more radical idea of hydrogen-peroxide fed engines (unless I am mistaken, I infer that the Walter style peroxide engines were basically turbines, though I can imagine how a piston-type engine might be driven) had potential--again, a matter of setting liabilities and limits against military utility and the limitations of the competing established tech.

Improving batteries however would definitely be a help even to a navy that had mediocre solutions to the prime mover problem. Short range shore defense subs could definitely operate on pure battery power, particularly if rapid recharge is an option--this could enable small short range subs to range around surface fleets, returning to moor to mother ships for recharges. As for the basic IC engine/battery setup, clearly even if some advanced battery design were in no way superior to the OTL go to standard of lead/acid batteries in the matter of power storage density, if they at any rate did not emit hydrogen gas when being charged, and could be both rapidly charged and rapidly discharged without shortening their working lives, then submarines at any pre-nuclear state of the art would gain capability--surfaced or "snorting" time could be reduced with the engines running on high power to rapidly recharge; maximum submerged speeds, particularly with improved streamlining, could be much higher (and perhaps not too noisy even so, the motors themselves are probably relatively easy to silence while the improved hull streamlining makes them quieter too).

Two plain objections to high temperature molten metal batteries installed in submarines emerge at once of course--for one thing, the problem of making a high temperature crucible without baking the crew from its leaking waste heat, or alternatively creating a heat signature for enemy ASW forces to detect, is pretty severe--one again needs an inner vessel that endures the high temperature continually and indefinitely, contained in a structure that both holds the hot inner vessel securely and insulates it to minimize heat loss--thermodynamically speaking zero heat loss is Utopian, so some heat signature has to be output, it is a question of whether it can be kept down low enough to be well masked when submerged.

And the other is that it seems plain to me that any kind of molten metal battery has to be held level, so as to neither spill nor slosh! A very low rate of sloshing is presumably tolerable but beyond a certain threshold, it will disrupt the separation of anode and cathode material by the electrolyte, which probably has to maintain a critical thickness. I am not sure just what happens if a highly charged molten metal anode and cathode material splash together--possibly not much worse than degeneration of power that can be extracted in the short term, but also possibly it goes bang as large short circuit currents are generated superheating the metals above their boiling points! The batteries would have to be kept level despite a submarine pitching up or down in combat maneuvers, and despite the possibility of both desired and unexpected rolling. I am envisioning the need to install the things in gimbaling suspension against these accelerations and torques, and even if electromechanical feedback mechanisms and robust hard to jam motors work reliably enough to prevent lateral surging within the crucibles, such mounts must multiply the total mass of the storage battery as a complete system considerably--in this consideration, we had better have superior energy storage density for the option to be feasible!

If however we can get the temperatures down somewhat, provide the efficient insulated containment, and the reliable gimbal suspension, despite the garish risk of a horrible death by being splashed by released hot metal (and I suppose a shock that could burst the crucibles would also tend to wreck the sub instantly anyway) world navies would at some point be very glad to switch these systems in versus lead-acid batteries. Note that even with development of nuclear fission power plants, the nature of fission power generation tends to favor steady ongoing bus power supply--I believe it is feasible to devise a reactor/generator system that can surge within certain limits, say a factor of two, between red-line maximum power output and "idling" low power, but one cannot throttle fission reactor rates down too low because of fission daughter product isotopes, some of which are "neutron gobblers" which tend to poison the reactor unless it is kept fissioning above a critical power level that can overcome the levels of these isotopes produced. These isotopes do decay into less troublesome ones to be sure. But as with a land based central power plant installation one ideally wants the nuke plant to be churning out wattage pretty steadily. So once again a bank of storage batteries can come in handy, to absorb surpluses while we ramp a fission core down gradually and provide surge power while we are carefully ramping it back up to peak power. And such a large bank of batteries can buy considerable operational margin in case circumstances require the fission cores all be scrammed completely, including a power reserve that might be necessary to restart the things.

So, are molten metal batteries in fact something that could not be built at all in say 1910, if only by an uptimer (SI, time traveler, whatever ASB thing) armed with Foresight knowledge? Is it categorically impossible for the state of the art of 1910 engineering to make a practical MMB? And if we find that yes, a suitably funded enterprise in the early 20th century could make practical MMBs at moderate fabrication and operational prices, is it really ASB to postulate some people conceiving of the abstract concept and then setting about empirically to discover suitable substances for the three conceptual elements?

It is definitely not something I know of anyone having any concept of over a century ago, but in principle it seems like the kind of thing Jules Verne could describe lucidly in operational theory without excessive handwaving, making the practical execution a matter of a search for the three materials plus a fourth structure to contain the stuff.
 
Top