Alternate Computing

What are possible alternative ways of computing besides Babbage engines, optical circuits, DNA computers, or molecular or quantum computing? I'm wondering if anyone else heard of the possibility of using mechanical circuits manipulated by water from the Clive-less World timeline.
 

Darkest

Banned
What about light based computing? Photovoltaic cells invented early, the computer is run by mirrors and laser light. Perhaps some kind of evaporation effect, light is concentrated into small water orbs, which turns into vapor and enters another chamber, turning a gear, moving mirrors, alternating light and solar cells.

Essentially, it would be a very large machine, though you could make it the size of a truck if you engineered it right. If the water gets cold, the computer doesn't run as fast, so they are primarily used near the equator. It would also not be as advanced probably.

Other ideas: sand-based computers, maybe with a system of weights and balances? No idea how that would work?

Potato-based computers? Potatos become the primary source of electrical energy after someone discovers how it works. If so, potatoes would become a very important crop in the future.
 
A trinary system would work with computer based on magnets attract, repell and neutral rather then the binary off and on system.
 

Grey Wolf

Donor
Dick Francis

Dick Francis wrote a novel in the early 1980s set c20 years in the future. He extrapolated the state of play with home computers as it was at that time and simply continued the trend. This is surely ONE ROUTE that computers could have taken in an ATL - non-compatibility except in the ability to talk to one another. Not just IBM-compatibles and Apples, but a dozen or so major makes, all with their own coding and all with dedicated users

I'm going to have to find that book again, I'm sure I have it around here somewhere

Grey Wolf
 
There are many different types of computing circuits possible.
I invented a kilohertz one using paintable relay logic which I may patent because sometimes you need a computer because it's cheap and you don't care if it is slow. 1840 technology, pretty much. Static electric relay logic might be possible around 1650 with hertz rates as a calculator because it wouldn't be usable for anything else that required any kind of speed. That's no better than cogwheels, but cheaper. Electron tubes weren't ready till amplifiers came out in 1905 or so. Semiconductor logic needs 1950 or so for semiconductors to be understood, though you could have built them by 1930.
 
There were also different kind of analog computers. They were especially good for graphics when memory was extremely expensive and kept the resolution of digital computers down. If the IC had been developed later, they might still be dominating.
 
I'd second the analog computer idea. There is I think still some untapped potential there, with op amp based machines instead of transistor based.
You covered most of the alternatives people tend to come up with in sci fi... hmmm, let's seee....

How about organic computers? Not DNA or molecular, but organisms that function as computers. Perhaps a currently existing organism is
altered to function as a computer... complete with input and output devices. Or perhaps computers genetically inserted into one's genes so that a computer grows as part of a person.
 
I think that all the different hardwares mentioned here (especially the more realistic ones) would result only in differences in computing performance (i.e. speed). To a user they would all do the same things as with our current digital computers, that is add, subtract and perform input and output to peripheral devices.
 

Grey Wolf

Donor
hexicus said:
I think that all the different hardwares mentioned here (especially the more realistic ones) would result only in differences in computing performance (i.e. speed). To a user they would all do the same things as with our current digital computers, that is add, subtract and perform input and output to peripheral devices.

Well, true but the Dick Francis one would result in a very different society in so many ways that its worth considering. Microsoft, Bill Gates, Windows, none of these would be all-pervading. Names such as Atari, Sinclair, Jupiter Ace, Commodore would still be referencing not only computers to a standard design but different animals completely. The internet also would be vastly different

Grey Wolf
 

Diamond

Banned
Grey Wolf said:
The internet also would be vastly different
If it even came to be at all. Think about trying to interface all those different system types! Instead, what about dozens of 'internets', each keyed to one type of computer system?
 
Diamond said:
If it even came to be at all. Think about trying to interface all those different system types! Instead, what about dozens of 'internets', each keyed to one type of computer system?
I don't think it would be all that hard to interface more different system types. What would likely happen is either the computers themselves would become more alike than different, or a common interface would be developed and manufacturers would figure out how to get it to talk to their computer... just like we have lan cards that are PCI, USB, PCMCIA, ISA, etc. Of course each computer manufacturer's operating system must also learn to interface with whatever TCP/IP equivelant there is in such an ATL, but that's not hard. Everything from linux to OS X to Windows to BeOS does that in OTL.
Now of course you need the infrastructure as well, routers, ISP's fiber optic backbones, network access points... but that relies on a single communication standard, which is certainly possible even given multiple computer manufacturers.
 
Compatibility between brands, or even between unrelated users, is not a natural development. It hinges on a bizarre decision made by IBM in the very early 80s.

During the 70s and 80s, business computing was fundamentally run on the service model. You called up the computer fellow, explained what sort of information you needed to track, store and manipulate and how, and he designed, built and programmed your system from the ground up. There was no effort to make it compatible necessarily even with other computers from the same designer, because that was not part of the service contract.

Home computers came along. Apple came to dominate them, because Apple allowed third parties to create compatible hardware, and released enough of its BIOS that third party OS's were humanly possible. This was pennies and nickels compared to business computing's many dollars, but IBM felt the need to compete anyway. And since Apple was the industry leader thus far, IBM decided to use the Apple strategy and simply do it more thoroughly and efficiently. Third party hardware, OS and software could be created for IBM machines fairly easily.

Then IBM sat and did nothing for 6 years. It made no attempt to interfere with the free market, partly because it quickly acquired a plurality of market share. It didn't even prosecute flagrant violations of its copyrights and patents. Thus, when it started losing market share to the clones, it attempted to go proprietary again, but it was too late--PC users had become accustomed to compatibility for 10 years, and refused to buy anything which did not adhere to the open standard created by IBM. And when PCs became powerful enough that they could be used for business functions in the mid to late 80s, they destroyed the one-off business computer industry.

All you have to do to eliminate this is make IBM stay proprietary at all points.
 
How about organic computers. Engineered neuron nets in organic fluid and nutrient baths. Or human brains in jars MUWAHAHAHAHAH!
 
Top