How do computers develop with earlier transistors?

Let's say a point-contact transistor is invented about 25 years before OTL (1922 or thereabouts). What I'm interested in -- for this thread -- is how computers would develop ITTL.

From what I can piece together on Wikipedia, it looks like the biggest advance needed for computers at this time is the very theory and mathematics on which even the most rudimentary software is based -- primarily in the form of the Church-Turing Thesis, which OTL didn't arise until the mid-30's. Even then the decision problem hadn't even been proposed in the modern form.

First, is this about right, or am I missing something? Second, given these limitations, what would the evolution of the computer look like where the electronics (to a greater extent) precedes the the theory?
 
Babbage.

Although Babbage's Engine was mechanical, the Lady Ada worked out a 'command language' to suit...

http://en.wikipedia.org/wiki/Ada_Lovelace

"In 1953, over one hundred years after her death, Lady Lovelace's notes on Babbage's Analytical Engine were republished. The engine has now been recognised as an early model for a computer and Lady Lovelace's notes as a description of a computer and software.[27]

Her notes were labelled alphabetically from A to G. In note G, the Countess describes an algorithm for the analytical engine to compute Bernoulli numbers. It is generally considered the first algorithm ever specifically tailored for implementation on a computer, and for this reason she is considered by many to be the first computer programmer."

http://en.wikipedia.org/wiki/Strowger_switch

This electro-mechanical widget could have been useful for a Babbage Engine, and was patented ~1891...

An ATL could have seen the Wright Brothers hiring time to crunch numbers for their Flyer's design...
 
Well, the Wright Bros making use of this that's before what I was thinking about -- but ate you saying the math and theory was actually in Babbage's work, ready to be used?
 
Okay, computer scientist here.

Developing a point-contact transistor earlier probably wouldn't have had a huge effect. Perhaps it'd speed up development slightly, but it wouldn't be revolutionary. Two things to note. First, the point contact transistor was not the first type of transistor developed. Transistors were invented in Canada, in the 20s, right about when you'd have your point contact transistor. They really didn't make a huge splash at all, for many reasons. Second, even after the point contact transistor was developed, it was years before they were actually used in computers. Again multiple reasons for this.

Now, the point-contact transistor is far easier to build than the field effect transistor. So production of the transistors is not going to be as much of an issue as OTL. However, the math and theory simply isn't there, part of the reason there wasn't a huge push into implementing field effect transistors. Again, having transistors be more common is going to push the date a couple of years early, but we're not going to see massive computer advancements. You're going to see more changes in simpler electronics like radios than with computers.

Nik, you're confusing the issue. This doesn't have much to do with software issues. It's a matter of underlying computer architecture, how they were designed, how they should be built, how all of the incredibly intricate and complex electronics work together. And (electronic) computer engineering simply wasn't very sophisticated at the time.

The Wright brothers wouldn't get much of a chance to use it. Computers simply weren't mass produced enough at the time. They were massive, expensive undertakings used in major endeavors with lots of funding (read: government, or VERY big businesses). Having good transistor use helps a little bit, but the Wrights wouldn't be able to afford the high cost of even temporary use.
 
Helpful.

I contribute to a TL where we've just about reached the stage where 'computers' will need to be invented to cope with the advent of the Space Age, so I'll refer to the above info again in future.

Does anyone out there have any suggestions for what alt-computers ought to be called, especially in a TL where the Roman Empire survives?

Thanks!
 
Helpful.

I contribute to a TL where we've just about reached the stage where 'computers' will need to be invented to cope with the advent of the Space Age, so I'll refer to the above info again in future.

Does anyone out there have any suggestions for what alt-computers ought to be called, especially in a TL where the Roman Empire survives?

Thanks!

Analytic machine is a good one (quickly shortened to just analytic). Apparently the Latin word for computer (ie "one who computes") is "computator." That sounds a bit too close to OTL if you're going for a real different feel, but analytic has a Greek origin. That's about all I can personally input, hopefully someone who knows some Latin might be able to think up an alternate word. Perhaps named after one of its creators?
 
I contribute to a TL where we've just about reached the stage where 'computers' will need to be invented to cope with the advent of the Space Age, so I'll refer to the above info again in future.

Does anyone out there have any suggestions for what alt-computers ought to be called, especially in a TL where the Roman Empire survives?

Thanks!

Sentiapparati?
 
IIRC, it wasn't until Shockley actually built his transistor that the potential of it was understood. So, the most practical PoD may be to have someone try to build Lilienfeld's FET in the late 1920s and notice the gain effect. Then, when large germanium crystals start to become feasible in the 1930's, someone puts two-and-two together and builds a point-contact transistor before WWII. The explosion in need for computational power during the war could then mean a wider use of transistors by the end of the war. I don't know that it would change much, but there it is...
 
Okay, computer scientist here.

Developing a point-contact transistor earlier probably wouldn't have had a huge effect. Perhaps it'd speed up development slightly, but it wouldn't be revolutionary. Two things to note. First, the point contact transistor was not the first type of transistor developed. Transistors were invented in Canada, in the 20s, right about when you'd have your point contact transistor. They really didn't make a huge splash at all, for many reasons. Second, even after the point contact transistor was developed, it was years before they were actually used in computers. Again multiple reasons for this.

Now, the point-contact transistor is far easier to build than the field effect transistor. So production of the transistors is not going to be as much of an issue as OTL. However, the math and theory simply isn't there, part of the reason there wasn't a huge push into implementing field effect transistors. Again, having transistors be more common is going to push the date a couple of years early, but we're not going to see massive computer advancements. You're going to see more changes in simpler electronics like radios than with computers.

Nik, you're confusing the issue. This doesn't have much to do with software issues. It's a matter of underlying computer architecture, how they were designed, how they should be built, how all of the incredibly intricate and complex electronics work together. And (electronic) computer engineering simply wasn't very sophisticated at the time.

The Wright brothers wouldn't get much of a chance to use it. Computers simply weren't mass produced enough at the time. They were massive, expensive undertakings used in major endeavors with lots of funding (read: government, or VERY big businesses). Having good transistor use helps a little bit, but the Wrights wouldn't be able to afford the high cost of even temporary use.

The bigger effect might be on solid-state physics--remember Bardeen? Well, he wasn't quite done with the Nobel-level work after just that...(look up "BCS theory"). But obviously besides what one guy might do, there might be other effects on understanding semiconductor physics from building an early transistor. I'll admit I don't know what they are (but hopefully after this semester will have at least an idea!)

Analytic machine is a good one (quickly shortened to just analytic). Apparently the Latin word for computer (ie "one who computes") is "computator." That sounds a bit too close to OTL if you're going for a real different feel, but analytic has a Greek origin. That's about all I can personally input, hopefully someone who knows some Latin might be able to think up an alternate word. Perhaps named after one of its creators?

"Computer," I think, came from the term "Computer" used to denote young women who did calculations before the advent of, well, the computer (particularly in astronomy, where some achieved quite some notoriety and made some major breakthroughs). So I'm not so sure that the name will be butterflied...
 
The explosion in need for computational power during the war could then mean a wider use of transistors by the end of the war. I don't know that it would change much, but there it is...

Right. By the time that a) mass producing transistors became viable and b) we have reasonably modern computer architecture, we've already reached the 40s. So, at best, we've only moved things a couple of years earlier. And then engineers will hit another brick wall pretty quickly before someone thinks up ICs, which makes OTL and TTL that much closer in computer development.

Point-contact transistors showed up at pretty much the perfect point in time to have the biggest effect. Getting transistor based computers in mid-WW2 would be neat, but there simply weren't enough computers during the war, nor did scientists realize their full potential.


I actually don't know. :eek:

I'm a computer guy, not a physics guy. I know what's necessary for some basic computer engineering, and whatever I remember from my two college physics classes, but that's it.

"Computer," I think, came from the term "Computer" used to denote young women who did calculations before the advent of, well, the computer (particularly in astronomy, where some achieved quite some notoriety and made some major breakthroughs). So I'm not so sure that the name will be butterflied...

You're right on the etymology. It's fully possible the Latin "computator" would be used. I think the name is really ugly, though. :p

It depends on what Megas is going for. If he wants a realistic Latin version, computator is fine. If he wants something reasonably different to add some flavor and originality to his timeline, computator is kind of close.
 
IIRC the Germans built a large tube type Computer in the early '30's. however it was too expensive to operate [ electric bill and Tube replacement costs].
Given cheaper to operate, with less heat burning out transistors, whe may see Germany becoming the 1930's --Silicon Valley.
 
I think the biggest effect of earlier transistors would be on radios rather than computers. Suddenly they're cheaper and much more rugged.

Think of a WWI where you're not dependent on field telephones.
 
IIRC the Germans built a large tube type Computer in the early '30's. however it was too expensive to operate [ electric bill and Tube replacement costs].
Given cheaper to operate, with less heat burning out transistors, whe may see Germany becoming the 1930's --Silicon Valley.

I don't know anything about that. The only early work I know of for German computing was from Zuse. Doing a quick look at wiki to refresh my memory, his first computer was built in 1938 and was a failure because it didn't have enough precision, not because of the tubes.

In any case, Germany wouldn't have been the computing mecca by any means. The US for the most part was the nation with all the big computer innovators. Zuse was a very bright man, but one early computer wouldnt have been enough to overcome the massive advantage the US (and, to a somewhat lesser extent, the UK) had in brainpower.

I think the biggest effect of earlier transistors would be on radios rather than computers. Suddenly they're cheaper and much more rugged.

That's what I thought. Transistors were originally used in small scale electronics, not large scale. The current paradigm for computers as giant massive machines, and simplistic architectures meant that transistors couldn't be fully exploited.
 
OTL Konrad Zuse had problem of find right hardware
he had to build his Z1 computer not with Radiotubes but with mechanical counter
his later computer Z3 / Z4 had to build it out electrical relays

lucky for us
the NAZI goverment had not understand the opportunities of this Technolgy
and suppressed most research and development on Computers during ww2
 

Cook

Banned
Sorry, I transposed the words when I read it; I had you thinking before about the Wright Brothers using this…
:p
 
I work in the semiconductor industry, but not in manufacturing.

To get reliable large scale manufacturing, you need bipolar transistors, and the best way to produce these are with diffusion. These two advances allow really large scale production.

Computing can be done with discrete transistors (individual transistors in one package), but for more advances you need integrated circuits.

Incidentally, the idea of photolithography may not have been around until the 1950s, but the precursor, the printed circuit board, started in the 1930s. If there are early diffusion transistors, then all the building blocks are there.

I tend to agree that the earliest applications will be radio (low frequency at first, then moving up as manufacturing techniques improve). Possibly radar sets can be shrunk and made lighter.

The early applications of computers were crypotanalysis and gunnery/projectile calculation.

Another advance needed is a Von Neumann machine.

Regards

R
 
Darlington transistors...

"And then engineers will hit another brick wall pretty quickly before someone thinks up ICs"

http://en.wikipedia.org/wiki/Darlington_transistor

Mr Darlington had the brilliant idea to connect two or three transistors on same die. He used it to compound the modest gains from a pair or prial of transistor stages, while keeping the temperature tracking manageable. Urban Legend holds that he wanted to patent 'or more', too, but his lawyer talked him down...

Had that extra phrase been in, he would have collected on *every* IC produced for the following decades...
 
Top