Successful Fifth Generation Computer?

The Fifth Generation Computer Systems project (FGCS) was an initiative by Japan's Ministry of International Trade and Industry, begun in 1982, to create a "fifth generation computer" (see History of computing hardware) which was supposed to perform much calculation using massive parallel processing. It was to be the end result of a massive government/industry research project in Japan during the 1980s. It aimed to create an "epoch-making computer" with supercomputer-like performance and to provide a platform for future developments in artificial intelligence.[1]
The term fifth generation was intended to convey the system as being a leap beyond existing machines. Computers using vacuum tubes were called the first generation; transistors and diodes, the second; integrated circuits, the third; and those using microprocessors, the fourth. Whereas previous computer generations had focused on increasing the number of logic elements in a single CPU, the fifth generation, it was widely believed at the time, would instead turn to massive numbers of CPUs for added performance.[2] The project was to create the computer over a ten year period, after which it was considered ended and investment in a new, Sixth Generation project, began. Opinions about its outcome are divided: Either it was a failure, or it was ahead of its time.


So what do you think, a failure or it was ahead of its time?
 
Well considering all current supercomputers use the same basic architecture that the Japanese pioneered in their fifth-generation computer project it was a blazing success. The computer itself was a bit lame at the time I read some articles in graduate classes about it and it would not out perform "standard" supercomputers because the software techniques needed for massively paralleled computing had not been developed yet. There was a lot of work in both software and hardware going on all over the world in the mid to late 1980s about how to handle multiple processors that showed up in supercomputers in the 1990s and 2000s and is showing up in desktops now - my computer science department had a "massive" 32 core supercomputer in 1987 that we did experiments on to figure out how to break up standard programs so they ran with as little loss as possible. But I digress...

So the ATL of this would be to have someone in Japan have the brainstorm of "we need to spend more time/effort on software than we do on hardware" or have some of the late 1980s breakthroughs in parallel programming happen in the early '80s or even the late '70s as thought experiments.

Tom.
 
Well, considering that functional/threaded languages like Haskell, Scheme, and even Erlang have their genesis in that era, a better question is what if someone like, say, Texas Instruments decided to make one last try at the desktop using the TMS 3200x0 series of DSPs as CPUs and uses Haskel or Scheme as the built-in language?
 
Well, considering that functional/threaded languages like Haskell, Scheme, and even Erlang have their genesis in that era, a better question is what if someone like, say, Texas Instruments decided to make one last try at the desktop using the TMS 3200x0 series of DSPs as CPUs and uses Haskel or Scheme as the built-in language?

Well I doubt it would be TI they really just did't get the PC market, I had the misfortune to have one of their attempts at a home PC the 99/4a - it made the Radio Shack stuff look state of the art in 1980. Basically someone in TI's upper management and/or marketing would need a brain transplant. There were lots of very smart engineers that were working on cool stuff behind the scenes but management never let them develop it and what they did develop marketing did a miserable job of promoting.

I think it would be better to have some young TI engineers (darn I can't think of any names off the top of my head) that know what the TMS 3200x0 series can do pull a Wozniak/Jobs (or a Hewlett/Packard if you would rather) and build a parallel Desktop in the '83-'84 time frame (i.e., within a year or so after the TMS32010 was released).

That would probably be a better POD to get parallel computing a jump start - it would also get a cool new desk top about the time I was seriously studying up computer science and electronics! I like that!

Tom.
 
I'm planning a timeline in Alien Space Bats in which I (and a couple of ther people time travel back to the late '60s in order to bring about a better computing landscape by using knowlege of what works and what doesn't. The idea is to make computer hardware that succedes becouse it works, rather than becomes ubiquitous by being ubiquitous.

By all means out the link and give me your thoughts either here, there, or by PM.

Oh, and by the way, there was this French company called Excelvision that made computers based on TI hardware (8 and 16 bit computers using the TMS 7010 and 9900 series CPUS for home, TMS 320x0 DSPs for business and workstations) and turned them into networked smart terminals. They went under when after the general election of 1988, the French government reneged on several contracts and their stock was then massively short sold for four weeks straight, preventing any way to raise new capital. Their most lasting legacy was the architecture of Midway's "Letter Unit" series of arcade hardware.

Thing is, if they are using the 320X0 series as computer CPUs in the timeline I'm making, they'll be less likely to sell them to others at reasonable prices, even if they won't be used in direct compatibilities. Hmm, maybe someone could sign a second source contract for them and while the ink is dry on that, the "Jobs and Woz" guys show them the prototype for the TI 320 16/Z?
 
So the ATL of this would be to have someone in Japan have the brainstorm of "we need to spend more time/effort on software than we do on hardware" or have some of the late 1980s breakthroughs in parallel programming happen in the early '80s or even the late '70s as thought experiments.

That's… not going to happen. From the research I've done for my (abandoned) timeline that had a heavy focus on computers and Japan it wasn't plausible.

Simply put the Japanese didn't have the same software focus of the Americans. They were too invested in hardware ideas. Now a few companies did try and break this.

The most interesting were, of course, Apple-derived. Canon—easily one of the most innovative firms in Japan, there's a great Wired article about them in the '90s—partnered with NeXT which wound up a massive failure but there are PODs to be mined there.

Likewise General Magic had a lot of Japanese interest, something could have been done there especially with Japanese investment in wireless technology. Some kind of General Magic + Sony + IDO[1] could create the smartphone revolution in the '90s in Japan based on pen input trumping Symbian and BlackBerry and Palm and Newton all at once. (Interestingly for all the amazing pre-smartphone features Japanese mobiles had the UI sucked, and General Magic had tons of ex-Apple designers.) In fact there's a great tech timeline to be mined from that where Newton (with a smaller model) & Palm are running BlackBerry-like compressed data service because General Magic shows the way forward.

But Japanese software development for supercomputers in the 1980s? Almost certainly not. Certainly there are a number of PODs to bring Japan into the modern software world circa 1980s but none that would effect the Government and since the government pays for the supercomputers as an industrial program via MITI….


[1] IDO being owned by Toyota, another reasonably innovative company. IDO merged with DDO to form the modern Japanese mobile operator KDDI which until Softbank came around was the much more innovative/radical competitor to NTT DoCoMo.
 
Top