What countries could sustain a native computer industry?

So I've been reading a little about the history of personal computing and computing in general in the late 70s up to the early 90s, before the current market stabilized around 'Wintel' computers, there were many competing technologies and standards, and many different systems for home computers, operating systems, game consoles and computing in general. What I've noticed is that the history of computing seems confined to the US, Japan, the UK and some Soviet and European experiments.

What other countries have the potential to sustain a total native computer industry? That is, they produce most or all the hardware and software, plus they develop new technologies. France, Germany? Argentina or Chile, if they avoided the military coups? Other Asian nations like Taiwan or South Korea (they have huge computing industries, but I'm not sure if they actually build native systems) Maybe Australia and/or New Zealand? South Africa?

I presume the 80s situation of many competing standards would not endure long, and sooner or later one of them will dominate the industry. Could it come from outside the US?
 
For those who can read Spanish, here's an interesting paper on the history of home computers in Argentina: http://publicaciones.dc.uba.ar/Publications/2010/DC10/monografiaArg_SHIALC.pdf. Among other things, it mentions a clone of the oldest videogame console, (the Oddyssey) called the Telematch, and several local electronics industries that built licsenced computers. Also, while it glosses over it, it says that the 1976-1983 dictatorship was a blow to the national industry (not surprising, given the poor relationship between the militares and the universities that were the cradle for much of computing, and their insistence on importing rather than producing locally).
 
a lot of countries could ... If Acorn had been the solution of choice for the goverment / civli service / NHS as well as education
 
Japan, Germany, France, Netherlands, and the UK were the only countries with the expertise to develop the semiconductors necessary. The Japanese did quite well in microcontrollers and memory chips while the French, Dutch, and British still have semiconductor companies in various segments. The biggest challenge is software, which inevitably will become the dominant force in computing. After a few key standards are established, software tends to develop best in decentralized environments, which disproportionately benefits the US given the size of its educated population. Further, IBM was THE dominant computing company in the 1980s so whomever wanted to supplant Microsoft would have to gain inroads with IBM. Geography alone makes this difficult although SAP might have had a chance had they developed applications beyond supply chain management.

TL;DR, Americans possessed overwhelming early advantages.
 
Any country with a economy advanced enough to allow capital investment in computing by its government.

Seconded

Any country in the world can have it's computer industry with enought government, private or mixed investiment
 
IMHO, there was a window of opportunity with the 'transputer'. An asynchronous 'RISC' device, designed to be used in multiples, with OCCAM language support, it had the rare ability to let different versions play nicely together. Simple enough CPU tech for a Uni campus to 'home brew' with modest yield at large nm, via a staircase of chip foundries at smaller nm and commercial yield. And, yes, if they test okay, they'd play together...

Didn't happen. Gov.uk sold INMOS, killing the transputer, just as the tech began to mature...
{Spit !!}

Had the tech been licensed out like Arduinos, Raspberries etc etc, you'd have situation where local manufacture was eased. Hey, so your local foundry sweetspot was 120nm ? Compared to the size of the die, that's still trivial...
 
Top