Japan, Germany, France, Netherlands, and the UK were the only countries with the expertise to develop the semiconductors necessary. The Japanese did quite well in microcontrollers and memory chips while the French, Dutch, and British still have semiconductor companies in various segments. The biggest challenge is software, which inevitably will become the dominant force in computing. After a few key standards are established, software tends to develop best in decentralized environments, which disproportionately benefits the US given the size of its educated population. Further, IBM was THE dominant computing company in the 1980s so whomever wanted to supplant Microsoft would have to gain inroads with IBM. Geography alone makes this difficult although SAP might have had a chance had they developed applications beyond supply chain management.
TL;DR, Americans possessed overwhelming early advantages.
To be fair the Japanese did not do a lot computer wise, since the computer and IT revolution was born in the United States. For the European to compete effectively with the Americans, you need a European IBM and loads of at first public money to be spent in pure research, in integrated circuits, transistors. Then you need the right management to capitalise on this and make the right decisions.
It would seem that the US is destined to become the center of computing through IBM, so was there any way for a country that isn't the US to do so? American techwankery is getting boring...
Bonus challenge: a country that isn't the US and isn't in Europe
Last edited: