Potential Delays to Computer Proliferation?

i was watching a stop-motion video and this thought occurred to me: what events could delay the proliferation of computers to the end of determining how that, in turn, could affect various media? i mainly just want to try to cover all my bases for any and all of my AH projects
 
i was watching a stop-motion video and this thought occurred to me: what events could delay the proliferation of computers to the end of determining how that, in turn, could affect various media? i mainly just want to try to cover all my bases for any and all of my AH projects
More socially interventionist governments with higher personal taxes would reduce the number of "early adopters" with the necessary surplus capital to buy early desktops and governments themselves would be the main customers, mainly for big mainframes with dumb terminals. So a more left wing/progressive 1980s in the US and UK?

WWII and post-war work on decryption was a big driver so no WWII or Cold War and "Gentlemen don't read each other's mail" governments and development probably slowed by at least a decade.
 
Computers didn't proliferate VERY fast when software had to be, essentially, written for specific models. While 'CP/M' was a nice standard, it wasn't very standard. Apple II machines were probably the biggest single market for software (non CP/M, obviously).

The development of the IBM PC, despite its numerous flaws, with an open BIOS, published architecture and the IBM name, opened the market to (legal) clones and to business markets. The old line was 'no one ever got fired for buying IBM' in a business setting.

IBM didn't believe the PC was going to take off and be a major product - or they'd have never given the Boca Raton people who developed it the freedom to make it so open.

Without that (if IBM goes its usual proprietary route), PCs stay expensive, the software market stays terribly fragmented, and PCs don't explode like they did.

At a wild guess, some Unix variant wins out in the end, but not until memory prices come down and processor performance goes up perceptibly. Which will, in turn, take longer because the market just isn't as big.

That could easily slow things down massively. And lead to what's effectively an oligopoly of proprietary systems (Apple, IBM, probably HP and DEC, say, plus a few micros that expand out of the CP/M field.)
 
Computers didn't proliferate VERY fast when software had to be, essentially, written for specific models. While 'CP/M' was a nice standard, it wasn't very standard. Apple II machines were probably the biggest single market for software (non CP/M, obviously).

The development of the IBM PC, despite its numerous flaws, with an open BIOS, published architecture and the IBM name, opened the market to (legal) clones and to business markets. The old line was 'no one ever got fired for buying IBM' in a business setting.

IBM didn't believe the PC was going to take off and be a major product - or they'd have never given the Boca Raton people who developed it the freedom to make it so open.

Without that (if IBM goes its usual proprietary route), PCs stay expensive, the software market stays terribly fragmented, and PCs don't explode like they did.

At a wild guess, some Unix variant wins out in the end, but not until memory prices come down and processor performance goes up perceptibly. Which will, in turn, take longer because the market just isn't as big.

That could easily slow things down massively. And lead to what's effectively an oligopoly of proprietary systems (Apple, IBM, probably HP and DEC, say, plus a few micros that expand out of the CP/M field.)

The big reason for IBM to make it so open was that it was worried about the DOJ investigation on it being a monopoly. It was worried that if it nearly monopolized the PC market it would add fuel to the fire. So rather than do that it made it more open in the PC market to try and prevent the government from breaking up its more valuable at the time mini and mainframe business.
 
Top