What if Intel and AMD keeps increasing processor speeds after 2000s

Saphroneth

Banned
At some point you run into fundamental physical limits, so this is actually very hard to do. It's to do with the sizes at which silicon wafers can be fabricated.
 

Saphroneth

Banned
its called overclocking
Overclocking involves pushing a CPU past the speed it's been tested at. It can obtain some multiplication but not a significant amount of boost - and it's error prone.

By definition an overclocked processor is not manufacturer-specified, as well.



Also your attitude here is being a bit sarcastic and confrontational. Speaking purely as a peer, I'd say keep an eye on that.
 

Saphroneth

Banned
look at amd zen on wikipedia theyre getting back to being faster
Raw clockspeed is no longer the main limiting factor.
In any case, simple speed-of-light issues are going to mean processors of a given size have an upper limit on clockspeed, and fabrication technique limits mean a lower limit on size for a given processor, while heat dissipation becomes more of a concern at smaller sizes.

Basically clock speeds reached the limit of technology and now it takes very expensive research to push a bit further.
 
What if Intel and AMD speed up the processor speeds after 2000s

If they could, they would have. Believe me. There are fundamental problems with increasing clock speed beyond where we are (smaller transistors mean higher voltage gradients for a given voltage, so you need to drop the voltage, which means dropping speed). Also, keeping the same switching rate with smaller geometries means higher heat production per unit area.

Basically, they've hit a really steep curve on clock speed. There's not much they can do about it with any realistic known technology.

SO....
to get them to keep increasing clock speed, you have to get significant new technologies in place. In which case, the clock speeds may be the least of the differences.


Edit: you might be able to double top clockspeeds by immersing the processors in liquid nitrogen - but then you have to worry about all the other components that don't want to be that cold. And running a nitrogen liquification plant just for your computer isn't going to be popular....
 
what if they have no problems for intel and amd back in the 2000s when they speed the processors up

Considering all the other work needed to increase the speeds, ie the new generation of manufacturing Fab's I don't see how you can massively increase the speeds back in the 00's.
 

CalBear

Moderator
Donor
Monthly Donor
There is, as has been noted, a limit on the speed available with current tech, at least within practical limits.

Quantum computing will be the next breakthrough.
 
Graphene is discovered shortly after the publication of Buckyballs in Nature in 1985 by Richard Smalley, spurring a bunch of research into producibility twenty years earlier. Either IBM, AMD, or Taiwan Semiconductor starts marketing a cheap grapheme process ca. 2000.
 
Liquid cooling becomes standard on desktope computers to handle the thermal load from processors. Faced with the cost and weight, they are quickly replaced by more moderately specified notebook computers.
 
Top