AHC: No Personal Computer

Aldo no personal computer ...

The Amiga was quite powerful back in the day

Amiga might win out :)
Newcastle Hardcore Techno will be manually articulated with human beings singing the television or rap samples, and with complex as fuck drummers drumming continuous loop break beats and cut up breaks. DJs force dubplates.

Without the 500, 600, 1200 you have destroyed art. Some bastard will make the 303. Nobody, without the Amiga, will make a song like "Igloo Terror"
(4 X 64 8 BIT SHIT).
 
Yeah the basic mouse concept goes back to the first trackballs developed for radar in 1947 and many times again after that independently. So thats not something you can just wish away either. It sorta screams that there is a gaping hole in interfacing that needs to be filled with some very obvious solutions.
 
The personal computer needs the optically-size-reduced IC chip developed for missiles in the sixties. If nobody thinks of it, parts of technology freeze. The best example I can compare it to is CPR, cardiopulmonary resuscitation, the first aid procedure. It was put in place in 1958. There is no reason it couldn't have been implemented in the 19th century, as it requires no instruments or medication. Nobody thought of it, plain and simple. In an earlier post, I mentioned a brief war in 1960 that ends the Cold War, takes away the need for ICBM missiles and stops the race to the moon. Some space flights will still happen. Weather satellites, when initiated, used transistors. In the late sixties, televisions with "works in a drawer" used solid state electronics for every component except the picture tube. That was progress that exceeded the expectations of WW2 veterans who endured the Depression and wartime rationing. In the mainframe years, learning about computers meant learning FORTRAN or COBOL, issues that took more training than the average person wanted to pursue.
 
he means zero computer, not even sinclair or bbc micro either
Well that pod will need to be pre mocropprcessor .. but almost inevitable in many ways.

Keep complexity up and the learning curve steep for computer usage.

Ie relm of mainframe and midrange iron.

Keep terminals up to date .. if they work on transfer speeds then say companies could install terminals at employees houses or what not for work from home scenarios.

You could keep that going long enough that the PC basically gets introduced as tablets or some form of smart phones .. etc...

PC success was not inevitable ... it really wasn't till win95 and 98 that it really started making steam on market penetrarion jn what we see today
 
It wasn't gatekeeping for gatekeeping sake. It was because the only system that they had for the compiler in the Education Department (don't ask me why the programming course was taught by the Education Department and not Comp Sci) was a system that only had terminal access. Vi was the best editor on the system (I told her she should have ripped a program out using ed just to impress the instructor.
Scientific computing is often weirdly retro.

It's a world where GUIs don't exist because you're doing all your work in batch jobs, where you're lucky if your target machine runs a real Linux kernel as opposed to the barebones custom AIX-Linux bastard LLNL was using in production as late as 2020, whose "business requirements" are the laws of physics, and where people still use Fortran 77.

We've upgraded to Vim though.
 
Last edited:
Without the internet, there are no trolls, bullies can't bully people in their own bedrooms, antivax and other stupid conspiracies find it a lot harder to spread, people can't buy illegal drugs and weapons online, and if in democratic nations the government does something that really annoys a lot of people, they protest about it in the streets instead of just muttering online about it/putting stuff online that gets them arrested.
 
where you're lucky if your target machine runs a real Linux kernel as opposed to the barebones custom AIX-Linux bastard LLNL was using in production as late as 2020
News to me--when I was running jobs on LLNL in 2018-2019 it sure acted like it was real Linux.

where people still use Fortran 77.
I guess, but at least in the field I was in all of the interesting stuff happens in C++, Fortran is mostly for legacy code that's long since been debugged and verified to be correct and so why change it?
 

marathag

Banned
News to me--when I was running jobs on LLNL in 2018-2019 it sure acted like it was real Linux.


I guess, but at least in the field I was in all of the interesting stuff happens in C++, Fortran is mostly for legacy code that's long since been debugged and verified to be correct and so why change it?
The last time I dealt with Fortran was in 1980 or so, Replaced by C.
it was just 'C' at that point C++ was later.
So it was mostly legacy at the point, but of course, some diehards stuck with ALGOL or Ada for all the typical reasons, too new, too expensive, additional training and conversion, etc.
 
PC success was not inevitable ... it really wasn't till win95 and 98 that it really started making steam on market penetrarion jn what we see today
In the late eighties, Macintosh and Windows made computers user-friendly. I'd say it was more like 1995 when they became an office necessity. If hard drives, processor speeds and memory had frozen at early nineties levels, personal computers would not become common appliances.
 
News to me--when I was running jobs on LLNL in 2018-2019 it sure acted like it was real Linux.
Sequoia or Sierra? The "AIX-Linux bastard" remark was strictly referring to the former.

I guess, but at least in the field I was in all of the interesting stuff happens in C++, Fortran is mostly for legacy code that's long since been debugged and verified to be correct and so why change it?
Ok, F77 is admittedly just legacy code that's being kept around. That being said, in my experience people do still write new code in F90.
 
You can SLOW the development down a bit but you can’t slow it down THAT much. Most of the things that turned into today’s PCs and the Internet are just updated versions of concepts that existed in the 70s. The government was working on the technology to created a distributed network to survive nuclear war a LONG time ago. Universities had connections between their computer networks for a long time. Go look at the P.A.R.C. And see how long ago it was playing around with these concepts.

Like most modern technology you can’t easily stop it or even slow it to much. Modern tech is often develsped by mult groups give or tack the same time. So there is one one person with a eureka moment that you can delay things. That is why you similar tech come out and then eventually the two or three different versions combine the best features until they are all replaced by something else. The various attempts to build a higher storage floppy disk being one example of this. Or the number of companies that produced PCs befor IBM or the various car companies. Etc.
I could give you a time machine and a sniper rifle and all the built s in the world and you would die of old age before you took out enough people to delay cars by more then a decade.
So to get a long term delay in PCs will take something HUGE like a nuclear war, a super volcano or a major asteroid.
Once the basic technology hits a given level you are going to see its bigger uses happen. Once you get the technology to develop Steam Engines you are going to get steam engines and once you get steam engines you will get steam locomotives and the sailing ship is ultimately doomed.

Technology has a life of its own and it evolves. It takes huge changes to have a significant effect either in slowing it or spreading it up. Look at the cost put in with the Manhattan Project and the Moon Shot and they had limited increases in tech. On the other hand look at WW1 or WW2 or the Spanish flu and these massive events had a somewhat limited effect on slowing technology.
So you want a major slow down you need something GIGANTIC.
 
I don't think it was either of those, actually. Some minor cluster or other.
Ah, ok. Not sure if that's how LLNL works but at my shop the minor clusters got the "just install CentOS 7 plus a bunch of F90/C/C++ compilers and MPI implementations, organize the latter two into modules and call it a day" treatment. So, they'd indeed run a real Linux if nothing else.

(For the non-SciComp people: A "module", in this context, is a glorified .bashrc file.)
 
Last edited:
Top