Chapter 111: “Livin’ Thing” - The Beginning of the Information Age
Above: Steve Wozniak, Steve Jobs, Paul Allen, and Bill Gates; collectively these visionaries would be called the “Big Four” of the burgeoning revolution in personal computing.
“Be nice to nerds. Chances are, you’ll end up working for one.” - Bill Gates
It began decades earlier with the advent of Computer terminals, clunky interfaces designed to provide time sharing access to more powerful, central computers. Prior to the invention of the microprocessor in the early 70’s, computers were almost entirely large, costly systems owned by big corporations, universities, government agencies, and similar-sized institutions. These gargantuan, leviathan systems were often never even directly used by end users, who more often used auxiliary, off-line equipment such as punch cards or command codes to prepare tasks for the computer to then go and do on its own. These operations could be simple or complex, but ultimately amounted to the computer, no matter how powerful, being little more than an oversized, overpriced calculator. After the computer completed the operations, users could collect results, but this would often take immense stretches of time to be performed. Users could wait hours, even days, for even powerful computers to complete their operations. Over time, however, small developments would begin to change this model of digital interaction.
The Kennedy years of the 1960’s brought more widespread terminal networks, and with them, the ability for multiple users to take advantage of a time-sharing system and use a single, mainframe processor at the same time. First developed commercially, this breakthrough would become one of several necessary for completing JFK’s dream of putting a man (or in this case, a man and a woman) on the Moon. Scientists and engineers immediately saw the benefits of these systems, and sought to maximize their efficiency and capabilities. But these developments soon hit their own brick wall. By forcing multiple users to share a single, mainframe processor, the amount of processing power each user had access to was still minimal. Ironically, it was in other, early computer research at MIT and other institutions that the future of so-called “personal computing” would be foreshadowed.
At MIT, Carnegie Mellon University, and other places where pre-commercial, early computers were still being developed and experimented with, a new model was being developed - one in which each user would have a computer with access to its own, unique, processor. At the time in the mid-60’s, such an idea was radical and virtually impossible due to issues with economic feasibility. The arithmetic, logic, and control functions of computers lived, at the time, in separate, costly circuit boards, and required large amounts of again, costly magnetic core memory. Despite these hurdles, “baby” steps in modern computing were taken in part of President Kennedy’s proverbial “journey of a thousand”. T-square, developed in 1961 at MIT, was an early example of what would later become computer aided drafting or CAD software. Video games as we know them today were arguably invented with 1962’s
Spacewar! While these early computers were the size of refrigerators, and could not have been built without massive grants and endowments, progress was still being made. Everything changed with the invention of the microprocessor in the early 1970’s concurrently by Texas Instruments, Intel, and Garrett AirResearch.
By combining the various functions of a circuit board with a metal-oxide-semiconductor field-effect transistor, high-density circuits could be constructed cheaply and easily, as well as drastically reducing the size required for computers. It was the beginning of a revolution and the Information Age as we know it. Even in these early days of personal computing, researchers were at SRI, a non-profit scientific research institute funded by Stanford University, and Xerox PARC, a for profit computer research center in Palo Alto, California, experimenting with the idea of computers that a single person could use and that could be connected by fast, versatile networks; not home computers, but personal ones.
Among those working on these early experiments for Xerox were a couple of starry-eyed visionaries from Los Altos, and San Jose, California - Steve Jobs and Steve Wozniak. A “nerd” in the classical sense, Wozniak or “the Wizard of Woz” as millions of his fans would come to call him, was born and raised in San Jose to Margaret Louise Wozniak and her husband, Jerry, an engineer for Lockheed Martin. Throughout his teen years, Wozniak was a tremendous fan of Star Trek, and would later credit his fandom of the series (and attending several of its famous conventions) with his decision to pursue a career in science. Wozniak was expelled from the University of Colorado Boulder during his freshman year there for hacking into the University’s computer system and sending prank messages over it. Shortly thereafter, he managed to enroll successfully in UC Berkeley, where he built his first computer in 1971 with the help of his friend, Bill Fernandez. Wozniak would later drop out of Berkeley and get a job with Hewlett-Packard, designing calculators, before befriending another computer geek, Steve Jobs, who managed to get him a job at Xerox, after Jobs showed them Wozniak’s homemade copy of the board from the popular video game
Pong by Atari.
Jobs, meanwhile, had a long and complex story toward becoming the early computing visionary he is seen as today. Born originally to a Syrian Muslim teaching assistant father and a Swiss-German American Catholic mother, Jobs would eventually be raised, along with his biological sister, Patricia, by Paul and Clara Jobs, a repo-man turned Coast Guard mechanic, and the daughter of Armenian immigrants. Raised in a generally happy, supportive home, Jobs took to mechanical pursuits at a young age as well. By 10, he was befriending older engineers and “fix-it” types in his neighborhood, but struggled making friends his own age. In school, he was picked on for his introverted personality. With the help of encouraging teachers and his parents, as well as a High School with strong ties to the burgeoning Silicon Valley, Jobs developed interests in snowshoeing, nature, taking psychedelic drugs (particularly LSD), and reading Shakespeare and Plato. Eschewing college to focus on his own “pet projects”, Jobs eventually found a job as a “technician” at Xerox’s research facility in Palo Alto. There, he discovered work that was already being done on the so called “Xerox Alto”, the first personal computer designed from its inception to support an operating system based on a graphical user interface (GUI), which later used the desktop metaphor. Though Xerox were unaware of the magnitude and potential of their discovery with the Alto, Jobs was enraptured by the machine, and worked hard at every opportunity to inform his superiors at the company that the device had the potential to fundamentally change the way people used computers. The company was so reluctant to hear Jobs, “not just because he was a hippie”, Jobs would later admit, but because they were uneasy about the idea of reentering the commercial computer market. It seemed to them at the time like such an industry was going nowhere, fast, and they would better serve their share-holders by focusing on their time-tested products: copying, printing, things like that. It wouldn’t be until after the company hired whiz-kid Steve Wozniak and he co-opted the Alto’s innovations to design what would later become the X-1 computer, with its advanced bitmap display and mouse-centered interface, that the company came around to the idea of selling these designs to the general public. With Jobs becoming their “idea man” and Wozniak the technical genius behind their plans, Xerox was poised to become one of the biggest names in the burgeoning computer industry. But in exchange for their labor and vision, Wozniak and Jobs demanded near total freedom to design and build their own creations with the help of the rest of the Xerox PARC team. Intrigued by the energy these young upstarts brought to the table, PARC head Jerome Elkind and his lead researcher, Bob Taylor, agreed to give them a chance. By as early as the spring of 1981, the team of Elkind, Taylor, Wozniak, and Jobs would see the Alto’s successor, the X-1 ready for public release, and the era of the personal computer had officially begun.
While Silicon Valley, California, and the Genesee River Valley, New York became major areas of technology investment and development in the late 1970’s thanks to Xerox, another city, Seattle, Washington, would play host to the other big name that would come to dominate the personal computing market - Microsoft. The story of this latter, scrappy company, began several years earlier in 1972, when childhood friends Paul Allen and Bill Gates shared a common desire to start a business around their computer design and programming abilities. Though they struggled to settle on a concept at first, spending much of the Bush years dropping out of Harvard (Gates), and earning degrees at Washington State University (Allen), 1975 marked a turning point, when the pair worked together on a BASIC interpreter for the MITS (Micro Instrumentation and Telemetry Systems’) Altair 8800 microcomputer. MITS was so impressed with the boys’ work, that they sold the resulting product internationally under the name “Altair BASIC.” A few years later, following Mo Udall’s election to the Presidency, Gates and Allen relocated to Seattle, headquartering their company at nearby Bellevue. Gates agreed to serve as the company’s initial CEO, while Allen would primarily be in charge of product testing and development. Though it would take a while for their business to turn a meaningful profit, the pair persevered and began work on operating systems, namely, their own form of Unix, called “Xenix”. These experiments would eventually lead to MS-DOS in 1980, and later, its far more famous successor, Windows, in 1983. With the development of DOS and Windows, Microsoft were, without realizing it at the time, creating what would one day become the most widely used OS for PC’s around the world. As their software applications became more widespread and popular, the company also diversified, adding a publishing division, Microsoft Press in 1983. It was this same year that co-founder Paul Allen would leave the company after being diagnosed with Hodgkin’s Disease and a rather acrimonious split with his former friend. Gates, now in sole control of Microsoft, would come to develop a domineering, arrogant reputation, though this did little to stem his overwhelming success. By the end of the millenium, he would be one of the richest men in the world...
These leaps and bounds in technology were also impactful in another area as well - video games! While digital, electronic games had been around in some rudimentary form for decades, few were readily available for the public. The onset of the “Information Age” would see this change, rapidly. The release of
Pong by Atari in 1972 marked the first commercially successful “arcade game”. Other beloved titles followed, including:
Breakout (1976);
Heavyweight Champ (1976); and of course,
Space Invaders (1979). Joining these independent releases were some of the first ever licensed video game companions to films and comic books, beginning with 1978’s
Superman, made to promote the Steven Spielberg directed picture. In addition to arcades, the very first games for home consoles and PC’s were being released as well. 1975 brought
D&D - the first text based computer role-playing game (CRPG) based on Gary Gygax and Dave Arneson’s tabletop creation from the year before. Two years after that came
Star Trek, an early 8-bit graphic strategy game, which allowed players to take command of the U.S.S Enterprise and lead it on a journey through the Alpha Quadrant to pursue and capture several escaping Klingon ships. While these early games were as simple as they come by today’s standards, at the time, they were nothing short of revolutionary, and marked the beginning of an industry which would one day come to rival even that of music, television, and film for dominance in popular culture. Of course, early gaming would not reach its first “golden age” until the mass marketing appeal of the following decade, when a booming American economy and a group of game developers in Japan, including a young man at Nintendo named Shigeru Miyamoto, would create new names and titles which would change the face of gaming forever.
Next Time on Blue Skies in Camelot: Hairline fractures in the Udall Coalition; A New Generation Comes of Age