Earthquake Weather: Pop Culture & Tech Goes Weirder

A Node Magazine Special Report
  • Earthquake Weather
    [ May / 1986 ]
    [ A Node Magazine Special Report ]
    [ Written by Martin Campbell ]


    Shinjuku Streets, Tokyo 1985 by CanadaGood, on Flickr


    THE HALLUCINOGENIC NEON LIGHTS of the future were obscured by dust and dirt and death.

    Nobody said anything, those few originally wandering the sidewalk and people like me coming out of their late night bars or love hotels or clubs—all the joys of early morning Tokyo. Even the hyper-efficient street repair crew a block south were idle, their water-cooled saws slowing down to zero revolutions, I could feel them through my feet between the shocks.

    The Big One went off at 2:18 am, 10 April, 1986 with the epicentre just three miles north of downtown Tokyo. Godzilla. The first rumbles were picked up at the University of Tokyo, a pair of students conducting a monitoring test as part of their final project. It measured 1.1 on the Richter scale, nothing to worry about. Normal, even. It escalated rapidly ever after that.

    I was in a bar, a few miles away, paying for obscenely overpriced single malt scotch for the salaryman I was using as a source and listening to a particularly excellent jazz band while I was working on an article about a certain Japanese bank that has clearly moved down the list of importance. As we stumbled out I don't think we could have done anything if we had been able to do so, and we were not. We all just watched as Tokyo fell apart.

    Tokyo is one of the world's better proofed cities for earthquakes. The city leads the way on, for instance active seismic vibration control using mass dampers a technology which has to started to proliferate throughout modern high-rise buildings. With a long history of devastating quakes the Japanese have placed a premium on modern technology to counter potential outcomes. What Godzilla revealed was the underlying problems of the political system.

    I've been living in Japan for six years now and as regular readers of the magazine know I cover pretty much the whole East Asian desk for Node Magazine. As a technology magazine first, and a socio-political magazine second we at Node get much wider latitude than American and European newspaper or television reporters because technology advertising pays the bills for our privately owned company and our non-technological features are there to broaden our demographics. Node has always been entirely upfront about wanting to be the Rolling Stone of technology. So when I get thrown out of a Minister's office for asking perhaps the first real question he's ever heard from a news company I don't have problems with my bosses.

    The first skyscraper toppled thirty-eight seconds into the East Asian Big One. Seventeen stories, built by a construction company that happily paid its dues to the local yakuza and its bribes to the local construction officials. Almost no one saw it, streets empty of all but that cleaning crew I was watching within a few minutes, and the drunken salarymen that marked Tokyo so neatly between image and reality. The second skyscraper fell, as best as anyone could tell, less than a minute later. I, and thousands of others, watched that one tumble and skitter, and then vanish: leaving a sudden missing gap in the city that was quite literally impossible to understand at the moment.

    I'd like to cover two things in this. Why Japan, as you've heard from more frequently published sources than a monthly magazine like Node, is in political chaos and the technology that led up to certain underreported events that were wrongly sensationalized. The Western press is incredibly bad at covering Japan for all kinds of reasons and for different reasons many of them don't grasp technology either.

    A French surveillance satellite, leasing two minutes of commercial time to a Tokyo urban planing firm, was the first to get photos. An office tower, twisted, the moment before it collapses; a visible ripple in hardened concrete arcing towards a cluster of low rise apartment buildings; a helicopter caught lifting off skittering sideways towards the edge of the roof. They are instantly iconic, emblematic of the ravening destruction that no one can imagine a modern city facing.

    It was 2:22 in the morning according to my painstakingly obtained watch (see "The Underground World of Watch Collectors", October/84) when the building three blocks away followed suit with the distant skyscrapers and toppled across a pair more that were, thankfully, under construction and thus not inhabited. This was something impossible to reconcile, a presented reality that argued with every memory and bit of knowledge we thought we knew. Something that no one had seen in over three decades, a modern functioning city torn apart by forces outside its control. Something, for those of us that were there, that was a fundamental turning point in history.

    The first official response comes from some poor bureaucrat, whoever happened to have the most seniority near a working phone in the middle of the night: "The Japanese Government is working hard to deal with the crisis". In terms of government-speak it is, as befits the Japanese, the most bland statement imaginable. It was also false, as Tokyo emergency crews worked with each other via radio independently of higher authority throughout the night—not a single phone line into or out of Tokyo was operational. The lines of official communication rested in the ships dotting the harbour, their powerful shortwaves and finicky satellite uplinks telling the world in snatches of disaster what happened.

    The opening lines of this story are jotted in my notebook in shorthand, done blind, while I stare frantically at what's happening. "2:22, street crew is helping evacuation. 2:23, street crew has mounted their saws on a truck, moving towards apartment building to slice it open. 2:26, sirens, somewhere. 2:27, payphone is dead."

    The standard fall-back system for telecommunications is that of the satellite, the signal bouncing towards the heavens as undersea cable systems, microwave transmitters, and copper phone lines collapse. Satellites are slow, much slower than a phone line and slower still than a cable running thousands of miles across the Pacific Ocean to the United States. Furthermore, satellites are used to replace those undersea cables when something goes wrong—they are not meant for the BBS population to hijack into. Hijacking, hacking, commandeering: this is what happened that night.

    I find a young Japanese man holding a bulky radio, clearly not something standard issue for even those that love the consumer electronics of this country. It is too ugly. He is sitting on the sidewalk, not dressed properly (and this is astounding, for this city) and words whisper out into the air. I sit next to him and together we listen to a picture of disaster.

    It started with an underground New York BBS and hacking group, the Incomparables, who seized an AT&T-leased satellite over the Pacific sometime between 3 and 4 am Tokyo time (or 1 pm to 2 pm New York time, 9 April). At this point much of the Western world is still in shock and there are no reliable reports out of the city. The compliant Japanese media has been muffled by the government and Western organizations—if they are reporting—are covering useless things.

    The shortwave radio my Japanese acquittance has gradually paints the scene of chaos and police and fire and emergency response that has conflicting and contradictory and infrequent commands from higher up. Centralization and hierarchy do not work well upon a massive blow.

    Once Incomparables had control of the satellite they locked it to a Tokyo-based satellite dish that was owned (in a non-legal sense) by one of the few prominent Japanese hackers. This gave the Incomparables a first look at the chaos inside the city, as well as bringing more useful reports of people watching what happened to wider attention. The Incomparables BBS crashed just after 6am, Tokyo time, as they reached both bandwidth and connection limits.

    The Japanese people have made an implicit agreement with their government and corporations. Economic prosperity in exchange for not noticing corruption, collusion, and the facade of democracy. At some point of course this must break down, but we can certainly say that Godzilla moved this event rather forward. It's dawn, and I already feel the inchoate rage that will soon be harnessed.

    The Incomparables go wide with (very slowly) transmitted fax documents from a Japanese source about the shoddy construction of one of the buildings that vanished. It reveals the skimming by the Yakuza, the convoluted bank loans that cost twice what they should, and the government bureaucrats that signed off on everything in what is obviously a bid for employment at the bank and the usefulness of employing extra people to make themselves look good to their bosses. This is not shocking for us on the ground who make it our beat, but the baldness of it is something new to the Japanese people.

    It's noon and the earthquake is over and we are all in a daze. So many buildings are gone the skyline looks radically different. Partial restoration of phones has meant information is flooding in and out of the city now, but in whispered rumours throughout the day it seems clear that the denizens of Tokyo are beginning to hear about the how and why this supposedly earthquake proofed city fell apart.

    The local AT&T New York City switching subsystem was hacked at 7:27 am (Tokyo time), presumably by the Incomparables, and their BBS went back up—backed by the full resources of AT&T who, at this point in the narrative, have failed to respond to the loss of their satellite which disrupts Japan-America phone calls and takes out roughly forty per cent of the entire Greater New York's region international phone capability.

    The aftershocks are surprisingly low and so after a night's sleep in the emergency shelter just in case, it is without a flicker of surprise that shops are open the next day, I grab coffee and chat to customers. The narrative has begun to form already, relatives in undamaged cities talking about secondhand knowledge of the information the Incomparables are leaking. The leaks are spreading out to other Western media (CNN is, of course, the first) and the shameful glee of Super Japan cut through by metaphorical fault lines that the real fault lines have exposed is terribly hard to watch when I review the footage later. On the surface they are entirely in the right, but for a country that has been hyped as the "Next Big Thing" for half a decade you can see the sober talking heads picking at it.

    By 8:30 am (Tokyo) the Incomparables had lost control of the satellite, the switching station, and one arrest had been made. It seemed that the brief flower of of citizen reporting was over. At least some information has escaped into the wider world from the black hole of the Japanese government-corporate access. However the attack on the Incomparables would lead to something more.

    I wonder if the gold flecked coffee glinting in the back of my head is now done with, because how could anyone ever serve that again? It was a meeting with a banker a few months ago as I worked on that story I mentioned before, about non-preforming loans and their effect on the Japanese technology sector. That story is still coming, and it's a lot bigger now.


    In the aftermath of the AT&T crash it became clear that the best hackers and the various BBS they inhabited had often collected information for information's sake, tucking it away as a quiet proof of how good they were. This became abundantly clear from 8 am to 10 am Tokyo time as a collection of hackers took it upon themselves to support the Incomparables.

    Although AT&T has publicly refused to comment on the issue, a series of interviews with, yes, those ubitiquous "unnamed sources" reveal that AT&T cut all of NYC out of the American phone grid at, approximately, 9 am Tokyo time under orders from the US government. It turns out that this did not go as planned. By that time the Incomparables, in affiliation with what participants in the event call "every fucking hacker and wannabe in the metro area", had gained access to both the US-UK undersea cable and the New Jersey switching subsystem through social engineering and by physical possession of the NYC station for seventeen minutes before cops arrived and arrested three people.

    The Incomparables release a short statement to the local New York media at precisely 10:10:10 am (Tokyo time), reading as follows: "The Incomparable possess the only non-government look into what is happening at Tokyo. We are also the only ones providing the truth. However we figure we're just about done here, so have a nice day."

    Despite official responses it was soon clear that the frenzied hacking earlier that day had presented a clear and coherent on the ground picture of what was happening inside Tokyo where news organizations had generally failed and that their leak of internal Japanese documents had radically changed the narrative.


    We should return to the beginning.

    The first skyscraper toppled thirty-eight seconds into the East Asian Big One. Seventeen stories, built by a construction company that happily paid its dues to the local yakuza and its bribes to the local construction officials. The second skyscraper went down just past a minute into Godzilla and I, along with thousands of others, watched it fall and put an end to the Japan we knew.

    That building brought down a government and although we are early the barbarians are at the gates and I can feel building the Japanese equivalent to the French Revolution. Nobody knows what shape this new thing will take, but I do know that the blinders of the citizens have been pried open in the most forceful way possible way. The proposed reforms by the new coalition government are sweeping in scope and perhaps even in effectiveness.

    For now all that I can conclude is that the hallucinogenic neon lights of this city that has prided itself as being fifteen minutes in the future are dimmed, but I think we've made another leap down the timeline just as Commodore Perry and the imperial/industrial revolution and the nuclear weapons and the American occupation did to this country.

    One more step into the future.



    —Martin Campbell is the Japanese and East Asian correspondent for Node Magazine. It is a point of pride for him that he has been thrown out of four Japanese Ministries and that seventeen major Japanese companies have standing orders for their employees to not talk to him. He is also no longer welcome in South Korea. His last piece appeared in Node Magazine February/86 about the yakuza and their associations with the government. He is a contributing editor for The Economist and occasionally files articles for the BBC.


    More Shinjuku Streets at Night, Tokyo 1985 by CanadaGood, on Flickr
     
    Last edited:
    Directory & Introduction
  • Directory:


    Edited: A small note for new readers. Osakadave and I spend a fairly decent chunk of time talking about earthquakes. After I incorporate his advice I'll edit the above opening post because (alas :)) he certainly knows more about earthquakes than I do. I bring this up so any new readers don't get derailed by that conversation. Like I'll say below, the focus of this timeline is on culture and technology.



    I really don't have time for this, but it won't let me go.

    Given the boom of pop culture timelines, my interest in technology, the potential for Japanese reform before the bubble pops… well. I've worked on various aspects of those things before, but the pop culture thing sparked my interest in it again.

    Bear with me for a second.

    Sony bought CBS Records in 1987 for 2 billion dollars[1]. Sony bought Columbia Pictures Entertainment in 1989 for 4.9 billion dollars. The content protection demands of those two sections of the company would critically wound a vast number of future Sony products. The latter purchase was financed by 5 different Japanese banks: Mitsui, Tokyo, Fuji, Mitsubishi and Industrial Bank of Japan. Finally Canon invested 200 million dollars in NeXT in 1989.

    If for some reason the banking industry of Japan is in a massive shake-up there's no particular way for Sony to massively overpay to buy into music & movies, but they could certainly stumble across NeXT. Windows couldn't display Japanese characters properly on a computer screen but NeXTSTEP could. Sony's key problems are that their content side has critically wounded the company's digital efforts, and that they never had much software expertise (especially in UI) which is an admittedly common problem in Japanese companies.

    A radically different Sony would change the face of consumer electronics in the 1990s, with alternate sales of CBS Records and Columbia Pictures Entertainment changing pop culture in the USA. (Let alone, say, Nintendo which has just launched the NES in America.)

    Given the state of the Japanese system in the 1980s one would require massive effort to change it (to deny, as a side effect, Sony the money they needed). Last time I did something like this I went with an interesting domino effect rolling out of a Gerald Ford victory in the 1976 election.

    I've decided to go with a blunter instrument this time around. There are a few reasons for it: I feel I underestimated the entrenchment of the system, I feel (a few years older and potentially wiser) that there were both more and less things wrong with Japan than my previous thoughts, and because this current iteration is focusing on different things I needed… well, a big push. Things Are Going To Be Different™ and this creates both conditions for it and a massive blast across popular culture.

    Japan, unfortunately has a number of earthquakes. The recent tragedy there is simply the latest example. I sincerely hope no one takes offence to my using an earthquake as a POD, but this is course AH where I figure we all kill or save a million people in a million timelines before breakfast.

    If you look at a list of earthquakes in Japan you'll notice that from 1978 until 1993 there were none. Which is strange. There were two dozen (including aftershocks) from 1993 to the present. There were another half dozen in roughly the same time frame going the other direction.

    So why the fifteen year gap?

    Therefore, as you've seen already, the East Asian "Big One" strikes near Tokyo at 2:18 am on 12 March, 1986. Technically the POD revolves around Node Magazine existing and butterflies from that leading to an earthquake (perhaps coverage influenced a building, and that building accidentally triggers the earthquake).

    This timeline is about technology and pop culture. Underpinning it is a number of political and economic changes, but they'll be lightly mentioned. The above post is probably the most serious I'll ever be.

    Node Magazine is a British Wired/Rolling Stone hybrid populated by New Journalism wannabes. Postings will be a combination of Node Magazine and an omnipresent narrator (The Futurist Manifesto has taught me how exhausting making every item in the timeline a book is) similar to, say, That Wacky Redhead or The Power and the Glitter!; indeed those two timelines got me thinking about pop culture, and being asked to talk about technology for The Power and the Glitter! got me back onto this.

    [1] Before that it was CBS/Sony Records, founded March 1968, CBS Sony Inc. in August 1973 and CBS/Sony Group Inc. in August 1983.
     
    Last edited:
    1986 Videogame Rodeo Computer Roundup Event
  • The 1986 Videogame Rodeo Computer Roundup Event

    Now you're playing with power.

    (Nintendo Entertainment System slogan upon launch in North America in 1985.)



    The impact of the Big One on the videogames and computing industry was instantly and widely dramatic. The popular Nintendo Entertainment System (NES) saw several months of factory production either pile up or simply not built, as shipping had been disrupted and a great deal of it shifted to disaster relief supplies. The growing cries of "where's my Nintendo, mom!" would not only place a certain stress on the psyche of American parents but would also herald a dramatic shake-up in the computing industry. Indeed Nintendo's advertising through 1985 and 1986 and their careful positioning had paid off so well that when they couldn't meet demand… things began to change.

    Nintendo Entertainment System
    640px-NES-Console-Set.jpg


    500px-SEGA_logo.svg.png

    Sega also faced problems. Their North American launch of the Sega Master System had been planned for June 1986 and that was clearly no longer viable. However they rapidly came up with a new strategy, a Western European "soft launch" in smaller countries (serving as an indirect marketing campaign to the larger European countries) along with several other smaller markets like Australia and Brazil. Those much smaller markets could be supported from Japanese production in time for a fall launch and related costs such as marketing would also be much lower than a North American (supply constrained) launch. Sega was also forced to shelve their already planned advertising push in North America but that proved to be something of a blessing in disguise.

    Sega Master System
    640px-Sega-Master-System-Set.png

    NEC and Hudson Soft, long considering the lucrative console market, were confronted with problems as finances dried up with Japanese banks coming under investigation. Their Christmas 1987 plans were on hold and instead they were forced to begin looking for an additional partner. This also forced them to contemplate a complete redesign of their console as a delay past 1987 might leave them facing a new Sega or Nintendo console within only a year or two of launch. Instead Hudson Soft focused on third party game development such as their highly successful Bomberman game on the Nintendo Famicom (NES in America) and began planning for potential first party titles with the experience they were gaining.

    500px-Commodore_logo.svg.png
    200px-Atari_old_logo.svg.png

    In America, Commodore International and Atari Corporation had spent the last two years locked in a brutal struggle with the Commodore 64 & Amiga 1000 on one side, and the Atari 8-bit & ST computers on the other. Jack Tramiel had founded Commodore and had been forced out, he then bought up the consumer electronics side of Atari (and taken most of the good Commodore engineers) and mounted a vengeful attack on Commodore. This had taken a major toll on both companies in terms of finances. Their joint struggle over Amiga had only deepened Tramiel's understandable grudge with Commodore.

    However Commodore thought there was opportunity there. The Atari 7800, although not doing great, had plenty of supply at hand. However the 7800 was something that Tramiel clearly had no interest in pushing, a huge potential missed opportunity against a briefly weaker NES and hard to understand given how key games had been for the C64's success. For the second time (partially spurred by Time's February 1986 Adios Amiga article and the loss of 53.2 million dollars in the fourth quarter of 1985) Commodore reached out to Apple executive Jean-Louis Gassée over taking the CEO position at Commodore and this time Gassée accepted, tired of internal fighting at Apple and wanting something different[1].

    Atari 7800
    640px-Atari-7800-Console-Set.png

    A challenge was what he found, but he also discovered a way out of it: buy out Atari, consolidate the industry, and push the Atari 7800 (with perhaps a moderate redesign) against the NES. Of course Jack Tramiel would never sell, but this was the era of the leveraged buyout and Commodore International had a backer: Sun Microsystems. Sun had recently mounted an Initial Public Offering (IPO) on highly successful sales of their workstations but Sun was interested in expanding their footprint down from just workstations[2] (and gaining access to both Commodore & Atari's excellent distribution networks) and in the wild and wooly market of 1986 personal computers Commodore International was both cheap but potentially about to get more expensive, once Jean-Louis Gassée began righting the ship with the planned launch of the Commodore 74 (backwards compatible with the C64, unlike most of their line), a cheaper Amiga 500 model, and cancellation of all other computers aside from the Amiga 1000 and C64.

    320px-Sun_Microsystems_1980s_logo.gif

    Jack Tramiel fought a good fight but his own mercurial personality worked against him in his public statements and his penchant for lowering prices of computers when not required was making stockholders (as had happened at Commodore) uneasy. In addition much of Commodore's top flight talent had followed Tramiel to Atari happily, but with Sun Microsystems and Jean-Louis Gassée (both highly respected in Silicon Valley, and Gassée loved by Apple engineers) in charge with—perhaps most importantly—the Commodore board paid off and out of any influence they had a new and safer option. The loss of much internal support finally convinced Tramiel that he was done and although forced out of a second company his payday might have helped make up for it.

    [3]

    So in late 1986 the complicated step-dance ended with Sun Microsystems and Commodore International merging to form Sun Commodore as well as mounting a takeover of Atari Corporation and the cheap acquisition of Atari Games from Namco to form a new first-party games studio[4]. Sun Commodore was therefore structured like so: Atari the console company (Atari 7800), Atari Games the development studio, the C64 & C74 low-end computers, the Amiga 500 & 1000 high-end computers, and Sun-number (e.g. Sun-1) workstations. Jean-Louis Gassée's main goal would to be simplification and stronger competition in the computing market, and to take advantage of the market opportunity open with the NES lacking supply.

    The Christmas 1986 season in the United States was dominated by a barrage of Sun Commodore advertising for the Atari 7800. With turmoil at home Nintendo had been desperately trying to get NES's into America but were essentially filling the pent-up demand, not the new Christmas demand and so the Atari 7800 had a strong second place showing capturing some 35% of all sales in the fourth quarter. Sun Commodore's idea of pushing the 7800 harder than Atari had was strongly validated, and they moved forward with plans for a new console.[5]

    The theoretical consolidation of the lower to mid markets in computers (theoretical in that virtually all the computers under Sun Commodore's belt were, in 1986, incompatible with each other) married to a workstation company made plenty of executives at Apple Computers and the IBM clone field sit up and take notice. The C64 was usually the best selling computer on the market in any given quarter and the potential of Sun Commodore as a company was clear.

    Nintendo also paid attention, recognizing the global networks of the new Sun Commodore and the success of the Atari 7800 once it had some marketing muscle behind it. In many ways the Atari 7800 (originally released in 1984, after all) was unable to compete with the NES but the low price of the 7800 along with the ability of the 7800 to run ported C64 games and the monetary inducement to developers[6] to get those ports out saw a number released in time for Christmas including Ghosts & Goblins, Loderunner, and perhaps most importantly: Elite.

    Elite_org_cover.jpg

    The low install base of the 7800 pre-Christmas '86 was actually working to Sun Commodore's advantage, as they had to sell a new controller for C64 games (packed in with 7800s once Sun Commodore owned Atari) and fragmenting the market was acceptable when so few were around before the holiday push.

    Updated Atari 7800 Model[7]
    768px-Atari_7800_with_cartridge_and_controller.jpg

    Nintendo's weakness in computer games (as opposed to console games) had been considered by some outside observers to be a problem, looking forward to when they launched in Europe sometime in 1987 instead of 1986 as planned, and as they watched the sales numbers of those early C64 ports Nintendo seemed to agree with and resolved to find a way around their problem. Combined with looking for potential partners for their next console, Nintendo's search (much like NEC's) would eventually settle on a rather logical choice.

    Sega too watched with wariness. The European/Brazil/Australia strategy was paying off in sales as every Sega Master System they shipped they sold, and in Europe there was soon a hefty premium on Master System's outside the official launch countries. As such they were very confident about their European strategy but, like virtually everyone, they had also believed—known, really—that the Atari 7800 was a dead system without even a modern (i.e. NES-style) controller. Sun Commodore had changed everything and the computer models under their broad roof were very popular in Europe. As with Nintendo, Sega was soon on the hunt for potential partners.

    By the beginning of 1987 the shake-up of the videogame and computing industries was only beginning….


    |||||


    [1] He was offered the job IOTL and turned it down. ITTL the earthquake disrupts Apple CEO John Sculley's plan to remove Steve Jobs from a position of power (as the Japanese market was very important to Apple) and install Gassée. As Gassée is easily the most political of Apple executives he promptly makes a contingency plan, which seems validated when Sculley reorganizes Apple and Del Yocam (a notorious hardass, who would have reined in the freewheeling Gassée—and began to, IOTL, until Sculley folded like a cheap suit and fired Yocam) became COO.

    Gassée's flaws (a love of high profit margins, lots of expensive research programs, willingness to let his engineers fly past deadlines) are things that the 1986-7 Sun Microsystems and Commodore/Atari/Amiga culture can potentially reign in.

    [2] Sun Microsystems has always been fairly active in buying companies, starting in 1987 OTL. ITTL the greater turmoil caused by the earthquake is heavily effecting the "Workstation Wars" since Japan was a major customer and Jean-Louis Gassée is willing to be demoted in order to have the resources he needs to take on Apple, the console market, and MS-DOS & Windows vendors. Sun figures a broadening of their market is a good thing for the future. For a handful of people (hi Nicole) you've read a similar Sun Commodore merger story before, but heck I still love the idea.

    [3] Sorry about the bad picture edit job.

    [4] Atari Games IOTL was bought by some of the employees and became infamous as Tengen.

    [5] The Atari 7800 consistently made quite a bit of money for Atari even on low sales, which makes it even stranger that ITTL & IOTL Tramiel never pushed it harder.

    [6] This seems like a requirement to link.

    [7] The "updated controller" ITTL is basically OTL's European Joypad. However (since you can't see them) ITTL it adds two shoulder buttons to give you four buttons to use with C64 games.

    -----

    I assume anything people don't know they can look up on the wild world of the internet but if there's anything in particular that's confusing let me know and I'll rewrite a little to clarify. In text links, at least in the above post, are all to Youtube and their wonderful collection of advertising videos.
     
    Last edited:
    James Bond
  • A 25th Anniversary Spectacular!

    The new James Bond… living on the edge.

    (A tagline for The Living Daylights, the fifteenth entry in the James Bond franchise.)





    "Bond, James Bond." Those iconic words had been on the silver screen since Dr. No in 1962 and Eon Productions was planning for it to be up there once again in 1987. However long-running James Bond actor, Roger Moore, had decided he was getting too old to play Bond and this began the search for a replacement.

    The NBC cancellation of Remington Steel in the spring of 1986 saw actor Pierce Brosnan being named as the newest James Bond, replacing Roger Moore, as the producers' first choice—Timothy Dalton—was busy with other commitments. Despite a upswing in news about the event coverage of the Japanese earthquake dominated the headlines. The positive publicity made NBC reconsider cancellation. Indeed NBC flirted with the idea of renewing Remington Steel for the 1986-87 season for some time, but despite suddenly deciding to renew it several people had to break the news to a certain NBC executive that the time on Brosnan's contract had run out the day before.

    As such Pierce Brosnan, who would have been contractually obligated to come back if NBC had made up their mind earlier, was now free to play James Bond in The Living Daylights reboot.



    Deciding that the very 1980s setting of the previous Roger Moore films was dated with a new Bond coming in and a full reboot being the plan, the producers set about to create a more timeless and classic James Bond film that shied away from hijinks and reoriented it to what could be considered an alternate 1980s that bore a striking resemblance to the more stylistic 1960s era Sean Connery films. The Japanese earthquake also influenced the underlying plot with the Soviet Union considered too old an adversary and instead choosing to go with corporate terrorists. Despite the complete lack of resemblance to the Ian Fleming short story, they decide to keep the name.[1]

    Production began right away, aiming for a 1987 release, with location shooting in Tokyo[2], New York, and London as part of a global network of sinister corporations manipulating governments. The intelligence agencies were portrayed as hopelessly comprised with Bond losing his license to kill and being stuck on a beach in the Pacific. However the new M supports his unofficial quest as she believes in his ideas, if not his actual self which is clear when she memorably says "the Soviet Union is an outdated dinosaur… much like yourself, Mr Bond, if one added sexism and misogyny to the list". For the first time M would be played by a women, Judy Dench[3], as another part of Eon's plan to throughly update Bond.

    The pre-title sequence would show Bond earning his 007 license to kill, and then flashing forward to where he's a junior 00 agent whose behaviour has led him to being ignored. When he persists in his inquiries political pressure forces M to revoke the 00 status, sending him off to the Pacific where he makes his way to Tokyo to look for answers.

    Sam Neill, considered for the part of James Bond, was instead cast as the Wall Street villain (coinciding with the 1987 film Wall Street, in one of those strange film industry coincidences) whose political influence is unmatched because of his vast wealth. This results in James Bond being hunted by elements of the CIA with only his new friend, CIA agent Felix Leiter, on his side. Reluctantly stealing from the non-Eon Productions production of Never Say Never Again the producers continued their quest to be as different as possible, auditioning a variety of non-white actors for the role of Felix.



    In the end it came down to LeVar Burton of Roots fame, Denzel Washington who was taking a starring turn on the television hospital drama St. Elsewhere, and mostly unknown Chinese American actor John Lone who had made an impression during the screen test. The first two men had potential commitments, LeVar Burton having been approached for a new Star Trek TV series and St. Elsewhere was a NBC show which presented problems after the Remington Steel issue.

    Torn by their choices a suggestion by one of the writers meant they wound up casting two people as Felix Leiter which made for a moderate reworking of the central section of the script as Bond gets told the only CIA agent he can trust is Felix Leiter as he once saved M's life, but of course M is cut off before she can describe him. John Lone starred as the "good" Felix and Denzel Washington in a memorable turn starred as the "bad" Felix. (What NBC thought of Eon's successful poaching and vaulting to stardom of a pair of their talent is one of those "I heard it from a friend of a friend" rumours in Hollywood.)





    Maryam d'Abo would star as the only major Bond Girl in the film. She was cast as a retired assassin who used to work for Sam Neill's character. Neill brings her back for one last missionto gain the trust of James Bond, find out what he knows, and then kill him. The producers deliberately conceived a much more active role for the "Bond Girl" as part of their reboot although, naturally, she does sleep with Bond. Taking advantage of the PG-13 rating in the United States that allowed, at the time, bare breasts the American version of The Living Daylights had both more violence and—of course—Maryam d'Abo's topless self. They also arranged for her to be in the September issue of Playboy, helping to promote the film. Most versions, including the British release, would have various cuts and, overseas, an American VHS copy of The Living Daylights was highly prized.[4]



    With a new look came a new director and writer (although veteran Bond writers Richard Maibaum & Michael G. Wilson would do much of the work as well). Fresh off the success of Top Gun newcomer Tony Scott found himself on the A-list of Hollywood directors. He was approached for multiple movies including Beverly Hills Cop II but in the end the pitch made by Eon Productions won out, and he signed on to direct The Living Daylights. Famous British playwright, Tom Stoppard, was approached for the script and despite initially not being willing to write an action film he changed his mind when told they wanted him primarily for his skills at dialogue and structure.[5]

    Eon Productions was determined to make this new Bond film not just a financial success (the Roger Moore films had usually done fine on that front) but also a critical one, to overhaul the series for the upcoming 1990s and portray a new kind of spy thriller. By the time filming began almost everything had been revamped in their attempt to do so.

    The reported $45 million budget was vastly higher than the previous film's $30 million one and the speculation among Hollywood insiders was that Eon Productions had done an exceptionally good job of hiding the rest of the budget, which meant The Living Daylights might have cost twice as much as the previous one. It is perhaps not surprising that the longtime pioneers of product placement in the James Bond films stepped up to the plate once again. Virtually every item with a visible brand name had been paid to be put there, ranging from the fridge in an apartment to the watch Brosnan wore. Naturally having a particular fridge brand on screen wasn't worth that much to General Electric, but Eon proved to be a master of eking out product placement dollars by having even that fridge bid on by several companies. Every dollar helped.

    Every dollar, and the cars. The Living Daylights marked the return of Aston Martin to the franchise. Their first appearance since 1969's On Her Majesty's Secret Service saw the V8 Vantage Volante convertible feature prominently in the film, as one more part of the producers (mostly) steadfast classic style reboot.


    The Living Daylights was released on 1 November 1987 in the United States, with the world premiere at the Odeon Leicester Square Cinema in London being a little over a month earlier. The film was heavily marketed as the 25th anniversary of James Bond, although that was counting the British release of Dr. No, and it proved to be a major success, grossing over $16 million in its first weekend in the USA for an eventual total of $76 million domestically, with a total worldwide gross of some $251 million. On raw numbers it was the most successful James Bond film of all time, but once you adjusted for inflation and ticket prices it fell to 4th place: beating out Moonraker at 5th; trailing Thunderball, Goldfinger, and You Only Live Twice.[6]

    The critical reception was also excellent with reviewers praising the new look of the film, the quality of the script, the toned down James Bond that still managed some humour, and the rest of the cast. The primary complaint levelled at the movie was that it was too realistic (indeed, a hard-fought battle with the British Board of Film Classification took place in the spring of 1987) and that the mostly retro stylings were at odds with the realism.


    A sad note capped off the otherwise remarkable The Living Daylights. Pierce Brosnan's wife, Cassandra Harris, was diagnosed with cancer in 1987 and Brosnan requested to be let out of his contract to be with her. Even his wife protested, as she had wanted him to play James Bond for years. In turn Eon Productions offered a deal in which they would delay the next Bond film by a year (audience fatigue had been a growing concern throughout the 1980s, so it was not entirely altruistic on Eon's part) but reduce Brosnan's salary for the next film, if he still wanted out when it was time to shoot Bond 16 than he would then be released from his contract.

    Eon also hoped that having an additional year would allow for a script and director as good as they had for The Living Daylights and perhaps even reduce the massive budget, as well as giving them time to find a backup James Bond if need be. After several weeks of thought and consultation with his wife, Brosnan agreed to the terms.

    The grand reboot had been a massive success… but would they keep their Bond?



    |||||


    [1] They did in fact consider making The Living Daylights as a reboot/prequel, much like the 2006 Casino Royale film. ITTL with Pierce Brosnan and the earthquake seeming to mark a major event in the 1980s they choose to do so. Corporate terrorists is, of course, my invention but the USSR again would seem too dated for a reboot. Although OTL The Living Daylights copied the short story for a section, the rest had little to do with it.

    [2] The high profile of The Living Daylights in Japan was considered a major publicity coup (for both Japan & Eon), and indeed with the resonant plot The Living Daylights did the best of any Bond film to that point in the Japanese box office.

    [3] Yes I'm reaching forward. However she was free from commitments at the time (84 Charing Cross Road might conflict) and she was a highly respected British actor at the time (won a BAFTA in '86). It seems reasonable that in a reboot they might go with her anyway although her inspiration would have to be someone besides Stella Rimington. Also, I love her as M :).

    [4] Yes, this is a change from OTL's James Bond. However they are dedicated to rebooting it and talks with Playboy (who did, IOTL, put her in an issue) have led them to push the still loosely defined boundaries of PG-13. Obviously the BBFC and most overseas ratings that are similar to PG-13 object, which actually helps The Living Daylights because they can cut the bare breasts and then weasel their way into cutting less of the violence.

    [5] Again I'm reaching forward. Honestly the 1980s James Bond films were utterly dominated by the same writers and the same directors. So I'm trying to find solid people that at one point or another talking about being involved with Bond, hoping that their interest is a long standing one.

    Tony Scott expressed interest in a James Bond film written by Quentin Tarantino IOTL. With Top Gun making him an A-list director Eon Productions talks with him and he decides to do something more serious instead of Beverly Hills Cop II to avoid being director-cast as that fun action movie guy. Tom Stoppard was mentioned as a possibility for Quantum of Solace, and has worked on a number of spy and action scripts (including The Bourne Ultimatum) in recent years. His reputation in the UK was well established by the 1980s, particularly in dialogue.

    [6] Butterflies from the radically different nature of the film and a longer film shoot have meant a somewhat later release date and an increase in budget from OTL's $40 million. Of course these are just reported budgets, once you account for the hidden stuff the alternate The Living Daylights probably cost about 50% more than our movie which tempers the net profit you might expect from such a hit.

    -----

    I hope no one minds a pure James Bond update. Pierce Brosnan being offered the part at about the same time as the earthquake meant I simply couldn't resist making an all-out James Bond spectacle post. Obviously the changes are massive, but at the same time Goldeneye or Casino Royale does show that Eon have been willing to go all out. Once they think over their reboot concept a little more, they decide if they reboot the franchise it'll have to be a major effort. They were also stung by the critical failure of the last Moore movie and IOTL made a much more realistic dour Bond as a reaction (I'm sure Dalton influenced that with his presence). With Brosnan as their actor he seems like a natural fit for a return to more 1960s era Bonds, which is a better fit for the reboot.
     
    Last edited:
    Star Trek
  • "You treat her like a lady, and she'll always bring you home."

    Space… the Final Frontier. These are the voyages of the starship Enterprise. Her continuing mission: to explore strange new worlds, to seek out new life and new civilizations, to boldly go where no one has gone before.

    (Opening narration for Star Trek: The New Frontier.)[1]




    Gulf+Western had been considering a new Star Trek TV series ever since they had planned to launch a fourth television network headlined by Star Trek: Phase II a decade earlier. The original Star Trek had become the most valuable show in syndication over the years and by 1986 was the crown jewel of Paramount Pictures' programming. In addition the movies were becoming too expensive with an aging and high salary demanding cast that, despite the incredible success of Star Trek IV: The Voyage Home, naturally compelled Paramount Pictures to think they would have only one or two more movies left. This meant that their thoughts and plans shifted to a potential new Star Trek TV series. The problems arose when they began to shop this new Star Trek show to the networks.

    NBC & ABC both asked for pilot scripts, not willing to commit to anything more than that. CBS offered a miniseries deal, with a potential pick-up after that. Fox, the brand new network desperate to have a major program to start with, offered 13 episodes but wanted them in March of 1987. Paramount Pictures was offended by the NBC and ABC deals, and didn't feel much better about CBS or Fox.

    Throughout the early half of 1986 Paramount Pictures looked for a way around this by potentially cobbling together a syndication based "network". However Fox sweetened the potential deal in the wake of the Japanese earthquake causing economic effects (and LeVar Burton's new high profile, after reports had leaked of The Living Daylights) which led Paramount to reconsider, and so they gradually warmed up to Fox. The 13 episode deal for March 1987 was unacceptable but Fox was open to possibilities and had heard rumours of the CBS offer which they mistakenly thought was bigger than it was. Therefore Fox agreed to picking up the whole first season of 24 episodes for September 1987, down slightly from the 26 Paramount was looking for, after some additional time in back and forth between them and Paramount. As Paramount wanted a wider audience they made an offer to Fox: any network affiliate could play Star Trek once one week of the first-run airing had passed, in return Fox would receive a percentage of those profits. Paramount managed to find a number of ABC/CBS/NBC affiliates that would commit to the deal, and based on the ratings would be theoretically willing to preempt their own network's prime time shows. Fox was willing to take the risk, especially on anything that would disrupt the Big Three competition as they well realized they were the underdog.[2]

    Some observers in Hollywood noted that Paramount Pictures going to Fox was darkly funny as it had been at Paramount where Barry Diller had proposed a new fourth television network and when they passed on it he went to 20th Century Fox and… started a new fourth network. Gulf+Western executives were not nearly as amused by this undercurrent of Hollywood thought as the rest of the town was.

    With the unique partial syndication deal in place Fox managed to get away with a low price for Star Trek, once they committed to a major advertising campaign, of just half the cost of an episode which meant Fox only had to cover $700,000 an episode, except for $3.5 million for the $7 million two hour pilot[3]. Fox demanded and received some creative control as well and this led to turmoil in the staff of the purposed new Star Trek. Gene Roddenberry had, in his older age, become increasingly fixated on particular elements of the Star Trek universe that Fox absolutely refused to go with. With the lucrative Fox/early-syndication deal on the table Roddenberry found himself increasingly sidelined, especially as his entire writing staff—led by veteran Star Trek writers David Gerrold & D. C. Fontana—were actually with Fox on that issue, especially since Fox had made it clear that Standards & Practices would be sidelined on most items.

    For the moment, at least, Gene Roddenberry would remain show runner but many of his weirder ideas, such as no interpersonal conflict between the crew, were junked by a combination of both Fox and Paramount pressure. The entire writing room would work on the two-hour pilot, Where No One Has Gone Before, and it was both an expensive undertaking with a particularly unhappy set (although the actors got along well with each other).


    Original plans to set the new Star Trek series in the 25th century with a registry number of NCC-1701-7 quickly changed when Star Trek IV: A Voyage Home was released with the Enterprise NCC-1701-A first to NCC-1701-G and than to a new time period roughly eighty years after the end of the original series, with the second-to-last registry number change of NCC-1701-D.


    The design of the new Enterprise was contentious, both inside and out. The interiors had been designed by Andrew Probest based on Roddenberry's thoughts and were mostly ready to go, as was the first idea of the Enterprise's exterior. The ship design was based on a sketch Andrew Probest had done for his wall at the office but story editor David Gerrold had seen it and liked it, as had Roddenberry. However when Fox saw these sketches they once again refused, citing the "hotel lobby" nature of the bridge colours and carpeting and the weird wide saucer of the ship. Paramount was also starting to remember how difficult Roddenberry was to work with as (Star Trek: The Motion Picture flashbacks were filtering in, especially as they talked to former executives from Paramount) after he pitched a fit over Fox's notes.


    Paramount asked Andrew Probest to take another crack at, stating they still liked his early sketches. They also brought up the fact that the fans had liked the Excelsior, perhaps he could work elements of her into the design? Andrew Probest went back to his original sketch and also went back to study all the previous concept art that been generated in the movies and for Phase II.


    The final design of the Enterprise was a cross between Probest's sketch, the Excelsior, and switching from lower-than-saucer nacelles to flat out horizontal nacelles, inspired by one of the Phase II Enterprise's and taken to a logical extreme. This final design made everyone happy, even Gene Roddenberry, and it was sent off to the model-makers. The Enterprise NCC-1701-C was the new registry, as the time period had changed yet again so as to allow the use of the Reliant and Excelsior models as standard parts of the Star Trek fleet with everyone still undecided about the use of the Enterprise-A model.[4]


    Next up was Fox's objection to the interior style of the ship. Probest once again went back to the drawing board designing a brand new bridge based on the spacious layout—at least from the only angle shown, since it was a partial set in the movie—of the Excelsior in Star Trek III: The Search for Spock and on a piece of concept art for that movie that envisioned the front half of the bridge as a massive wraparound screen. Paramount managed to get additional money from Fox to build the set as it was projected to be quite a bit more expensive than the first version they had presented would have been.




    Minor changes in interior design were also made, moving closer to the whiter palate of Star Trek IV and eliminating all the carpets from the concepts (over Roddenberry's strong objections). Many of the sets needed quite a lot of work done to them but it was still cheaper to do that than build new ones and the movie quality of the older sets helped set the tone—to Paramount's late and unfortunate realization of how much money it would cost—for brand new sets like the Bridge. The accounting justification transferred much of the expense forward onto the as yet untitled Star Trek V.


    Engineering was not in the pilot script so naturally Paramount refused to build the set. Reminiscent of the Original Series not getting a shuttle until specifically put in the script, Engineering was promptly written into the pilot… and Paramount responded in kind, by promptly handing over the Engineering set used in the movies, which required a fair bit of work to make it look different and to that end they added an upper level and redid the lower one (naturally much of this was also charged to Star Trek V).




    To save some money somewhere and capitalize both on the popularity of Star Trek IV and the idea of this being a fairly close successor crew modified versions of the movie uniforms were used, the main difference being the reintroduction of blue and gold as main top colours with red being using for command (to avoid the red shirt syndrome). Early trials of spandex uniforms had been rather uncomfortable, a little too revealing, and Fox—continuing their streak of wanting to just put the movies on television—hated them as well. So, at least for the first season, it was the movie uniforms with two more colours.


    Casting went on through the fall of 1986 and the spring of 1987:

    Captain Jean-Luc Picard saw a number of actors considered, including: Patrick Stewart, Mitch Ryan, Roy Thinnes, Yaphet Kotto, and Patrick Bauchau. Stewart and Bauchau were the early frontrunners but concerns were raised about their "toughness" following Kirk in the role as Captain, despite the plan being not to have the Captain constantly going on away missions in the new show. In a surprise choice the producers went with Yaphet Kotto for the role, marking a major milestone in casting with a black man at the helm of a starship for the first time in Star Trek history. His casting did result in a name change for the character, however.



    Executive Officer William Ryker needed a youthful Kirk-like persona, as head of the away teams and second-in-command of the Enterprise. Among those in the running for Ryker were Michael O’Gorman (an early stand-out), Gregg Marx, Jonathan Frakes, Ben Murphy, and Jeffrey Combs. Unknown Jonathan Frakes impressed everybody but in the end Jeffrey Combs was the final choice, demonstrating more of an edge that the producers felt would help balance out the otherwise potentially bland role. On an interesting note Jonathan Frakes found himself on the crew of the new Star Trek and he would go on to play a number of minor roles as well as working to help put the show on the screen: he directed his first episode in the second season and developed a career as a prolific and well regarded television director.



    Counseller Deanna Troi saw Denise Crosby as the early choice but Gene Roddenberry took a shine to Marina Sirtis and in one of his increasingly rare victories against the network managed to get her cast in the role. A strange kerfuffle over eye colour promptly ensued with a number of producers claiming her green eyes were incongruous with her dark hair but in the end the exotic nature of it won out as she was technically playing an alien.



    Chief Engineer Geordie La Forge[5] had perhaps the largest potential list of names but the heightened profile of one of the actors they had already talked to about it, LeVar Burton, virtually guaranteed him the part as he had also been an early favourite. Among the stranger names considered was Reggie Jackson, former Major League Baseball player, and it seems certain that if cast Wesley Snipes would have missed out on much of his movie career.



    Chief of Security Tasha Yar was a difficult role to cast with Lianne Langland, Julia Nickson, Rosalind Chao, Leah Ayers, Marina Sirtis (also considered for Troi), and Bunty Bailey all considered for the role, as was Denise Crosby (also considered for Troi). The role was based on the character of Vasquez in Aliens but Dorothy Fontana brought up that Jenette Goldstein was in fact blue-eyed and blonde, so they moved away from their idea of picking a Latina. Rosalind Chao, an early favourite, was beat out by another Asian American actor: Julia Nickson.



    Science Officer Data had a relative newcomer, Brent Spiner, become the late favourite beating out Mark Lindsay Chapman, Eric Menyuk, Kevin Peter Hall, and Kelvin Han Yee. Spiner had been guest starring on Night Court and that show soon created an episode centred around a fight between a fan of the original series and one of the new.



    Finally the key role of Doctor Beverly Crusher was a tough choice to cast as she was originally planned as a love interest for the Captain. With Ryker and Troi filling that particular slot and with the favourite—Cheryl McFadden—preferring to continue her stage career the runner-up choice, Jenny Agutter was cast. Her youthfulness did, however, make the idea of her having a son rather unlikely. Therefore Wesley Crusher was cut from the already large cast, although the producer's liking of Will Wheaton would see him become a recurring character as as an agent working for Star Fleet headquarters who invariably managed to interfere in an annoying manner.



    The last main character of the cast wouldn't join the show until after the tenth episode (over Paramount's handling of the sixth and seventh episodes). The Worf character had been considered from the beginning but vicious fights between Roddenberry (who wanted a never before seen alien) and Paramount (who wanted an alien from the original series) had sidelined him, despite the incredible audition Dorn had had. With Roddenberry on the way out as show runner Dorn was brought back—of course it had nothing to do with him being black—as the Klingon second in command of security.



    The members of the cast that hadn't yet leaked to the media and the name for the new Star Trek television series were revealed in April of 1987 and soon fans and non-fans were talking about the series. The popularity of syndicated Star Trek episodes and the success of Star Trek IV that had gone beyond the traditional fanbase combined to form a major launching pad for Star Trek: The New Frontier. That, along with a joint Fox-Paramount marketing push throughout the summer, made it seem clear that at least the first episode would be seen by a wide audience.

    However a bitter dispute between the various sides broke out soon after filming the first few episodes over Yaphet Kotto and he parted ways with Star Trek: The New Frontier. The writing room and Roddenberry managed to convince the producers to at least keep Kotto on for two additional episodes than Paramount had planned, the sixth and seventh, as a two-part event to close out his character. That let him both introduce the new captain, Patrick Stewart using the previously settled upon name Jean-Luc Picard, and memorably sacrifice his life to save the Enterprise. Critics would view Kotto's performance highly favourably (unlike Paramount), and his departing pair of episodes was a major turning point in the series after the not so well received second through fifth episodes.

    This was kept under wraps until Paramount kicked off a major promotional event for the sixth and seventh episodes… coincidentally the first two of November "sweeps", that key month where the ratings that help determine the advertising rate were studied. Indeed the two-part wrap-up of Kotto were the second and third highest rated episodes of the entire first season (as they aired on separate nights, unlike the pilot). As for Yaphet Kotto he had nothing but good things to say about the actors he worked with and he took both his payday and his newly raised profile and vaulted into the soon-to-be highly successful Midnight Run movie almost right away.




    The two hour pilot, Where No One Has Gone Before, had a high bar to uphold. Some of Gene Roddenberry's ideas from the novelization of Star Trek: The Motion Picture provided the central frame of the episode: in what case is force justified? The pilot was a tense yet philosophical thriller as the Enterprise raced towards a potential conflict and confronted internal issues over the nature of Starfleet and the strong divide between humans and New Humans and their approach to the pending confrontation that left the crew torn apart. Coupled with conflicting orders from different factions in Starfleet this left the Captain isolated in his choices, as he attempted to reconcile the two sides and save the universe from plunging into war.[6]

    The only thing left was to see the reception.

    Star Trek: The New Frontier was a major media event, with some 28 million people tuned into the first episode airing on Sunday 27 September 1987 on Fox, and a further 12 million people watched it at some point the week of 4 October through syndication on both Fox and non-Fox affiliates[7]. Although those ratings would decrease it was easily Fox's most popular show for the 1987-1988 television season and much like the original series the demographics of the people watching were excellent. Fox, Paramount, and the affiliates were incredibly happy with the show. Indeed the Big Three networks saw a number of their shows pushed out of primetime in both major and minor markets to make room for Star Trek on Sundays.

    Critically the pilot was highly praised as was the sixth and seventh episodes. The ones in-between were poorly reviewed, and reviews for the the latter seventeen episodes varied. Audience response, however, was excellent throughout as they latched on to the hopeful future themes that were inherent throughout the show.


    Star Trek: The New Frontier was off to a solid start, the only question in Paramount's minds was whether or not the film franchise could hold up its end of the bargain.



    |||||


    [1] They adopt the "her" referring to the Enterprise from Star Trek II instead of the "its" used in the original series and OTL's TNG, but also use the "no one" instead of "no man" phrasing (ships are girls, damn it… except Russian ones). I don't know if they considered any name except Star Trek: The Next Generation and I couldn't find evidence but it seems possible, and the name was always a little silly.

    [2] With the economic ripple effects Fox wants Star Trek even more than OTL, and Paramount is less sure about the first-run syndication plan as that had never been done before for a drama in the 1980s. The syndication deal is unique but it seems like an interesting idea to me. 1987 Fox didn't have nearly the coverage of the Big Three and Fox is traditionally the network willing to take the most risks. The affiliates are getting a slightly better deal than OTL: the show for free, six minutes of commercials to them, six minutes to Paramount (of which a third of that money heads back to Fox), and a massive amount of free-to-them advertising.

    [3] At that time the networks usually paid about $800,000 for an hour long and Paramount probably could have squeezed for a million dollars. ITTL the syndication deal lets Fox get the show cheaper from Paramount. If my math is reasonably okay the syndication money heading to Fox drops the cost to them by a third, in the first season. OTL's budget for an episode was $1.3 million at the beginning and 1.5 million at the end of the first season, ITTL it's $1.4 million increasing to $1.6 million. The two hour Encounter at Farpoint IOTL cost $5 million.

    [4] Think mostly the first sketch, with horizontal nacelles similar to how the one's on the Phase II Enterprise concept seem to heading at the beginning of the thread (but not the same refit-Enterprise style, of course), and detailing/lights more like the Excelsior. In other words it's the next flagship class after the Excelsior, a more direct continuity, on Fox's insistence and for budgetary reasons. Or something like a cross between the Renaissance class, and the Excelsior.

    [5] Minor butterflies have resulted in Geordi being the Chief Engineer from the beginning, instead of the collection of people that played that role IOTL's first season. Incidentally all mentioned alternatives were considered IOTL and the fragile nature of casting is inherently subject to butterflies.

    [6] Thanks to Jello_Biafra for reminding me of the New Humans. And don't worry, the level of militarization in Star Trek: The New Frontier is only a little higher, more reminiscent of TOS and certainly not DS9.

    [7] IOTL around 20 million watched Encounter at Farpoint in the first week. The combination of a broadcast network and much greater promotion has increased this. The second week syndication number is pure speculation, but of course there is no TiVo or streaming internet video yet and Fox only airs two nights a week in the 1987-1988 television season so they don't do many reruns, therefore the only chance to see it again (or catch up, having missed it) is through syndication.

    -----

    And… Star Trek, easily the longest section so far of this timeline (in other words: don't expect this length going forward :)). I hope everything seems reasonable and that Brainbin feels a tiny bit better about this new series. A positive case of network interference! The Writers strike will be covered at some point in a little bit along with tech '87, but I haven't decided upon the next post yet.

    I was a little hesitant about reversing the Picard casting, but I really loved Stewart in the role and with Kotto's example (and other factors) to play on his Picard is probably a little different in good ways—it was also a fun twist, and cast shake-ups are rare in Star Trek but reasonably common in other shows. Star Trek: The New Frontier gets a major two-part event early on in the shows run and Kotto gets a nice chunk of change, a great ending, and a little more publicity heading into Midnight Run.

    As always, comments are very welcome :).
     
    Last edited:
    Max Headroom
  • Broadcast Signal Intrusion Incident

    "Have you any idea how successful censorship is on TV? Don't know the answer? Hmm… successful, isn't it?"



    Max Headroom: 20 Minutes into the Future was a popular British cyberpunk movie based on the The Max Headroom Show which was a music video show where the digitally generated Max Headroom played clips from music videos, talked to guests where the topic always turned to golf, and in the second season included a studio audience and a quiz… with rarely awarded prizes, often due to an overlong explanation of the rules (which changed every week) ending up with either no quiz or a cut-off quiz.

    For the time it was revolutionary. A sharp satire of the networks of the future in the still small cyberpunk world, helping to define what it means on screen. It was also one of those British shows that were about to make the leap across the pond….


    "He's the toast of the town (lightly buttered). He's the non-fattening sugar substitute in your tea. He's a bon vivant, a gaucho amigo, a goomba, a mensch, and the fifth musketeer. He's the apple of your eye and aren't you glad he's here. Direct from a wax and shine at the carwash around the corner, it's the man of the hour, or at least for a good thirty minutes, Max Headroom."


    The Max Headroom Show spawned a movie, Max Headroom: 20 Minutes into the Future, where Network 23 reporter Edison Carter discovers that the Network has designed new blipverts to impart advertising in just a couple seconds… except that they sometimes kill. However before he can reveal this Edison suffers a head wound, has a poor copy of his brain digitally uploaded (Max Headroom), and his body sold for organ parts. Luckily he escapes being chopped up for spare parts and moves on to defeat Network 23.



    20 Minutes into the Future, a popular cyberpunk television series, an ABC/Channel 4 joint venture, British produced, Chrysalis Visual Programming Ltd. & Lakeside Productions, predominantly American cast: it's based on Max Headroom: 20 Minutes into the Future which is based on The Max Headroom Show which led to 20 Minutes into the Future which led in turn to The Original Max Talking Headroom Show on Cinemax, that same Cinemax who had broadcast The Max Headroom Show and Max Headroom: 20 Minutes into the Future thus creating enough interest for ABC to ask for its own Max Headroom which wound up as 20 Minutes into the Future. Mad Headroom was also a big fan of New Coke just "Don't say the 'P' word", several times over throughout 1986 and even took the Coke-Pepsi Taste Test.



    20 Minutes into the Future was the chosen title as it was considered to have broader appeal than Max Headroom alone, although of course Max was heavily involved in the commercials. From a two-part appearance on NBC's Late Night with David Letterman to the cover of Newsweek where Max Headroom proclaimed, "I'm an image whose time has come," and indeed he was everywhere.



    It was a complicated project to get going but once it did, ABC found themselves with a contender on their hands. Primed to believe the worst of the news media by the media themselves ("Boy finds pet dog in tree, news at 11") the revelations over in Japan simply put another round of ammunition in the gun. Of course most Americans didn't know or care about the details and it was already becoming the recent past, but they grasped the fact that the news had been… not news, a compliant organ of the government, and it had sunk in deep. It was into this environment that the first twelve episodes of season 1 (so that Channel 4 could split it into two six episode series) premiered in the spring of 1987 on Tuesdays.[1]



    20 Minutes into the Future was, if not a hit or instant success, certainly a solid performer. It had good demographics, an above average audience for ABC, and received mountains of free publicity. Plus the multiple parties involved had put together a new deal with Coca-Cola for the Classic Coca-Cola, as Max Headroom had previously been promoting New Coke, which helped defray a surprising amount of the cost of the show. Max Headroom's apologies for New Coke became some of the stand-out advertising for Coca-Cola in that time period.



    The second season of 12 episodes premiered in the spring of 1988 as ABC had decided to keep it a fairly short run of shows and with its decent ratings couldn't find a good spot for it on the fall 1987 schedule. Instead Coca-Cola and the premiere of the The Original Max Talking Headroom Show on Cinemax served as essentially free advertising, helping keep ABC's own advertising costs down and ensuring that Max Headroom would remain in the public consciousness. Just as with the first season the overall ratings were good, not great, but the demographics in the Adult 18-49 bracket remained better than the average ABC show. However the long-standing Hollywood gossip about the show centred on the fact that ABC was airing a show dedicated to attacking, essentially, future ABC and so as 20 Minutes into the Future closed out the second season widespread speculation considered it cancelled, leaving it only a brief pop culture icon.[2]


    The Writer's Guild of America's 1988 strike proved to be a godsend for 20 Minutes into the Future. Produced over in the United Kingdom it was immune from the strike and indeed that's widely considered the reason it received a third season. It was the third season that became not just a critique of the news media but a blistering satirical take on government and corporations as well, described in storyline how each side is using the other for exactly what they want. It's also notable for finally understanding a little bit of computer networks (the competing rumours are a British writer on vacation in France saw their Minitel network & someone on staff finally read William Gibson's Neuromancer) and so the show managed to get a jump on the future internet.

    The third season of 18 episodes airing in September 1988 saw itself face little competition to start and indeed the first several episodes saw themselves in the Top 10 of the week providing a major lift to ABC's overall brand as a series of new Max Headroom shorts began promoting other ABC shows (in Max Headroom's own particular way). Although competition picked up soon 20 Minutes into the Future had established itself as a major fixture in the television landscape. Nevertheless ABC remained unhappy with what the show was doing and soon the creators and Channel 4 found themselves locked in struggle over the future events of the show.


    Max Headroom and his 20 Minutes into the Future TV show were, quite simply, being the future. Yet as it came off a successful third season how long could such a show last? At the very least it had kicked off a minor cyberpunk boom and had brought certain ideas about science fiction into the public. Indeed Star Trek: The New Frontier and 20 Minutes into the Future presented the most compelling and most mainstream potential visions of the future, and everybody else in Hollywood was taking notice.



    |||||


    [1] IOTL of course Max Headroom got only decent ratings in its first season of just six episodes. ITTL they've greenlighted more to start, and they've done somewhat better.

    [2] For the second season of Max Headroom it moved to the Friday Night Death Slot for September 1987 and of course tanked. With better ratings it keeps the Tuesday slot, but has to wait until Spring 1988. Indeed OTL speculation considered one of the reasons for Friday and cancellation was ABC executives were uneasy with the show.

    -----

    Just a short update, covering what could have been a major show be somewhat bigger ITTL. The influence of it, of course, will start to effect things.

    Around a week ago a friend and I were out drinking and although I don't remember the exact question the conversation wound up about technology and he goes (to paraphrase) "what was that digital guy in the '80s with coke and TV" and naturally I go straight to "Max Headroom". Hence, this.
     
    Last edited:
    New Year's Eve 1987 Technology Bash Video Game Cotillion (It's a Joint Party)
  • The New Year's Eve 1987 Technology Bash Video Game Cotillion (It's a Joint Party)

    Hot hits today! More hits on the way!

    (Sega Master System slogan upon launch in North America in 1987.)



    Sun Commodore (Is A Busy Bee)

    Coming off a successful 1986 Christmas season for the Atari 7800 essentially validated some of the time and expense involved in putting three major technology companies—Sun Microsystems, Commodore International, and Atari Corporation—into one new one, Sun Commodore, as well as acquiring Atari Games. However the main problems lay ahead. Sun Commodore was running some five (or seven, depending on how you look at it) different operating systems across its various brands and needed to begin consolidation as well as keep momentum on the Atari 7800 going.

    The Commodore 64 game ports had helped the 7800 strongly and so they doubled down on that, working hard to bring on board more developers and providing development tools to more easily create new C64/7800 games. In addition they began wooing the major American publishers, such as Electronic Arts, to bring their purely computer game based libraries to the 7800. For technical reasons much of that catalogue couldn't be brought over, but at the least they made inroads in mindshare.

    Throughout 1987 the Atari 7800 saw a reasonably steady stream of games head onto the console validating their developer centric approach (they charged less royalties than Nintendo, for instance) and their internal Atari Games studio released several titles as well. Although they never hit the peak 35% fourth quarter '86 market share in 1987 they were able to retain a roughly 20% standing against Nintendo.

    With that ship steady Sun Commodore began planning for the future. The Atari 7800 was from 1984, at least originally, and even against the marginally newer NES it was out-of-date in the fast-paced world of CPUs. So the Atari Panther project began. With the Amiga being quite popular for games it was obvious that the Atari Panther would have to capitalize on the previous computer-console success, this time by it being easy to make Amiga/Panther games at the same time. For that they'd need a new CPU for both platforms, but luckily Sun Microsystems had been working on one.

    The only other major factor was Japan. The best-selling games of all time were from Japanese developers and Sun Commodore had no foothold in that market nor could they break Nintendo's grip on third-party Japanese developers by themselves. Something would have to be done. The search for partners began in earnest by mid-1987 and by the end of 1987 they had found what could be an ideal partner.


    Sun Commodore also needed to get out of the Atari ST & XE business as fast as possible to cut costs and simplify their line-up, but the ST in particular was doing quite well. The Amiga 1000 ST was their first step as it possessed all the hardware features that made the Atari ST so compelling, particularly in the niche music field, as well as improving the specifications from the Amiga 1000. It also contained the beta version of the application compatibility layer for the Atari ST and the beta version of the emulation layer for the Atari XE, Commodore 64, and 74. Now of course it was in beta and performance for Atari ST programs wasn't great (the other machines were slow enough to emulate fine, just as the Atari 7800 could run ported C64 games) but it was a major first step to getting their ducks in a row. This would allow them to begin the wind down of the Atari XE & ST lines.


    1987 saw the start of development on the SunAmiga operating system as well. Based on the existing SunOS (itself based on BSD Unix) it was the Grand Unifying Operating System which would bring in support (via emulation or a compatibility layer) for nearly everything Sun Commodore was selling and add user interface design improvements. This would, at least in theory and three to four years down the road, provide an upgrade path for everyone buying the current computers and save a great deal of resources as the many product lines could finally be trimmed down.


    The other major item was the CPU. The revolutionary chipset that powered the Amiga and enabled it to do video and audio processing well in advance of any consumer computer was also complicated and expensive and the next generation of it it certainly wasn't going to fit in the Atari Panther for a reasonable cost. With advancing CPU technology it was obvious that much of the work could be put on a single chip and it just so happened that before all the mergers & acquisitions Sun Microsystems had been working on their own Reduced Instruction Set Computer (RISC) workstation class CPU but they lacked internal foundries. Perhaps if a partner could be found to help fund and maybe design it, a cut-down version could be used in a home console.

    Indeed, a partner could be found.


    Fujitsu in Japan was a major manufacture of semiconductors but had seen virtually no non-Japanese success with the licensed designs they had tried (such as TRON) and had been thinking about either making their own probably-RISC design or finding another licensed design. With Sun Commodore making overtures to various companies Fujitsu was by far the most interested. So in early 1987 the two companies joined forces on the SPARC design and (at the behest of the personal computer and console side of the business) began working on SparcLite, a lighter weight version suitable for use in those consumer electronic items.

    As part of the deal Fujitsu obtained both a SunOS and AmigaOS license and began work on their AmigaOS personal computer for the Japanese market, the FM Towns aimed squarely at the multimedia and gaming markets there. They had previously been considering a modified version of DOS as the operating system but the AmigaOS was much better. NEC's PC-9801 wouldn't know what had hit it. They also asked for another compatibility layer, this time for their FMR50 computers, to be added to the AmigaOS which wouldn't be a (huge) problem. Of course the FM Towns wouldn't be out for another couple years but it didn't seem like NEC was planning a new computer.


    RISC CPUs Will Take Over The World (Maybe)

    In fact 1987 was a major year on the CPU front as a variety of companies were working on RISC design CPUs. In particular three of them, at a fourth company's instigation, would join forces. Advanced Micro Devices (AMD) was designing a RISC CPU, the AMD 29000 (29k), for a 1987 launch; Motorola was working on the 88000 (m88k) for a 1988 launch; and IBM was in the preliminary stages of their own POWER design based on their earlier pioneering work on RISC itself (the 801 project).

    The fourth company, naturally, was Apple. At the time reliant on Motorola's 68k chips they were well aware of Motorola's work. In addition they had been contemplating their own future chip plans as they were unsure if the 68k could compete with Intel's x86 or new i960 RISC chip. Obviously the day in which they'd move to a new CPU architecture was some ways off (in fact software at the time couldn't even begin to emulate all the existing 68k programs on a new architecture) but it was coming. It would be in Apple's best interests to have the strongest possible alternative, if Intel wasn't chosen.

    Thus in early 1987 Apple began discussions with Motorola over the m88k and also brought Motorola around to Apple's way of thinking (once the promise of a Mac OS license was put on the table)[1]. Motorola began talks with a variety of companies and soon interested people at both AMD and IBM about the possibilities. Particularly if they could sell a lot of CPUs they could put together enough resources to compete with the Intel bemoth in the long run.

    Once talks began it became clear that although all sides had conflicting goals it would be possible to eventually put them together. The AMD 29k was almost out the door, but it wasn't the workstation class chip that the other partners needed, so that was out. However AMD was interested in the workstation market and once the 29k was out would have a team free. The m88k was a little further off from release and it seemed clear that SPARC, AMD 29k, Intel i960, and MIPS would have the early-mover advantage enough that the m88k's improvements wouldn't be enough so that was folded into the new project. Finally IBM was some time away from a launch, which would allow them to incorporate the other two CPU design teams into a larger effort. Once everything was signed… Power was a go with a 1989 planned launch for workstations and servers, intended to be followed by PowerClear in 1991 for personal computers.[6]

    Apple in turn signed the documents for the Mac OS licenses and Motorola promptly got to work on clones. AMD and Apple also started consulting on who AMD would sell their license to (as AMD had no intentions of making Macs) and IBM began planning for their own clone. This was an easy way to partially satisfy the two main factions at Apple as regarded cloning by limiting it to strategic partners as, of course, any clone sold will still make money for Apple but there will only be a few other companies making them.




    Sony Collects Ducks To Put In Order & NeXT Is Thinking About The Next Thing


    Sony had been planning to move into the content business. Music, movies, television, the entire collection of entertainment & Hollywood baubles that seem so very tempting when you can't look at the books. Now, well, now those plans were off the table. It's a little hard to secure financial banking when the banks themselves are under (if not exactly investigation, that doesn't fly in Japan's cosy world) a much closer eye.

    However Sony is a major company making lots of things across the consumer electronics space… but not videogame consoles or computers. Yet they do have interesting technologies in the planning stages. The earthquake has essentially resulted in the cancellation of not only the content move but also various other items such as Digital Audio Tape (DAT) which was suffering major technical issues and it has also caused problems with debates about the standards for CD-WO technology[2] and in the competition's world the Digital Compact Cassette (DCC).

    In fact this has left Sony in a disturbingly good position for a small team's side project… the MiniDisc[3]. Importantly and (mostly) unlike either the DAT or DCC it can easily be used to store data as the MiniDisc can be rewritten like a floppy disk entirely unlike a CD. It can also hold around 160 MB of data (or 74 minutes of compressed audio) with projections that the second generation would match CDs at 650 MB and with formatting & software improvements take older MiniDiscs up to 320 MB. The only real problem is that it can't be out the door until 1989… except maybe that isn't a problem.


    Steve Jobs left (was forced out) of Apple in 1986 and promptly founded a brand new computer company, NeXT. However as part of his agreement with Apple he couldn't directly compete in Apple's markets. That made Jobs', as a particular subset of teenagers would say, kirk out[4]. Of course that contract didn't say anything about getting some other company to do that work for him if the workstation market was looking a little crowded with Sun Commodore around.

    Sony had long thought about getting involved in the computing world but there weren't any (good) homegrown Japanese operating systems, Windows handled the characters used in Japanese incredibly poorly, and neither AmigaOS nor the Mac OS were up for licensing. However Steve Jobs and his new NeXT needed money, lots of it. Well. Lots for them but not exactly lots for Sony.

    By the time Sony and NeXT started talking the original NeXT plan of an Apple-like hardware/software combination was in full play but of course it could only be in limited markets by contract. Sony didn't have that problem. Steve Jobs didn't like divorcing hardware from software but if he could at least control it it wasn't so bad. In return for a great deal of cash giving them a major stake in the company Sony gained a optional exclusive licence, in that if Sony and NeXT both agreed on adding an additional party it could happen, on NeXT software. Of course NeXT had a fair amount on input on the internals of future Sony computers.

    NeXTStep, the object-oriented, multitasking operating system under development, was vastly more powerful than consumer OSs of the time such as AmigaOS or Mac OS as it (like Sun Commodore's SunOS) was based on BSD Unix. Furthermore Jobs had poached much of the best Apple staff in an effort to build it, so it would be at least as up-to-date on modern interface design as Mac OS. Sony understands that opportunity, at least if they're going into computers, and so they throw their muscle (and MiniDiscs) behind it.

    This allowed NeXT to focus more on the software side of things, although they were also designing NeXT workstations, and progress moved ahead fairly rapidly on NeXTStep to the extent that Sony began designing their own computers for a 1989 release, matching up nicely with their planned release of the MiniDisc.


    Nin-ten-do Is A Catchy Marketing Jingle (And Much More)

    By 1987 Nintendo had fully recovered from 1986 and although their entry into Europe had been tough—against entrenched opposition from the Sega Master System, the Atari 7800, the C64/74, and the Amiga—they were making headway slowly but surely, helped out by their unbeatable collection of exclusive games. They had also found their footing after the potentially disastrous Christmas 1986 and new marketing centred on showing the gameplay differences between the NES and Atari 7800 had done wonders.

    Yet like just about everybody else thinking about videogame consoles it was also time for them to begin planning a new one. The key question was whether or not they would go for a CD add-on, as that would enable vastly better graphics and sound and was becoming popular on the computer side. There were problems with the CD though including loading times and the expense of the CD player (although not the CDs themselves) and how many people would buy the add-on. On that front they had encouraging news: the Famicom Disk System had sold reasonably well in Japan and Europe, although they hadn't offered it in North America, and despite developer grumblings the delay of larger cartridges had meant that the Disk System had been required for a number of games which had in turn helped its market penetration. Perhaps that strategy could be duplicated again.



    Sega (Doesn't) Like A Wide Release

    1987 also saw Sega's fortunes improve. With increased supply and a launch throughout all of Western Europe before Nintendo got there (and already deeply established in smaller countries from there 1986 soft launch) their strategy had been vindicated, especially if one looked as well to Brazil, Australia, and New Zealand where the Sega Master System was also the market leader. It was time for North America.

    …It wasn't time for North America. Maybe it had never been time for North America. The launch of the Sega Master System in the United States and Canada was an exercise in futility with Nintendo and Atari having more or less locked up the market. Most Japanese third-party developers and publishers were barred by Nintendo from the Sega Master System in America and most Western developers and publishers were going with computers or the Atari 7800. Sega, with only a handful of exclusive internally developed games, was simply a non-factor. At least the Japanese launch of the Sega Master System had been smooth as Sega finally replaced the Sega Mark III.

    That, naturally, would have to change for their new console and so planning began right away. The hardware side was the easier one and they made the same choice Nintendo were planning to make: 16-bit cartridge system with a CD add-on, as Sega had noticed how well the Disk System had done for Nintendo. The software was tougher, but Sega strengthened their internal studios and started (multiple years in advance) the wooing process for both Japanese and Western developers and publishers.



    That Other Console Company (That Didn't Make A Console)


    NEC & Hudson Soft had seen their plans interrupted for a console and then had seen that they would be too late to launch against Nintendo and Sega. Furthermore they had no real presence outside Japan and Sega was proving handily in North America that that was a major problem if you weren't first to market.

    The solution was obvious in retrospect, of course. In the fall of 1987 NEC, Hudson Soft, and Sun Commodore entered into talks. In return for NEC & Hudson Soft providing games from both Hudson Soft and deals with Japanese third-parties they could simply rebrand the Atari Panther (and Atari 7800, if they wanted) and make it themselves in Japan. NEC/Hudson Soft would collect the videogame royalties in Japan and remit a portion back to Sun Commodore.

    NEC & Hudson Soft had saved a ton of money on redoing all their development work for a console and gained exclusive translation/publishing rights to all Atari Games games (but not vice versa); Sun Commodore gained access to the Japanese publishing and development community and a little extra money from Japan that they wouldn't have received otherwise. It was more or less a win-win for both companies.



    In Other News…

    Microsoft Windows 2.0 makes it out the door just before the new year ticks over and in an amazing piece of technology (that the Mac OS has had since its launch) windows can overlap each other. Apple sues them.


    The state of the computing market remained in flux as Commodore 64/74, Apple II, and IBM clones running MS-DOS duked it on the lower end of the market while Amiga, Mac OS, and Windows fought it out at the upper hand. With no clear victor yet established in either market it seems clear that the war will continue and widen.

    Go Corporation is the most well-funded start-up in Silicon Valley history.

    And in games a collection of amazing titles are released, including: Nintendo's Zelda II: The Adventure of Link (and The Legend of Zelda in North America/Europe); Konami's Castlevania, Metal Gear, and Contra games; Capcom's Street Fighter[5], Square's Final Fantasy, Sega's Phantasy Star and LucasFilm Games' Maniac Mansion.



    |||||


    [1] Jean-Louis Gassée was perhaps the strongest force preventing Apple from licensing Mac OS. With him gone that faction is weakened although not to the point where just anyone can get a license.

    [2] CD-R in other words. The original name was CD-WO for Compact Disc—Write Once. As it was a multi-company standard pre-release (unlike MiniDisc, which Sony widely licensed post-release) it is an easy thing to distrupt.

    [3] The MiniDisc was created in reaction to the failure of DAT. However it seems reasonable that it was a pre-existing project pushed onto the back burner by DAT, there simply isn't a lot of data about the internal workings of Japanese companies.

    [4] To really freak out or go crazy about something especially to talk in short clipped sentences. I'm sure one can guess the origin.

    [5] Yes, Mega Man fans, it has been delayed (but not cancelled). Sonic the Hedgehog fans, if Thande happens to read this and I feel like Yorkshire Rage, are out of luck.

    [6] Let's call this footnote the catch-all CPU section. IOTL Sun relied on Texas Instruments to fab the SPARC and eventually opened it up as a standard that all kinds of companies used, the most successful being Fujitsu, but because Sun didn't have their own fab they never quite got the top tier of chip designers going forward: ITTL Fujitsu functions as an in-house fab so Sun will have better people. In addition the pressure of needing a CPU for their PCs and consoles means they make a cut-down version (see PowerPC, OTL) for them which I've named the SparcLite for obvious reasons. That and the fabs is why they partner with Fujitsu.

    The AMD 29k was a fairly big success but AMD was mostly focused on making x86 CPUs and let it fall by the wayside to get run over by the Intel i960 which also fell by the wayside for different reasons. That more or less happens ITTL, but AMD invests a lot more effort in RISC design since they have partners. The Motorola m88k was a very nice CPU that came out too late. IBM's POWER, of course, led to PowerPC and ITTL it's a similar thing with PowerClear. AMD and Motorola are pretty much at the height of their RISC expertise and so work with IBM on much better terms than the OTL Motorola-IBM PowerPC partnership because by then Motorola was fading as a powerhouse with the m88k failure.

    ITTL Apple has a carrot (Mac OS license) and a stick (Intel x86) and the existence of Sun Commodore and the potential of the SPARC with Fujitsu on board early acts as a turning point in getting the various companies above together but even more importantly, volume = money = only way to compete with Intel in the long run. They all more or less knew this IOTL, but persisted in thinking that one more great RISC CPU would give them victory. ITTL the good-enough RISC (SPARC) has potential volume sales in Sun Commodore's line-up… so, they hustle.

    And if you love this CPU stuff (who are you? :)), there's a few more major players that hasn't been introduced yet, but they're all orbiting one particular thing…. (Neither ARM nor MIPS is the only clue.)

    -----

    Would you believe I've never owned a MiniDisc player or even touched a MiniDisc? Just stared longingly at them on screen, where they are the de facto standard (or were) for cool future technology. More on them later.

    Technically it's wrong of me to refer to System x as Mac OS, but I didn't figure anyone would mind. The same goes for Workbench (the AmigaOS).

    A little dry I admit, but I needed to do some groundwork, and I'm sure I screwed something up in that morasses of CPUs. Also the Sega Master System slogan followed by Zelda II: The Adventure of Link amused me.
     
    Last edited:
    1987 & 1988 Pop Culture NEWS OF THE WORLD Report
  • The 1987 & 1988 Pop Culture NEWS OF THE WORLD Report

    Yippee-ki-yay, motherfucker.




    Star Trek: Boldly Going Into Ratings

    Fox has found their breakout show as Star Trek: The New Frontier continues to perform well on their network. The popularity of the syndicated episodes has brought a number of new affiliates to the network to take advantage of the "air the previous week's episode, switch to the Fox feed for the new episode" deal and Fox is planning to go to a full schedule for the 1988-1989 season. Among the many shows being considered are, of course, a number of science fiction programs.



    Star Trek Movie Watch: Shatner's Hairpiece Vs. The Studio

    The fifth Star Trek movie, to be directed by William Shatner as per his contract, is running into all kinds of problems with Paramount unhappy about the script. At the moment Paramount is attempting to get around Shatner's contract to bring in a new production team but with Shatner's ego on the line this could go in a number of directions. At least there's a major videogame coming for the classic Star Trek franchise.



    Moonlighting Thrown Into The Sunshine


    Rumours are swirling on the once popular television show that the cast is ready to revolt. With the massive success of Die Hard turning Bruce Willis into a star and with Cybill Shepard's children taking her time it looks like the show might not be coming back and the decline in ratings (probably because neither star happens to be on screen much) isn't helping the situation. The idea of a two-hour movie event has been mooted, along with a potentially shortened final season but nothing has been decided.

    Die Hard's blend of terrorism fake-outs, corporate malfeasance, and incredible action sequences have led it to the top of the box office and has given Bruce Willis more than enough clout to escape from Moonlighting. It's more or less assumed that he'll take it but nobody knows what his contract requires.[1]



    The Corporations Are Coming For You

    It looks like multiple competing film projects are in the works in Hollywood spurred by the popularity of cyberpunk perhaps best known by 20 Minutes into the Future, the Max Headroom television show. Not much information yet but Metropolis is the title of one of them, another is a sequel, and one or more of them is based on a book or short story. Interestingly two of them share a writer.


    Star Trek: The New Frontier Faces Last Frontier?

    The upcoming Writers Guild of America strike is causing major problems for multiple television shows, the popular Star Trek being one of them. Apparently they are considering multiple British writers and consulting with Patrick Stewart about them given his long-time work in British theatre. If they are unable to secure a writing deal it seems probable that the second season will be the last.

    [2]


    Batman Can't Save Gotham


    With no final script turned in it looks like Tim Burton's Batman has gone into turnaround, with studio problems with the cast and Burton's vision contributing to the woes it faces. The only thing known for certain is that production will not be happening in time for a 1989 release. Michael Keaton has possibly already left the film, although Jack Nicholson remains committed to the project.



    Ghostbusters II Isn't Funny?

    Pre-production has halted on Ghostbusters II as Ivan Reitman & Bill Murray have demanded script rewrites despite the filming schedule. Columbia chief David Puttnam is apparently furious, but it appears the director and cast have the upper hand on this one with Bill Murray being willing to walk away without a new script.



    Strike Off?

    Potential compromise reached! It seems possible that the Writer's Guild of America has reached a deal with the counter parties and that means we have the Fall season of television after all. Hollywood has been widely effected but if television shows can still get on the air for September than perhaps the networks will come out of this unscathed.



    Your License To Kill Has Been Revoked

    The planned 1990 James Bond film is apparently jeopardized by legal trouble. With the sale of MGM/UA being planned Danjaq, the Swiss-based parent company of Eon Productions, has deep concerns about probable bidder Qintex and their relationship with Pathé. Nevertheless pre-production has begun but with legal struggle adding to the woes of Pierce Brosnan's potential withdrawal everything seems to be in flux for the next time we hear "Bond, James Bond" on screen.



    The Japanese Are Invading… With Telepathic Mutants?

    Massive Japanese animated hit Akira is coming to America for a 1988 fall release. The combination of near-future Tokyo and superpowers seems to have hit a chord with some executives at MGM/UA and they are planning a major campaign for it. There isn't been a mainstream animated movie aimed at adults anytime recently and we're hearing that Disney is considering one as well, if Akira does decently at the box office.



    Strike Continues, All Bored

    The WGA strike has pushed back most television launches to at least October, leaving the networks scrambling for additional programming. This seems to have secured 20 Minutes into the Future a prime spot on ABC as it's a British produced show. However that's one of the few scripted shows that will be making an appearance on time. Star Trek: The New Frontier closed out its second season with episodes written by a collection of British playwrights (although somewhat jarring, reviews were favourable) but a late third season launch is causing problems at Fox.



    Animation Domination!

    The performance of Akira at the US box office seems to be leading to a boom in adult animation. Fox has announced the Simpsons, based on shorts from the Tracey Ullman Show, to premiere in the spring of 1990. Disney rumours continue about a new film aimed with an older audience in mind. Finally Hollywood money seems to be moving into a number of Japanese studios as they look for the next Akira.



    What Will James Cameron Blow Up Next?

    There aren't any details but it appears that James Cameron has decided not to film his screenplay The Abyss because of an intriguing offer by a Hollywood studio for another project. After Terminator and Aliens the man is a hot commodity as a director and it appears he'll be doing another science fiction movie. Aliens III or Terminator 2 both seem like strong contenders but there's no confirmation as of yet and there may be an entirely different movie in contention.



    Planes, Spies, And… Science Fiction?

    Fresh off the back-to-back hits of Top Gun and The Living Daylights Tony Scott is one of perhaps a dozen directors considered the best "gets" in Hollywood. However it seems like he might be working with his brother, Ridley Scott, for his next movie. The two brothers have a very different style but if one wants to combine the thrilling action of The Living Daylights and the brilliant Blade Runner… there won't be any complaints. Nothing is certain yet except that both Scotts' are deeply involved with major projects.



    |||||


    [1] Die Hard has a somewhat different plot ITTL for obvious Japan-related reasons and does somewhat better at the box office. It also has a longer shoot than OTL which make the Moonlighting ratings problems worse.

    [2] The nacelles should not have that weird bend in the struts and should look more like the Excelsior nacelles but otherwise that's more or less how the alternate Enterprise-C looks. I apologize for not having any 3D model rendering skills.


    -----

    A short update to get ducks in order. After this we're moving to cover technology and videogames in 1988-1989 and after that… well, those are surprises.
     
    Last edited:
    Raging 1988-1989 Continuous Blowout of Tech & Videogames
  • The Raging 1988-1989 Continuous Blowout of Tech & Videogames

    Welcome To The Next Level.

    (Sega Mars tagline in the United States.)



    Sun Commodore Stays The Course

    Sun Commodore opened the new year of 1988 with a pair of new computers, the Amiga 1500 and the Amiga 750 which were respectively the high and low end successors to the A1000 and A500. They also released the Atari 7800 Expansion Card which added a new high score saving system, additional RAM for the system, and a new sound chip. As part of this interim stop-gap solution until their Atari Panther console they also revised the main system to the Atari 7800/Enhanced which adds the Expansion Card capabilities to the motherboard.

    Market share throughout the year remained in the 15-25% range in North America, however the NEC launch of the PC-Engine (the Atari 7800/Enhanced model) in Japan with a collection of translated Western games and several from Hudson Soft and the second tier of Japanese developers was a niche success. NEC's loose restrictions on content combined with the Western games meant that it saw fairly quick uptake among several different groups of Japanese gamers. The low price might also have helped. As a brand exercise it was quite successful, bringing NEC & Hudson Soft into the argument about consoles.

    Development of the SparcLite CPU and the SunAmiga operating system continued apace with a 1991 launch planned for both and so Sun Commodore began to orient its product line towards that with new Amiga computers and the Atari Panther planned for that year. Meanwhile SPARC itself was powering the workstation side of the business into the global lead as performance was simply unmatched for the price.

    Sales of the C64/74 remained fairly strong but signs of weakness were widespread. However the lowered price of the Amiga 500 (and also the reasonably low price of the Amiga 750) as well as full C64/74 support began to transition C64/74 customers to the Amiga as Sun Commodore's marketing explicitly pushed it as the obvious next computer to get. To help this transition they announced a trade-in program where an Amiga computer would be reduced in price if you brought in a C64 or C74. As part of a marketing campaign those computers were refurbished and sold very cheaply in third world countries.


    Throughout 1989 Sun Commodore continued its string of reasonable performance as their workstations, computers, and console all did very well. In Japan the PC-Engine had built a surprisingly strong foothold in the market, at least partially on novelty, and a number of Japanese games began to make their way over to the Atari 7800 in the United States and Europe. Several of the more enterprising Japanese developers also made C64/74 ports of their titles introducing Japanese style console games to a Western computer audience for the first time. To help that out Sun Commodore released a cheap adaptor that let one use the 7800 controller on both the C64 and C74.

    Fujitsu in Japan launched the FM Towns, an AmigaOS clone, touting their powerful graphics capability for a cheap price and NEC was forced into serious problems with no computer of theirs able to compete.



    Sega Global

    Having been mostly shut out of the North American market as regards the Sega Master System the new console would be launching there second, after Japan, however in an innovative plan Sega was working towards a global 1989 launch of their new console rolling out in Japan, North America, and then Europe/smaller markets. Indeed this 1989 launch would put them at least a year ahead of the Nintendo by all reports and fully two years ahead of the Atari Panther. The obvious trade-off was that the games would not potentially not look as good but Sega was confident that being first would help them build a strong position against the Nintendo juggernaut.

    Having chosen not to go in-house Sega turned to Matsushita Electric Industrial Co., Ltd (Panasonic, outside Japan) for development of a Compact Disc Drive add-on for the console after seeing the fair amount of success that Nintendo had with the Famicom Disk System. This was scheduled for a 1991 launch opposite the Atari Panther and Sega was strongly considering making it also a powerful upgrade perhaps with extra RAM or even an extra processor.

    Sega, in a major coup, also signed Electronic Arts to their new console having promised to match royalties with Atari and naturally being much more free than Nintendo with their restrictions. Perhaps most importantly the deal ensured that John Madden Football would be a timed exclusive launch title.[1]

    The Sega Mars launched in the spring of 1989 in Japan and fantastic graphics paved the way for a good start in that market, especially since top tier Japanese developers such as Capcom, Namco, and Konami all released titles for it. In 1989 those were mostly upgraded versions of NES games but nevertheless the show of support was important. A summer 1989 launch in North America went much more smoothly than the failed Sega Master System as Sega successfully advertised their graphics in commercials and suddenly the NES and Atari 7800 looked incredibly out of date. Finally the Western Europe/Brazil/Australia/New Zealand joint fall-winter launch brought the Sega Mars into its strongholds, where reception was fantastic. With only the Amiga computers able to compete on graphics (and those were rather more money) the Sega Mars was off to a roaring start. However the games in 1989, with the notable exception of John Madden Football in the USA, were not particularly good and Sega turned its efforts towards having a new mascot….



    Nintendo & Apple Sitting In A Tree, K I S S I N G

    Nintendo had been looking for a potential partner that was strong in Western computer games to combat the Atari 7800's strong support there. For obvious reasons Sun Commodore, the strongest player in that market, was out but the Apple II had been a longtime player in that space. The Apple IIGS CPU was deliberately downclocked to not compete with the Macintosh and by adopting that CPU (although using different graphics chips) and speeding it up to 4 MHz from 2.8 MHz this would allow fairly easy porting of Apple IIGS videogames.[2]

    Nintendo also began talks with various companies over the design of the freshly named Super NES and by a complicated process Sony would make the sound chip[3]. This also gave them a leg-up in bidding for the SNES CD drive but bids were solicited from several other companies including Phillips. In the end Nintendo simply postponed their decision, with internal developers not entirely happy about the CD's slow loading times and inability to save games on it.

    The NES continued to be a very successful console in Japan and North America and began making headway in Europe. Although not a major contender they did do better in Europe (a market about half the size of North America at the time) than Sega was doing in North America and at the least if you wanted to buy a NES you could find one.

    By 1989 Nintendo was once again in talks over a potential CD drive and this time Sony suggested their brand new MiniDisc platform. Rewritable and with faster load times than a CD (if less storage) were strongly appealing to Nintendo and both companies began to work on the SNES MiniDisc Drive for a launch sometime in 1993.[4]


    Meanwhile Apple cut prices sharply on the Macintosh trading margins for sales as Motorola launched the Motorola Macintosh Mach I (the triple "m"), the first clone of the Apple Macintosh. This was followed in 1989 by the IBM Macintosh THINK Box. Meanwhile AMD had entered into negotiations to sell their Macintosh license to a third party.

    The Apple Newton project, started in 1987, continued with a variety of conflicting goals. It's notable that Del Yocam, Apple's COO, let the project continue despite its expense given his usually tough stance on such things. Indeed Apple's R&D had been heavily cut down from its lavish ways in an effort to streamline the company.

    The AMD-IBM-Motorola (AIM) Power project taped out in the spring of 1989 and by the fall IBM had several servers for sale using it. More importantly to the personal computing world was the continued development of PowerClear as Apple was hard at work adapting the Mac OS to run on it.


    Sony's NeXTStep

    Sony didn't have anything particularly interesting happen in 1988.

    On the other hand 1989 was a huge year for Sony. The release of NeXTStep 1.0, the MiniDisc, and of course the brand new computers running it.

    As Sony lacked any internal computer case design team they turned to the Audio group of the company. In response they came up with a computer that rather than being a beige box looked like a high-end piece of audio equipment. The higher-ups at Sony were quite impressed and also got them to make the MiniDisc drive.

    The Sony CyberDeck[5] is launched with major fanfare in Japan. Besides running NeXTStep they are also gorgeous computers, equipped with an optional MiniDisc accessory and reasonably powerful as they used the MIPS processor from MIPS Computer Systems. Naturally they were also rather expensive and so much of their sales in Japan were as workstations, a market that had weak competitors at the time in Japan. However they were also used as a home computer by more upscale clients and indeed their very existence brought major pressure to bear on the state of personal computing in Japan.


    The MiniDisc standard was released for license and Sony launched a variety of hardware:
    • Walkman MD.
    • Dual CD/MD Deck designed to transfer CD tracks to an MD, it can also connect to a computer so the computer can use both CDs and MDs (this causes headaches in CD rental stores in Japan until rental-specific CDs begin to be released).
    • MiniDisc Deck for computers that already have a CD drive and don't need the dual CD/MD Deck.
    They also sign up a number of companies to put out albums on the MiniDisc format for people that aren't buying the at-launch rather expensive Decks. Finally the two MD Decks came with drivers for all major operating systems as well as several Japanese-specific models on an included MiniDisc and Sony set to work to get everybody to include as part of a standard OS release.



    |||||


    [1] IOTL Sega backed down on cartridge prices to get John Madden Football. ITTL they drop that fee earlier and have done a good job courting developers so they'll actually make more money and earlier as the game will be released in 1989 instead of 1990.

    [2] IOTL the SNES's Ricoh 5A22 was based on the same CPU. ITTL they just use the actual CPU, slightly faster than OTL, because of a deal with Apple to help them with cross-platform porting.

    [3] Similar to how it worked IOTL, despite butterflies. Ken Kutaragi likely still would have bought his daughter a Famicom and it's quite reasonable to assume he did the same things after that.

    [4] Feel free to get your pitchforks out, but I think you'd be wrong.

    [5] The audio group at the company has parlayed their case design into calling them "decks", with the "cyber" part capitalizing on the global uprise in cyberpunk and "cyber" being seen as a cool and futuristic word.

    -----

    Note of course that pictures going forward are obviously going to be a little wrong sometimes. I hope you can overlook this because I enjoy having all the pictures in this timeline and I hope y'all do as well.
     
    Last edited:
    Top