The Doctor is Out
  • The Doctor is Out

    Doctor Who had been on the air since 1963 – over two decades, an interminable run for any dramatic program. In that time, it had survived four changes in lead actor, but even as the series celebrated its 20th anniversary on the air, it did not seem nearly as invincible as it once had. The added competition from ITV-2 after 1981 ate into the show’s audience share, for one thing – younger viewers and families were more than happy to watch ever more lavish presentations of live football, game shows, and elaborate high-concept light entertainment than an aged, hackneyed science-fiction program with an ever-declining budget; not to mention the declining quality, another key factor in the show’s fall from grace. Doctor Who fandom was as diverse and tribal as Star Trek fandom, with consensus no easier to broker amongst its many factions. However, virtually all of them were agreed that despite a promising enough start the tenure of the Fifth Doctor represented a decline from the golden years of the Fourth Doctor (though opinions varied regarding its severity). Naturally, a minority contested this conventional wisdom, though many would not fully assert their positions until some years later.

    Fan criticisms of the Fifth Doctor era went out of their way to acquit Richard Griffiths, who played the Fifth Doctor. Griffiths was well-regarded in the role by most observers, both within and without the fandom, despite having the difficult task of succeeding the popular Jim Dale as the Fourth Doctor. Then again, Dale succeeded Jon Pertwee, the man who was still seen as the Doctor, and pulled it off with aplomb. On the other hand, Dale had the benefit of audience goodwill due to being an unambiguously British Doctor with a British companion, whose adventures were scripted by British writers, with effects done in-house by the British technicians at the BBC, with no need to pander or kowtow to American interests. As a result, the fandom was far more willing to forgive the show’s many faults during that difficult transitional period because at least it was being true to itself. By the time Dale had parted for greener pastures, it was at least in part because by then, the bloom was off the rose.

    Novelty no longer playing much of a part in the show’s popularity, the writers and producers increasingly had to make do with nostalgia instead. Thus, all five Doctors were reunited for the 20th anniversary serial in 1983, entitled The Five Doctors. Sadly, notall five of the actors who had played them would reunited: William Hartnell, who had played the First Doctor, had passed away in the interim between his appearance in the previous milestone serial, The Three Doctors, in 1973. The role was recast with a reasonable facsimile for Hartnell. However, Pertwee, Dale, and Patrick Troughton all returned in their original roles – it marked Dale’s first and only appearance on the program following the end of his tenure. In a shocking twist, Roger Delgado also returned for the serial, reprising his role as the Doctor’s most iconic individual adversary, the Master. The plot involved him teaming with the Doctor’s most iconic collective adversary, the Daleks. Somehow, in spite of this awesome collusion of the universe’s greatest supervillains, all five Doctors, working in tandem, were able to defeat them.

    The Five Doctors was a smash-hit, easily the highest-rated serial of the Fifth Doctor’s tenure, pulling in Doctor Who’s best viewership figures of the 1980s. However, that serial’s success was sadly anomalous, a mere reprieve which only served to delay the inevitable, and in some ways accelerate it. With the welcome exception of the spike provided by The Five Doctors, the audience for Doctor Who seemed to erode with each passing serial. It seemed that every core demographic for the program sought other diversions; football or game shows elsewhere on the telly, or even increasing time spent on the home microcomputer as the 1980s wore on and prices for new models continued to drop, making them affordable even for working-class households. British society, it seemed, was passing Doctor Who by. It was increasingly seen as a relic, a product of a different time entirely.

    Perhaps if the quality of Doctor Who had been maintained, this inexorable decline might not have taken place, but the perils with any show having been on the air for so long were that the young turks brought on to shake things up might have happened to be fans in their youth – and such was the case here. The new batch of writers were far more concerned in correcting – or, in the parlance of comic book writers, retconning – past inconsistencies or even “transgressions” against their own internal logic with regards to how the Doctor Who canon operated. They were also not always particularly subtle about making their thoughts known, often using characters as their personal mouthpieces. (The controversial “Yank Years” unsurprisingly served as a lightning rod for their revisionism.) Many serials were thus direct continuations of previous, 1970s-era serials, often with a twist that somehow invalidated the original.

    Despite having once been part of the Doctor Who fandom, many of the show’s newest batch of writers and producers seemed to feel that their elevation to their current positions set them apart from their former compatriots in some meaningful way. This was especially true of those fans whose conclusions about Doctor Who lore differed from their own. For example, late into the Fifth Doctor’s tenure, an adolescent male companion named William was introduced. Companions were usually intended as audience surrogates, and William (or “Billy”, as he was commonly known) was no exception; he was a smug, obnoxious, know-it-all brat who constantly argued with the Doctor, only to always be proven wrong. That many of his arguments were often lifted verbatim from popular fan screeds served to drive home the point that William served as a petulant satire of hardcore fans – or, rather, those hardcore fans whose own beliefs differed from those of the writers, with the Doctor serving as author avatar. Deliberately and blatantly alienating the program’s core audience was perhaps not the best move the producers could have made to quell the exodus of viewers.

    Unsurprisingly, those in charge of Doctor Who not only disdained much of their show’s own history and its fanbase, but also their own superiors. The BBC, after all, had a chequered reputation as the overseers of Doctor Who. The Yank Years in particular stood as perhaps the most sordid chapter of the program’s very long run on BBC-1, with the fanbase growing increasingly irate at what they had deemed to be “misplaced priorities” on the part of the BBC, as if the program were nothing more than a commercial enterprise. Still, the long series of Controllers who presided over the fate of Doctor Who were surprisingly indulgent and tolerant with regards to its production. Or, at least, so they had been… all good things must come to an end, as too must all strategic management decisions.

    Enter Mark Lewin, who became the Controller of BBC-1 in 1983, in the run-up to The Five Doctors. Prior to becoming Controller, he’d heard the horror stories of raging egos running rampant on the set of Doctor Who (between the leaky nature of fandom contacts and the notoriously snoopy entertainment press in the UK, such stories were common knowledge) and his investigations into the matter confirmed his worst fears. He dreaded the disaster that would have come of the 20th anniversary serial, but it surprisingly went off without a hitch – the presence of the “old guard” no doubt playing a role in tempering the young turks somewhat. In the long run, though, this might have been the worst possible outcome, as it raised Lewin’s expectations far beyond the point where they could ever be met. He made very clear that he expected the smooth production and successful result of The Five Doctors to be the “new normal”. However, his expectations would never come to fruition.

    Indeed, the character of William was introduced in the serial that immediately followed The Five Doctors, quite effectively obliterating any lingering goodwill from it. Though William lasted less than a year, the damage was done. Richard Griffiths hated the character even more the fans did, seeing him as a personification of all that was wrong with the production staff. Tired of being the public face of an increasingly maligned program, and of the positively noxious atmosphere behind the scenes, Griffiths announced his intention to leave Doctor Who at the end of the 1985 season. The press and the fandom naturally went into overdrive with speculation as to the casting of the Sixth Doctor… until they were faced with a rather rude awakening.

    Lewin announced in a press conference mere days after word of Griffith’s departure was leaked that Doctor Who would be ending its 22-year run upon conclusion of the current (and thus final) season. The Fifth Doctor would therefore be the last – it was never specified just how many regenerations each Time Lord was allowed, although the implication had always been that it was a finite number. [1] No one had expected that number to be as low as four, but so it went. Despite the widespread loathing within the Doctor Who fandom toward the creative team, many of them still reacted with outrage at the show’s cancellation, which made national headlines (if occasionally in derisive tones). Many eulogized Doctor Who as well past its prime, coasting by on nostalgia, and worthy of this chance at a definitive conclusion rather than a slower, more lingering death by a thousand cuts. After all, the BBC would still own the rights to Doctor Who, and might possibly bring it back in some other form in future, as had been the case for Star Trek. Indeed, in many ways Star Trek served as both a rallying point and an inspiration for disheartened Doctor Who fans. After all, Star Trek had ended its original UK run over a decade earlier, and yet it remained a mainstay of British television. Doctor Who had been running for over four times as long as Star Trek did, and had accordingly accrued an episode count fourfold that of Star Trek. Doctor Who would no doubt enjoy a very long and fruitful life in syndication; it would also provide an opportunity for many younger viewers to watch 1960s-era Who for the first time.

    And so it was that the Doctor died once and for all, naturally in preventing the destruction of the universe and all its inhabitants, but not before revealing his name – which turned out to be a Gallifreyan word meaning “doctor”: “you see, my dear, it’s an occupational surname, not rather unlike your own ‘Baker’ or ‘Smith’”, as the Doctor himself explained. This reveal was considered distinctly underwhelming, though most rational observers agreed that very likely any name would have done. Despite this obvious disappointment, responses to Requiem for a Time Lord, the final serial, were generally positive; viewership figures were the highest since those of The Five Doctors two years earlier, allowing the Doctor to go out with a good sendoff. The Doctor’s final words comforted his companions - and the audience – by reminding them that there were many other Time Lords, and perhaps the nobility of his sacrifice might inspire one or more of them to carry on his legacy.

    Meanwhile, Mark Lewin was almost immediately promoted to BBC Director-General following the finale of Doctor Who, and though his successor was urged to reverse his decision to cancel the program, he never did (albeit possibly for fear of being overruled by his superior, who had made a name for himself crusading against behind-the-scenes excesses). Lewin would enjoy a successful career in television broadcasting, with the curious exception of being the first BBC-1 Controller not to be knighted by Her Majesty the Queen, for reasons which remain, not unlike the Doctor himself, somewhat enigmatic…

    ---

    [1] The serial establishing that Time Lords are allowed no more than twelve regenerations IOTL was “The Deadly Assassin”, which aired in 1976. ITTL, no such limit is ever put into place, with the writers wisely deciding not to constrain themselves – or rather, their successors – with having to write around that limit once they reach it… as they did IOTL, with “The Time of the Doctor” in 2013, thirty-seven years after the limit was established, and in which (for lack of a better analogy) the Doctor was effectively given a “continue” after having lost all his original lives.

    ---

    Thanks to e of pi for assisting with the editing, as usual.

    Thus concludes the history of
    Doctor Who ITTL. It will not be returning before September 20, 1986, so it will not be covered in any further detail. Feel free to speculate as to its future, but don’t expect me to confirm or deny any of your suppositions. Your only hope will be for me to write a sequel TL! (Which probably isn’t going to happen, by the way, so don’t get your hopes up waiting for one.)
     
    One Last Night at the Movies
  • One Last Night at the Movies

    1984 was a year which had been immortalized in popular culture some decades before it had actually come to pass, courtesy of the seminal dystopian novel of the same name, written by George Orwell. Published in 1949, Orwell’s novel was reflective of the times it was written. The Attlee government was rapidly and thoroughly nationalizing each and every sector of British industry and wartime rations remained in place. [1] The Soviet Union had forcibly installed Communist governments in the states which had fallen under their sphere of influence, including in East Germany. And the spectre of fascism, which so desperately sought to maintain control over its people through the mastery of propaganda techniques, continued to loom large over the European consciousness. However, times – and trends – had changed a great deal since 1949, and the year 1984 that would actually come to pass was very different from the future Orwell had envisioned. Instead, it much more resembled what Aldous Huxley had described in Brave New World: people did not need to be deceived, only distracted. The traditional paradigm of panem et circenses had re-asserted itself in these more complacent times.

    However, even the very nature of panem et circenses itself had changed since Orwell’s time. In 1949, the American motion picture industry still operated under the Hays Code, promoting restrictive values of decency and propriety over freedom of expression. Although the breakup of the Golden Age movie studios had already been set in motion with the Paramount Decision the year before, it would take some time before its effects would trickle down into the nitty-gritty of how movies were made in Hollywood – even as their artistic supremacy and mass popularity was increasingly challenged by foreign imports, particularly from the United Kingdom. By 1984, in the wake of decades-old Supreme Court decisions validating the artistic legitimacy of film as a medium, and movements such as France’s nouvelle vague and the American New Hollywood generation it helped to inspire, all of these constraints had been eliminated, but at the cost of creating new challenges. Every filmmaker in the modern age, not to mention every executive at every studio, found themselves increasingly forced to walk the fine line and find the delicate balance between art and entertainment.

    One director who seemed to consistently manage to hit the sweet spot was Stanley Kubrick, though it was obvious that he did so with great care and deliberation. His reputation as a perfectionist preceded him (best demonstrated by his legendary filing system, which involved thousands of boxes and was of his own design), and horror stories direct from his sets were a constant fixture in the trade papers. However, despite being widely known and feared as a tyrant, his films were always financially successful – even taking into account the inevitable time delays and cost overruns, which the savvy studio chief at MGM, Edgar Bronfman, was shrewdly beginning to factor into the budgeting of each new Kubrick production. Still, following up on Napoleon – the biggest hit of 1971, winner of seven Oscars, and eventual contender for greatest film ever made – would be no mean feat for Kubrick, and several of his big ideas went nowhere.

    The early-1970s were the height of porno chic and the peak of commercial viability (with two pornographic films among the Top 10 at the year-end box-office for 1972), which led Kubrick to briefly toy with the idea of revisiting his former collaborator Terry Southern’s proposal, Blue Movie, a high-budget pornographic film with big-name stars and genuine artistic merit. When Kubrick had initially rejected the idea, Southern had decided to publish Blue Movie in book form; much like Dr. Strangelove, the story evolved considerably in the developing stages, and the book as published was a satire of Hollywood filmmaking. Kubrick expressed some interest in the idea of adapting the published Blue Movie as-is; instead of making a porno, he would be making a self-referential critique of the motion picture industry somewhere between Sunset Boulevard and . He even offered the role of “Boris Adrian”, the character thinly based on himself, to Peter Sellers. However, ultimately, plans to adapt Blue Movie faltered. Napoleon had become known for its explicit nudity and sexual activity, which had overshadowed the blood, gore, and violence depicted in the battle scenes despite being far fewer in frequency and much shorter in overall duration. Kubrick felt that he would have to go even further with Blue Movie, filming graphic and unsimulated sexual acts and showing them uncensored - albeit shot and edited in such a way as to de-glamourize them, to emphasize the satirical nature of the film. However, as had been the case earlier, Kubrick continued to doubt whether even he was up to the daunting task ahead of him; even he had his limits. Bronfman, extremely wary of the prospects of such a film, not to mention the backlash he would face in distributing it and the possible boardroom coup that might result once it was greenlit, advised Kubrick to find a different project. Blue Movie was ultimately never optioned by MGM, and it was never adapted into a major motion picture.

    Kubrick did like the challenge of presenting previously “taboo” material in such a way as to demystify it for audiences, which inspired his next choice of project. He had been considering directing a film about the Holocaust for some time, and when he made Bronfman aware of this while the two were discussing potential alternatives to Blue Movie, the archivist was dispatched to comb through the studio’s records for any such scripts on file. [2] Soon enough, a film treatment called To the Last Hour had been written in 1964, by none other than Howard Koch – who had won an Oscar for writing the screenplay to Casablanca. To the Last Hour was, improbably enough, an inspirational Holocaust story, regarding a repentant Nazi profiteer named Oskar Schindler, who requisitioned Jewish workers for his enamelware factory during World War II, but eventually – at great personal risk and the cost of his entire fortune – decided to rescue them from their ultimate fate in the concentration camps. The lives of over a thousand Jews were saved as the direct result of his intervention, and he had been named Righteous Among the Nations by the government of Israel in 1963. However, he was unable to match his tremendous success as a human being in any of his business ventures, and had lived out a meagre existence following the war largely on the remittances of the many Jews whose lives he had saved – the Schindlerjuden, as they were known in German. Schindler had received $20,000 for the film treatment in 1964 – not an insignificant sum of money in those days – but it did not last long. Something about the squalor of Schindler’s existence spoke to Kubrick, the great irony of the situation appealing to his sense of narrative as a writer. It also complemented the insignificance of his actions against the backdrop of the wider Holocaust – six million Jews had died, and he had only saved 1,200 of them, or just one out of every five thousand. [3]

    Bronfman, being of Jewish heritage himself, was supportive of Kubrick’s intention to direct a film about the Holocaust. Howard Koch was still alive when pre-production officially began in 1972, and flew out to England to discuss his treatment and his own original plans for how the script would have been drafted at that time. Kubrick commissioned Koch to write the screenplay, though he would heavily revise and edit the material to suit his own vision. Koch, a veteran of the Golden Age of Hollywood, and well aware of the need for screenwriters to bow to the demands of domineering taskmasters – be they directors, producers, or studio heads – acquiesced, and the two worked together amiably. Ultimately, they would share screenplay credit in the finished film.

    The two agreed that for the film to work, there had to be two central themes which would give it a suitably (and ironically) “epic” feel: Schindler’s selfless act of heroism contrasted against the majority of Germans (and Poles and Czechs and others) who did nothing; and the rescue of those precious few Jews in contrast to the great many who were murdered, often in horrifying (and gruesomely explicit) ways. Therefore, Schindler and his story would merely be the focal point of a broader narrative. This would anchor Kubrick’s ambitions: realistically, he could not tell a story about the totality of the Holocaust, because he reasoned such scope to be outside the capability of the cinematic art form. However, he could not focus too closely on Schindler, because he felt that would cheapen the contrasting situation which made him stand out in the first place. Schindler’s redemption and rescue would be a single ray of light breaking through a vast sky of unrelenting gloom. Kubrick also liked the idea of ending the film with an epilogue which focused on Schindler’s present-day life of poverty. He had done the right thing, but he had not received his rightful rewards, a cruel subversion of the expectations of American filmgoers in particular. [4] Since Oskar Schindler himself was still alive and living in Frankfurt, he was extensively interviewed by Kubrick and allowed a role in the production. Bronfman was aware of the potential bad press which would emerge had they been perceived as taking advantage of Schindler, so he arranged for a small stipend to be sent to him for the duration of the film’s production, and for Schindler to receive a token share of the film’s grosses upon release. Between them, this steady stream of income would cover his modest living expenses. Bronfman referred to it privately as his “hero’s pension”, a term which (at Schindler’s own insistence) was never used for promotional purposes.

    Schindler served as an informal consultant during the making of the film, though he was never formally credited as such (instead receiving a “special thanks” credit). Consultant credit was awarded to one of the Schindlerjuden, Leopold Pfefferberg, who had been instrumental in arranging the commission of the original To the Last Hour treatment in the 1960s, having long been a passionate advocate of making Schindler’s story known. [5] He and Kubrick did not get along, and he was swiftly banned from the set. Ironically, though Kubrick did his best to play down Schindler’s achievements as part of his vision for the film, Schindler himself fully supported the thesis statement he presented (“I could have done more, as we all could have done more”) and he was shown the rough cut screening in London (alongside Bronfman, Pfefferberg, and several others) some weeks before the film’s premiere in Frankfurt in late 1975.

    By this time, Schindler was in failing health, and he died in early 1976 - having lived just long enough to have finally gotten the recognition he deserved. [6] He was buried on Mount Zion in Israel, the only member of the Nazi Party to be so honoured. Schindler’s share of the grosses were left to his widow, Emilie, from whom he had separated and who now lived in Argentina, allowing her to eke out a comfortable existence for the rest of her days; despite the graphic violence in the film, and the extremely harrowing subject matter, Oskar Schindler was a financial success, grossing over $50 million worldwide, against a budget of less than $5 million. Many politicians and other celebrities urged the public to view the film, which would eventually become required viewing in classrooms the world over. The audience for the premiere in Frankfurt included such luminaries as the Chancellor of Germany, Helmut Schmidt. At the Academy Awards, Oskar Schindler was nominated for several Oscars (fittingly enough), including Best Picture, Best Director for Kubrick, and Best Original Screenplay for Kubrick and Koch (the first time a Kubrick film had received a nomination in that category). However, it won only for Original Screenplay, accepted by Koch alone (as Kubrick’s fear of flying once again prevented him from attending the ceremony in person). Koch dedicated his win, on March 29, 1976, to “the late” Oskar Schindler; word of his death had reached the United States just days before.

    Kubrick certainly wasn’t the only director who had learned to walk the line between art and entertainment. One of his great contemporaries in this category, David Lean, also made movies for MGM, and also faced the challenge of where to go from his last project for the studio – in Lean’s case, it had been the romantic historical drama, Ryan’s Daughter. This, in turn, had followed a string of action-adventure epics: The Bridge on the River Kwai, Lawrence of Arabia, and Doctor Zhivago, all of which had seen financial successes on the same scale as their respective productions. Ryan’s Daughter, a more muted production, had also met with more muted success; Edgar Bronfman made clear that he expected the next David Lean film to be a smash-hit on the order of all those previous. Doctor Zhivago, after all, was one of the highest-grossing films of all time, an achievement obscured by having been released in the same year as The Sound of Music. The making of Ryan’s Daughter had been a headache for Lean, even by the standards of his notoriously troubled productions. At least in returning to epics, he would be treading familiar ground, or rather, sailing through friendly waters.

    Lean and his writing partner since Lawrence of Arabia, Robert Bolt, sought to tackle a film adaptation of The Mutiny on the Bounty, which already had a long and storied history on the silver screen. It had twice been adapted by Hollywood: first in 1935, starring Charles Laughton and Clark Gable, and winning the Academy Award for Best Picture; then again in 1962, a notorious flop starring Marlon Brando which capsized his career. One problem Edgar Bronfman had with David Lean making a Mutiny of the Bounty movie under the auspices of MGM was that the studio had already been responsible not only for the 1935 original, but also for the 1962 remake – an attempt to make lightning strike twice after their 1925 silent epic, Ben-Hur, had been successfully remade in 1959, smashing box-office records and winning 11 Academy Awards, an all-time high. But David Lean was nothing if not doggedly determined, and he ultimately departed from MGM in the late-1970s when it became clear that they would not revisit the subject matter for a third time. Lean and Bolt ultimately shopped their concept around on both sides of the Pond, and found an interested buyer in the Baron Grade of Elstree, a television impresario who had only recently moved into the motion picture industry. [7] Lean and Bolt saw in Grade another Sam Spiegel or Carlo Ponti – all men with vast coffers and a willingness to prove themselves as entertainers. Lord Grade took advantage of newly-instituted trade agreements in place between the various Commonwealth Realms as part of the Commonwealth Trade Agreement, and the film was ultimately a United Kingdom-Australia co-production, with filming done in Australia – one of the few parts of the world Lean had not shot footage in up to that point.

    In the follow-up to his star-making role as Clark Kent aka Superman, Kirk Allen starred as Fletcher Christian, the leader of the mutiny. An obscure British stage and television actor, Anthony Hopkins, was personally chosen by Lean to play William Bligh, captain of the Bounty. Lean’s old collaborator, Sir Alec Guinness (who had not appeared in Ryan’s Daughter due to the role Lean wanted him for, Father Collins, conflicting strongly with his Catholic views), appeared as Admiral Viscount Hood, member (in the film, the presiding officer) of the court which acquitted Bligh for losing command of the ship. Although the film was called Mutiny on the Bounty (as the previous two versions well-known to American and British audiences were), it was not a direct adaptation of the 1932 novel, instead sourced directly from Bligh’s diaries and other primary sources. Lean filmed Bounty in his typical lavish style, with panoramic vistas of the open ocean highlighting the isolation of the ship amidst the high seas. This also enabled him to focus on the confined space aboard the Bounty, taking inspiration from WWII-era submarine movies in this regard. The film, unlike previous versions, did its best to avoid taking sides in the power struggle between Bligh’s loyalists and Christian’s mutineers, and portrayed both leaders sympathetically.

    The Bounty herself was played by a modified sixth-rate frigate replica, the Rose, constructed in 1970 in Lunenburg, Nova Scotia, Canada. [8] Naval enthusiasts complained that the Rose was twice as massive as the historical Bounty and half again as long, and that an actual replica Bounty had been built at Lunenburg in 1960, for the express purpose of starring as the Bounty in the 1962 adaptation. However, perhaps the only group of people more superstitious than sailors were film executives, and all parties involved agreed that recasting the 1960 Bounty would be a jinx on this new production. Bronfman, during preliminary discussions on remaking the film at MGM, had been insistent that the studio would not be paying to build a new Bounty yet again, a directive which Lean kept close to his chest. His scouts had already found the Rose when he presented his pitch to Grade, who loved the idea of saving money by casting her as the merchant vessel despite her… over-qualifications for the part.

    Mutiny on the Bounty was a massive success, along the lines of Kwai, Lawrence, and Zhivago, restoring Lean’s reputation as perhaps the foremost maker of epic films. Critics praised both Allen and Hopkins, with Allen being confirmed as “not just a one-trick pony” (British critics even praised his English accent) and Hopkins being celebrated as a “revelation”; both were nominated for the Academy Award for Best Actor. Indeed, Bounty itself won none of the marquee awards for which it was nominated (including Best Picture, Best Director for Lean, and Screenplay for Bolt), but did win several technical awards, including Best Cinematography, Best Original Score, and Best Art Direction-Set Decoration. Audiences the world over flocked to see the film, with the debate over whether Bligh or Christian was “right” igniting coffee tables, water coolers, and talk shows in 1980, and the film itself becoming the highest-grossing picture of that year.

    And then there was Steven Spielberg, who despite being a charter member of the New Hollywood generation was also seen as the father of the so-called “Blockbuster Age” which had succeeded it; the key logistical factor was the transition, pioneered by Jaws, from traditional market-to-market “roadshow”-style releases to simultaneous “wide” releases, wherein films would open on as many screens as possible in their first weekend of release. Though Spielberg had built this reputation on Jaws, he personally considered that film a purely mercenary, commercial enterprise – especially when he stacked it up against Oskar Schindler, a fellow nominee for Best Picture at the Academy Awards recognising the best in film for 1975. Kubrick was a professional inspiration to him, and he would often cite Napoleon as the greatest film ever made in interviews on the subject. Kubrick’s great critical and solid financial successes as an auteur – even Oskar Schindler had been profitable – inspired Spielberg more directly, including on his next choice of project. Like so many young people in the 1960s and 1970s, Spielberg was fascinated by the seemingly endless possibilities of an infinite universe. Aliens, in particular, fascinated him. The tiresome clichés of little green men and inscrutable invaders were not nearly so appealing to him, however, and he decided to direct a film about aliens who came in peace. First, however, James Bond had come calling, and he directed both Live and Let Die (in 1976) and The Man with the Golden Gun (in 1978). Eon Productions wanted him to return to direct The Spy Who Loved Me (ultimately released without him in 1980), but he decided that it was time to move on.

    He had long nurtured an outline for a film about humans making contact with extraterrestrials. As had been the case with Jaws, his concept was intimate in scope, focusing on the plight of a small group of people. He planned to make this movie through a development deal with Columbia, though it was put on hold after Jaws so that Spielberg could work on the James Bond films for United Artists. But by 1978, he was willing to revisit the concept, especially after Journey of the Force, written and directed by his close personal friend George Lucas, became such a smash – even outgrossing Jaws in the process. But Spielberg repeatedly ran into difficulties in getting his film green-lit, particularly as the result of clashes with the studio upper-heads. Executives wanted an action-packed adventure story in the vein of Spielberg’s Bond films or Journey, and screenwriters were suggested to him with that directive in mind. Spielberg himself wanted to write a deliberately-paced, meditative piece on the nature of humanity and our place in the cosmos – his own answer to 2001, a film whose shadow still loomed large over science-fiction fandom, even over a decade after its initial release. Columbia was hesitant, but Spielberg used his clout as a proven hitmaker – his last three movies had all been smash successes – to get the studio to see things his way. It was still the late-1970s, after all, and people still at least paid lip service to the auteur theory, even if only to bemoan it in private.

    The film was given the title Close Encounters, a reference to a term used in ufology to describe contact with extraterrestrials. The title implied the setting: present-day Earth. As a science-fiction film, all the future technology was in the hands of the alien visitors. Lawrence Kasdan, who wrote the smash-hit interracial romance film The Bodyguard, starring Diana Ross and Steve McQueen, was ultimately commissioned to flesh Steven Spielberg’s treatment out into a script. [9] Although Spielberg himself essentially co-wrote the script (especially later drafts) alongside Kasdan, the byzantine rules of the WGA assigned credit for the screenplay solely to Kasdan; Spielberg was given story credit. [10] Spielberg cast Jon Voight, who had played Hooper in Jaws, in the lead role of a scientist who was responsible for making initial contact with the aliens. [11] The film also focused on his family, particularly his relationship with his children. His youngest son, played by Raymond Read, won critical plaudits for his convincing performance, despite being the tender age of seven during filming. He became the youngest-ever actor to be nominated for an Oscar when he received a nod for Best Supporting Actor, though he ultimately did not win. [12] In one the film’s more curious casting choices, and in yet another nod by Spielberg to one of his filmmaking inspirations, the French New Wave director Francois Truffaut played the ufologist, despite his uncertain grasp of the English language. Close Encounters finished at #2 in the worldwide box-office for 1980, behind only Mutiny on the Bounty. In addition to Read’s nomination for Best Supporting Actor, the film also received nods for Best Picture, Best Director for Spielberg, and Best Original Screenplay for Kasdan.

    Spielberg considered himself among august company at the Oscars that year, for in addition to David Lean, he was also sharing space with Stanley Kubrick, who had decided to follow-up Schindler with a sunnier, lighter project – at least, by his standards. Inspired by several films of the earlier 1970s, particularly the works of Peter Bogdanovich (director of The Exorcist and Chinatown), Kubrick sought to direct a modern take on the venerable film noir genre, one which could be as blunt and explicit in its depiction of violence and gore as the classic noirs had been stylized and obfuscatory. Kubrick had directed the classic film noir Killer’s Kiss and wished to revisit the genre. He was inspired by the rash of “serial killers” in the late-1970s… as were many authors whose works he now sought to adapt.

    The story goes that Kubrick was in his office, searching for the right crime novel to adapt – with a large number of paperback novels in a rather intimidating pile. One by one, he would pick a book from the pile, leaf through the opening pages, and then throw it against the wall in frustration once he decided that the book was not what he was looking for. [13] The book would always make a loud thud against the wall, startling his secretary on the other side… until, after a fashion, she realized she had been bracing herself for a thud which never came. Intrigued, she peeked into his office and noticed him engrossed in the recently-published Thomas Harris novel The Lion of Judah, a psychological thriller and horror story about a fanatical serial killer committing murders according to his interpretation of the Book of Revelation. [14] When she brought this to his attention, he paused for a moment to consider what his captivation might mean… before curtly informing her to “buy the rights”. Without another word, he returned to his reading as she headed off to comply.

    The Lion of Judah had intrigued Kubrick for several reasons. The religious imagery, which might have otherwise alienated potential audiences (along with himself), functioned primarily as window-dressing to a book which focused on the FBI agent tasked with discovering the Lion’s identity, and then capturing him. The central “twist” of the book relative to other serial killer fiction popular at the time is that the agent was able to track down the Lion with the help of another serial killer who was in FBI custody, the notorious Hannibal the Cannibal. [15] As it happened, Kubrick had his own ideas about the story and characters which were… considerably different from those of Harris, and he was not afraid to bring them to life. Kubrick had never been much of a collaborator – he and Kirk Douglas constantly clashed on the set of Spartacus – and he never had much respect for writers of the works he adapted for the screen. It didn’t help that Harris himself was extremely reclusive – years later, Kubrick would remark that he had never met the man in person; Harris did not even attend the film’s premiere in Washington, D.C. (Kubrick himself attended only by having chartered transatlantic passage by ship.)

    Despite the American premiere and setting, the film itself was mostly shot at various locations around England, rather awkwardly standing in for the Midwestern United States. The cast, however, was indeed mostly American – only Hannibal the Cannibal himself was British, played by Peter Sellers, in his final collaboration with Kubrick and his final acting role, full stop – he died shortly after the end of filming. He received a posthumous nomination – and win – for Best Supporting Actor in 1981 for his performance as Hannibal the Cannibal. As was to be expected in any film where Sellers was part of the cast, he overshadowed most of the other players; in particular, the lead actor, playing the FBI agent who tracks down the Lion of Judah with Hannibal’s help, was savaged by critics as particularly bland and unmemorable. On the whole, the film was a lighter experience than Oskar Schindler – although almost anything would have been. At Edgar Bronfman’s insistence, Kubrick cut the film so that it would receive an “R” rating in the United States, also ensuring that it would receive a “15” certificate in the United Kingdom. He thus focused on what was left unseen to create the requisite atmosphere of gloom and dread. The more accessible rating paid off, resulting in another hit for Kubrick and for MGM – in fact The Lion of Judah was even more successful at the box-office than Oskar Schindler had been.

    Kubrick’s continued emphasis on darker, broodier works might have had an influence on Spielberg, because the younger director decided to follow Close Encounters with another science-fiction film about alien visitors, the proverbial “evil twin” to his earlier picture. This was done at the behest of Columbia, whose executives naturally wanted a sequel. However, it evolved into an independent project shortly thereafter, as Spielberg felt that Close Encounters told a complete story, and he was leery of directing a possible sequel, as the sequel to his Jaws was a critical and commercial disappointment (and despite that, a third film – which promised to be an even bigger disaster – was still in development). As a result, Spielberg’s idea evolved into Watch the Skies, reusing the same working title as Close Encounters but with a very different plot. Watch the Skies was based on the infamous Kelly-Hopkinsville encounter, which Spielberg had learned about during the production of Close Encounters; this had cemented in the popular imagination the idea of malevolent aliens visiting remote farmhouses and mutilating livestock. Spielberg wrote a treatment loosely based on the original encounter, but told from the perspective of the farming family encountering the aliens. [16] For more proof that Spielberg intended Watch the Skies as an “evil twin” to Close Encounters, Lawrence Kasdan was once again brought on to write the script, although Spielberg considered other writers, such as John Sayles, before ultimately re-teaming with Kasdan. Executives at Columbia, much to their own astonishment, loved Kasdan’s first draft and green-lit the film, which immediately went into pre-production.

    Spielberg and Kasdan agreed to rename the film Night Skies, deeming it more evocative than Watch the Skies (“it sounds like an Army Air Forces recruitment film”, as George Lucas had remarked of the original title). It would feature a small group of aliens – eleven in Spielberg’s original treatment, though this was continually cut down throughout the pre-production phase. In Kasdan’s first draft, there were eight – seven evil, each representing one of the deadly sins, and an eighth, the token good alien, who befriended an autistic child, Emmett (who would be played by Raymond Read from Close Encounters, in yet another production link to the earlier film). However, it was decided that a few of the sins would not translate well to a film intended for general audiences. Lust was eliminated because Spielberg and Kasdan felt it would be best to avoid raising the question of… interspecies compatibility. Sloth was eliminated because it did not fit the concept of intrepid alien scientists travelling to a distant planet to study and dissect its native life. Avarice and gluttony were consolidated into the personality of a single alien, leaving just four “sinful” extraterrestrials, and five altogether. The animatronic characters were designed by Rick Baker, who had worked on the makeup for Journey of the Force, in the breakthrough film of his career. His prototype design for “E.T.” (short for extra-terrestrial), the good alien (given that nickname by Emmett) cost nearly $100,000, but would form the basis for some of the most iconic alien creature designs in history.

    Night Skies was also the most gory and visceral film yet directed by Spielberg, very nearly earning the film an R-rating from the MPAA. On appeal, it was reduced to a PG, though it soon became clear to all involved that there was a gap in the MPAA ratings system that might have needed to be filled. Those involved with the making of the film, alongside critics and other commentators within the entertainment industry, and particularly those outside of it, for once seemed united in their assessment of Night Skies - it was too violent and bloody for a PG-rating, but too tame and family-friendly for an R-rating. The MPAA, at Spielberg’s petitioning, immediately got to work seeking out a happy medium.

    All the same, Night Skies was a smash, becoming the highest-grossing film of 1982, and receiving several Oscar nominations, including, once again, Best Picture, Best Director for Spielberg, and Best Original Screenplay for Lawrence Kasdan. Raymond Read once again received an Oscar nomination for Best Supporting Actor, becoming the youngest person to be nominated for two Oscars and the first person to receive a second Oscar nomination under the age of 10. Though he won neither award, this legacy would come back to haunt him in later years. In less foreboding news, the inaugural Academy Award for Best Makeup was awarded to Rick Baker for his accomplishments on the film, the first of many wins he would ultimately score in that category.

    Night Skies was very much like Close Encounters (and Spielberg’s earlier film, Jaws) in that they shared a relatively limited narrative scope. This was an especially marked contrast to the globe-trotting and international intrigue which characterized the James Bond films, including the two which he had directed. Spielberg very much wished to return to a “big” movie for his next project after Night Skies, and he had long wished to make a film about the Pacific Theatre in World War II. Considering that the Pacific Ocean covered a third of the Earth’s surface – larger than all of the planet’s landmasses combined – it would appear that an epic scope would be necessary to do justice to depicting the war on screen. Columbia was leery; Pacific War pictures had consistently underperformed at the box-office, including the critically-acclaimed American-Japanese co-production Tora! Tora! Tora!, in the mid-1970s. Audiences weren’t interested in a WWII picture, forcing Spielberg to put the idea on ice for the time being.

    In the meantime, both he and Kubrick seemed to be having an impact on the Hollywood scene of the 1980s. The thriller, science-fiction, and horror genres, all thrown together into a blender, and mixed well with the economic uncertainty of the era (before the Invest in America program finally began paying dividends mid-decade), created a new “fusion” genre, perhaps best embodied by another film released in 1982, the first to make extensive use of visual effects produced with computer technology. In many ways, this film was a neo-noir, though with a unique setting: the inside of a computer. Hardwired depicted the computer’s central processing unit as a gritty inner city, named Circuit City. In the noir tradition, it starred a private investigator whose loved one was gunned down by a newly-arrived crime lord whose “territory” included Circuit City - being the delegate from a “global network”. He then partnered with a rogue vigilante who had pursued this crime lord from some other “city” which was said to have left in ruins, leaving the vigilante to swear revenge. Circuit City thus represented a single microcomputer; the specifics of the “network” were unclear, but the writers used the historical telegraph and telephone lines for inspiration, envisioning the day when microcomputers could all be interconnected worldwide in much the same fashion, cementing Hardwired as science-fiction despite many of the trappings of Circuit City being contemporary, or even throwback homages in some respects.

    The art direction and set design of Hardwired was carefully handled, with a focus on stylistic artifice. In this respect, Hardwired would prove tremendously influential in helping to create the “look” of the 1980s: bright, flashy colours contrasting with a background of darkness and shadow, which eventually came to be known as the “neon” style (from how it strongly resembled neon lighting at night). This style would eventually lend itself to the tremendously popular Neon City Vice later in the decade, and indeed that series owed about as much to Hardwired as it did to MTV in its visual aesthetic, only without computer effects. No, computer-generated or assisted visual effects (which became known in shorthand as digital effects, to contrast with the physical effects which included scale models, prosthetics, matte paintings and compositing, stop-motion, and trick photography, alongside other decades-old techniques) were very much the exclusive province of Hardwired in the early going. [17] That said, even Hardwired was relatively sparing in its use of digital effects, which remained far more expensive, time-consuming, and labour-intensive than physical effects for the time being, and were far less convincing to the human eye. [18] However, digital effects were heavily advertised in marketing for the film. This rankled many old-school members of the Academy, some of whom were vocal in considering the digital effects of Hardwired to be “cheating”, and thus the film was not nominated for the Oscar for Best Visual Effects (which instead went to Night Skies). [19] Complimenting the film’s synthetic visual style was the score, performed entirely by synthesizers. This did receive an Academy Award nomination for its composer, the Greek musician who called himself Vangelis.

    The film was a commercial success, though many critics complained of the pedestrian, clichéd plot (despite the novel setting and impressive visuals). Hardwired being such a throwback was an obvious attempt by the film’s producers to have the Journey of the Force lightning strike twice – especially since it seemed that an actual sequel to Journey was years away in the early-1980s. In one respect, it did live up to its ambitions to be a spiritual successor to Journey, and that was with regards to merchandising: tie-in toys and games (including the arcade game, and the home console and computer ports which soon followed) were all smash hits. Hardwired would come to accrue more critical esteem in later years, being credited (much like Star Trek before it) for “inventing” the future in many ways. It also became a hot seller on home video, with more units sold for CED than any other film released in 1982, including Night Skies.

    Hardwired borrowed its dark, surprisingly nihilistic tone from the darker films that had preceded it, but it was in many ways both mainstream and highly approachable and consumable by audiences. The same could not be said of many other cult films released in the early-1980s. Many were the products of United Artists, newly part of the CanWest conglomerate. Given its inexperienced ownership, a period of experimentation was inevitable, and this was allowed by studio executives as long as those doing the experimenting were Canadian - federal tax credits were a necessity. Thus, a number of Canadian filmmakers found themselves with the resources of a major Hollywood studio at their disposal, and surprisingly, many of them were able to make something of this once-in-a-lifetime opportunity.

    David Cronenberg, from Toronto, took up the horror mantle and ran with it; the more recent works of Stanley Kubrick, Oskar Schindler and The Lion of Judah, were clear influences. His early films, stylistically, were a combination of the devastation and depravity of Schindler with the psychological elements of Judah – albeit with a less detached narrative drive than Kubrick; Cronenberg’s films were character pieces with focused on how the horror wrought upon them affected them personally, usually demonstrated for cinematic purposes through grotesque and irreversible physical transformations.

    James Cameron, born in Northern Ontario but raised in Niagara Falls, was much more a nuts-and-bolts filmmaker than Cronenberg, far more interested in spectacle than the visceral. Inspired to go into filmmaking by Journey of the Force, he started his career with Roger Corman in a production design capacity. Corman gave him his first shot directing, with the result being a “haunted house in space” film called Star Beast. [20] A cheaply made, efficiently-shot exploitation action-horror film, Cameron’s flourishes as a director and a production designer nonetheless brought him to the attention of studio executives at United Artists, where he worked in visual effects, second unit photography, and as first assistant director on several genre pictures – not making his return to the director’s chair until the newly-reorganized Lucasfilm approached him in 1984 – George Lucas was particularly impressed with his ability to sell his ideas.

    Nevin Richard of Montreal was also a throwback director, of sorts, though his inspirations were not 1940s genres but 1960s genres. A devotee of the French nouvelle vague, Richard was also an admirer of an obscure cult writing style known as gonzo journalism, most popular in the early-1970s, along with stream-of-consciousness narratives in literature and pop art in still images. [21] His gift was in bringing these diverse styles together and fashioning an eclectic and identifiable whole from them, and this impressed executives at United Artists, who saw him as the kind of director who would earn them mountains of prestige, ideally playing to the art-houses with the same success that Woody Allen had managed for the studio in the 1970s.

    And finally, among francophone directors, there was Denys Arcand, born in a small village in Quebec. Already an established filmmaker by the time UA came calling, and fully bilingual to boot, he was nonetheless reluctant to direct a film in the English language. When he finally did so, that film (an adaptation of a popular Mordecai Richler novel) underperformed, and UA agreed to distribute his French-language films to the worldwide Francophone audience, an arrangement which both sides found far more agreeable.

    By 1984, it seemed that “dark” and “bizarre” were dominant artistic trends in the filmmaking industry, which is why it came as such a surprise when Steven Spielberg decided to buck this trend – and with his planned WWII film, to boot. He’d decided that a comedic angle might be just what the picture needed to breathe life into the Pacific War, not to mention that he was tired of dark, brooding movies after having directed five in a row – even his two James Bond movies had been decidedly more dour and straight-faced than the norm. Spielberg encountered resistance to the notion from all corners; it had been almost four decades since the war had ended, but treating World War II with irreverence still seemed almost sacrilegious to a great many Americans. Eventually he decided to focus on a far more recent – and much briefer – conflict in which Americans had fought: the Argentine War.

    Here Spielberg took inspiration from Kubrick once again, but this time from Dr. Strangelove. The Argentine War, more than most conflicts, was predicated on several absurd elements – the casus belli was an archipelago of windswept islands, the Pope had been unable to defuse tensions between the two primary belligerents, and the UN task force sent to pacify the region so overpowered the Argentine armed forces that the former had obliterated the latter in a single battle. However, memories from the Argentine War were fresh in everyone’s minds, and most Americans approved of it as an absolute victory for democratic ideals against totalitarianism – and the first decisive American military victory since World War II; it was easy to see how Spielberg was able to conflate the two conflicts. Spielberg decided to fictionalize the story, which he did with the help of screenwriter John Sayles, from whom he commissioned the script; as had been the case with Night Skies, Spielberg retained story credit. [22]

    The film was given the title Prepare for War, from the famous adage “if you want peace, prepare for war” (si vis pacem, para bellum, in the original Latin.) The story involved two countries: the People’s Democratic Republic of Platinea (Argentina) and the Republic of Andea (Chile), who were in dispute over a contested territorial claim in Antarctica, a situation which the corrupt and inefficient fascists in Platinea hoped to exploit to distract their restive populace from their dismal domestic situation. Among the members of the Platinean cabinet was a mysterious older gentleman (“El Señor Doktor”) with a German accent – a reference not only to Dr. Strangelove, but also specifically to Josef Mengele, believed to still be alive and in hiding in Argentina, having fled there after the war. [23]

    Their opposite numbers were ersatz representatives of the Western Democracies: President Blaster of Freedonia (the US), a former newsreader before he got into politics, was an intriguing contrast to the Platinean cabinet, in that he also wanted to start a war to distract from domestic issues which threatened his chances at re-election. Prime Minister Whiffle of Albion (the UK) – whose appearance vaguely resembled that of President Muffley in Dr. Strangelove, also to emphasize his weakness and impotence [24] – continuously ignored or denied the threat from Platinea, even though they also had territory in Antarctica neighbouring the disputed claim, and the Platineans had often attacked them over it in the past – always without facing reprisals. President Gloverrain of Gaul (France) was a snob who cared little about the situation, going along with everyone else only after making clear how deeply it inconvenienced the people and republic of Gaul, who cared little about the situation, or any situation that did not involve Gaulish food, Gaulish wine, or Gaulish politics. Finally, Prime Minister Stamford of Borealia (Canada) was eager to prove his small country’s mettle, despite the relative paucity of its own military might. (Australia, the fifth major participant in the allied intervention, was not directly represented at all in the film – although in some ways Borealia was a composite of both Canada and Australia, as implied by its name. Indeed, even Canada would likely have been left out if there hadn’t been room for an overcompensating, eager bootlicker among the allies.)

    The central action set-piece of the film, the Battle Over the Antarctic Ocean, was depicted farcically, with extensive pyrotechnics used to represent the incredibly overpowered allied task force completely demolishing the tiny Platinean contingent – made to look far more pathetic in the film than even the Argentines had been in the actual Battle of the Argentine Sea. The “carrier” depicted in the film, the Feliz Navidad, was a hastily jury-rigged monstrosity of interwar vintage, flying WWI-era biplanes off its deck (including a Fokker decked out in Red Baron livery); other ships in the Platinean fleet included honest-to-goodness sailing ships. Given the Amerocentric spin of the film, the infamous scandal in which the British fighter planes remained on deck due to their lack of a decisive advantage over the Argentine ground-based fighters was not directly referenced, though the Albionish commanding officer (identified as Commodore Lord Fauntleroy of Chesterfield – true to his name, he spent most of his appearances sitting on a couch) did (comically) express hesitation at engaging “such a formidable fleet” in reference to this. The Freedonian admiral in charge of the task force was played by Ernest Borgnine (in an obvious reference to McHale’s Navy) – with part of the joke being his advanced age (he was already over 65 during filming) making him unsuitable for direct leadership of a critical task force, but that he was chosen because he had experience in a similar campaign during the Pacific War; Borgnine himself had been a gunner’s mate, but the character was referred to as having been a commanding officer on a torpedo boat, again in reference to McHale’s Navy. The CAG, Colonel Popcorn, was played by none other than Kirk Allen, in an opportunity for the actor to demonstrate his comedic talents – he was the central viewpoint character of the battle sequence. As with Borgnine, Allen was chosen in part to reference his past roles: he was a flyer (like Superman) stationed aboard a ship (like Fletcher Christian). In contrast to Borgnine, Allen, born in the early-1950s, was a bit too young to play a Colonel (or rather, a Captain, as his character should have been a naval aviator if not for the pun potential in his name), and was depicted as green (too young to have served in the overseas quagmire) and gung-ho, eager to prove his mettle.

    The denouement that followed played out very similarly to the actual denouement to the Argentine War; the neighbouring country of Pindorama (Brazil) immediately declared war on Platinea following the defeat of most of their military (with the crossing of the “Platinean River” scene being a brief if obvious homage to Bridge on the River Kwai), as the task force headed for the capital bombarded the Platinean coastline along the way (an act clearly depicted as unnecessary and gratuitous, echoing the common criticism of its execution – sometimes called the “Rape of the Argentine Coast” – in real life). Upon arriving at the capital, bedlam erupted in the Presidential Palace, with multiple deaths (some by suicide, some by assassination) taking place in a single scene – Sayles and Spielberg wrote and shot the scene as an insane parody of the closing scene of Hamlet, complete with the “bad guy” (from their perspective) showing up randomly at the end.

    Spielberg couldn’t resist the opportunity to punish Mengele vicariously, having the mysterious elder statesman with the German accent be captured alive by the allies, who immediately recognized him (“it’s Doctor Von Mangler!”) and arrange for him to be sent to “the Hag” for his war crimes tribunal. In one of the more off-the-wall moments of the film, Von Mangler was then handed over to an old, ugly woman (who had been a silent background character in several earlier briefing scenes) who immediately attempted to have her way with him. Naturally, the only remotely sane Platinean cabinet minister (believed by many to have been based on El Suertudo, Juan Manuel Lombardi, though there was no evidence for this) was the only one left standing at the end of this chaos, and it was he who signed the instrument of surrender (aboard the American carrier, as Spielberg considered that “cooler” than how the war actually formally ended, and it was yet another homage to World War II), ending the war, and the film; immediately before the cut to credits, a montage of still images with captions detailed the fates of all the major players, lingering on Freedonian President Blaster being defeated by “the first Freedonian in space, and the first man on the Moon” in the subsequent Presidential election.

    A massive departure for Spielberg, it was also his best-reviewed film to date, profusely praised by Roger Ebert and Gene Siskel in Coming Attractions as “incisive satire, freewheeling farce, and superbly-executed slapstick in a perfectly-balanced comedic concoction”. As a pure parody film, it received more acclaim than even Catastrophe! had a few years before. Not since Dr. Strangelove had a satirical film been so well-regarded, although Spielberg’s film was considerably lighter in tone than Kubrick’s black comedy. The humour was also less sexual or prone to double-entendre, Spielberg preferring farcical elements. He had even attempted to film his own take on the famous “pie fight” ending to Dr. Strangelove in his movie (with Kubrick’s blessing), but ultimately could not, because (as had been the case for Kubrick) his actors could not maintain straight faces for complete takes; the abortive pie fight was eventually included as bonus footage on the CED release of the film.

    Kubrick himself was very fond of Prepare for War; while acknowledging its debt to Dr. Strangelove, he admired its commentary on the Argentine War and on the potential of conflicts being manufactured as a diversion from more pressing issues on the home front. The film itself was released the same year that Peronistas, a musical with similar themes as regards war and the military, made it to Broadway, and between the two of them, a new wave of historiography on the Argentine War began to emerge as a result. Many of the real-life political figures parodied in the film, some of whom were still in office, were naturally less enthusiastic - though former President Ronald Reagan, who along with UK Prime Minister Whitelaw was perhaps the most viciously parodied figure among the Allies, publicly spoke out about how much he had enjoyed the film, taking his portrayal as “President Blaster” in stride, joking that he would have even played the part himself, were it offered to him. (Gregory Peck, a Democrat, played the part instead.) Whitelaw, for his part, did not deign to comment on “Whiffle”, though his Parliamentary opponents took delight in needling him with the character. [25] In Canada, Prime Minister Stanfield was delighted by his portrayal in the film, as was much of the Canadian media, who proved that portrayal accurate upon expressing delight at having rated mention at all.

    Former French President Mitterrand [26] never commented directly on his portrayal as the snooty, aloof Gloverrain, but many within the French intelligentsia rose to his defence, and to the defence of the Republic. It was film critic Arthur de Boutiny, writing for Le Monde (the French newspaper of record, and the only one widely available in the Anglosphere), whose observations were most widely noted:

    Arthur de Boutiny said:
    The government and the people of France took an active interest in the conflict from the beginning. Mr. Mitterrand was vocal in his condemnation of Argentine aggression, and enthusiastically supported UN sanctions, and ultimately intervention. The Clemenceau was eagerly volunteered as a critical member of the task force, and our flyers defeated the Argentines with great skill and discipline, even as the British remained grounded on deck. Our troops and ships remained in the region after the war ended, and played a critical role in the humanitarian effort that followed. Further, Mr. Mitterrand, as a left-wing head of state, has always shown tremendous compassion for the poor and the unfortunate, including those who were oppressed, imprisoned, tortured, and killed by the totalitarian regime in Argentina. Messrs. Spielberg and Sayles, despite depicting all other parties involved in a fashion resembling their actual behaviour prior to and throughout the war, chose instead to resort to the same old clichés to portray the French. It was beneath both of these men’s talents to do so. But it could have been worse – at least in this film the Gaulish forces did not eagerly surrender at the first sight of the Platinean biplanes and galleons.

    De Boutiny’s critique so moved both Spielberg and Sayles (albeit perhaps with a nudge from the studio, worried about the scandal hurting their grosses at the French box-office) that they publicly apologized both to De Boutiny, and to the French people as a whole. Ironically, despite his scathing critique, De Boutiny gave the film a good review overall, and it performed quite well in France - some Frenchmen were miffed at their country’s inaccurate depiction in the film, but this was blunted by the overall parodic and farcical tone of the picture. Among the film’s defenders in France was none other than Francois Truffaut, whom Spielberg had even invited to appear as Gloverrain, though he declined, believing that the part called for a stronger actor, given the tone of the film. (Gloverrain was ultimately played by Franco-American actor Rene Auberjonois, best known at the time for his role as Father Mulcahy in the flop anti-war film M*A*S*H.) [27]

    Prepare for War was released in 1984, a fitting year of release for a film whose central theme involved a manufactured war as a distraction from the home front. However, Prepare for War could not possibly carry the mantle of “a modern-day 1984 for the year 1984” because that was the year an actual adaptation of Orwell’s classic dystopia hit the big screen, helmed by none other than Stanley Kubrick. Kubrick agreed to direct 1984 at the behest of Edgar Bronfman shortly after The Lion of Judah was released. Bronfman, aware that someone would want to release an adaptation of 1984 in 1984, decided to beat everyone else to the punch, securing the film rights in 1980. He was also aware that Kubrick worked very slowly, and he figured that four years would be enough time for him to complete production. Kubrick was enticed by the idea, and suitably flattered by Bronfman’s proclamation that only he could properly bring Oceania to the screen.

    Kubrick adapted 1984 for the screen himself, though he briefly considered hiring Terry Southern to write it with him. Much of the location scouting and photographic research he had done in developing Oskar Schindler was put to good use in pre-production, as the art design and set decoration required a distinctly totalitarian aesthetic. One of Kubrick’s more particular bugbears, accuracy in the use of period typefaces, reared its ugly head here as well. Relying too heavily on Soviet art and architecture was a double-edged sword: the dominant school of design in the Second World, brutalism, placed an extreme emphasis on utilitarianism, or function over form. Brutalist structures thus tended to be simple, drab, and monotonous, which would not be visually appealing for audiences. [28] Although this may have complemented 1984 as a narrative, it went against Kubrick’s instincts as a filmmaker; his films had always been known for their bold, stylistic visual flair. Thus, eventually, he somehow worked out a compromise between the grotesque excesses of fascism and the extreme utilitarianism of communism, devising a style which would produce some of the most memorable images in film.

    1984 was otherwise a relatively faithful adaptation of Orwell’s novel. Kubrick did emphasize the lack of objective truth in the novel by emphasizing repeatedly that even the basic facts which were presented to the audience were just as likely to be fabrications as the propaganda which was known to be false. Kubrick made skilled use of third-person narration to emphasize the ambiguity and muddle the “truth” – several narrators were used, each telling a different story from all of the others. The film also openly suggested a commonly-held theory about the world in which 1984 was set: that Eurasia and Eastasia, the two perpetual rival states to Oceania, did not exist - that even whole countries existed merely as tools of the Ministry of Information. Kubrick also made extensive use of Orwell’s IngSoc “language”, even creating additional words in that language not originally featured in Orwell’s novel.

    Ironically, the most common criticism of 1984 was that it played out in exactly the way everyone had expected from a Kubrick adaptation of the novel; there were no surprises to be had. The ideal for any artist was to surprise and satisfy in the same breath; though Kubrick had succeeded at the latter, he had failed at the former. It seemed that mantle would have to be taken up by a newer generation of artists, including the Canadian Cohort of Cronenberg, Cameron, Richard, and Arcand.

    However, despite his artistic crisis of confidence, the film performed very well at the box office, capturing the zeitgeist and becoming Kubrick’s biggest hit since Napoleon. Like many of his other films, it was nominated for a slew of awards, including Best Picture, Best Director, and Best Adapted Screenplay at the 1985 Oscars. It did not win any of these, but did take home a handful of technical awards, including Best Art Direction-Set Decoration. The big awards, much to the surprise of many, went to Prepare for War, which became the first comedy film to win Best Picture since Tom Jones over two decades earlier. [29] Spielberg won his second Best Director Oscar, and Sayles took home his first, for Best Original Screenplay. [30] Although Gregory Peck and Ernest Borgnine had each been nominated for Best Supporting Actor, neither veteran took home another Oscar at that night’s ceremony; nominal “lead” Kirk Allen (who was top-billed despite making his first appearance well into the second act of the film) hadn’t even been nominated. Spielberg, in his acceptance speech, dedicated his Oscar win to two men: Kubrick and Truffaut, the latter of whom died several weeks later.

    With 1984 behind them, the two directors – Stanley Kubrick and Steven Spielberg – were at a crossroads. Kubrick ultimately decided to try his hand at adapting Umberto Eco for the screen. [30] Meanwhile, by the time of the Academy Awards ceremony in early 1985, Spielberg was about to commence filming on the sequel to Journey of the Force

    ---

    [1] Attlee apparently wanted rationing to continue indefinitely, which again helps to inform 1984, and demonstrates the totality of state control of the British economy in the 1940s. Rations were not fully lifted until 1954, under the Conservatives. They survived George VI (died 1952), Queen Mary (died 1953), and the coronation of Elizabeth II (also 1953), mooting any prospect of the return of a lavish coronation banquet (last thrown for George IV in 1821, although one was planned for Edward VII in 1902 before his appendicitis mooted the affair).

    [2] Kubrick planned to direct an adaptation of the novel Wartime Lies during this timeframe IOTL, attempting to commission Isaac Bashevis Singer to write the screenplay in 1976; Singer declined, and the film was not optioned until the early-1990s, at which time Schindler’s List was already in development; ultimately Kubrick abandoned the project. ITTL, since he remains at MGM with the success of Napoleon, he is able to avail himself of the studio archivist’s services to find a story idea instead.

    [3] Based on Kubrick’s OTL critique of Schindler’s List, which was suggested to him as a film about the Holocaust: “Think that’s about the Holocaust? That was about success, wasn’t it? The Holocaust is about six million people who get killed. Schindler’s List is about 600 who don’t.” This quote suggests how he would approach the film if he had made it (and indeed, how he does make it ITTL).

    [4] The plot of Oskar Schindler resembles that of Schindler’s List only in broad strokes - List was directly adapted from the Thomas Keneally novel Schindler’s Ark (never written ITTL) which (being a novel) was a fictionalized account of true events. One major change is that the emphasis on the list itself (or rather, the lists, as there were several drafts) is greatly downplayed.

    [5] Pfefferberg was tireless and incessant in his attempts to disseminate Schindler’s story to the masses. IOTL, he convinced Thomas Keneally to write Schindler’s Ark (as the story goes, he cornered Keneally in his shop upon learning that he was a writer and hard-sold him into writing about Schindler). He then served, as ITTL, as a consultant on the screen adaptation of that novel, Schindler’s List.

    [6] Schindler died in 1974 IOTL; the financial security brought on by the “hero’s pension”, along with being constantly scrutinized by people with an active interest in his continued good health, serve to prolong his life somewhat.

    [7] Baron Grade of Elstree is better known IOTL as Lew Grade, the man who offered to broadcast The Muppet Show on his channel, ATV, after every American broadcaster turned them down. He pursued a career as a movie producer IOTL as well, and just as ITTL he chose to adapt a novel with a nautical theme: Clive Cussler’s 1976 Dirk Pitt adventure, Raise the Titanic!. It was a huge bomb, due in part to overinflated production costs (as Lord Grade himself famously put it: “it would have been cheaper to lower the Atlantic”).

    [8] The Rose, completed in 1970 (shortly after the POD) and modeled after the sixth-rate frigate HMS Rose from over two centuries before, has starred in several films IOTL, most famously the 2003 film adaptation of the Aubrey-Maturin novels, Master and Commander: The Far Side of the World, for which she was officially renamed HMS Surprise (a designation to which she is technically not entitled, as she does hold a royal warrant).

    [9] Kasdan’s screenplay for The Bodyguard was indeed written in the 1970s, and it was indeed considered as a Diana Ross vehicle, and thus intended as an interracial romance from the outset; various candidates for the male lead included both McQueen (who declined IOTL because he refused to be billed after Ross – he accepts the part ITTL under the compromise of diagonal billing, invented for him IOTL as well as ITTL) and Ryan O’Neal. This is what ultimately brought him to the attention of George Lucas IOTL, when he needed a scriptwriter for The Empire Strikes Back. Meanwhile, The Bodyguard was, of course, finally made into a film starring Whitney Houston and Kevin Costner in 1992 IOTL. That film was notoriously silent on the subject of race; ITTL, this version of The Bodyguard makes much greater hay of the issue. (And yes, Diana Ross sings some songs in the movie and for the soundtrack, but no, an R&B-styled cover of Dolly Parton’s “I Will Always Love You” is not among them.)

    [10] With thanks to Electric Monk for taking the time to find the official WGA Rules as regards script and story credit for an original screenplay:

    The term "story" means all writing covered by the provisions of the Minimum Basic Agreement representing a contribution "distinct from screenplay and consisting of basic narrative, idea, theme or outline indicating character development and action."

    Any writer whose work represents a contribution of more than 33% of a screenplay shall be entitled to screenplay credit, except where the screenplay is an original screenplay. In the case of an original screenplay, any subsequent writer or writing team must contribute 50% to the final screenplay.

    In the case of Night Skies, Spielberg would have developed all of the elements comprising a story credit by himself, he then would have worked on perhaps a quarter to a third of the script alongside Kasdan. Had Night Skies been an adaptation, this might have been enough to get him a co-writer credit for the screenplay, but since this is an original work, Kasdan alone receives the screenplay credit. This is rendered in the opening credits of the film as “Story by STEVEN SPIELBERG; Screenplay by LAWRENCE KASDAN”.

    [11] As previously mentioned, Richard Dreyfuss did not play Hooper in Jaws due to his commitment to Those Were the Days. Although Billy Crystal took his breakthrough role in American Graffiti, I don’t think I could get away with casting him as Hooper ITTL. Therefore, and as previously mentioned, the part goes to Jon Voight, which then gets him the role in Close Encounters as a knock-on effect.

    [12] The youngest nominee in any acting category at the time of the POD (at which time the most recent ceremony had been the 38th Academy Awards, held on April 18, 1966), was Jackie Cooper for Best Actor in Skippy, at the age of 9 (and 20 days). The youngest-ever nominee up to that point in Read’s specific category was Brandon deWilde, for Shane, at the age of 11 (-going-on-12). In the years since then and through the end of this timeline, IOTL, only Justin Henry for Kramer vs. Kramer came along to dethrone them - he remains to this day the youngest-ever nominee of any competitive Academy Award of Merit. ITTL, on the other hand, he was never born, having an OTL birth year of 1971.

    [13] Yes, Kubrick was looking for his next project (after Barry Lyndon) in exactly the same way IOTL, and it was in this way that he stumbled across The Shining (the first of a great many things about his approach to the novel which mystifies author Stephen King, as by his own admission the book starts off rather slowly and having little to do with the eventual thrust of the plot).

    [14] The equivalent ITTL of Red Dragon, the first book in the Silence of the Lambs/Hannibal series, which introduced (or rather unleashed) the character of Dr. Hannibal Lecter to (upon) the world. Harris’s earlier novel, Black Sunday, is not written ITTL (although the Munich Massacre, which inspired it, still happened).

    [15] Yes, this is Hannibal Lecter ITTL. The character was inspired by intrepid 23-year-old reporter Thomas Harris’s trip to Monterrey, Mexico, in 1963, where he met with Dr. Alfredo Ballí Treviño (aka Dr. Salazar), who had been convicted of murdering and mutilating a close friend. Since his encounter with Dr. Salazar predates the POD, Hannibal’s creation is therefore resistant to butterflies, even after all these years. Hannibal’s surname is not Lecter ITTL, and indeed, his surname is never given in the film – he is identified primarily by his nom de guerre, as is the case with so many serial killers.

    [16] Watch the Skies, later (as ITTL, though for different reasons) renamed Night Skies, was an OTL story idea by Spielberg which he conceived at the behest of Columbia, hungry for a sequel to Close Encounters. IOTL he went off to make 1941 and then Raiders with George Lucas (who is, of course, unemployable in the early-1980s ITTL) before returning to the project; he first wanted Kasdan to write it, but he was already writing The Empire Strikes Back. He then turned to John Sayles, who delivered his first (and, as it turned out, the only) draft of the script in 1980. Rick Baker spent somewhere between $70,000 and $100,000 (reports vary) on a prototype design for the lead alien before Spielberg, leery of making another blood-and-guts film immediately after Raiders, scrapped the project. In essence, Night Skies evolved into two different films: the alien encountering a family (and particularly the “befriending a child” subplot) became E.T., and the horror elements were transmuted into Poltergeist; both films were produced by Spielberg, though Tobe Hooper directed the latter. Amusingly, neither film was produced by Columbia; E.T. was made at Universal, and Poltergeist at MGM-UA.

    [17] IOTL, the term computer-generated imagery, or CGI, eventually became the predominant term to refer to what ITTL are known as “digital effects” (although the more generic term “computer graphics” was also very popular for a time). On the other hand, IOTL, the term “practical effects” is used instead of “physical effects”.

    [18] As was the case IOTL with many films remembered as “breakthroughs” in CGI technology, including not only OTL’s Tron but also Jurassic Park and Terminator 2: Judgment Day. It wasn’t until the mid-1990s that films began to rely primarily on CGI for visual effects, and most of these have aged horribly. (It was the Star Wars prequels – particularly Attack of the Clones – and then Avatar, in the 2000s, which really were breakthroughs in CGI with (relatively) little use of practical effects.)

    [19] As per IOTL. Tron was nominated for two Oscars (winning neither), neither of which was Best Visual Effects. The Visual Effects Oscar went to E.T. The Extra-Terrestrial.

    [20] Star Beast was the working title for the script that became Alien IOTL, which was very nearly sold to Roger Corman. The former change did not happen ITTL, but the latter did. The original script was much more schlocky than the finished film, which it (naturally) remains as a Corman production; not even James Cameron can elevate it to high art. (Remember, his first movie IOTL was Piranha II.)

    [21] In terms of culture shock sensibility, think Quentin Tarantino, about a decade early. Also think Talking Heads, if they had made movies instead of music (and yes, I know they made movies IOTL).

    [22] In this case, Spielberg probably wrote as much of the script as Sayles, maybe slightly less, but Sayles still gets sole screenwriting credit because the WGA judges would reckon that Sayles wrote over 50% of the film. (Although Spielberg never took script attribution to arbitration, because although he probably deserved joint credit, he was generous enough to allow Sayles full credit, since he was already the director and producer. Remember, this is Steven Spielberg, not some haughty auteur.)

    [23] Mengele did indeed flee to Argentina after the war, remaining there until 1959, when he fled to Brazil via Paraguay, where he evaded capture for the rest of his life. IOTL, he drowned while swimming off the coast of Brazil in 1979; for yet more proof that I am not writing a utopia, he is still alive and at large ITTL, and will remain so at least through September 20, 1986.

    [24] Whitelaw did not actually resemble Muffley in real life; Sellers was much more gaunt in the face, and Whitelaw – though balding – maintained a comb-over as was the fashion until baldness became “sexy” in the 1990s. He also had caterpillar eyebrows, which would be ill-served by hiding them with a pair of horn-rimmed glasses. In fact, it is the Canadian Prime Minister, Robert Stanfield, who much more closely resembles Muffley. See for yourself: here is Stanfield and here is Muffley. I’d go so far as to argue that Stanfield might look more like Muffley than the character’s actual inspiration, Adlai Stevenson. Indeed, ITTL, with his 1972 election, many American and British wags, getting a good look at the newly-elected Stanfield, promptly derided him as “Prime Minister Muffley”. In Prepare for War, Stanfield’s alter-ego Stamford is portrayed by none other than Leslie Nielsen, in a fun bit of meta casting; his brother, Erik Nielsen, is a Canadian cabinet minister (ITTL and IOTL).

    [25] This film is credited for the widespread adoption of the term “whiffle” to mean “to prevaricate; to vacillate; to be fickle” in the United Kingdom ITTL.

    [26] Mitterrand won narrowly in 1974 ITTL when he lost ITTL, and he lost narrowly in 1981, when he won IOTL. By 1981, the world economy had not sufficiently recovered to save him, although it would have by 1982 (which saved Whitelaw in the UK and Stanfield in Canada).

    [27] At the time IOTL, he had been playing the stuffy, pompous bureaucrat Clayton Endicott III on Benson (which doesn’t exist ITTL, as you can’t have Benson without Mona – I mean, Jessica) for several seasons, which alone makes him perfect as Gloverrain. In addition, a few years down the line IOTL he would voice the outrageous French chef caricature in The Little Mermaid (“hee hee hee, hon hon hon”), which shows he isn’t above poking fun at his heritage.

    [28] There actually are some distinctive (“striking” is far too strong a word) examples of brutalist architecture in England, where Kubrick filmed 1984. A principal filming location is the University of East Anglia, which – well, see it for yourself. IOTL, the campus complex was even highlighted recently by the National Trust as some of the best brutalist architecture in Britain, which granted is rather like saying that Three Mile Island was one of the best-handled nuclear disasters in American history. Speaking of American history, several prominent American brutalist structures are also featured in the film, albeit through the use of second unit photography and aerial shots, including the truly hideous Boston City Hall.

    [29] IOTL, of course, Annie Hall won Best Picture (over Star Wars) in 1978, ending a fourteen-year drought for comedy winners. Another would not win for two decades, when Shakespeare in Love controversially beat out Saving Private Ryan (a WWII film directed by none other than Steven Spielberg) in 1999. Worth noting is that even many of the comedies that have won Best Picture in the last several decades usually come with a qualifying twist or subgenre, and Prepare for War (in part a political satire) is no exception.

    [30] One of Kubrick’s many unrealized projects was an adaptation of Eco’s impenetrably dense novel, Foucault’s Pendulum. He was unable to make the film IOTL because Eco had been deeply dissatisfied with the film adaptation of The Name of the Rose, and refused to grant him (or anyone) the rights. In later years, Eco has admitted that he regrets this decision.

    ---

    Thus finally concludes the 1984-85 cycle! Only one cycle now remains before the end of the timeline. Thank you all for reading what eventually emerged as the longest update in the history of this thread, at 12,657 words, shattering the record previously held by my update on the Argentine War. (Fitting, considering that I reference the events of the update in this one.)

    Immense thanks are in order to both e of pi and Electric Monk (yes, he’s back!) for assisting in the editing of this monster update, and for letting me bounce my ideas off them. Thanks also to Thande and MaskedPickle for clarifying certain issues with regard to Britain and France, respectively. Although it wasn’t intentional, in the making of this update, I consulted with one person from each of the four member nations of the UN task force to Argentina ITTL: a Canadian, an American, a Briton, and a Frenchman, and this delights me :D

    Anyway. The next (and last!) cycle beckons. More to Come soon-ish. Then the Overview Update, fortunately those are much easier (and faster!) for me to write. Until then!
     
    Last edited:
    1985-86: The Best-Laid Plans
  • The Best-Laid Plans (1985-86)

    Tonight Show.jpg

    Tuesday, September 23, 1985, 11:50 PM EDT

    Coming back from commercial, Johnny Carson, the host of The Tonight Show since 1962, was comfortably seated at his desk, cue card in hand, as he prepared to introduce the first guest of three for that night’s episode. He waited for Doc Severinsen’s band to wrap up the number they were playing before he began speaking, addressing the viewing audience.


    “My first guest this evening is a television icon, though she’s also worked on the silver screen, in old-time radio, and on the Broadway stage. Audiences around the world have fallen in love with the scatterbrained housewife she played on I Love Lucy ever since the show first started running in 1951, and it hasn’t gone off the air since. Today she is the President and CEO of Desilu, the studio that produced I Love Lucy, along with My Three Sons, The Untouchables, Star Trek, Mission: Impossible, Rock Around the Clock, Three’s Company, Hill Avenue Beat, The Ropers, Neon City Vice…” He chuckled and turned to his sidekick and announcer, Ed McMahon. “Ed, they’ve had so many hit shows, I’ve run out of room on my cue card.”

    McMahon guffawed at this. “Ho ho, you are correct, sir!”, he said, in his notoriously superfluous manner.

    “Anyway, my point is, chances are you’ve seen and enjoyed a great many shows brought to you by her studio, and she’s here tonight to talk with us about some of their latest projects. Ladies and gentlemen, would you welcome please the lovely Miss Lucille Ball!”

    And in walked Ball, through the curtains stage left of Carson’s desk, as the band obligingly played a rendition of the theme from I Love Lucy, which over the years had been bootstrapped into her personal leitmotif, much as “Hooray for Captain Spaulding” had been for Groucho Marx. She kissed first Carson and then McMahon, both of whom had risen to greet her as she entered, before they took their seats.

    “You’re looking lovely tonight, Lucy,” Carson said to his guest, after the applause died down. “Still not a grey hair on that pretty little head of yours.”

    Ball burst into a shriek of laughter at this - she always had a fresh batch of henna dye applied to her hair whenever she would be making a talk-show appearance. [1] “Not for lack of trying,” she said.

    “Well, I’d say you’ve certainly had better luck than me, wouldn’t you?”

    “Hey-oh!”, bellowed McMahon, right on cue. Carson’s hair, of course, was mostly white, and had been for years.

    “Maybe it something to do with the fact that I used to have brown hair,” Carson said, setting Ball up for the punchline.

    She came through. “Hey, me too!”, she exclaimed, setting everyone off into gales of laughter, though none louder or longer than McMahon, who was still laughing when Carson resumed his questioning.

    “Well, Lucy, if that laundry list I was just reading is any indication, it seems your studio has been putting out hit TV shows for just about as long as there have been hit TV shows. What’s your secret?”

    “No secret,” Ball said. “We just make the shows we want to make, and so far we’ve been very lucky in having them be the kind of shows people want to see.” This typically self-effacing statement was met with appreciative applause. “Thank you, thank you. That reminds me, we always use a live studio audience for all our sitcoms.” As she said this, she gestured to Carson’s own audience. “You can’t get that kind of reaction out of a can.” This time the audience responded more enthusiastically, albeit at the prompting of the flashing “APPLAUSE” sign, and the melodramatic gestures from the show’s director.

    Ball saw this and couldn’t help but chuckle. “You see what I mean?”

    Carson nodded. “Yes, yes, they’ve always been very good to me.”

    “And really,” Ball continued, as if the digression regarding the nature of audience responses hadn’t even happened, “You ask me what my secret is, but what’s your secret? You’ve got longevity, haven’t you been doing this twice as long as Allen and Paar put together by now?”

    Carson chuckled at this. “I never keep track of that sort of thing - my agent does it for me.” McMahon guffawed loudly at his not-really-a-joke, although the studio audience – willfully ignorant of the rounds of contract negotiations which had defined network politics at NBC for over a decade, were more muted in their response.

    Ball, however, joined in McMahon’s laughter. She read the trade papers, after all.

    “So, the new Mission: Impossible show. Why are you making a new one, anyway?”

    “That show is our third-best all-time performer in syndication. Now that may not seem so impressive, but number one is I Love Lucy and number two is Star Trek.” More applause, this time spontaneous, from the audience at the mention of those beloved series. She smiled at this; she didn’t need their reaction to confirm what cold, hard data had already proven, but it was always nice to have.

    “And of course you brought Star Trek back last year as a Saturday morning cartoon. So why not bring Mission: Impossible back the same way, then?”

    “Well I know last time I was here, I talked a bit about how Star Trek was a show about characters and ideas, and how the planets and the aliens and the space battles were just window dressing, right?”

    Carson nodded, recalling the most recent of her annual appearances on the show, the previous September.

    “With Mission: Impossible, people watched for the action, the stunts, the capers. The thrill is in seeing real people pull all that off. It wouldn’t work as a cartoon the same way Star Trek does.”

    “So what’s different about this new show from the original?”

    “A lot has changed in the world in the time since the original went off the air. The global political situation is different, the technology is different, even how men and women relate to one another, and among themselves, and work together as a team is different. This new Mission: Impossible show will reflect all that.”

    “And are any of the original cast coming back? Hard to imagine the show without Rollin or Cinnamon, say.”

    Ball knew a dig when she heard one – and Carson knew as well as anyone in the industry how the contract disputes between the husband-and-wife team who played Rollin and Cinnamon, Martin Landau and Barbara Bain, and Desilu had been fodder for the supermarket tabloids and the trade papers alike in the early-1970s. Indeed, Carson wore an impish grin and lazily perched the side of his face against the palm of his left hand, his elbow propped up on his desk, as he awaited her reply.

    Ball didn’t bite. “I can say Peter Graves is coming back as Jim Phelps,” she said, and waited for the applause that came in reaction to this announcement to subside. “As for anyone else from the original IMF team… you’ll all just have to wait and see.”

    “And while we’re waiting, of course, we can always watch some of your other shows, like Neon City Vice.” The mere mention of the show drew catcalls and shrieks from the audience.

    Ball smiled. “I can see we have some fans in the audience,” she said.

    Carson chuckled at this. “Well, yes, and as you can probably tell, it’s very hot stuff. Now, even though you’ve obviously been responsible for a lot of hit shows over the years, all the hype around Neon City Vice is still remarkable.”

    “Yes, it is, and we’re thrilled that audiences have embraced it so much. The cast and crew are dedicated to making the best possible shows they can, and it’s been amazing to see their hard work pay off.”

    “And I understand that Ricardo Montalban, who plays the gentleman gangster Gutierrez, will be coming back as a regular this year?” More excitement from the audience at the mention of Montalban, riding the high of a seemingly unlikely late-career comeback to become one of the hottest names in Hollywood, despite being retirement age.

    “Oh yes, we simply had to bring him back. You know, it’s funny, I’ve known Ricardo since we were both in pictures back in the forties. I was a Ziegfeld girl, he was the ‘Latin lover’, you know the type.”

    “And look how far you’ve both come since.”

    “Exactly, this is exactly what I was saying to him the other day at the Emmys after-party. I was thrilled when he won, nobody was cheering louder than me.”

    “We do have a clip from Neon City Vice featuring the Emmy-winning Ricardo Montalban, now is this a clip from this week’s episode?”

    “Yes, this is from the season premiere, which airs this Friday night.”

    “All right, so here it is, Ricardo Montalban on Neon City Vice.”

    ---

    The return of Neon City Vice a much “hotter” show than any of the studio’s other, more established hits at the end of the 1984-85 season was the talk of the town. Desilu hadn’t had as big a hit with younger audiences since Star Trek; the 18-29 demographic adored the show, as did adolescent audiences. This helped to inform the development of their single new offering for the 1985-86 season, the Mission: Impossible revival. The question of who would return to participate in the revival was central to the show’s early development. Carrying on with the last IMF lineup from the show’s final season was right out: all but one of the six regulars from the 1972-73 season were over 50, outside the bedrock 18-49 demographic, and even older in comparison to the young adult and adolescent audiences viewers craved. Although younger audiences watched The Ropers, a show with an exclusively geriatric cast, in quantity (which would seem to refute the claim that younger audiences could not relate to older characters), that show was a lighthearted family sitcom. The levels of strenuous physical activity required of the characters in an action-adventure show like Mission: Impossible were simply prohibitive for older actors; Herb Solow remarked that “everyone watching is just going to be waiting for the moment when somebody breaks a hip”. In addition, several actors made clear that they would not return, most notably the duo of Martin Landau and Barbara Bain, who played master-of-disguise Rollin Hand and femme fatale Cinnamon Carter respectively. Landau had moved onto a reasonably successful film career; Bain, sadly, suffered the handicap of being an actress over 50 in Hollywood – the show’s producers probably would not have wanted her to return even if she were interested. Sam Elliott, as Dr. Doug Robert, was hugely unpopular with the show’s fans, and Elliott’s predecessor Peter Lupus, who played the beloved strongman Willy Armitage, was not invited to reprise his role due to the bridges that had been burned in dismissing him in the first place. That only left a select few who might be reasonable prospects for returning.

    Ultimately, of the original cast, only Peter Graves (as team leader Jim Phelps) returned as a regular, serving roughly the same role he had held on the original series. However, he seldom went out into the field himself, leaving that to the younger agents whom he had assembled. The plan for the first season was for these agents to be drawn from a large list of rotating, recurring cast members, and for the list to be whittled down after seeing how the various actors gelled with their counterparts, and how producers, executives, and – most importantly audiences would react to them. Essentially, this meant that Graves was in fact the only regular for season 1. The show’s opening titles credited him alone among the cast; other IMF agents for each given episode were listed as “Also Starring” at the beginning of the first act. Other original cast members who returned on a recurring basis were Greg Morris as tech genius Barney Collier, Mark Lenard as The Great Paris [2], and, most controversially, Lynda Day as Dana Lambert, originally the ingenue but now, in turn, serving as a mentor figure to ingenues herself. Day’s character was introduced in the sixth and penultimate season of Mission: Impossible primarily due to fears by executives that the show’s unchallenged sex symbol up to that point, Barbara Bain, was on the wrong side of 40 and would inevitably lose her lustre with audiences. To differentiate Dana from Cinnamon, the former was written as inexperienced and prone to missteps, and often in need of rescuing, whether by Cinnamon or by the other members of the IMF. What tickled creator and showrunner Bruce Geller (who had left the original series by the time her character was introduced) enough to include her in the new series, despite initial misgivings, was her transformation into a mature and hypercompetent senior IMF agent, the product of ample character development in the decade-long interim since she had last been seen. Her role as a mentor figure to younger female IMF agents was very important in that it gave her a golden opportunity to interact with several of the rotating recurring characters, one of whom quickly proved a rising star.

    Juliet Landau was cast to fill a very particular need: to pay homage to Martin Landau and Barbara Bain, and to maintain their presence on the show in spirit despite the lack of a physical appearance by either of them. After all, she was their daughter. [3] Indeed, her character combined their respective roles of master of disguise and honey trap. However, she was not explicitly named in-series as the daughter of either character nor of both as Rollin and Cinnamon had not been an item during the original series, and not enough time had passed since then for them to have had a daughter now in her early twenties. [4] Instead, she was said to have a “family history” with the IMF; notably, her last name was never revealed, and since both Rollin and Cinnamon were conspicuously absent from the show, Casey’s most important relationship was with Dana. Landau, who had primarily acted in stage productions before being cast in a recurring role in this new series (with the blessing of her parents), was a revelation; only 20 years old when she made her first appearance, she portrayed her difficult character (an 18-year-old “teen genius” type) with surprising vulnerability and confidence. There would be no doubt that Landau would be invited to return as a regular for the show’s second season, and indeed she was. She wasn’t the only child of an original series regular to turn the trick, either. Philip Morris, who was the son of Greg Morris, and who had grown up on the Desilu lot [5], also played a character on the new series. However, his character, Grant Collier, was explicitly the son of Greg’s character Barney. Grant had followed in the footsteps of his father and was a technical whiz, and this vital skill – coupled with his race in a show which (much like the original series) had relatively few non-white characters made him nearly as indispensable as Casey. He, too, was earmarked for a return as a regular in the 1986-87 season.

    The question of what to call this Mission: Impossible spinoff was a hot topic for discussion amongst studio executives and the show’s producers prior to the series premiere – although, ultimately, no subtitle was used (the rationale being that “if the new Twilight Zone revival doesn’t need a subtitle, then neither do we”), several were considered. Perhaps the most notorious was Mission: Impossible: The Next Generation, given that, as noted, multiple recurring cast members were in fact children of the original cast, but several key people, studio head Lucille Ball and creator-showrunner Bruce Geller included, found the subtitle ridiculous, and that no show with such a subtitle could ever be taken seriously; neither could the use of multiple colons in a single title. A counter-suggestion was made to formally drop the colon between Mission and Impossible (as was often done colloquially anyway), but this, too, was flatly rejected (“just because a lot of people do it doesn’t make it right”). The Next Generation subtitle was useful in one respect, which was in providing the fandom an easy way to distinguish between the revival series (which became TNG, for The Next Generation) and the original series, which retronymously became known as TOS. Among the general public, however, the new show remained in the shadow of the 1960s-70s parent series, despite solid ratings.

    However, and increasingly, the new Mission: Impossible’s status as an in-house production on the Desilu lot(s) was proving to be the exception, not the rule. Even before The Wall had come down, Desilu had owned more studio space than any other organization in the Greater Los Angeles area. With the former Paramount studio space being added to Desilu Gower’s existing capacity, it came to the attention of many commentators within the industry was Ball was not so much a studio head as she was a feudal landlady; the majority of studio space at all three of the company’s hubs – Cahuenga, Gower-Melrose, and the Forty Acres backlot in Culver City was being rented out to other production companies to produce programming of their own. The Hollywood Reporter archly noted this when they published an article titled “Lucy the Real Estate Mogul” in early 1985.

    In many ways, ironically enough, this resembled the situation some two decades earlier, before the studio’s renaissance began with the “House that Paladin Built” era. Before 1966, Desilu’s only in-house production had been the star vehicle The Lucy Show; all other studio space was rented out to independent productions such as The Andy Griffith Show. Even though Desilu had a much more active production schedule in 1985-86, it also had much more studio space available – far too much for Desilu itself to use effectively on its own. Fortunately, and just as had been the case in the 1960s, high-profile prospective tenants came immediately, and Desilu’s canny marketing department decided to turn an apparent weakness excess production capacity into a strength.

    The Desilu “brand” – a singular marque of quality and prestige in the field of television production for 35 years would extend to cover not just the shows they produced, but also the studio space in which their shows were produced. As part of new rental agreements and, effective for the 1985-86 season, renewal agreements – Desilu demanded that the filming location for each production be prominently displayed in the show’s end credits to the point of even being given their own “card” in slideshow-style end credits, where appropriate. Desilu produced special logos for each studio space to use for identification. Brandon Tartikoff (who, as VP Production, had no control over the rental of excess studio space, this being the purview of the VP Property Management) saw these logos at a conference meeting and loved them so much he decided to institute them for Desilu’s own productions as well. As the filming locations for a series typically appear toward the end of a credits listing, this had the amusing effect of the famous cursive Desilu logo appearing twice in close succession for its in-house productions. The cards read as follows:

    Filmed At
    DESILU
    Gower
    Studios
    Hollywood, California

    Filmed At
    DESILU
    Cahuenga
    Studios
    Hollywood, California

    Filmed At
    DESILU
    Forty Acres
    Studios
    Culver City, California

    This modernization of studio branding could not be more fortuitously timed, given the burgeoning number of non-Desilu productions filmed at Desilu’s studios, including a couple of very big names indeed…

    One of the big stories of the 1985-86 season was the return of two beloved sitcom mainstays to the genre after lengthy absences. Mary Tyler Moore and Valerie Harper, who had played best friends in The Mary Tyler Moore Show in the 1970s, both sought new vehicles for themselves in their middle age, hoping that their success would define the 1980s much as Mary Tyler Moore (and its spinoff, Rhoda, which starred Harper) had done for the previous decade. Each of them took different tacks to this approach, however, and each had to escape the shadow cast by their own previous successes.

    Mary Tyler Moore sought to return to the sitcom genre in which she had made her name (after a number of abortive attempts to headline variety shows, of all things, some years before) as the female editor-in-chief of a newspaper. The show was called Mary, a name previously used for a short-lived variety show which starred Moore in the late-1970s; Moore was far from the first performer to reuse the name of a previous star vehicle for a new one. [6] Although quite similar superficially to Mary Tyler Moore (which had been set in a television studio), the big change was in Moore’s character: she played a brittle, crusty, and saucy “boss lady” type named Mary Brenner, very different from the sweet and demure Mary Richards (and from Laura Petrie on The Dick Van Dyke Show, for that matter), so convincingly. She applied this persona to great effect as the editor-in-chief of a major Chicago daily, the Chicago Eagle. Her characterization was played up in early promotional material for the show; “America’s sweetheart as you’ve never seen her before”, one frequently-used quote put it. Another was more direct and more hokey at the same time: “She can turn the world off with her snarl.”

    Mary was created by the writing partnership of Ken Levine and David Isaacs, who had cut their teeth on Captain Miller and Taxi Drivers before following the Charles Brothers from the latter series over to The Patriot. The central relationship of Mary and the conflict that propelled was between Mary and her ex-husband, Frank DeMarco, who also worked at the paper; in fact, she was his boss. [7] The pilot episode entailed her hiring to hire him at the insistence of the publisher; he had walked from the rival Chicago Post over a pay dispute. He had been the paper’s Pulitzer Prize-winning star reporter and acted accordingly, alienating his bosses, but the publisher of the Eagle desperately wanted to attract a reporter of his calibre and reputation, and was willing to meet his steep salary demands.

    Levine and Isaacs rebuffed attempts at stunt-casting the ex-husband – Dick van Dyke and Ed Asner had both been (only half-jokingly) floated for the part and endeavoured to undertake a search for Mary’s perfect sparring partner with the same care and attention that had resulted in Dave and Rebecca on The Patriot. Ultimately, John Astin, who had played Gomez Addams on The Addams Family in the 1960s, was cast in the role – their antagonistic chemistry was too appealing to pass over, even though their romantic chemistry left much to be desired – test audiences found it hard to believe the two had ever been in love in the first place, to which Levine mused that “they’ve obviously never been divorced”. Although initial plans were for a “will-they-or-won’t-they” attitude toward a potential reconciliation between them, much in the vein of several classic films from the Golden Age of Hollywood [8], these were ultimately abandoned; Levine and Isaacs had worked on several shows whose writers had wanted to break up their characters but were forced not to by higher-ups due to their popularity with audiences. The example of Rhoda had been stuck in the craw of many a writer long before The Patriot had confirmed that what audiences wanted were more important than what creators wanted, at least as far as the networks and studios were concerned. As a result, Moore and Astin played a divorced couple who would be forced to work together, but could never rekindle their romance – and that was that.

    Given that both Moore and Astin were over 50, most of the rest of the cast were younger, and given their star power, consisted primarily of unknowns cast for their talent and attractiveness over any name recognition. Among the show’s biggest discoveries among its younger players was a former backup singer named Katey Sagal (also the daughter of director Boris Sagal), playing Mary’s sassy, chain-smoking secretary. Anyone older – including Burgess Meredith as the publisher of the Eagle, who had a memorable cameo in the pilot was kept to a recurring role.

    Mary was critically acclaimed, the show praised for subverting Moore’s (and, to a lesser extent, Astin’s) image, and for depicting a antagonistic relationship between a man and a woman which did not involve any unresolved sexual tension. Audiences enjoyed the show as well, and it finished just within the Top 30 for the season. However, much as had been the case with Mary Tyler Moore in the 1970s, the show enjoyed far greater critical acclaim than it did popular appeal. It was a “smart” show, and it made people feel good to watch it – or to say they watched it.

    Just as her one-time co-star had done, Valerie Harper decided to play off her previous project (in her case, Rhoda which had ended with her character married with a daughter named Mary) to star in a family sitcom quite unlike it in many respects. Both Harper and Moore (each of whom served as executive producers on their own shows) claimed that it was coincidence that they both just happened to be coming out with new projects in the same season, and were filming at the same studio. (“In our defence, it’s very hard not to film at Desilu,” Mary Tyler Moore joked in an interview for the Hollywood Reporter. “I think by now they must have bought up all the studio space here in the Southland.”) Because Mary Tyler Moore and Rhoda had ended up very different from each other (despite starting out with an identical template – a single woman trying to make it in a new city [9]), their new shows, which were plays on their previous projects, were also very different from each other.

    In a transparent attempt to be topical in the proud tradition of socially-aware 1970s sitcoms, Valerie (whose lead actress played a character named Valerie, both in an attempt to ape Mary Tyler Moore and to follow the Vivian Vance paradigm of using her own name after over a decade of incessantly being addressed in public by the name of her famous character) cast Harper as the breadwinner of her household, juggling her working life with her family life. Her husband (or “househusband”, as the show called him), was a struggling writer who worked from home. Harper’s character was an advertising executive, allowing for much of the comedy to come from her attempts to tailor and sell ad campaigns to a revolving door of guest characters. A career in advertising was nothing new for sitcom protagonists; Darrin on Bewitched had been an advertising executive, and Valerie would occasionally homage this nostalgic connection.

    The supporting players at Valerie’s workplace were few and far between, as the nature of her job allowed her to interact with clients one-on-one. They included her boss – who was friendly and supportive, in a deliberate contrast to the irascible Lou Grant character on Mary Tyler Moore indeed, if anything, the character could have been said to be overtly milquetoast; this allowed Valerie’s (seldom-seen) co-workers to get away with taking advantage of him, though not Valerie, as she was the firm’s top salesperson.

    The only other female regular on the show was the boss’s secretary – who also happened to live down the street from Valerie; the two carpooled to and from work together each day. (Valerie drove.) A busybody neighbour and secretary in the tradition of both types, she was played by character actress Edie McClurg, who spoke with a distinctive Upper Midwestern accent, donchaknow. [10] Valerie and her husband had three sons, to run the gamut of kid-oriented storylines between them: a boy in his early teens (intended as a Tiger Beat heartthrob in the making, albeit in a non-threatening way so as to avoid alienating young male viewers) [11], a boy just entering grade school (and the middle child), and a toddler (played by twins, so as to share the workload between them). [12] In fact, the pilot focused on the main thrust of the series, Valerie’s struggle between her work life and her family life. Her maternity leave had ended on her 40th birthday [13], and upon returning to work, Valerie discovered that the place had fallen into a mess without her. As she struggled to get a handle on her new situation and try to re-assert control over her hectic surroundings, it soon became clear that the main problem she was facing was that access to the boss was barred by the secretary. Although she attempted to get her way with the “soft touch”, attempting to trust in the boss to take care of things himself, the mounting chaos at the workplace soon became too much for her to bear, and she finally decided to take charge and brute-force her way into the boss’s office. All was well that ended well, fortunately, with the boss even thanking her and praising her “initiative”. However, one final bit of comedy was saved for the end of the workplace sequence: as Valerie sat silently in her car in the office garage, in a moment of exhausted reflection, her reverie was shaken by a firm and insistent rapping upon the passenger door window.

    “Yoo-hoo! Valerie!” It was none other than the secretary, Patty Poole.

    Valerie pushed the button to unlock the power door, letting her in. “Come on in, Patty.”

    “Why thank you, Valerie,” said Patty, in her chipper Midwestern accent. “Oh, and I hope you don’t mind what I did earlier. I was under very strict orders, donchaknow.”

    “Well, I forgive you, as long as you forgive me shoving you aside like that,” Valerie replied, allowing a wry smile to cross her face.

    Patty just snickered. “Oh, don’t mention it. After all…”

    And together, as Valerie turned the ignition and the car drove out of the parking garage, the two said in unison, Patty in her chipper tone and Valerie in a resigned one: “It was just business.”

    On the home front, all three kids were played by age-appropriate actors: the eldest son, David, by Scott Morton, born in late 1970 (he turned 14 while shooting the pilot); middle son William by Ryan Hickson, born in 1977; and the youngest son Mark by Brian and Brandon Valentine, born on January 1, 1984 (a Sunday), and already famous for being the first “New Year’s Twins” in the history of their hometown. The age-appropriate casting was done for two reasons. Authenticity was one of them, of course, but the other seemed counter-intuitive at first blush: because all the children were played by child actors, the hours they could spend on-set were strictly limited, and this meant that their on-camera time was limited as well. Only Scott Morton was old enough to carry the A-plot of a given episode, though more often than not, even his storylines were relegated to subplots in the early going. Valerie Harper was the show’s star, and that fact was made plain on set on a daily basis. The husband was played by Lowell Wolfe, formerly a soap opera actor (who had enjoyed a long run on Another World in the 1970s) whose reputation as hunky beefcake eye-candy was fading as he aged.

    Critics were lukewarm – at best toward Valerie. Though most of them praised the cast, they considered the jokes hoary and saccharine, and the storylines sophomoric. Many disliked the marked lack of chemistry between Valerie and her onscreen husband, a striking contrast to Rhoda, whose title character was passionately in love with her husband, Joe. This lack of chemistry was probably explained by the poor working relationship between Harper and Wolfe, who grew to resent “wearing a goofy print apron and carrying a plate of cookies, asking everyone about their problems”. [14]

    For obvious reasons, the show with which Valerie was most frequently compared was Mary, against which it was found lacking in virtually every conceivable metric. Despite this, most critics praised the pilot for Valerie even over that of Mary, with more than one describing it as “the best pilot of the season”. It was the transition to series which proved hobbling to its quality – but not to its popularity: Valerie was a smash-hit, finishing in the Top 5 for the 1985-86 season, the highest-rated sitcom that year. Harper had always been beloved by audiences, who had tuned into Rhoda in droves, and were thrilled to see her back on the tube. And just as producers had predicted, Scott Morton became a major teen sensation, with pinups of his smouldering gaze adorning the pages of Tiger Beat, Seventeen, and all the other teen magazines of the day. This did little to endear him to his TV “father”, a one-time regular on the pin-up circuit himself, which further added to the antipathy between the various cast members. As sure as Morton featured in nearly every new issue of a teen magazine, backstage stories about tensions on the set of Valerie would feature in one of the supermarket tabloids.

    The 38th Primetime Emmy Awards were scheduled to be broadcast on September 21, 1986 (a Sunday). [15] Most of the buzz leading into the awards ceremony focused on the “head-to-head” between Mary Tyler Moore and her show Mary (which had scored the most nominations of any comedy series at that year’s ceremony) and Valerie Harper and her show Valerie the two shows were nominated for Outstanding Comedy Series (alongside Desilu mainstays The Patriot, The Ropers, and the life-after-divorce sitcom Starting Over), and the two actresses were nominated for Outstanding Lead Actress in a Comedy Series against each other. Both actresses refused to play into media speculation of a “feud” between them, pointing out that they had been nominated against each other before, as recently as 1977, and were all smiles and hugs on the talk-show circuit leading up to the ceremony. Besides, as it turned out, their “feud”, and the event in general, were completely blindsided by a shocking revelation from one of the television industry’s most beloved and iconic figures, which would culminate in a revealing tell-all interview special to be aired the night before the Emmy ceremony…

    ---

    [1] A key difference from OTL, where she did let her hair return to its natural brown and allowed herself to be seen in public as a brunette, as in this 1975 interview with Dinah Shore (also, many believe, the source of the infamous “Vivian Vance was contractually obligated to put on 20 pounds” rumour, from the “contract” gag gift featured in the clip). Ironically, in her role as chief ambassador of Desilu to the media, she has to be “on” a lot more than she would as an actress.

    [2] ITTL, Mark Lenard was originally floated as a replacement for Martin Landau in 1969, as contract negotiations wore on (Leonard Nimoy, who replaced him IOTL, was obviously unavailable). Although ultimately Landau did renew his contract, he sought (and received) a concession to be granted leave from several episodes in order to focus on other projects. Therefore, Lenard’s character found use as a substitute for Rollin whenever Landau was absent, leading to Lenard making multiple appearances in each of the show’s last four seasons. (Herb Solow, whose favouritism for Star Trek was well-known on the Desilu lot, would always “pull rank” whenever there was a conflict between his role as Sarek on Star Trek and his role as Paris on Mission: Impossible.)

    [3] Juliet Landau, born in 1965 and thus before the POD, is the younger of Landau’s and Bain’s two children, and IOTL followed them into acting (after having been a ballerina in her youth).

    [4] During the Rollin/Cinnamon years IOTL, there were occasional winks at the audience that yes, these two are married in real life, and isn’t that funny?, but nothing beyond that. This continued throughout the show’s run ITTL, but all parties involved decided that there should be nothing more serious between Rollin and Cinnamon than the occasional light flirtation (borrowing from the James Bond films, in which Bond and Moneypenny are always flirting with each other but never actually sleep together).

    [5] Philip (or Phil) Morris, who also played Barney’s son in the OTL Mission: Impossible revival, did indeed grow up on the Desilu/Paramount lot. As a boy, he had played one of the “onlies” in the first-season Star Trek episode “Miri” (IOTL and ITTL), alongside many of the children of Star Trek cast members. IOTL, Morris would also have additional roles in subsequent Star Trek productions, but would become best-known for his recurring role as the Johnny Cochran parody character, Jackie Chiles, on Seinfeld.

    [6] As per OTL: Mary was the title of both a 1970s variety series and a 1980s sitcom, neither of which lasted for more than one season. Among those who have starred in multiple star vehicles with the exact same name, ITTL and IOTL, was Bob Newhart, who appeared in two different series called The Bob Newhart Show: a 1960s variety show and a 1970s sitcom.

    [7] This was the original plan for Mary IOTL, but CBS executives rejected this premise (just as they had rejected the idea of Mary Richards being a divorcee – they were apparently very incredulous about making changes to her public image, which is especially surprising after Ordinary People). Ironically, ITTL, there is no Ordinary People, and thus Mary Tyler Moore proving her mettle as a brittle, cold-hearted shrew (she plays Mary similarly to how she played Beth Jarrett, albeit softer, since this is a weekly series) is a huge revelation. Co-creator Ken Levine has spoken at length about his involvement with Mary on his blog, recounting the change in premise among other anecdotes. Astin was ultimately involved with the series IOTL as well, though he played what Wikipedia described as “a condescending theatre critic” named Ed LaSalle; having seen snippets from the pilot, this editor can report that he’s basically the show’s take on a Ted Baxter type.

    [8] IOTL, there is a term to describe the types of films being mentioned here, which is comedy of remarriage; however, this was coined IOTL by philosopher Stanley Cavell in his 1981 book, Pursuits of Happiness: The Hollywood Comedy of Remarriage.

    [9] Yes, yes, technically, Rhoda Morgenstern was returning to New York City, but you shouldn’t let that needle you.

    [10] Oh, you betcha, ya!

    [11] Maclean’s entertainment editor Jamie Weinman, in his delightful article on the history of this show’s OTL equivalent, Valerie/Valerie’s Family: The Hogans/The Hogan Family (which I urge you all to read if you’re at all interested in the politics of 1980s television), describes the casting of that show’s oldest son (played by Jason Bateman) to fit these parameters as “the Michael J. Fox” template. So it’s someone like Michael J. Fox (but not him; he’s too old and too short) or Jason Bateman (but not him; he was born after the POD).

    [12] Twins (or triplets!) playing babies was and is a common tactic in television and film as to avoid running afoul of child labour laws. IOTL, perhaps the most famous example was Full House, which cast twins Mary-Kate and Ashley Olsen (billed for the first seven seasons as “Mary-Kate Ashley Olsen”, as if they were one person) as youngest daughter Michelle, a baby at the time of the series premiere. The Olsen Twins, as they became known, were able to leverage the show into a career as the pre-eminent child stars of their era, always playing twins (or at least two people) in all their subsequent projects.

    [13] Valerie Harper was born in 1939, and thus would been 45 when shooting the pilot (and 46 when it went to series). Even at that age it would be possible (if not likely) for her to have had an infant son, but all involved decided to lower her character’s age to 40, for a multitude of reasons. It does not escape anyone’s notice that Harper – playing a character five years younger is by far the most divergent of the cast from the ages of the characters they play. Harper’s co-star Lowell Wolfe was 39 when shooting the pilot, just a year younger than his character and six years younger than Harper.

    [14] A paraphrase from a delightful OTL quote by Bess Armstrong, who played the mother on the beloved and acclaimed cult series My So-Called Life, in 1994: “If I end up standing in the doorway with a plate of cookies saying, ‘Honey, do you want to talk about this?’ I'm going to open my veins.” (For the record, she never did – her husband did all the baking.)

    [15] Which means you’re never going to find out who won those Emmys, since the ceremony takes place one day after the Baba Wawa interview with That Wacky Redhead. If only, if only…

    ---

    So ends the final overview update of That Wacky Redhead! I hope you’ve all enjoyed reading about two solid decades of television production at a pace of less than one-quarter of real time! ;) Thanks once again are due to Space Oddity for his thoughts and suggestions with regard to the Mission: Impossible revival, and particularly the casting. And, as always, thanks to e of pi for assisting with the editing, and for egging me on to write in general. (For those of you wondering where the customary summary of ratings-by-network is, you’ll find it in the next update this time around.)

    Tonight Show.jpg
     
    Last edited:
    The Power of Networking
  • The Power of Networking

    The paradigm of three – and only three – commercial broadcast networks may have seemed eternal and unchanging by the mid-1980s, but it had surprisingly not always been the case. Intermittent discussions and even abortive attempts to launch a “fourth network” would have, in fact, marked a return to the way things had been during the Golden Age of Television – when, in addition to ABC, NBC, and CBS, there was also the DuMont Television Network, which endured (in one form or another) for a decade from 1946 to 1956, straddling the line between the Experimental and Classic Eras of television history. The network represented an attempt by the television equipment manufacturer DuMont Laboratories (founded by Allen B. DuMont) to provide the means to make use of the new technologies it was developing, not unlike what Thomas Edison had done in the 1890s. The network enjoyed its greatest success in the early-1950s, having one of the young medium’s biggest stars, Jackie Gleason, on their roster; it was on DuMont that the first Honeymooners sketches debuted in 1951. However, just as they had done with NBC’s radio talent a few years before, CBS poached Gleason in 1952, marking the beginning of the end for the network. Only his starpower might have been able to counter the myriad financial and economic challenges facing the DuMont Network going forward.

    From the outset, DuMont was faced with considerable institutional hurdles which none of its three rivals had been forced to clear, giving it an unfair disadvantage against them. Unlike ABC, NBC, or CBS, DuMont had no established radio infrastructure from which to draw talent or adaptable material, nor against which they could balance their inevitable losses from capital investment during their formative years in the new medium of television. In fact, of the original four television networks, DuMont was the only one which had been explicitly created for the new medium, rather than simply one which expanded operations from an established radio network. [1] As a result, DuMont needed a partner who could bankroll the network’s expansion and fund its programming. They would ultimately find one in Paramount Pictures, which since 1939 had held a 40% stake in DuMont Laboratories.

    However, Paramount, being one of the major Hollywood film studios, naturally saw television as an existential threat, and one which should be thwarted, not embraced. Although most of the film studios did eventually come to embrace the new medium and started new divisions explicitly for the purpose of producing television programming, Paramount was notoriously slow to follow suit. Paramount Television would not come into existence until 1968, shortly after the Gulf+Western conglomerate had purchased the studio (and only after they had, in turn, failed to complete the acquisition of an existing television studio, Desilu Productions, to absorb into Paramount). This stubborn refusal to change with the times very much informed the tenor of Paramount’s relationship with DuMont.

    As it happened, DuMont’s leadership had just about the exact opposite attitude to their backers at Paramount; being the only network formed for the explicit purpose of television broadcasting, and being owned by a firm which had pioneered the development of technology for the nascent medium, it was the only one of the four networks whose brass consisted of true believers (as opposed to opportunists), and thus what it lacked in most everything else, it made up for in entrepreneurship. The DuMont Network thus attracted talent who made up for their inexperience with their innovative potential. Jackie Gleason had been the network’s biggest star, and their failure to hold onto him was likely fatal, but by no means was he the only bright light at DuMont. The first true situation comedy on television (before the term “sitcom” itself even came into vogue), Mary Kay and Johnny, began airing on DuMont in 1947, beating I Love Lucy to the punch by almost four years. [2] The first science-fiction series on television, Captain Video, began airing on DuMont in 1949, ultimately airing on that network for six years, one of the network’s longest-running programs. Network television’s first game show, Cash and Carry, was also a DuMont original, dating all the way back to their (and television’s) first season of operations, 1946-47 (at which time the “network” consisted of just two stations). The Reverend Fulton Sheen stunned observers when his religious program Life is Worth Living was able to hold its own against early television’s biggest star, Milton Berle. One of DuMont’s biggest ratings bonanzas during their heyday, however (aside from Gleason’s Cavalcade of Stars), was their live coverage of boxing and professional wrestling events.

    DuMont, despite their early start, was unable to keep pace with the rapid expansion of the other networks starting in the late-1940s. Television stations had to apply for licences from the Federal Communications Commission, or FCC, before they could commence broadcasting, and most licences were, naturally, granted to owners who already operated radio stations in the same market, for obvious reasons: existing relations and contacts with the FCC; experience and expertise with broadcasting technology; the physical assets necessary to transmit broadcast signals to a wide audience; and, most importantly, a pre-existing affiliation with one of the four radio networks. As a result, most of those stations which came into existence before 1948 chose to affiliate with NBC or CBS, the two major radio networks which had expanded into television. DuMont, and to a lesser extent ABC, was left by the wayside. Perhaps this obstacle might have been surmountable in the long run, as additional stations came into operation; however, applications for new licences came in faster than the FCC could process them [3], and the agency ultimately decided to put a temporary freeze on granting any new ones. This freeze, which was originally to last for only a few months, instead endured for almost four years, well into the 1950s and long enough for the backbone of television infrastructure to well and truly ossify.

    In the height of the antitrust hysteria of the era (which had resulted in the Paramount Decision of 1948, ending vertical integration in the motion picture industry), the FCC had ruled that no firm or individual could own more than five television stations nationwide. Paramount’s 40% stake in DuMont would prove another stumbling block when the FCC ruled that the two television stations owned by the Paramount Television Network – a short-lived parallel venture by Paramount to establish their own broadcast network independent of DuMont’s own efforts, born of the same fleeting mentality which had resulted in investment into DuMont in the first place – nevertheless counted towards their tally of five owned-and-operated, or O&O, stations, even though the Paramount stations aired no DuMont programming. This left the three core DuMont O&Os which formed the core of their network: WABD in New York City, hub of the early television industry; WTTG in Washington, D.C., which also served nearby Baltimore; and WDTV in Pittsburgh, which emerged as the crown jewel of the network, as no other VHF stations would serve what was then the sixth-largest market in the United States for the duration of DuMont’s operations as a network. It alone kept DuMont afloat during the lean years of the “freeze”, though ironically the network was forced to sell its most valuable asset to Westinghouse in 1954, desperate for a cash infusion. This short-term gain was in all likelihood the death knell for DuMont as a television network.

    In the intervening years, though many other stations would affiliate with DuMont in some capacity, they were free to pick and choose which DuMont programming to carry; in addition, after the FCC freeze was finally lifted in 1952, it became nearly impossible to receive a new licence in the very-high frequency, or VHF, band of channels; instead, the ultra-high frequency, or UHF, band of channels was opened up for exploitation. However, UHF stations usually gave off weak signals which were poorly received by viewer antennae – if they could be received at all, as tuners which were capable of converting UHF signals to information were not mandatory, and most television manufacturers only included the VHF dial (channels 2-13) on their sets, leaving off the UHF dial (channels 14-83) entirely. This state of affairs would not change until the 1960s, long after DuMont ceased broadcasting.

    The final DuMont Network broadcast took place on August 6, 1956, transmitted across only five stations when the other networks all had over 100. From that point forward for the next three decades, viewing audiences in the United States had only three commercial broadcast networks available to watch over-the-air. The common knowledge that there were “only three channels” was always a misnomer, however. From 1964 onward, television sets were required to be manufactured with a UHF tuner and dial, granting viewers access to UHF stations in their market – and many markets had at least one, given the low licensing and operating fees in comparison to VHF stations (albeit at the cost of generally poorer over-the-air reception). After 1971, the public broadcaster PBS was available nationwide, and it operated a cooperative of stations in similar fashion to the commercial broadcast networks, offering high-quality and educational programming without advertising, splitting the costs amongst all the member stations. And finally, from the late-1970s onward, advances in telecommunications technology enabled Pay-TV channels to flourish, available to viewers by cable or satellite transmission. By the mid-1980s, MTV and CNN were household names, and were increasingly coming to define the culture of the Post-Boomer generation, increasingly known as the “Echo Boom”. [4]

    Although the DuMont network had ceased operations in 1956, the two remaining DuMont O&O stations did not. DuMont Laboratories spun off their broadcasting operations in 1957 as the DuMont Broadcasting Corporation, though it was renamed Metropolitan Broadcasting shortly thereafter, so as to dissociate itself from the former network. Paramount sold Metropolitan in 1958 to John Kluge, who fancied himself a media mogul, serving in his own way as an inspiration to many who would come after him. He would aggressively expand his new company (renamed once again, to Metromedia) and its media holdings throughout the 1960s, picking up new television stations, radio stations, and other entertainment properties, including (most curiously) the Harlem Globetrotters. Its portfolio grew larger and larger as the years wore on, culminating in the record nine-figure acquisition of a VHF station in Boston, WCVB (channel 5), in the early-1980s. All of these purchases provided Metromedia with a truly impressive portfolio of stations, the vast majority of which had no affiliation with any of the three major networks, but providing invaluable infrastructure for anyone who might be interested in launching a fourth, though ultimately anyone who was interested would inevitably take a different tack to doing so.

    Indeed, ever since DuMont went off the air in 1956, there had been intermittent attempts by various entities to launch a replacement fourth network. Even before DuMont officially ceased operations, there had been several attempts in the mid-1950s to launch new networks, none of which were successful. Most ad hoc “networks” which did launch functioned more along the lines of first-run syndication, selling “packages” of programming to stations which may or may not have been affiliated to an actual broadcast network; each station, as was already the case in the more traditional rerun syndication market, could choose to broadcast the shows they had licenced at their own discretion, dramatically reducing the potential effectiveness of nationwide marketing campaigns – exhorting viewers to “check your local listings” was not nearly as effective as giving them the specific date, time, and station on which they could expect to find their programming. Another problem – the very same problem which helped to bring down DuMont – was that each market had a severely limited number of VHF stations, and almost all of these were affiliated to a major network. Any network with the same national reach as the Big Three would have to consist largely of UHF affiliate stations. Many would-be entrepreneurs found the very notion daunting, comparing it to herding cats. Others were intrigued by the challenge.

    Foremost among those who felt they might just be able to transcend the lofty barriers to entry facing anyone who sought to develop a fourth network was Barry Diller, a television executive who had worked for ABC in the 1960s, becoming the Vice President of Development before relocating to United Artists Television in 1974. [5] He gradually became convinced that there was room for another network to compete with the Big Three, one with him (naturally) holding the reins as the chief creative force, but unsurprisingly, he could find precious few backers, and hardly any at United Artists Television. The 1970s were not the 1950s; the barriers to entry appeared nigh-insurmountable unless the necessary investments could already be made, or the necessary capacity already existed. After United Artists had been sold by Transamerica to CanWest, Diller would finally find his willing benefactor in the person of Israel Asper, who was not about to let a pesky little thing like an international border get in the way of his dream for a transnational network. Asper already had experience in transnational telecommunications dealings, having bought out a North Dakota station which he then turned into the Winnipeg flagship of his network. This network, with the help of hit programming such as SCTV and Life After Death, was able to compete with its rivals (CBC/Radio-Canada and the CTV/TVA tandem) in most of Canada’s major cities, particularly Montreal, Toronto, and Vancouver, even after many analysts had written the Global Television Network off as unlikely to find a niche against the two titans of Canadian broadcasting. Global’s success convinced Asper that if four Anglophone networks (CBC-1, CBC-2, CTV, and Global) could work in Canada, then they could also work in the United States. Asper’s other key strength was an established base of operations in the United States. His acquisition of United Artists – and his subsequent political alliance with Canadian Prime Minister Stanfield, resulting in the loosening of CRTC “CanCon” restrictions in the early-1980s (much to the chagrin of cultural protectionists) – provided him with the assets needed to churn out international co-productions on an assembly line basis. Given Canada’s relatively small number of population centres, his Global network reached almost complete market saturation by the mid-1980s, with a particular coup for Global being the launch of the Halifax station, CIHF, in 1984, just in time for the inaugural CFL game played by the Atlantic Schooners, to which the new station had naturally secured the exclusive rights. [6] The last holdout south of the 60th parallel was Saskatchewan, though plans to rectify this had already been set in motion. Eager to fend off salvos from the Canadian cultural and intellectual elite, Asper heavily endowed his alma mater, the University of Manitoba, with the funds needed to greatly expand their schools of business, law, and media studies.

    It had been a tradition since the very beginnings of the American mass media industry for Canadians to seek their greatest fame and fortune across the border, and this was no less true for Izzy Asper than it had been for Louis B. Mayer, the Warner Brothers, and Mary Pickford before him. He was clearly entrenched in the Canadian broadcast industry, and his expansion southward into the American motion picture industry, with the acquisition first of United Artists, and then of the trademarks, logos, and insignias pertaining to the former Paramount Pictures Corporation, was in the end a mere prelude to his plans to expand his broadcast operations stateside; United Artists provided his Global Television Network with the content he needed for a competitive edge over his rivals, but in many other respects it functioned as a loss-leader. This was because Asper, despite his deep pockets and his big dreams, was not a creator as Louis B. Mayer or Jack Warner had been, and any plans to create an American network needed a visionary. Luckily for him, Barry Diller was just the right man for the job, and he was available at just the right time.

    CanWest, through their previous acquisitions, already owned several television stations in the United States, including a permit to build a station in Houston, and an already-operating station which served the large and lucrative Cleveland market – Asper, in his more ambitious moments, envisioned it functioning as a bulwark to sustain any emerging network against adversity, much as Pittsburgh had been for DuMont in the early-1950s. However, a few scattered stations did not a national network make, and Asper knew he had to enlist additional stations into his scheme. Aware that image means everything in any enterprise, and on the advice of Diller, Asper’s CanWest Global Communications was re-branded, shortly after the acquisition of the Paramount trademarks, to CanWest Paramount Communications. However, the planned network itself was to be named the Paramount Global Television Network, or PGTV. Global Television in Canada began using this new identity on September 1, 1984 (a Saturday).

    The television landscape of the 1980s was very different from the one which DuMont had faced in the 1980s. UHF stations were far more accessible over-the-air than they once were; UHF signals were stronger, and just about every television set in a given household could receive them with ease… more to the point, many households in the 1980s did not use traditional “rabbit ears” to receive broadcast signals, instead relying on a cable hookup or satellite connection; cable and satellite providers tended to be local, and thus provided market-specific lists of channels to their customers. These lists would inevitably include every VHF and UHF station serving each particular market. Cable and satellite providing such a large number of viewing options to customers was a double-edged sword: on the one hand, it weakened the monopoly the broadcast networks held over audiences; on the other, it dramatically lowered the barriers to entry for competition, including, at least theoretically, any claimants to the banner of the fabled fourth network. An advantage shared by Asper and Diller were their contacts with the FCC, whose stringent anti-trust regulations over ownership and affiliation had been loosened during the Reagan administration (which had also seen the termination of the Family Viewing Hour and the end of the Fairness Doctrine), and which had been left alone by the Glenn administration. The FCC might have had it in for DuMont, but they were far less antagonistic towards CanWest Paramount. Asper’s good working relationship with both the CRTC and the FCC was the most valuable tool in his arsenal on the quest to secure the broadest possible coverage for his nascent network.

    CanWest Paramount made waves when it bought out the long-troubled WOR-TV, an independent station which served the largest media market in North America, New York City, which most insiders assumed was intended to serve as the flagship station of the ad hoc “network” which most of them still considered a pipe-dream. But it turned out that WOR-TV, redesignated WWOR-TV in order to fit the now-standard paradigm for call letters, was merely the appetizer to an altogether more ambitious main course.

    In 1985, John Kluge, who had taken Metromedia private some years before, announced that he was selling 51% of its stock to CanWest Paramount, giving the corporation de facto ownership of its portfolio of television and radio stations, and making CanWest Paramount the owner-operator of by far the largest number of television stations in the United States outside of the three broadcast networks. However, even though the old FCC restrictions had been relaxed since the 1950s, they had not been entirely eliminated; the Metromedia acquisitions found CanWest Paramount bumping its head against the hard ceiling of television stations any firm could own; any and all new network stations would have to be traditional affiliates. In fact, CanWest Paramount even sold a small number of their newly-acquired stations, most prominently the former Metromedia station WNEW (itself the former DuMont flagship WABD), which served New York City, as PGTV already had an NYC station in WWOR, and the custom (which was a hard rule in Canada) was that each company should have only one station per market [7], one which Asper and Diller were inclined to follow. As a result, WNEW was sold to an independent buyer, leaving WWOR as the PGTV East Coast flagship station. There were other curious realignments as a result of the merger: the new PGTV station in Boston, previously an ABC affiliate, saw that affiliation transfer to a station in New Hampshire; the Houston station still under permit for construction, already given the call letters KUAB, was effectively abandoned, with those letters transferred to the former KRIV, whose acquisition was deemed by the FCC a fulfilment of that permit.

    Asper and Diller then took their show on the road, attempting to woo potential affiliate stations all over the United States, and in this regard they were remarkably successful, securing affiliation agreements in over 150 markets in time for the planned launch of PGTV American broadcast operations on August 6, 1986 (a Wednesday), exactly three decades to the day after the final DuMont Network broadcast, and shortly before commencement of the 1986-87 season. Indeed, in securing an affiliate for the San Diego market, PGTV found itself a tri-national network, when none of the stations (not even the UHF stations) physically located in San Diego would agree to affiliate with PGTV – but XETV, a station located just across the border in Tijuana, Mexico, agreed to join the new network. XETV was a VHF station (channel 6), one of the few which joined PGTV, and which existed specifically because of the late-1940s freeze on new VHF stations imposed by the FCC – which did not affect Mexico. Given the extreme proximity of San Diego to the Mexican border, a VHF transmitter built in Tijuana could be received by television sets across the region; the Azcárraga family, who were prime movers and shakers in the Mexican media industry, took advantage of the opportunity this presented with the launch of XETV in 1953. It affiliated with ABC in 1956, remaining with the Alphabet Network until 1972, when the owners of a local UHF station (KCST, channel 39) were able to persuade them to disaffiliate with a station located in a foreign country and owned by foreign interests. From then until PGTV came calling, XETV was an independent station. Asper, now flush with contacts in a third country, was sufficiently intrigued by the prospect of his PGTV becoming a truly global network that another VHF station licenced to a Mexican city (XHRIO, channel 2, serving Matamoros, Tamaulipas) but serving audiences across the Rio Grande (in the Harlingen-McAllen-Brownsville market in the extreme south of Texas) would also become an affiliate. These would be valuable in future, should Global ever wish to expand into Spanish-language broadcasting, but for the time being most Mexican viewers of PGTV would be the affluent elite of the country, who encouraged the “Americanization” of their children. Many Mexican Pay-TV providers offered XETV and XHRIO to their customers for this very purpose. As regarded the PGTV base of operations in Canada, their stations in Saskatchewan commenced broadcasting on August 6, 1986, giving them coverage in every population centre in the Great White North.

    The first programming to be broadcast (inter-)nationwide on PGTV was a late-night talk show, a competitor to the predominant Tonight Show starring Johnny Carson, late into the evening of August 6, 1986; exactly thirty years after the last DuMont Network broadcast. [8] However, it was very much a product of the 1980s, much more “modern” and “hip” than his more staid, traditional format. The program was called The Late Show with David Letterman; Letterman, who had started his television career as a weatherman for the Indianapolis station WLWI, eventually moved to Los Angeles, seeking his fortunes as a comedy writer, where he met with some success. Inevitably, his on-air experience resulted in a performing career as well, and it was in this capacity that he was invited as a guest on The Tonight Show in 1978. He immediately struck up a rapport with Johnny Carson, who took Letterman under his wing. Letterman eventually became a writer, and then a “guest host”, for the show; he was so successful that he became the show’s first “permanent guest host” (a designation which Letterman himself called “the greatest oxymoron in show business”, a nickname which he eventually extended to himself in typical self-deprecating fashion) in the early-1980s. [9] However, he soon tired of filling in for Carson and chafing under his mentor’s somewhat ossified format, always being rebuffed by staunchly traditionalist showrunner Fred de Cordova whenever he suggested more innovative or avant-garde sketches. NBC brass also made it clear that they considered Carson far more valuable than Letterman, taking a hard line in their annual contract renegotiations with the latter. Letterman stayed on partly in his belief that Carson might soon retire; he had already hosted The Tonight Show for twice as long as all of his predecessors combined upon his twentieth anniversary with the program in 1982. Industry insiders believed that Carson would soon tire of his disputes and compromises with NBC, and choose to retire on a high note on his twenty-fifth anniversary in 1987. When Carson signed a multi-year contract renewal in 1985, however, it became clear that this would not be the case. Letterman also signed on for one more year in 1985, but he knew that at the next stop, he would have to be getting off.

    By 1985, Letterman had become one of the notoriously antisocial Carson’s few close friends, and he knew that he would have to receive his mentor’s blessing lest his decision to depart for sunnier pastures be perceived as anything other than a personal betrayal. To this end, the two spoke at length on the subject, including shortly before Letterman signed a contract with PGTV, who had heard through the industry grapevine about his troubles with The Tonight Show and were willing to pay him handsomely to become the anchor of their fledgling network’s late-night lineup, offering him complete creative control without a pesky de Cordova to interfere with his comedic vision, such as it was. Carson encouraged his protégé to seek his own fortune at PGTV, wishing him luck, but suspecting that he would do no better than his perennial also-ran rivals, Merv Griffin and Dick Cavett; Cavett had been cancelled by ABC (which had given up on late-night talk shows altogether, instead proffering a late-night news program, Nightline), and Griffin, who remained on CBS, ultimately would choose to announce his retirement in 1987, which was also the 25th anniversary of his talk show, the following year. [10]

    Letterman had wanted to host The Late Show in New York City, but the only studio space available to PGTV in 1986 was the WWOR-TV facility in suburban Secaucus, New Jersey. Thus, Letterman reluctantly agreed to remain in Los Angeles, where his show would broadcast from a studio in the former Metromedia Square (renamed the Paramount Global Television Center with much fanfare) on Sunset Boulevard in Hollywood, as opposed to Carson’s Tonight Show, which famously broadcast from “Beautiful Downtown Burbank”, in the San Fernando Valley. True to form, Letterman’s talk show was noted for its unusual sketches, which often lacked traditional punchlines, focusing more on “anti-humour” or experimentation to see what reaction certain stunts would provoke from unsuspecting patsies. Letterman’s joke-telling style, muted and muzzled back when he was forced to recite monologues mostly written by Carson’s staff (though audiences had grown accustomed to his bizarre non-sequiturs and ad libs whenever a joke would bomb), flourished on The Late Show. Though Letterman was nearly 40 in 1986 (about the same age Carson had been when he began hosting The Tonight Show at age 37 in 1962), he appealed to younger audiences, disaffected with the more conventional humour often featured on the Tonight Show. Letterman was also a far less congenial interviewer than Carson, often mocking or belittling his guests (albeit usually with veiled asides and double entendres rather than direct insults), even if they were celebrities plugging their latest projects. His show lacked the Ed McMahon-style “sidekick”, with his bandleader adopting aspects of that role (primarily in bantering with the host). Letterman’s show came strong out of the gate, with the series premiere beating The Tonight Show in the ratings. It would ultimately settle below Tonight as the weeks wore on (as always seemed to be the case), but it remained well above Merv Griffin, and performed almost neck-in-neck with Tonight among younger audiences and other key viewer demographics. Thus, PGTV would commence the 1986-87 season with a proven hit already on their schedule.

    Nevertheless, the climate which the Paramount Global Television Network faced upon formal expansion into the United States for the 1986-87 season was one in which the Big Three networks remained incredibly dominant, as they had always done. In the preceding 1985-86 season, the ABC series Neon City Vice finished at #1, the first Desilu production to top the ratings charts since Rock Around the Clock in 1977-78, knocking the previous champion Wasps to #2 – another primetime soap and one-time ratings king, Texas, also finished in the Top 5. On the whole, however, it appeared that the genre (which had so dominated television in the early-1980s) was finally beginning to decline; notably, PGTV would include no primetime soaps in its inaugural primetime lineup, though granted this may have been more out of concern for their great expense. Desilu dominated the Top 10, with not only Neon City Vice but also The Patriot and The Ropers making the cut; however, both Hill Avenue Beat and Eunice fell out of the Top 30, and were likely to face cancellation after the 1986-87 season, especially once PGTV began to cannibalize their potential viewer base. Until then, ABC, with 14 shows in the Top 30, continued to dominate. NBC, with ten shows, remained in second but was well ahead of CBS, which maintained the previous season’s standing of six entries in the Top 30.

    In many ways, it seemed fitting for Lucille Ball to retire from Desilu in this climate, an era when the medium was undergoing a fundamental realignment the likes of which had not been seen since the Golden Age in which she had made her start. Increasing experimentation within the medium, and new opportunities presenting themselves regularly, had resulted in the old order seemed increasingly fragile and dated. With the establishment of PGTV, the first true commercial network to join the ranks of the “Big Three” since DuMont went off the air three decades before, the United States of America finally had its fourth network… again. The greatest irony of all was that it all seemed something of an anticlimax. In the 1950s, there had only been the four networks. In the 1980s, there were plenty of other channels available to viewers, and even other uses for the physical television set beyond receiving broadcast, cable, or satellite signals, what with home video and video games. Nevertheless, this new development provided a curious bookend to the television career of the most influential woman ever to grace the medium…

    ---

    [1] The four old-time radio networks were (in order of creation): NBC (1926), CBS (1928), Mutual (1934), and ABC (1943, though originally formed as the “NBC Blue” network in 1927, before an FCC ruling forced RCA to sell one of their two networks during the War). Although Mutual explored the possibility of expanding into television as the other three networks did in the late-1940s, they ultimately never would, due to their structure as a cooperative of network affiliate stations (as opposed to independently-owned and operated affiliate stations sharing a common identity, branding, and programming, as was the case for the other three networks) availing them of less ready capital for rapid growth. Although “old-time radio” as we understand the term today (where scripted, dramatic programming was the dominant means of entertaining listeners) was essentially over by about 1960 (to be replaced with music, which remains dominant to this day, with a few prominent exceptions), all four old-time radio networks continued to exist through the end of this timeline, IOTL and ITTL.

    [2] Mary Kay and Johnny premiered on November 18, 1947. Like many early television programs, the show ran for 15 minutes. Being so old and on such a cutting-edge medium (“network television” only existed in a handful of markets in the Northeast at this time, and even by 1949 less than ten percent of households owned a television set), the show did not confirm to several classic sitcom tropes: the stars, real-life married couple Mary Kay and Johnny Stearns, who played themselves (a common conceit in old-time radio sitcoms), shared a bed onscreen; Mary Kay’s character also became the subject of television’s first pregnancy when she herself became pregnant, delivering her son Christopher in December, 1948, again beating That Wacky Redhead and Desi IV by a number of years. However, IOTL, no episodes survive for public dissemination; a handful of the original kinescopes have been archived.

    [3] In essence, the main problem was that the VHF band of frequencies needed to be re-defined so as to avoid broadcast signals from different stations interfering with each other. There were (and are), in theory, twelve stations which can operate in the VHF band: channels 2-13. Channel 1, famously missing from the dials of American television sets, was a casualty of the constant shifting in frequency designations during the 1940s. After all, commercial television broadcast signals had to compete for frequencies with commercial radio broadcast signals, alongside a whole host of other broadcast applications for public and private use. As a result, most markets would eventually host only three of the twelve allotted VHF stations. Affiliates of NBC and CBS would invariably occupy two of those three slots, leaving affiliates of ABC and DuMont, alongside independent stations, to compete for their one and only chance to be seen on a VHF station in a given market.

    [4] Recall that, initially, the Echo Boom referred only to the spike in birth rates during the 1970-74 period, though it was later conflated with the entire post-Baby Boom generation, the one we refer to IOTL as “Generation X”, lasting from the early-1960s to the early-1980s.

    [5] IOTL, of course, Barry Diller left ABC to join Paramount Pictures as Chairman and CEO, however there’s no room for him ITTL, so he goes to United Artists instead.

    [6] Among those in attendance at the game is Prime Minister Stanfield himself, and the cameras dwell heavily on his presence. Many, particularly those working for the CBC and CTV/TVA, accuse Stanfield’s government of leaning on the CFL to award broadcast rights for Schooners games to CIHF-TV, and they are correct, though naturally this won’t be proven for some time.

    [7] In Canada, the rule is actually one station per language per market, although this rule is only applied de facto to bilingual areas (primarily Montreal). In the United States, especially in an era long before the rise of the Spanish-language networks, a language clause is effectively meaningless.

    [8] The Fox Broadcasting Company, which was (of course) the fourth network IOTL (similarly headed by Diller, who left Paramount to join forces with 20th Century Fox when that studio’s new owner, Rupert Murdoch, evinced a similar willingness to throw money at him to the one which Asper demonstrates ITTL), premiered IOTL with a talk show as well, that being The Late Show Starring Joan Rivers, which premiered on October 9, 1986 (a Thursday). Rivers, who left her berth as permanent guest host on The Tonight Show to accept the gig (as Letterman did ITTL, albeit without consulting Carson, which resulted in a lifelong estrangement between them), did not last long – she was fired in May, 1987, though the show continued with a rotating lineup of hosts (including one Arsenio Hall) until it was cancelled in 1988. FOX, as a network, has duly retconned their broadcast history into beginning on April 5, 1987 (a Sunday), with the debut of their primetime lineup, starting with the far more fondly remembered Married… with Children, which ran for 11 seasons and is still considered one of the network’s most iconic shows.

    [9] Letterman left Tonight IOTL to host a morning show for NBC in 1980. It was swiftly cancelled, but the network was eager to hold onto Letterman and cancelled Tom Snyder’s Tomorrow to give him the post-Carson timeslot in a bid to keep him onboard (note that both ABC and CBS had vacant late-night timeslots at this point IOTL, Cavett having retreated to PBS and Griffin having moved to first-run syndication). ITTL, on the other hand, different management at NBC (Fred Silverman is still at ABC) does not approve of the notion of a comedic morning talk show, and thus Letterman remains at Tonight, the “permanent guest host” position being created slightly earlier than IOTL (Joan Rivers having been formally appointed as such in 1983). Even IOTL, Letterman guest hosted over 50 times, mostly between 1980 and 1981, before his own late-night show began taping in New York City in 1982.

    [10] The Merv Griffin Show ran from 1962, though intermittently; it was off the air entirely for over two years, from March, 1963 to May, 1965. IOTL, CBS cancelled the show in 1972 and the show moved to first-run syndication, where it was produced by, intriguingly enough, Metromedia; ITTL, Merv Griffin does just well enough to remain on CBS, where he perennially ranks second to Carson. As Metromedia was sold to 20th Century Fox in 1986 IOTL, The Merv Griffin Show was cancelled to make way for, yes, an in-house talk show, which did not last. ITTL, Griffin soldiers on but decides to retire after 25 years – but CBS can’t poach Letterman because, unlike Carson, Griffin’s contract is renewed on a year-to-year basis, so Letterman was already signed, sealed, and delivered to PGTV. (No doubt Letterman’s emergence as a new competitor played some role in gently encouraging Griffin, 60 years old in 1985 and already plenty busy with his game shows and other endeavours, to retire.)

    ---

    This update was co-written with Dan1988, special thanks to him for his contributions, some of which go back for years! :eek: Thanks also to e of pi for assisting with the editing, as usual, and to Electric Monk for his helpful advice. If there’s one important thing I’ve learned in the writing of this update, it’s that you can’t form a television network all by yourself…
     
    Last edited:
    Castles in the Sky
  • Castles in the Sky

    When John Glenn was elected 39th President of the United States, space enthusiasts – including veteran, long-dormant Moonie Loonies – were thrilled at the very thought of what his incoming administration might mean for the space program. NASA had lost a great deal of the lustre (and the funding) that the organization had enjoyed in its heyday, and proponents naturally assumed that the prestige of an astronaut president, coupled with the funding blitz of his Invest in America program, would inevitably result in a grand return to NASA’s glory days. This was despite the fact that Glenn, belying his background, had mentioned the space program surprisingly little on the campaign trail in 1980, and whenever he did it was almost always in response to a direct question from reporters or concerned citizens. It could honestly be said that Glenn’s opponents mentioned the space program far more often than he did, although the Republicans largely ceased to do so after their attack ads on the subject backfired. Glenn did briefly mention the space program in his first inaugural speech, and it figured into his plans for the Invest in America initiative, though very much as a longer-term, back-burner project, well behind his other key objectives such as transportation infrastructure and revitalizing the manufacturing sector.

    As it happened, President Glenn had very different plans for the future of space exploration than what had been the paradigm of the 1960s. Early into his administration, he appointed fellow astronaut James McDivitt (a retired USAF Brigadier General and veteran of both the Gemini and Apollo programs) as NASA Administrator, and the two shared a common vision for the future of space exploration. The Space Race of the 1960s had taken place against the backdrop of the height of the Cold War, making it yet another proxy conflict between the two superpowers – albeit technological rather than martial in nature. However, relations with both Soviet Russia and Red China had thawed considerably by the early years of the Glenn Administration. Furthermore, NASA and the Soviet space program no longer held an effective duopoly on space exploration; the European Space Agency, the Commonwealth Space Agency, and the National Space Development Agency of Japan were all capable of launching substantial payloads into orbit at the dawn of the 1980s, with the Indian and Chinese space programs not too far behind this key developmental milestone. This informed Glenn and McDivitt’s decision to define NASA’s relationship with other space agencies not as one of competition, but of cooperation.

    The reasons for this paradigm shift were at least as much financial as they were ideological. At its peak in the mid-1960s, NASA had commanded over 4% of the annual federal budget expenditure, a rate which had dwindled to less than half that figure by the time the Apollo Program had ended in the mid-1970s. Under President Reagan, the number had declined further still until it was barely over 1% by the time Glenn took office in 1981. [1] Although Glenn could (and did) bolster that figure somewhat upon taking office, the return to 1960s-era funding levels for NASA was simply untenable. Thus, his initial grand plan – for a return to the Moon and for a permanent orbital space station – became an either-or proposition. Like most either-or propositions, it swiftly divided the minds at NASA (along with the agency’s very vocal base of fans and supporters) into two camps.

    Amongst the general public, there was no question of which option was more popular. Moonshot Lunacy had defined a generation – and many within that generation, then children, were now old enough to vote – with their wallets as well as their ballots. Lobbying organizations demanding a return to the Moon were well-funded and well-organized cogs in the Washington political machine, and their antics always got press. Many scientists and researchers pushed for a return to the Moon as well – the discovery of water ice by Apollo 20 had opened up a whole host of new possibilities for the lunar environment, as well as new technological applications. Some scientists, as they had been doing ever since 1974, decried NASA for turning its back on lunar exploration right on the cusp of a major breakthrough – comparing it to Newton watching the apple fall from the tree and then deciding to go back inside and mint more coinage.

    However, the Moon was still considered by a surprisingly large proportion of the scientific community – perhaps even the majority – to be something of a dead end. Even though there was water ice on the Moon, that still almost certainly did not suggest life (and implying that it did, though a popular notion with lay enthusiasts, was considered shoddy pseudoscience by the scientific community). [2] Water ice was considered far more useful for its technical applications (namely, being harvested to sustain a long-term facility, or to provide fuel for launch vehicles), which were probably decades away even if an immediate return to the Moon was in the cards. The Moon was also unpopular with a handful of space enthusiasts who preferred the exploration of new frontiers, leading them to adopt a “been there, done that” attitude to further lunar missions – some were particularly scathing in their remarks, calling lunar exploration a relic of a very different, far more tumultuous time, much like overseas combat deployments, campus unrest, race riots, and “those filthy hippies”. [3] Those who opposed increased funding for space exploration in general were particularly opposed to lunar missions, regarding them as pointless, chest-thumping exercises in patriotism which would cost taxpayers billions in wasted money. Orbital operations, at least, had proven economic worth, what with the vast (and growing) network of telecommunications satellites in geosynchronous orbit.

    Technicians at NASA, along with budget watchdogs who were not necessarily opposed to some (reasonable) spending on the space program (but were vigilant of overspending, befitting their roles as critics of the excesses of Invest in America), pointed out that it would be far less expensive to focus on a space station. Although Man had already been to the Moon, the means by which he had done so had been retired to museums and public displays, the facilities for creating additional means to do so had been dismantled or converted for other purposes, and the minds who had brought those means to bear were now retired, having sought their fortunes in private and public sector alike. Expectations for the next phase of lunar exploration were for NASA to build on the previous triumphs of the Apollo program: Moonie Loonies still clung to their fantasies of lunar colonies by the year 2000, and even the most modest in-house proposals called for much more elaborate and complex bases, possibly semi-permanent and thus reusable, which would play host to missions of much longer duration than even the later Apollo missions. These parameters would necessitate the design and construction of new modules from scratch, and thus both the startup costs and the lead-up time required for a new lunar program would be simply enormous.

    By contrast, the construction of new modules for a space station would claim direct iterative descent from what had come before, and more importantly, what was still being built on a regular basis. NASA’s Marshall Spaceflight Center, based in Huntsville, Alabama, had extensive experience with the design of converting Saturn V rockets to modules for the Skylab stations, which was easily adaptable to converting the successor Caelus launchers to modules for a successor station. Johnson Spaceflight Center, in Houston, Texas, was able to draw on its own history as mission control for the Space Shuttle, which had the ability to dock with the Skylab stations as a launch feature. As a result, a vast network of suppliers and contractors were already in place, ready, willing, and able to exploit the minor modifications of the existing (and active!) construction facilities which would support a potential large next generation space station. That one of the key states which stood to benefit from the choice of a space station over a lunar landing program was Alabama did not go without notice, another palpable indicator of the controversial “deal with the devil”... who, true to form, had his finger on the scales.

    Time was another consideration; even on the most aggressive timetables, it was not certain that Man would return to the Moon before the end of the 1980s (and therefore by the end of Glenn’s projected second term on January 20, 1989). Even the most conservative estimates for a space station, on the other hand, had it mostly complete and already operational by mid-decade; all Glenn would have to do to see it come to fruition was win re-election, and so he did.

    Most importantly, a permanent space station would not have to be a project whose costs NASA had to shoulder alone, which was where the new paradigm of cooperation came in. NASA could farm out the construction of entire modules (alongside other components) to the “lesser” space agencies, forcing them to bear those costs in exchange for becoming partners in the enterprise. Feelers were put out to ESA, the CSA, and NASDA, with all three agencies expressing interest in a collaborative endeavour. [4] This, more than anything else, would tip the scales in favour of a space station, as it made NASA’s limited resources stretch further than might have otherwise been the case. The culmination of all these advantages meant that the space station was able to be far more spacious and lavishly constructed than would have been the case had it been a mere sideshow to a return to the Moon.

    Although the reasons for going ahead with a space station instead of a lunar landing were well established, the reasons for going ahead with a space station for its own sake were somewhat more nebulous. One strength of the previous competition-based paradigm at NASA was that nobody needed a reason to go to the Moon, other than to beat the other guy in getting there first. Although scientific research and experimentation was conducted on the lunar surface by the Apollo astronauts, and samples were returned for analysis by earthbound chemists and geologists [5] – which proved, among other things, that Moon rocks had the same composition as Earth rocks (supporting the shared origin theory), and also that terran plant life could survive in returned lunar soil samples (under Earth-like conditions) – these breakthroughs were considered a mere sideshow to the overarching goal: to establish, and then extend, a lead over the Soviets. Once the Soviets effectively abandoned their own lunar exploration plans in the early-1970s, the Apollo program’s days were numbered, despite the rise of the Moonie Loonies and their philosophy of “luna gratia lunaris” – the Moon for the Moon’s sake. [6] After all, the lunar mission – just like all of NASA up to that point – had been sustained by the spirit of competition.

    The spirit of cooperation which had become the new doctrine at NASA would inform the choice of name for the space station. Americentric names such as Freedom, Liberty, Independence, and Revolution were all rejected for fear of seeming exclusive of NASA’s partner agencies. [7] On the other hand, names which focused overtly on the aspect of cooperation – Peace, Brotherhood, Unity, and Concordia were among these [8] – were deemed insufficiently inspiring; a name was needed which would capture the majesty of space, and of the ambition and drive to innovate which the space station would represent. Peace and unity, after all, were earthbound concerns, not necessarily spacebound ones. It was in turning to the classic Greco-Roman wheelhouse that a worthy name was finally chosen: Olympia, for Mount Olympus, the seat of the Greek pantheon, and the highest peak in Greece. Olympia would represent the notion of cooperation (Olympus was shared by twelve deities, all with vastly different philosophies, powers, and interests) as well as the awesomeness of space, and its isolation from the people of Earth.

    The incoming Glenn administration found themselves in an even better position than they realized once it was decided to move forward with plans for a space station. The Humphrey administration, in its later years, had already approved studies for future space stations to follow Skylab – particularly for use as orbital platforms which would host the massive arrays collecting the solar energy which would then be transmitted down to Earth as microwave power. These plans continued under the Reagan administration, coming to an end only when it became apparent that such a means of power generation might not be economically feasible, by which time popular opinion was rapidly shifting against microwave power anyway. [9] However, these plans called for a station – or rather a network of stations – built on a much larger scale than the (comparatively) modest plans for Olympia, and thus it was almost trivially easy to cut them down and adapt them for use on their current project. This saved a great deal of what little time and money needed to be funnelled into design and development, allowing them to proceed almost immediately to component construction.

    The basic building block of these studies had been massive modules, ten metres in diameter, fashioned from the tanks of the Saturn V (and later Caelus) second stages just as the Skylab stations were built from the smaller third stages, though at roughly four times the volume. [10] Although the original plans for the “superstations” of the 1970s had called for a great many of these modules to be cobbled together into a vast and intricate network, just one C-II rocket module would function as the core of Olympia. This module would house the life support and habitation systems necessary to sustain a much larger crew than Skylab from the outset of operations. In fact, its great size was such that it could not be launched fully outfitted – it would be far too heavy (weighing in at 200 tonnes, nearly twice as much as what Caelus was capable of throwing to low Earth orbit). As a result, an auxiliary module built from a C-III rocket (the same size as Skylab, 6.6 metres in diameter) would be launched as a cargo ferry before later seeing conversion into additional laboratory space for the intended crew manifest – sixteen to twenty astronauts, all told.

    These two core modules would be supplemented by three smaller laboratory modules, one to be built by each of three “junior partners” to NASA. Massing only 20 tonnes each, these were still too large to be launched natively by the rockets built by the various space agencies; thus they would also be launched by American rockets, though they would dock with the station under their own power. The infrastructure supporting all of these modules would include massive, state-of-the-art arrays of solar panels – a legacy of the failed microwave power experiment – providing the energy needed to power the myriad of laboratories and facilities aboard the station. Once the station was largely in place, the Space Shuttle would handle standard crew and cargo transport duties, though it was enough of a handful for the original four shuttles that more were ordered to keep supplies flowing – as at least one would be docked in rotation at Olympia to function as a “lifeboat” in case of perilous circumstances.

    President Glenn thus formally unveiled his plans for a space station in the summer of 1981, slightly ahead of his proposal for that fiscal year’s budget, presenting it as an accessory to his broader Invest in America initiatives. Although he could not fund NASA by executive fiat – budgeting was the responsibility of Congress alone – he had enough allies in the Democratic-controlled House to pass a budget which called for a bump in NASA funding, which in turn would allow for the space station to get off the ground. The wheels were set in motion at the very instant that Congressional approval was secured in the autumn of 1981, but even though NASA was able to proceed from a relatively advanced starting position, there was little apparent progress for the first few years of development; everything that was happening was behind-the-scenes work, including the final design of Olympia, which would be built in stages. It was within this window that detailed discussions with ESA, the CSA, and NASDA first took place; it was decided that all three would have their modules attach to a single node which would function as the primary artery connecting all of the modules of the station, which was unimaginatively called “Node 1”. It was planned to launch already attached to the core module, which had internally become known as the Olympia Core Module, or OCM; reporters, who were generally less acronym-happy than NASA technicians, were happy calling it simply “the Core”, and the name stuck. Construction on most of these components began in earnest in 1983, after two years of planning, though it wouldn’t be until 1984, when proper structural assembly of these components began in Southern California, that the Core would resemble anything close to its final form, and the actual launch of the Core – and, thus, effectively, of the station in earnest, as it was designed to be operational even without the other modules, took place in the late summer of 1986, nearly five years after construction was first approved by Congress. The crew for the first mission aboard the Ur-Olympia, flying aboard the USS Enterprise, followed two weeks later, in early September, for a month-long stay.

    From the outset, NASA’s publicity regarding Olympia emphasized the scientific breakthroughs that would be possible on the station, which would be much larger than Skylab and thus capable of housing many more laboratories – and many more test subjects, or rather, astronauts. Advances in telecommunications technology, particularly with regards to broadcast satellites, and the relatively close proximity of the station’s planned low Earth orbit to the planetary surface was such that live transmission and real-time two-way communication would not only be possible, but almost mandatory. The educational possibilities were enormous, with science students the world over able to directly observe the experiments which were to be conducted aboard Olympia. Among the laboratories on the drawing board was one which would study the effects of long-time exposure to the zero-gravity environment of outer space. Loss of bone density and muscle mass were already known to be direct effects of spaceflight of any duration, and testing potential remedies to this problem might also have use on Earth to treat any number of degenerative diseases. The human guinea pigs aboard the station would additionally be joined by other tourists - animal, vegetable, and mineral alike - as they would be subjected to a wide battery of tests measuring their adaptability to extreme and hostile environments, to provide a better understanding of just how insidious and tenacious life could be.

    However, all of these wondrous experiments would be contingent on the launch of the dedicated science modules, which lagged well behind the launch of the Core; the American Scientific Research Module, or SRM, built from the C-III rocket housing, was constructed in parallel with the Core but due to its lower construction priority would not be finished for several months thereafter; the launch was scheduled for 1987. The three modules belonging to the junior partners – ESA’s Jules Verne, the CSA’s Endeavour, and NASDA’s Kibo – were due to arrive later still; it was telling that the SRM, considered a mere accessory to an even larger main project by NASA, was still considerably larger than any of the three partner modules, all of which were the defining prestige projects of the 1980s for their respective agencies. As of the latest timetable projections in mid-1986, those modules were due toward the end of the decade in question. Ironically, despite the doctrine of cooperation, the three agencies were in fierce competition with each other over which module would be completed first, and which one would play host to more impressive facilities. As competition so often does, it drove innovation at ESA, the CSA, and NASDA, each of which was eager to join in a game of oneupmanship over their rivals to prove once and for all… which of them was the third-greatest space agency in the world.

    Even though the Core module would be all by its lonesome for the first year of operations, the crew of astronauts aboard the Ur-Olympia were still able to conduct a surprisingly large number of scientific experiments in their jury-rigged, temporary laboratories. However, the skeleton crew of six still had to prepare the station for future expansion – a much harder job than it would have been for the full crew complement of three to four times that number. Teamster work – loading and unloading the regular cargo shuttles to and from the Earth – comprised a surprisingly large proportion of their schedule as equipment was installed to fill up the Core, a job which was sped up but also made more difficult by the construction of a second run of two additional shuttles, both named for their intended roles as “messengers” to Olympia: Hermes and Iris. The two shuttles were ordered in 1981, as part of the same budget that authorized Olympia; they were ready to launch just in time to begin servicing the Core five years later. In addition to carrying cargo, the shuttles would also ferry crew, with the plan being to drop off the arriving complement and pick up the departing complement aboard the same shuttle, though – especially with the skeleton crews continuing to operate until such time as the station was more complete – there would be some overlap between the end of one mission and the beginning of the next, and a great deal of housekeeping work was able to be completed during these overlaps. Still, even a dozen astronauts was well short of the intended roster at any given moment once the station operated at full capacity. There was sadly very little opportunity for the assembled astronauts to wow earthbound audiences with their audacious experiments.

    Once it became clear during the initial mission that the astronauts would have little time to engage of publicity activities except during the brief “overlap” periods, an idea was hatched to take advantage of the high frequency of these handovers and of their relatively brief duration. Node 1, even in the station’s primitive state, still allowed multiple shuttles to dock with Olympia at once, and each crew complement was embedded with a single shuttle. [11] Each shuttle was designed to carry a full crew, not the skeleton crews currently being ferried until such time as the station was sufficiently complete so as to support them at capacity. This left a lot of empty seats on which sojourners – who arrived with the arriving shuttle and then departed with the departing shuttle, thus only remaining aboard the station for a few days – could hitch a ride. [12] These sojourners would not be expected to be trained in operating the station, given the extremely short durations of their stay, and which even allowed the possibility that they might be… civilians. NASA publicity saw this as a win-win – bring Olympia to the people by bringing one of the people to Olympia. One suggestion considered for Skylab – but ultimately shelved, for lack of time and space – was to have a civilian science teacher serve as mission crew and perform a lesson aboard the station. It was decided in the summer of 1986 that reviving the “Teacher in Space” initiative for Olympia would be the perfect fit. An internal memo from NASA Administrator McDivitt approving this initiative in principle was dated September 19, 1986 (a Friday), in the late morning, right before he left work for the weekend to catch up on his long game at the links.

    The Teacher in Space program was perhaps NASA’s most ambitious attempt to maintain public interest in Olympia, which had started strong but was beginning to wane by the mid-1980s. Olympia, commentators had frequently noted at the time of its selection over a return to the Moon, had the advantage of coming to fruition much more quickly and cheaply than the alternative, and indeed at least the American portions of Olympia were still on track to be completely assembled and operational by Election Day 1988, with comparatively minor delays and setbacks, especially relative to past projects. The “echo boom” generation was increasingly becoming known for a desire for instant gratification, wanting everything yesterday [13], and the unfortunate reality of space exploration was that nothing about it was instant or immediate. Video feeds from Olympia couldn’t capture the wonders of space when the crew being observed were concerning themselves with such mundane tasks as unloading cargo, although some easily amused sorts took delight in the physical acrobatics that accompanied the act of pulling an object – any object – out of a container.

    The astronauts made some efforts to entertain their earthbound viewers during their limited downtime, most notably when the lone non-American of the first Olympia mission – West German astronaut and long-distance runner Alfred Bäcker, from Darmstadt – ran the Berlin Marathon in real-time along the inner circumference of one of the unused floors of the station’s cavernous core, the camera set up in such a way as to mimic the famous master shot of a similar scene from 2001: A Space Odyssey. Speaking to the press afterwards, Bäcker noted his desire to not fall out of practice, and reminded viewers at home that his 42.2-kilometre run [14] was not nearly so taxing as it had been for his fellow competitors in Berlin, as the centripetal forces generated as he ran around the stationary module only produced gravity at roughly 0.3g – wholly dependent on his running speed, which averaged at 14 kilometres per hour. This allowed him to finish the marathon in just over three hours – a personal best for him, but still almost an hour short of the world record, and of the record held by the winner of the Berlin Marathon on that particular September 14th (a Sunday). [15] The event was a big hit, popular the world over – particularly (obviously) in West Germany, where Bäcker earned the enduring nickname “Der Läufer”, or “the runner”. Although the running craze which had been popular in the last decade had faded somewhat by 1986, this ingenious bit of quick thinking and playing to the crowd by Bäcker would likely set a precedent for the missions ahead…

    Olympia.png

    Space Station Olympia shortly before the commencement of the first Olympia mission in early September, 1986. The Space Shuttle Enterprise is about to dock with the station.

    ---


    [1] 1.25% precisely, Doctor. Measured in your Earth units.

    [2] The idea of water ice on the Moon meaning life on the Moon (often extremophile bacteria among “serious” enthusiasts, with more fanciful proponents suggesting subterranean colonies peopled with complex and intelligent life-forms) was very popular in the later 1970s ITTL, becoming the “Face on Mars” of its day, advocated (among other places) on such shows as In Search Of… (not hosted by Leonard Nimoy ITTL), which never let facts get in the way of a good story. Even the otherwise-rigorous Cosmos used the water ice to advance an argument that alien life could be far more likely than first appearances suggest, earning Sagan one of his (very) few rebukes from the scientific community. (Simply put, the man had blinders on when it came to alien life. He’s even entertained the “ancient astronauts” theory with a straight face, when any remotely serious anthropologist would laugh any such proponent out of the room.)

    [3] Though it should go without saying that all four of those issues still exist in the present day ITTL, just not to the same extent, nor to the same intensity.

    [4] This marks the first time that an agency of the US government has ever formally cooperated with the CSA. Progress is possible!

    [5] This editor’s pedantic nature obliges him to point out that scientists who study lunar soil are properly selenologists, not geologists.

    [6] It should be noted, for the edification of any linguists who might be reading, that luna gratia lunaris is pure Pig-Latin, derived from the famous MGM motto ars gratia artis - art for art’s sake.

    [7] Freedom, of course, was the name chosen for the OTL equivalent of Olympia (before that project was folded into the International Space Station) but that name passed muster under the more… dogmatic Reagan administration. The Glenn administration is more conciliatory by comparison, and less adamant about choosing a name positively dripping in Americana. Of the other choices, France obviously likes the name Liberty and pushes hard for it, but nobody else – not even her partners in ESA – is nearly as enthusiastic.

    [8] Many of these names have their supporters amongst the various member states of the other agencies – in fact, said agencies often become divided over the issue. Within ESA, France (naturally) supports Brotherhood (or, in French, Fraternité), whereas the other states (particularly West Germany) prefer Unity. In the CSA, the UK is fond of Peace, but Canada prefers Concordia (a name her delegation suggested), as it is the name of a Canadian university formed in 1974 (IOTL and ITTL) through the merger of a Jesuit college and a preexisting secular university. This division prevents any of the “cooperative” names from gaining much traction, allowing the compromise choice of Olympia to emerge by consensus.

    [9] Before the dark times. Before The Greenpoint Dilemma.

    [10] The second stage of the Saturn V three-stage rocket, the S-II, evolved into the Caelus C-II ITTL, just as the third stage of the Saturn V, the S-IVB, evolved into the Caelus C-III. However, and as is definitely not the case for the Caelus first stage (C-I), the two upper stages of the Caelus rocket are simply modernized updates of their Saturn predecessors, not unlike the incremental (and nigh-imperceptible) changes in car models from year to year. (It should also be noted that the C-II features more cupholders than the S-II.)

    [11] Ideally, up to three shuttles would be docked at the completed Olympia at any one time: the shuttle which would ferry the departing crew back to Earth; the shuttle which had ferried the arriving crew from Earth; and a third shuttle for cargo transport.

    [12] Another option considered was to embed these sojourners with the even more frequent Shuttle missions carrying supplies to the station, which generally flew with just a Commander and Pilot aboard, staying for a few days up to as long as a week.

    [13] The echo boomers being specifically known for wanting everything yesterday is, of course, a reference to the famous early-1980s pop song, “I Want It Yesterday”, performed by a young starlet who went on to become an icon of 1980s fashion and culture. You can probably imagine her OTL analogue, what she’s like and what kind of girl she is.

    [14] This would require approximately 1,343 revolutions about the station. Due to attitude control effects, Bäcker switched directions between clockwise and counterclockwise every few minutes at ground control request; he would always do this while obscured from the camera’s view.

    [15] IOTL, the Berlin Marathon - one of the marquee marathon events the world over, though for obvious reasons the course only covered West Berlin until 1990 - began in 1974, and takes place on the last Sunday in September. ITTL, butterflies see to it that the event takes place on the second Sunday in September instead.

    ---

    This update was co-written with e of pi, who was also wholly responsible for the design of Space Station Olympia. Thanks also to Dan1988 for assisting with the editing, and to nixonshead for the dazzling render of Olympia as it appears at the conclusion of this TL! If only there were some way to see the finished station in all its glory… if only, if only…

    Also, would you look at that, I knocked out another update in just one week! Could I possibly match that pace and finish this timeline in less than a month? (Probably not.)


    Olympia.png
     
    Appendix B, Part XI: As the World Turns
  • Appendix B, Part XI: As the World Turns

    At the dawn of the twentieth century, the very height of British imperial power, a small but vocal and enthusiastic minority of the chattering classes at Westminster, led by the quixotic Liberal Unionist Joseph Chamberlain, were agitating for an evolution of the Empire from its traditional form into a cooperative Imperial Federation which would return legislators representing not only the UK but also her Dominions beyond the Seas - initially only the “civilized” pre-WWI White Dominions (Canada, Australia, New Zealand, Newfoundland, and South Africa), but eventually all of the territories spanning the length and breadth of the Empire. This did not happen before World War I, which would prove nearly as devastating to the relationship between Britain and her Dominions as the conflict itself had been in terms of lives lost. From that point forward, the Empire - having reached its territorial zenith as the result of gains made in that war - began to drift apart, until it was forced apart after World War II, due to a variety of internal and external factors. An increasing number of Britons believed that their future lie across the channel, with the continent that had for so long been the object of their vexations, as opposed to beyond the seas with their former countrymen.

    Certain political forces on the continent, however thrilled and grateful they might have been at the part which the British had played in their liberation from Nazi oppression, were not quite so willing to embrace them as partners in peacetime. Once bitten, twice shy, so the expression went, though in the case of the United Kingdom it took three rejections (two from French President Charles de Gaulle, and a third as the result of protracted negotiations between the EEC leadership and British PM Harold Wilson) before Perfidious Albion finally sought friends elsewhere, and even then, their decision to remain a part of the EFTA and seek closer trade relations with the Commonwealth Realms in the meantime was perceived on all sides as a stopgap measure. What the British government had not expected was that a number of key geopolitical realignments would be taking place in the 1970s which would have dramatic repercussions on the ultimate destinies of not only the UK, but all of the continental powers as well…

    With the collapse of the Backwards Bloc in the 1970s, its three European members (Spain, Portugal, and Greece) floated the idea of seeking membership in the common market, also known as the EEC, so as to have greater access to foreign goods, services, capital, and people. The three countries had already reduced trade barriers amongst themselves during the Backwards Bloc years, with Spain and Portugal in particular forging what had become known as the “Iberian Compact”, essentially a common market in miniature (albeit, and unlike the EEC, without erecting trade barriers with the rest of the world). As Portugal was already a member state of the looser EFTA, this in effect made Spain an indirect member of that bloc as well. As a result of the Iberian Sunrise, both countries agreed to pursue a joint destiny for themselves: they would jointly seek admittance to the EEC, or failing that, Portugal would lobby for Spain’s entry into the EFTA. At first the EEC easily seemed the more logical destination for the two Iberian monarchies, with a much greater upside, but gradually the EFTA came to look more and more attractive, as a result of the changes transforming both the EEC and the EFTA beginning in the 1970s, and into the 1980s.

    Both blocs first enlarged from their charter rosters in the 1970s: the EEC admitted Denmark in 1973; the EFTA admitted Iceland in 1970, Ireland at British behest in 1974 - as the Celtic republic had originally sought to join the EEC alongside the UK, but effectively could not do so without them - and Finland, an associate member of the EFTA since 1961, joined as a full member shortly thereafter; both moves were in response to the oil crisis and the Humphrey shock, amongst other economic uncertainties in the mid-1970s. This brought EFTA membership to ten states, though the UK remained the only economic Great Power within the bloc, with the world’s sixth-largest economy in 1975. [1]

    The EEC, by contrast, though it was willing to admit the very large British economy in 1973, eventually changed direction, choosing to focus on economic, social, and political integration of her existing members, the Inner Six plus Denmark. These included three of the world’s ten largest economies: West Germany (#4), France (#5), and Italy (#7). [2] Ironically, Denmark, which arguably stood the most to gain from stronger ties with her more prosperous partners in the EEC, would consistently prove a thorn in the side of integration, its population seeming to prefer that the EEC retain what had up until that point been its primary function as a common market, despite the lofty (and vague) ambitions for “ever closer union” that had been a part of the vision for the bloc since its inception. Many EEC bureaucrats would ruefully remark in the years to come that admitting Denmark in the first place was probably a mistake, but there was no going back now. EEC politicians and economists began to float the idea - which, in various forms, had dated back to the nineteenth century - of a common currency for all EEC states. This began in earnest with the development of the European Unit of Account in 1974, in direct response to the collapse of the Bretton Woods system. Politicians in the EEC were enthusiastic about monetary union, and were eager to make lemonade out of the lemons that the Humphrey shock had handed them, despite the fact that monetary union was made more, not less, difficult with the severe fluctuations brought about with the switch to a pure fiat system. However, when US President Reagan once again tied the value of the US dollar to the Gold Standard in 1977, policymakers within the EEC suddenly found themselves facing a golden opportunity of their own, and monetary union became the driving force of economic and political policy within the EEC from that point forward. Though the EEC continued to invite new applications for membership, expansion had become a definite sideshow to integration. Danish trepidation over having joined the European project, meanwhile, continued to rise.

    Spain, Portugal, and Greece all had reservations about joining a trade bloc where monetary union was on the cards - EEC bureaucrats had gone so far as to insert clauses into draft treaties with these countries making “eventual membership within a European Monetary Union and exclusive usage of its monetary unit for currency” a pre-condition of further negotiation. That gave all these countries pause, and in the end, they joined the EFTA - Spain in 1979, and Greece in 1980 - for the time being. Spain in particular was a boon to the EFTA as the world’s 11th-largest economy in 1980, more prosperous than any other EFTA country save the UK. Greece was no slouch either, with an economy comparable in size to existing members Portugal and Finland. The microstate of Liechtenstein, whose only borders were with two EFTA members (Switzerland and Austria), also joined the bloc at this time, giving the EFTA proper a membership of thirteen states.

    The UK was in a unique situation - as the EFTA, unlike the EEC, did not preclude independent trade agreements with states outside of the bloc, the UK retained trade reciprocity with many of her former colonies, a legacy of the old Imperial Preference system. Nowhere was this more to British benefit than in Australia - one of the world’s ten largest economies in 1975. The two countries had always shared very close cultural ties, although Australia’s population had diversified from its predominantly Anglo-Celtic ancestry after World War II (90% of Australians were of British and/or Irish heritage in 1947) to include settlers from elsewhere in Europe (most notably Italians) and, increasingly, Asians. In joining the EEC, the UK would be required to abandon her trade links with Australia, and British politicians were very much aware of this at the time, but gave it little thought, even though it would have devastated the Australian economy. It was only after the fact, once it was clear that Britain would not be joining the EEC, that commentators sought to make political capital out of existing trade links to the Commonwealth - “our brothers and sisters beyond the Seas”, who had fought in the same wars, spoke the same language, and shared the same culture - and how these would have been jeopardized by EEC membership. Gradually, this helped turn the tide of public opinion against joining the EEC, especially once the Commonwealth Trade Agreement began to take shape.

    The largest economy in the CTA other than the UK itself was Canada, the world’s eighth-largest economy in 1975. Had Canada remained under the leadership of Liberal PM Pierre Trudeau, who disdained Canada’s British heritage and did his best to de-emphasize or even eliminate it from the workings of the Canadian government (a trend kickstarted by his predecessor, Lester Pearson, in the 1960s), it would have been unlikely that Canada would have taken so dominant a role in the emergence of the “New Commonwealth”, as it was sometimes called, in the 1970s. However, Tory PM Robert Stanfield was from Nova Scotia, culturally far more British than Trudeau’s Francophone province of Quebec, and he recognized that the best way to reduce the influence of the American economy and culture over that of Canada was to find ballast to it - Trudeau had favoured Red China to this end, but Stanfield realized that the Commonwealth in general and the UK in particular would be much better partners. The Commonwealth Trade Agreement turned out - much like the European Coal and Steel Community of the 1950s had been for the EEC - to be the first step in something altogether grander and more all-encompassing than a mere trade treaty.

    Many economists and even some politicians in Canada favoured trade reciprocity with the United States, but others feared that their much stronger industrial base would cripple the Canadian manufacturing sector, putting hundreds of thousands - if not millions - of jobs at risk, which would easily subvert the benefits from cheaper and more accessible goods and services. The UK and particularly Australia seemed far less remote a threat to Canadian manufacturing interests. Canada was particularly interested in including the various Caribbean island nations that were part of the Commonwealth in the CTA, as crops were grown there which were impractical in the colder Canadian climate - even after the Turks and Caicos had joined Canada as its third territory in 1981, the economy of that small island chain was based largely on tourism and (as increasing numbers of well-off Canadians made their homes there in the winter) the service industry. Canada also had a large Caribbean diaspora population who favoured closer economic ties with their homelands. Jamaica was the most populous Commonwealth Realm in the Caribbean, with nearly 2.2 million people in 1981; Trinidad and Tobago followed with a population of 1.1 million. No other Commonwealth Realm in the Caribbean had a population of over 500,000. Jamaica had a particularly important role in the formation of the CTA as the Commonwealth Heads of Government Meeting 1975 - the event considered the “birth” of the trade bloc - was held in its capital of Kingston. The host of the event, Jamaican Prime Minister Michael Manley, opposed protectionism and favoured closer trade links with the mature, much larger economies who formed the core of the CTA, and most other Commonwealth organizations. These were the “Big Three” of the UK, Canada, and Australia. In 1975, they were three of the world’s ten largest economies, providing their bloc with an important distinction which was, appropriately enough, shared with the EEC.

    As was the case with the EEC, the CTA evolved over time. Initially, the organization committed itself solely to the reduction of trade barriers, but there was also some discussion of possible regulatory functions for the distribution of goods across the “Commonwealth Market”, as commentators, particularly in the UK, came to refer to it. Free movement of people, an important pillar of the EEC, was also discussed, and this suggestion was met with the most enthusiastic response - the problem was that the Big Three were all eager to have their citizens freely live and work amongst their countries, but all feared the problem of opening their borders to the citizens of impoverished Caribbean countries. This was a barrier to a number of moves toward integration - the notion suited the Big Three, and perhaps a select few others, but only when it was limited to just them. As a result, the Big Three began to meet privately amongst themselves to discuss economic and political matters. The first meeting was held in 1977, shortly after the Commonwealth Heads of Government Meeting that year in London, and ostensibly to discuss the particulars of the Commonwealth Space Agency; however, the CTA and how it could serve as a springboard for further economic integration was also a topic of discussion.

    New Zealand, despite its much smaller population than the big three (just over 3 million in 1981; the UK had over 56 million, Canada had 25 million, and Australia had 15 million), was a similarly mature economy, and usually shared similar objectives and goals with the Big Three, as well as from possible expansion of the CTA into other competencies. However, it was excluded from the “brain trust” meetings for reasons of optics; if New Zealand, one of the old “White Dominions”, were excluded from the meetings, they would look less like a conspiracy of the oppressors against the oppressed of the former Empire, although many did indeed make this argument regardless; New Zealand’s exclusion merely served to annoy the Kiwis.

    What eventually became known as the Commonwealth Free Movement Area, or CFMA, emerged as the result of discussions beginning at the first formal Big Three Meeting in 1978, held shortly after Canadian PM Robert Stanfield won re-election, at his official retreat in Harrington Lake, in the Gatineau Hills across the Ottawa River from the nation’s capital. [3] All three PMs in attendance agreed that all citizens of all three countries should in principle have freedom to live and work in any of the three, which would supersede the existing paradigm of emigrants from (usually) the UK merely having the right of return. By 1978, all three countries had net immigration, and there was little fear of a flood of emigrants from one or two of them to the others as the result of such an arrangement. All three agreed that New Zealand should also become a charter member of the CFMA, but had strong reservations with regards to the Caribbean Commonwealth Realms; emigration from the region to all of the Big Three countries was already very high in the 1970s, and that was without full freedom of movement. It was, therefore, decided at that meeting that developing the Commonwealth into a more elaborate organization with further-reaching competencies should be a “layered” process: membership of the CTA should not automatically confer membership of the CFMA, but, by contrast, membership of the CFMA was contingent on existing membership of the CTA. Another mechanism agreed upon at this meeting was the right of any and all existing members of any Commonwealth organization to veto further enlargement. After the meeting concluded, New Zealand was informed of plans for the CFMA and showed an interest in becoming a member.

    However, it was during the early-1980s that plans for integration of the core Commonwealth Realms into a multi-layered, quasi-confederal organization had to clear some important ideological roadblocks. The Commonwealth of Nations could not be depicted as a true brotherhood of equals so long as the UK Parliament at Westminster retained the ability to amend the constitutions of other Commonwealth Realms at will, and without recourse on the part of those Realms. It was important to the UK - and to politicians in Canada, Australia, and New Zealand - that the New Commonwealth could not be compared to the Empire of old. To this end, the concept of patriation - the Commonwealth Realms, former Dominions, bringing the ability to amend their own constitutions home from Westminster - entered the public consciousness. Nowhere more than in Canada did this concept catch on with the chattering classes.

    PM Stanfield’s recently-deceased predecessor, John Diefenbaker, had in 1960 introduced what he considered the crowning legislation of his career, the Canadian Bill of Rights, the first codified human rights law in Canadian history. However, as a mere piece of federal legislation, it was no more sacrosanct than any other; it could easily be repealed by any subsequent government. Although this was an accepted fact under the Westminster system (and indeed, the UK Parliament had made parliamentary supremacy a cornerstone of the British political system), Canadians, who were influenced by Americans and their inviolable, supreme, and enduring Constitution - which was far more difficult to amend than by simply ramming a bill through Congress - were disquieted by this notion. The Quebec referendum of 1980 gave those who favoured constitutional reform a once-in-a-generation opportunity to forward their cause, and many seized it. The ensuing 1982 federal election (fought after the Quebec provincial election of 1981 had returned a federalist government to the National Assembly, ensuring their likely cooperation with plans for constitutional reform) was premised largely on differing visions for a Canadian constitution. The Liberals favoured vesting additional powers in the federal government at the expense of provincial governments, enshrining affirmative action and other redistributive programs, mandating official bilingualism, and vesting the power of judicial review upon the Supreme Court of Canada [4]; the Conservatives already had a “blueprint for a Constitution” in Diefenbaker’s Bill of Rights. Stanfield’s personal popularity, his record of solid stewardship, his willingness to work with politicians across the ideological spectrum, and his more developed constitutional platform (PC campaign materials included free copies of the Canadian Bill of Rights, and Tory attack ads focused on the vagueness of what a Liberal Constitution might look like) resulted in his third consecutive majority government, the first (and only) for a Tory PM since Sir John A. Macdonald himself in 1887.

    After the 1982 election, Stanfield immediately set to work drafting the Constitution. Ultimately, the process would take two years. The Canadian Bill of Rights, the centerpiece of the new constitution, was indeed modeled heavily on Diefenbaker’s 1960 legislation, though with some modifications. Official Bilingualism was codified, with the famous “Stanfield Compromise” (French-language services must be provided by the federal government in all regions where the French-language population exceeds the national average) formally enshrined:

    “The official languages of Canada are English and French. The federal government must provide services to its citizens in at least one of these languages, whichever is the language spoken by the larger proportion of native speakers in a given census division. In census divisions where the proportion of native speakers of one of these two languages is in minority, but exceeds the national average, services must also be provided for speakers of this language. Services provided by all of the provinces for their citizens must meet these same criteria, with the exception that the proportion of native speakers of the minority language must exceed the provincial average. If the provinces lack the capacity to provide minority language services for their citizens, the appropriate resources will be made available by the federal government, the other provinces, or private enterprise, where appropriate. Separate funds or surcharges may be raised by these governments to provide for these services if necessary.” [5]

    This provision provided for the rights of French-speakers throughout Canada, and for those of English-speakers in Quebec, albeit only within select regions. The final clauses were inserted at the insistence of Quebec’s Premier, who was only willing to agree to provide English-language services for Anglo-Quebecers if the provincial government did not have to pay for them, or if these Anglo-Quebecers were willing to pay what quickly became known as the “Anglo tax”, which passed in 1985, after he had won re-election; the constitution was interpreted by the Supreme Court as meaning that the tax could only be used to fund providing minority-language services, and this alarmed many activists who feared that this was promoting a form of segregation (and indeed, the equality section of the Bill of Rights had to be revised to specifically permit this form of discrimination, again at the behest of the Quebec delegation).

    One of Stanfield’s more minor alterations, which (in typical Canadian fashion) was the most widely reported and the most widely disputed despite its complete lack of relevance to the everyday lives of the Canadian people, was a name change for the nation-state. The name Dominion of Canada naturally implied that it was a British Dominion, and indeed the group of former colonies granted home rule in the late-19th and early-20th centuries (Canada, Australia, South Africa, New Zealand, and Newfoundland) were formally known as the “British Dominions Beyond the Seas”. Therefore, Stanfield decided to give Canada the title it had originally sought at Confederation in 1867: Kingdom of Canada, a title which was ultimately rejected at that time for fear of provoking the United States. By 1984, the United States, though still a staunchly republican nation, was secure enough in its own station that it was nothing more than mildly bemused at Canada deciding to call itself a Kingdom. After all, it was a monarchy, and Elizabeth II had explicitly reigned as Queen of Canada since her coronation in 1953. Stanfield presented Canada’s formal name change to Kingdom of Canada as a reflection of Canada’s status as a mature, fully independent constitutional monarchy. This was met with opposition among republicans - predominantly Québecois - and indeed, the Quebec Premier had originally opposed this name change (the draft constitution referred to the Kingdom of Canada in long form, though “hereafter Canada” in most of the document), but Stanfield insisted on it in exchange for several other concessions. Monarchists, needless to say, were delighted, and some even proposed creating a parallel Canadian peerage, though only granting such titles to the Royal Family (for example, something along the lines of creating Charles, Prince of Wales, as Prince of Ontario, and referring to him in that context as heir apparent to the Canadian throne); however, this was never seriously entertained by the government. The name of the national holiday, Dominion Day, was changed to Canada Day, effective July 1, 1984 (a Friday).

    The Kingdom of Canada Act 1984 passed through the British Parliament in that year, which among its other provisions formally relinquished any further right by that Parliament to amend or otherwise alter the Canadian constitution. Two other acts, the Commonwealth of Australia Act 1984 and the Kingdom of New Zealand Act 1984, passed immediately thereafter, in sequence; Australia, being formally titled a Commonwealth instead of a Dominion, declined to change its name to Kingdom as Canada and New Zealand had done. Stanfield’s crowning legislative achievement finally having come to pass, he retired from politics at the age of 70, and after 12 years as Prime Minister (pending the selection of his replacement), leaving a wide-open leadership race for his successor. The frontrunner from the very beginning was the young, charismatic, and urbane Finance Minister, Brian Mulroney. Mulroney was in many ways Stanfield’s opposite: he was a member of the “Blue Tory” wing of the party, which represented Toronto and Montreal business elites, as opposed to the “One Nation” Red Tory wing. Stanfield was, especially by the standards of a politician, straightforward and genuine, whereas Mulroney was very slick and polished. However, that sheen - and the massive financial edge he had over his rivals with the support of those business elites - enabled him to win the PC leadership convention, and with it, the office of Prime Minister. Mulroney was a strong advocate of free trade, having been a staunch supporter of not only the CTA but also the CFMA, and he also spoke frequently of increased trade reciprocity with the United States, although there was a great deal more resistance to lowering trade barriers with an overbearing southern neighbour than there was lowering them with more distant countries whose exports were far less of a threat to Canadian farmers and manufacturers.

    It was early in Mulroney’s tenure as PM that the CFMA finally came into force. The four charter members were the UK, Canada, Australia, and New Zealand. All member states of the CFMA would retain full border controls, but could not refuse to admit a citizen of any other CFMA member state to reside and work in their own state for an indefinite duration - in effect, full freedom of movement between the four countries. There was some discussion of broadening the CFMA’s powers further - for example, allowing citizens of all member states to vote in local elections, a right already enjoyed in most capacities by Commonwealth citizens in the UK - but this would come later. For the time being, the CFMA had already enshrined itself as the “Core” of the Commonwealth. In essence, it allowed New Zealand to join the Big Three without actually enlarging the Big Three - indeed, once the CFMA came into existence, formal Big Three meetings ceased altogether, as further discussions could continue within the context of CFMA policy meetings instead. Enlargement of the CFMA was an issue which would face considerable debate in the years to come: most of the Commonwealth Realms, and even many Commonwealth republics, were for obvious reasons very eager to join the CFMA - India, in particular, despite having spent the last several decades drifting away from the UK with regards to foreign policy, showed considerable interest in joining both the CTA and the CFMA. India had a massive population base - on track to lap China in the coming decades, given the higher birthrate - and a seemingly endless supply of cheap labour. For this reason alone, the CFMA states were tremendously cautious of admitting India to either organization, even though business interests in the CFMA and the CTA at large were in favour.

    The United Kingdom, the largest economy and - despite all of the political and constitutional changes throughout the Commonwealth in the 1970s and 1980s - still the effective head of the association, had the benefit of also being part of the EFTA, as it had been since 1960. The UK, uniquely, represented the intersection between these two treaty zones, being the only state which was a member of both. The resulting free trade area to which the United Kingdom had exclusive access was 75% the size of the archrival EEC, roughly half that of the United States, and approximately the same size as that of Japan, the world’s third-largest economy. Unsurprisingly, many of the countries within each trade bloc flanking the UK wanted in to their overlapping free trade zone, although there was surprising resistance in certain corners, particularly the Republic of Ireland.

    Many Irish - particularly those who retained their ancestral disdain for their one-time imperialist oppressors - continued to resent the UK for effectively blocking their entry into the EEC. The Irish national identity was built largely on what it was not - British - as on what it was in and of itself, a value it shared with another Anglosphere country with a much larger and more overwhelming neighbour, Canada. Ireland had become a republic in 1949, at which time the British Commonwealth and the Commonwealth Realms (those which recognized the British sovereign as their head of state) were synonymous, resulting in their expulsion from the organization; ironically, this would be the impetus toward relaxing these rules, allowing the modern Commonwealth of Nations to become an organization of states with a common cultural heritage, and not necessarily a common head of state. Nevertheless, Ireland had never rejoined the Commonwealth despite this change in membership criteria, and being a member state of the Commonwealth was a precondition to joining the Commonwealth Trade Agreement (indeed, all existing members were Commonwealth Realms). Ireland nevertheless was in many regards - cultural, ethnic, historical - far more similar to the member states of the CTA than any of the actual other member states of the Commonwealth. Irish politicians put out feelers towards entering into some kind of trade agreement with the CTA as early as the late-1970s, but the mere suggestion of rejoining the Commonwealth in order to do so would be touching a third rail in Irish politics. The eventual solution was a bilateral treaty between every member state of the CTA and the Republic of Ireland, which had the side benefit of proving to the Irish the benefits of being part of a looser, less restrictive trade association. Had Ireland joined the EEC, it would have been unable to conclude a trade agreement with the CTA on its own - now it stood alongside the UK at the CTA-EFTA intersection. Ireland then signed another bilateral treaty, this time with the CFMA, allowing its citizens to live and work anywhere within not only the United Kingdom, but also Canada, Australia, and New Zealand, and vice-versa. This set of bilateral treaties enabled both sides to have their cake and eat it too: Ireland became a de facto member of the “core” Commonwealth without formally re-joining the organization, whereas the CTA and CFMA membership did not have to worry about having “snubbed” countries within the Commonwealth by passing them over to admit Ireland, as Ireland hadn’t technically “jumped the queue” and been admitted to either organization.

    Nevertheless, the CTA and the CFMA became campaign issues in the elections held in the two largest economies belonging to that bloc: Canada and the United Kingdom. Mulroney’s government had enjoyed a bump in the polls following his taking office in early 1985, and as he was a new PM, there was some pressure for him to win a mandate of his own rather than coast on Stanfield’s 1982 mandate, even though the Canadian constitution allowed him to stay in office without calling an election until 1987. To this end, he called an election in the spring of 1986. John Turner, who had been Leader of the Opposition since 1975, retired after losing the “Constitutional Election” in 1982, becoming the first Liberal leader in a century to have never served as Prime Minister. [6] In the leadership convention that followed, which maintained the Liberal Party tradition of alternating Anglophone and Francophone leaders, Jean Chretien, the MP for Saint-Maurice, Quebec, was chosen as leader. [7] He benefitted from his contrasts to Mulroney, and indeed to Turner; he had humble origins and a folksy image (embodied in his nickname, “the little guy from Shawinigan”), and enjoyed tremendous personal popularity. Although he had only enjoyed a relatively minor role in Cabinet in Trudeau’s government (as Minister of Indian Affairs), he had become one of leading figures in the Shadow Cabinet, also playing a key role in the “Constitutional Election” of 1982, where his passionate, if inarticulate (his childhood bout with Bell’s palsy famously, as comedians joked, rendered him “unable to speak in either of Canada’s official languages”) vision for Canada’s future earned him plaudits. [8] From then on, his future leadership of the party seemed inevitable. Slightly older than Mulroney, he was 52 when the election campaign began. The two smaller parties, the NDP and Social Credit, retained at the helm the men who had led them into the 1978 and 1982 campaigns, Lorne Nystrom and Andre-Gilles Fortin, respectively.

    Still, Chretien couldn’t compete with the very strong economy and Mulroney’s charisma - Mulroney was also a fierce debater, demolishing Chretien in both the English- and French-language debates. As a result, Mulroney won the election easily; most of the new seats created through redistricting by the 1981 census went PC, allowing his party to increase its majority in the House of Commons without actually poaching a great number of ridings from the other parties. Nonetheless, Mulroney’s popularity with Quebec voters enabled his party to do better in Quebec than Stanfield had ever done - Chretien himself came perilously close to losing his seat, surviving only because he was able to come up the middle between the Tory and Socred candidates in his riding. The Anglo elite in Montreal completed their decisive shift towards the PCs in this election, the Tories sweeping the West Island (which included Mulroney’s riding of Dorval), and - in a definite wound to Liberal pride - Trudeau’s old riding of Mount Royal, not won by the Tories in over a half-century.

    The UK, on the other hand, was a very different story. The Tories at Westminster had been in power since 1974 - nearly as long as the Tories in Ottawa, who had formed government since 1972, but unlike their Canadian brethren, the UK Conservatives went into the election with the same old leader - Willie Whitelaw. Although 1986 was a rematch of 1982 - Leader of the Opposition David Owen, for Labour, had held on to power despite the left-wing schism in his party - Owen still managed to carry the impression of being a fresh face, and an antidote to the staleness in the upper echelons of British politics. The British economy had been slower to recover from the late-1970s recession than Canada or much of the rest of the world, despite her ambitious trading and migratory agreements. Whitelaw remained personally popular, but his party machine wasn’t so fortunate.

    A number of Cabinet reshuffles would eventually result in John Major, an MP since 1974, being named to one of the Great Offices of State as Chancellor of the Exchequer. [9] His relative youth despite his prominence in Government - aged just 40 when he became Chancellor - and his reputation for competence and for being a dull, steady pair of hands - meant that he was being groomed as Whitelaw’s successor by the power brokers at the Conservative Central Office almost immediately. Whitelaw had been PM for over a decade by this point; his tenure had exceeded the length of that of his immediate predecessor, Harold Wilson, in late 1983, making him the longest-serving Prime Minister of the twentieth century. [10] However, Major’s reputation for extreme dullness (as the son of a circus performer, he was often said to have been the only child who ran away from the circus to become an accountant) was completely shattered by the revelation of a shocking affair with a Tory backbench MP, Edwina Currie; both parties were married with children. Overnight, this destroyed Major’s prospects for moving next-door to No. 10, Downing Street, from No. 11, although in the end he did vacate No. 11, as he resigned his position; neither Major nor Currie would seek re-election in 1986.

    The Major-Currie sex scandal was merely the culmination of a number of monocle-popping incidents which shook public faith in the Conservatives - being the party which had always placed placed greater stock on family values and upholding the social contract. The British Board of Film Censors (BBFC) found themselves mired in controversy when one of the films they had certified for release was discovered to contain several frames of what appeared to be an erect penis in one scene, in clear violation of the informal (and unacknowledged) “Mull of Kintyre rule”. [11] This was brought to the general public’s attention through the relentless campaigning of social activist Mary Whitehouse, known for her staunch conservatism. There was considerable controversy over whether the object (for lack of a better word) being referred to actually was an erect penis - it was depicted only in silhouette, and not in such detail as to settle all doubt, as it was out of focus. Some commentators suggested that the offending object could merely be something phallic in shape, such as a candlestick; however, critics dismissed this possibility as far-fetched, and indeed, polls showed that many who were shown the offending images did believe that they depicted an erection. Whitehouse’s influence with Conservative voters could not be understated: her lobbying was instrumental in the passage of the Protection of Children Act 1978, which had banned child pornography. [12] As a result, between the BBFC fiasco and the Major Affair, the Tories began to fall far enough in the polls that Labour consistently placed ahead of them.

    Once it became clear that Owen could win the next election, what support remained for the splinter Democratic Socialist Party continued to evaporate - especially as that party (as is so often the case for parties on the far-left) began to splinter. The Labour Party thus won a small but workable majority in the general election of 1986, with the Tories remaining a fairly robust opposition; Whitelaw resigned as leader of the party shortly thereafter, his successor due to be chosen in the autumn. The Liberal Party, in the end, performed about as well in 1986 as they had in 1982; the collapse of the DSP saw Labour regaining many of their left-wing strongholds, but the Liberals also gained seats at the expense of the Tories, in constituencies where the electorate was not demographically predisposed to vote Labour, but where outrage at the perceived immorality of the Conservative Party was deemed sufficient to turf their MP in protest. David Owen thus became the next Prime Minister of the United Kingdom; unlike the last changeover between Labour and the Tories, this one would see the new government pursuing a consistent foreign policy with the old one. Owen was eager to strengthen trade links with the CTA and the EFTA, even floating the idea of a fully amalgamated CTA-EFTA “super-bloc”, with London serving as its financial and economic centre. However, despite having once been a proponent of joining the EEC, he did not favour the direction that integration was now taking in that bloc, and despite the Whitelaw government continuing to pay lip-service to the notion of someday re-opening negotiations to join, Owen formally dropped this pretence, making clear in a speech on a state visit to Paris - in the presence of the President of France, no less - that “the economic and political future of the UK lies firmly outside of the EEC”.

    Despite their divergent destinies, both blocs seemed to be evolving in lockstep with each other. As the CFMA came into force, so too did the European Currency Unit, or the ECU (₠) - popularly spelled and pronounced écu, particularly in French, as it shared its name with several French coins. Only member states of the EEC were allowed to mint écu coins and print écu banknotes, and not even all of these chose to do so; Denmark had negotiated an exemption for itself, and continued to use the Danish krone, albeit at an exchange rate pegged to the écu. [13] The EEC quickly negotiated agreements with the microstates of Monaco, San Marino, and the Vatican City, allowing each of them to mint and print their own écus and use them as legal tender - these countries were far too small to ever meet the criteria for EEC admission under normal circumstances, and thus special ones were deemed to apply. The Glenn Administration in the United States informed the EEC of their plans to once again eliminate the gold standard and convert the US dollar to a pure fiat currency, giving the écu a deadline for when it could come into force before more turbulence would close the window of opportunity, which was January 1, 1982. Later that year, the “Glenn shock” once again destabilized world currencies, but the EEC countries stuck it out.

    The surprising success of the single currency inspired those who dreamed of a United Europe to push for further integration, or “more Europe”, as the notion was sometimes described. This included a central bank - which became the primary objective, so as to better organize and control the new currency - and a common defence policy. France pushed hard for the common defence policy, having left the NATO command structure and wanting to head a purely European military alliance. However, all of the other member states of the EEC - including West Germany - were already NATO members, and there were concerns that an additional military and defensive alliance between them would be superfluous.

    However, and as it happened, co-operation and co-ordination with regards to military technology often took place beyond the borders and auspices of the EEC or any other, similar, supranational organization. The greatest example of this was the ongoing relationship between Britain and France, dating back to the Entente Cordiale of 1904, eighty years earlier. The two used their combined influence and might to force through a carrier-friendly Eurofighter Typhoon, which would eventually serve as the primary fixed-wing vessel for many of Western Europe’s (ground-based) air forces, in addition to the air arm of the Royal Navy and the Marine Nationale.

    The 1980s were also a productive time for aircraft carriers. The Invincible-class light carriers (Invincible, Illustrious, and Indomitable) were commissioned, one after the other, in this decade; all three were in active service by 1986. However, these shiny new carriers, though they were by this point the only carriers serving in the Royal Navy, were considered a mere appetizer to the main course which was yet to follow: the two Entente-class supercarriers.

    Truly these ships were worthy of the term supercarrier - they would be larger than any other carrier ever built by any navy other than that of the United States, although this distinction - which was originally unique to the Entente-class - would be shared with the first Soviet supercarrier, also under construction, originally named Riga but renamed for the recently deceased Soviet Premier Leonid Brezhnev when she was launched in 1985 (as the General Secretary had died in 1982, after her keel had been laid down). [14] Both ships had very similar dimensions in all respects: displacement at maximum load (roughly 60,000 metric tonnes), length overall (about 300 metres), beam (about 75 metres overall, and half that at the waterline), and draught (over ten metres deep). However, the Entente-class carriers had two decisive advantages over the Leonid Brezhnev: their nuclear propulsion (necessary for Britain and France, both of whom maintained far-flung colonial possessions, as opposed to the Soviet Union, which effectively had no overseas territories), and their surfaces being fitted with catapults, which allowed them to launch heavier, more traditional aircraft; the Leonid Brezhnev was limited to a cheaper, lighter ski-jump design. The nuclear propulsion for the Entente-class vessels was truly worth noting: it marked a first for any vessel in any navy other than that of the United States. Construction would probably take close to a decade, but naval enthusiasts and nationalists alike were thrilled: the Entente-class carriers were truly great ships fit for Great Powers, capable of meaningful power projection.

    Of-UOpRbMnXOH6I4P5lHQNW7uZ6Y8bSrnEe_MYmY_VYUylqBfxUZAFaPBSJ9uGv1FJXnRni98eelLNoguULGLtwXVWk3mCJTiiIAhIeo_zdq8Gqe4kNB9h0TQuHxqHwV4VGw-Oaw


    The UK was far from the only Commonwealth Realm to enlarge her fleet in the 1980s. The two Commonwealth-subclass vessels built by the United States for Canada and Australia were completed in 1986. In Canada, one of Brian Mulroney’s first acts as PM was to change the planned name of the new carrier from Diefenbaker - which remained controversial - to Macdonald, after Canada’s first Prime Minister and Father of Confederation; Macdonald, like Mulroney, had been a Tory, but unlike other long-serving Tory PMs (apart from the still-popular - and still-living - Stanfield), such as Borden, Bennett, and Diefenbaker, Macdonald retained a mostly positive legacy and continued to be widely liked by Canadians. HMCS John A. Macdonald, as she was properly known (though she quickly acquired the informal nickname “the John A.”) was commissioned with great fanfare in the summer of 1986, departing from CFB Halifax for a tour of the Arctic. Her motto, officially in Latin and a variant of the national motto, translated to English as “from sea to sea to sea”, emphasizing her role in protecting Canada’s coastline along the three oceans - the Atlantic, the Pacific, and the Arctic - that it bordered.

    Australia received HMAS Australia shortly after the Macdonald had arrived in Halifax; the two sister ships set sail from where they had been built simultaneously in Pascagoula, Mississippi, but the voyage to Sydney was a much longer one from the US Gulf Coast, though fortunately the ships were capable of traversing the Panama Canal by design. Naturally, it was winter in the Southern Hemisphere when the Australia arrived there, and thus the Australian government could not send Australia on the equivalent prestige mission (to the Australian territorial claim on the continent of Antarctica) until the new year, after enough of the relentless ice floes had melted away. Instead, Australia went on a tour of every major city as she circumnavigated the country - and the continent - for which she was named. She met with enthusiasm wherever she went - it helped that she had the undivided attention of naval enthusiasts Down Under, as her predecessor, HMAS Melbourne, had been retired in 1982.

    Outside the Commonwealth, the two Iberia-class carriers built by the Spanish shipbuilding firm Bazan for the two Iberian countries - Principe de Asturias for Spain and Infante D. Henrique for Portugal - had both been launched by 1986, built one after another on the same slip. They were still being fitted out, but would be serving their respective navies in active roles before the end of the decade. Spain - which had lost her last far-flung overseas possessions with the coming of democracy as a result of the Iberian Sunrise in the 1970s - planned for the maiden voyage of the Principe de Asturias to be a modest Australian-style coast-to-coast tour, followed by a sojourn to the Canaries. Portugal, on the other hand, planned to send the Infante D. Henrique (already referred to by her inevitable nickname O Navegador - the Navigator) on an ambitious tour of all her far-flung insular possessions, from the Azores through the Panama Canal to Macau and Portuguese Timor, and back again - albeit by travelling in the opposite direction from whence she came, allowing for a circumnavigation of the globe, which was after all a long-established tradition of Portuguese mariners. [15] Given the long stretches of ocean that such a tour would entail, O Navegador would have to make several detours at friendly ports along the way, including Pearl Harbor in Hawaii and Perth in Australia. Controversially, she would also have to stop in South Africa, a state with which Portugal continued to maintain good relations despite the increasingly tight economic sanctions being imposed upon the apartheid regime by the rest of the free world.

    The free world was changing in many ways, and at trajectories which had been completely unanticipated a quarter-century earlier, at the height of the Cold War. The United Kingdom had shifted from a tentative Continental orientation back to the overseas orientation which had defined the British Empire, but the New Commonwealth was a very different beast, one which promoted cooperation and consensus-building between equal partners. The EEC was not able to enlarge itself to consume all of Europe, despite multiple attempts, but focusing on integration was showing great promise for the future. Britain and France, despite being leading members of opposing economic blocs, were able to continue working together on projects which were in their mutual interest, the fruits of an alliance which had lasted for over 80 years, and showed no signs of ending anytime soon. Canada, which had spent so many years distancing itself from its Imperial heritage, now embraced its status as a core member nation of the Commonwealth. Canada and Australia continued to enjoy a very good working relationship with the United States despite their increasing ties with the United Kingdom. Spain, Portugal, and Greece were all taking important steps toward democracy and economic diversification despite having been charter members of the authoritarian Backwards Bloc in the not-too-distant past. Long-term plans, ironically, had a funny, funny way of changing course in an instant, and it behooved the leaders of world governments to maintain the flexibility and the strong working relationships needed to take advantage of the unexpected.

    ---

    [1] Among the other member states of the EFTA, Sweden (at #16), Switzerland (#18), and Austria (#23), were all in the top 25 world economies, with Norway (#26) just below this threshold. New member Finland was the world’s 30th-largest economy in 1975; Ireland, by contrast, was not within the Top 50. (Portugal, which remained a member of the EFTA despite forming the Backwards Bloc, was the world’s 34th-largest economy in 1975.)

    [2] Among the other EEC countries, The Netherlands (#14) and Belgium (#17) were no slouches either, and thus allowed the EEC to claim a full quarter of the world’s twenty largest economies; even newcomer Denmark (#24), though the weakest economy in the EEC outside tiny Luxembourg, was only a laggard in relative terms, not absolute ones.

    [3] The Harrington Lake retreat, quite conveniently, has two guest cottages in addition to the main cottage (and one for the staff), and therefore the British PM stays in the upper guest cottage, and the Australian PM stays in the lower guest cottage.

    [4] The Liberal policy is extremely similar to their OTL plans under PM Trudeau in the early-1980s, which resulted in the Charter of Rights and Freedoms. ITTL, Trudeau plays a key advisory role in the development of the Liberal platform for constitutional reform, and actively campaigns in support of it, much as he continued to meddle in constitutional affairs after his retirement from federal politics IOTL.

    [5] You may be wondering where this convoluted formula stands in comparison to OTL. IOTL, Canada was enshrined as a fully bilingual country, where services must be provided in English and French nationwide, which put many Anglophone civil servants who lived in largely Anglophone regions of the country (particularly the West, where it was a contributing factor in the “Western alienation” which rose as a political force in the 1980s) out of work. Quebec, on the other hand, discriminated against its Anglophone minority with increasing severity in the 1970s IOTL, starting with Bill 22 (passed in 1974 by the federalist Liberals, who ironically enjoyed - and still enjoy - broad support from Anglophone voters), and culminating in Bill 101, passed by the separatist Parti Quebecois in 1977. This resulted in a mass exodus by many of the province’s Anglophones, who resettled in the Greater Toronto Area; most of Montreal’s financial interests moved to Toronto as well, providing that city (which would likely otherwise meet much the same fate as many American Great Lakes cities of the 1970s) with a critical boost which has allowed it to overtake Montreal as the country’s economic hub. ITTL, the “Stanfield Compromise” is both convoluted and prone to loopholes, but it does represent a compromise (or rapprochement, if you prefer) of a sort, and sometimes optics can mean everything in politics.

    [6] IOTL, Edward Blake, leader of the Liberal Party of Canada (and, therefore, Her Majesty’s Loyal Opposition) from 1880 to 1887, was the last (non-interim) leader of that party not to become Prime Minister until Stephane Dion stepped down from the leadership after his defeat in 2008. (Dion was followed by Michael Ignatieff, who lost his seat in the 2011 election and returned to academia - and, eventually, to the United States, from whence he came.) Turner, for his part, briefly served as PM when he replaced the retiring Pierre Trudeau in 1984 - only to lose to Mulroney in that year’s federal election, one of the largest landslides in Canadian history.

    [7] The Liberal tradition for alternating between Anglophone and Francophone leaders is one which is almost as old as the party itself. After Edward Blake (Anglophone) succeeded Alexander Mackenzie (Anglophone) in the 1880s, every subsequent succession has adhered to this formula:
    • Laurier (Francophone)
    • Mackenzie King (Anglophone)
    • St. Laurent (Francophone)
    • Pearson (Anglophone)
    • P.E. Trudeau (Francophone)
    • Turner (Anglophone)
    • Chretien (Francophone)
    Notably, this chain is identical IOTL and ITTL, though the dates of succession vary in the case of Turner and Chretien. IOTL, it has endured to the present day:
    • Martin (Anglophone)
    • Dion (Francophone)
    • Ignatieff (Anglophone)
    • J. Trudeau (Francophone)
    [8] Chretien’s Bell’s palsy paralyzed one side of his face, and thus when he speaks, it is out of only one side of his mouth. This is immediately obvious visually. Some good-natured ribbing about this (the “fluent in neither of Canada’s official languages” crack is borrowed from a common joke IOTL) is considered acceptable (Canadian political humour can be surprisingly mean-spirited), but a famous PC attack ad against Chretien in the 1993 campaign which drew attention to his disability was widely considered to have crossed the line. Not coincidentally, the PCs (who were the incumbent majority government going into the election) were subsequently reduced to two seats (no, that isn’t a typo) and only 16% of the vote (they had won 43% in 1988 - Canada uses FPTP and thus a majority of the popular vote is not needed to win a majority of the seats). It should be noted that Mulroney, the PM in the 1980s IOTL who had resigned before that election, will not make the same mistake his successor did in this regard.

    [9] John Major was, IOTL, elected to Lambeth Borough Council in 1968, but was defeated in 1971. ITTL, he holds on in 1971 - and then runs for the vacant Streatham seat in 1974, which he wins. (Bill Shelton, the Tory MP who won the seat IOTL, here loses Clapham in 1970, thus depriving him of the springboard needed to contest this seat). This gives Major a decade’s experience in the Commons before he becomes Chancellor - the same amount as he had IOTL before he was appointed to the position (in the twilight years of his predecessor’s tenure).

    [10] Recall that Wilson served ITTL from October, 1964, to February, 1974, without a break: approximately nine-and-a-half years. ITTL, the last PM to serve a longer term than Whitelaw (and Wilson before him) was the Marquess of Salisbury, who held three non-consecutive administrations (1885-86, 1886-92, 1895-1902) for a total of thirteen years in government, a record which no subsequent Prime Minister has bettered even IOTL. (The last Prime Minister with a longer unbroken term was the Earl of Liverpool, who governed uninterrupted for nearly 15 years, 1812-27, which is also the third-longest Premiership ever in its own right (behind Walpole and Pitt the Younger).

    [11] The “Mull of Kintyre” rule, which for the record the BBFC formally denies ever actually existed, essentially states that no penis depicted onscreen shall have an angle from the vertical in excess of the Kintyre peninsula. Much like the informal “one F-word” rule for the MPAA, it is extremely arbitrary but is nevertheless an unusual example of a clear guideline from an agency otherwise renowned for its vagueness.

    [12] The Protection of Children Act 1978 began life IOTL as a private member’s bill proposed by a Conservative MP after a petition in support of it started by Whitehouse’s organization received well over a million signatures. Her advocacy for the bill, and support of its passage, is generally considered the greatest positive to result from her extremely controversial and polarizing career in social activism. ITTL, the bill is put forward by the Home Secretary, becoming law shortly before the 1978 general election.

    [13] Similar to IOTL, where Denmark has negotiated a permanent exemption from the Euro but remains a member of the Exchange Rate Mechanism, or ERM.

    [14] Unlike the Entente-class, the Leonid Brezhnev is based on an OTL design, which (like most Soviet military projects) has a long and convoluted history: the lead ship was renamed four times over the course of her development. Ordered as the Riga, she was renamed for Brezhnev after his death in 1982; after Gorbachev, who denounced Brezhnev’s legacy, took over in 1985, she was renamed Tbilisi; by 1990, when she was commissioned, the writing for Georgia’s long-term membership in the Soviet Union was no doubt on the wall, and she was renamed one last time, following the Nimitz-class paradigm, for a WWII Fleet Admiral, Nikolay Kuznetsov. Today, she serves as flagship for the Russian Navy. It should be noted that, IOTL, the Kuznetsov is not considered a supercarrier - at least, not by Wikipedia. [CITATION NEEDED]

    [15] The intended route of circumnavigation is approximately as follows:
    • Lisbon
    • Azores
    • Panama Canal
    • Pearl Harbor, Hawaii
    • Okinawa, Japan
    • Hong Kong and Macau
    • Port Hera, Dili, Portuguese Timor
    • Perth, Australia
    • Durban, South Africa
    • Sao Tome and Principe
    • Cape Verde
    • Madeira
    • Lisbon
    ---

    Thus concludes this ante-penultimate update of That Wacky Redhead, the final instalment of Appendix B, and the first update on this third iteration of the forum (or as I like to call it, the “New New Forum”). Thanks, as always, are due to e of pi for assisting with the editing of this update, as well as to Dan1988, Thande, and Electric Monk for their additional input! This update was a lot of fun for me to write, and not only because the material being covered is extremely topical at the moment! I want to say that this will be the last long update, since there are only two left and I plan for both of them to be more direct and focused, but you never know. In the interim, you can expect a guest post from longtime friend to the thread Dan1988, covering some of the material featured in this update from a radically different perspective. Until then, thank you all so much for reading!

    (Fun fact, for those of you who appreciate this sort of thing: this update is one of the very few to mention neither TWR nor her studio.)
     
    Last edited:
    Appendix C, Part VII: Smaller, More Personal Pictures
  • Appendix C, Part VII: Smaller, More Personal Pictures

    1986. The day of reckoning had finally arrived. The millions of voices that had been crying out for a sequel were suddenly silenced. People lined up outside the theatres, and those lines went around the block. It had been a long and arduous road from the original release of Journey of the Force in 1977; although Paramount had green-lit a sequel almost immediately, the Trial of the Century had constituted an almost decade-long detour, along which anticipation gradually swelled into desperation. But nobody could make a Journey sequel until it was clear who would profit from doing so, and it had not been clear until the Supreme Court of the United States of America made its final, eagerly-awaited ruling in early 1983. Tens of thousands of die-hard fans had gathered on the plaza just outside, and cheered in excitement when it became clear that the creative forces which had brought Journey to life would have sole responsibility for its sequel.

    However highly anticipated that sequel might have been for the average filmgoer, in many ways those who were most eager for the film were the studio executives. Not that they - unlike the massive Journey fandom - necessarily had any interest in seeing the movie itself, but instead they were intent on finding out how it would perform at the box-office. George Lucas, who had written and directed the original film, would function for the sequel as its executive producer, representing his role as the man in complete control of the studio’s coffers. He knew he had a lot riding on the success of Journey sequel - the first original, big-budget tentpole film to be released by the Lucasfilm studio since the acquisition of the former Paramount Pictures. Lucas had invited his dear friend Steven Spielberg to direct the film as his first Lucasfilm picture, as Lucas had no desire to ever direct a film again after the hectic and nightmarish experience that making the first Journey film had been. In exchange, Lucasfilm made a three-picture deal with Spielberg, agreeing to back any of his chosen projects and allow him substantive creative control - though retaining their budgetary veto.

    Marcia Lucas, who had won two Academy Awards for Best Film Editing - one for a film directed by Spielberg, and the other for a film directed by her husband - had been true to her much-publicized declaration that if she ever laid a finger on another Moviola again, it would be too soon. Although she had offices at the Lucasfilm studio space (rented out from the Desilu Gower-Melrose complex) in her capacity as Chief Creative Officer, she spent most of her time on that lot visiting with her old friends, including her friend and mentor Lucille Ball, the head of Desilu Productions. However, most of the time she held court at the Lucas family mansion in the Hollywood Hills, trying to get in as many hours of quality time as she possibly could with her family. She had toiled for many years working unglamourous editing jobs, and she felt she had earned the right to live it up now that she and her husband were obscenely wealthy. A stable home life being something she had lacked as a girl was all the more incentive for her to provide it for her own children. George was more actively involved in Lucasfilm’s operations, but even he left much of the day-to-day accounting to their third partner, Andy Taylor, who had enjoyed the change of pace from his former law practice.

    Still, Lucas had a public image to preserve. He and Spielberg had built their reputation as blockbuster filmmakers, which was one of many reasons that the Journey sequel needed to be a smash success; the reputation and future of not only the studio but also its key players hinged on it. Running a studio necessitated an entirely different skillset than making a movie, and Lucas, it turned out, had a much better knack at big-picture strategizing (a role which suited him as a CEO) than detail-oriented minutiae (a role which suited auteur filmmakers). His three-picture deal with Spielberg was an example of his desire to promote the works of his many filmmaker cohorts - such as Francis Ford Coppola, Martin Scorsese, and John Milius, among others - and allow them the creative freedom they needed. Mindful of the mistakes which had ended the New Hollywood era, however, he made sure to keep both hands on the studio purse-strings at all times. Because allowing his friends to make the smaller, more personal pictures which would satisfy their creative urges was not a lucrative strategy, he would need to balance such projects with proven moneymakers such as the long-awaited sequel to his blockbuster, Journey of the Force. To this end, George had begun work on the story outline for the Journey sequel almost immediately upon the rendering of the Supreme Court’s verdict, but he largely stepped back from the ensuing pre-production process, trusting the Academy Award-winning team fresh off Prepare for War! to get it right: John Sayles was hired, on Spielberg’s recommendation, as screenwriter.

    All sides were in agreement that the general tone of this sequel needed to be darker, more mature than the first one had been, given the nine-year gap between them; the Echo Boomer children who had watched the first film upon its initial theatrical release were now young adults. The long gap between the two films also raised the question of whether the original trio of stars - William Katt as Annikin, Kurt Russell as Han, and Karen Allen as Princess Leia - ought to return, and if so, in what capacity. Should they be the stars of the sequel? Or should the sequel focus on new characters - after all, the previous film had made clear that the Civil War between the Rebellion and the Empire was a galaxy-spanning conflict. It would be justifiable to tell a story featuring new characters on another front.

    The climax of Journey of the Force was inspired in part by a historical event - Operation Chastise in World War II - indirectly through its depiction in the film The Dam Busters, as well as a fictional film, 633 Squadron, which depicted a similar event; Lucas had other ideas based on similar raids of Axis territory by Allied forces, as there was certainly no shortage of them. The idea he liked best was a Rebel Alliance raid on Imperial supply lines, and the raid going south and forcing the main character (Annikin Skywalker in the original outline) to crash-land his starfighter on a remote planet far behind enemy lines and try to find his way back out. In his original outline, Lucas had dwelled heavily on the logistics of Imperial troops and fleets and the impact it had on the strategic objectives of both sides; everyone who read this outline, starting with his wife Marcia, told him to put much more emphasis on the action-adventure elements of the story. After all, even the notoriously meticulous Stanley Kubrick had rightly decided to centre his epic masterpiece Napoleon mostly on le petit corporal’s relationships, as opposed to his logistical mastery, and George was no Kubrick.

    Even in the earliest outlines for the sequel, there was no obvious role for Kurt Russell’s character, the roguish scoundrel Han Solo. In the near-decade since his star-making role as an adult actor, Russell had become one of Hollywood’s biggest action heroes, known for taking on projects which allowed his charm and comedic chops to shine through in otherwise subpar material. His price tag had also risen considerably: for his last film before he entered into contract negotiations for the Journey sequel, he had received a record $10 million payday, becoming the first actor to receive an eight-figure salary. Unsurprisingly, his agent demanded that much for his return to the role of Han Solo, even though the film was budgeted at $30 million total - already a fairly high figure for the era, and a number that Lucas was determined not to exceed any further. In renegotiations, Russell’s agent agreed to take a smaller upfront salary (perhaps as little as $1 million) against a whopping 10% of future box-office grosses. Lucas would have none of this, and informed Russell’s agent that contract negotiations would not continue. The film would go on without him, and with Han gone, there was less of a perceived need for Annikin or Leia to return in starring roles either.


    The decision to focus on new protagonists was crystallized one Saturday morning when George and Marcia were watching cartoons with their children. Unsurprisingly, all the members of the Lucas household enjoyed watching The Animated Adventures of Star Trek, and George couldn’t help but notice that the change in cast - and even setting, as the lead ship of the series was not the Enterprise,but the Hyperion - had resulted in new storytelling opportunities, even though many of the new characters filled much the same roles as the ones in the original, live-action Star Trek. That got George to thinking: what would a new hotshot starfighter pilot look like? Or a new smuggler? What about these characters could be different from their original counterparts? How would they relate to each other? In addition, it occurred to him that making a film with new leads also gave the title of his saga an even greater meaning: the main “character” of the films was not any human, but instead the Force itself. That night, after the kids had gone to bed, he immediately started brainstorming ideas with Marcia, and it was she who made a suggestion which changed the entire track of the film development: making the new smuggler character a woman.

    In light of his revelation about the Force itself being the main character of the franchise, the overarching title was tweaked to Journeys of the Force. The original film was given the retroactive subtitle The Death Star, which was first seen in the opening credits of the long-awaited home video release in 1984 (since sequels had finally been confirmed), but notably, the packaging for the film itself retained the original title, and almost everyone continued to refer to the film as such. The sequel was given the subtitle Behind Enemy Lines, as it would focus on the new starfighter pilot lead crash-landing on a remote planet after a failed Rebel Alliance raid on Imperial supply lines. Steven Spielberg and John Sayles, once they received the final outline from Lucas, immediately began to reshape it into a workable shooting script. Sayles focused mainly on the plotting, characterization, and dialogue; as had been the case with their previous collaborations, he left most of the detail for the action set pieces to Spielberg.

    The film opened with an audacious raid upon a key supply line for the Imperial forces, only for the alliance strike force to suffer a painful setback when their forces were overwhelmed. Colonel Annikin Skywalker, who led the spaceborne raid, was downed and captured by the Empire, where he was kept prisoner by his old foe from the previous film, Darth Vader, preventing him from using his “Jedi-bending” to affect an easy escape.

    Meanwhile, one pilot, Lt. Wedge Darklighter, though his starfighter was damaged, was able to evade capture, making an emergency landing on a remote planet on the outskirts of the sector. Fending his way to a port town, he attempted to discreetly seek passage back to neutral territory, but the only person able to help him was a young smuggler woman, Pathe Amidala, who instantly saw through his attempts to conceal his identity. She asked him why she shouldn’t just turn him into the Imperials, but he promised that the Rebellion would pay her double whatever the Empire had to offer.

    Pathe was someone whom most would describe as a collaborator; she gained handsomely from trade with Imperial agents, and agreed to provide services to them - for a hefty price. From the point of view of the Imperials, she was a profiteer - thus, nobody liked her and fewer still trusted her. She did have a contact within the underground resistance in the region - affiliated with the Rebellion - but had never met him. However, cooperating with him was the only way the pilot could be sure of securing the large sum of payment the smuggler expected, so he insisted upon a face-to-face meeting.

    The underground agent, Arn Riclo, turned out to be nothing like either of them had expected. Optimistic and idealistic, his passion for the cause and hope for a better future for the galaxy was inspiring. He and the smuggler were oil and water - but he shook her with his absolute faith in the potential for her redemption.

    He informed Wedge that Skywalker had been captured and that Resistance agents were attempting to effect a rescue through their mole working in the Imperial base. However, they had a critical shortage of good pilots and good ships - drafting Wedge and Pathe would go a long way toward alleviating both problems. Pathe balked - but relented when Arn promised her a fortune - enough for her to become the Boss of her own syndicate. She accepted gleefully, though Arn was crushed - he had hoped it wouldn’t take all that to win her over, and told her so. Pathe at least had the decency to look ashamed afterward.

    Pathe was particularly concerned that Skywalker was being held on the prison planet as a trap. Nevertheless, the promise of money beyond her wildest dreams won her over. The three escaped to neutral space, where Princess Leia and Kenobi finally appeared at the briefing mission. It was here that Kenobi confirmed what many in the audience had suspected: that the Force was strong with one of the three. Wedge naturally assumed it was him, but to his (and everyone’s) surprise, it turned out to be Pathe!

    Pathe was skeptical, and Kenobi seemed willing to drop the matter, but then in a cunning bit of misdirection, he tricked her into realizing her powers, and she then accepted that the Force was strong with her. Meanwhile, Princess Leia revealed how the rescue operation would succeed: through a two-pronged attack. The Resistance would create a diversion through an armed uprising, distracting the bulk of the Imperial forces and allowing the Rebel Alliance fighters to stage an effective rescue.

    The final act branched off along two paths: the space battle featuring Wedge and the attempt led by Pathe to effect a rescue of Skywalker. It was Wedge’s job to do well enough to divert enough Imperials to allow them through. The scene at them leaving the Rebel hideout, with Arn and Pathe getting on her ship together, was shot in such a way so as to allude to the famous ending scene of Casablanca, a nod to how that film had influenced both their characters: Pathe as Rick Blaine, and Arn as Victor Laszlo, with the two of them sharing the role of Ilsa between them. Kenobi also joined them on their flight to the enemy base, but stayed behind on the ship when they arrived. Arn and Pathe went in to rescue Annikin, but of course they encountered Vader first. Pathe finally proved her virtue once and for all when Vader threatened Arn and she stepped in between them to intervene, saving his life while risking hers. But of course she was fine, because Vader being distracted had allowed for Skywalker to serve as the cavalry in his own rescue operation. Skywalker thus killed Vader, ending the story arc of their antagonism, and effected their escape alongside Arn, Pathe, and Kenobi - thus uniting three Jedi-Bendu in one ship. Pathe pledged to study the ways of the Jedi-Bendu, right after she got her money; but only because she felt the need to make an honest man out of Arn, and treat him right - because he was joining the Rebellion too. The makeup raid was a smash success, throwing Imperial control of the sector into disarray, and promising to have key strategic effects on other fronts, and in future sequels.

    In a snub which became the talk of Hollywood, the character of Han Solo was never mentioned in the film, even though four other major characters from The Death Star - Annikin, Leia, Kenobi, and Vader - all returned, and each of them discussed their previous adventure, in which Han had played a key part. When asked about Han Solo’s whereabouts, George Lucas simply remarked that “he went back to smuggling”. Kurt Russell declined interviews regarding his failure to make even a cameo appearance in Behind Enemy Lines, and tersely refused to comment when asked about the question directly, with his new agent informing interviewers that his client would much rather discuss his newest project, which was also scheduled for release in the summer of 1986.

    Russell’s film, Human Target, performed very well at the box-office, as his movies often did, but nonetheless, its tally was dwarfed by the boffo returns managed by Behind Enemy Lines, which had received over $300 million at the box-office by the weekend of September 13-14, 1986; returns even months into release showed no real signs of slowing, making it likely that the film would outgross the original, and become the highest-grossing film of all-time (not adjusted for inflation). Despite these massive revenues, critical responses were more muted, the film tending to get more thoroughly mixed responses than the general acclaim which had met the release of the original - and the fond remembrances associated with it ever since. Many hardcore Journey fans were vehement in their fury at the switch to new leads, disliking the franchise shift to an anthology format. The absence of Han - a longtime fan favourite - was a particular sore spot for these fans, who shared many resemblances with the notorious “Puritans” of Star Trek fandom. Other fans, particularly younger ones, disliked the darker tone of the film and the more muted victory achieved at its conclusion. The contrast of the destruction of a major military installation with merely opening up a supply line for some future military action left a sour taste in many of their mouths - even though, in real-world warfare, the latter was often more important than the former.

    On the other hand, the world of film criticism was a very different beast. Sayles’ dialogue was praised as far more literate and memorable than the workmanlike dialogue written by Lucas for the original film, as was Spielberg’s kinetic and taut direction. The actors were singled out for praise as well: Tom Cruise as Wedge, Holly Hunter as Pathe, and British-born Cary Elwes as Arn. Hunter and Elwes were praised for their oil-and-water chemistry, with Hunter additionally receiving plaudits for being believable as a seasoned smuggler despite her relative youth - she was born in 1958, and was 27 during principal photography - and inexperience, as Behind Enemy Lines was her breakthrough role on the big-screen. Most importantly, the smash success of Behind Enemy Lines would vindicate Lucasfilm’s financial strategy as well as its creative strategy. Already, George began outlining a third Journeys film, to feature a whole new set of protagonists, in keeping with the anthology format. The film was already doing well enough that he was able to approve budget increases for the projects his friends Martin Scorsese and Francis Ford Coppola were working on, allowing them greater flexibility to weather the inevitable cost overruns he knew would be coming. After all, nobody knew better than George that filmmaking was fraught with unforeseen circumstances…

    ---

    “Miss Ball? Mrs. Lucas is here to see you.”

    “Thank you, Doris, please send her in.”

    Not a moment later, two-time Academy Award-winner and one-third-part owner of a major Hollywood studio, Marcia Lucas, walked into Lucille Ball’s office, carrying a plastic bag filled with fast-food packaging.

    “I brought lunch!” she said, lifting the bag for emphasis.

    “My hero,” said Lucy, a smile playing across her face. “What is it?”

    “Chinese. From a new place down the block. George and I tried it last week, and we both thought it was real good.”

    “And how is George? Still in his office, counting his money? Don’t think I didn’t notice your little movie is #1 at the box-office again this weekend.”

    Marcia grinned. “Actually, he’s on the phone with our travel agent. Booking us a family vacation to the South Seas.”

    Lucy whistled at this - or rather, she tried to whistle. She settled for a variation on her familiar I Love Lucy wail, but with the pitch shifted in such a way as to express approval. Marcia tried not to giggle.

    “I say you’ve earned it. Andthen some. And you’ve certainly got the money for it now.”

    “And I’m making real sure George doesn’t pour it all back into the company. I love Marty and Francis like brothers, but if they were as good with managing money as they were at making movies, they wouldn’t be in this situation in the first place.”

    Ball nodded as she and Marcia retrieved the Chinese food from the bag and began to help themselves.

    “We’ve already declared a real big cash dividend,” Marcia continued. “Don’t know how I talked George into paying out so much, but it’s a good thing with all the money the movie’s making. Andy says he’s going to put his share into real estate back in Baltimore.”

    “And what else are you and George doing with your share?”

    “We’re putting a lot of it towards education for the kids. Private school isn’t cheap and neither is a good college. George is hoping one of them will want to go into filmmaking, but I don’t think that’s a possibility. And we’re looking at buying a retreat in Aspen. Not a lot of snow here in Hollywood and I’d love to expose the kids to it. And skiing would be a real fun family activity for all of us. And we’re probably going to adopt again.”

    Ball was floored. “Sounds like you’ve got a lot of your time planned out already. You spend so much time away, I hardly see you anymore. This is the first time we’ve seen each other in person for weeks.”

    Marcia smiled softly. “Lucy, I’m real grateful for our friendship, and all you’ve done for me and George these last few years.”

    Lucy stopped moving her fork midway to her mouth as Marcia said this. “Oh boy, this sounds serious. Good thing I’m already sitting down.” She lowered her fork and reached into her desk drawer for her beloved cancer sticks. “Better take something to calm my nerves before you go on.” She lit one and took a long drag.

    “I think you’ve earned your retirement, Lucy. No woman - no person - in Hollywood has worked as hard as you for as long as you have. And… and I think your leaving is what finally convinced me I should leave, too.”

    “You’re finally leaving showbiz, Marcie?”

    “I’ll be staying on with Lucasfilm as CCO and keeping my seat on the Board, but I’m vacating my office space. I want to spend every spare moment I can with my kids. I don’t want to miss out on a single thing they do.”

    “And Georgie?”

    “He’s cutting his hours back, too. Only going in part-time. He was real hands-off with Behind Enemy Lines, and it turned out so well, I think it finally convinced him he doesn’t have to micromanage everything. I think he’s picking up a real knack for fatherhood, too.” Marcia couldn’t hide the beam of pride that had come over her face. “Andy’s real good with the day-to-day stuff, like he was born to do it. Lucasfilm is in safe hands.”

    “Do you think he’ll ever direct again?”

    Marcia nodded. “Someday. But something completely different. I think he wants to make a real smaller, more personal film. He’s showed me a lot of his story ideas. Some of them are real good.”

    The two of them ate in silence for a moment.

    “I’m really going to miss this, Marcie,” Lucy said, finally.

    Marcia felt tears welling up in her eyes. “Me too.” With that, she suddenly rose from her seat, dashed around the desk, and threw her arms around Lucy. “You know I love you, don’t you?”

    Lucy laughed, before letting out with her increasingly familiar hacking cough. “As long as you know I love you too, kiddo,” she said, gingerly returning the embrace. As she did, she glanced wistfully around her office. There were a lot of things she was really going to miss.

    ---

    Thus concludes the penultimate update of this timeline! Special thanks to e of pi for co-writing this update with me, including helping to develop the plot of Behind Enemy Lines, the sequel to the original Journey of the Force which is actually described in much greater detail! We hope you enjoyed our take on the sequel’s development and production.
     
    Say It Ain't So, Lucy!
  • Say It Ain’t So, Lucy!

    mmNynrN9l1DLzFWH-VSDyyKTuyVxWvWwpC3xlkAkJ72peyPvsNR9w8Ps50rbt3CF7tp8-MJHsOavEqR4AYiD6TVc9WNNfZnCIbnwH1cwkPmmHneXQZErDOf7nhSrIGdxgYxVCupZ



    Official studio photograph of Lucille Ball, on display at the Desilu Gower Visitor Centre from the spring of 1986 onward.

    She’d always hated funerals.

    But there was no getting around it: Lucille Ball just had to attend the funeral of her late ex-husband, Desi Arnaz, the father of her children, and still a close friend long after the divorce, until the day he died. As far as she was concerned, the empire that had been named after them, the one the media had decided that they’d built together, had really been his creation, and had only prospered under his guidance. Even though he hadn’t been involved with the studio in any formal capacity for just about a quarter-century, she still felt as though a part of it had died with him. She had already decided to leave, but any lingering doubts she had about doing so were scattered, along with his ashes, to the four winds.

    They’d asked her what she would do with all her free time, now that she was retiring. Every time they’d asked, she came up with a different “joke” answer, each more corny and ridiculous than the last. In all honesty, she didn’t know what she would do with her free time. She wanted to live each day as it came, a fatalistic outlook she’d come to embrace more and more as she got older. She liked the idea of spending more time with her many grandchildren before it was too late – some of them were still small and she doubted she would live long enough to see them come into the full bloom of adulthood.

    As for her two children, both of them would be plenty busy, as the plan was for them to jointly take the reins of Desilu - but Lucie Arnaz, the responsible, hardworking daughter, would be named President and CEO. Her brother, Desi Arnaz IV, would be far better served as an effective “mascot” for the studio, much as Ball herself had been.

    True to his word, Herb Solow was making good on his own plans to retire. Ball had graciously allowed him to take with him the original three-foot model of the Starship Enterprise from 1964, which had adorned his desk for most of his more than two decades at the studio. She’d asked him to stay on a bit longer, so as to ease the transition between generations, but Solow held firm.

    “I can’t imagine the studio without you,” he had said, over their final cup of coffee at her desk. “I don’t think I ever want to see it, either.”

    “Well, what if I don’t want to work here without you, either?” She was choking back tears, not that she’d ever admit it, or that he would ever ask if she was.

    Solow smiled at this, a wan little smile that didn’t quite reach his eyes. “You can’t always get what you want.”

    He took a moment to reflect upon how she’d changed in all the time he’d worked at Desilu. When he’d started, under Oscar Katz, she’d been content to run The Lucy Show as her personal fiefdom and leave the rest of the studio to her talented underlings. She had even been planning on selling her studio to Gulf+Western - and considering what had happened to Paramount, who knows what might have happened to Desilu if she had? More than once, he’d shuddered at the thought. She’d always credited that dream for changing her mind, but he was never sure he’d really believed it himself. The only thing he was sure about was how much he was going to miss her, much to his own bemusement.

    “Happy trails, Herbie,” she said to him, at the end of his last day. “And when you write your tell-all expose about your years working here, promise you’ll go easy on me.”

    He would. He signed a book deal just a few short months later, and the tentatively-titled Inside Desilu Productions: The Real Story was scheduled to hit bookstore shelves in time for Christmas, 1987. Word on the advance manuscripts noted the glowing depiction of Ball - then again, perhaps that was because Solow had saved up all his venom for Brandon Tartikoff.

    As for Tartikoff, he would become SEVP and COO, ascending to Solow’s former position, but wielding considerably more power, as the two Arnaz siblings intended to be far less hands-on as chief executives than their mother or father had been. One idea which tickled their fancy was a semi-autobiographical star vehicle: a sitcom about a responsible elder sister and a layabout younger brother who go into business together. Tartikoff wasn’t crazy about it, but he was willing to produce a pilot in early 1987 with an eye for a premiere in the 1987-88 season. If ABC, with whom Desilu retained their right-of-first refusal agreement, wasn’t interested, perhaps the fledgling new PGTV network might be - it was only proper that Desilu got a show broadcast on the new fourth network sooner or later, after all.

    That said, there were other parties interested in the future of Desilu programming beyond the four networks, none more prominent (or vocal) than the titan of Pay-TV, Ted Turner, proprietor of WTBS Atlanta, available throughout the United States (and Canada) as TBS Superstation. He had been seeking an audience with Ball for years, but she had always rebuffed him. When she announced her retirement, she figured that would be the end of him calling on her, but to her surprise, that only seemed to strengthen his resolve. Her only planned involvement with Desilu post-retirement was to assume the title of Chair Emeritus on the studio’s Board of Directors, on which she would continue to nominally have a seat, albeit one she would probably never occupy. That was apparently still enough for Turner, who was eager to “have [her] ear” as she would “have the ear of the new power-brokers of an evolving, growing Desilu”. Slimy businessman-speak to the last, but she finally consented to meet with him in the dog days of summer, 1986, eager to put all the concerns of her job behind her before she formally retired.

    “Miss Ball? Ted Turner here to see you.”

    “Thank you, Doris, send him in.”

    Ted Turner was many things - shy was not one of them. He strode into her office like he owned the place; Turner owned many things, granted, but Desilu Productions was not one of them. Ball did her best to maintain a neutral expression, though her lips were pursed so tightly they were turning white.

    “Miss Ball, I’m privileged to have this opportunity to meet with you today.”

    “Thank you,” she said, making a point not to tell him to call her “Lucy”, as she otherwise always did. She was about to invite him to take a seat when he beat her to the punch, resting his laurels in the chair Herb Solow had usually occupied during their meetings. She wondered if he knew that, and if he did, what he meant by doing it. She was about to find out.

    “I should think by now my work speaks for itself. I’ve always wanted Turner Broadcasting to represent the vanguard of the entertainment industry in the 1980s, and the decades ahead, up to and including the new millennium, in much the same way Desilu has always represented the vanguard of television.”

    “That’s an admirable goal, Mr. Turner.”

    “Now come on, call me Teddy. I know you want to.”

    She grimaced, hesitating, trying to come up with a diplomatic response, but he didn’t bother giving her the time for one.

    “Now I know Desilu has been involved in a lot of parallel ventures. You’ve been working with that video game company since forever ago, you got in on the ground floor of the home video revolution, and you invented the rerun, for which we at TBS - not to mention most every station in this and every other country - are eternally grateful.”

    “That was Desi, he had all the good ideas - ”

    “And modest to a fault!” he interrupted dramatically. “A true lady. The kind of lady I’ve always wanted to do business with.” At this, he tossed his briefcase upon her desk, dramatically opening it and pulling out a dossier which he handed to her, and which she accepted reluctantly.

    “We were all heartbroken by word of your retirement. But I see a golden opportunity for your studio to move in a new direction. The Desilu empire is built on the syndication market; I know it, you know it, everybody knows it. I’ve been spending the last several years and a not-insignificant sum of money investing in ways to make beloved classics more appealing to modern audiences. I can’t help but think how much Desilu’s library might benefit from the new technology we’ve been working on, in the same way the film libraries of so many classic film studios already have, including studios you worked with in the Golden Age. So what I think you really ought to do here is to let our powers combine.”

    She’d heard all about this new technology, and leafing through the dossier had confirmed her worst fears. Colourization. She glanced up at him, unsure if she could hide the pall of dread which had crossed her face; he was grinning like the cat that ate the canary.

    “The wave of the future,” he said. “Now a lot of people, including some of your very close friends, are getting terribly upset at what I’ve done to breathe new life into classic films. And I’m no tyrant, Miss Ball; although I own the copyrights to those films and have every right to do whatever I please with them, I realize it would be much better to seek input from the original creators wherever possible - for public relations purposes, you understand.”

    “That’s very interesting, Mr. Turner, but I don’t see what that has to do with Desilu. You don’t own the copyright to anything this studio has ever produced.”

    Turner chuckled at this, dismissively. “You’re right about that,” he said. “But not for lack of trying. I’ll have you know I was gunning hard to get I Love Lucy myself, before you beat me to the punch.”

    “Really? I hadn’t heard,” she said. She was lying.

    “Oh yes. I had plans for I Love Lucy. Big plans. I say “had”, but I still have them, actually. That’s why I’m here. As you know, TBS has been running I Love Lucy - along with Star Trek and many other Desilu programs - for many years now. We’ve been a very loyal customer of yours, Miss Ball.”

    “And we here at Desilu appreciate your continued patronage.” Which was true enough. WTBS paid top dollar to secure syndication rights for the Desilu shows in the populous Atlanta broadcast market, seeing as the station constantly had to outbid all three (and soon to be four!) national network affiliates for the privilege. On top of that, the TBS cable channel also paid for several Desilu shows, in a deal distinct from the ones with WTBS, the (de jure) independent television station. This was why Ball had ultimately felt obliged to meet with Turner, despite her personal distaste for him; he had funnelled a very large amount of syndication money in her studio’s direction over the years.

    Turner seemed to be waiting for her to ask just what his plans for I Love Lucy were, but she adamantly refused to bite. So he launched into his sales pitch. “I can’t stress enough how wonderful a show I Love Lucy is. Still funny, still well-written, still brilliantly-acted. Whole generations have fallen in love with your character, Miss Ball, including yours truly.”

    “Thank you,” she said. She wasn’t sure she could watch I Love Lucy anymore. All three of her co-stars were now dead, and the “Lucy” character who had so defined her public image at that time in her life seemed a whole other person from what she had become.

    “But nobody’s perfect, of course, and I can’t help but think of ways we might be able to make a great show even better.”

    She decided to stop beating around the bush. “Which is why you want to colourize it.”

    “You see, Miss Ball, this is what makes you such a great studio head. You cut right to the chase. And you are exactly right. We live in a world of colour, Miss Ball; what excuse is there for a show to be… black-and-white in this day and age?” He said the offending words in a withering tone, as if he were discussing a leper colony.

    “Well, it was filmed in black-and-white,” she said.

    “Yes, but you had no choice, you would have used colour if you could.”

    “Colour film existed in the 1950s, Mr. Turner. If we’d wanted to shoot in colour, we certainly could have. Other shows in the fifties did.”

    Turner paused, as if taken by surprise, but he covered himself admirably. “Wuh - well - yes, yes, of course, but none of that mattered, because there were no colour TVs back then. You used black-and-white because you had to. Nobody would use black-and-white if they had a choice.”

    “...You have seen The Exorcist, haven’t you, Mr. Turner?”

    “Oh, that was different, that was black-and-white for artistic purposes.”

    “Are you trying to say I Love Lucy wasn’t black-and-white for artistic purposes?” Her eyes narrowed. Her teeth clenched. In another place, at another time, she might even be impressed by his brazenness.

    Turner, wisely, chose not to answer that question. He reached over, leafing through the pages in the dossier he had given her, before he stopped at some colourized photos of the I Love Lucy sets.

    “Here, take a look at how the Tropicana comes to life in these vivid, bright colours.”

    Then she didn’t something he didn’t expect - that she didn’t expect - and burst out laughing, coughing and wheezing so hard it looked as though she was about to run out of breath. Once she had regained her composure, she rose from her seat, heading over to a filing cabinet in the back corner of the room. She opened the top drawer, rifling through several folders, many of which were faded and worn, and had obviously seen better days. She retrieved a single folder and returned to her desk; Turner noted that the label on the folder read I Love Lucy Set Photos. She removed a single photo from the folder and laid it down on the desk.

    “Have a look, but don’t touch,” she said, still giggling, with a big grin on her face.

    It was instantly recognizable as the set of the Tropicana on I Love Lucy - and it was a colour photograph, a rare and invaluable artifact from the Golden Age of Television.

    “What colour is the foliage?” she asked, still unable to hide her grin.

    He muttered something unintelligible.

    “I’m sorry, I didn’t get that?”

    “Pink,” he said, louder. “All the trees are hot pink.”

    “You know why all the trees are hot pink?”

    He muttered again - but this time, she didn’t give him a chance to finish.

    “Do you know who Karl Freund was, Mr. Turner?”

    “Uh, Freund, Freund. Wasn’t he the shrink who thought everybody wanted to sleep with their mother?”

    “That’s Freud. No. Karl Freund was a cinematographer. A leading light of the German expressionist movement, you ever heard of it?”

    “Well, yes, as a matter of fact - ”

    “Desi hired him to shoot I Love Lucy. He figured out how to light the show in such a way as to avoid casting shadows with a three-camera setup. He was a genius, a great man. I didn’t talk to him much, that was Desi’s department. But I feel very privileged to have known him and to have seen him work. Now you’ve actually seen what the sets looked like, how they were the most ridiculous colours, so they could show up better on black-and-white film! And I’ve seen how wrong you were trying to figure out what the actual colours were. So you tell me, why would I agree to you doing this?”

    “But… but you have red hair, you joke about it all the time on the show, but how can they really tell if it’s in black-and-white?”

    She scoffed at his ignorance. “Mr. Turner, Desilu has always - always - treated our audience with respect and have never looked down on them, and we’re not about to start just because I’m leaving.”

    “Miss Ball, please. I think you’re letting your nostalgia blind you to this once-in-a-lifetime business opportunity. I’m talking about the wave of the future here. I thought you would understand.”

    “Mr. Turner, you’re a businessman, not an artist. That’s what you always were, and what you always will be. It’s why you’ll never understand why I will never colourize I Love Lucy. It’s about heart.”

    Heart? What the hell does heart have to do with anything?”

    “If you don’t know, I’m not going to tell you.” At that, Ball slammed Turner’s dossier firmly shut and handed it back to him. “Thank you for stopping by, please show yourself out. And close the door behind you.”

    “Miss Ball - ”

    “Don’t make me call security.”

    “Well, come now, you wouldn’t really do that now, would you?”

    Without a word, she reached over to her phone and lifted up the receiver, her finger hovering over a prominently-displayed bright red panic button. She’d had it installed after Harlan Ellison had barged into her office back in 1967, ranting and raving about re-writes to his Star Trek script.

    Ball gave Turner a sly look. Just try me, it said.

    Reluctantly, he rose from his seat. “I can see there’s no swaying you, Miss Ball. A shame, I think we could have had a very lucrative and productive business relationship. Perhaps your successors might see things differently.”

    “That’ll be difficult, since I intend to tell the board of directors that they’re not to enter into business with you in any way. And that includes making any new syndication agreements with you, once our current ones expire. God only knows what you might do with our shows anyway, if you’re given the opportunity. And I don’t intend to ever find out.”

    Turner snarled, but without another word, turned on his heels and stomped out of her office, slamming the door behind him as he left.

    Ball sighed. At least his visit had clarified one thing. Her legacy at Desilu had been to foster creators, and then to defend their creations from those who might do them harm. It went all the way back to when she had decided not to sell to Gulf+Western, and now it concluded with her refusal to make a deal with Turner. It amazed her how, after all these years, and all the many ways the entertainment industry had changed throughout, the fundamental aspects of how people did business were exactly the same.

    Ted Turner’s insistence to speak with her personally, as opposed to her new COO Brandon Tartikoff, or even her children, served to demonstrate how thoroughly her “Boss Lady” image had captured the popular imagination. It was a very different image from the daffy sitcom star of the 1950s and 1960s, but at the end of the day, she wasn’t really sure it was any closer the real Lucille Ball. She was a performer by profession, and at heart, so it wasn’t surprising that she’d spent her entire career putting up a front to the public, starting as a Goldwyn Girl all the way back in the 1930s. She was tired of being anyone but herself, and she figured she’d done enough to earn her immortality. She’d always be remembered by one of her many masks, so it suited her purposes just fine to live out the rest of her life on her own terms, as her own self.

    At the very end of her very last day, after the movers had finished removing her personal belongings from her office, Brandon Tartikoff appeared in the doorway. “Lucy, can I come in?”

    “You’re always welcome, Brandie,” she said.

    He entered, walking up to stand alongside her as she looked around the mostly-empty room. “It looks so different.”

    “It looked just like this after Desi moved out,” she said, without looking at him. “24 years and it looks exactly the same.”

    Tartikoff knew better than to try coming up with a response to her rhetorical musings, merely smiling wanly at her when she finally glanced over at him. She attempted to match his smile with her own, and shuffled back over to the doorway. The final artifact of her occupancy was the nameplate on the door, which she gingerly slid off its base, clutching it tightly in her hand.

    “Come on Brandie, it’s time to go.”

    “It’s not going to be the same without you, Lucy.”

    “Life goes on, Brandie, life goes on.”

    Tartikoff exited the office, leaving Ball in the doorway, alone with her thoughts. Reaching into her purse for a cigarette, she lit it and took a long, desperate drag, sighing dramatically as she exhaled, and gingerly closed the door behind her without another glance.

    It was time to leave.


    ---

    Thus concludes the final cycle of That Wacky Redhead! The epilogue will follow in short order. Thanks to e of pi, as always, for assisting with the editing of this last update proper, although I must stress once more that the epilogue is still to come, and the timeline will not be well and truly over until then. But until then, please enjoy the official ending theme of That Wacky Redhead, “If You Leave” by Orchestral Manoeuvres in the Dark, as originally featured IOTL in the 1986 film Pretty in Pink (for those who guessed another John Hughes film, close but no cigar). Thanks to vultan for helping me to decide on this particular song as this TL’s sendoff - yes, that’s right, I still remember, Admiral! After all, an elephant never forgets…

    Remember, it ain’t over till it’s over, and the epilogue is yet to come, though it is completed, and will be posted in a couple of days. Until then!
     
    Epilogue: September 20, 1986
  • September 20, 1986


    p70ZdCQUd_0aD-ZKsCJwMda5QY7-9SPgiwEtrneDCESAONs1-tZymtZPFHt0nN-x53hP03-_FR7w8FZGyKpNLelyi9e5fGofvtqceZLIjrvC7SwUqvsaSJsxyWz-qC9VtbYQlAz8


    Lucille Ball in a promotional photo taken shortly before being interviewed by Baba Wawa for the television special aired the evening of September 20, 1986.

    We return to the overly-staged interview set, with WAWA standing next to the chair in which she has been sitting during the interview with BALL. She maintains a studious and neutral expression as she begins speaking.

    WAWA: Welcome back. As Lucy approaches her retirement from Desilu and from private life, I asked her what kind of legacy she feels she left behind, how she has changed the television industry and the programming it produces, for better… and for worse.

    DISSOLVE TO WAWA, now seated, in full-on Interviewer Mode.

    WAWA: What are you going to miss most of all?

    BALL, pensive, thinks for a moment before giving an answer.

    BALL: I’ll miss the way so many of our shows brought people together over the years. All over America - all over the world, really - no matter where you live, how old you are, whether you’re a boy or a girl, what colour your skin is, what you believe in… we all come together to watch TV, and our love for what we’re watching transcends all those boundaries. I know people have felt that way about a few of our shows, and I hope they’ll continue to feel that way long after I’m gone.

    WAWA: Now would you describe that as your legacy, or as the legacy of television?

    BALL: Oh, television, definitely. It’s a powerful medium, just like radio was, just like the movies still are. But there’s an immediacy to television, y’know?

    Hard-cut to WAWA, whose faux-sage nodding once again utterly fails to hide an obvious cut in BALL’s speech. Cut back to BALL.

    BALL: ...an immediacy and an intimacy at the same time. It’s right there in your living room and it seems to be talking to only you, but it’s really talking to all of us. I think that’s the appeal. It’s this shared secret but we’re all in on it.

    WAWA: And what part did you play in helping to spread this shared secret?

    BALL: You know, in the grand scheme of things, I wasn’t really responsible for all that much. I just kept a studio running for a few decades, kept a few thousand people employed. This world is so much bigger than that, really makes you realize how small and unimportant one person can be.

    WAWA: Don’t you think you’re selling yourself short? Your studio has been responsible for some of the most beloved programming in the history of television. I Love Lucy, My Three Sons, The Untouchables, Star Trek, Mission: Impossible, Rock Around the Clock, The Questor Tapes, The Muppet Show…

    Quick-cut to BALL, laughing and raising her hands in mock surrender.


    BALL (through laughter): All right, all right! Enough already.

    Cut back to WAWA, who is completely ignoring BALL, and defiantly continues to list the studio’s greatest hits over her protests.

    WAWA (talking over BALL): ...Three’s Company, The Ropers, The Patriot, Hill Avenue Beat, and Neon City Vice, just to name a few!

    BALL (still laughing): That’s more than a few!

    WAWA: And your studio produced all of them! Don’t you think the lady at the top might have had something to do with that?

    BALL: I didn’t create any of those shows, and I didn’t run them. I didn’t even supervise, I left that to my lieutenants. Oscar Katz, Herb Solow, Bob Justman, Brandon Tartikoff…

    WAWA: But you were the boss! The buck stopped with you!

    BALL: I’ve always been good at giving people what they need to bring out the best in them. If you want to know what the secret of my success was, look no further.

    WAWA: Lucy, you’ve been in show business for a very long time now, and you’ve worn a lot of different hats. How do you want to be remembered in the future?

    BALL: Well, I’ve had a long life, and I may not go on for much longer, but if syndication has taught me anything, it’s that you’re really not dead, as long as you’re remembered. Now, a lot of people still remember me as just that wacky redhead from I Love Lucy, and I have no problem with that. That show brought joy and laughter to so many people, and I’m honoured to have been a part of it, and contributed in some way. Is there more to me than that? Sure there is, but I don’t mind so much if people take the time to figure that out for themselves.

    WAWA: Lucy, do you have any last pieces of advice to those of our viewers who might like to follow in your footsteps and seek out a career in Hollywood?

    BALL: Yeah, no matter what, just follow your dreams! (laughs) Literally, in my case. You really never know just what might come of them if you really believe in them.

    Cut back to WAWA, alone; the interview has concluded.

    WAWA: Lucille Ball will be remembered as an entertainer who sought to deliver the finest quality product possible, whether it was in front of or behind the camera. Her legacy with Desilu will continue to enrich the lives of millions of dedicated viewers throughout the country, and around the world. Thank you for watching. On behalf of NBC News, I’ve been Barbara Walters. Good night.

    Exit WAWA; the lights dim, as we

    FADE TO BLACK.


    ===

    Thus concludes That Wacky Redhead! Thank you all for following this, my first timeline, which would not have been possible without the contributions of many members of this board, including, of course, all of you who posted on this thread, and who contacted me via PM, and who voted for TWR (and for TWR) in the Turtledove Awards. Thank you all, from the very bottom of my heart. That said, I feel that a few of you deserve a special citation for going far above and beyond what should be reasonably expected, and without further ado, I intend to do just that.
    Let the end credits music roll

    To @Thande, @statichaos, and @Space Oddity, for each laying out the welcome mat for TWR in your own way:
    • For Cronus Invictus proving that you can write engaging and popular TLs about things other than War and Politics;
    • For AWOLAWOT proving that culture can have wider repercussions on War and Politics, not just the other way round;
    • For Now Blooms the Tudor Rose just making writing an AH TL look like so much fun that I, like TWR herself, just had to “give it a whirl”.
    To @Emperor Norton I and @Lavanya Six, for plugging this timeline just about everywhere you possibly could.

    To @Falkenburg, for writing That Wacky Limerick and codifying the rules of the That Wacky Redhead Drinking Game™.

    To @Clorox23, for doing such a great (and enthusiastic!) job of curating my TV Tropes page.

    To @TheMann, for letting me borrow your idea of a Canadian aircraft carrier post-Bonaventure from Canadian Power, an excellent TL which you should all read.

    To @RogueBeaver, for letting me bounce some ideas about Canadian and Quebec politics off you.

    To @Don_Giorgio, for your help with all matters Greek (it’s all Greek to me)!

    To @juanml82, for your help with all matters Argentine.

    To @Archangel, for your help with all matters Portuguese.

    To @MaskedPickle, for your help with all matters French, and for your part in carrying the torch for pop culture TLs.

    To @Workable Goblin, for your help with space probes and satellites, and for space advice in general.

    To @Chipperback, for encouraging me to step outside my comfort zone in writing about the Big Consequences that such Big Dreams have entailed, which I think have led to an overall much richer TL; for your advice about various sports, events, and personalities, even though sometimes it felt like we were trying to relate to each other in two different languages; and for your recognition of TWR in writing your own TLs.

    To @Andrew T, for your invaluable help in explaining the convoluted vagaries of the U.S. legal system, and for laying out the anatomy of a lawsuit, and its journey all the way to the Supreme Court, along with sundry advice with regards to technology, and a few great suggestions on casting ITTL! Not to mention, more than any of the other pop culture TL writers who followed me, really making a success of Dirty Laundry on its own terms, despite constantly (and unnecessarily) comparing it to my own TL. As far as I’m concerned, if TWR did carry the pop cultural torch, she has now passed it in on to Don Henley. May he bear it well!

    To @vultan, for being this TL’s biggest cheerleader, for your passionate enthusiasm for pop culture TLs in general (including writing a few of your own), and especially for brainstorming all of the many, many American election results with me, down to figuring out all the candidates. Wherever you are now, I hope life is treating you well.

    To @Electric Monk, for your enthusiasm, for starting the list of updates, for sharing your thoughts on perhaps the widest variety of subjects pertaining to this TL’s development of any of my consultants (with one exception), and for coming back just in time to help bring this TL to its conclusion. And, of course, for your friendship in turn, one which I feel has already yielded potential for some truly great AH in future, on both sides of the equation.

    To @nixonshead, for being generous enough to share your immense artistic talent with the readers of this thread, bringing my ideas to life in a way I never could have even imagined doing myself. I feel immensely privileged that someone with your gifts has taken such an interest in this TL, and shown such a willingness not only to render images for it, but to also provide input on the various aspects of the writing as well, as we’ve now seen how great a writer, as well as an artist, you are on your own TLs, starting with Kolyma’s Shadow.

    To @Dan1988, for your frankly inexhaustible determination to really dig deep into the minutiae of this TL, focusing on the kind of detail and incidental richness which makes it feel real and like a universe that’s properly lived-in, as opposed to a mere backdrop for a staged story, culminating in the two interlude posts which you were kind enough to write for me (along with your help writing other updates). In addition, you’ve almost certainly written more process notes for this TL than anyone else, quite possibly even including myself, and it’s honestly very touching that anyone would take such an avid interest in making my TL “work”. It’s great to see you finally writing a TL of your own!

    To @Thande (yes, again – you get two mentions, don’t you feel special?), I blame you for creating my TV Tropes page, for urging me to maintain my optimistic tone throughout (which made for all the more contrast with the wave of dystopic timelines which were popular when I started writing), for your frank and forthright advice about British culture and politics, and for recommending this TL often as one of your favourites, high praise indeed from such an influential voice on this board. (It’s finally done; you can now officially add it to your list of favourite finished TLs, after having jumped the gun almost two years ago.) And for having taken the time out of your vacation schedule to meet with me in person on multiple occasions, resulting in much fruitful discussion on AH along with other matters. (Speaking of which, since Roem and Meadow seem to be erring on the side of caution when it comes to publishing pop culture TLs through Sea Lion Press, perhaps the time might finally be right for a resurrection of Cronus Invictus on AH.com? Just a suggestion…)

    And last, but certainly not least…

    To @e of pi. After you sought some advice from me with your own TL, the now-completed Eyes Turned Skyward, I very cannily convinced you to help me with the space program in TWR in exchange, and to be frank, I got by far the better of our now-concluded arrangement. You’ve proofread every update going as far back as I don’t even remember when, you’ve been a sounding board for me to bounce off every conceivable idea on every conceivable subject with regards to this TL’s development, not to mention you’ve provided more than a few of your own. We’ve directly and fruitfully collaborated on many of this TL’s best moments, often after literally hours of discussion brainstorming them. And you’ve always lent your time and your creative energy willingly and indeed with great enthusiasm, because of your honest passion for this TL being the best it can be, despite your lack of personal stake in it; indeed, you’ve often supported TWR and encouraged others to do so even when we were in direct competition with each other, which demonstrates incredible generosity the likes of which very few are capable of mustering (and I am certainly not one of those few, myself). I can’t imagine how much poorer this TL would be if not for your contributions, and despite being an AH writer I don’t think I want to try. And (although this is partly a testament to just how long it took me to write this timeline) you’ve gone from a total stranger to one of my closest friends.

    ...

    Although the content posts have now concluded, of course I intend to continue monitoring this thread and would be more than happy to answer any further questions and respond to any further comments that might arise. As @Cluttered Mind (the very last new poster before the TL concluded!) was good enough to remind everyone, I am continuing to conduct two surveys: one of the birth year of my readers, and one of their favourite episodes of Star Trek, so now is the time to resume canvassing those of you who have not yet participated; I intend to post the final results on July 18, 2016 (a Monday), four years and eight months after I started writing this TL. But don’t feel obliged; whatever you might have to say, I’m more than happy to read it.

    ...

    But whatever you might have to say, for the time being, I have nothing more to say, except…

    That’s All, Folks!

    ===

     
    Last edited:
    Top