Going Global
  • Going Global

    New Hollywood was, above all else, a confluence of disparate stimuli: a response to the situation on the ground in the entertainment industry in the mid-to-late-1960s; the sociological ramifications of the “Generation Gap” so keenly felt by the maturing Baby Boomers; the growing artistic significance of the auteur theory in intellectual discourse; and the direct influences of avant-garde filmmaking movements in other countries, most notably the Nouvelle Vague in France (a country which had always been a trendsetter in motion pictures, dating from the invention of the medium itself in the late-nineteenth century). It was therefore natural for the emerging movement to become a product of its time – a period of profound social tumult, followed by an illusory peace and prosperity, which in turn soon collapsed as a result of the precarious geopolitical situation. But New Hollywood limped on through the 1970s, partly as a result of inertia and partly because the cohort of directors which had emerged from the shiny new film schools were (after a fashion, in some cases) producing major hits – Jaws was a smash for Universal, and Journey of the Force was even bigger for Paramount. This came at a time when many studios were rather desperate for hits, and many directors (other than George Lucas) requested nothing more than greater creative freedom going ahead. However, it soon became apparent that the degree of creative freedom sought by directors directly correlated with the size of their budgets.

    Francis Ford Coppola, one of the leading lights of the New Hollywood movement, had already gone bust in the early-1970s, before he was forced to capitulate to the studio in the making of The Godfather – a smash success, though it did not win Best Picture (losing to the cynical, sophisticated musical Cabaret). The Godfather Part II followed (and also lost Best Picture, to the neo-noir Chinatown, though Coppola at least won for Best Director), but after that, Coppola’s schedule seemed remarkably clear. He had initially intended to adapt Heart of Darkness before letting the writer of the planned screenplay, John Milius, handle the project himself. He then moved on to one of his dream projects, a biopic of the enterprising automotive engineer Preston Tucker, though he had an ambition far beyond the traditional constraints of that hoary genre. In fact, he had planned to follow Godfather with his Tucker biopic before he was distracted by Heart of Darkness. [1] Perhaps the extra time he had to incubate his idea might have endeared him overly much to certain concepts which he (or anyone else) might have otherwise rejected in the more sober, collaborative conditions of formal Hollywood brainstorming. Marlon Brando was given the lead role of Tucker, despite being a decade too old for the part, and (as it was later discovered) grossly overweight. [2] Paramount, which had produced the Godfather films, balked at the budget demands. So Coppola looked around for a new partner for his idea, and he did find it – or so he had thought.

    Outside of Hollywood, the Transamerica Corporation had commenced operations in the insurance sector, where it had enjoyed considerable success. This would provide it with the wealth it needed to expand into a conglomerate. Yet by 1981, Transamerica was in trouble; the ambitions the life insurance company-cum-conglomerate had for being the next Gulf+Western had been for naught; all their years of hard work had yielded only illusory rewards. As a result of this came one of the biggest surprises in Tinseltown.

    This was brought about through the doings of one of Transamerica’s subsidiaries, the venerable United Artists (UA) movie studio. Originally founded in 1919 by the popular movie stars Charlie Chaplin, Douglas Fairbanks, and Mary Pickford, as well as the film director D.W. Griffith [3], as a means of distributing their own films without relying on the major studios, it had since diversified its assets, which had grown to include a music label, a radio station in Philadelphia, and television stations in both Puerto Rico and Cleveland. United Artists had always been one of the smaller studios, which had driven the company to take pronounced creative (and financial) risks in order to attract audiences. However, throughout its existence, United Artists was never consistently successful in doing so. But that which had kept them down in the Golden Age – the absence of a dedicated distributor – meant that they were not at all hampered by the court-ordered “divorce” spurred by the Miracle Decision in the early-1950s, and from then on United Artists enjoyed a particularly fertile period, with their film properties (and their accompanying soundtracks, available through the studio’s recording division) enjoying unprecedented success. No better example of this synergy could be found than the James Bond franchise, which launched in 1962 and in which they eventually purchased an ownership stake (which had a cumulative effect upon the return on their investment). Although the Bond-mania of the 1960s had faded, 007 had seen longevity unrivaled in any film property of the post-serial era. Eleven instalments had been produced by 1980, with a twelfth on the way. [4] On the other hand, their television division had a spotty record, with The Troubleshooters, The Outer Limits (an influence on the development of Star Trek), and the iconic (for better or for worse) Gilligan's Island counting amongst their few successes. [5] As was the case with many of the other studios, United Artists Television largely focused on syndicating its film library, including many of the pre-1950s Warner Bros. films and their Popeye cartoons. United Artists did not escape from the emerging era of conglomeration, and Transamerica bought them out in 1967 – the year after Gulf+Western had purchased Paramount. Apparently, Transamerica had been rumoured to have been interested in acquiring Desilu, but Lucille Ball turned them down, claiming that “I wouldn’t sell to Charlie Bluhdorn – and he offered me a lot more than you did.” That Transamerica had wanted to purchase Desilu was not altogether surprising, considering what endeavours they – and their successors would later attempt under the United Artists umbrella.

    UnitedArtistsLogo1970s.jpg


    When Transamerica took over UA, they created a new logo which had the then-Transamerica T.

    The period which followed Transamerica’s acquisition of United Artists was not without its growing pains. New ownership always equated to new ideas, even when they were radical departures from what had been in place before. The studio had frequent clashes with executives at Transamerica; one frequent complaint was that United Artists released an exceptionally large volume of movies which had been rated “X” by the MPAA, usually for the high degree of sexuality present therein. Although all of the studios released “X”-rated films to some degree, and these pictures tended to do well with adult audiences, Transamerica preferred to focus on more “family-friendly” fare. [6] It had repeatedly demanded that the logo and byline “A Transamerica Company” be struck from such prints, and United Artists just as often refused to do so. The conglomerate had even contemplated re-branding, replacing UA with “Transamerica Films” and spinning off the music and television divisions, but the continued success of United Artists Records stayed their hand. Meanwhile, the handful of broadcast stations purchased by the corporation – most notably WUAB-TV, the leading independent station in Cleveland, which they had hoped to develop into a superstation to rival Ted Turner’s WTBS in Atlanta – were also successful, and a ready market for the product provided by the other divisions. (WUAB-TV filled hour upon hour of Saturday mornings with Popeye cartoons, hoping to recreate the immense popularity garnered by the Looney Tunes starting in the 1950s.) [7]

    The impasse between Transamerica and United Artists would finally come to a head with the production of Francis Ford Coppola’s biopic of Preston Tucker. [8] Having finally found his partner – and his source of financing – in United Artists, Coppola, the pioneering New Hollywood auteur, an Academy-award winner, and the director of two proven box-office and critical successes already regarded as being among the greatest films ever made (and the best not to win Best Picture, other than Citizen Kane) – succumbed to his rapidly-inflating ego and became, to put it delicately, rather difficult. Coppola’s perfectionism and exacting aesthetic standards resulted in virtually the entire film being filmed on lavishly-constructed soundstages, eschewing the affordability of location shots in favour of absolute control over what was being filmed. Not surprisingly, production costs very rapidly began to escalate. In fact, the ludicrously-detailed backdrops and sets were – on more than one occasion – rebuilt from the ground up (despite having originally been constructed to Coppola’s own precise specifications) because they didn’t “look right” once they had been realized. In addition, the sets were so large that they filled almost the entire studio space, leading to their popular nickname of “fire traps”. [9] Coppola’s precise attention to detail did not merely encompass what he shot, but also how he shot, demanding multiple takes until he was satisfied, leading to the popular claim that he exposed one million feet of film stock in the making of the picture. [10] The film was also a musical, with original score and songs composed by Leonard Bernstein with lyrics by Betty Comden and Adolph Green – United Artists had hoped to sell the soundtrack on their label, noting the tremendous success enjoyed by the Greased Lightning soundtrack LP. [11] This necessitated further costs, for the intricate choreography, and the extended rehearsal time. Marlon Brando, the lead actor, was dubbed – but only after the studio went to the expense of recording all of his vocals for the soundtrack, in order to save face. Brando also did not bother to study the mannerisms or even the life history of Tucker, instead choosing to use his “Method” to realize his character. Preston Tucker’s real-life children and grandchildren were involved in every aspect of production, frequently clashing with Coppola (particularly with regards to Brando’s portrayal). [12] Pre-production continually delayed the start of filming, which resulted in further raises to the salaries of the key cast and crew members to keep their schedules clear; principal photography, once it had finally commenced, lagged immediately. After one week of filming, the crew was six days behind schedule. During post-production, studio executives balked at the nearly five-hour length of the workprint that Coppola had assembled – his rough cut was little better, at three-and-a-half hours long (even the most epic of musicals, such as The Sound of Music, rarely topped three hours in length). [13] Even after trimming the length considerably down, to under three hours, the film was critically lambasted and largely ignored by audiences – starting with its disastrous première in Chicago. The film became the largest box-office bomb in history, in terms of the paltry grosses when stacked against the massively inflated budget: costs had ballooned to $40 million, but the film generated revenues of only $3 million, for a 92.5% loss. [14] After a one-week run, the film was pulled from wide release, and United Artists was ruined. As a result, the studio declared bankruptcy, and this was the final straw for Transamerica. The conglomerate bowed to the demands of its stakeholders and withdrew from the motion picture industry altogether and put United Artists and all of its associated properties up for sale.

    In the face of a major global recession, interest was tepid, until an unlikely buyer came onto the scene, one noted for his persistence and willingness to take risks. When Israel Asper, the owner of Canwest Global Broadcasting, made his offer, he was positioning himself to join a long list of Canadians who had left their home and native land for the bright lights of Hollywood: among them had been the actress Mary Pickford, an original investor in United Artists; the film moguls Louis B. Mayer (the third letter in MGM); and the Warner Brothers. The efforts Asper had made to expand the Global Television Network were paying dividends, thanks in large part to their flagship show, SCTV. However, he had a problem. Thanks to the Canadian Content policies, or CanCon for short, which had been mandated by the federal government, Asper had difficulty finding programming for his growing network – despite the noble (and perhaps overly naive) intentions of legislators, it was American programming, as opposed to Canadian (or even Commonwealth) programming – with exceptions, such as Doctor Who – that continued to draw Canadian audiences to their television sets. This was a fact of simple economics – the American market was large and robust enough that any shows which failed to attract an audience could have their costs written off by the networks and studios, who had counterbalancing successes elsewhere. By contrast, the Canadian market was small and even if homegrown programming was popular, it would have great difficulty recouping the costs of the vast majority of (unprofitable) network offerings, because it was much cheaper to simply schedule already-produced American series than to invest in an almost-certain loss. Hence, despite the Canadian programming in its schedule – often cheaply produced, or not under direct threat by American competition, such as local programming, children’s shows, game shows, newsmagazines, and sketch comedy – Global had conversely built its success on American programming, especially dramatic series. [15] Asper needed a way to ensure that he would maintain the schedule he knew his audiences demanded, while staying on the right side of Canadian law.

    United Artists was Asper’s ticket to maintaining Global’s success, and further improving its position relative to rivals CTV and the CBC. Owning an American motion picture and television studio would allow Asper to produce his own programming at the same level as the American production companies. He could produce “Canadian” programming which fulfilled CanCon regulations, but were essentially American shows which just happened to be filmed in Canada with a mostly-Canadian cast and crew – as minute a proportion as Canwest could get away with – that would easily have ready buyers in the American market; this took advantage of a loophole in the CRTC regulations that supporters of CanCon simply did not anticipate. [16] As with his Canadian acquisitions, Asper saw value in United Artists’ broadcasting presence, despite its relatively small size compared to that of its competitors; there would definitely be room for him to expand his television network in the United States. On all counts, United Artists was the perfect solution to Canwest’s woes. His marketing skills enabled him to assemble a consortium of Canadian investors, including some of the wealthiest families in the country (most notably the Toronto-based Thomson family), to back his proposal – Canwest had a 50.1% stake in the consortium which bid for United Artists.

    Yet he knew that the steps of acquiring the studio would be a difficult process, and Transamerica had obvious reservations about whether he was the “right” person to purchase United Artists – even though the board of directors would be perfectly happy to divest themselves of “that troublesome studio”. Regardless, Asper remained persistent. He remembered how in 1974, the original owners of what would become his flagship Winnipeg station – then known as KCND-TV – had balked at his then-radical idea of transferring the station to Manitoba from its original city of licence in Pembina, North Dakota. An unending series of flights to and from Houston to confer with KCND’s owners finally allowed the transfer to take place; so it would be in this case as well, with Asper becoming a regular visitor to the famed Transamerica Pyramid in San Francisco to meet with the board and persuade them to sell their studio. In addition, he also travelled to Hollywood, to ingratiate himself with the UA management.

    Essentially, Asper made Transamerica and offer they couldn’t refuse. He would buy United Artists – lock, stock, and barrel – for $350 million. Building upon a previous offer, made to the original owners of United Artists back in 1951, Asper offered to exercise control of United Artists for five years, and if the studio became profitable, he would have the option to assume permanent ownership of the company. [17] However, the most valuable promise he made was not to Transamerica, but to the people at United Artists itself. He promised to give the studio as much creative freedom as they wanted in making their films. Unlike Transamerica, he would wear the “X” rating with pride. After all, he wanted to have quality content on his Global Television Network. But he insisted that budgets be carefully monitored and controlled; there would not be another Tucker under his watch.

    Once the media on both sides of the 49th parallel caught wind of the acquisition, their reactions varied widely. In Canada, the news made headlines, with whole sections covering the transaction from different perspectives (business, finance, life, entertainment, and – of course – the editorial pages). In fact, considering the media reaction, it came as something of a surprise that “Izzy”, as he was universally known, was not chosen as the Canadian Newsmaker of the Year, however much talk of the United Artists buyout overshadowed what many observers considered other, far more important stories. In the United States, however, apart from trade journals – including Variety (who dubbed it “the sale of the century”) – the Wall Street Journal, and the Los Angeles and San Francisco media (most prominently the Los Angeles Times and the San Francisco Chronicle, respectively) coverage was comparatively scant. This made a great deal of sense – for Canada, the purchase of a Golden Age film studio by a Canadian conglomerate was the “new foothold in Hollywood”, but for the Americans, UA had been just one studio among many. More ink was likely devoted to the coverage of the purchase of RKO by Desilu Productions back in 1958, even by the industry press (taking umbrage at upstart, one-time B-listers who had risen to the top in an equally upstart industry, then attempting to muscle their way into the establishment). On American television, ironically enough, the news was relegated to the entertainment section of most local newscasts (though every UA-owned station, unsurprisingly, covered the acquisition in considerable detail). During the run-up to the purchase, however, negotiations hit a minor snag with regards to UA’s broadcasting operations – the two television stations and the radio station in Philadelphia (plus a long-standing construction permit for a station in Houston). Although anti-trust regulations with regards to corporate ownership of broadcasters had been loosened under the Reagan administration, foreign ownership restrictions on media outlets remained firmly in place. A cap on direct foreign investment into broadcast stations, originally set at 25%, had risen only to 50%, less one share. [18] This seeming speedbump nearly threatened to derail Asper’s purchase, and it was undoubtedly the primary criticism from the American media perspective – sensationalism and nativism, after all, often went hand-in-hand. Canadian reporters observed the situation trying rather desperately not to gloat – those in the culturally protectionist Dominion were glad to see the proverbial shoe on the other foot. Asper did his best to silence them as he devised an extraordinary solution: United Artists’ broadcasting operations would be spun off into a separate company, retaining the United Artists Broadcasting name, and Canwest would own the maximum 50% – less one share – the local business communities in the respective markets would collectively own the remainder. This was deemed an acceptable solution by all sides, and was integrated into the deal.

    However improbable it might have seemed, Israel Asper and Canwest Global Broadcasting cleared every roadblock to achieve their purchase of United Artists. Although Canwest was forced to adapt to their new, American-based division, day-to-day operations continued apace. Due to his unyielding loyalty to the province of his birth, Manitoba, Asper chose to maintain his headquarters in Winnipeg; he resisted calls to move to either Toronto (home to the flagship station of the Global Television Network) or Los Angeles (home of the United Artists studio). The bookkeeping and financial reporting at UA, and in all its divisions (including United Artists Broadcasting) would be revised in order to comply with Canadian tax laws, which strictly prohibited the “creative accounting” which was rife in Hollywood; this made United Artists the first studio to voluntary change their accounting standards after the Trial of the Century. Asper, aware of the good press this would bring Canwest, invited the “rogue accountant”, C.A. Baxter, to his studio to accept his commendations. Naturally, this did not earn Asper any friends in Tinseltown, though it would remain to be seen if this would matter in the long run. A more immediate focus for Canwest, meanwhile, was expanding Global into new markets; the third-largest network in Canada still had huge gaps in coverage throughout the Dominion. But for all of Asper’s trying, United Artists would change Canwest, as a company, and as a brand. It was no longer just another conglomerate which managed a television network and acted as a holding company for its owned-and-operated stations. Overnight, it became one of the major media companies in Canada – and in North America…

    ALT United Artists logo under Canwest.png

    The new United Artists logo, incorporating the Canwest Global stylized “G”; otherwise it was a straight lift from the previous logo.

    ---

    [1] Coppola purchased the rights to Preston Tucker’s life story in 1976, IOTL and ITTL. Although he is involved with the production of Heart of Darkness IOTL (winning Best Picture as Producer and notably succeeding where the two Godfather films had failed), the production of what became Apocalypse Now IOTL was a much longer, more arduous process, and Tucker was shelved for ten years, before finally being released in 1988.

    [2] Brando was considered for Tucker (45 years old when the film is set, and who died at age 53) during the 1970s pre-production period, but was both too old and too far out of the studios’ favour by the time production began in the mid-1980s: the role was cast with Jeff Bridges (who was actually too young for the part, ironically enough).

    [3] And the former Secretary of the Treasury (and future U.S. Senator from California) William Gibbs McAdoo, son-in-law to President Woodrow Wilson, as the silent partner – despite having an equal share in United Artists, owning 20% of the company’s stock alongside each of the four actual united artists.

    [4] The James Bond franchise had also produced 11 films by 1980 IOTL; neither reckoning includes the “unofficial” 1967 spoof film of Casino Royale.

    [5] Gilligan’s Island was officially “revived” during the late-1970s revival TV-movie/miniseries fad (which, ITTL, also produced Star Trek: The Next Voyage) IOTL as well as ITTL – the Rescue from Gilligan’s Island two-parter aired in 1978 (which, yes, ends with them stranded on the same island after having been rescued), followed by The Castaways on Gilligan’s Island in 1979 (which doubled as an attempted backdoor pilot for a Love Boat ripoff, of all things). Perhaps the most notorious of the Gilligan movies, The Harlem Globetrotters on Gilligan’s Island, aired in 1981 (yes, ITTL too).

    [6] Recall that, ITTL, the “X”-rating was trademarked by the MPAA in 1972 and was able to maintain serious cachet as an “adults only” rating. Pornography (along with, later, obscene “video nasty” exploitation films) were not rated by the MPAA and only appeared in… “specialty” theatres (and, later, on home video), but were sometimes advertised as “rated A” (for adults only), “rated N” (for naughty, nudity, or not rated), or “rated U” (for unrated). At least one wag would refer to such films as being “in the ANUs”, an exceedingly lowbrow (and, fortunately, quite obscure) joke which was nonetheless wholly appropriate for their standard content.

    [7] Over time IOTL, United Artists sold off much of their music and broadcasting operations. For example, “Rikavisión”, the Puerto Rican TV station in the United Artists Broadcasting portfolio, is now owned by Univisión, and is branded “Tele-Isla”; its affiliate in Mayagüez (on Puerto Rico’s west coast) is also an affiliate of Univisión. Likewise, United Artists Records was eventually absorbed by EMI.

    [8] Tucker has the dubious distinction of being to TTL what Heaven’s Gate was to OTL – the single, wretched monument to the New Hollywood way of doing things that capsized it and everything it stood for. Why does such a film come into being ITTL? Because Heaven’s Gate was only the worst of a bad lot. See also: The Last Movie; New York, New York; At Long Last Love; Coppola’s own One From the Heart; etc., etc. It was basically inevitable that creative freedom would be taken to its logical extreme in such a degree.

    [9] Coppola turned the same trick in OTL with his 1982 film One from the Heart, which involved re-creating modern-day Las Vegas on sound stages, including a set recreating McCarran Airport and the centrepiece set recreating the Strip… instead of actually flying out (or even driving – it’s just four hours away) to Las Vegas. At least late-1940s Chicago was a time and a place rather divorced from late-1970s Hollywood.

    [10] One million feet of film stock were exposed for Apocalypse Now IOTL.

    [11] IOTL, Coppola’s original plan for the Preston Tucker biopic did indeed include musical numbers. This probably didn’t happen because he (and every studio in Hollywood) had been burned by his musical One from the Heart, leaving us instead with the 1988 film we all love and enjoy.

    [12] Preston Tucker’s children and grandchildren were involved as well with the 1988 film IOTL, where Coppola was keen to be as historically accurate as possible, to the point where Jeff Bridges imitated Preston’s mannerisms. The children even allowed Bridges to wear their (grand)father’s cuff links and ring – they notably did not extend this courtesy to Brando ITTL (one of Tucker’s grandchildren was overheard to rather callously remark that “It wouldn’t fit him!”).

    [13] For comparison, the workprint of Apocalypse Now was 289 minutes, before he edited it down to 202 minutes (which was released in 2001 as Apocalypse Now Redux) – executives demanded the 153-minute cut released to theatres in 1978.

    [14] Although Apocalypse Now (with a budget of $31.5 million) grossed $79 million domestically upon its release (good for fourth place in the year 1979), One From The Heart (with a budget of $26 million) grossed less than a million dollars. Tucker does better than that, but that’s still not nearly enough to save it – or United Artists.

    [15] Recall that the original Global station in Toronto, ON, had driven itself into bankruptcy because it had an all-CanCon schedule, and it had to include American programming in its schedule in order to remain viable – making it no different from its rivals, primarily CTV (the CBC, though it is known at present for airing exclusively Canadian programming, did in fact air American series with some regularity prior to the mid-1990s, but never in as high a volume as the privately-owned networks did.)

    [16] One of the earliest and most famous examples of this loophole abuse was Night Heat, which is rather notoriously set in a city which straddles the 49th parallel – any laws or customs which were exclusive to Canada or the United States were omitted. Considering that Night Heat was a police procedural, that took some doing. Other examples, spanning through the ages, include Forever Knight, Kung Fu: The Legend Continues, and all the Stargate series.

    [17] In 1951, two freelance producers approached the remaining owners of United Artists (it was down to just Pickford and Chaplin by this point), with a similar proposal. Pickford was amenable, but Chaplin was initially opposed, changing his mind only when he ran afoul of the US government.

    [18] This increase never happened IOTL – foreign owners are still restricted to the 25% share, even to the present day.

    ---

    This update was co-written with Dan1988, so my thanks to him for taking the time and effort to help me weave these disparate plot elements together! Thanks also, as usual, to e of pi for assisting with the editing. I hope all of you enjoyed the Heaven’s Gate of TTL, everyone – a musical about a man who invented a car. Perhaps not the most bizarre topic for such a genre (the Newsboys Strike of 1899 is still the reigning champ, at least IOTL), but certainly a prime contender for the title. Fun fact: many of the anecdotes regarding the production of Tucker are borrowed from real-life films. Bear that in mind: live by the auteur theory, die by the auteur theory.


    ALT United Artists logo under Canwest.png
     
    On Which the Sun Never Sets
  • On Which the Sun Never Sets

    The British Empire no longer existed in any meaningful sense of the word; more romantic historians would claim that the Empire had nobly sacrificed itself in World War II, to save the Earth from tyranny. Others were more likely to assign blame for its collapse to one of any number of more mundane, less honourable causes: the anti-colonial foreign policy of the United States under Presidents Roosevelt and Truman; the decolonization process commenced by Prime Minister Attlee as part of his overall focus on creating “a land fit for heroes” returning from overseas through the establishment of the British welfare state; or the disaster at Suez, which obliterated Britain’s influence and aspirations to superpower status. Most likely, it was a combination of all the above factors. The British Commonwealth of Nations, which promoted peace, co-operation, and unity through diversity, could claim to be a successor organization to the Empire but was primarily a ceremonial fellowship, with aspirations to continued economic and cultural integration, but no remaining political ties. In fact, many members of the Commonwealth had become republics and no longer recognized the British monarch as Sovereign, though Queen Elizabeth II remained the symbolic head of the organization, and made sure to mention it and the work it was doing in every annual televised Christmas broadcast she gave.

    Her grandson, His Royal Highness Prince William Arthur Philip Louis of Wales was born on February 15, 1981, and was second-in-line to the thrones of all the Commonwealth Realms (which recognized Elizabeth II as Queen, as opposed to mere members, which did not) from that time forward, in the direct line of inheritance behind his father, Charles, the Prince of Wales. [1] His mother, the Princess of Wales, was 23 years old when she delivered their son; Prince Charles was 32. Prince William was given a lavish christening in the spring, celebrating his induction into the Church of England, of which all English (and later British) monarchs since Elizabeth I had been Supreme Governor – the event was steeped in rite and ritual, with the Prince William (said to be a quiet, well-behaved infant) dressed in the ancestral coronation robes which had been tailored for his great-great-great-great-aunt, Princess Victoria, the German Empress, in 1840, and bathed in water brought in from the River Jordan. That Prince Charles had provided a Y-chromosome in the conception of his first child served to ebb a raging debate in the run-up to the birth (the parents had declined to find out the sex ahead of time) – spurred in part by the removal of Princess Anne from the line of succession nearly a decade earlier: whether male-preference primogeniture was “fair” in a society where women sought full equality to men, and had been making major strides in that direction. [2] The subject had even been raised during the 1973 Commonwealth Heads of Government Meeting in Ottawa, and had helped to capsize the planned Ottawa Accord – as was so often the case with fundamental constitutional changes, consensus could not emerge with regards to making just one modification and leaving everything else alone. Absolute primogeniture became a topic of discussion again at the 1981 meeting in Melbourne, but William being a boy meant that any such changes would likely not come into effect for another half-century or more – his hypothetical daughter might not be born for another 30 years. Or he might only have daughters, as George VI did, postponing the issue until the next generation. Perhaps most significantly, all of the other European monarchies continued to use male-preference primogeniture – by its very nature, the United Kingdom was not the sort of monarchy to pioneer such a drastic change to an ancient custom. [3]

    Despite the robustness of the line of succession and the continuing popularity of the British monarchy, the vestiges of what had once been the most extensive empire the world had ever seen were continuing to unravel. Their last remaining colonial possession on the mainland of the Americas, British Honduras, had sought to follow the lead of many other former colonies in the region and secure independence for some time, though its efforts to do so had been hampered by the active interference of a hostile, irredentist neighbour – Guatemala. The United Kingdom knew better than to withdraw from the territory without any guarantees, which were not forthcoming from the military junta. Being one of the Great Powers, Britain eventually sought to isolate Guatemala in the UN, gradually bringing all of the other countries into the region onside, against Guatemala’s claims to British Honduran territory; this was, nevertheless, a process which took many years.

    Meanwhile, Canada, a proposal tabled by NDP MP Max Saltsman, which also received early support from Dan McKenzie, a (backbench) Government MP, sought to fulfill a long-time national dream: the acquisition of territory in the Caribbean. [4] Sir Robert Borden, the Prime Minister during World War I, had sought to integrate much of the British Caribbean into Confederation, much as Australia was doing with many island territories in Oceania, but was rebuffed by Westminster. In the years since, the “snowbird” phenomenon, referring to Canadians who wintered in warmer climes, typically the Southern United States (and especially Florida) had made itself known, and providing the Dominion with her own territory in the Caribbean would help to retain tourist dollars in the Canadian economy [5] – it would also furnish part of the greatly impoverished region with the benefits of the robust Canadian welfare state. Saltsman and McKenzie – in a charmingly bipartisan move, and one which highlighted the compatibility between the PCs and the NDP over the party ostensibly “between” them ideologically, the Liberals – favoured the Turks and Caicos Islands, a small crown colony which had until 1959 been a dependency of Jamaica – the Governors of Jamaica and then the Bahamas had overseen the affairs of the small territory until each colony gained independence. The Turks and Caicos were small, and sparsely populated – extending the “security net” over it would not prove overly taxing on Canadian infrastructure. [6] It also received many Canadian tourists already; they would not need to be lured over to their new territory. Saltsman tabled his bill shortly into the term of the majority government which Stanfield’s Tories won in 1974, and – though the legislation was watered down from his proposal (favouring only “creating a dialogue with the United Kingdom in regards to the future disposition of that crown colony called the Turks and Caicos Islands”), Stanfield followed through, communicating with the new PM at Westminster, Willie Whitelaw. The United States was quietly (and secretly) informed of continuing negotiations as well; a handover of British territory to an effectively sovereign state in the Americas was not perceived as being in violation of the Monroe Doctrine, but the USA was a close ally of both Canada and the UK and it was deemed unseemly to not inform the White House of their intentions. President Humphrey, who was still in office when negotiations commenced, offered them his full support, and this carried on clandestinely, amidst the backdrop of the more transparent situation regarding British Honduras. Indeed, the fates of the two were eventually intertwined.

    It was decided that the Turks and Caicos Islands would be admitted to Canada as a third territory – a very different one from the two that already existed (the Northwest Territories, established in 1870 with its acquisition from the British Crown, and the Yukon Territory, established in 1898 after the Klondike Gold Rush – both of which were mostly within the Arctic). [7] This territory would enjoy responsible government as the other two did, and would elect one MP to the House of Commons; this MP would enjoy the smallest constituency (an estimated 6,000 people – less than one-third the size of the next-smallest constituency, the 22,000 people of the Yukon) in Canada, and would have one appointed representative in the Senate. [8] The United States suggested – and both Canada and the United Kingdom accepted – that the terms of the agreement be submitted to the people of the colony in a referendum. Most polls showed the referendum passing in a landslide, which it did, with 90% in favour of joining Canada – independence was not included as an option in the referendum, despite minor agitation on the part of local residents. [9] (The US President at the time of the referendum, Ronald Reagan, was said to have been “disappointed” by the lack of this option, but did not challenge the results). British Honduras (renamed “Belize”) gained independence and the Turks and Caicos joined Canada as its third territory (the first enlargement of the Dominion since Newfoundland was admitted in 1949) on the same day: July 1, 1981 – Dominion Day in Canada, and now in Belize as well (which retained Elizabeth II as Sovereign and Head of State). [10]

    One of the reasons that the Turks and Caicos Islands had been transferred so readily to Canadian sovereignty (notwithstanding that Canada and the UK were in personal union) was that it had been considered as the host site for a spaceport which would be jointly operated and funded by both countries, inspired by the Guiana Space Centre in Kourou, French Guiana, which had been the base of operations for the French National Centre for Space Studies since 1968, and then the European Space Agency, or ESA, once it had been formed in the mid-1970s from the merger of several precursor organizations (some of which, such as the European Space Research Organization, had counted the United Kingdom as a member, but Britain voluntarily “withdrew” her membership as it became clear that she would not be joining the EEC). Britain did not have the resources to launch rockets into space alone; partnership with France had been sought in the 1960s to that very end, but joint efforts had proven disastrous. The Commonwealth Trade Agreement which had emerged as a “temporary placeholder” for British (and, by extension, Irish) designs to join the EEC evolved to include a space exploration component, starting with tentative proposals at the Commonwealth Heads of Government meeting in (appropriately enough) Kingston, Jamaica in 1975. [11] This would allow the other founding nations of what would become known as the Commonwealth Space Agency (CSA), Canada and Australia – and, potentially, additional Commonwealth countries further down the line – to co-ordinate their own space programs with the existing foundation of British efforts to develop a workable launcher; this joint effort would provide assured launch capabilities and the prestige of a more expansive program which none of the member nations would be able to achieve individually.

    In light of this, the Turks and Caicos were strategically located: rockets launched into the East, necessitating that no inhabited land be underneath their flight path, and had an easier time launching closer to the Equator – East Caicos, the proposed launch site, was within the Tropic of Cancer, and there was nothing east of the Lucayan Archipelago until the Sahara. However, plans to base CSA launches there were soon abandoned as impractical – although it would create many jobs for Canada’s newest territory, the islands were too remote to be effectively supplied and too small for effective infrastructure to be maintained. Instead, Australia would host the launch site, continuing their historic connection with the development of British rocketry. However, instead of using the traditional launch site at the Royal Australian Air Force base at Woomera, in South Australia, a new one was to be built in Far North Queensland, near to the city of Cairns, which was much closer to the Equator (even when compared to East Caicos), and better-connected to existing transportation links, which made it a better prospect for a long-term, intensive space program. [12] The Premier of Queensland, Joh Bjelke-Petersen, notoriously crowed over this coup by claiming that “Nuclear missiles may have detonated over South Australia, but space-borne rockets will launch from Queensland”, referring (rather callously) to the increasingly controversial legacy of British nuclear testing in Maralinga. [13] The first order of business for the “Big Three” contributors to the CSA enterprise was to determine a means for launching their payload into space…

    The disparate reaches of the Commonwealth (and even beyond, including those countries that had fought for and won independence from the British Empire) were united by far more mundane and accessible means than grand political ambitions: they were bound together by the common airwaves. One exemplar of how television helped to shorten distances was Are You Being Served?, which entered the 1980s as one of the longest-running and most popular sitcoms in British history, though it had evolved a great deal from how it had originally been conceived. Trevor Bannister, who played original male lead Mr Lucas, remained with Are You Being Served? despite his vocal displeasure at what had originally been his star vehicle having been stolen from him by the exceptionally talented supporting players, largely because his role was expanded following the departure of Arthur Brough as Mr Grainger, the original department head in menswear. [14] John Inman’s character Mr Humphries, the associate in menswear, was promoted to the new senior, though Mr Lucas remained as the junior (with his lack of promotion frequently mined for comedy). Mr Rumbold, the floor manager, was very pleased with this result, famously expounding upon his reasoning in a discussion with Captain Peacock, the floorwalker:

    Mr Grainger did half the work we’d expect of a salesman at Grace Bros., and so did Mr Lucas. That’s three sets of salaries and benefits paid for the work of two salesmen.
    But now you’ve eliminated Mr Grainger’s position and are still half a salesman short.”
    Yes, but we’re now at one-and-a-half out of two, instead of two out of three – our efficiency ratings have improved from sixty-seven to seventy-five percent!
    But surely you could hire a third man who could operate at optimal efficiency?
    With our luck we’d just as likely find a fellow who’s even worse than Mr Lucas. Then we’d be worse off than where we started. No, it’s much better off this way.
    I’m not sure I quite agree with your calculations, Mr Rumbold.
    That’s why you’re just the floorwalker and I’m the executive, Captain Peacock.

    Are You Being Served? suffered the loss of another cast member, Harold Bennett (as “Young Mr Grace”), who (like Arthur Brough before him) had retired at the end of the season prior to his death. The joke of the character, as his name implied, was that this senile, enfeebled mogul was the younger of the two brothers who had founded and owned the department store – “Old Mr Grace doesn’t get about much these days”, came the constant rejoinder to anyone who asked after him. There was some discussion to casting the role of Old Mr Grace, but the problem of finding the right actor – too old, and he might not last, as had been the case for Brough and Bennett; too young, and he might be unconvincing as an older gentleman – stymied the producers. [15] In addition, Bannister had requested – and received – a rather weighty pay raise in order to continue playing the thankless role of Mr Lucas, and this went through largely due to Bennett’s retirement freeing up the resources in the show’s budget to do so. It was therefore decided to make Young Mr Grace an unseen character, further changing the dynamic of the program and reducing it to the core septet: Mr Rumbold, Captain Peacock, Mrs Slocombe, Mr Humphries, Mr Lucas, Miss Brahms, and Mr Mash, the custodian. [16] These would remain for the remainder of the show’s run, which showed no signs of slowing even after a decade on the air, an impressively long run for a sitcom on either side of the Atlantic. And speaking of, Are You Being Served? had by this time emerged as a staple series on the American PBS, given its (relatively) large number of episodes. The CBC, in Canada, aired the episodes “first-run” (one of several BBC programs which had a regular berth of the network), though in practice with a delay of several months. [17] Indeed, Are You Being Served? could be said to have united the Commonwealth – it was so popular in Australia (no doubt inspired by there actually being a department store chain in New South Wales called Grace Bros.) that a remake was produced there; all the names were changed, and an all-new cast of actors were hired. Scripts were recycled from the original series; this was the fatal flaw, as the original series was not only still running, but was very widely televised. The Australian remake lasted only one season, and would go down as one of the worst series in the history of the Commonwealth. In fact, it did so poorly that it made the equally inevitable (and short-lived) American remake, Beane’s of Boston (which ran for one season, from 1980-81) seem simply mediocre by comparison. [18]

    One of the biggest and most acclaimed programs on the BBC in the late-1970s, Fawlty Towers, had been in limbo since the conclusion of the second season in 1977; a third season had remained a possibility, though not a contractually-obligated certainty. John Cleese had said as much in an interview that year on Parkinson, famously claiming that “There won’t be any more Monty Python films, but there may yet be another series of Fawlty Towers”. In the years since, Cleese and his wife, former Doctor Who companion Connie Booth (creative as well as life partners) had divorced, though amicably. Both were concerned that the creative possibilities for their sitcom had been exhausted… after only 16 episodes (a relatively modest run, even by British standards). [19] In the end, Fawlty Towers would not see a third season until four years had passed since the second; it aired in late 1981 (alongside the final season of To the Manor Born, in fact – the two programs even aired back-to-back). The entire core cast returned for the eight additional episodes that were filmed, though a new director was chosen as the previous one was elsewhere engaged. Critical reactions to the third season widely regarded it as a slightly disappointing follow-up to the previous two – but viewer reaction was extremely positive; ratings were higher for the third season than for the first two, partly because of the pairing with To the Manor Born (which prompted the BBC Director-General to describe 1981 as “a banner year in light entertainment”). [20] On November 29, 1981, which came to be known as the “Double Event”, both programs concluded – To the Manor Born with a lavish wedding, and Fawlty Towers with Basil Fawlty being run out of Torquay – fortunately, the latter program aired first, so as to prevent the good feelings brought about by the wedding; Fawlty Towers had ended in such a definitive manner so as to prevent the possibility of a fourth season; Cleese and Booth both felt that they had completely exhausted the narrative potential of the show, and knew that they had no interest in coming back to it. However, they did sell the idea to a pair of American writers (brothers, in fact) who had previously worked on Taxi Drivers

    The “Double Event”, for its part, became the highest-rated regular, scheduled broadcast in British history, with both programs attracting over 25 million viewers (To the Manor Born slightly outperformed Fawlty Towers); the previous record-holder had been the Royal Variety Performance of 1965. [21] This was certainly the pinnacle of success that could be expected for the BBC: both shows, of course, aired on their flagship network, BBC-1, opposite only two other channels: sister station BBC-2, and commercial rival ITV. But a second commercial network would begin airing on January 1, 1982, the result of legislation passed in 1979, though years in the making before that – many remote controls sold as early as the 1960s were labelled “ITV-2” in anticipation of the fourth channel. Their anticipatory actions were eventually vindicated when the name of the service was indeed announced as ITV-2, thus leading ITV to follow in the path of the original BBC service and rename itself “ITV-1”. [22] Notably, ITV-1 was actually tuned to channel three, and ITV-2 was tuned to channel four, as BBC-1 and BBC-2 occupied channels one and two, respectively. Nevertheless, the delayed creation of the fourth television service was enough to fulfill a dream long held by many. Observers in the United States wondered when they would acquire their own fourth network.

    And then there was Doctor Who. Despite the Yank Years being well and truly behind it, the show continued to maintain a cult audience in the United States, many of whom (to their credit, at least in the eyes of certain British fans) preferred the exploits of the Fourth Doctor to the Third; perhaps this might have been willful contrarianism, given that Pertwee’s Doctor perpetually topped the popularity polls on both sides of the Pond. Jane Seymour, on the other hand, enjoyed a far more unambiguously positive reception as the Doctor’s primary companion, considered either the best since Connie Booth, or simply the best. Aware of the cachet that the role had given her, Seymour departed from Doctor Who after three seasons to appear as the lead Bond girl in The Spy Who Loved Me, the Soviet spy Anya Amasova. [23] She was convinced to remain as long as she had – departing after the 1978 season – purely so that she would remain with the program as long as her predecessor, the reviled Angela Bowie, did; the producers had sought to cleanse the palate of Bowie and found that such a task required an enormous amount of proverbial after-dinner mints. The Fourth Doctor himself, Jim Dale, continued past Seymour’s tenure; indeed, he had a hand in choosing the next companion, Joanna Lumley. Lumley portrayed a character more prim and proper than Seymour, and was also older, considerably over the age of thirty. [24] She remained as the principal companion for three seasons, helping to cement the tradition of companions lasting for precisely that long. Dale himself followed in the footsteps of his predecessor in departing after his sixth season on the program, in 1981; he was replaced by Richard Griffiths, who was twelve years younger than he (and also a year younger than Lumley, one of the reasons why she left the show at that juncture). [25] Griffiths was the youngest Doctor thus far, aged 34 when he assumed the role, and the first born after World War II; however, he looked a good deal older than his age would suggest (and, indeed, perhaps even older than Jim Dale did at the conclusion of his tenure). The question of what Griffiths would bring to the role intrigued many Who fans worldwide. For while the show no longer ruled the roost in the international market, it was one of the many British programs whose influence extended throughout the borders of the former Empire, and further beyond…

    ---

    [1] Yes, he has the same name as his OTL half-brother. The “William” comes from Prince William of Gloucester (who, ITTL, survived and became the Duke of Gloucester); “Arthur” from the mythical King in the Matter of Britain; “Philip” from his paternal grandfather, Prince Philip, Duke of Edinburgh; and “Louis” from his (TTL) maternal great-grandfather, the Earl Mountbatten of Burma. (The father of the Princess of Wales is named “John”, a name which by this time has come to be regarded as bad luck by the Royal Family).

    [2] Prince Charles had two sons IOTL; Lady Amanda Ellingworth had three sons. He didn’t seem predisposed to providing X-chromosomes and she didn’t seem predisposed to accepting them, so it only seems fair that their first child be a son.

    [3] The first European monarchy to switch from male-preference to absolute primogeniture IOTL was, unsurprisingly, Sweden, doing so in 1980 – after the birth of Prince Carl Philip displaced his elder sister, Princess Victoria. Legislation stripped him of his title as Heir Apparent at the age of seven months, granting it to Victoria – the Netherlands were next, in 1983, followed by Norway in 1990, Belgium in 1993, Denmark in 2009, Luxembourg in 2011, and (though the law has not gone into effect) the United Kingdom in 2013 (under the Statute of Westminster 1931, each of the Commonwealth Realms determines the succession individually). Notably, only in the pioneering Swedish instance did this apply retroactively, making Carl Philip the first Heir Apparent in a continuing monarchy to have that status revoked since another Swedish Crown Prince, Gustav (son of Gustav IV Adolf) in 1809. But ITTL, a son was indeed the first-born of Carl XVI Gustav, which delays any movement to push absolute primogeniture succession through.

    [4] Saltsman made his proposal to annex the Turks and Caicos in 1974, at which time the Liberal government of Pierre Trudeau was in power IOTL, and it died before being tabled. Dan McKenzie, though he was in office at the time, sat as an Opposition MP with the PC caucus, and made his own (separate) proposal in 1986 (by which time Saltsman had retired), when his party was in government under Brian Mulroney (who was more of a Reaganite/Thatcherite “Blue Tory” in contrast to Stanfield’s Rockefeller/One Nation “Red Tory”) – however, it died in committee. Stanfield, unlike either Trudeau or Mulroney, is more the type to be amenable to such a proposal.

    [5] This was part of Saltsman’s argument for making his proposal IOTL (very much in keeping with the economic philosophy of his party at the time). Canada did (and does) provide a disproportionately large share of tourists to the islands relative to its population.

    [6] The population of the Turks and Caicos in 1980 was 7,413; at present it is approximately 31,458, a more than fourfold increase. This gives it a comparable population to all the other Canadian territories, though it also means that it would cost much more to accommodate supporting their population.

    [7] The Northwest Territories originally encompassed virtually all of Canada between the Great Lakes watershed and the Rocky Mountains, from the 49th Parallel to the North Pole, before various sections (first Manitoba, then the Yukon Territory, and finally Saskatchewan and Alberta) were separated from it. (IOTL, the eastern two-thirds of the remaining territory was reconstituted as the effectively Inuit-governed territory of Nunavut in 1999.)

    [8] Although the apportionment of electoral districts (which are always assigned by province or territory) is subject to a convoluted formula which has been revised several times, the rule of thumb in recent decades is that there are about 100,000 electors for each MP, nationwide. However, each province and territory must have at least one MP, and no province may lose MPs in reapportionment (or rather, fall below a grandfathered threshold), meaning that older, smaller provinces (and the territories, with their low populations) are better-represented than newer, more populous ones. The Senate is a whole other can of worms that will not be elaborated upon at this time.

    [9] Polling showed support for joining Canada at 90% in the 1990s, which is notable because the Canadian economy underwent a significant downturn in that decade IOTL, worse than the American and other world economies. ITTL, the Canadian economy is performing slightly better, relatively speaking, than the American economy in the early 1980s (because of the “Reagan shock”), which is partly reflected in the tourist presence in the Turks and Caicos, and this ensures strong support for the notion ITTL when it is formally put before the electorate (which never happened IOTL). The question put to the electors ITTL is a simple “yes” or “no” question, with no elaboration.

    [10] Belize won independence a few months later IOTL – on September 21, 1981. As noted, it was the last continental holding of the United Kingdom in the Americas – leaving France as the only remaining European power with a physical presence on the American continent (in French Guiana).

    [11] You will recall that Kingston is also the place where the CTA first emerges as something other than the ultimate placeholder.

    [12] The CSA supplants the Canadian Space Agency as well as the British contributions to the ESA. Australia had earlier collaborated on missile designs with the United Kingdom, but did not have a dedicated space exploration agency until the 1990s IOTL; you can consider their earlier leap into the arena ITTL a beneficiary of their stronger economy (because of the earlier disengagement from the overseas quagmire) alongside the golden opportunity to get in on the ground floor.

    [13] Petersen became the Premier of Queensland in 1968, and remained in that position until 1987 IOTL – his rise was too early for me to realistically butterfly, and considering all he went through in his nearly two decades in office prior to his ultimate downfall (and how that downfall came about) it would be difficult for me to unseat him. But on the plus side, at least it gave me a prominent figure into whose mouth I could insert those incredibly tactless words.

    [14] When Brough left the role IOTL, he was replaced by a succession of department heads, none of whom lasted for more than a season (the last two, before the producers threw up their hands and promoted John Inman – though only de facto, as he had understandably grown superstitious of the position – each lasted only a few episodes). ITTL, with the circumstances of Brough’s departure changed, they decided not to replace him. The advantages of not replacing a character is that above-the-line costs go down, and the opportunity exists to focus more strongly on the remaining characters – which is how Mr Lucas is able to gain a second wind despite petering out IOTL – it’s a rather unfortunate coincidence that Bannister left just as the producers finally gave up on attempting to cast a permanent replacement for Mr Grainger.

    [15] They went with the second option IOTL, choosing Kenneth Waller – who was 28 years younger than Bennett, young enough to be his son – to play the role of Old Mr Grace, caked in unconvincing old-age makeup. He lasted just one season before they reached the obvious conclusion of depicting Young Mr Grace as an unseen character; ITTL, I’m bestowing them with a fair deal more insight than they had IOTL, but the producers were experienced sitcom writers, so it isn’t too much of a stretch.

    [16] Mr Mash left after the third season when the actor, Larry Martyn, made another commitment which he could not break; he was replaced by Mr Harman, played by Arthur English (the only replacement character on Are You Being Served? to outlast the original, and be better-remembered).

    [17] Par for the course with the CBC – they failed to air the first season of Doctor Who until 1965, and lagged nine months behind on broadcasts of Coronation Street (despite airing it five times a week as opposed to the three times it aired on ITV) for many years, though they’ve finally caught up more recently, and are apparently “only” a few weeks behind now. IOTL, the CBC did not air Are You Being Served?, and Canadians became familiar with the show the same way that Americans did: by watching it on PBS.

    [18] The Australian version of Are You Being Served? ran for two seasons IOTL, having scored a singular coup in getting John Inman to reprise his role as Mr Humphries, dispatched to the unnamed Australian city (the program was shot in Melbourne) by Mr Grace himself (his cousin owned the Australian store). However, the show suffered the same revolving-door casting as its English parent (the Mr Rumbold and Miss Brahms actors, who were ironically never recast back at Grace Bros., were swapped out between seasons). Meanwhile, the American version, Beane’s of Boston, never got past the pilot stage IOTL; as you might imagine, ITTL it was watered down considerably (with the solitary exception of the Mr Humphries analogue being made unambiguously gay – which, as in the UK, raised the ire of many gay rights groups for stereotyping).

    [19] Fawlty Towers only ran for 12 episodes IOTL, of course, in two seasons which aired, yes, four years apart.

    [20] An apt comparison for the third season of Fawlty Towers is Let It Be by the Beatles, popularly judged to be inferior to the rest of the Beatles catalogue but still superior to the overwhelming majority of pop music. Certainly, Cleese and Booth are deeply dissatisfied with the third season, and Cleese in particular will spend the rest of his life talking it down. (By contrast, IOTL, his favourite episode is “Basil the Rat”, the very last.)

    [21] The Royal Variety Performance 1965 aired on November 14th of that year, on ITV. Every regular, scheduled broadcast since that has since outperformed it IOTL (five broadcasts have done so, starting with the divorce of Den and Angie on EastEnders at Christmas Day 1986) did so as a result of aggregated viewing figures (initial broadcast plus repeats). ITTL, the 25 million viewers figure refers only to those watching the “Double Event” the night of, on November 29. 1981.

    [22] IOTL, the fourth television service, unimaginatively entitled “Channel Four”, started transmission on November 2, 1982, almost a year later.

    [23] Amasova was played IOTL by American Barbara Bach, with The Spy Who Loved Me released in 1977, the third of the Roger Moore-era Bond films. Seymour instead portrayed the Bond girl in the earlier Live and Let Die, Moore’s first outing as Bond, in 1973 (at the age of 22). But fear not – despite being several years older ITTL, Seymour is still a full decade younger than Michael Billington. Actresses leaving television series in their prime to appear as Bond girls are a time-honoured tradition in the UK – famously, Honor Blackman and Diana Rigg both did so after their stints on The Avengers. (In an odd coincidence, both actresses were the only Bond girls to be older than the actors playing 007).

    [24] Lumley had also appeared in a Bond film, On Her Majesty’s Secret Service, IOTL (and ITTL), after which point she appeared in The New Avengers.

    [25] Griffiths was twice considered for the role of the Doctor IOTL: in 1981 (for the Fifth), and in 1989 (for the Eighth, had the show not been cancelled). As was the case with Michael Billington as James Bond, I felt obliged to take all those near-misses and turn them into a hit.

    ---

    Thanks to e of pi for his advice in the creation of and assistance in the editing for this update!

    Here we have a look at the “modern Commonwealth” ITTL, and what it is trying to achieve as an organization – where it stands relative to British culture, and what British culture has been producing. Receptive markets within and beyond the Commonwealth will always welcoming programming that strikes a chord with audiences, and one thing to consider as the CSA forges on ahead is that the telecommunications satellites it will someday launch should only strengthen these bonds further
     
    Popular Movement
  • Popular Movement

    Football, throughout most of the world, referred to the sport properly called association football, and also known as soccer; ironically, the countries where this was most often not the case were those in the Anglosphere, other than the British Isles, from which the beautiful game had originated in the mid-19th century. This was because many other sports originating from the British Isles had also come to be known as football, particularly those based on the game of rugby (a full-contact sport which, oddly, involved far less kicking and was not played with a ball). Rugby had followed colonists throughout the British Empire and into the United States, evolving alongside the immigrant cultures of their new lands in distinct forms. In North America, football described a descendant of rugby often called gridiron football, though the sport was played slightly differently in Canada and the United States (the variants becoming known as “Canadian football” and “American football”, respectively). Though it had initially been popular only amongst amateurs and at college campuses, the sport quickly caught on, definitively overtaking the cherished “National Pastime” of baseball by the 1970s in the United States. In Canada, ice hockey continued to reign supreme, but football was clearly in second place, especially given its perception as a homegrown sport, and the native professional major league (the CFL) which – uniquely – was not shared with that of the United States. [1] This was in stark contrast to the American imports of baseball and basketball (though the latter had been invented by a Canadian, James Naismith). By 1980, Canada had two teams in each foreign league: the Montreal Expos and the Toronto Blue Jays in MLB; and the Montreal Olympians and the Toronto Huskies in the NBA. By comparison, Canada played host to seven of the twenty-four NHL teams, and supported its own top-level professional football organization, the Canadian Football League, with nine teams. The CFL enjoyed the open endorsement of the Prime Minister, Robert Stanfield, who was an avid football fan.

    However, Stanfield, being a politician, had a much larger ambition. He wanted to bring the CFL to his hometown of Halifax (which he also represented in Parliament). The largest city in the most populous province of the Maritime region, Halifax had benefited mightily from the pork brought in by the federal PCs, who dominated both the province and the region. The Halifax metro area had a larger population than one of the cities that already supported a CFL team – Regina, Saskatchewan – though the population of the city proper, at 125,000, was smaller. What it mostly lacked, however, was the proper facility to host a professional football team, and when the investors came calling, the federal government was more than happy to oblige – it was “only fair”, after all, given all the investment package that had gone into Montreal for the Olympics facilities, the airport, and the high-speed rail lines. Although Halifax was much smaller and less economically significant than Montreal, surely it deserved at least some of those same perks? In this case, it helped that Nova Scotia Premier John Buchanan, a fellow Tory and close Stanfield ally, had agreed to support the construction of a stadium in nearby Dartmouth – both governments provided the funding needed to cover the estimated $6 million construction costs. It was in place for 1982, when the CFL granted an expansion team to the “Atlantic Schooners”, as they were officially known, who would play in a 30,000-seat stadium which would be completed in 1984. [2] It was just one example of the “arena fever” which swept North America in the 1980s, as many mid-sized cities sought expansion teams from the professional sports leagues, and had to build facilities to entice them. And if expansions could not be sought, than poaching an existing team was the next best thing, and there were multiple examples of this throughout the early 1980s

    Indeed, there was even a parallel football story within the parallel football league, in Indianapolis, which was a city on a mission. The former “India-no-place”, the nowhere capital of Flyover Country, was undergoing a thorough revitalization process as the 1980s dawned, pioneered by mayor-turned-Senator Richard Lugar, with the ardent support of local business interests, and continued by his successor, Michael Carroll. [3] In hopes of changing his city from “a place to stop over” into “a place to stay”, Carroll sponsored the construction of the Indiana Convention Centre, a sprawling complex which would be headlined by a new stadium, the Hoosier Dome. But as far as Carroll and other backers were concerned, the facilities were an empty shell without an anchor tenant. It needed an NFL team. And they were convinced to get one, by any means necessary. But the NFL was not expanding, so they would have to be… aggressive. The Baltimore Colts, at first, had seemed a tempting target – but the Baltimore press had learned of tentative negotiations to relocate and was able to agitate the populace sufficiently so as to encourage the Colts ownership to seek out local buyers, who purchased the team shortly thereafter. [4] Surprisingly, it was in sunny, prosperous California that the ideal team was found: the Oakland Raiders. Raiders management were dissatisfied with their own stadium and threatened to move unless the city paid for improvements; local officials balked, and though management tried to move by Los Angeles by fiat, this was rejected by the NFL. Oakland, thinking that the team was bluffing, continued to refuse to pay for improvements, but it would ultimately cost them; the team was relocated in Indianapolis in 1982, after it was shown that the Hoosier Dome would be built to the exact specifications sought by Raiders management. It was a powerful lesson to civic planners everywhere: sports teams would be willing to move for the mere promise of what they wanted rather than spare the expense and stay put for what they viewed as sub-par facilities. It saw the emergence of a series of multiple economic arms races which came to define the decade and its propensity for public works projects: the bigger, the better.

    And then there was the international example: St. Louis. It stood in stark contrast to Indianapolis, having always been problematic as a location for an NHL team. Though Missouri was technically a Midwestern state, it had always retained a strong Southern heritage: it was the birthplace of Mark Twain; a slave state until the Emancipation Proclamation, and a border state in the Civil War; and the Ozarks, the hillbilly country immortalized by the Paul Henning rural sitcoms of the 1960s – Henning himself had been born in Independence [5] – extended into the southern half of the state. Hockey was, unsurprisingly, quite unpopular there – locals preferred baseball (the Cardinals, established in 1892, had been the southernmost and westernmost MLB team until the 1950s) and, increasingly, football and stock-car racing. [6] Hockey was about as popular in Northern Missouri as it was anywhere else in the Midwest, but the Blues had only been in St. Louis since 1967, and many fans remained true to the Chicago Black Hawks. It didn’t help that Missouri was the smallest Midwestern state to have a hockey team other than the cold-weather Minnesota. Indeed, it had once supported two teams – the Scouts had briefly played in Kansas City before moving to Denver as the Colorado Rockies, taking much of their Kansan fanbase with them. [7] Although hockey had been very good to the Midwest, the unique circumstances in Missouri had mitigated its rise there, and the St. Louis Blues – whose success in the rink was commensurate with their popularity outside of it – were soon sold to other interests. Nobody in the Gateway to the West was interested in keeping the team playing at St. Louis Arena; indeed, throughout the United States, buyers were scarce. [8] Therefore, they became the second team to migrate to the Great White North in the early 1980s (after Atlanta to Calgary) when they were purchased by a consortium based in the Ontario city of Hamilton. Its leaders had long hoped to see the NHL return to their city (a team called the Tigers had briefly played there in the 1920s), and to that end, built an arena of their own. Unlike Halifax, Hamilton investors sought no government funding to do so; the resources of the powerful steel industry were based in the city, accordingly known as “Steeltown” – a Pittsburgh of the North, although Gary, Indiana was probably more apt in terms of its locale: a city on the Great Lakes, barely outside the shadow of its much larger neighbour. However, Chicago was much larger and more dominant counterpart to Gary than Toronto was to Hamilton.

    In fact, by 1981, Toronto proper was increasingly diminishing when compared to its neighbours: it was home to the provincial legislature and a large university, as well as numerous arts, entertainment, and sports facilities, but these were only supported by 575,000 souls. The population of the city had fallen precipitously since 1971, with many people (as in most American cities) withdrawing to the suburbs; Montreal’s population, on the other hand, remained stable, increasing very slightly to 1.25 million. [9] In fact, the population of Toronto had fallen slightly behind that of its largest suburb, North York (which had 580,000 residents). [10] It and another Toronto suburb, Mississauga, were among the Top 10 largest cities in the country, at #3 and #9, respectively (with Toronto itself at #4). Hamilton, the Steel City of Canada, was the eighth-largest city in the country, with 325,000 people. [11] That four out of the ten largest cities in Canada were located in the “Golden Horseshoe” region – so named because of the approximate shape of the western shore of Lake Ontario which defined it – was telling of the region’s overall population. In fact, Hamilton in many ways seemed a rougher, grittier Toronto in miniature: it had a large, prestigious research university (McMaster) within its borders, and had genteel, affluent suburbs of its own, including the wealthy city of Burlington. It was no surprise that it would eventually seek even greater prominence through the construction of a new sports arena (and, by extension, luring an NHL team). The project even had a star backer in the face of former NHL player, and current entrepreneur, Tim Horton.

    Horton had been one of the hockey greats of his generation, who had won four Stanley Cups as a defenceman for the Toronto Maple Leafs in the 1960s. Hamilton was an adopted hometown of sorts for him; he had opened a doughnut shop there in 1964 which, by two decades later, had expanded into a chain of 100 shops scattered throughout the Golden Horseshoe region. Expansion might have been even more rapid if not for Horton’s drunk driving auto accident in early 1973, which left him paralyzed from the waist down. [12] A changed man, he became an advocate of wheelchair accessibility – ordering the conversion of all his existing shops to allow for the entry of handicapped customers, well before the practice came into vogue elsewhere – and the growing lobby in opposition to drunk driving. By consequence (and for synergistic reasons) the arena, which would be named Copps Coliseum after a long-serving former mayor of Hamilton (Victor K. Copps), would be fully wheelchair-accessible and would offer Tim Horton coffee and doughnuts as an alternative to the beer and other food items which would also be available at the venue. [13]

    The Blues were renamed the Hamilton Bulldogs, in a roundabout reference to the previous NHL team to serve the city. That team had been known as the Tigers, but that name could not be reused because the local pro football team was the Hamilton Tiger-Cats. However, the Tigers had been moved from Quebec City, where they were (informally) known as the Bulldogs. The name was chosen over more Hamilton-specific names like “Steelers” in hopes of appealing to the residents of the surrounding areas – many of whom were already fans of the Toronto Maple Leafs or the Buffalo Sabres. [14] The presence of the Bulldogs in Hamilton boosted the Canadian presence in the NHL to eight teams, against sixteen American teams. This restored the 2:1 national ratio that had prevailed during the “Original Six” era. Sold in 1982, the team played one last lame-duck season in St. Louis before they were to move to Hamilton to begin play in the newly-completed Copps Coliseum in the 1983-84 NHL season.

    Between football in Halifax, and hockey in Calgary and Hamilton, Canada seemed more united than ever in its love and zeal for sport. But Quebec had always been the odd province out. Prior to the Quiet Revolution of the late 1960s, it had been a pastoral, socially conservative province – one of the last bastions of established support for the Roman Catholic Church in the Western world. It had been a province that supported the notion of a Canadian national identity while most of the English-speaking country still felt some allegiance to the increasingly moribund British Empire; just as English Canadians finally began to warm to the concept, Quebecers did an about-face, instead identifying as Québecois, favouring secular, socialist values. Many of them also believed that Quebec would be better off independent from Canada, though the nature and degree of this proposed independence was nebulous, at best. René Lévesque, the leader of the separatist Parti Québecois, who served in that capacity as the Premier of la belle province from 1976, supported a system which he called “sovereignty-association” – basically, the Quebec and Canadian economies would continue to be intertwined, but Quebec would otherwise enjoy complete political and legislative independence. It was a system which defied easy comparisons, because it was notoriously vague on the details – but the EEC was often mentioned as something that the new “partnership” could strive towards. But it wasn’t something that Canadians – or the Canadian government – would take lying down, and they took whatever steps they could in order to prevent that eventuality from coming to pass.

    The famous Montreal-to-Mirabel “Rocket” line, completed in 1978, had always been intended to be the first in a series of high-speed rail lines which were planned to criss-cross the populous stretch of land along the Windsor-to-Quebec City axis. Mirabel Airport was thriving, despite its great distance from the downtown core, entirely because the travel time was so short. Other Canadian cities sought to emulate this link, particularly arch-rival Toronto, whose own International Airport, though only half as far from the central business district, was still inconvenient to access by car. [15] The provincial government of Ontario, a classic Red Tory administration led by Premier Bill Davis, campaigned in the 1977 election promising to build that high-speed rail link, insisting that Toronto remain competitive with Montreal. His party narrowly won a majority, largely at the expense of the provincial Liberals (whose power base was mostly outside Toronto – from which they were entirely shut out). The federal government, meanwhile, sought to link Montreal, the largest city, with Ottawa, the national capital – the “Federalist line”, as it came to be known (in French, le ligne fédéraliste) was intended to strengthen the physical connection between Quebec and Canada. Premier Lévesque, aware of the symbolic implications of building such a line, nonetheless contributed to its construction because (in order to attract provincial investment) the line was planned to be newly-constructed – travelling from Montreal via the existing Mirabel “Rocket” line, and then continuing along the east bank of the Ottawa river, remaining entirely within Quebec – crossing the river into Ontario only just outside Ottawa – where a bridge was to be built south of the newly-amalgamated city of Buckingham, which was also north of the Ontario village of Cumberland, some 30 kilometres due east of Ottawa. This was informed by the presence of the Rivière du Lièvre, flowing through Buckingham, which was also wide enough to merit the construction of a large bridge (meaning that one would have to be built, one way or the other). On the Ontario side of the river, existing rail lines were planned to be replaced with the high-speed rail, and a second Montreal-to-Ottawa connection, bypassing the airport, would eventually be built; or, barring that, a route to Montreal from Kingston, via Cornwall (along the St. Lawrence River).

    12.25% of the population of Quebec (or about 800,000 people) claimed English as their mother tongue in the 1981 census; 13.75% (or 900,000) described it as their home language. In absolute numbers, this was a slight increase from the 1971 census; however, as a share of the population it marked a decline. [16] (This was partly due to the rise of Allophones, who spoke neither English nor French at home, who were 7% of the Quebec population in 1981). Nearly all Anglo-Canadian immigrants had settled in the West Island of Montreal, or near to Mirabel, where large numbers of English-speaking workers were needed at the new airport to deal with travellers from the rest of Canada (along with the US, the UK, and other countries in the Anglosphere), the new domiciles built in Dorval once the airport had been demolished were overwhelmingly English-speaking, much like many of the surrounding municipalities. These new migrants came to be known in French as les voleurs, in a pun typical of that language (as “voleur” can mean “flyer”, referring to moving in where an airport had either closed or opened, or “thief”, referring to these migrants having “stolen” opportunities from the Québécois). The Parti Québécois sought to strengthen the position of the French language in Quebec as a prelude to separation from the rest of Canada, and his government passed restrictive language laws almost immediately upon gaining power. They were granted Royal Assent in 1978 [17], against the protest of Prime Minister Robert Stanfield – however, the PM had nothing to leverage against Lévesque, as had not been the case for Bourassa before him. However, his advocacy for the Anglo-Quebecer community won him several seats in Montreal in the 1978 federal election, even as he lost support elsewhere in the province (though all of his MPs held their seats thanks to their personal, idiosyncratic popularity). [18]

    The referendum marked a resounding defeat for the Oui column, with 58% of voters rejecting sovereignty-association. Although the secret ballot prevented exact knowledge of how each linguistic community voted, breaking down the districts poll-by-poll indicated that the Francophone voters were about evenly divided on the issue – Anglophones were near-unanimously in favour of Non, as were Aboriginals. Allophones, though less decisive in their support of Non, did break heavily in that direction. An estimated 40% of the population were Oui-voting Québecois; turnout was very high, at 90% of the vote, an unheard-of number for federal or provincial elections. Lévesque put on a brave face, even claiming that “If I’ve understood you well, you’re telling me ‘until next time’”. [19] But Lévesque and his party was defeated in the 1981 elections by a resurgent Liberal Party of Quebec, which won thanks to the collapse of the Union Conservateur. [20] Raymond Garneau became the new Premier of Quebec, and sought to redefine the place of Quebec within a united Canada. Robert Stanfield, the Prime Minister of Canada for nearly a decade at this point, sought much the same thing, though for very different reasons…

    Interest in aping the high-speed railway program pioneered in Canada was immediate and powerful in the United States, but federal funding (considered a firm necessity, given the massive infrastructure costs) was not forthcoming until the 97th Congress was seated in early 1981 – the Democratic-controlled House (with support from the Wallace Americans, who were, fortunately for the Democrats, known for their love of pork) was enthusiastic about high-speed rail, in particular freshman Rep. George Takei of California, a key early sponsor, who was eager to implement the reforms he had brought to Los Angeles on a much larger scale. He had the support of President Glenn on this issue. The Republican-controlled Senate played along, aware of the need to rehabilitate their economic reputation after the second major recession of the late-1970s was blamed on Reaganomics. The testbed would be the line from Penn Station to the Newark Airport Station, crossing under the Hudson River by way of the North River Tunnels. Announced as part of Glenn’s initial barrage of initiatives during his first 100 days in office, the rail line was completed in time for the elections of 1981, which likely helped to secure Governor Thomas Kean (a Republican) to his second term in office – more for the promise of future rewards, as opposed to the tangible gain of linking Newark with Midtown Manhattan. Although the line was not given a nickname (unlike the Canadian “Rocket” line), it did have a hockey connection – Penn Station was located immediately underneath Madison Square Garden, home of the New York Rangers. This connection had virtually no impact on planning, however – linking a busy railway station with a busy metro airport was intended as a prelude for connecting the entire Northeast Corridor, from Boston to Washington, DC – Governor Kean had signed on to the agreement largely because Newark-to-Trenton had been promised as the first high-speed rail extension from Newark. From there, plans were in place to extend it through Philadelphia, Wilmington, and Baltimore, on the way to Washington, D.C. Where high-speed rail would go from there was a bigger question, and one which hinged on the Glenn Administration’s vaunted “Invest in America” policy. Rep. Takei, unsurprisingly, favoured introducing it into California (proposing a Los-Angeles-to-San-Francisco line, jokingly called the “Fault Line” by supporters and opponents alike). Other obvious candidates included the Chicago metro area, and the “Texas Triangle” of Dallas, Houston, and San Antonio. Which of these lines, among others, might come next was a question that would determine the shape of transportation policy for the remainder of the decade.

    ---

    [1] The CFL did eventually (and briefly) expand into the United States in the 1990s, IOTL, seeking (as many professional football leagues did at that time) to function as an “alternate” or “parallel” league to the firmly entrenched NFL. On the other hand, as previously mentioned, no American major football leagues have ever entered Canada, and their one OTL attempt to do so would have been blocked by federal legislation had they not backed down. As has been made clear in this update, Stanfield is a huge football fan, and though he’s probably less culturally-protectionist and reflexively anti-American than Trudeau, he would unquestionably share his opinion as regards this issue.

    [2] There was every intention for the Schooners to play IOTL – a franchise had been granted, though it was conditional upon the completion of a suitable stadium. However, the funding for said stadium never materialized, and the bid collapsed. The CFL has never again made a serious attempt to permanently expand into the Maritimes IOTL.

    [3] The OTL mayor of Indianapolis at the time, William H. Hudnut, who was responsible for many of the same initiatives as Carroll ITTL, was still in the House at the time that Lugar was elected to Senate ITTL (two years earlier than IOTL, remember) and did not run against Carroll in the subsequent general election.

    [4] The Colts were indeed moved to Indianapolis in 1984, IOTL, to the enduring heartbreak of Baltimoreans. As ITTL, the city and its people were vociferously against such a move, so (as my Hoosier consultant e of pi puts it) Indianapolis stole them in the night, moving them out under cover of darkness. The CFL, of all leagues, were able to exploit the void left behind with their own Baltimore-based team, the only one in their ill-fated expansion into the US to be successful. In fact, the NFL felt so threatened by this that they moved the Cleveland Browns to the city in 1996, where they play as the Ravens to this day. (The CFL team then moved to Montreal and became the new Alouettes).

    [5] Independence, of course, is also famous for having been the jumping-off point of the Oregon Trail, a route plagued with heartbreak, dysentery, and “peperony and chease”.

    [6] Yes, by which I do mean NASCAR. Even today it still retains that redneck, hillbilly stigma – back then, it was still struggling to break through into the mainstream.

    [7] Missouri is an intriguing place, geographically. Its two largest cities (and metro areas) are on opposite ends of the state from each other, each bordering another state (Illinois, across the Mississippi River, in the case of St. Louis; and, fittingly enough, Kansas, in the case of Kansas City – in fact, there is another Kansas City in Kansas, right across the border). St. Louis is often called the “western-most Eastern city”, and Kansas City likewise the “eastern-most Western city”; it helps that the mean centre of population for the whole country has been between them since 1980 (IOTL). Kansas City attracted the NFL Chiefs in 1960, the MLB Royals in 1969, and the NHL Scouts in 1974; although the Scouts left to become the Rockies in 1976 IOTL and ITTL, they would then move again, to Newark (as the New Jersey Devils) in 1982 IOTL.

    [8] There were few interested buyers IOTL, either, and the Blues were very nearly sold to interests in Saskatoon, the second-largest city in Saskatchewan (with barely more than 150,000 people at the time, and a metro area of perhaps 200,000, if that – at the time, St. Louis alone had 450,000 people, and its metro area had 2.9 million) before the NHL nixed the deal. The Blues were purchased at the eleventh hour by Harry Ornest, who kept them in the city, where they remain to this day. (ITTL, Ornest – who would IOTL go on to own the CFL’s Toronto Argonauts – was ironically part of the consortium that would move the Blues to Hamilton.) I should emphasize that, even ITTL, Hamilton is the smallest NHL market by a considerable margin – though it has a lot more people than Saskatoon within a reasonable distance, even though it’s sharing them.

    [9] Montreal’s population declined IOTL, as well, from 1.21 million in 1971 to 980,000 in 1981 (a 19% decline). However, ITTL, the success of the Olympics, the demolition of the airport (and related facilities) on the island, the construction of new infrastructure promoting urban renewal, and the less restrictive English-language laws (though the city proper is largely francophone) all combine to keep Montreal robust (which, by extension, retards the growth – or rather, accelerates the decline – of Toronto).

    [10] Toronto had 600,000 people in the 1981 census IOTL, a 16% decline from 1971. North York, just to the north of Toronto proper (think of it as the Brooklyn to Toronto’s Manhattan) had “only” 560,000 people in 1981, though the fact that it had not yet been granted city status in 1971 shows how massive its growth had been in that decade.

    [11] IOTL, Montreal was at #1, Toronto was at #2, North York was at #5, Mississauga was at #8, and Hamilton was at #9. Rounding out the Top 10 are: Calgary (#3 IOTL, #2 ITTL), Winnipeg (#4 IOTL, #5 ITTL), Edmonton (#6 IOTL and ITTL), Vancouver (#7 IOTL and ITTL), and the nation’s capital of Ottawa (#10 IOTL and ITTL). Worth noting is that 1981 marked the final census in which no city in Canada had over one million residents, thanks to Montreal’s decline in population.

    [12] IOTL, this drunk driving accident happened a year later, and it killed him. His business partner, Ron Joyce, then bought out his shares in the company from his widow and turned the company into the powerhouse it is today. Joyce remains Horton’s partner ITTL, and though Horton slows him down, he’s certainly no slouch either.

    [13] Molson would win the contract, in exchange for their votes in support of the move to Hamilton from St. Louis (and hoping to prevent a second round of the Beer Wars – likewise, Labatt and Carling O’Keefe back the move wholeheartedly). “Coffee beer”, though rather different (and good deal more literally-interpreted) than what we understand the term to mean IOTL, would become very popular at Bulldogs games (to the point of becoming the unofficial drink of the team).

    [14] Maple Leaf Gardens, home arena of the Maple Leafs, is a 45-minute drive from Copps Coliseum (less than 70 km, or about 42 miles). Buffalo Memorial Auditorium, home of the Sabres, is a bit farther away at a 70-minute drive (over 100 km, or about 65 miles). These are, respectively, the shortest and third-shortest distances between two NHL arenas ITTL (in between are those of the New York Rangers and New York Islanders, at 85 km, or 53 miles). Even the distance between Toronto and Buffalo (163 km, or 101 miles) was only (just barely) the fourth-shortest before the move (and at this point IOTL), behind the two New York teams and (in a rather impressive three-way photo-finish) the New York Rangers and Philadelphia (at 160 km, or 99.5 miles) Hartford and Boston (at 161 km, or 100 miles).

    [15] Toronto International Airport would, IOTL, eventually be named for Lester B. Pearson, winner of the Nobel Peace Prize and Prime Minister for five very eventful years (1963-68). Like Dorval vis-à-vis Montreal, it is not located within Toronto proper but a nearby suburb (Mississauga).

    [16] Given that language laws were much harsher IOTL, and were enacted earlier, the English-speaking population of Quebec cratered through the 1970s, down to 10.9% who spoke it as their mother tongue, and 12.7% who used the language at home (and those numbers would only continue to decline in the coming decades, though they have recently stabilized). Many of those emigrant Anglophones – and their business interests – headed for, you guessed it, Toronto, bolstering that city (and more vitally, its service sector) during a crucial period, and sparing it the fate suffered by many other Great Lakes cities IOTL.

    [17] In yet another linguistic pun, 1978 was referred to ITTL by those sympathetic to the government of Quebec as « an de Mirabel », referring to the (Latin) expression annus mirabilis (despite describing the passage of a bill which strengthened the French language), while also acknowledging that year’s launch of the Rocket line… to Mirabel.

    [18] The narrator has not seen fit to mention that one of these Montreal-area Tories is young lawyer – and key Stanfield ally – Brian Mulroney, who ran at the urgings of the PM himself. Mulroney fittingly won the seat that included Dorval. (IOTL, in 1984 he ran for and won the Eastern Quebec seat that included his hometown of Baie-Comeau, but ITTL he instead runs for a Montreal-area seat where he is deemed to have a better chance.)

    [19] Or, as he said in his native French: « Si je vous ai bien compris, vous êtes en train de nous dire à la prochaine fois ! » He said this IOTL as well, and given that he lost almost 3-to-2 in that election, something tells me that he would have said it no matter what.

    [20] Despite losing the referendum IOTL by an even larger margin, Lévesque was able to hold onto power in the election that followed – because the third-party that collapsed in 1981 was playing spoiler for the PQ (who won anyway) in 1976, as opposed to the Liberalswho, in turn, won anyway in 1989 IOTL, even with the presence of an English minority-rights party (the Equality Party) siphoning off votes – and then the PQ won in 1993, even though that smaller party had collapsed.

    ---

    And thus concludes the 1981-82 cycle! Just four more to go. Thank you all for your continued and seemingly inexhaustible patience! Thanks also to e of pi for his help and advice in the making of this update, and for assisting with the editing! As always, I endeavour to bring you updates by the end of the month, and I have once again succeeded, however narrowly! Here’s hoping I can keep it up at least once more, but in the meantime, I’m hoping that March will be a banner month for the timeline! :)
     
    Last edited:
    1982-83: Sometimes You Want to Go
  • Sometimes You Want to Go (1982-83)

    May 3, 1982

    It was the beginning of another production season at Desilu Gower, and the three senior executives – President and CEO Lucille Ball, SEVP and COO Herbert F. Solow, and EVP Production Brandon Tartikoff – were enjoying a “working lunch” in Ball’s office. Naturally, her desk was buried in assorted paperwork, so the three of them were forced to hold their Chinese food containers in their laps as they ate. But now they had put their food aside; Ball took one last drag on her customary post-meal cigarette, letting out an unpleasant hacking cough as she crushed the butt into her crystal ashtray (which was perched on an equally elegant pedestal).

    “So, have you ever considered changing the logo?” Tartikoff asked her, apropos of nothing, as soon as she had finished… clearing her throat.

    Solow grimaced at this; Ball replied, “Why, what’s wrong with the logo?”

    “Well, a lot of companies have been changing their logos in the last few years. And our logo has a very… mid-century feel to it. Written out across the screen in that cursive font, with the old-style brass fanfare playing… I was thinking something more minimalist, streamlined instead. Maybe with a saxophone, or a bass guitar riff?”

    “Bringing Desilu into the Eighties,” Solow remarked, flatly.

    “Something like that,” Tartikoff said.

    “Our logo’s been the same for thirty years. That’s three decades of brand recognition. What good would changing it do?”

    “We’re building a whole new lineup. Modern shows for modern audiences.”

    Solow let out a sudden noise that sounded remarkably like stifled laughter.

    “So modern audiences don’t watch our older shows?” Ball asked, with an expression on her face that just dared Tartikoff to cross the line.

    To his credit, he didn’t. “Well, no, what I mean is – ”

    “Herbie, could you be a dear and pass the remote control?”

    Solow obligingly dug it out from somewhere underneath all the stacks of papers. A small, infrequently used black-and-white television set, complete with rabbit ears, occupied an otherwise bare and quiet corner of the room. Solow handed the remote to Ball, who turned on the television and began flipping through channels.

    “Lucy, you really don’t have to – ” Tartikoff began, but Ball held up her other hand, silencing him.

    “Here we are, KCAL, channel nine,” Ball remarked, as she put down her remote. “And what do they air for an hour every weekday at this time?”

    Sure enough, it was an episode of I Love Lucy – in fact, it was the one with Lucy and Ethel working at the chocolate factory (“Job Switching”).

    “Check and mate,” Solow muttered.

    The show continued playing in the background as Ball turned back to her EVP Programming. “Brandie, you’ve been doing great things for this studio. Please don’t ever think I don’t appreciate what you’ve been doing. But you don’t have to change everything just because you can. Look at General Electric and Coca-Cola, for example. They haven’t changed their logos in the whole 20th century. And why should they?”

    Tartikoff nodded, knowing when he was beaten.

    “If you give me market research that says people hate the Desilu logo, then I guess we’ll have to change it. But not a minute before.”

    “Well, that’s not my department,” Tartikoff pointed out, smiling.

    “True. Herbie, I’m authorizing you to direct the marketing department to conduct some market research on this.”

    Solow pulled out a notepad and jotted this down. “Sure thing, Lucy.”

    “Thanks for listening, Lucy, I’ve got to get back to work,” Tartikoff said, as he backed out of the office.

    “You know, Lucy,” Solow said delicately, after it was all clear. “He does have a point – I’m not saying you need to completely revamp the logo, but maybe you might tweak it a little? I keep hearing they’re able to do something like the “merging circles” with computers now. And that fanfare with the heavy brass and strings… it does sound rather dated, maybe you could arrange it differently, have synthesizers play it instead?”

    Ball gave him a long look, similar to the one she had subjected Tartikoff to earlier – but to his credit, Solow didn’t blink. “Wilbur Hatch made that fanfare, Herbie.”

    “Lucy, Wilbur Hatch has been dead for more than a dozen years.”

    She was obviously taken aback by that figure, and stunned into silence. 1969 – the year that Man had landed on the Moon, and the year that the unpleasantness in Southeast Asia had finally ended – it hadn’t always been so far in the past, had it?

    “Marl would do a fine job making sure they pay tribute to what Wilbur did for your logo, Lucy,” Solow continued, once he was sure she’d been given enough time to absorb that information – Marl Young had replaced Wilbur Hatch as the studio’s musical director. 15 years younger, Young was by this time about the age that Hatch had been when he’d composed the original Desilu logo fanfare.

    Ball eventually nodded her acquiescence; Solow made a note to head to Tartikoff’s office later. Meanwhile, she turned back to the television set, where the climactic conveyor belt scene was just getting underway. Solow watched her as her eyes were glued to the tube, even though she must have seen that show – aired some thirty years before – countless times by then.

    Solow knew better than to interrupt his boss before the scene had concluded – with the forewoman screaming “Speed it up a little!” as the audience roared with laughter – the kind of laughter that could never have come from a can, something about which everyone at Desilu had always prided themselves. He could see that Lucy had a very wistful expression on her face, her eyes glazed over.

    “Lucy?”

    “Hmmm. Sorry, Herbie – kind of got carried away there. I find now, when I watch the show, I usually spend my time looking at Viv. I can’t believe I missed it back then – how sensational she was.”

    Solow nodded. “That she was.”

    Nothing more needed to be said.

    ---

    Brandon Tartikoff wasn’t one to rest on his laurels. He was already the toast of town, credited for reinventing perhaps the most tired genre on television, the police procedural, with Hill Avenue Beat in the previous season. But Tartikoff wanted to breathe new life into the bread-and-butter genre of both television in general and his studio in particular: the sitcom. However, he didn’t want to “reinvent the wheel” as much as sponsor the culmination of what had been a continuous, gradual improvement process. He knew he had his chance after hearing a pitch from Glen and Les Charles, brothers and writer-producers on Taxi Drivers. The two were avid fans of Fawlty Towers, and sought to adapt it for American shores; they were convinced that it would be as big a transatlantic smash as Those Were the Days and Three’s Company had been. [1] Desilu, which continued to produce Three’s Company, and had worked closely with the BBC on Doctor Who in the past, seemed natural as a first port of call. Paramount Television, which produced Taxi Drivers, was in no position to produce new shows when they could barely afford to keep making the ones they had left.

    It didn’t help that Ted Turner had emerged as the de facto programmer at Paramount, since a ready market for syndication was wholly dependent on his every whim. He never cared for Fawlty Towers, notoriously describing the show as “PBS, not TBS the one letter makes all the difference”. That the Charles Brothers had already decided that they wanted to set their American adaptation in sleepy, rural New England, the closest equivalent to Torquay – would seal the show’s fate in Turner’s eyes. Naturally, he would have preferred that it be set in Savannah, Georgia, a town much nearer and dearer to his heart but utterly lacking in the character that the Charleses wanted to convey. And even beyond Turner, nobody other than Tartikoff was willing to allow them anywhere near the creative freedom they sought. For their setting, they wanted one that was symbolic not only of their desired editorial independence, but also of national independence.

    They knew that they wanted to set their show somewhere along the Massachusetts coastline, a land rich in that sort of culture and history. They just weren’t sure where – Boston had been their first choice, but the disastrous failure of yet another transatlantic adaptation, Beane’s of Boston, just a few years before made for an ominous precedent. [2] Cape Cod, the area most directly analogous to the resort town of Torquay, was deemed too sleepy a locale for a show which needed to capture the more hot-blooded, aggressive attitude for which Americans were stereotypically known. Therefore, Plymouth, site of the famous Pilgrim colony of over 350 years previous, was chosen as a compromise – it was steeped in colonial history, and was less than one hour from Boston. This made it attractive to the kinds of tourists who sophisticated, cultured tourists who would clash horribly with the coarser locals, not to mention their intended lead, “Big Dave” Sullivan.

    Dave had been linebacker for the New England Patriots in the 1970s, before his alcoholism had capsized his career. The hotel was one of the few assets he had retained after hitting rock bottom, and he held onto it for “sentimental reasons”. Originally known as the Mayflower Inn, the hotel changed names to the Patriot once it came into Sullivan’s possession – a name which would lend itself to the title of the series itself, in a pun. [3] Sullivan was no Basil Fawlty, being proudly working-class, boorish, and unrefined (in contrast to the would-be social climber of Torquay). Always a people person, at least in his own mind, he was happy to eke out a living running the place and interacting with his guests. He did this with the help of his staff, who formed the core cast; the pilot script took the time-honoured tack of introducing a new character to shake up the established order, while also advancing the “culture clash” theme inherent in the premise. Rebecca Hopkins, scion of the Yankee elite (her ancestry could be traced to those who came over on the Mayflower in 1620), was left at the altar by her fiancé, Sumner Samson, in the first episode – the Patriot (which, as the Mayflower Inn, had been a cherished refuge of the Hopkins family in days gone by) was hosting the wedding reception. [4] Scandalized by the ordeal, and with nowhere else to go, she was forced to accept Sullivan’s offer of employment as a waitress at the Patriot’s restaurant, where she clashed with the other employees: George, the slothful chef, and Carlota, the sassy chambermaid, being foremost among these. Notably, The Patriot rearranged the ethnic balance from Fawlty Towers: in that program, the waiter (Manuel) had been the ethnic stereotype, and the chambermaid had been sensible (and English). Carlota, however, was of Portuguese descent, rather than Spanish, reflecting the large Lusophone diaspora in New England. [5]

    Lucille Ball, when the script finally crossed her desk, wasn’t sure what to make of The Patriot – her tastes had clearly favoured shows like Rock Around the Clock and Three’s Company, both of which had been immensely successful in their day. Desilu was a studio willing to make challenging, thought-provoking dramatic series – but when it came to sitcoms, they had always been known for their broad, kinetic crowd pleasers, going all the way back to their infancy, as a vehicle for producing I Love Lucy. A more character-driven, adversarial show – in the vein of the combative 1970s Norman Lear sitcoms, though with far less of a political axe to grind – would likely not have passed her muster in the first place had the Charles Brothers not cannily pitched the show as a throwback to the Spencer Tracy and Katharine Hepburn vehicles of the 1940s – the Golden Age of Hollywood, and an era when Ball herself was a B-movie queen. Her primary input into the creative process of The Patriot was a push to make the show warmer, more human than Fawlty Towers – transforming that bitter British cynicism into American-style optimism, though rough around the edges. The starkness of the Lear style was deemed a fashion whose time had passed. In many ways, and not just because the Charles Brothers had written for Taxi Drivers, The Patriot felt like a Paramount Television series, sharing the same warmth and careful attention to character development and interaction.

    The next step was casting. It seemed that every actor under 40 with a football player’s body type was in the running for Dave Sullivan; it helped that the character’s background was working-class Irish Catholic, and there was no shortage of men who fit that description. [6] The decision was eventually made to test the Daves and Rebeccas together, which would also allow the producers to gauge for chemistry. An unusual pairing emerged after countless rounds of call backs: former Pittsburgh Steelers defensive end Ed O’Neill, and stage actress Shelley Long.

    O’Neill had played for the Steelers for seven seasons, from 1969 to 1976, as part of one of the greatest transformations in professional sports history. The Steelers in 1969 had been in such bad shape that they were able to sign a marginal player like O’Neill – who, at 6’4” and 230 lbs., was rather slight, but he was quick and rangy and that kept him on the team. [7] O’Neill was charismatic and helped to bolster team morale, his quick wit and sense of humour keeping spirits in the locker room high even in the face of rather humiliating losses. As the Steelers continued to improve through the 1970s, O’Neill remained a cornerstone of their lineup, and this culminated in his having played on two Super Bowl-winning teams, in 1974 and 1975. More significantly, he became a prominent figure in the Pittsburgh media through the early-1970s, always available for print and on-camera interviews with sports reporters. Commercials for local businesses soon followed: O’Neill was talented enough to appear in actual, dramatized sketches, as opposed to the more traditional talking-head endorsement – his “character”, perfected in this era, was the bellyacher whose life was instantly (perhaps ludicrously) improved immediately by whatever product or service was being pitched. It gave O’Neill the acting bug, enough to step into amateur theatre during the offseason. The roles were minor, but they were good practice. Sadly, though his acting chops improved with time, his football skills were barely able to keep pace with the dramatic improvements on the Steelers roster, and eventually they fell short. In 1976, after the Steelers failed to make it to the Super Bowl, O’Neill was traded to an expansion team, the Tampa Bay Buccaneers. However, he had other plans, deciding to retire from football. He had caught the acting bug. One of his connections got him a role in none other than Smokey and the Bandit, the second-biggest hit film of 1977 (behind only Journey of the Force), in which he played Junior Justice. [8]

    Dave was humanized by his relationship with Rebecca, his status as a recovering alcoholic, and his admiration of his former football coach from his days at Boston College, Ernie Pulaski, played by Robert Prosky. [9] Pulaski, always known as “Coach”, served as a father figure to Sullivan. Pulaski, whose mental faculties had been battered by one too many tackles, was nonetheless a character of uncommon sweetness and innocence, and the only one to take an instant shine to Rebecca. He served as bartender at the restaurant, which also aided in their rapport (as the waitress doubled as barmaid at the chronically-understaffed Patriot).

    Fred Silverman at CBS liked The Patriot, agreeing to air it in the autumn of 1982 in one of the network’s few plum timeslots (at Tartikoff’s insistence). Not surprisingly, the show was an instant hit, and was critically acclaimed, drawing (positive) comparisons to both Mary Tyler Moore and Taxi Drivers, which thrilled the Charles Brothers. Lucille Ball, for her part, was reportedly rather annoyed by a review which exclaimed, “Finally, Desilu is making good sitcoms again!”. However, The Patriot was not the only hot new comedy of the 1982-83 season. Police Squad!, airing on NBC, was a deconstructive parody of the ubiquitous police procedurals of the 1960s and 1970s, taking the same approach to the genre that Batman had taken with pulp superhero fiction. [10] In this way, Police Squad! competed not only with The Patriot, but also with Hill Avenue Beat – the two shows took diametrically opposed approaches to addressing the shortcomings of past efforts in the genre. Police Squad! was also like Batman in that its absurd situations were played “straight” by the cast, though the jokes were far more conspicuous. This particular style, known as “saturation comedy” (wherein the jokes came so thick and fast, the viewer was bound to be laughing constantly) was borrowed from the predecessor to Police Squad!, which would come to be regarded as one of the funniest films ever made.

    It was Catastrophe!, the hit comedy film written and directed by the same team (also led by two brothers, Jerry and David Zucker, along with their childhood friend, Jim Abrahams), and released in 1980. [11] Catastrophe! was a parody of the environmental disaster film genre of the late-1970s, spoofing in particular The Greenpoint Dilemma, but also The China Syndrome, The Swarm, and King Kong. The title was a reference to many films (such as Greenpoint and China Syndrome) including various words for “disaster” as part of their titles. (The working title of the film was indeed “disaster”, but this was changed to “catastrophe” during post-production, as it was deemed a more naturally funny word. This was fortuitous in that, as a running gag, neither word was ever spoken aloud in the film; this was a backhand reference to the awkwardly-forced title drops used in other movies. (Characters would often struggle to find synonyms for “disaster” or “catastrophe” and use them instead, and one such scene was prominently featured in the theatrical trailer, allowing the narrator to fill it in himself.)

    The direct parody of Greenpoint came in the disaster springing from an innovative source of energy. This “breakthrough” technology, which promised to make electricity “too cheap to meter”, was the innovation of a scientist character named Dr. Powers, in one of the many punny names featured throughout the film.

    Like many disaster films, the two central characters (Ted Wheeler and Aurora Dawn – who was frequently misidentified as “Oreo” throughout the film) were romantically attached – or rather, had been, in the past. Flower children and Moonie Loonies back in the 1960s, the two had grown apart since the salad days of peace, love, and moonshots – a ludicrously over-the-top flashback scene saw the two, in full hippie regalia, passionately making out in front of a wall of television sets in a store window, all broadcasting the Apollo 11 landing, the camera in soft focus (and it only got worse when he pulled out a jar of petroleum jelly and smeared more of it on the camera lens, though not before winking and saying “I’ll be needing the rest of this for later.”) But times had changed; Ted had become an engineer, and an avowed champion of this revolutionary new power source, which promised to end energy woes and keep America moving. Aurora, on the other hand, remained true to her roots. She was a member in good standing of the Society for Quitting Utilization of Energy through Animalistic Kinesiology (or SQUEAK, for short), which maintained a permanent picket line outside the power plant (allowing for some more topical parody of then-President Reagan’s… difficulties with labour unions). She deplored the “exploitation” of the “noble and majestic creatures” that were being “slave-driven” to provide power for their “callous and cruel human masters”. The creatures in question? Hamsters (or so everybody thought), who powered the plant by running on their little hamster wheels. Aurora also worked as a girl reporter in the classic tradition, and this seemingly-legitimate job (and her connections with her old flame) gained her access to the plant, where she met with Jim Hampton (who insisted on being called “Hammy”), the analogue to the beleaguered supervisor character played by Rip Torn in Greenpoint. He was played by William Shatner, and Aurora recognized him immediately, begging him to say his famous lines from Star Trek. [12] Hammy steadfastly denied this connection, and eventually directed her to Dr. Powers. He functioned as a combination of the John Lithgow character from Greenpoint coupled with a more conventional Frankensteinian mad scientist. He revealed that he was breeding giant hamsters, which would power even larger wheels, which would give hamster power an even greater edge.

    “We’ll nuke the competition,” Powers assured her.

    “Do you mean microwave, or nuclear?”

    “That’s what I said.”

    Aurora, naturally, was skeptical, but Powers was defiant: “Just wait until I show you at the demonstration next week. I'll show you all!” This was followed by gales of inappropriate evil laughter. However, as with most mad scientists, Dr. Powers had made one fatal mistake: he had not stocked his wheels with hamsters, but with gerbils.

    The gerbils in question (with their numbers comically inflated in every subsequent mention of them), were mostly “played” by stuffed animals and puppets (several hundred of them, in fact), utterly lacking the sophistication or pathos of the creations of Jim Henson, or anyone else for that matter – they were laughably fake, a deliberate stylistic choice done for comedic effect. Real gerbils (and other rodents – appropriately enough, only guinea pigs were seen in Dr. Powers’ laboratory) appeared mostly in non-“action” scenes. This was done for pragmatic purposes as well as aesthetic ones – given that animal rights groups were already being parodied in the film, the ZAZ team made sure to tread lightly so as to avoid further raising the ire of their real-life counterparts. Naturally, the film’s end credits included the standard “no animals were harmed” disclaimer.

    Sure enough, the trouble began when the “hamsters” were able to escape captivity, in the form of a “hamster-proof trap”, since they were, in fact, gerbils. Like most rodents, said gerbils eagerly reproduced and quickly overran the power plant, making power generation impossible (“there are too many hamsters on the wheel!”) and swarming the staff. Hammy then made the obligatory reference to a famous episode of Star Trek when, despite his cocky reassurances that he could handle the crisis, he was soon buried underneath a massive pile of gerbils. Naturally, his only response?

    “Not again…”

    Shatner’s character having been dispatched, the apathetic plant manager “took charge”, so to speak, though he lacked the amorality of Jack Nicholson’s character from Greenpoint, being played instead by Nicholson’s co-star from Easy Rider, Dennis Hopper, as numb from narcotics overuse – the burnt-out hippie, an increasingly common sight circa 1980. His complete unwillingness and inability to handle the crisis allowed not only the gerbils but also Dr. Powers to run amok. Meanwhile, comedic interludes of various bystanders without power or swarmed by gerbils – or both – were interspersed with the action.

    The most prominent subplot entailed Aurora’s boss, the editor of the newspaper. He was played by character actor Leslie Nielsen, in a comically-serious, super-straight parody of his previous roles – despite only sharing a few scenes with her, all of which were set in his office. This famous exchange, which contained possibly the most memorable line in the film, served to encapsulate his character:

    “I have some bad news, boss – Mr. Hampton is dead, I watched him get nibbled to death by a wave of gerbils.”

    “I don’t care that he’s dead, I want you to get me that interview for tomorrow’s paper!”

    “Surely you can’t be serious.”

    “I am serious. And don’t call me Shirley.”

    Other characters included the politician (for lack of a better word) played by Lloyd Bridges (variously described as the President, Governor, Mayor, and dogcatcher) and his counterpart at the police force (FBI, US Marshals, state troopers, county sheriff, city police department, etc.) played by Robert Stack. Aurora’s involvement with SQUEAK would also come back to haunt her thanks to a secret agent played by Peter Graves, who was investigating the possibility of terrorist involvement in the… crisis.

    The casting of four heretofore-serious actors (Nielsen, Bridges, Stack, and Graves) in comic roles (though played “straight”, in much the same way as they had been on the old Batman series) breathed new life into their careers. The casting also alluded to their previous roles: Graves was a secret agent, similar to his character of Jim Phelps from Mission: Impossible; Stack played a law enforcement officer, much like Eliot Ness on The Untouchables. In fact, those two shows, along with Star Trek, had all been productions of Desilu, leading Catastrophe! to be known internally (and facetiously) within that studio as “The Film that Paladin Made”. To add to the mystique, one of Nielsen’s most famous roles had been that of Commander Adams from Forbidden Planet, a major influence on Star Trek.

    In a final-act nod to King Kong, the film ended with Powers’ giant gerbils rampaging through the streets of New York City – one of the few times the animals were played by real gerbils (placed in a largely-edible and laughably fake, mostly cardboard scale model of Midtown Manhattan). The climactic scene took place at the Empire State Building (as it had in the original 1933 film – the remake had chosen a more contemporary location). Aurora, cleared of any wrongdoing, and Wheeler, both literally and figuratively powerless to stop the gerbils, reunited and patched their differences, passionately embracing as the Empire State Building collapsed behind them (through use of the same clever compositing as in King Kong). But they were unfazed. Aurora passionately declared, in the final line of the film, “This is just like Casablanca!” [13]

    The film was a smash hit with critics and audiences alike, the highest-grossing and best-reviewed comedy of 1980. That television would come calling seemed inevitable. And sure enough, when it did, Zucker, Abrahams, and Zucker were ready with Police Squad!. They even cast Leslie Nielsen in the series, recognizing a master of deadpan when they saw one. He was cast as the star of Police Squad!, the detective/sergeant/lieutenant (depending on the situation, in a gag borrowed from Catastrophe!), and his portrayal quickly proved the cornerstone of the series.

    But not every genre-busting comedy could last forever. The primetime soap opera was enjoying unprecedented success as a genre, as shows “inspired” by Texas hit the airwaves in droves. The irony was that the show which beat all of them to the punch, Soap, had ended in 1982 after a five-year run. [14] The most popular character on the show, the butler Benson, was the only one allowed to escape the mayhem unscathed, as the network had picked up a spinoff which would star him as the head of household to a politician who was also the cousin of his previous employer – the character was given a recurring role in the final season as a means of introducing him to the audience. Naturally, much of Soap’s final season spoofed the plotlines of the “straight” soaps, but this did little to deter their popularity.

    ABC continued to lead the pack in the overall ratings, with a top-heavy lineup: of their twelve shows in the Top 30, eight were in the Top 10, including Texas, which was the #1 series on the air for the third year in a row. CBS edged NBC for second place, with nine shows in the Top 30, and the remaining two entries in the Top 10. NBC took the remaining nine places in the Top 30 but, ominously, failed to score a single Top 10 hit.

    At the Emmy Awards held in September of 1983, Desilu managed to win for both Outstanding Drama Series (with Hill Avenue Beat repeating for the award) and Outstanding Comedy Series for The Patriot, winning the two-way race with fellow freshman Police Squad! (though Leslie Nielsen did win the Emmy for Outstanding Lead Actor in a Comedy Series). It was the first time that the studio had pulled the twofer since Star Trek and The Lucy Show won those respective awards in 1968, helping to cement yet another second wind for Desilu. Lucille Ball, however, deferred most of the credit to Brandon Tartikoff, who was onstage to accept both awards (alongside Cannell in the former case, and the Charles Brothers in the latter case). Nevertheless, it was a singular triumph for Ball, given that she had been in charge at the studio for over twenty years



    ---

    [1] IOTL, their planned adaptation of Fawlty Towers was eventually developed into a show set in a Boston bar, where everybody knows your name. That show was, of course, Cheers, which ran for eleven years, and became one of the most popular and beloved television series of all time. It doesn’t exist ITTL, which is one more piece of evidence that I am not writing a utopia!

    [2] Since Beane’s of Boston never got past the pilot stage IOTL, it did not have the chance to sully the reputation of Boston as a sitcom setting.

    [3] Puns are, of course, very common in British titles, and this one is particularly convoluted: The Patriot can refer to the hotel, the kind of tourist it attracts, or “Big Dave” Sullivan (who played for the New England Patriots).

    [4] Which differs from Cheers, wherein graduate student and teaching assistant Diane Chambers was left at the bar when her fiancé (and employer), Professor Sumner Sloane, went to pick up his engagement ring from his ex-wife, just before the two were due to depart on a flight to be wed in Barbados. Eventually, he succumbed to his ex-wife’s charms, and left on that flight with her instead of Diane.

    [5] George, the chef, is based on a combination of Terry from Fawlty Towers and the character who evolved into Norm Peterson on Cheers. Carlota, naturally, is based on the Italian-American Carla Tortelli, the other barmaid.

    [6] The role of Sam Malone, former relief pitcher for the Boston Red Sox, was originally going to be that of an ex-football player (one reason former Los Angeles Ram Fred Dryer – who left the NFL just one year prior! – was a finalist for the part) before the part was rewritten in recognition of Ted Danson’s wiry frame.

    [7] O’Neill was indeed recruited (not drafted) by the Steelers in 1969 IOTL, though he was cut in training camp and (eventually) went into acting. ITTL, he survives and rides his continuously-improving team to Super Bowl glory, though he’s not nearly as good a player as Dryer (both of them play the same position, amusingly enough: defensive end).

    [8] A role played by yet another NFL player-turned-actor (amusingly, first a Steeler and then a Ram), Mike Henry, IOTL.

    [9] Prosky was up for the role of Coach IOTL (and, later, that of Martin Crane on Frasier). The character’s name on Cheers was Ernie Pantusso, reflective of actor Nicholas Colasanto’s Italian heritage. As Robert Prosky has Polish heritage, the character is instead named Ernie Pulaski to reflect this.

    [10] Police Squad! aired on ABC IOTL – and was cancelled by network President Tony Thomopolous after only six episodes had aired, because “because the viewer had to watch it in order to appreciate it”, a line that really does belong somewhere in a ZAZ script. TV Guide famously described this as “the most stupid reason a network ever gave for ending a series”, and I can’t say I disagree. ITTL, Police Squad! runs the whole season and has been renewed for a second – for better and for worse.

    [11] Yes, this means that Catastrophe! takes the place of Airplane! ITTL.

    [12] Obviously analogous to the scene in Airplane! in which a child continuously agitates Kareem Abdul-Jabbar – excuse me, co-pilot Roger Murdock – though it’s probably even funnier in the context of TTL, since Shatner hasn’t had much luck in movies and it wouldn’t be surprising that he took a job as a shift supervisor at a power plant to pay the bills.

    [13] The irony, for those of you who have not seen Casablanca (and you probably should, if only to play the “famous quote from Casablanca” drinking game), is that the ending here is pretty much the opposite of what happens in that movie.

    [14] Soap ran for only four seasons IOTL, quite notoriously ending on a cliffhanger. Benson left midway through the show’s run to star in his own spinoff, and was replaced by Roscoe Lee Browne as Saunders.

    ---

    And finally, after far too long a delay, I bring you the beginning of the 1982-83 cycle! Thanks to e of pi for his help with this update, including co-writing the Catastrophe! synopsis with me, and thanks also to Chipperback for his wonderfully detailed TTL biography of Ed O'Neill, which is presented here in abridged form.
     
    Appendix A, Part X: Where No Man Has Gamed Before
  • Appendix A, Part X: Where No Man Has Gamed Before

    Hello, everyone! I'm proud today to be able to bring you my guest post for That Wacky Redhead, which I've been working on with Brainbin's generous permission to play in his world--this originally spun out of some discussions around the same time as the April Fool's Post I provided last year, and...well, I think I owe you all some make-up. I hope you'll find this up to the timeline's typical standards as I cover a topic rather dear to my heart:

    ----------------------------------
    Appendix A, Part X: Where No Man Has Gamed Before

    "Space....the final frontier! This is the setting of Star Trek: The Roleplaying Game! Your ongoing mission: to explore strange new worlds, to create new stories and new adventures! To boldly play what no one has played before!"
    -from the opening line of the Starfleet Officer's Manual, 1982

    Simulation and storytelling have long roots in human history, with folk tales and games being integral to the human experience. In ancient times, games like Senet (an Egyptian game simulating the journey after death) or Chess (which is loosely based off combat from the era in which it originates) functioned as more than just entertainment, but as a way to train the mind. The simulation aspect of gaming came into its own by the beginning of the 18th century, when the Prussian general staff led the way in developing formalized, elaborate war games, in which players assuming the roles of army commanders led forces against an enemy army, commanded by another group of players [1]. With randomizing elements such as dice simulating the luck of battle, such games allowed more in-depth analysis of the strengths or weaknesses of various battle plans before actual combat began.

    Over the following century, similar wargames became important parts of the training and strategic planning of most major militaries. However, where generals and admirals saw a tool to enhance their performance on the battlefield, others saw the opportunity to try their own hand at the command of vast forces as a hobby. Little Wars, an early “toy soldier” game invented by famed author H.G. Wells, aimed to bring such wargames to children or civilians, enabling a kind of “armchair generalship” [2]. The hobby of wargaming incubated throughout the early 20th century, before ultimately being combined with fantastic elements by several people, roughly simultaneously, in the late 1960s. The most influential resulting product was the Chainmail wargame, developed by Gary Gygax in 1971. Originally mostly focusing on realistic medieval combat, the game would rapidly expand to include modeling of common fantasy tropes such as wizards and magical creatures.

    In the same period, wargaming had begun to develop new storytelling aspects. In the early 1970s, some other creators, most notably Dave Wesely and Dave Arneson, took the concept of simulation games and began to apply them not to vast armies engaged in combat, but instead to a small party of characters, each controlled by a single player, all engaged in an overarching plot. The umpire, originally the impartial curator of random events and continuity in a wargame, took on a more authorial role and evolved into the game master, responsible for world-building and setting up the challenges that characters would face over the course of their stories--often based on popular fantasy epics like The Lord of the Rings. Arneson came into contact with Gygax, and the two collaborated on the first major published role-playing game, or RPG, Dungeons & Dragons, released in 1974. The game inspired a cult following and spawned a cottage industry of peripheral supplies and imitators, as other publishers attempted to copy Gygax’s success, or transposed the concept of a character-based RPG into other settings [3].

    One such cottage industry member was David L. Movello, the owner-operator of a small publishing house that created pre-generated stories that game masters could present to their players, intended for use with existing games like Dungeons & Dragons. However, Movello had an interest in developing such “modules” for original systems--and in particular he had been working on a game set in the universe of his favorite television show during his time as a history student at the University of Dayton: Star Trek. While Star Trek had seen several iterations of video games released by the Syzygy Corporation, these had been quite primitive, mostly involving shooting down an infinite series of enemy Klingons and Romulans. To Movello, the story and thoughtfulness of Star Trek had been far more interesting--the big questions and ideas that brought meaning to the struggles of the characters. In RPGs, he saw a way to recreate this experience, and in 1978 he had begun development of his own “homebrew” system for a Star Trek game.

    By 1980, he was running his system at Star Trek conventions as an event known as “You’re the Crew!” These events put players into the shoes of either the canon crew, or new characters, as they faced off against scenarios that Movello had designed, dealing with unfriendly aliens, near-godlike beings, and mysterious phenomena as they explored the galaxy. The events were popular, but Movello knew that unless he could obtain licensing rights for his game (expensive, not to mention challenging) he would need to make the setting more generic before publishing it. However, in a game at the Cervantes Convention Center in St. Louis, in the summer of 1980, during which he was experimenting with such a concept—ironically, also lightly based on the show Deep Space which Herbert F. Solow had, on behalf of Desilu, officially excluded from the Star Trek canon at the same con—he was approached by one of the players after the game. Alex Garcia was one of the unofficial “scouts” occasionally hired by Desilu to get an inside feel of the Star Trek fandom, and he had heard about “You’re the Crew!” As it turned out, Desilu had been considering licensing the creation of a Star Trek roleplaying game, and Garcia thought that “You’re the Crew!” had the right feel. Moreover, Desilu had been hoping to license the property to a small studio, aiming to (as had been the case with Gold Key, and then Syzygy) be in enough of a position of strength to dictate carefully how their property could be used.

    The shocked Movello eagerly agreed to a meeting with a barrage of lawyers, and arrangements were worked out: Desilu would agree to hire on Movello and his small press (consisting of a few friends and fellow hobbyists), who would contribute their existing “You’re the Crew!” concept and system for expansion into a line of Star Trek RPG books and settings[4]. In return, Desilu would provide a Desilu Publishing imprint for the game and official approval of the game’s content, as well as promoting the game at its conventions. Eager to take a shot at a substantial windfall in exchange for getting to work on the official canon of Star Trek, Movello and his team began work on a series of rulebooks, setting guides, stats for assorted ships, and some “Episode Modules” which would provide starter scenarios for gamers, supplemented by new hires and with some informal assistance from the staff writers who had created the miniseries and wrangled the continuity on the show.

    Movello had originally developed his game aimed at short, often single-sitting adventures, with an episodic feeling--which had resulted in the success of his system on the convention circuit. To help him bring new players up to speed, his system was based on the traditional cubical 6-sided die, one familiar to even non-RPG gamers, unlike the d20 and other more obscure dice from D&D and many of its imitators. A character’s skill increased the number of dice they could roll, instead of adding bonus modifiers to the result. In an innovation he developed to keep play fast-moving in short games at conventions, Movello switched from requiring each die to be added to a total result, and instead took success or failure based off the number of dice that rolled above a threshold. This made counting successes much easier at the table, eliminating mental math breaks in the heat of action[5]. This system was fast to learn: a character’s skill and physical traits let them add a certain number of dice to their “pool’--explainable around the table as, “See the number of dots? Add your wit and language skill, and roll that total.” This mechanic from Movello’s “You’re the Crew!” was retained for the Desilu Star Trek RPG.

    Skills were similarly stripped down in “You’re the Crew,” though Movello drew on the D&D class system and the trio of uniform colors seen in the series to create a set of specialties. Skills were ranked either as general, or belonging to one of the three primary divisions: Command, Sciences, and Operations. Medical, engineering, security, helm/navigation, leadership, and scientific disciplines, among others, were then treated as subcategories of skills within their overarching divisions. Characters had to pick a division, and were given more ability to advance on skills that were ranked as either general, or within their division and chosen specialities. Leveling up was an uncommon event in Movello’s original “You’re the Crew,” and had to be essentially introduced from scratch for the Star Trek RPG. For the experience system, characters gained points from either completing objectives or by using their skills--Movello’s rules were that a critical success with a skill could lead to the character possibly gaining better insight in that field [6], and advancing in that skill.

    Character creation was an angle that Movello had given more attention to even before Desilu approached him--to encourage players to create their own unique characters, he framed character creation and initial skill point distribution within a quick simulation of a character’s past training and service with Starfleet. Movello based the “curricular” models on his own college experiences, leavened with some mildly military elements drawn from his studies as a history student. Tables helped a player constructing a new character to begin at “Starfleet Academy” picking basic stats and their division, then attending a specific “division” school where they would be walked through spending skill points, and could even simulate training cruises and past service history--intended by Movello to be fast enough to play through before a session at a con, since he found players sometimes became more invested and engaged in the “You’re the Crew!” games when they could create their own characters. The character focus of the process was also in line with the more character-driven style of the game and adventures [7].

    “You’re the Crew!” had also included sections of starship-level play, where characters spent sessions, or portions of sessions, flying the starships they commanded through space, battling enemy ships, or navigating spatial anomalies. These session’s rules had originally been based heavily on more traditional wargaming, but gradually evolved and by the time of the fateful St. Louis con, Movello’s rules had included opportunities for the players to bring their character’s skills to bear in flying the ship--engineers had limited abilities to “shift power” to alter the strength of various system stats, while a good helm officer could enhance a ship’s rate of movement or maneuver. The final RPG rules kept with this integration of abilities in spite of the segregation of starship play from character-level play. However, due to this and Movello’s focus on the show’s balance of character-based stories and combat and single-ship actions, battles involving more than a few ships could often become cumbersome.

    In 1982, the system was finally ready for release, with a burst of promotion that saw ads in genre magazines, official trial games at conventions, and more. This timing of the Star Trek RPG’s arrival was convenient—it was among several new RPG systems to emerge into the hobby community at the time as D&D led an explosion in the size of the fandom. However, D&D was under heavy public scrutiny at the time, suffering the first of several backlashes alleging that the game was “satanic” or that its players were at risk of being inducted to some kind of cult. Other new releases that followed close to D&D’s formula often featured similar content, and suffered by association. However, Star Trek was a mass-market property, with millions of existing fans, and was a known quantity with which many parents were themselves well-familiar. This allowed the Star Trek RPG to not only navigate the backlash, but actually emerge as the primary “wholesome” alternative to D&D. In its first year, the game sold out the initial 20,000 copies of the three core books, which then went on to sell a total of over 100,000 copies in its first three years, establishing itself as a persistent rival to D&D for most culturally visible RPG--spawning its own sci-fi imitators as D&D had spawned innumerable fantasy copycat RPGs.

    However, not all of these sales were to customers intending to play the game: a widely quoted fact was that the setting guide outsold the player’s handbook by a significant margin. This was unsurprising, in a way, given that the RPG system books were the first officially licensed material dealing with such questions as the absolute speed of warp drive, the composition and cartography of the Federation, the strength of various ships from the show and the makeup of Starfleet, as well as other topics extensively debated by fandom. In many cases, Movello and his team worked to incorporate their favorite elements of the show canon, comics, and common “fan canon lore” or “fanon”—for instance, the Vulcan homeworld was established to orbit the sun known to humans as Epsilon Eridani, which had been a common fan speculation, while the absolute speed of a starship under warp drive was defined using a formula originally published in The Making of Star Trek: the warp factor was the cube root of the speed divided by light speed.

    More general information about the setting was also formally established. The included maps showed the general layout of Federation, Klingon, and Romulan space and the powers’ relative fleet sizes were also established. The game also gave more insight into the structure and function of the Federation’s civilian governments, as well as non-Starfleet organizations such as local defense fleets maintained by individual member planets, and the Federation Merchant Marine, which had first been introduced in the person of Captain Curtiss, the guest star of “The Borderland” in Season 5. The Merchant Marine was also established to include the SS Antares, whose crew had appeared briefly in “Charlie X”--common fanon given that they shared a uniform style and badge with Captain Curtiss. As part of fleshing out the Federation fleet, the game gave details on many ships seen in canon, establishing the the Enterprise as a Constitution-class and the Excelsior as a Declaration-class, with both being Starships, the largest and most capable ships in the fleet, typically dispatched on extended patrols in the outreaches of Federation space, with the Artemis as a Olympus-class frigate, which served escort and survey duties throughout the fleet, while assorted scouts and cruisers (including some designs adapted from common fan-built “kitbashes” of the available model kits) filled out the core Federation fleet. The game was primarily set post-miniseries, with the intention for players to be able to create their own crews and ships--something of a break with the traditional “Roddenberry rule” that material other than official shows would have to follow behind the main canon, filling in the “Lost Years” instead of blazing a new trail ahead. However, for those that did wish to do so, published supplements for the original configuration Enterprise and adventures set during the first run of the series or the period between the show and the miniseries were also produced.

    As a widely-available, Desilu-authorized “official” reference, the RPG sourcebook became something of a lowest-common-denominator, and many elements which Movello and his team had developed for the game became common presumptions to fill gaps in the show’s canon--turning the gamebooks into some of the more authoritative references in fanon. The appetite for such guides as shown by the sales differences between the setting guide and the player’s handbook did not go unnoticed by the heads of Desilu’s licensing department. Some of the appetite came from those itching for larger fleet battles than the main RPG starship combat could support, more along the lines of traditional wargames. Just as Movello and others had experimented to create a character-based RPG, some other amateurs created their own rules for adapting official stats, ship models, movement rules, and settings to create their own “starfleet battles,” inspired more by the climactic combat seen in “These Were the Voyages,” or the climax of the Miniseries [8]. However, this was done without the approval of Desilu, who prefered to focus on larger-scale ship-to-ship combat through their established Syzygy video game series.

    However, the largest impact of the game on the Star Trek fandom wasn’t its role as a distillation of the official Desilu canon, nor its revisiting of the original, but rather the creativity and new stories generated by players of the game. Prior to the RPG, the vast majority of Star Trek fandom had focused on the Enterprise and her crew. Comics and other tie-ins had primarily followed the adventures of this one ship, while fanfiction had primarily focused on the interactions (and relationships) of the original cast of characters. The release of the RPG meant that many gaming groups were creating their own crews, and fleshing out stories and conflicts not possible with the original characters. As the crossover between gamers and writers of fanfiction was strong, it didn’t take long for such “new crew” fanfiction to begin making its appearance in fanzines alongside more traditional stories.

    With the playerbase of the RPG primarily male-dominated, it was perhaps unsurprising that these “new crew” stories inspired by the game would also primarily be written by men, contrasting against the traditionally woman-dominated and character-focused “old crew” fiction. There were no small numbers of arguments between the old guard of fanfiction readers and writers, who saw the rise of “new crew” stories as nothing more than shiploads of Mary Sues attempting to steal the luster of the original, and those who preferred the additional depth and possibilities such “new crew” stories offered compared to the continual rehashing of plots in “traditional” stories--particularly, as the arguments went, those that focused on endless variants of the same “slash” relationships based on dubious “subtext” supposedly present in the show.

    Even in the face of such segmentation, the market for material starring new characters in addition to the old was clearly quite strong--a factor also attested to by the fandom’s interest in the new characters introduced to flesh out the crews of the Enterprise and Excelsior in comics set between the original show and the miniseries, and (in its own way) the interest in treating Deep Space as an unofficial sister series. This reassurance that the fandom would accept and even embrace well-done introductions of new characters was carefully noticed by Desilu management, most particularly Brandon Tartikoff.

    However, regardless of the effects the RPG was to have on the shape of fandom and of the future of Star Trek itself in the coming years, it had already carved out a place for itself in fandom, with Desilu’s heavy promotion of the game trial sessions at cons rapidly turning it into a near-universal experience therein, much as fanfiction had been a decade earlier. Just as nearly every fan had at some point read or even written fanfiction, so too would almost every fan come to at least witness a session of the RPG being played, if not participate themselves. The RPG experience, like fanfiction, fanart, and fanzines, rapidly became intertwined as a critical part of the fabric of the Star Trek fandom.

    ----------------------------------

    [1] Significant background for this list of events is drawn from this Wikipedia article.

    [2] Or, in some cases, no chair generalship! It’s a fascinating game to read about, and looking at the pictures of it in play is hilarious--like Steampunk Gencon.

    [3] Up to this point, all is essentially as-OTL, and again significant background is owed to the above article.

    [4] IOTL, the company that received the Star Trek license was FASA, who were also quite small and primarily dealt with expansions and supplement for other games at the time, but their Star Trek efforts were largely self-directed within the scope of the license. ITTL, Desilu is looking for an upstart seed they can grow, but one already showing results, hence picking up the (fictional) Movello’s game. They also keep him on a tighter leash as far as canon, since they’ve got their own plans, and they want to preserve the integrity of the unified Star Trek brand.

    [5] Essentially, I’m having Movello invent a d6-based dice pool, sort of like West End Star Wars or White Wolf’s Storyteller system, as used in Vampire and Werewolf.

    [6] Lightly inspired by Call of Cthulhu’s advancement system.

    [7] This is heavily based on the actual OTL FASA game. Call it convergent evolution. ;)

    [8] IOTL, the Amarillo Design Bureau would actually publish a “Starfleet Battles” game that was much the same--players commanded fleets of ships. Though I never played, I owned several of the models as a kid.
     
    Last edited:
    The Fantasy Kitchen Sink
  • The Fantasy Kitchen Sink

    In the years that had followed the blockbuster success of Journey of the Force in 1977, sociologists and demographers would posit that it had been so successful primarily because it was among the first films to appeal to what was, at the time, the youngest generation of viewers: the Mini-Boomers, born in the early 1970s. Children functioning as a lucrative backbone demographic was nothing new for the film industry: Snow White and the Seven Dwarfs had become the second-highest grossing film of the 1930s, primarily on the basis of ticket sales to children, which also motivated their septennial re-release policy – just far enough apart for a new cohort of viewers to emerge. The media landscape had changed a great deal in the nearly half-century since then, and more options were available to the consumer than ever before. Despite the many challenges to the preeminence of the cinema, and despite that industry’s often lackluster response thereto, people continued to buy movie tickets in sufficiently large numbers to justify the development of blockbuster pictures with big budgets. Even Lucasfilm v. Paramount did not deter them, though it did result in many uncomfortable questions about just how the expenses accrued in the making of these movies were being determined.

    Lucasfilm v. Paramount did have a detrimental effect on the making of one blockbuster film, however, and that was the planned sequel to Journey of the Force. The Hollywood Reporter best described the situation in their headline “LAWSUIT TO ‘FORCE’ HALT TO SEQUEL PLANS”, thanks to the injunction that would prevent the creation of more Journey-related media until all appeals were exhausted. The Lucases obviously had no intention of working on a film from which Paramount would derive all the profits, some of which were – in their view – rightfully their own. Audiences were not pleased at this turn of events, as demand for a sequel to Journey was strong and immediate, owing to the open-ended nature of the film’s narrative – it told a complete story, but was clearly intended as part of a larger saga, which gave Lucasfilm plenty of opportunity to more stories set in the same universe, should there be demand for them. Unfortunately, they had not anticipated that they would not be able to meet this demand. But it was a perfect storm for other creative types in Hollywood, because it generated an appetite that would have to be sated.

    That said, fantasy didn’t owe all of its increased prominence as a film genre to Journey of the Force. The three Lord of the Rings animated films, directed by Ralph Bakshi and released over three consecutive years, had been a sleeper success both in North America and (particularly) Western Europe. In addition to the expected success that the LOTR trilogy had enjoyed in its native Britain, it also performed very well in Ireland (unsurprisingly), as well as the Netherlands and Scandinavia, particularly Sweden. [1] Studio executives, naturally, examined what the two properties had in common, hoping to use their similarities as the basis for their future works in fantasy. The Lord of the Rings films, based on works of literature by a highly esteemed academic, were also considered a more important work, largely because they had borrowed from the established science-fiction tradition of allegorizing modern-day situations. [2] Tolkien’s distaste for industrialization and naturalist sympathies already provided the basis for a strong pro-environmentalist message, which was given more thematic resonance when juxtaposed with the complacency and isolationism of the Hobbit characters. In addition, the racist undertones in the story, which had been criticized by many commentators and had even come to bother Tolkien himself in his later years, were addressed more directly in the films – an obvious byproduct of cultural movements, such as Blaxploitation, which were popular at the time. Surprisingly little allowance was made to make the films more palatable for young audiences, but it was very popular with them nonetheless. This ethos was a major inspiration for another animator, Don Bluth.

    Don Bluth had left Disney, his employer since 1971, and secured funding to produce The Rats of NIMH, an ambitious and surprisingly mature project based on the Newbery Medal-winning children’s book by Robert C. O’Brien. It represented Bluth’s vision for what animated features should be in this day and age: challenging, artistic, and unwilling to condescend to its young audience. Released in 1982, it had performed very well at the box-office, easily recouping its hard-won investment, and attracted rave critical notices. It marked the dawn of Bluth’s burgeoning animation studio as a major player in the industry, followed immediately by his work with AMS on laserdisc-based video games. [3] However, his very lucrative work in that field was something he considered secondary to feature-length films, and to that end his next project was an adaptation of the Norwegian folktale, East of the Sun and West of the Moon (a derivative of the “search for the lost husband”). Bluth was clearly going back to the well dug by Walt Disney, a personal idol of his; many of Disney’s most beloved films were fairy tales, such as Snow White, Cinderella, and Sleeping Beauty. Several other such stories had long been planned to see the light of day, but never did, such as The Little Mermaid and The Snow Queen. [4]

    In fact, Disney had cancelled their planned adaptation of The Fox and the Hound, a 1967 novel by Daniel P. Mannix; it been intended as the studio’s next feature after 1977’s The Rescuers, but the success of The Lord of the Rings convinced Ron Miller, the head of the studio, that the human element, sorely missing from every Disney film since Walt had died – it was likely no coincidence that the last Disney feature with a principally human cast was 1963’s The Sword in the Stone, the last which Walt himself saw to completion. Many blamed the successive leaders of the company following the deaths of the Disney Brothers who had kept it afloat for decades – first Walt, in 1966, and then Roy, in 1971. By the end of the 1970s, after a succession of ineffectual (and unrelated) leaders, Miller, Walt’s son-in-law, took charge at the studio, becoming President in 1978. The Sword in the Stone had been based on the Arthurian mythos – specifically, T.H. White’s The Once and Future King – and had unusually mature and complex themes for a Disney movie; despite this, it had performed well at the box-office. In many ways, it later presaged the success enjoyed by The Lord of the Rings. And no animation studio had been more aware of the success enjoyed by the Lord of the Rings films than Disney, which was in the midst of perhaps the greatest slump of its existence. It made for a compelling narrative on the part of Miller’s faction in the company; Disney had gone astray, making so many films about animals. It needed to make films about people.

    To that end, they selected Lloyd Alexander’s Chronicles of Prydain series as the basis for their next film. Like the Great Matter, Prydain was stepped in Welsh mythology; like The Lord of the Rings, it was a modern fantasy told in several volumes. Disney had never produced a sequel feature before – each previous film had been a stand-alone story or an anthology. But this was the blockbuster era – many hit films were now likely to spawn sequels. These were often inferior to their predecessors, and underperformed them financially, but The Lord of the Rings was a conspicuous exception, largely because each film had been based on a corresponding book. If Disney were to repeat this strategy with Prydain, they could produce five such films. With that in mind, the first Prydain novel – The Book of Three – was green-lit to be adapted (under the working title Battle of the Trees) with a scheduled release date in 1982. [5] This meant that the studio would be going head-to-head against one of their former animators – which wasn’t lost on either side. Disney was able to take advantage of their established (if atrophied) infrastructure and pumped as many advertising dollars as they could into their first film. Marketing dwelled heavily, even lovingly, on the similarities between The Chronicles of Prydain (it was decided to borrow from the Lord of the Rings playbook and refer to the first film in by the name of the entire series; it could always be changed retroactively), as well as the emphasis on traditional Disney elements, such as cuddly animal sidekicks, and (as noted) Princess Eilonwy, whose feisty, sassy personality also borrowed heavily from the Princess character in Journey of the Force. Unlike Bluth, who cast several well-known screen stars in NIMH – who, in that film’s comparatively sparse advertising, were prominently featured – Disney stuck to a cast of largely-unknown British radio performers. [6] It was an easy way to keep expenses down, and in the end it paid off: The Chronicles of Prydain, released shortly after The Rats of NIMH in the summer of 1982, considerably outgrossed the latter film (and performed even better overseas, thanks in large part to the Disney brand). However, both films performed well enough to make a “profit”, for what little meaning that definition had in the motion picture industry: Prydain grossed $48 million domestically, compared to $28 million for NIMH. [7] Disney executives green-lit production on The Black Cauldron, which (it was hoped) would be faster and cheaper to make than the first film had been.

    However, the American motion picture industry was not the only game in town when it came to animated fantasy. Perhaps the second-most popular modern fantasy series, The Chronicles of Narnia, written by J.R.R. Tolkien’s close friend and associate C.S. Lewis, was considered a prime candidate for an animated adaptation in the wake of the success enjoyed by LOTR. [8] Though neither Disney nor Don Bluth were interested, a consortium of producers throughout the Commonwealth (primarily in the United Kingdom and Canada) took it upon themselves to complete the film adaptation of the first, and most popular, of the books, The Lion, the Witch, and the Wardrobe, which was actually released in 1981 in much of the world, only receiving a stateside release for Christmas, 1982 – on CED, becoming one of the first major releases exclusive on home video in the most lucrative market in the motion picture industry, and spurring the direct-to-home video industry. However, the film performed very well, spurring many Bible Belt customers to purchase a CED player specifically for the purpose of watching the film – The Chronicles of Narnia was infamous for its blatant Christian allegory. The movie went on to become the best-selling CED of 1983, buoying its modest success overseas – Prince Caspian, by all appearances, would soon follow.

    Although the fantasy genre lent itself well to animation, it was by no means limited to it. Journey of the Force had been as big an influence on the fantasy boom, and it having been based on pulp fiction and the film serials of the Golden Age was an inspiration in the decision to adapt Robert E. Howard’s Conan the Barbarian to the big screen. Conan was a far more visceral, up-close-and-personal experience than the more sweeping, epic Journey of the Force and Lord of the Rings properties, telling the story of one man and his personal vendettas (for lack of a better word) as opposed to his involvement in a larger story. Cast as Conan the Barbarian was Dolph Lundgren, a newcomer and martial artist who had primarily been working as a model and commercial actor. He was only 24 years old during principal photography. [9] Born in Sweden in 1957, he was a tall, muscular, blue-eyed blond – very much the personification of the Nordic superman common to pulp fiction of this sort. To avoid the unfortunate implications of such casting (and to stand shoulder-to-shoulder with Journey of the Force, which had been recognized for its diverse supporting cast [10]), Conan’s allies were deliberately cast across racial lines – many of whom were also younger unknowns. Producers deliberately sought a “young Samuel L. Jackson type” and a “young Bruce Lee type” to complement Lundgren – it was decided that the big money would be spent on the older, supporting characters – played by established actors – and so deliberately seeking out Jackson and Lee (whose larger-than-life screen personalities would likely overwhelm that of their star) was mooted. Still, the internationalism and multiracial nature of the cast was highly praised by critics – though the presence of female characters, not so much. Only one actress, who played Conan’s love interest, received major billing – and she appeared topless in the film, as did several female extras, raising the eternal spectre of sex and violence – the film was very popular with adolescent male audiences, earning over $40 million domestically. Most location shooting was done in Europe, where the film was even more financially successful.

    As with NIMH and Prydain, Conan found itself in a heated box-office battle with a rival film in the spring of 1982. The Sword in the Stone had seemingly inspired not only Ron Miller, but also John Boorman, the noted director of the Oscar-nominated thriller Deliverance, and the surreal science-fiction film, Zardoz. Boorman had wanted to adapt Mallory’s Le Morte d’Arthur into a film for some time, and the fantasy boom gave him the chance he needed. However, a potential roadblock for the positive reception of his Arthurian fantasy film, which was named Excalibur, was that it would be released in a post-Monty Python and Camelot environment. [11] In fact, this reality came to inform the reconstructive approach that the filmmakers sought, in deliberate contrast to Camelot’s irreverent take. (To emphasize this point, the word “Camelot” was never mentioned at any point during the film.) While not totally without comedy, the film went to great pains to highlight the virtuousness and nobility of Arthur and the righteousness of his cause. Cast as the wizard Merlin (and given top billing – Arthur was listed second) was Oscar-winner Alec Guinness, whose portrayal emphasized the mystery and cleverness of the sage. Though Arthur was the rightful king through his descent from Uther and the House of Pendragon, he was shown winning over his supporters through his courageous and benevolent rule, a nod to the more democratic times in which people now lived and – ironically – more true to politics in the British Isles at the time in which the film took place. [12] The Saxons, unsurprisingly, were demonized – but unlike many barbarian hordes in films past, these characters were lily-white. Guinness’ sincere yet subtly comedic performance was highly praised, receiving an Academy Award nomination for Best Supporting Actor – which was, surprisingly, followed by a win. [13] It was one of several Oscar nominations for the film at the 55th Academy Awards, which also saw the film take home the statuettes for Cinematography, Costume Design, and Art Direction-Set Decoration [14] – as with Conan, much of the location footage was shot in Europe, a venerable tradition dating back to Stanley Kubrick’s film Napoleon. Excalibur was not only the most lavishly-awarded fantasy film of 1982, it was also the most financially successful; it earned $80 million domestically, making it the fourth-highest film of that year. [15] In the end, Excalibur made for an informative contrast with Camelot; both films were products of their time. Camelot had been doused in the cynicism, irony, and surrealism of the mid-1970s, while Excalibur was a harbinger of the more optimistic and earnest (though not naïve or humourless) 1980s.

    Somewhere between live-action and animation, there were the Muppets – or rather, Jim Henson and his Muppet Studios. Their beloved variety extravaganza The Muppet Show had ended in 1981, and Henson and his cohorts immediately set to work on the production of a feature film. Initial plans to make a movie starring the Muppet Show characters were scuttled because Desilu – which owned half the copyright on the show – declined to become involved with producing a film. The studio had been burned before upon tentatively stepping into the film industry, and had no intention to try again. Desilu also made a number of arrangements with other companies that were predicated on their staying out of the movie business, most of which were through Desilu Post-Production (would the big studios be so willing to support an upstart movie maker’s wholly-owned post-production subsidiary, even if it did the best work in the business?). [16] For that reason, the Muppet characters were out, appearing only in licensed Muppet Show merchandise – Henson himself only owned the rights to Kermit the Frog outright, allowing him to continue appearing on Sesame Street.

    Henson, aware since the mid-1970s that he would not be able to bring the Muppets to the big screen, had written an original story treatment called The Crystal, which told a rather stock modern fantasy story set on an alien world. This eventually became The Broken Crystal, a film which (in order to secure backing from distributors) was forced to abandon plans for an all-puppet cast, with the lead characters instead played by human actors of some recognition. [17] This was obviously intended to have an impact on the film’s box-office prospects, but it was all for naught – the movie was one of the relative failures of the wave of fantasy films in 1982; despite relatively high grosses, these could not effectively overcome the very high budget. [18] Henson was bitterly disappointed, as his reach had so clearly exceeded his grasp, and he returned to television, in hopes of bringing his Muppets to Saturday mornings. Some years later, he would take some comfort in The Broken Crystal having become a cult success.

    The many fantasy films released in 1982 were so popular with young audiences that attempts to extend the brand into another media were inevitable. Just as Journey of the Force and The Lord of the Rings had begat the wave of fantasy films that had followed, their own successes would in turn spur even more attempts to exploit a rapidly-emerging trend. At the same time, there would never again be such a dense concentration of fantasy films within so brief a period. Indeed, some producers, most notably those visionaries who were able to spot trends before they arrived, were wary that audiences might soon tire of fantasy, as they had done of science-fiction in the not too distant past. On the other hand, they were also aware that audience tastes usually tended to be cyclical, and fads of the past had a curious habit of re-emerging at the slightest provocation…

    ---

    [1] Why Sweden? It all has to do with this fellow, Åke Ohlmarks, who translated The Lord of the Rings into Swedish in 1959. Tolkien hated the translation, as did many Swedish Tolkien fans, because of its gross inaccuracies and departures from the source material. ITTL, J.R.R. Tolkien’s son and literary executor, Christopher, explicitly requested that the Swedish distributor use subtitles translated by anyone other than Ohlmarks, a request that was granted, and as a result the films (despite both the nature of adaptation and translation concerns) are quite possibly more accurate to Tolkien’s original novel than the Swedish translation! (IOTL and ITTL, Christopher Tolkien would also insist upon any translator other than Ohlmarks, who died in 1984, for the Swedish version of The Silmarillion.)

    [2] It should be point out that Tolkien hated allegory and despised all attempts by critics at reading them into his work, but of course Tolkien was deceased by the time that production had started on the films, and Bakshi was far less averse to adding them in than Tolkien would have been.

    [3] Find out more about Bluth’s career as a video game developer in this post.

    [4] Both fairy tales were, of course, eventually produced by Disney IOTL: The Little Mermaid kickstarted the Disney Renaissance in 1989, and The Snow Queen (adapted under the title Frozen) was released in 2013 and became the highest-grossing animated feature of all time (not adjusted for inflation – Snow White has been the highest-grossing animated feature of all time for its entire release history if you account for inflation).

    [5] The Black Cauldron is the name of the second book in the Prydain series; the resulting OTL film was an adaptation of the first two books. Battle of the Trees was a working title for The Book of Three, derived from (the English translation of) the title of a famous Welsh poem.

    [6] As happened IOTL with casting for The Black Cauldron, but not The Fox and the Hound, which starred Mickey Rooney (sadly, yet another actor who has died since I started writing this TL) and Kurt Russell. Bluth was a pioneer of stunt-casting celebrity voice actors in animated features, a practice which continues to the present day.

    [7] IOTL, Disney’s The Fox and the Hound, released in 1981, grossed $39.9 million and finished at #14 for the year, against an estimated budget of $12 million. Bluth’s The Secret of NIMH, released the following year, grossed $14.7 million and finished at #52, against an estimated budget of $7,000,000 – that this was still enough to bankrupt Bluth’s animation studio (for the first – but certainly not the last – time) speaks to the disconnect between reported profits and actual profits in the film industry. The Black Cauldron, on the other hand, grossed $21.2 million and finished at #42 in 1985, against an estimated budget of $44 million. This budget was inflated in large part due to disastrous meddling by Jeffrey Katzenberg, who was brought on after Miller’s ouster in 1984; he would have much the same impact (though with a much better result) in the making of Aladdin.

    [8] The animated telefilm The Lion, the Witch, and the Wardrobe, which was directed by Peanuts veteran Bill Melendez and aired in 1979 IOTL – and won the first Emmy for Outstanding Animated Program – has been butterflied ITTL.

    [9] Lundgren was, IOTL, attending the University of Sydney at this time, studying for his postgraduate degree in chemical engineering, which he completed in 1982. But fear not, Faculty of Engineering; though I may have deprived you of one of your more famous alumni, I have plans for you!

    [10] Yes, really. Amazing what difference filming in California and having the legendarily pro-diversity Desilu in your corner can do for casting, isn’t it?

    [11] Although Monty Python and the Holy Grail had been released prior to Excalibur IOTL as well (in fact, the two films were released only six years apart, as opposed to eight ITTL), the film’s cult following accumulated slowly enough that it did not enjoy the ubiquity it does not until some years after the release of Excalibur.

    [12] Thanks to Space Oddity for linking me to this scene from the OTL film of Excalibur, which does strike me as an effective rebuttal to the many scenes in Holy Grail contrasting Arthur’s splendour (and hygiene) from the peasantry he intends to rule over – consider this famous quote. (The scene also works without inviting comparisons to Holy Grail, because of the effective use of baptismal imagery.) In Excalibur, both IOTL and ITTL, Arthur proves his worthiness to be King based upon his actions, clearly a reconstructionist take on the divine right of the original Arthurian mythos framed for modern times.

    [13] Why does Guinness win Best Supporting Actor for Merlin when he failed to do so for Obi-Wan IOTL? A few reasons: the success of The Lord of the Rings financially, and The Journey of the Force winning the big prize back in 1978, along with Harvey Korman winning for Blazing Saddles, shows that the Academy is more open to less “conventional” choices ITTL (and, to be fair, they’ve always been more willing to go off the beaten track when it comes to supporting categories) and the film is, after all, based on a work of classic literature (and can also be considered, however loosely, a “historical biopic”, a category which Oscar loves).

    [14] The only Oscar nomination for the film IOTL (at the 54th Academy Awards, mind you, not the 55th) was for Best Cinematography. Several other awards committees nominated the film for Costume Design, which was the only award it won at the Saturns IOTL.

    [15] Excalibur did reasonably well IOTL, with a $35 million gross, making it the #18 film of 1981, against a budget of approximately $11 million. Worth noting is that, IOTL, a little movie called Star Trek II: The Wrath of Khan made $78.9 million at the box-office in 1982, making it the #6 film of that year.

    [16] Well, they were for Industrial Light & Magic (ILM), a wholly-owned subsidiary of Lucasfilm, IOTL. However, Lucasfilm did not have the dedicated studio facilities or infrastructure to properly compete with the film industry as Desilu does – they released Star Wars through 20th Century Fox, and then Indiana Jones through Paramount (and Howard the Duck through Universal). Nowadays, studios cooperate far more readily on major projects and Disney’s purchase of ILM (via Lucasfilm) has not hurt their prospects so much. But back then, it would have been a very different story, especially since Desilu (unlike Disney) did not have a pre-established presence in the motion picture industry.

    [17] IOTL, the two lead characters – elf-like beings known as “Gelflings” – were played by animatronic puppets that plunged headlong into the uncanny valley.

    [18] The Dark Crystal grossed over $40 million IOTL and about the same ITTL, however the budget was considerably higher ITTL – in other words, Jim Henson is a victim of his own earlier success creating unreasonable expectations for his big-screen début.

    ---

    Thanks to Thande for the constructive debate regarding our thoughts on fantasy, and, as always to e of pi (and friends) for his usual proofread.

    Oh, and before you ask, there is no E.T. or Blade Runner ITTL. For the millionth time, I remind you all that this is not a utopia! But thanks for reading anyway :)
     
    Last edited:
    What Else is On?
  • What Else is On?

    Someday, somebody’s gonna figure out a way to get rid of all these wires, and they’ll be a millionaire.

    Malcolm Richter (played by Donnelly Rhodes), in the Life After Death episode, “What Else is On?”, originally broadcast November 11, 1982

    For the better part of more than a quarter-century, the collective output of the three broadcast networks of ABC, NBC, and CBS had enthralled television viewers north of the Rio Grande. There had been challengers to their cultural oligopoly – most notably the CBC in Canada, and its eventual analogue, PBS, in the United States – but, for the most part, these were considered mere sideshows. Programming produced for one of the three networks and aired by them, or by simultaneous substitution, or rerun in syndication, came to define English-language television in North America, and American television throughout the world. It wasn’t until the 1980s that this long-established status quo, in place since the collapse of the DuMont Network in the mid-1950s, was finally rendered obsolete.

    But it didn’t happen overnight, because it hinged on the mass-market adoption of new technologies – always a slow, gradual process. The technologies in question produced much the same result from the consumer perspective despite the differences between them, taking advantage of the same basic infrastructure which was already being laid – and launched – into place. Ironically, this dated all the way back to the period that had marked the dawn of the triopolistic hegemon: 1957, with the launch of the Soviet satellite Sputnik. It was a radio satellite, just as most of those which had launched in the years since had been. Satellites had many advantages for telecommunications purposes over old-fashioned, terrestrial transmitters and receivers, the most immediate of which as far as television was concerned was the ability to carry a plethora of channels – far more than the dozen or so (most of which were low-quality, low-fidelity UHF stations) that could be received through antenna. Alongside “satellite”, as that form of transmission would become known, was a complementary form of channel distribution (which, indeed, was dependent on satellite), “cable” – where radio frequency signals were transmitted through coaxial cables. This was a technology that was easier for the average consumer to understand – it strongly resembled existing wiring hookups within the average household, including telephones, another ubiquitous telecommunications device. Thanks to the home video and video game boom of the late-1970s, people were already accustomed to plugging their CED players and VCS consoles and into their television sets; what was one more wire to connect? [1]

    The connection between cable and CED technology wasn’t limited to wires that plugged into the television set. That the physical CED disc was shaped remarkably like an LP record went unremarked upon by no one, least of all musicians themselves. The 1970s were a notoriously drug-fueled decade (but then, so were the 1960s before them and the 1980s after them), which likely contributed to the urban legend about the origins of the “music video” and, by extension, the “video album”: that a stoned musician, producer, or engineer had attempted to play a CED disc in the record player – or an LP in the CED player, depending on the source – and then the idea for combining the two formats hit their drug-addled brains like a bolt of lightning. [2] Whatever their origins, music videos would revolutionize more than one industry, largely because of their impact on more than one medium. The format as it is understood in the modern sense originated in the Progressive Rock genre, which was rooted in strong narrative sensibilities – adding a visual component was a logical extension of the epic storytelling which so defined Prog and other like-minded genres.

    Thus, the most popular contender for the title of “first music video” was “Moonraker”, based on Queen’s hit 1974 title song for the James Bond film of the same name. The “promo film” combined scenes of the band performing the song “in concert” (actually, on an empty stage) intercut with scenes from the film. It appeared as an extra on the original LaserDisc release of Moonraker, in 1978 (as part of a concerted last-ditch marketing push for the lagging format), and it was again included on the 1981 release for CED. [3] In the three intervening years, there had been a proliferation of music videos released for VDP players; these enjoyed some popularity with the early adopters of the format, many of whom (unsurprisingly) skewed young and wealthy, and were fond of popular music. Queen, for their part, made extensive use of music videos, which were often featured on British variety shows in place of live performance. It was a win-win situation – Queen was known for utilizing extensive engineering tricks such as overdubs for their single and LP releases, which could not be replicated live. [4] The videos preserved their special vocal effects and also created the feeling of “being there” – much like concert films, which were also very popular at the time, and which were also being released on home video in the late-1970s (an ideal format for them, as they were otherwise never seen once they left theatres). But Queen, who by this time were making music videos of most of their songs, took the novel approach of releasing a CED version of their 1978 LP, Arabesque, partly in an attempt to better make their name known in the American market (where home video was more popular, and variety was on the way out). [5] It was the first “CED album”, and since the discs allowed for thirty minutes of video per side, it was comparable to an LP – the album was about 50 minutes long, divided evenly between sides.

    It was in this environment that Music Television, or MTV, premiered. It was also inspired by the tradition of music videos standing in for live performances on televised broadcasts, and struck when the iron was hot, taking advantage of the new “video album” fad started by Arabesque when it launched in the summer of 1979 – on July 20th, to be exact, the tenth anniversary of the Apollo 11 moon landing, at 12:01 AM. [6] Accordingly, it began broadcasting with archive footage of that landing, the first audible words heard being Neil Armstrong’s famous quote “That’s one small step for a man, one giant leap for mankind”, before seguing into the first music video shown on the network – fittingly enough, Queen’s Moonraker. [7] It is likely that being the first music video aired on MTV and being a prominent early music video were conflated together in the popular imagination, producing its reputation for being the “first music video” simpliciter. Established music critics, many of whom were highly disdainful of the video album trend and of music videos in general, were scathing in their criticism of the nascent MTV. Rolling Stone magazine led the charge; it struck them as wholly appropriate that their long-time bête noire, Queen, had set the tone for the channel. [8] The magazine would infamously predict a quick and brutal death for MTV, and for the music video trend in general.

    But MTV was a smash success, proving the nexus of the music video/video album confluence – by the end of 1979, they had played all of Arabesque in sequence, in a meta-broadcast of the video album. However, they would reach their apex in the following year, through their connection with rising star Michael Jackson, whose dance-funk album Right through the Night found even greater success in video album form – after a fashion. [9] Jackson and his producer, the inestimable Quincy Jones, immediately saw the appeal and potential of the music video, creating what were essentially one-song musicals, with a skeleton narrative effectively “interrupted” by the songs themselves. Jackson spent most of 1980 making these videos, all of which premiered on MTV and bolstered the sales of his (non-video) singles, as well as the album. Many of these were much longer than the original songs, and this prevented them from appearing together on an hour-long CED – by 1980, the LaserDisc format, which allowed for longer play, was essentially dead. VTP was a possibility, but fortunately for CED players, new technology was introduced in 1981 allowing two hours of video per disc instead of just one – and Right through the Night: The Video Album was released that year, coming in at over ninety minutes long. By this time, all of the videos had been aired (ad infinitum) on MTV, but the video album still sold very well, because of the opportunity for music videos on demand – something beyond cable and satellite technology of the time.

    MTV continued to thrive, however, because of the holding pattern that was thus established: the audio-only album would be released first, alongside audio-only singles, and then music videos would follow, which would invariably premiere on MTV. Music videos for every track of the album would be produced – including the filler tracks which were never intended to be released as singles – and once every track had an accompanying video, they would be edited together and printed on CED as the video album. By this time, most of the singles had gone through their useful lives, and the videos attached to them had been played out on MTV. This allowed MTV to maintain their cutting-edge reputation and serve a “visual Top 40 radio” function while allowing the music industry to “double-dip” releases – Right through the Night was far and away the best-selling video album of 1981 and 1982 – by which time Michael Jackson’s next album, Starlight, had been released (in audio format). [10] The notion of only producing a video album was inevitably floated by certain artists; Queen had considered it after Arabesque had been such a success, but the stranglehold of certain financial interests in the recording industry – and the necessity of audio-only formats, such as car radios – prevented this from becoming a reality. Still, the new situation did render certain other formats obsolete, or at least hasten their decline, with old-style musicals and variety shows being first and foremost among these. However, many of the more sophisticated music videos – with scenes of spoken dialogue interspersed throughout – were not “adapted” directly to radio; after all, who would want to listen to characters talking when they could be hearing them singing?

    Profit margins on video albums were smaller than they were for traditional audio LPs – fortunately for the producers, consumers had a reasonable expectation that these video albums would cost more than audio albums, which helped to cover the higher costs of shooting the videos. Prices for video albums reached equilibrium in a range similar to that of other home video: cheaply-made ones would be sold for lower prices, whereas lavish, elaborate video albums performed by the hottest artists (with highly desirable music videos attached) would be sold at a premium. A 1983 study on video album pricing confirmed the perceived rule of thumb that the length of the album directly correlated with its price – fortunately for the nascent format, the industry was able to take advantage of established infrastructure which provided significant economies of scale. Unsurprisingly, a company never far from the bleeding edge of media technology, Desilu Productions, was involved. It had three key components to facilitate the production of music videos: ample studio space; the sprawling Forty Acres backlot; and post-production facilities. [11] It was in Desilu’s best interest that CED – the format of which they were a major backer – be as successful and appealing to as many diverse interests as possible. Many music videos which become classics were shot and edited there.

    MTV, meanwhile, would not be alone for long. Given that the station’s niche of providing “visual radio” had rapidly become a phenomenon, rival conglomerates soon began developing music video stations of their own, most of which were first added to cable packages in the 1982-83 period. Music videos were here to stay, and they defined the 1980s perhaps more than any other format, genre, or medium. MTV, for their part, were clamouring to expand beyond the United States, and the first port of call was Canada, where there was also interest in music videos and video albums, thanks to the near-simultaneous introduction of both CED players and cable television to that market. However, given that Canada was far less populous than the US and that CanCon regulations, introduced in the early-1970s, had been imposed upon the broadcast media supposedly to help protect Canadian musicians, “visual radio” would have to adapt to meet the same standards. What few homegrown Canadian music videos had been released still reflected 1970s production values. Many musicians or even record labels simply did not have the resources to produce them at the same level of quality as those of their American counterparts. The knock-on effects of the “video nasty” scandal had also played a part; many of the moral crusaders, who had failed in their attempt to prevent HBO from crossing the 49th parallel, had quickly moved on to the next target: stricter control over the home video market, which extended to album videos, and therefore to music videos in general; the predilection of raucously suggestive lyrics in many musical genres of the time had often extended to lascivious imagery in music videos, which fuelled the argument that “visual radio” was depraved. The irony that those fighting this crusade were the children of those who had railed against Elvis Presley swiveling his hips on Ed Sullivan in the 1950s did not escape anyone’s notice. Those who had more liberal sensibilities took the cultural protectionist tack to preventing MTV from coming to the Great White North – the free-market, “Top 40” approach to music videos would overwhelmingly favour popular American acts, leaving Canadian performers in the lurch, other than those who had achieved some measure of stateside success, such as mature, soft rock acts like Anne Murray, who were not predisposed toward releasing music videos.

    MTV wasn’t about to let “concerned citizens” stand in their way; they knew that most ordinary Canadians wanted to have the channel available in their homes. However, they needed native backers to make that a reality, who would be willing to go to bat for them with the CRTC, the gatekeepers of Canadian television. They found them in a consortium led by the CTV/TVA broadcasting alliance, who operated bilingually (and therefore had an interest in both English- and French-language channels of the planned “MTV Canada”) and had seen other pay-TV ventures (including those spearheaded by their archrival, the public broadcaster CBC, as well as HBO Canada) launch with great success. They didn’t want to be left in the lurch, and surprisingly, their venture had an unlikely ally: Israel “Izzy” Asper – owner of CanWest, known in the United States for having recently purchased the United Artists film corporation, but known in Canada for running the upstart Global Television Network. Asper’s unique portfolio had given him special incentive to back the MTV deal: he knew that the CRTC would almost certainly demand that any “visual radio” channel abide by CanCon rules comparable to those governing actual radio broadcasting. This meant that Canadian artists would be heavily featured – but they didn’t have to be in Canada. The many Canadian artists who had relocated to Hollywood could shoot their music videos there, as long as the studio was owned by Canadian interests – and there was only one of those in Tinseltown. UA entering the music video business would put them in conflict with Desilu, but Asper cannily suggested an alliance with them as well: UA would stage and film music videos only for Canadian acts, and would direct all of their clients to Desilu Post-Production for all their editing and special effects needs. In practice, however, doing the post-production work in Canada made it much easier for music videos to meet the minimum CanCon threshold – and record companies, mindful of their potential Canadian revenues, knew this as well.

    Asper was one of a large number of individuals called before the CRTC to testify with regards to allowing MTV onto Canadian television, speaking eloquently in support of the notion. However, the CRTC was also made aware of the simple economic reality that people were buying video albums on CED, and that their organization could not control their purchases directly – they could only influence them, as they had influenced single sales by forcing radio stations to showcase Canadian performers. To that end, a carefully-regulated Canadian version of MTV would surely have the same impact on CED sales. To this end, MTV Canada (locally known and branded as simply MTV) would be launched on January 1, 1982, with two feeds available: one in English, and one in French. [12] The latter was the first dedicated music video station available in that language worldwide. It had taken two-and-a-half years, but Canada finally had their MTV. For those in charge at corporate headquarters in New York, it was only the beginning…

    It was only the beginning for Izzy Asper, as well. Music videos and CanCon regulations made for strange bedfellows, but they were enough to keep the UA studios in Hollywood busy, and its technicians employed, while still being able to take advantage of his assets on both sides of the 49th parallel. Still, the staggering amount of bureaucracy involved in running his operation had inevitably curbed his expansion plans somewhat, as did the unfortunate reality that, in many ways, Canadian telecommunications technology had always been several years behind that of the United States – the relatively short time it had taken to get MTV Canada green-lit had been a conspicuous and fortunate exception. SCTV had proven that a potential American audience for Global programming did exist, but even though the United Artists deal had included the purchase of two American stations (including a Cleveland station, WUAB, which was a local market leader), Global didn’t even cover all of Canada by the dawn of the 1980s. [13]

    In certain areas, such as Southern Ontario, Greater Vancouver, Manitoba, or parts of Alberta, full network availability was guaranteed; outside of them, however, coverage was problematic. To help fill the gap, many stations in those patchy markets, including affiliates of the rival CBC and CTV networks (even those stations owned-and-operated by them), would carry Global programming in their stead. From the network perspective, however, this was problematic, as their coverage of the Global schedule was patchwork and haphazard; the right to air SCTV, naturally, was hotly contested, but it only came in a package deal. Stations that aired Global programming would often “cherry-pick” what they wanted to bring to air. It didn’t help that the CanWest Global System (as the network was officially known) was already highly decentralized: Global did not air a “main” network schedule, per sé, unlike the American networks (except for PBS) or domestic rivals CBC and CTV. Thus, affiliate stations and even owned-and-operated stations largely had the freedom to program their own schedules, including the Global network programming feed. In order to emerge as a viable and serious competitor to the rival network in an age when the established network television order was open to a shakeup, drastic changes had to be made to bring both the network-owned stations as well as the affiliates in line. This involved ensuring that the network schedule was uniform across all stations, allowing for better demographic and regional breakdowns for ratings analysis. Interestingly enough, Asper had decided to include the Cleveland station inherited from UA, which (though on the Canadian border, across Lake Erie) was not even a Canadian station. However, Asper was armed with what he felt was a secret weapon in bridging the gap on that front.

    But consolidation was not the same as expansion, which was needed to fill in all the holes where Global stations simply did not exist, either by investing in additional infrastructure [14] or by outright purchasing stations in new markets. Asper sought to expand network coverage east of the Ottawa River; Quebec, the second-largest province, had no Global stations or even programming available whatsoever. But Global found an opening in Quebec City, where the local (English-language) CBC Television affiliate, CKMI, had long suffered through severe financial problems; unlike Montreal, Quebec City was overwhelmingly unilingual. As a result their audience was barely enough to keep the station alive. Just before Christmas, 1982, Asper approached Pathonic Communications, who owned CKMI, about purchasing the station from them. They immediately agreed, and CKMI would officially become a Global station just in time for the 1983-84 season to begin.

    His next acquisition in la Belle Province, was a community television station, CHOY, located in a Montreal suburb called Saint-Jérôme. Much like CKMI, CHOY struggled financially, despite government subsidies, making it an easy target for purchase. However, in order to comply with the CRTC’s rules on community television, CHOY had to retain a block of French-language programming for the benefit of the residents of Saint-Jérôme, which included educational programming from TÉLUQ (Télé-université, the distance learning service of the Université du Québec). But this was a small price for Asper to pay to finally gain coverage in the Montreal area, the largest in Canada – rather larger were the capital assets he would have to secure to render the station able to serve the vast conurbation. [15] Global thus decided that every English-speaking community in Quebec should have access to Global programming, a claim that only the CBC could match. As an opening salvo, a repeater station was built in Sherbrooke, marking the beginning of Global’s extensive repeater network throughout the province. [16] However, acquiring stations which already had the necessary infrastructure in place – as was the case with CHSJ, a station in the Maritime province of New Brunswick, east of Quebec, which Asper also acquired in 1983 – made Global’s job much easier. [17]

    The big loser in the face of Global’s aggressive expansion was the CBC. Their lost market in Quebec City could easily be served with a repeater of the Montreal station, but New Brunswick was a different story altogether. With the acquisition of CHSJ by Asper from the New Brunswick Broadcasting Company, the CBC no longer had any English-language presence on television in that province. To compensate for this, their French-language sister network Radio-Canada’s transmitters in New Brunswick began airing CBC programming until such time as a replacement station could sign on; for cable and satellite TV customers, the feed from the Halifax station was rebroadcast instead. Sure enough, CBAT was soon granted a new licence by the CRTC and signed on for the first time in Moncton, restoring CBC television service to the province. [18] However, the CBC was in some ways glad to be rid of CHSJ, as the station was notorious for pre-empting large chunks of CBC programming, with the province missing out on several shows, including even the successful ones which were critical for the CBC’s success (and greatly annoying both many MPs and the CRTC at the same time); notoriously, CHSJ had once unilaterally pre-empted the Stanley Cup finals in favour of broadcasting returns from a provincial election. [19] Whether or not the Irving family, which previously owned CHSJ, had been trying to save face by selling it off to Asper is one of the great mysteries in Global’s history, including being the fuel of several conspiracy theories.

    Although Asper felt that he still had much more work to do in giving Global a truly national presence in Canada, he was pleased with what he had accomplished to date. Not only did the recent acquisitions expand the reach of Global, but it also made it that much easier to promote his other holdings in United Artists. UA was providing solid, if unspectacular revenues for CanWest – and although the use of their studio space to produce music videos for Canadian artists had been a step in the right direction, most of that revenue came from the UA television catalogue (with their continued productions serving as the main occupants of their Hollywood soundstages). CanWest-owned stations were never without want of good movies or reruns, even in the wee small hours of the morning. However, this stable holding pattern was clearly intended by Asper to be the jumping-off point for bigger and better things, including new television series. He had to prove that SCTV was no fluke, and he now had the means, motive, and opportunity to do so. What caught his attention, as was often the case, was his competition. He had noticed that CTV and TVA were able to make their alliance work with their take on the venerable medical drama. He also heard that the CBC was seeking to revive a previously-established property of theirs, in response to the increased threat from the private networks. Of course, Asper wanted to respond to these new offerings, and preferably in a way which would appeal to both Canadians and to Americans.

    The CTV-TVA “alliance”, as both sides of the cooperative relationship often described it (the Canadian media, given the Anglo-French angle, sometimes called it the “Entente Cordiale”), had helped to turn around CTV’s fortunes and consolidate its position as Canada’s foremost private network. In part, this was due to CTV taking full advantage of established TVA programming, either by dubbing their original series into English, or by producing licenced remakes, using them to fill voids in the network schedule. Their most ambitious undertaking was a jointly-funded series which premiered on TVA, essentially seen as Canada’s answer to the popular Quincy, M.E. (itself a thinly-veiled ripoff of a 1960s Canadian drama called Wojeck). Set in Montreal, with a bilingual cast (which made it easy to reshoot dialogue scenes as the producers saw fit) and a cool jazz soundtrack (befitting one of Montreal’s claims to fame as Canada’s jazz capital), Concordia was an interesting hybrid of programming styles. For the most part, it largely reflected the conventions of Quebec’s homegrown and ubiquitous téléroman genre, which straddled the line between serialized soap operas on the one hand and more traditional North American episodic television on the other. [20] The series focused on the city medical examiner, played by Eric Donkin, a British-born actor primarily known for his theatrical work with the (Canadian) Stratford Festival – the character was largely old-school and set in his ways, but able to solve all crimes through his meticulous M.E. work. This was against the backdrop of a series that reflected the aesthetic and styles of the previous decade, far more than it did the new one. [21]

    Not to be outdone, of course, the CBC took their cue by going back to the well and trumping “Canada’s answer to Quincy” with one fell swoop: a remake of Wojeck itself – after all, it had been the direct antecedent to Quincy, M.E. To this end, CBC green-lit the program, though with a new, younger star (the original Wojeck, John Vernon, had been one of the endless stream of Canadian actors departing for better opportunities in the States). [22] Both versions of Wojeck were set in Toronto, the second-largest city in Canada, and the largest in English Canada – the better to complement Concordia’s setting in bilingual Montreal. Toronto was primarily a working-class industrial city in the vein of many others on the American side of the Great Lakes, and despite its great size it was not nearly as self-consciously cosmopolitan as Montreal – it had been overwhelmingly WASP-ish (primarily Methodist, until that church merged into the United Church of Canada in the 1920s) and Orangist until mid-century, which had informed the portrayal of the city in the original 1960s series. In the 1980s series, Toronto was depicted as struggling to eke out a reputation for itself as a world-class city – like Montreal, it was home to MLB and NBA teams, and the world’s tallest freestanding structure in the CN Tower – which was, sadly, a relic of terrestrial radio broadcasting in an era increasingly favouring cable and satellite transmission (hence the reason no other, taller radio tower had yet been built – giving Toronto a solitary, if verbose, distinction).

    Since Global, alone among the three Canadian networks, already had a homegrown success stateside in SCTV, that informed their programming strategy, as did close observation of trends in American television at the time. Hill Avenue Beat had revolutionized American drama, and if Global wanted to produce a series as revolutionary as the CBC had done with the original Wojeck (as opposed to their derivative remake) they would have to follow their lead, or perhaps even go further beyond. Thus came their own take on the medical examiner drama, meaning that all three Canadian networks would have one on the air in the 1982-83 season. Global’s version was called, fittingly, Life After Death, and it was set in Winnipeg, base of operations for Global’s corporate parent, CanWest. (It was also filmed there, thanks to generous tax breaks for film production provided by the provincial government of Manitoba.) [23] It premiered later than both of its rival series (though just weeks after Wojeck hit the airwaves), and starred Donnelly Rhodes, a Canadian actor known on both sides of the border for his appearances on Soap. Rhodes had primarily been known as a dramatic actor prior to Soap, but his role on that program as a bumbling fool had made an impression on viewers (and casting directors) which he hoped to erase through Life After Death. How could he do so on a show which aired only in Canada? Because Asper wanted to have Life After Death follow in the footsteps of SCTV (and myriad British programs) to air in US markets. Typically, Canadian programs, as well as British ones, had smaller seasons of about 13 episodes or fewer, compared with American programs which averaged out at 22-26 episodes to comprise their seasons. On this front, Asper preferred to retain the 13-episode season – his insistence on a comparatively short run was in part due to the limited overall budget the series would have, and a marked emphasis on quality over quantity. Secondary post-production work, due to United Artists being on board with the project, was done in Hollywood.

    Life After Death was sold into first-run syndication, and picked up by several stations in the United States – including, of course, Asper’s own United Artists Broadcasting station in Cleveland and Puerto Rico (the latter with Spanish subtitles), but also including such superstation heavyweights such as WSBK in Boston and WPIX in New York City – airing during the 1982-83 season alongside Canadian broadcasts. It was during this time that the show was noticed by a program executive at third-place CBS, who liked the show’s innovative approach to the medical drama and contacted Asper, picking up the show for broadcast as a midseason replacement in early 1983. Had ratings been strong enough, CBS would have bought into the second season, contributing to a “full” 22-episode order and airing it in the 1983-84 season, simultaneous with the planned Global broadcasts in Canada. It was the first time that a primetime series produced for Canadian audiences would be broadcast on an American network. However, ratings – though good, especially by CBS standards – were not up to the level that the network wanted, perhaps because the show had already aired in scattered markets throughout the US by the winter of 1983, and the network was trying to pass off these reruns as first-run episodes. Asper was aware of this and argued this in negotiating a deal for a second season, but to no avail. [24] This “betrayal” by CBS, coupled with the continued backroom compromises with NBC with regards to SCTV, strengthened Asper’s resolve to move into the American market on his own terms, and – much as had been the case with MTV and United Artists – he was determined to find the right opportunity to do so…

    ---

    [1] The typical “keeping-up-with-the-Joneses” household in the early-1980s would be expected to have the following inputs connecting to their television set: cable hookup, CED player, VCS (II) console, stereophonic speaker connection, and camcorder/VTP player (and possibly recorder). That’s five different hookups where, prior to the mid-1970s, there’d be only one – to the antenna/rabbit ears. The wiring situation is only going to get a lot worse before it gets any better, just as was the case IOTL.

    [2] The varying descriptions of the music video’s precise origin help to indicate that the story is an urban legend. In actual fact, music videos as we understand them today evolved in much the same way as the Queen example given in the update – videos of musical acts performing their songs were substituted for live performance where it was not feasible. MTV was directly inspired by broadcasters in New Zealand, which due to their small population size and geographical isolation was rarely able to attract big name acts to their shores – pre-recorded material was therefore used instead.

    [3] The music video of “Moonraker” is sometimes (erroneously) regarded as the first “special feature” on a home video release ITTL.

    [4] Queen turned the same trick IOTL with “Bohemian Rhapsody” – oft-cited as the “first music video” in the modern sense of the term – in which they “performed” the song (inexplicably dressed in Queen II-era outfits from some years before). That the video was intended to be the substitute for a live performance of the song is cemented in that the “opera” segment of the video was played in actual concerts because the band could not possibly have performed it live (they did play the rest of the song themselves, however). Since “Moonraker” has a similar overdubbed harmonizing effect, a music video is for much the same purpose ITTL.

    [5] Queen’s 1978 LP was called Jazz IOTL, though it wasn’t particularly jazzy. (1975’s A Night at the Opera wasn’t very operatic, either – and yes, I know it was a reference to a Marx Brothers film). However, Jazz did feature a delightful (if not remotely faithful) middle-eastern pastiche called “Mustapha”. A similar song (only proggier) appears on the LP ITTL, and that style “lends its name” to the album, in that Arabesque is a real musical genre native to the region which does not appear at all on Arabesque.

    [6] MTV premiered on August 1, 1981 IOTL, about two years later, there being neither threat nor incentive provided by video albums. The first footage shown was the leadup to the inaugural launch of the Space Shuttle Columbia, which had taken place earlier that year – a different Columbia has been flying for several years by 1979 ITTL. However, a montage of still images from the Apollo 11 moon landing was shown IOTL, just as ITTL.

    [7] The first music video played on MTV IOTL, in an impressive display of hubris, was “Video Killed the Radio Star”, by the Buggles. The single was nearly two years old at the time. ITTL, “Moonraker” – about five years old, and available on home video for over a year – is chosen to capitalize on the Moonshot Lunacy nostalgia.

    [8] It is ASB in any universe for Rolling Stone to like Queen, whom they once unironically described as “the first truly fascist rock band”. Elaborate music videos, meanwhile, happen to be on the rise when the editorial staff happens to be in one of their “three chords and the truth” phases.

    [9] Michael Jackson is one of the youngest people at the POD whose career after it went more-or-less the same as the years progressed: the youngest member of the Jackson 5, discovered and signed by Motown, side solo career, estrangement from domineering, abusive father, full-fledged solo career. Right through the Night (named for a line from the OTL song “Off the Wall”, written by Jackson, and ripe for double entendre, much like many of his songs on the Off the Wall album), is a good deal funkier than the OTL disco classic, and also lacking a song by Paul McCartney (what can I say? They ran in different circles ITTL). Although many don’t associate Jackson with music videos until his Thriller album, he did experiment with the format for Off the Wall, though far less creatively and memorably so.

    [10] “Starlight” was a working title for the song that eventually became “Thriller” IOTL.

    [11] What the narrator doesn’t see fit to tell you is that Forty Acres is expensive to maintain – it’s sitting on prime Culver City real estate which is why, IOTL, it was levelled in 1976 and an industrial park built on the land. ITTL, Forty Acres sees constant use from third-party clients and still barely generates more revenues than expenses. (Tours open to the public are another way the backlot generates revenue – for obvious reasons.)

    [12] MuchMusic, Canada’s answer to MTV (which, unlike its inspiration, is still required by the CRTC licence to air at least 50% music videos), did not premiere until September 1, 1984, IOTL. The French-language channel MusiquePlus followed two years later. As was the case for the earlier launch of MTV, video albums are primarily responsible for the accelerated timetable. Neither MuchMusic nor MusiquePlus were affiliated with MTV, which has since twice attempted to enter the Canadian market. Thanks to conglomeration among the major telecommunications firms, the same company currently owns both MuchMusic and MTV Canada, whose broadcast licence specifically prohibits it from airing music videos in order to protect MuchMusic. That’s right; what’s only de facto in the US is de jure in Canada!

    [13] By contrast, the three American networks covered most of the 200+ markets in the United States by the end of the 1950s, speaking to the greater financial base in that country, as well as the industrial/technological component. DuMont had similarly spotty nationwide coverage in the mid-1950s, but of course, it shut down.

    [14] Adding infrastructure can be accomplished by building a network of repeater stations (which Global did in Southern Ontario, and to a lesser extent the Winnipeg and Calgary stations did) or by modifying the coverage area – which CKVU, the Vancouver station ITTL and IOTL (before 2000) successfully did by changing from UHF to VHF, greatly increasing coverage to include much of southwest British Columbia and northwestern Washington state, including portions of Seattle – while repurposing the old UHF frequency as a booster transmitter for Victoria and southern Vancouver Island (ITTL) and adding on a repeater in Courtenay, covering northern Vancouver Island (ITTL and IOTL).

    [15] The station had a strength of only 47 watts, so the CRTC granted CHOY a major transmitter boost, which allowed it to cover Montreal. However, it also required a second, equally powerful repeater transmitter to get it to work more effectively, since the main CHOY’s newly boosted signal only covered Laval and the North Shore; the second transmitter would therefore cover Downtown Montreal as well as the South Shore. The second transmitter would also contain programming from TÉLUQ, but not the additional French-language programming for Saint-Jérôme.

    [16] Questions immediately arose as to which station of origin would be used for each of the transmitters. A compromise was worked out wherein, except for the existing transmitters in Sherbrooke and the Montreal area, CKMI would be the official station of origin for all of Quebec’s Global repeaters. The Outaouais region would be an exception, however, as they already received the CKGN feed from Toronto via a repeater in Hull, Quebec, which served the Ottawa area on the Ontario side of the river.

    [17] In addition to its primary transmitter in Saint John, the largest city in the province, it also had repeaters in eastern New Brunswick (mainly Moncton in the south, Miramichi in the centre, and Campbellton in the north near eastern Quebec) and in western New Brunswick (at Perth-Andover, mainly to gain cable coverage in Maine). Asper therefore didn’t really need much effort to expand it, as most of New Brunswick’s population was covered by one of these existing repeaters.

    [18] A second transmitter was quickly put on the air at the same time, covering both Fredericton (the provincial capital) and Saint John; over the next couple of years, more repeaters would be added on, eventually replacing the Halifax station on cable with the Moncton station.

    [19] Despite Canada’s fanatical zeal for hockey, this is indeed based on an OTL event. CHSJ, which was IOTL bought out by the CBC as an owned-and-operated station (it had previously been an affiliate) pre-empted a game of the Stanley Cup finals (between the New Jersey Devils and the Mighty Ducks of Anaheim, so at least a Canadian or Original Six team wasn’t actually playing) to report the results of the 2003 provincial election (which, to be fair, was a very close race).

    [20] The téléroman is a soap opera-like format not too dissimilar to the telenovelas of Latin America. Family dynamics did come into play quite often in Concordia, as is typical of soap operas, but normally they were considered secondary (contrary to téléroman conventions, where they were a major driving factor of the plot – witness the first one, La famille Plouffe, based on the aforementioned novel, for example). Women largely occupied traditional roles as counterparts to male characters (especially in domestic scenes). The episodes themselves were largely formulaic, by starting with a crisis in the beginning for which there was always a solution at the end – or at least one in sight. This makes it more complex but also much more encompassing than its Anglo-American equivalent, and oftentimes it does cross over with more traditional North American-style “episodic” programming, making Concordia fairly conventional by Quebec standards. Compare that with the typical plot of a telenovela – poor girl falls in love with rich guy, rich guy breaks off relationship with evil rich girl, evil rich girl conspires with parents in attempting to break up the poor girl-rich guy relationship, poor girl turns out to have a rich relative, poor girl and rich guy get married and live happily ever after, evil rich girl gets her just desserts. Quite different, eh?

    [21] You guys probably know the drill by now. Camerawork was staid and overly formal, and the orchestral soundtrack occasionally overwhelmed the action in trying to create the right “atmosphere”. The colours were drab and unappealing, consistent with the washed-out earth-tones that had made the 1970s such a shocking contrast with the loud, colourful, tie-dye era that had preceded it, and against which 1980s aesthetics were such a strong reaction.

    [22] For this reason, the OTL Wojeck ran for only two seasons of 20 episodes, from 1966 to 1968 – a short run, even by Canadian standards. Vernon returned to the role only once, in a 1992 telefilm which aired on the CBC. Wojeck had been based on the career of real-life Toronto coroner Dr. Morton Shulman (who held the job from 1963 to 1967).

    [23] The inspiration for this purely TTL series is actually a very similar 1990s CBC series, Da Vinci’s Inquest, with veteran actor Nicholas Campbell in the title role of the coroner (and Donnelly Rhodes in a supporting role as a police homicide detective). Da Vinci’s Inquest, in fact, is actually based on the life and work of the Vancouver city coroner (later Mayor, and currently Senator) Larry Campbell, and as such it takes place in Vancouver, with its real-life problems (albeit somewhat fictionalized) as major plot points.

    [24] This ordeal is similar to CTV’s OTL experiences with CBS in the production of Due South, the quirky 1990s cross-border hit. CBS cancelled the show after the first 22-episode season, but CTV pressed on without them, only to change their mind and pick it up again when the second season was in the can. Then they cancelled it again at the end of that season, and refused to change their minds; CTV, with the help of outside funding, produced two more half-seasons with American network involvement. In the years since, many Canadian-made shows to become popular stateside have done so on cable (such as Corner Gas and Degrassi).

    ---

    Thanks to Dan1988 for co-writing this update! Thanks also to e of pi, as always, for proofreading it. And to a reader at the TrekBBS known as USS Triumphant who had some cracker-jack timing and made a brilliant observation just in time for me to start writing the “MTV Update”! Serendipity can be a wonderful thing. But to answer the question so many of you have been asking for so long: yes, music videos are indeed “a thing” ITTL. And just as IOTL, they’re the final nail in the coffin for more traditional musical films…
     
    Last edited:
    Appendix B, Part VIII: Don't Cry for Me, Argentina
  • Appendix B, Part VIII: Don’t Cry for Me, Argentina

    At the beginning of the nineteenth century, the continent of South America, with the exception of the Guiana region, was divided between the same two powers which shared most of the Iberian Peninsula: Spain and Portugal. However, although the Portuguese State of Brazil survived its peaceful separation from the mother country completely intact to the present day, the same could not be said of the various Spanish viceroyalties, which fractured almost immediately upon winning hard-fought independence from Spain. This included the Viceroyalty of the Río de la Plata (or River Plate – referring to the confluence of the Paraná and Uruguay rivers), which had divided into several successor states; the most prominent of these was the Republic of Argentina. However, as a result of the protracted collapse of Spanish rule, Argentina maintained several irredentist land claims to territories which they did not control. These included the Falkland Islands (in Spanish, las islas Malvinas), governed (or occupied) by Britain since 1833, and a small group of islands in the Beagle Channel off Tierra del Fuego, under Chilean control since 1881 but in active dispute between the two neighbouring countries (who shared the Tierra del Fuego archipelago) since 1904. These simmering disputes, which endured even as borders with Argentina’s northern neighbours (Uruguay, Brazil, Paraguay, and Bolivia) had been settled definitively (often in the field of battle), occasionally threatened to boil over, but until the 1970s they had always been contained. Anglo-Argentine relations had historically been very good, with Argentina seen by outside observers as a de facto British colony through most of the nineteenth century. Argentine relations with Chile had definitely been more tense, but this was par for the course in South America, a continent divided by a common heritage.

    Ever since Juan Perón had been deposed from the Argentine Presidency in a 1955 coup d'état, and the movement named for him (Peronism, represented by his Justicialist Party) had been banned from participating in subsequent elections, that country had experienced constant political instability. Perón’s eventual democratically-elected successor, Arturo Frondizi, was himself deposed in a 1962 coup, as was his democratically-elected successor, Arturo Illia, in 1966. General Juan Carlos Onganía, who led the latter coup, decided to appoint himself President in order to govern Argentina on a permanent basis; this was in contrast to the leaders of previous coups, which had merely established transitional authority pending the results of the elections which they inevitably called, restoring power to the people. Under the banner of his so-called “Argentine Revolution”, Onganía claimed to be restoring the natural state of affairs in Argentina while ensuring economic stability. Normally, as so many other dictators had in the past, he personally favoured corporatism and autarchy. However, considering the state of the Argentine economy at the time, he instead decided to institute policies which were intended to slow inflation. Wages were frozen, trade barriers were brought down, value-added tax was introduced while the inheritance tax was eliminated, collective bargaining rights were suspended, and state pensions were privatized. His regime replaced the old peso with a new peso pegged to the US dollar, although the future value of the peso against the dollar would be announced via monthly timetables in a series of gradual controlled depreciations, which became known as the “tablita”. Petroleum remained a key interest of the state, with a goal of energy self-sufficiency by 1975. Ironically, Onganía himself would eventually favour with the very military that had installed him, and he was summarily deposed in much the same way as he had risen to power – though many of his economic policies would survive him, for better and for worse.

    New elections were held in 1973, and in the nearly twenty years since he had been deposed and exiled, Juan Perón had sought to consolidate power and allow the representatives of his movement to seek elected office; this time, he was ultimately successful. Though he was not allowed to return to Argentina, a proxy candidate, Héctor Cámpora, was permitted to seek the Presidency for his Justicialist Party, winning that election – with an outright majority of the vote. The expectation following that election was that Cámpora would, with the help of the Argentine National Congress (as a majority of the Chamber of Deputies had also returned a majority of Peronist lawmakers), clear the way for Perón to make a triumphant return to his homeland; Cámpora’s would then resign, triggering a new presidential election – which would surely be a coronation for Perón. And perhaps it would have been – had he not died while still in Spain, on June 5, 1973, at the age of 77. [1] His corpse went ahead with the planned journey to Argentina, where it was entombed. Much like his second wife, Eva, Juan Perón would be remembered for the promise of what had been, along with what could have been.

    Cámpora had served under Perón during the 1950s, but no one had intended for him to hold office for an extended period of time. Indeed, even before Perón was due to return, Cámpora did much to agitate the military, primarily by establishing relations with Communist Cuba, since he thought he would have little to fear from them. Then, when it became clear that Cámpora would remain President until his term ended (or sooner, if tradition had anything to say about it), he faced the quandary of how to reconcile his left-populist ideology with the fiscal reforms of the previous administration – which seemed to be doing some good for the economy (even helping it to weather the Oil Crisis, thanks to the emphasis on native oil production), despite not having been supported by the people. Cámpora did his best to square the circle by rectifying what he called “the social cost”, implementing Peronist social policies on top of the existing reforms. Notably, wage and price controls were completely abolished, and through a social pact with the main Argentine trade unions, workplace democracy was introduced. Most of these changes naturally came at the expense of the military-industrial complex, deepening that institution’s enmity for him. His good intentions were additionally hampered by his lack of competence or charisma and the poor execution of his agenda; the military responded by launching yet another coup, and this one was intended to last. [2] Those behind the new junta, euphemistically known as the “National Reorganization Process” (Proceso de Reorganización Nacional, or simply el Proceso, “the Process”), had become convinced that the death of Perón had eliminated the one person whom Argentines could rally around, few people having much sympathy for Cámpora (who eventually fled to Mexico).

    At first, the Process was welcomed by the Argentines, as the tradition of coups toppling corrupt democratic governments was well-established by this point. Even the anomalous Onganía Presidency did not mar the overall positive impression that many people had of their military (since, after all, it was the military that had deposed Onganía himself in the end). However, all the usual suspects were as loud and vociferous in their opposition to undemocratic rule as they had always been, and it soon became apparent that another tradition, that of the military silencing opposition through censorship and restriction from political involvement, would have to be… changed with the times. Opponents of the Process soon developed a rather nasty habit of “disappearing”. Naturally, the junta focused initially on the more notorious figures of the Argentine underground, knowing that the general public would not miss them. Even as more benign figures were targeted, many people willfully ignored the implications: “Algo habrán hecho”, so the expression went. [3] Many Argentines were also fearful for their own well-being, declining to associate with acquaintances for fear that they might be one of them, and that they would be tainted by any connection. Those opponents of the Process who were safe from repercussions (and able to spread word of them beyond the borders of Argentina) labelled their implementation of state terror “the Dirty War” (Guerra Sucia), a name that stuck. Naturally, those who were part of the Process that did not fully agree with its methods or objectives soon found themselves drummed out of the inner circle, which rapidly devolved into an echo chamber.

    By the time the Process had come into power, Argentina had more or less achieved fiscal stability: the economy was finally growing again, and, more importantly, inflation was finally under control. The Process, for their part, eliminated most of Cámpora’s social pact, and privatized most of the state-owned enterprises (except for YPF, the state petroleum company). Yet there were lingering problems which would eventually come back to haunt both Argentines in general and the junta in particular. Many retailers and industries went into bankruptcy, unable to compete with the flood of cheap imports that entered Argentina. Poverty and unemployment rose tremendously as a result, and soon the government began taking on debts in order to remain solvent. Unsurprisingly, corruption was rampant. Whenever an industry would be privatized, someone with good connections to the Process would inevitably get their hands on it at bargain rates via bribery and insider trading information. The tablita, meanwhile, did at least as much harm as good. By 1978, the Argentine peso was the most overvalued currency in the world, which made it easier for Argentines to purchase imports, leaving Argentine-made consumer goods on the shelves. The peso thus affectionately became known as “la plata dulce”, or sweet money. This phenomenon, coupled with the already-present tendency to seek quick, brutal, and violent solutions to existing problems, and the growing international condemnation of the Dirty War abroad (leading Argentina to affiliate with the infamous Backwards Bloc in the twilight years of that group’s existence), resulted in the junta adopting a foreign policy to match the tenor of their domestic policy.

    The growing belligerence of the Argentine junta sparked an international incident in late 1977 when the RRS Shackleton, an unarmed survey ship which operated out of the Falkland Islands, was attacked by the Argentine Navy – specifically, the destroyer ARA Almirante Storni, who fired upon her while she was en route to Antarctica, as part of a British Antarctic Survey expedition, and hit – critically damaging the hull of the vessel. Fortunately, no hands were lost, but HMS Endurance, a patrol vessel which was in port at Stanley at the time, had to be dispatched to rescue the crew from their ship, which was taking on water and gradually sinking into the ocean. [4] The Shackleton would not make it to Antarctica, much like her famous namesake; nor would she successfully return to Stanley or any other friendly port for repairs. Said namesake’s son, the Baron Shackleton, was a peer in the House of Lords, and a member of the Opposition Labour Party. His vocal criticisms of the government’s muted reaction to the sinking of the Shackleton – even as it became an international incident – helped to propel the issue into the British consciousness. Surprisingly, the resolutely left-wing Leader of the Opposition, Michael Foot, denounced the “inaction” of the government regarding the Shackleton Incident, especially when it was discovered that the Argentines had been illegally operating a naval base on Thule Island, in the South Sandwich Islands (British territory, though also claimed by Argentina), since 1976. [5] The combination of the lack of retribution for the loss of the Shackleton, and the passive acceptance of the “squatters in the Sandwich Islands” added up to the largest scandal of the Whitelaw government’s first term, and a major issue in the following year’s general election. Although there remained considerable opposition to taking a harder line on Argentina on both sides of the aisle, when polls showed Foot leading Whitelaw on the issue (despite his dismal performance on most economic questions), the PM agreed to take a tougher stance – in fact, a task force was dispatched to the Falklands for the duration of the election; en route, they forcibly evicted the Argentine presence from the Thule. Argentina bristled, but did nothing – the junta knew they would score more points that way, and might be likely to eventually face a government which was more amenable to continuing their negotiations regarding the Falklands. It turned out that the Shackleton Incident would teach the Process all the wrong lessons…

    The UK general election was decided just weeks before the 1978 FIFA World Cup, to be held in Argentina, was due to commence. A victory by the Argentine soccer team would be a phenomenal propaganda coup for the junta – but it would be hard-won. The UK had two chances to avenge the Shackleton Incident on the soccer field: both England and Scotland were among the ten UEFA teams which had qualified to compete in the World Cup. [6] In the first series of round-robin play, Argentina won the most points of the four countries in her group, advancing to the second round along with Italy (France and the Soviet Union were knocked out). England advanced along with Brazil from another group, as did Scotland and Chile – Portugal and West Germany were the last two countries to reach the second round. It was the first time that Scotland had managed to advance to the second round of the World Cup – all of Great Britain from Land’s End to John O’Groats was abuzz with the possibility of a final between the both halves of the union. [7] Any match between would have to wait until then; England and Scotland found themselves in each of the two groups of the second round-robin. England faced Italy, West Germany, and Portugal, and in what many observers deemed an upset, managed to finish first in that group, over Italy who came in second and thus moved on to compete for third place. Scotland, on the other hand, finished dead last in her group, which included Argentina, Brazil, and Chile. [8] Argentina came in first, thus earning the right to compete with England in the finals, and Brazil finished second, losing only to Argentina. Both countries defeated Chile and Scotland, and Chile also defeated Scotland, but many Scots were still pleased at their country’s best-ever showing to date. The defeat to Argentina in particular stung, however, given the Shackleton Incident. It was about to get far worse.

    Italy rebounded from their failure to finish first in their round-robin group by narrowly defeating Brazil (2-1) for a third-place overall finish. The following day, on June 25, 1978, Argentina defeated England, 1-0, winning the World Cup on home turf. Considerable unrest was reported throughout England (the less charitable might instead be inclined to describe the events as riots) and the Process was not particularly gracious in their victory – however, the popular report of an official claiming to have “sunk your football team like we sunk the Shackleton” was likely apocryphal, believed to have originated (without verifiable source) in the July 1, 1978, issue of News of the World. Nevertheless, this inflamed the reactions of the English populace – and it was telling that even the Scots (who, along with the Welsh and the Northern Irish, made it their policy to cheer whichever team was playing England) were noticeably peeved. They weren’t the only ones – Chile and Brazil, both of whom had also been defeated by Argentina late in the tournament, both had separate, independent grudges which were exacerbated by the events of the World Cup. These four teams – or rather, the three states which supported them – would find additional, and unexpected, common ground against Argentina before the year was out.

    Argentina had been seeking to resolve their longstanding territorial disputes with both the United Kingdom and Chile for some time – in fact, it had been one of the few constants in foreign policy through the tumultuous regime changes of the 1970s. However, how each successive Argentine government chose to approach their territorial disputes varied widely. As far as the Falklands were concerned, covert negotiations had been ongoing for much longer than the British might be willing to admit. The Process decided to change tack from this conciliatory approach and adopt a more aggressive posture, culminating in the illegal naval base and, in a bridge too far, the Shackleton Incident. But before the situation had escalated, and as a gesture of good-faith negotiations, Argentina had invited Her Britannic Majesty to moderate an arbitration process over the Beagle Islands, disputed with Chile. This began in 1971, and lasted for six years. In 1977, the arbitration court in The Hague ruled in favour of Chile – the Queen was permitted under the terms of the agreement to accept this ruling and present it to Argentina, or to reject it. She did the former, no doubt with the encouragement of her government, despite the fact that doing so could disrupt their own negotiations, and indeed it would. As for Chile, despite their failings to get what they wanted for Britain, those in the inner circle of the Process gradually convinced themselves that a swift, decisive military action would give them the leverage they needed to get what they wanted by force, since it was painfully clear that diplomatic channels had failed to secure them the islands in the Beagle Channel.

    Argentina chose to launch an invasion into the Beagle Islands territory in late December of 1978 – just in time for Christmas – as part of what they called Operación Soberanía (or, in English, Operation Sovereignty). The days were long so far south of the Equator, and the particular night of their planned assault, December 22 – the summer solstice – was clear and pleasant. [9] The Papacy, which had always wielded considerable influence in the Southern Cone, offered to mediate the dispute, but Innocent XIV’s ambassador was rebuffed by Argentina; the time for talk had passed. [10] During his Christmas mass a few days later, Pope Innocent would make a point of praying for the souls of the Argentines and Chileans whose blood would be shed in the war. Although Argentina had seized the Beagle Islands without difficulty, the conflict would not stay confined to the archipelago for which it had been named for long. Chile declared war against Argentina immediately after the Beagles had been captured, and their forces went on high alert, amassing along the long Andean border between the two countries. The Argentines, naturally, had already mobilized their troops and many of the Chilean brigades found themselves face-to-face with their opposite number by the time they took up their positions. In yet another Christmas miracle, the border managed to remain quiet through the 25th of December. The following morning, however, the peace could no longer hold. One side fired, the other responded, and battles broke out all along the border, at every pass and in every valley. Before the year was out, Argentines in the border city of Mendoza faced a rude surprise when they were bombed by the Chilean Air Force. The war, clearly, was spiralling far beyond the contained conflict originally envisioned by the junta. Fortunately (or, perhaps, unfortunately) for them, the populace had been incited to seek vengeance from the Chileans (never mind that Argentina had been planning to do much the same thing to the Chilean capital of Santiago) far more effectively than they would have otherwise been through propaganda. There would definitely be no quick and easy end to the war then.

    By this time, the Chilean ambassadors to the Organization of American States (OAS), and to the United Nations, vehemently protested Argentina’s unprovoked and unlawful invasion, and demanded immediate action on the part of the international community. The OAS reacted more quickly than the notoriously bureaucratic UN, with the majority of its member states voting to suspend Argentina from the organization. [11] Uruguay, Paraguay, Bolivia, and Peru were among those who opposed, but their opposition wasn’t enough for Argentina to be exiled into a club, the only other member of which was Cuba, suspended since 1962. However friendly the two countries had been just a few years before, there was certainly no love lost between them anymore. The United Nations Security Council (UNSC), on the other hand, took a more gradual, incremental approach – on December 23, the day after the Argentine invasion began, the UNSC passed a resolution, 14-1 (Bolivia was the lone NAY) condemning Argentina and urging both sides to withdraw to their positions on December 21, and seek a peaceful solution to their territorial dispute. This was standard operating practice for the Security Council whenever a war erupted anywhere in the world – neither side paid it much heed. What would be far more important was how to the war would continue to unfold on the ground, and how the UNSC would respond. The possibility of the international community intervening militarily on behalf of Chile arose almost immediately, even over the Christmas holidays. US President Ronald Reagan liked the idea the more he thought about it; although Argentina was more powerful than Chile and would probably win out eventually in a one-on-one contest, it would be a Pyrrhic victory at best: long, hard-fought, and very costly to both sides. Neither Soviet Russia nor Red China would intervene, since it was being fought between two anti-Communist countries – in fact, both were sympathetic to Chile, Argentina having seen a left-leaning, legitimately-elected, and more Communist-friendly government overthrown by a repressive, totalitarian military junta (one which was, ironically, far more similar to those Communist states in character, if not in ideals). Chile, meanwhile, was the ideal “victim”, being one of Latin America’s few longstanding democracies. [12] Apart from the few countries which were aligned with Argentina or in a position to gain from Chilean weakness, most governments in that region backed Chile wholeheartedly – the horror stories of the “Dirty War” had spread far beyond Argentine borders. If the US were to intervene in Argentina, they would do so with widespread support – and, more importantly, without widespread opposition.

    Reagan had ideological reasons to support US involvement in Argentina. The United States had not decisively won a large-scale military conflict since World War II. Korea had been a hard-fought stalemate, and the less said about the overseas quagmire, the better. The morale of American troops would be boosted immensely with an unambiguous victory. As would the morale of the American people – who were growing disillusioned with Reagan after his policies had contributed to the second recession to plague the United States in less than a decade. It was only because of their overwhelming majorities in both Houses of Congress that the Republicans were able to weather the losses they had suffered in the midterm elections of 1978. A “short, sharp shock” to Argentina would, in Reagan’s estimation, be just what he needed to get America back on the right track. It would also vindicate his plans for maintaining a military presence in Iran. And if he did still lose in 1980, at least Argentina would give him a legacy with something positive to show for it. International disdain for the Process in Argentina and empathy for Chile gave Reagan the opportunity he needed to build a coalition of participants in his planned intervention. After the Christmas holidays had ended, he set out to work seeking partners, and it didn’t take him very long to find them.

    Perhaps surprisingly, Canada responded very well to his initial overtures. This was largely because the planned intervention was to focus on sea and air superiority to complement the Chilean ground presence – possibly Brazil could be persuaded to open up a second front to weaken the Argentine Army further – with bombing, shore bombardment, and amphibious raids being central to any second phase of a military intervention. Large-scale landings of troops, as had been done in previous foreign entanglements, were out of the question – small contingents might be sent to shore up Chilean defences, but the US Army was stretched out rather thin as it was, and Reagan knew that boots on the ground would be trouble for his administration, especially in the long run. This approach was right up Prime Minister Stanfield’s alley: Canada lacked the troops necessary to make a meaningful contribution to a ground or amphibious invasion force, but it did have an aircraft carrier, which cost no small amount to maintain, which had yet to prove itself. In fact, many in the Official Opposition Liberal caucus had decried HMCS Eagle (CV 23), purchased from the UK in 1973, as a “white elephant that floats” (or, more to the point, a “white whale”, given Stanfield’s passion for flattops) and “not worth buying, not worth keeping”. [13] Being part of a naval strike force in Argentina would give the Eagle a chance to prove herself, and would likely secure her long-term future under the Canadian flag. It would also justify Stanfield’s plans to equip her with a new air wing, to replace the aging F-4 Phantom II air superiority fighters flying off her deck at the time – fortunately, those planes would be good enough to engage the Argentine air forces.

    Even more surprising than Canada’s willingness to participate was that of France, but President Francois Mitterrand was – ironically enough – more bellicose in some ways than his old rival, Charles de Gaulle, had been. He was also friendlier with NATO, though he made no steps to have France formally rejoin the organization’s command structure. Mitterrand, being a socialist himself, had been displeased at Argentina’s left-leaning government being deposed by the junta. France had two aircraft carriers, and could easily spare one to contribute to an international strike force. Having France involved in addition to Reagan’s planned coalition of the core Anglosphere countries would add legitimacy to the endeavour, and create a united front amongst the three Great Power Western Democracies. This would, of course, necessitate the involvement of the United Kingdom. Prior to the Shackleton Incident, this would not have been likely, despite the “special relationship” between the US and the UK. However, with Anglo-Argentine relations sinking as surely as the Shackleton herself had done, and with Prime Minister Whitelaw promising to defend British sovereignty against any further Argentine provocation, there was little doubt that the UK, too, would be willing to participate. Australia, which had joined the US in the overseas quagmire, also promised to commit ships to a naval intervention – in fact, the timing couldn’t be better, as the USS Enterprise (CVN-65) was docked in Perth at the time of the Argentine invasion, and would be able to accompany HMAS Melbourne (R21), the only remaining Australian carrier, to join any planned strike force from the opposite direction, and shore up Chilean sea and air defences from the Pacific.

    The formation of such a grand coalition of nations was contingent on the UN sanctioning their involvement, which entailed ensuring that neither the USSR nor the PRC would veto any UNSC resolutions to that effect. The USSR, surprisingly, was not only willing to look the other way but actively supported outside involvement in bringing a swift end to the war – Argentina exported a great deal of food to the Soviet Union which was now being diverted to war rations. December and January may have been summer months in Argentina and Chile, but in the northern hemisphere they ushered in the notorious Russian winter that had felled both Hitler and Napoleon in years past. But in peacetime, the only people who suffered were the local inhabitants, who were forced to rely on imports to feed themselves. What surpluses there were would not last long, and if Argentina and Chile were left to their own devices, the war would likely continue into 1980. Therefore, Russia worked to expedite UNSC resolutions which would lead to military intervention. It was clear that they would not participate themselves – they had plans for an invasion of Afghanistan brewing, and in their mind it was better to leave playing hero to the capitalists. Naturally, they didn’t expect to receive nothing in return for their generosity, and Brezhnev negotiated a secret deal with Reagan that would allow the USSR a free hand in Afghanistan (which also quietly acknowledged Iran as being within the American sphere of influence) in exchange for their support. The media would not learn about this treaty until many years later, though it also explained Reagan allowing American athletes to participate in the 1980 Olympics – hence, “Only Reagan could go to Moscow”.

    This left the People’s Republic of China, which had held China’s Security Council seat since 1971, as the only power which was in any position to veto military intervention. Red China had poor relations with Soviet Russia (preventing the Kremlin from being able to coax Peking into authorizing the use of force) and non-existent relations with the US, which still did not recognize the PRC as the legitimate government of China. On the other hand, that provided Reagan with the leverage he needed – he had something that China wanted. Not to mention that China in 1979 was in no position to condemn anyone for wanting to engage in military action against a foreign country. However, the amount of time that would be needed to arrange a meaningful gesture that would thaw relations between the two powers would probably be much greater than organizing an operation in Argentina. In addition, Reagan’s Secretary of State was busy negotiating with the foreign ministers of Canada, the UK, France, and Australia, along with several other countries, and could not be diverted from that task to devote his energies to an altogether more delicate affair. Therefore, Peking (reluctantly) agreed to postpone negotiations until after the conclusion of the military intervention, and to abstain on any resolution which would sanction it. With that, the last roadblock to action had finally been cleared away.

    Ultimately, Reagan decided to hold off on any decisive actions until after the 96th Congress had been sworn in on January 3, 1979, so as to avoid having his opponents accuse him of taking advantage of the more heavily Republican lame-duck 95th Congress. However, during this time, and while steps were taken at the UNSC to develop the knee-jerk condemnation of the Argentine invasion of Chile into the authorization of military force, the member states of the coalition worked to coordinate their movements and planned attack strategy. Argentine air power was estimated at approximately 150 fixed-wing aircraft, total – this included their entire air force as well as their naval air arm. The strike force which would intercept the Argentine forces sought a decisive numerical advantage, and ultimately five aircraft carriers were dispatched to the Argentine Sea. Two American supercarriers, the USS Ranger (CV-61) and the USS Constellation (CV-64), both berthed at US Naval Station Norfolk at the time, were dispatched to Argentina, along with their escorts. The two flattops could support over 150 planes between them – enough to match the Argentines. En route to the Azores, where they were to rendezvous with their European allies, they stopped at Bermuda, awaiting HMCS Eagle, which arrived from CFB Halifax a few days later. This would be the first belligerent operation she would undertake flying the Canadian flag; her crew could barely contain their excitement. The Eagle could fly 40 planes off her deck – a respectable contribution in comparison to what the Americans (and in particular, the British and the French) were flying.

    Meanwhile, the Eagle’s sister ship (both were Audacious-class carriers), HMS Ark Royal (R09), departed from HMNB Devonport in Plymouth – she had not been expected to leave that port when she had entered it in December of 1978, being due for decommissioning in early 1979. She had been hastily re-equipped with her air wing, which had been parted from her in November, for what the British government deemed to be a fine last hurrah – the Ark Royal, despite having spent the majority of her life in refit, was still more capable than HMS Hermes, the only other active UK flattop. (HMS Invincible would not be ready for active service until 1980, after the war had already ended.) The Ark Royal met one of France’s two carriers, the Clemenceau (R98) off the French naval base in Brest, and departed for the Azores. The two European flattops could carry some 80 aircraft between them, par with the Eagle, though even all three combined could not match the air power of the American contribution to the strike force – some three-fifths of the 268 planes which were bound to contest the Argentines for air supremacy. The five carriers met at the Azores, which Portugal (a NATO ally) had allowed them to use as a staging point. [14] From there, they would proceed to the Falkland Islands – the nearest allied territory to Argentina. In fact, the Falklands were close enough to the Argentine coast that enemy planes would be able to strike there, making it crucial that the strike force be able to defend against that possibility.

    By mid-February, the strike force was in place. Just over two months after the commencement of Operación Soberanía, on February 26, 1979 – a Monday – UNSC Resolution 444, authorizing military action against Argentina for their blatant disregard of international law, and their refusal to observe any of the previous UNSC resolutions condemning its actions and recommending alternative solutions, was passed, 14-0. China did not participate in the vote. [15] Both Britain and France jointly authorized “use of force” against Argentina the following day; Canada and Australia followed the day after. US Congress waited until March 1, 1979 – and being the last of the five coalition countries nations was enough to have the others playfully invoke the memory of both World Wars, which the USA had also joined late; Captain E.N. Anson of the Ark Royal, in acknowledging the official transfer of overall command of the strike force to Rear Admiral Patrick Drake aboard the Constellation, jokingly replied: “About time those Yanks showed up, we’ve been fighting since February!” The Admiral is apocryphally held to have said in response: “And now that we’re here, we can start winning.” That the task force was just outside Argentine waters, and had been assembled for over a month, wasn’t going to stave off any good-natured ribbing between allies, especially as prelude to fighting their mutual enemy. The strike force itself was quickly given a place within the larger command structure that comprised the entirety of the coalition forces – codenamed Operation Tailwind. [16]

    As the entirety of the Argentine coast was on the Atlantic Ocean, the Commander of the US Atlantic Fleet, Admiral Rembrandt C. Robinson, was placed in operational command of Operation Tailwind. [17] Robinson, who had served in the US Navy since World War II, had little experience with flattops and was chosen primarily because of his amphibious warfare expertise, should the war have to come to that. Rear Admiral Drake, in command of the strike force, held the rank defined in the NATO code as OF-7. However, each carrier group was also commanded by a Rear Admiral, though both were junior to Drake in that their rank was defined in the NATO code as OF-6, despite wearing identical uniforms and being addressed in identical fashion. This would prove extremely confusing to the personnel belonging to their allied navies – especially with the added wrinkle that the commander of the French carrier group, Counter Admiral (contre-amiral) Jacques Antoine Choupin, also held the rank of OF-7, making him senior to every other officer in the strike force save for Drake, with whom he held an equal rank. [18] For comparison, the highest-ranking British officer in the strike force was Commodore J.F. “Sandy” Woodward, and the highest-ranking Canadian officer was Commodore Adrian Jackson, who had supervised the refit of the Eagle for Canadian service, and had been the first to command her under the Canadian flag, before being promoted to command the entire carrier group. Both Woodward and Jackson voiced concerns about the confusion of ranks on the American and French side of the equation. The French, who were no longer part of the NATO command structure and were participating in the operation for their own reasons, were free to ignore these concerns (which was exactly what they did) but this only fed into existing rumblings regarding the “doubling” of the Rear Admiral rank within the US military.

    The strike force remained in their position off the Falkland Islands until all five countries that had contributed to it had authorized the use of force against Argentina, in order to give them one last chance to surrender without any loss of life or devastation brought on by coalition forces. The Argentine government was well aware of the sheer manpower that the strike force had brought to bear, but there was still considerable sentiment to continue the war. Chile had bombed the Argentine city of Mendoza in their opening counter-attack, and this quite effectively silenced vocal resistance to Operación Soberanía going forward. And, indeed, Argentina was doing very well – enjoying a definite (if small) advantage over Chile, though one which was not likely to see material gains as the positions of both armies became bogged down along their formidable Andean border. Some within the junta argued that the strike force was in actuality a blockade force – the Argentine coast was large enough, they reasoned, that only several carriers could effectively block all access to it. However, Argentina still had access to the sea through Peru (via Bolivia), and both sides knew this. It was obvious that the coalition was biding their time, waiting for Argentina to make the first move. Obviously they had planned for a worst-case scenario; the Process was ready to give it to them. When it became clear that no Argentine forces were forthcoming, Admiral Drake gave the order to advance the strike force toward Tierra del Fuego early in the morning of March 4, 1979 (a Sunday). They soon discovered that the cream of the Argentine Navy had been dispatched to meet them at a sweet spot where the range of several Patagonian airbases overlapped, providing the fleet with comprehensive air cover. It soon became clear that the Argentine Air Force had pulled every plane they could away from the Chilean front lines, in hopes of mitigating the numerical superiority enjoyed by the coalition – in all, approximately 120 planes were mustered to engage the 268 carried by the strike force. In a toxic combination of hubris and desperation, the fleet which had been sent out by the Argentines were led by their capital ships – their only cruiser, the ARA General Belgrano, and their only flattop, the ARA Veinticinco de Mayo (built as HMS Venerable during World War II, before being sold to the Dutch as HNLMS Karel Doorman). The Battle of the Argentine Sea – the first and only battle between aircraft carriers since World War II – was about to commence.

    Shortly after sunrise, the Argentine planes were first spotted by an E-2 Hawkeye flying off the USS Ranger, prompting a scramble aboard all five carriers. The opening salvo of the battle – the first AIM-54 Phoenix missile to be fired in anger – was fired by an F-14 Tomcat about half an hour later, firing at the approaching Argentine planes (which were still far enough away that they could not return fire). This was the first of a volley of Phoenix missiles which proved highly effective – and deadly. Though a few Argentine planes managed to come within attack range of the coalition air wing, by this time, the majority of their planes had already been lost, and by the time the battle was over, 101 of the estimated 120 Argentine Air Force planes had been shot down – one of the most devastatingly lopsided aerial battles of all time. The action would not go off without a hitch – the confusion over rank equivalency would be trumped by the differing units of measurement used by each member nation of the coalition strike force. France, which had invented the metric system nearly two centuries earlier, naturally favoured it; both the United Kingdom and Canada had officially transitioned to metric by this time, but Imperial units remained common in everyday use, including aboard their warships; the United States, meanwhile, stubbornly retained their customary measures – this had been an issue even during peacetime exercises, and was naturally a major problem in co-ordinating fleet actions against an actual enemy force.

    The air cover was the only effective line of defence against the strike planes which would be able to attack the fleet without fear of reprisals – most of their anti-aircraft guns could not reach the planes before they were close enough to fire their deadly payload – anti-ship missiles. Although the Argentines enjoyed an initial numerical advantage, this was quickly whittled down as ever more planes launched from their respective carrier decks. It was during the early phases of the battle the Argentines scored most of their (relatively few) kills against the coalition; every last one of the strike planes which attempted to venture closer to the seaborne strike force was successfully intercepted, though their weaponry would likely not have been effective at attacking ships at any rate. As the overwhelming air superiority enjoyed by the coalition became apparent, the Argentine fighters began to withdraw – many for legitimate reasons (out of ammunition, too heavily damaged to continue fighting, low fuel), though low morale and the clear futility of the struggle no doubt encouraged some pilots to withdraw when they had no proper excuse to do so. Those planes which managed to return to their coastal bases would not sortie from them again for the duration of the battle. The dogfighting had not lasted more than a couple of hours; it was still morning. After the coalition planes had landed, re-armed, and refueled, they would be able to go on the offensive. At this time, the Argentine fleet was ordered to withdraw – which would prove too little, too late.

    The only Argentine planes remaining were those flying off the Veinticinco de Mayo, which were given the order to take out as many incoming planes as they could, in a last-ditch effort to defend the fleet; however, the numerical disadvantage they faced was so great that not a single coalition plane was lost in their assault – though every Argentine plane was. Among the ships to be damaged beyond repair by the coalition in that fateful battle were the carrier Veinticinco de Mayo (it having been said that the zeal to attack a bona fide flattop made it an irresistible target to each and every pilot), the cruiser General Belgrano (sunk by American planes flying off the USS Constellation) and the destroyer Almirante Storni, the same vessel which had attacked the RRS Shackleton two years before. The other destroyers in the Argentine fleet had either been successful in their escape or had surrendered. The two submarines brought in the Pacific retreated when it became clear which way the battle in the air was headed, and were pursued in turn by the coalition’s nuclear submarines. The ARA Santiago del Estero was sunk by the British nuclear submarine HMS Churchill (S46) which, true to her namesake, managed to catch her through dogged persistence hunting, attacking when she had to resurface for air; this allowed the Churchill to strike before the Santiago del Estero had a chance to “get her pants on”. The other submarine, the ARA Santa Fe, was disabled by the USS Archerfish (SSN-678) off Tierra del Fuego. Neither made it back to the Pacific.

    Dozens of Argentine planes had been downed in the fateful battle, including all but one of those which had managed to take down a coalition plane – “El Suertudo”, or “the lucky one”, as Argentine Air Force Capitán Juan Manuel Lombardi would become known. It was without doubt a title he carried only in comparison to most of his fallen comrades – he was the only one who could claim an unambiguously successful run in the operation, up to and including the commanding officer of the fleet, Vicealmirante Juan Lombardo, who was killed instantly when a missile exploded on contact with the hull of his flagship, the Veinticinco de Mayo. He was the highest-ranking casualty of the entire war.The junta, eager to score a propaganda victory out of the unmitigated disaster that the Battle of the Argentine Sea had been, grounded both Lombardi and his plane (ostensibly for repairs) in hopes of using them as propaganda tools in the remaining stages of the war.

    The battle marked the first combat flight of the F-14 Tomcats, only recently assigned aboard the decks of the Ranger and the Constellation, and these acquitted themselves most admirably, fleet-defence operations being exactly the type of combat for which the planes were designed. [19] However, they were not without a few tragic losses – two of them were downed by the Argentines, one in combat, the other damaged seriously enough that it had to be written off on return. By contrast, the Blackburn Buccaneer strike planes stationed aboard HMS Ark Royal were kept out of the dogfighting, and then the initial strike against the fleet (only for the battle to be over before a second strike could be launched) their age and inferior combat capabilities deeming them a liability in aerial combat. Naturally, this soon made its way back to the UK, where the press were rather unforgiving in their assessment of the Royal Navy air arm: The Sun ran the headline “GROUNDED!” with a file photo of several Buccaneers on the deck of the Ark Royal. The subtitle succinctly read: “Yanks, Frogs, Canucks triumph, down 100 Argie planes – but Buccs stay on deck”. (The sinking of the Santiago del Estero by the Churchill ran the following day, but the damage was done.) That the strike planes on the Royal Navy’s flagship carrier were seen as outdated – even though the Ark Royal herself was due to be decommissioned and said planes had already been offloaded before being hastily returned to her deck – struck a powerful chord. It helped that the Ark Royal’s sister ship, HMCS Eagle, had been able to launch her Corsairs from the outset, and that they were able to help sink the Veinticinco de Mayo – many within the Senior Service still resented that the Eagle had been decommissioned over the Ark Royal – and would have been sold for scrap had the Canadians not come in and rescued her. The Canadian tabloid Toronto Sun newspaper, by contrast, ran “ATTA GIRL!” and a file photo of the Eagle as their front page headline on March 5, 1979 (along with the subtitle “Canuck flattop helps Coalition strike force to sink Argentine fleet”). The divergent fates of the two Audacious-class carriers could not have been illustrated more effectively; in fact, the two Sun newspapers were rumoured to have been displayed side-by-side behind closed doors at the 1979 Commonwealth Heads of Government meeting in Zambia that August.

    In contrast to the extensive Argentine losses, seven planes were lost in the Battle of the Argentine Sea, and these constituted the entirety of the coalition losses. Five of these were American – and amongst those aboard those seven planes was the highest-ranking coalition officer to perish in the battle, LCDR Warren F. Novak, leader of a squadron of Tomcats. Canada and France lost just one plane apiece. The advantage for Britain in half of their air wing being confined to the deck of the Ark Royal is that it presented the Argentine fighters with fewer targets – they were the only nation fighting the battle to suffer no casualties at all. Many of the more “upmarket” British broadsheets chose to emphasize this; the Daily Telegraph ran the headline “ARGENTINE FLEET CRUSHED” with the subtitle “UK carrier group suffers no casualties”. The Buccaneers would indeed fly strike missions after the Battle of the Argentine Sea, enemy air power having been effectively neutralized – those planes which had survived the battle would be in no position to engage the coalition directly, given subsequent events that would emerge on other fronts. However, since Britain played no part in the downing of over a hundred warplanes, they could not boast the singular achievement of an American pilot, Captain Ray Heiser, and his radar intercept officer, Commander Simon Johansson, flying aboard a Tomcat, who between them managed to take down nearly 5% of the planes all by themselves, scoring five kills and becoming the last pair to date to achieve the exalted status of ace-in-a-day. The battle produced two additional aces, both of whom were also American, and both of whom had served in the overseas quagmire and had racked up enough kills there that the battle simply brought them over the top (and in one case, just barely – the fifth kill was a helicopter gunship which flew off the Veinticinco de Mayo). Their successes were impressive, though on the whole, Capitán Lombardi (despite managing just one kill) had arguably logged the more difficult achievement.

    First and foremost among these was the entry of Brazil into the war on the following day, March 5, 1979. Argentine troops had been stationed along the Brazilian border (staring at the amassed Brazilian troops across the Uruguay river) for the duration of Operación Soberanía – troops who could have been fighting in Chile – but it turned out to be one of the more sensible decisions by the Process (unsurprising, given that it was led by Army generals, as opposed to Navy admirals). However, Brazil enjoyed an advantage in manpower and – thanks to the Battle of the Argentine Sea – overwhelming air superiority. The Battle of the Uruguay, the largest land battle of the war, would succeed in breaking the Argentine defensive line, badly damaging troop morale and popular enthusiasm for a war which had suddenly turned very ugly. The few Argentine planes which were able to engage the Brazilians at the Uruguay performed little better than at the Battle of the Argentine Sea the day before. As for the Navy, the Admiralty were able to convince the junta to have their remaining surface ships withdraw to port – which were, at least, far more defensible against coalition attacks. The USS Enterprise and HMAS Melbourne arrived shortly thereafter, and would provide air cover from the Pacific coast for the remainder of the war. Notably, as a result of her participation in this war, the Melbourne fired shots in anger for the first time in her lengthy naval career.

    The coalition (which now included Brazil) and Chile enjoyed an air superiority ratio of greater than 10:1 over Argentina once the Enterprise and the Melbourne were in position to join the aerial bombing and shore bombardment activities that formed the core of the operation, devastating Patagonia. Though the region was sparsely populated, the coalition intended for the hopelessness of the situation to force changes in Buenos Aires – before their bombs did. Chilean troops, protected by additional air cover, went back on the offensive – what little Argentine air power remained moved to intercept them. American troops were due to arrive in the Pacific to shore up the Chilean position – in the Atlantic, amphibious assault ships were coming into position. A proper, full-blown amphibious invasion in the vein of Normandy was quickly deemed unnecessary and was not considered as a contingency plan within the operation. Brazilian troops continued to march across the Uruguay and through enemy territory; it was only a matter of days – weeks, at the most – before neutral Paraguay and Bolivia were cut off from Argentine supply lines, which would finally create an effective blockade against the country. The situation was increasingly grim. Many within the Process loudly argued when they had once whispered that Argentina had fought honourably and could surrender while still maintaining her dignity – before the bombers levelled Buenos Aires and before hundreds of thousands – even millions – starved to death in the coming winter.

    In addition to the continuous loss of manpower, materiel, and territory, Argentina faced considerable economic hardships as a result of the war as well. The most devastating – and ironic – of these were the strategic attacks on the country’s strategic petroleum reserves, including many of its refineries, which quickly became the principal non-military targets of coalition bombers. This would not only heavily cripple Argentine supply lines, but also their economy; Argentines was soon hit by their very own Oil Crisis. The financial situation was even more dire; in late 1978, the Central Bank of Argentina had issued new guidelines pertaining to mortgage and loan repayment schedules. As all loans and mortgages in Argentina had an adjustable rate, all interest payments would be tied to the value of the US dollar as it related to the peso. This was a risky move, as most people had assumed that the tablita would continue to function as normal, and many Argentines began to refinance their mortgages on these new favorable terms. However, once the Process needed to pay for the War, major balance-of-payment problems emerged, exacerbated by the pervasive corruption endemic to the system.

    The Process was also growing concerned with peso itself; Argentines were making overseas loans in that currency (driving desperately-needed capital out of the country), and most of those loans were financing high-risk ventures, including several Ponzi schemes which had already collapsed. The month before the Battle of the Argentine Sea began, the junta decided to pay for the war by severely devaluing the peso. The tablita was finally shattered, throwing Argentina into major economic turmoil. Unable to pay off their mortgages with their dollar-linked interest payments, Argentines began declaring bankruptcy in droves as their payments rose ten-fold, while financial speculators benefited by being able to write off their debts. A major financial crisis had struck the country, quickly becoming a currency crisis as cash shortages soon became widespread. Adding to Argentina’s woes, the United States Federal Reserve, in a bid to stem the tide of stagflation, raised interest rates and hence limited the supply of dollars outside the US. This resulted in the dreaded inflation, seemingly kept under control for so long, roaring back with a vengeance. Mass public protests against the Process, previously unimaginable, began multiplying throughout the country, demanding an end to both the war and their misery. Desertion became a huge problem for the army, as new conscripts grew more concerned about themselves and their families. The newest member of the inner circle, the war hero and propaganda symbol Capitán Lombardi, was surprisingly among a growing faction who agitated for surrender – or for installing someone who would surrender. To borrow from the popular arcade parlance of the day, it was “game over” when the American reinforcements arrived. US Army troop transports arrived at the port of Antofagasta, in Northern Chile, allowing Chilean troops to prepare for an offensive which would, if all went well, link them up with Brazilian troops, who were marching westward. Meanwhile, Inchon and Tarawa-class amphibious support “baby carriers” formed the backbone of the attack force which also included Admiral Robinson’s command ship, the USS Mount Whitney (LCC-20), arrived off the coast of Buenos Aires, floating just outside attack range. The original strike group was approaching from the southwest, their planes hitting every prime target along the way while the Enterprise and the Melbourne, along with newly-arrived naval reinforcements which had accompanied the troop transports, continued to hunt down Argentine stragglers in the Pacific, and provide air cover for the Chileans as they advanced across the Andes, and through Argentine lines. The time of reckoning was nearly at hand. And that was when the conspirators sprang into action.

    The coup which ended five years of totalitarian rule by the Process in Argentina was swiftly-executed, and the first thing that the new heads of government in Buenos Aires did once they had secured power was to contact the fleet sitting just outside attack range of their capital in order to request a cease-fire and negotiate a surrender. Admiral Rembrandt C. Robinson arrived in Buenos Aires as commander of the coalition forces without his task force having ever fired a shot. He was accompanied by most of the commanding officers of the coalition strike force which had done most of the heavy lifting in Operation Tailwind, including Rear Admiral Drake, Counter Admiral Choupin, Commodore Woodward, and Commodore Jackson – the latter three of whom being, after all, the highest-ranking officers dispatched by their respective nations of origin. (The Melbourne, which remained in the Pacific, was not able to send an Australian officer in time to be present for the surrender negotiations.) The surrender went into effect on March 21, 1979 – the Argentine War had ended not three months after it had begun. Many of the coalition ships that had been sent to Argentina remained to participate in humanitarian efforts – Commodore Jackson, a native of the legendarily-tidy city of Toronto, was quoted as saying “Well, we might as well stay and help clean up that mess we made”. Coalition losses (excluding Brazil and Chile) were limited to just fourteen planes – eight American, two French, two Canadian, one British, and one Australian. LCDR Novak of the USN was, as he had been at the Battle of the Argentine Sea, the highest-ranking casualty of these. (Likewise, the highest-ranking Argentine officer to die in battle remained Vicealmirante Juan Lombardo, whose death spared him from the fate shared by many of the other officers in the Process). Most of the coalition ships which had participated in the war would also take part in the humanitarian efforts which followed the armistice.

    As had been the typical pattern in years past, the junta organized elections, to be held as soon as possible, and invited UN observers to monitor their fairness and accessibility. In many instances, rubble and debris had to be cleared from thoroughfares and polling stations before the voting could take place, but it would soon come to pass. In the meantime, the “interim” leaders of Argentina signed the Treaty of Montevideo in neighbouring
    – and neutral Uruguay, accepting blame for the Argentine War and renouncing Argentine claims on the Beagle Islands in Chile and (at British insistence) the Falkland Islands as well. Raúl Alfonsín, leader of the Radical Party and a vocal critic of the Process throughout its existence, was elected President and soon got to work prosecuting the leaders who had organized and carried out the Dirty War. Although he faced considerable resistance, Alfonsín forged ahead, buoyed by popular opinion which had become vehemently anti-military, and with the help of allies such as Capitán Lombardi (promoted to Mayor after the war). Despite the perpetrators facing justice, Argentina was in a shambles; inflation was once again out of control and the economy was in desperate need of stimulation. Taking advantage of the need for drastic measures, Alfonsín decided to enact major constitutional reforms, strengthening the existing federal system in Argentina while changing the structure of the federal government from a pure presidential system, to a semi-presidential system in the French vein, decentralizing power in hopes of breaking the vicious circle of coups. Perhaps the most intriguing of his ideas was to move the capital from Buenos Aires to Viedma, a small coastal city in Patagonia some 800 miles to the south (which had been bombarded by the coalition strike force, thus necessitating re-construction, potentially as a planned city), thus isolating the seat of political power from the centre of economic power. In addition, a new currency was introduced, the austral, one which intended to be more stable and long-lasting than the peso had been. These reforms, it was hoped, would put Argentina’s domestic situation back under control, allowing the government to regain the faith of the people.

    How the rest of the world perceived Argentina would change unexpectedly in the 1980s, and (for once) not as a direct result of government actions. It began with the 1980 publication of an English-language nonfiction book about the 1950s heyday of Peronism – retro nostalgia was alive and well even in the literary circuit. Given that the Justicialist Party had run in the 1979 elections but lost to the Radicals, the movement was considered dormant, and was romanticized as an almost mythical part of Argentine history. Eva Perón, who died tragically young, having accomplished so much in her short life, but with the potential to do so much more, was the focal point of Peronistas – having died in her prime, as opposed to her husband, who spent nearly two decades in exile. Among those who read Peronistas, which became an instant bestseller, was lyricist Tim Rice. He had written musicals about subject matter as disparate as Jesus Christ and Peter Pan, and he already had a passing acquaintance with Eva Perón through his youthful interest in philately. [20] Alongside his frequent collaborator, composer Andrew Lloyd Webber, Rice immediately got to work writing the book, music, and lyrics of what he was sure would be another smash-hit musical. The recent war, the impact it continued to have on the political discourse in the UK, and the seemingly endless series of printings for Peronistas was enough to attract backers to the notion of a musical set in Argentina. Nevertheless, it would not premiere on the West End stage until June 15, 1983, where it was an instant hit – with a Broadway opening planned for 1984. Reviewers praised the emphasis on authentic Latin rhythms in the score, including (of course) the tango. The musical’s historical accuracy was suspect – the book on which it unofficially been based was described by many opponents of Peronism as excessively sycophantic – but it had a strongly pro-democratic message, urging the audience to be wary of granting the military too much power or reverence – a fitting, Eisenhower-esque moral for a play set in the 1950s.

    However, newfound British interest in Argentine history would not totally alleviate tensions between the two countries. As soon as they were able, Alfonsín’s government would revoke the prior renunciation of their claim on the Falklands, claiming that a regime which “did not represent the democratic will of the people of Argentina” had been “coerced” into doing so in the first place – however, they made great pains to stress that any future territorial disputes would be resolved only through diplomatic means. The British government, unsurprisingly, was wary of this promise (as the Argentine and Chilean governments had negotiated for several years before the invasion) and sent a permanent naval detachment to patrol the British islands in the South Atlantic, even though there was no longer any substantial Argentine Navy or Air Force which could conceivably challenge British supremacy in the region. It was consistent with a new, emerging defence policy, which would be delineated in the celebrated White Paper of 1980…

    ---

    [1] There are many cumulative changes here. First of all, Onganía remained in power only until 1970 IOTL, with a more traditional transitory junta replacing him after he was deposed until elections were held three years later. As ITTL, the Justicialist Party won the Congress and Cámpora won the Presidential Election, though both just missed out on the majority they secured here (which you can attribute to greater fatigue with autocracy ITTL). And, of course, Juan Perón survived long enough to return to Argentina and win a second Presidential election of 1973 in a landslide – only to die in office a year later, replaced by his (third) wife, Isabel.

    [2] This coup takes place in 1974 ITTL, as opposed to 1976, as IOTL.

    [3] Translated from the Spanish as: “They ought to have done something (to deserve it)”.

    [4] The Almirante Storni attacked the Shackleton IOTL, in 1976 – their shots missed, though apparently they were indeed meant to hit. ITTL, their aim is true.

    [5] The illegal naval base, on the island of Thule, was allowed to continue operating until the Falklands War in 1982 (or rather, until the British fleet relieved the Argentine occupation of the islands, and the naval base – which, sure enough, was used as a staging point – was destroyed).

    [6] The others were France, West Germany, Italy, the Netherlands, Spain, Portugal, Sweden, and the Soviet Union. IOTL, England did not make it to the 1978 World Cup, and neither did Portugal or the Soviet Union (Austria, Poland, and Hungary did instead.)

    [7] Yes, I’m aware that Wales has its own soccer team, and that it failed to qualify for the World Cup. However, I’m sure they’re swept up enough in the excitement of England and Scotland getting in that they ignore this – the stereotype is that any British nation which is not England will always root for the non-English team, so it’s probably safe to assume that the Welsh will support their fellow Celts on the Scottish team.

    [8] ITTL, Chile qualifies for the World Cup in lieu of Bolivia.

    [9] IOTL, Operación Soberanía was aborted in part because a severe storm impeded Argentine operations – however, this event takes place more than thirteen years after our POD, and the butterfly effect is named for changes in weather being dependent on individual events.

    [10] Pope John Paul II would have far more success IOTL, after dispatching his personal envoy, Cardinal Antonio Samorè, to resolve the dispute before it escalated into war. However, Innocent XIV did not place Samorè in the same position ITTL, and this too contributed to the war breaking out.

    [11] Worth noting is that Canada was not a member of the OAS at this time (ITTL or IOTL), and was thus unable to condemn Argentina’s actions in a formal diplomatic body other than the UN, and this helps to bring them onside to the notion of “official” action.

    [12] This is because the results of the 1970 Presidential election are butterflied (since Humphrey’s CIA is more effective in influencing the results than Nixon’s CIA was IOTL), and a pro-US candidate was elected. This butterflied the rise of the Chilean junta in general and of Augusto Pinochet in particular.

    [13] The Liberal Party, while in power, had decommissioned Canada’s last aircraft carrier, HMCS Bonaventure, in 1970, and had not replaced it with anything. IOTL, to this day, Canada has never had another aircraft carrier, or even utility vessels which can function as such (destroyers can only support one or two helicopters apiece).

    [14] As a symbolic gesture of the ancient Anglo-Portuguese Alliance, in force since 1373, Portugal first extended use of the Azores to the United Kingdom alone – and then to the United States, Canada, and France (along with the rest of NATO) about an hour later. This is understood to be Portugal’s principal contribution to the Argentine War, although they do provide humanitarian aid and supplies after the war’s conclusion.

    [15] The number four is considered bad luck in several East Asian languages, including Chinese, because it is a homophone for the word “death”. I’m not saying that Red China skipped the vote just for that reason, but let’s just say it didn’t hurt matters any. (You will note that, IOTL, the PRC skipped the vote on Resolution 444 as well.)

    [16] Operation Tailwind was a term used IOTL during the overseas quagmire in 1970 – it remains available for use ITTL by the coalition to refer to the assault on Argentina.

    [17] Rear Admiral Rembrandt C. Robinson died IOTL on May 8, 1972, when his helicopter crashed in the Gulf of Tonkin. He was the only flag officer of the US Navy to lose his life in the overseas quagmire. ITTL, said quagmire had been over for two years by that point, and Robinson – who had spent his entire career up to that point serving in the Pacific – was transferred to the Atlantic Fleet shortly thereafter.

    [18] The French naval rank of chef de division, equivalent to OF-6, was phased out of the Marine Nationale and no longer exists – one is promoted straight from capitaine de vaisseau (OF-5) to contre-amiral (OF-7).

    [19] Tomcats did not fly off the Ranger or the Constellation IOTL until some time later, though they were active on other American carriers – butterflies account for their deployment to these carriers ITTL, which in turn accounts for them being the ones to be sent to Argentina as part of the strike force.

    [20] Andrew Lloyd Webber and Tim Rice had been planning to work on a Peter Pan musical in 1972 IOTL, but eventually abandoned these plans. At about this time, Rice was listening to a radio show about the life of Eva Perón, which inspired him. Lloyd Webber, however, declined at first in order to work on the (flop) musical Jeeves, but afterwards the two did work on Evita together – the album version was released in 1976, and the West End musical (sometimes described as an opera – there is no spoken dialogue, unlike the version ITTL) in 1978. (Broadway beckoned a year later – and then of course the film adaptation finally arrived in the 1990s.)

    ---

    Well, well, where to start? First of all, I hope you all enjoyed the longest post I’ve yet written for That Wacky Redhead, at 11,616 words. It’s also far and away the most War-And-Politics-heavy update I’ve ever written, which made for a nice challenge, in all the good ways that writing should be. Fun fact: “Argentine” is the correct demonym for Argentina; “Argentinian” is a neologism, and one which I like to think I’ve butterflied ITTL (or at least mitigated the use thereof, if it was already on the rise).

    This update would not have been possible without any of the following consultants (in alphabetical order): CalBear, Dan1988, e of pi, juanml82, and Thande. Thanks again to TheMann for so graciously allowing me to take HMCS Eagle, originally featured in his timeline Canadian Power, on her own adventure unique to TWR! Extra special thanks must also go to Dan1988 and e of pi for assisting with the writing and editing of the update! I promise that this update will be relevant for the next one, so stay tuned!
     
    Last edited:
    Appendix B, Part X: The Fight-or-Flight Maneuver
  • Appendix B, Part IX: The Fight or Flight Maneuver

    Laws, like sausages, cease to inspire respect in proportion as we know how they are made.

    – Popularly (and apocryphally) attributed to Otto von Bismarck, Chancellor of the German Empire

    The aftermath of the Argentine War affected each of the allied combatants differently; all of them sought to make political capital from their triumphs in the conflict, with varying success.

    In Canada, once the humanitarian initiatives which followed the war had concluded, HMCS Eagle (CV 23) returned to Canadian Forces Base (CFB) Halifax, where she was given a hero’s welcome. Commodore Adrian Jackson was personally received at the base by none other than Prime Minister Robert Stanfield; Jackson held the distinction of being Canada’s first flag officer to see combat in a war zone since Korea, more than a quarter-century earlier. He and the officers and crew who served aboard and flew off the Eagle and her escorts were lavished with honours, including the Argentina Medal, created to recognize those who had served during the campaign.

    Although the Corsairs and the Phantoms which had flown off the Eagle in Argentina were not nearly as antiquated as the Buccaneers which had remained aboard the Ark Royal, the Canadian government had already initiated their New Fighter Aircraft (NFA) Program which sought to replace them, even before the war broke out. Seeking planes with carrier capacity as well as land-based functions for the Royal Canadian Air Force, many candidates were rapidly eliminated until only three remained, and these were formally shortlisted in 1978: the Grumman F-14 Tomcat, the McDonnell-Douglas F/A-18 Hornet, and the Dassault-Breguet Super Étendard. [1] The French Étendard, the only non-American plane in contention, was privately considered the laggard of the three, but was kept in consideration primarily as a bargaining chip, and as a means to avoid the appearance of Canadian over-reliance on American military technology. Though the Tomcat was considered superior to the Hornet by most metrics, the Hornet was a solid performer overall, and much more affordable. A final decision was due in 1980, but the Argentine War intervened in the interim. During that time, the F-14 Tomcat famously distinguished itself in the Battle of the Argentine Sea, making the acquisition of those planes by a very attractive prospect with the general public. However, in the end, the estimated price savings of approximately $1 billion tipped the scales in favour of the Hornet – though the difference remained budgeted for Defence, it would be funnelled towards naval flexibility as opposed to aviation.

    The Argentine War had taught other lessons besides demonstrating the impressive combat ability of the Tomcat. The ongoing aftermath of the war, and the humanitarian efforts taking place in the Southern Cone, had allowed US amphibious platforms to demonstrate their versatility and utility in war and in peace – which allowed them to outshine their larger, more warlike fleet carrier brethren in some respects. It was something for the RCN to consider alongside their overriding need to shore up one of their traditional strengths: anti-submarine warfare. The Eagle was to be stocked just about bow-to-stern with Hornets, leaving little room for Sea Kings – the few helicopters aboard were transports, with anti-submarine warfare relegated to a secondary role. Therefore, a supplementary – and smaller – less offensive platform was sought to provide the helicopter support necessary to bolster the flexibility of the fleet. [2] Meanwhile, across the Pacific, Australia sought to retire HMAS Melbourne, and had also learned many of the same lessons from the Argentine War. For many years, Australia and the UK had been in negotiation to transfer ownership of a British carrier. HMS Hermes had been on the table since the 1960s, and almost immediately once Britain began building Invincible-class carriers, one had been on offer to Australia. [3] But those in power in the Land Down Under did not restrict themselves to the mother country in seeking vendors: Italy and Spain, both of whom were working on building V/STOL carriers, offered to build and sell an additional ship to Australia; Spain had already sold a carrier to neighbouring Portugal under similar terms, which was due to begin construction in 1982. [4] The United States, obviously, had no shortage of “baby carriers”, some of which were still under construction at the dawn of the 1980s, and were more than willing to make a deal with Australia, one of their closest allies.

    Canada and Australia, each finding the other in a similar position to their own, agreed to pool their interests, and their buying strength. They had similar needs and would seek out a vendor who would be able to meet them. Neither Spain nor even Italy had the capability to build two new carriers in any reasonable amount of time, eliminating them as contenders. Britain, obviously, was the first port of call for the two Commonwealth Realms, and Whitelaw generously offered to build two new Invincible-class light carriers, one for each of them, on top of the three that were already under construction. Both Canada and Australia agreed to consider the offer, but would not commit until they fielded others, primarily from the United States.

    The USA, naturally, had just the sort of ship that both of them needed, given the sheer size and variety of the American fleet. The US Navy had just received the last Tarawa-class amphibious assault carrier from the contractor, leaving them with the ability and the incentive to start building more of the same right away. Although the Invincible-class was superior as an aircraft carrier, the Tarawa-class had unsurpassed amphibious capabilities – indeed, it had shone like few others in the aftermath of the Argentine War. The ships were also considerably less expensive than those of the Invincible-class, and (in perhaps the deciding factor for Canada) the beam was just narrow enough to traverse the Panama Canal, a decisive advantage for a multi-coastal country such as Canada – it would take an Invincible weeks more to reach Vancouver from Halifax than a Tarawa, especially since rounding the Horn entailed passing by a recent war zone. Therefore, after some further negotiations, both Canada and Australia agreed to buy modified Tarawa-class amphibious assault carriers. The US Navy offered them an additional discount if they were given further input into the planned design modifications, as a testbed for their own follow-up ships, and this offer was accepted. [5]

    Australia planned to retire the Melbourne as soon as their new carrier was complete (which was estimated for 1986). It was to be named HMAS Australia. Canada, meanwhile, obviously planned on retaining the Eagle as flagship and primary carrier, and therefore naming a smaller, less well-equipped vessel Canada would be a non-starter. Prime Minister Stanfield announced that the ship would instead be named for the recently-deceased Prime Minister John Diefenbaker, a fellow Tory and perhaps not the least controversial politician for whom the ship could have been named (especially given his notorious opposition to the lamented Canadian aviation project, the Avro Arrow). [6] But Stanfield was resolute, and the decision went ahead. The ship would receive the designation LHD 1 (for Landing Helicopter Dock) upon commissioning, as opposed to her Tarawa-class predecessors, designated LHA (for Landing Helicopter Assault), which would reflect the intended peacetime utility of the vessel.

    Although the 1980 Referendum concerning the Sovereignty of the Province of Quebec (as the federal government so cumbersomely described it) had resulted in a decisive defeat for those who favoured that option, and though the pro-separatist Parti Québecois had been decisively defeated at the provincial polls the following year, it was clear to everyone involved that the status quo could not endure. Pierre Trudeau, the only living former Prime Minister (who had served from 1968 to 1972 – but had been a private citizen since 1974) had advocated strongly, both during and after the referendum, for “une belle concorde pour la belle province” – which he translated into English as “a better agreement for a better Canada”. [7] The Liberal Party to which Trudeau belonged had long advocated for patriation of the Canadian Constitution – all Canadian constitutional law up to that point had been argued and passed by the British Parliament at Westminster, a power they still theoretically retained, though effectively did not exercise. [8] The idea had some allure with the governing Tories, given Diefenbaker’s passage of the Bill of Rights back in 1960. However, that had been a purely symbolic document which enumerated rights that could not effectively be enshrined. Patriation of the Canadian constitution would likely hinge on the creation of a new bill of rights – each of the three great Western Democracies had one of their own, all centuries old. Technically, the British (or rather, the English) Bill of Rights, passed in 1689, also had jurisdiction in Canada (and all other Commonwealth Realms) but unlike the American and French Bills, they mostly conferred rights upon Parliament, as opposed to the people. [9]

    The PCs and the Liberals were both, in principle, prepared to explore the issue of constitutional reform. The NDP supported drafting a document which reflected “traditional Canadian social democratic values” – causing a stir among the more left-wing members of their caucus because of their cautious avoidance of the word “socialist” in their advocacy. The Socreds, aware of their Québecois nationalist base, pushed for strong provincial rights, which also attracted support from their traditional Western heartlands. It was a surprisingly serious, complex, and intellectual issue upon which to fight an election, but it was increasingly looking like it would be the main event. Stanfield would defend his platform in the federal election which he called for the autumn of 1982, noting that economic indicators seemed to be improving. Fortunately for him, the Canadian economy had weathered the tumultuous 1970s quite well, which he naturally argued was due in large part to the drastic (and, accordingly, successful) measures he took upon forming government. Canada especially looked good next to the United States – but there were signs that this was changing, with the American economy starting to grow more quickly and unemployment dropping more dramatically in recent months. It would probably be some time before these improvements were widely perceived, but it was a huge warning sign against postponing the election until 1983, as was Stanfield’s right under the British North America Act (which mandated elections every five years, barring war or insurrection).

    All four of the same party leaders who had contested the 1978 federal election also planned to fight the 1982 contest – by which time Robert Stanfield had been Prime Minister for a decade, his Tories holding power for slightly longer than the nine years (1963-72) that the Liberals had previously formed the government. This allowed the Grits – along with the other two parties – to argue that it was “time for a change”. [10] John Turner was fifteen years younger than Stanfield, though he had served in federal politics for a longer duration. The problem facing Turner was that the natural bases of support for the Liberal Party had been heavily bombarded – their coalition of voters Western Canada had defected not only to the Tories but also to the NDP; their stronghold of Quebec had been eroded by increased support for the Tories amongst Anglo-Quebecers and by continuing strength for the Créditistes among nationalist Québecois. The Liberals were becoming most every voter’s second choice – not an advantageous position in a first-past-the-post electoral system. Perennial speculation that their party might go the way of their British counterpart – a fate which had continually been postponed thanks to strong leadership and fortuitous circumstances – once again became a prime topic of discussion on Parliament Hill. Lorne Nystrom, the leader of the NDP (simultaneously analogous to the British Labour Party and the turn-of-the-century Progressive movement which had swept North America), was a rural agrarian much in the vein of the party’s founder and spiritual leader, Tommy Douglas, who campaigned in a left-populist vein of the kind which always struck a chord in recessionary times. And then there were the Socreds. Their traditional base in Western Canada had been virtually eliminated after the Quebec faction emerged victorious in a long and bloody schism through the 1960s – but their staunchly regionalist tack (avoiding outright separatism, though many within the party were sympathetic to the notion) began to strike a chord with Western voters, many of whom were feeling increasingly taken for granted by the PCs. Andre-Gilles Fortin cannily exploited this – most of his rhetoric emphasized the regional identities of Canadians in general, as opposed to merely Quebecers in particular.

    The Socreds found an unexpected ally in Jack Horner, an Alberta PC MP on the party’s right-wing, who had long disliked Robert Stanfield (a Red Tory if ever there was one). In the aftermath of the referendum, Horner – displeased with Stanfield’s disregard for regional distinctions, crossed the floor and joined the Socreds, giving them their first MP from English Canada since 1968. In his riding of Crowfoot, the Socreds had indeed come in second place in the last election – with barely more than ten percent of the vote, against the three-quarter share he had won as a PC. It would be an uphill climb for Horner in seeking re-election, but he was richly rewarded by Fortin, who (in part due to pressure from party backers outside Quebec) sought to attract new voters outside of la belle province. Fortin appointed Horner the party’s new Deputy Leader, eager to build support in Alberta and British Columbia – the historical core of Socred support and where the party had formed government on a provincial level (and where they were incumbent in British Columbia). In Western Canada, support for the Socreds rose to levels not seen since the 1960s in voter polling. [11] It appeared likely that the Socreds could become a truly national party for the first time in a generation. The Tories ignored the threat; Jack Horner, after all, had always been a cowboy who had never gotten along with Stanfield, and he had been passed over for promotion to the front benches in favour of other Albertan MPs like Joe Clark and Don Mazankowski. Their cavalier attitude would come back to haunt them.

    Naturally, the rumbling discontent in the House of Commons was matched by Canada’s ten provincial premiers, all of whom agitated for increased provincial rights – any concessions which Stanfield’s government might be prepared to grant to Quebec should logically be extended to all the other provinces. In addition, the Western provinces wanted greater representation in Canada’s upper house, the Senate – whose entire membership (though apportioned amongst the various provinces according to an arbitrary legal formula) was appointed by the vice-regal Governor-General on the “advice” of the Prime Minister. Senate reform had been a very long-standing demand by agitators that had never met with much success- some went a step further and called for abolition – in large part because it could not be agreed how the Senate should be reformed – an elected Senate was popular but the method of election was hotly-debated, and while those in Western Canada favoured that each province have an equal number of Senators (along American lines), the leaders of those provinces who stood the most to lose from this arrangement (Ontario, Quebec, and the Atlantic provinces) naturally opposed this proposition. [12] Each “region” of Canada returned 24 Senators: Ontario, Quebec, the three Maritime provinces, and the four Western provinces; sparsely-inhabited, impoverished Newfoundland was granted six upon joining Confederation in 1949, giving it as many as populous British Columbia and Alberta each had. In an obvious patch job, the three territories (including newly-admitted Turks and Caicos) were each granted one Senator in 1981, bringing the total number to 105. [13]

    In the end, the PCs were narrowly returned with a third consecutive majority – the first for the Tories since 1887 – despite barely maintaining 40% of the popular vote. The Liberals gained seats mostly at the expense of Social Credit – who bled support in Quebec even as the recovered in the West, the exact opposite of the situation in the 1960s. The NDP made considerable gains at both PC and Liberal expense, achieving their best-ever result. In all, 1982 was largely a wash from 1978, but the results were clearly interpreted by Stanfield as a note payable to the electorate: they wanted reform, but they trusted his steady – nay, progressive conservative – hand to deliver it to them. The Tories lost a mere two net seats from the previous election, with 147 MPs returned to Parliament. The Opposition Liberals won 78 MPs. The NDP caucus increased to 44 MPs, and the Social Credit Party collapsed, reduced to just 13 MPs – eleven in Quebec and two in Alberta, the re-elected Horner and Gordon Kesler. [14] Although Clark and Mazankowski survived the Socred resurgence in rural Alberta, both came perilously close to defeat; the Social Credit Party actually came in second in the popular vote throughout the province. However, the only seat in Alberta the Tories lost was to the NDP, who won an urban Edmonton riding.

    HMCS Diefenbaker would carry that controversial name into service, despite some rumblings on the campaign trail that a more “suitable” name replace it – the Liberals naturally favoured their own former PM, Laurier (who was just as polarizing in his day as Diefenbaker would later be). The NDP, meanwhile, were torn – J.S. Woodsworth, the father of Canadian socialism, had also been a staunch pacifist – the only Canadian MP to vote against declaring war on Germany in September of 1939. He would frown upon any warship – even one with humanitarian capabilities – being named for him. The NDP’s other candidate, “Father of Medicare” Tommy Douglas, was still alive and deflected any suggestions that the ship be named for him. He had never been Prime Minister or had even come close. The Socreds were divided by nationality – many English-speaking Socreds were accepting of Diefenbaker, but Francophone Creditistes preferred various French-Canadian nationalists, such as Louis-Joseph Papineau or Henri Bourassa. The “consensus” choice – floated by the media and leading in many polls, was the first Prime Minister and Father of Confederation, Sir John A. MacDonald, who (like Diefenbaker) was a Tory. Though colourful (he was a notorious drunkard), his legacy was far less controversial than that of Diefenbaker. However, partisan jockeying for position prevented any of the other parties from taking up his mantle over those of their own pet candidates. But Canada was far from the only Argentine War combatant with a debate over aircraft carriers dominating the national conversation…

    HMS Ark Royal (R09) had been scheduled to be decommissioned, paid off, and scrapped in 1979, having retired to HMNB Devonport at the end of the previous year – before she was hastily re-equipped with her former air wing and re-manned with her former crew (including her last commanding officer) in order to be dispatched to Argentina. The Ark Royal was the only full-sized fleet carrier remaining in the Royal Navy, and had been since her sister ship, HMS Eagle, was decommissioned in 1972. The British government had no plans to replace the Ark Royal (which had spent over half of her career in refit), willing to rely on the three Invincible-class light carriers which were on order. However, when it became clear that even the conventional strike planes flying off the Ark Royal were not strong enough to enjoy a decisive advantage over those of a second-rate regional power, there was considerable doubt within the Admiralty and within Parliament as to whether the Royal Navy was in a position to defend British overseas territories and project power worldwide. If new planes were needed, then so too was a new fleet carrier. The Ark Royal, upon returning to Devonport in late 1979, was ordered into refit. The two light carriers in service, HMS Bulwark (R08) and HMS Hermes (R12), remained on active duty – initial plans to scrap (in the case of Bulwark) or sell (in the case of Hermes) the ships were postponed. In the year that HMS Invincible (R05) was commissioned, Parliament published the celebrated White Paper of 1980. [15]

    The white paper committed to constructing a full-sized fleet carrier to replace the Ark Royal, and to maintaining a fleet of three light carriers to support the full-sized carrier. It also committed to new strike planes which would replace the aging Blackburn Buccaneers. The success of the nuclear-powered HMS Churchill submarine in Argentina (and of British nuclear subs in general), and of the increasingly nuclear-powered fleet of American supercarriers (though both of those which had served alongside the Ark Royal were diesel-electric) also opened the door to exploring the possibility of a nuclear carrier, easily the most controversial and polarizing proposal in the white paper. In the end, the next carrier would become an election issue, one of many to be fought at the ballot box in 1982.

    Michael Foot, the leader of the Labour Party during the 1978 election, was forced to resign from the position shortly thereafter. Many Labour frontbenchers threw their hats into the ring in hopes of replacing him, particularly his close colleagues on the left-wing of the party. However, considering that Foot had failed to make great strides with the general population largely due to his own leftist views, many within the party felt that a more moderate candidate would be a preferable alternative. Likewise, a younger leader might be more appealing with younger voters. To that end, David Owen emerged victorious. [16] He was just over 40, and having distinguished himself as a junior cabinet secretary during the twilight years of the Wilson government in the mid-1970s, and even more so in the Shadow Cabinet (serving in the key post of Shadow Foreign Secretary during the Argentine War). Although he had been part of the Europhile faction within Labour, he (unlike some of his allies, such as Roy Jenkins) was willing to abandon this ideology as it became increasingly clear that Britain would remain “with Europe, but not of it” for the foreseeable future. However, this change in policy direction did not fail to alienate Labour’s far-left – most notably Peter Shore, the Shadow Chancellor and himself a contender for the Labour leadership. Encouraged by the rabble, he and several other officials broke away from the Labour Party in 1980, forming the Democratic Socialist Party of the United Kingdom, popularly known as the DSP.

    The DSP appealed primarily to the socialist “true believers” within the Labour big tent, such as the entryist Militant tendency – many of whom deserted the party to throw their weight behind the nascent DSP in order to influence its core principles and future direction. This process was accelerated when the Militant Tendency was formally expelled (with the support of Owen) in 1981. [17] The DSP attracted a disproportionate share of attention relative to what most polls showed were its likely share of the vote – largely because it was a party of the vocal minority. Still, psephologists predicted that it would be enough for them to play spoiler in many Labour marginals, allowing the Tories and even the Liberals to make gains at the Opposition’s expense. Owen turned out to be one of Labour’s biggest strengths – he was very well-liked among the electorate, his centre-left viewpoints proving very popular among swing voters who had walked away from Foot.

    As was the case in Canada, the Tories won a third consecutive majority in 1982, on a smaller share of the popular vote than previously – though they had a net loss of a few seats, they maintained a comfortable working majority. Labour continued to see only incremental gains from the previous election, and indeed did worse in their core regions even as they did better in swing regions and (surprisingly) areas of traditional Labour weakness, due to the loss of left-wing votes to the DSP – who, for their part, won four seats nationally, on a higher share of the vote than the Scotland-only SNP (who, thanks to the regionalist tendencies of the FPTP system, won more seats. The Liberals, on the other hand, more than doubled their representation at Westminster – winning many three- or four-way marginal seats by coming up the middle of the anti-Conservative vote. In all, the Tories (including the Ulster Unionists) won 347 seats, down just seven from 1978, though with barely over 41% of the vote; Labour won 208, on 28% of the vote; the Liberals won 46, on 17% of the vote – less than they had received in 1978; the DSP won four seats on nearly 10% of the vote. The Scottish Nationalist Party, on just 2% of the national vote (about 20% of the Scots vote) won a dozen seats, their best-ever showing in Parliament – and all of their gains were at Labour’s expense. The DSP also gained a Scottish seat at Labour’s expense – their other three were in all in England; one in London (Shore’s seat) and two in the North.

    It turned out that the United Kingdom and neighbouring France had much the same problem. Though France’s two carriers were neither as old nor in as poor repair as the Ark Royal, they were nearing the end of their operational life spans, and President Mitterrand was growing increasingly aware to determine their replacement. Whitelaw and Mitterrand, aware of their mutual conundrum, decided to examine possible alternatives jointly – which they announced amid much fanfare. They were inspired by the precedent of Canada and Australia having done much the same thing shortly beforehand, with great success. In private, however, each were very vocal and less than cordial in stressing what they had planned for each of their respective carriers, and how incompatible these desires often were. Both were in agreement on the need for CATOBAR systems, but Britain wanted a larger carrier than France, whose preferred manufacturer was constrained by the size of its largest drydock. [18] France insisted on nuclear power, which in theory would not have been too difficult, as both sides constructed nuclear engines for their submarines. However, French nuclear fuel was far less enriched than British fuel, and therefore, less efficient; this played havoc with early attempts to design an engine which would meet their needs.

    It was decided, as one of the reasons that the two had agreed to jointly pursue carrier construction in the first place, that each side would build one carrier, and that the other’s carrier would function as a “backup” carrier where necessary, under the terms and conditions of the Entente Cordiale, which lent its name to the carrier class: Entente-class. In the spirit of the class name, each carrier was initially named after the principal architect of the Entente on each side: King Edward VII for the British carrier (which would be the second ship to carry the name, after a pre-dreadnought battleship commissioned in 1905) and Émile Loubet (after the French President of the time). Both names would ultimately be changed: the King Edward VII was renamed to Ark Royal (with the blessing of Edward VII’s great-granddaughter, Elizabeth II) after a popular petition, started by veterans of the recently-decommissioned Ark Royal which had served in Argentina, which gathered over 100,000 signatures. [19] By this time, Loubet had already changed names to Richelieu, the famous French cardinal and minister to Louis XIII (and a frequently-used name for French warships); it would be changed again to Charles de Gaulle, named for a man whose complicated relationship with the British exemplified the combative nature of the joint carrier project. However, the ever-changing names belied the relative constancy and consistency of the final carrier design that was shared between both members of the Entente-class, which were scheduled to begin construction in 1986. Each ship would displace 60,000 tons and measured 300 metres overall, and were designed to carry an air arm of approximately 50 planes. [20] One aspect of the design which united both British and French politicians – and the press – was how the carriers were to be described: these were supercarriers, in the American vein (and, indeed, in terms of size and mass the ships were comparable to the conventionally-powered supercarriers of the US Navy, such as the Ranger and the Constellation). [21]

    The composition of the air arm for these supercarriers was another matter entirely. They were ready to be chosen, and then built, far more quickly – in fact, the groundwork had been laid down in 1979, when the Eurofighter project was launched. The recent Argentine War had certainly been a concern in the minds of planners on all sides, but it gave Britain and France – usually at loggerheads when it came to defence policy – a common objective. France, angling to replace her aging carriers Clemenceau and Foch in the medium-term, wanted a new, top-of-the-line fighter plane which was carrier-capable. Britain now had an incentive to build another CATOBAR carrier, and the Buccaneers had demonstrated their need to be replaced, so the Whitelaw government was on-board with the need for new jet fighters with strike capability. The resultant Eurofighter project became the latest in a string of attempts by an international consortium of European states to produce a common aircraft which would meet their needs. It followed the successful development of the Panavia Tornado land-based fighter-bomber craft, developed by Britain in cooperation with West Germany and Italy (France was initially involved, had withdrawn long before any design could get off the drawing board).

    British and French interests were much more closely aligned when it came to designing the planes which would fly off their carriers. France already had several other models in development, or ready to fly, but were lacking a certain oomph on their carrier air wing which the planned Eurofighter plane would provide. The British knew that they would have to see eye-to-eye with the French on this, lest the Republic withdrew once again, as it had done with the Tornado, among many other projects. [22] The “junior” Eurofighter partners, West Germany, Italy, and Spain, were not thrilled, as none of them could support CATOBAR planes on what carriers they did have, but nevertheless came to accept a multirole aircraft primarily intended for carrier use, but easily adaptable for land-based operations, inspired by the example of the CF-18 Hornet: an air superiority fighter capable of being outfitted for air-to-surface strike missions for land-based use. The planned design, which was in fact of British origin, was given the name Typhoon, and would be built by the British and the French, much like the carriers off which they would fly. All would fly under the power of the same engine which had been especially designed for use by the Typhoon, by a consortium led by Rolls-Royce and Snecma (so that neither side had the exclusive right to build engines). The Typhoon was a Mach 2, delta-wing fighter.

    Anglo-French cooperation with regards to aviation in general, and delta-wing supersonic jets in particular, was nothing new – Concorde, the first supersonic passenger liner to enter commercial service, had been the result of decades of collaboration between the two countries. Concorde was widely regarded as a tremendous feat of engineering, and could make the transatlantic voyage in half the time that conventional jetliners had done in the past. Despite this, Concorde did not sell nearly as many planes as had been hoped. Environmental and noise concerns made overland flights unfeasible, restricting the plane to overseas travel. The extremely high flight costs, especially after the Oil Crisis had driven up fuel prices, relegated the plane to the status of “luxury liner” in order for the only airlines that did operate the plane – in the UK and France – to turn a profit. Other airlines had initially been interested in Concorde when it had still been on a drawing board as a means of speeding up travel as radically as the transition from passenger ships to airliners had done in the 1950s, but when it became clear that the downsides of supersonic travel could not easily be mitigated, they all withdrew their orders. Concorde took a huge loss in manufacturing, since only 20 planes were ever built. The important precedent of Anglo-French cooperation in building and flying such an elaborate project had already shaped the face of European defense, but Concorde’s commercial failures would forever change the shape of the British aerospace industry…

    Meanwhile, in the land where most Concorde flights found themselves, and ironically enough, all things considered, the political impact of the coalition triumph in the Argentine War was far less long-lasting than it would be elsewhere. Though it had been the first unqualified military victory for the United States since 1945, the very nature of the “short, sharp shock” that Reagan had been hoping for meant that it was quickly forgotten, buried amidst domestic issues. In the end, victory was not enough to capsize Reagan’s bid for re-election in 1980.

    John Glenn, befitting the career for which he had first become known to the American public, went into office with high expectations. At the same time, the public image of the Presidency was becoming increasingly dependent on modern media. Ronald Reagan had been an actor, and then a television presenter, before being elected Governor of California in 1966 and serving for eight years as a stepping stone to the Presidency. After his defeat, he retired with his wife Nancy to their estate in Bel-Air, characteristically quipping to reporters that “I won’t be missing these Washington winters”. John Glenn had been a test pilot, and then an astronaut, before being elected US Senator (after several earlier, abortive attempts to get into politics), where he served a relatively undistinguished decade in that office – the highlight of which had been his landslide re-election victory in 1976, stemming the tide of the massive Republican victory that year. It had allowed him to emerge from a crowded field, by being everything to everyone, while also leaving him vulnerable to attack – “What on Earth has John Glenn done” was a rhetorical question frequently leveled at his lack of legislative accomplishments. [23] His image was also shaken by the “Deal with the Devil”, which had been made shortly after his departure from the Senate. Party loyalty prevented him from singling out the DSCC and the DCCC, but the famous photograph of Robert Byrd and George Wallace (two Democrats, and two formerly staunch segregationists) shaking hands at the press conference announcing the deal that would see the Democrats retake the House after six years of Republican control would cast a pallor over his otherwise triumphant inauguration.

    The incoming President Glenn had much to prove and much to face in rehabilitating the American economy, especially given the looming threat of Japan, which was now encroaching on American commerce much as their military had once encroached on American territory. Glenn determined that encouraging private investment in public works would be the happy medium between the two unsuccessful economic extremes of the 1970s, and sought allies to help him “Invest in America”. The first step was infrastructure, in every sense of the word. That was where Glennrail came in.

    The Glenn administration was popularly credited with introducing the term “ready-build” to describe public works projects which had already been conceptualized and were able to commence development immediately, pending an infusion of funding. [24] Infrastructure was a particular focus of the Invest in America program, particularly bridges (“America is building bridges” was popularly featured in advertising during the 1982 midterms), tunnels, and maintenance and repairs of existing roads and rail lines. Transportation being fundamental to infrastructure and to manufacturing interests, it was also one of the first industries which saw greater expansion. Enlarging mass transit networks in urban areas was very popular and easy to implement; it often took as little as buying more buses. Rep. George Takei was the celebrity Congressman who carried the torch for transportation funding – he served on the House Committee of Transportation and Infrastructure, and appeared in that capacity on Meet the Press in early 1981, trying to sell high-speed rail to the American public. [25] Fortunately for him, his idea caught on in the Glenn Administration, thus becoming known, by detractors and supporters alike, as “Glennrail”. It was not a misnomer, as Glenn would come to be associated with HSR as strongly as President Eisenhower before him had been associated with the Interstate Highway System. He even made a Kennedy-esque promise early in his term, vowing that “people all over these United States will enjoy widespread and convenient high-speed rail travel before the end of this decade”. He was able to do so largely because much of the preliminary work had already been done by the Humphrey Administration in the mid-1970s, shortly after the Oil Crisis, which was then tabled by the incoming Reagan administration shortly thereafter. The “Invest in America” component, however, entailed the involvement of private corporations, who were encouraged to “do their part” through tax incentives and (where warranted) low-interest business loans. The federal government, for their part, sought to pool their resources with state governments, and local governments – many of which were restricted to maintain balanced budgets and therefore had to be more flexible in terms of providing funding.

    Glennrail was clearly in its infancy by the autumn of 1982, not to mention it had many powerful opponents, including (not surprisingly) the automotive industry – the American automakers had suffered the Oil Crisis and the influx of Japanese imports encroaching on their market share, and resented that a mode of transportation they thought had been driven out of business decades earlier was being brought back to life on Uncle Sam’s dime. Naturally, the automotive industry providing millions of jobs and billions of dollars in tax revenues – in both cases, largely in industrial swing states – resulted in the Invest in America program also reaching out to the American automakers (no seed money was offered to Japanese manufacturers for expansion, even though these would also have provided jobs and tax revenues, because of the “Buy American” nativist attitude of the time). American Motors, the fourth-largest (and perennial laggard) of these, was particularly threatened by Japanese innovation, and was provided with a business loan of nearly $100 million by federal and state governments that staved off a buyout attempt by a French conglomerate, money invested in modernizing the production facilities at its main plant in Kenosha, Wisconsin – the oldest continuously operating automobile manufacturing plant in the world. [26] This didn’t stop lobbyists from insisting that money spent on “rails instead of roads” was “money ill-spent”, but it did help to generate goodwill from one of the biggest enemies of Glennrail. It was just one yarn which could fill a whole storybook of how the politics of Invest in America had a huge impact on every sector and every industry. However, as with so many other investments, it would take some time to pay dividends.

    It didn’t help that economic recovery was somewhat sluggish – the country was officially out of recession by the third quarter of 1982, but unemployment remained stubbornly high, and GDP per capita was not enjoying the growth rates they had seen during the “rebound” period of the mid-1970s. The US economy was removed from the gold standard once and for all, creating a “Glenn shock” just as the monetary system had recovered from the “Humphrey shock” of nearly a decade before (once Reagan had reinstated the old system). The system of taxation was reorganized once again – however, 50% remained the maximum rate levied upon individuals. In addition to economic arguments based on revenue maximization curves, the symbolic significance – that a taxpayer should be entitled to at least half of his or her earnings – was difficult to debate effectively. However, Glenn made no attempt to push Congress to lower that ceiling. Fiscally conservative Republicans (still labelled “Reaganites” after their fallen leader, though the term “fiscon” began to supplant it by 1982) pushed for further reduction of taxes, but these were drowned out by liberal Democrats (and even a few liberal Republicans) pushing for greater expansion of the welfare state, often (facetiously) described as a “Greater Society”. The 1970s had seen this see-sawing result in two massive recessions, resulting in the desire by moderates on both sides of the aisle (led by Glenn) to seek a third alternative – which was where his “Invest in America” platform was put into practice.

    President Glenn’s moderate, third-alternative-seeking tendencies, coupled with his history as a spaceman, immediately invited comparisons to one of the iconic authority figures – and spacemen – of popular culture, Captain James T. Kirk. A persistent legend had it that the Republicans had initially planned to attack Glenn during the 1980 campaign by derisively painting him as a “would-be President Kirk”, only for focus groups to respond to Glenn more positively after such a comparison had been made; this potential bit of apocrypha may have been borne out of the sudden, renewed enthusiasm for Glenn from a formerly dormant group: the Moonie Loonies. Retro nostalgia, which had cast them aside some years before, was now beginning to look more favourably upon them. The first man in space, Yuri Gagarin, had flown his historic flight in 1961 – the fifth, President Glenn himself, had done so in 1962. Retro nostalgia tended to lag twenty years behind, which meant that by the early-1980s, people were looking back quite fondly indeed at the time of the very first space explorers. Glenn, during his Presidential campaign, had delivered vague platitudes about the importance of the space program, and his outrage that President Reagan had allowed it to atrophy, but carefully avoided making any promises or timetables regarding new missions or projects once he took office – such a pledge would remind everyone of JFK and his Moon landing promise, and most everyone continued to be fatigued of invoking the tapped-out Camelot legacy. Space would happen when it would happen, and until then, President Glenn made it his mission to invest in America.

    The first high-speed rail line in the United States – from Penn Station to Newark International Airport Station, built and completed in 1981 – supported a chassis built by the Electro-Motive Division of General Electric, though with a powertrain licenced from Bombardier, which built the high-speed trains in Canada (starting with the famous “Rocket” line from Montreal to Mirabel). [27] This jury-rigging would be replaced in due-time by a complete, purpose-driven train fully constructed by EMD, which would service every high-speed rail line built throughout the United States, and these were legion. Rep. George Takei, who represented an LA-area district (which contained Hollywood itself, unsurprisingly) favoured a Southland-to-Bay Area route – nicknamed the “Fault Line” because it would travel along the San Andreas Fault. However, political and economic realities instead favoured developing the densely-populated Northeastern conurbation. On from Newark was Trenton, the capital of New Jersey, followed by Philadelphia, Wilmington, and Baltimore en route to Washington, DC. In the other direction from New York, of course, was much of Southern New England on the way to Boston. However, as much as the Northeast dominated the American economy and political scene, it was not the only densely-populated region in the United States. [28]

    To Rep. Takei’s delight, California was quickly added to the list. Gov. Houston I. Flournoy supported the high-speed rail plan, partly in hopes of leaving a strong legacy (as in accordance with the two-term tradition he wasn't running again in 1982). In addition to the obvious Southland-to-Bay Area line, plans were in place to continue the line to the state capital at Sacramento, with a “fork” near Merced (and, eventually, a spur linking Sacramento to San Francisco) – as well as a line from Los Angeles to San Diego, a major border city and home to the largest military base in the Pacific. It was here that the “Invest in America” strategy of attracting private investment paid dividends – the Walt Disney Company became a partner in the California high-speed rail program with the condition that a line to Anaheim – home of Disneyland – be built as soon as possible. For that reason, two crews would operate out of Los Angeles – one heading north to Bakersfield (via Palmdale) and the other south to Anaheim. At the same time, crews worked to link San Francisco to nearby San Jose – fortunately, they knew the way – en route to Merced, where another crew operating out of Sacramento would rendezvous with them. Ground was broken in time for the 1982 midterms, allowing for Takei to campaign on the progress he was helping to deliver (not that he would have lost anyway, in a safe Democratic seat, though it certainly helped to show that he wasn’t resting on his laurels or coasting on his celebrity).

    However, high-speed rail construction wasn’t restricted to the two coasts. President Glenn was himself from Ohio, and he knew that his victory had depended on the Midwestern swing states, which he also hoped to support by bolstering their shaky industrial and manufacturing sectors. It also helped that the EMD headquarters were themselves based in suburban Chicago, and that a major Paxrail yard was in Indianapolis – the line between the two cities was one of several lines emanating from Chicago (which would, in turn, continue on to Cincinnati) in a geometrically complex and intricate “radial” pattern which would, when completed, resemble a web. The others headed for Minneapolis (via Milwaukee and Madison), St. Louis (via the Illinois state capital of Springfield), and both Cleveland and Detroit, via Toledo. Chicago’s central location and long history as a rail hub would serve it well in the coming transition to high-speed lines.

    Vice-President Jimmy Carter was naturally inclined to support a high-speed rail line operating out of his native Georgia, and the Peach State did have the advantage of hosting another major transportation hub in its capital city of Atlanta. Carter found several unlikely allies (and investors) in his endeavour: one of these was Senator George Wallace, who still held the purse strings of the Alabama state government, and who had a longstanding desire to bring home the bacon – which a rail line from Atlanta to his home state’s largest city of Birmingham, covering a distance of over 150 miles, would provide. Another supporter of high speed rail in Georgia was a man who was in many ways the opposite of Wallace: media mogul Ted Turner. The iconoclastic Turner had strong ideological reasons for supporting high-speed rail: his strident environmentalism was first among these. Unlike Wallace, he had no particular target in mind for a route originating in Atlanta – and though he was loath to support Birmingham, it made a great deal of sense: from Birmingham, the high-speed line would likely cut right across Mississippi clean, smoking into New Orleans – the same destination as any line originating from Houston once the Triangle Line had been completed. Birmingham was not the only attractive destination for an Atlanta-based line. The Walt Disney Company, having provided the funding necessary to secure high-speed rail from Los Angeles in Anaheim, the home of Disneyland, was no less interested in connecting the much larger and more isolated Walt Disney World complex to the high-speed rail network. Another possibility was a line extending east from Atlanta to the Carolinas, and on from there to Washington, D.C. from the South, linking with the Northeast Corridor. However, a roadblock in this plan was the Governor of Georgia, a member of the AIP, who refused to approve the funding for such a project. Thus, instead of originating in Atlanta, the first high-speed rail line in the south east would be located in Florida, originating from Miami and heading northward through Orlando (via, among other stops, a station near Cape Canaveral, home of the Kennedy Space Center), and from there onward to Tampa and Jacksonville, perhaps eventually to connect to Atlanta – or bypassing it entirely and heading through Savannah to the Carolinas.

    Other than the pioneering Penn Station-to-Newark line, the first rail line to cross state lines also crossed international lines – the border between the United States and Canada, along the 49th parallel north. A line connecting Seattle, the transportation hub of the northwest (though, to be fair, pretty much by default) to Vancouver, the only major city on Canada’s Pacific coast, became popular as a means for Canada to invest in high-speed rail west of the Quebec City-to-Windsor axis (where half the country’s population lived) in a region that would take over a decade to connect to it. Granted, Seattle was also fairly isolated from the other rail lines (even Sacramento was years away) but connecting the two cities strengthened the positive relationship between the two countries and allowed both Stanfield and Glenn to convince their electorate that they cared about an oft-overlooked corner of their respective countries – and recognized the commonalities that crossed the border between them. Their shared region, the Pacific Northwest, was also known as Cascadia, referring to the Cascade mountain range which traversed it, and the new rail line (which would eventually continue south to Portland, Oregon, with the eventual goal of connecting to a line from Sacramento) was therefore referred to as the Cascadia Line.

    Of the seven rail networks laid out in the original “Glennrail” proposal, five (Northeast, Fault Line, Midwest, Cascadia, and Florida) were accepted right away, and one (Southeast) was flatly declined. The seventh, Texas Triangle, was in a very precarious position. The line was intended to link the three largest urban areas in the populous, prosperous, and resource-rich state of Texas: Dallas-Fort Worth in the north, San Antonio in the southwest (via the state capital, Austin), and Houston in the Southeast, on the Gulf Coast. The Texan legislature was reluctant to invest so much money in public works – however, the project found an unlikely champion in the US Senate, in the state’s senior Senator, George Bush. Whether his act of bipartisanship was intended as a gesture to make him more attractive to the general electorate for a rumoured run in 1984, or whether it reflected his genuine moderation, it worked; oil interests in Houston and financial titans in Dallas both supported the rail line, making it – surprisingly – the one that required the lowest outlay of cash from the public sector relative to total costs (naturally, the private corporations would be rewarded in the long run, through generous tax breaks, nearly equivalent to what they would have received spending their money on charitable donations). The line between San Antonio and Austin was scheduled to be completed first. By 1982, crews were in operation across the country, with signs promising high-speed connections – not unlike those which promised much the same through superhighways in the 1950s. Although the pace of construction seemed unusually rapid, it was because many of the original proposals for high-speed rail lines were able to be implemented with minimal modification from their original specifications.

    The Democratic Party campaigned heavily on Invest in America in the 1982 midterms – but they were forced to do so alongside George Wallace (a powerful supporter of the initiative) and his faction, linking perhaps their biggest strength with their Achilles heel. The Republican caucus did not present a united response to the Democratic platform – the fiscon faithful were among Invest in America’s loudest opponents, but moderates were more lukewarm in their opposition, some even offering qualified support. Many Republicans pointed out that the economy was not much better than it had been when Glenn took office – both sides were aware of the same indicators that Prime Minister Stanfield had noticed, that the economy was improving, and that it would soon become readily apparent – but not by November 1982. Charts and percentages and abstract data points could not compete with anecdotal evidence as far as the average voter was concerned, and when the answer to the question “Are you better off than you were two years ago?” is met with a resounding “No!”, then the choice for whom to vote seemed clear. Notably, the 1982 election cycle was the first covered by the Cable News Network (CNN), one of the many brainchildren of Atlanta media tycoon Ted Turner, and initial attempts to focus on cold, hard facts soon gave way to pretty pictures, vox populi, and sensationalism – just as had been the case for every other form of news media before it. Nevertheless, political junkies enjoyed the round-the-clock coverage of the national campaign – cable channels didn’t need to sign off in the wee hours of the morning, though advertisers and demographers weren’t sure what to make of the kinds of people who would be watching rehashes of yesterday’s breaking news stories at 3 o’clock in the AM.

    CNN appealing to the high-awareness, high-interest voter did little to counteract the general apathy that came with midterm elections, however, and turnout declined considerably from 1980, since there was no Presidential contest at the head of the ticket. The Republicans did succeed in regaining the House, giving them control of both chambers of Congress – both the GOP and the Democrats gained seats from the AIP, who retrenched to their “strongholds” (less Alabama), leaving the party reeling. In yet another parallel with Eisenhower, President Glenn would have to face a hostile Congress for the rest of his term – at least, until 1984. The Republicans won 228 seats in the House, an increase of 24 from 1980, to give them a comfortable majority. Their gains in the upper chamber were more modest, bringing them to 53 seats there. The AIP were reduced to single-digits in the House, allowing the Democrats to stem their net losses to fewer than 20 seats overall. Although this was definitely a setback for the Democrats, they considered this inertia from the prior state of affairs. The changes that the Glenn administration had set into motion would soon overtake them.

    All things considered, the early-1980s were a period in which many of the world’s leading democracies had to make tough decisions and see them through. Voters were willing to stay the course to get them done, but only time could tell whether the outcome would be anything close to what had been anticipated…

    ---

    [1] The New Fighter Aircraft Program also took place within the same timeframe IOTL, and also selected the F/A-18 Hornet as the basis for an entirely land-based fleet of aircraft. Fortunately, the Hornet is just about equally suited to carrier-based flight, which does not hinder its selection ITTL. Certainly if money were no object, the Canadian government would have chosen the Tomcat, but it was, so they didn’t. The Super Étendard, meanwhile, was never a contender IOTL, apparently because it was never on offer by the French (though, as ITTL, it would have been a laggard).

    [2] It should be noted that, IOTL, the Canadian government has occasionally considered acquiring a helicopter carrier for that same purpose, but has never firmly committed to that objective.

    [3] Britain was willing to sell HMS Invincible herself – that’s right, the lead ship of her class and the eventual flagship of the Royal Navy – to Australia prior to her having completed construction. If that isn’t a palpable demonstration of the sorry state the Senior Service was in during the 1980s, I don’t know what is.

    [4] When Portugal has more islands to defend (especially in potentially hostile regions of the world), a better economy, and a slightly larger population, a light carrier becomes very attractive. Consider that Thailand purchased a carrier from Spain in the 1990s (just a decade later) IOTL, and it doesn’t seem so outlandish that Portugal might do so.

    [5] Many of these modifications are similar to those which would be integrated into the design of the OTL successor class to the Tarawa, the Wasp-class. Note also that, IOTL, the Wasp-class is classified “LHD” (like the HMCS Diefenbaker ITTL) as opposed to “LHA” like the Tarawa-class.

    [6] Diefenbaker died in 1979, IOTL and ITTL, serving as an MP until his death. IOTL, he lived long enough to see the Tories finally (and briefly) form government (for the first time since his defeat in 1963), and died before the government collapsed the following year – ITTL, he lived long enough to see the Argentine War to its conclusion.

    [7] Literally, the phrase means “a beautiful concord [harmony, or state of agreement] for the beautiful province” – la belle province is the popular nickname for Quebec, even in English. Trudeau deliberately mistranslates the phrase to get the same point across to Anglophone audiences who live outside Quebec and are therefore less interested in its well-being specifically.

    [8] Patriation is a neologism from the word repatriation, as the power to revoke or amend Canadian constitutional law had never been held by the Canadian government. The term was coined in 1966 by then-Prime Minister Lester B. Pearson, well-known for his independent streak (he brokered a peace at Suez, and then replaced the Red Ensign with the Maple Leaf flag).

    [9] The Bill of Rights 1689 is considered the culmination of the Glorious Revolution, which definitively established the supremacy of Parliament over the Sovereign (and gave the former the unimpeachable right to select whom the latter would be, codifying a precedent dating back to Henry VIII). It inspired the first ten amendments to the US Constitution, collectively known as the Bill of Rights, and passed in 1791. The principles of the American Revolution in turn inspired the Declaration of the Rights of Man and of the Citizen during the French Revolution, and that document can be found embedded within the Constitution of the French Fifth Republic (dating to 1958) and those from several of its predecessor states.

    [10] The longest duration a single party has controlled the government in Canadian history was the period of 1935-57, during which time the Liberals won five consecutive majorities. Their loss to the PCs in the 1957 election (despite winning a higher share of the popular vote) came as a huge upset to media observers of the time. Of course, as far as Turner is concerned, that’s ancient history, and there’s no way any Tory government should be allowed to hold power for that long. (The longest stretch of uninterrupted Conservative governance was 1878-96, sustained by four consecutive majorities).

    [11] The Social Credit Party won four seats in English Canada in both the elections of 1962 and 1963 – and, despite the schism of the party along linguistic lines shortly after the latter election, they actually won five in 1965, on 3.66% of the vote; by comparison, the Socreds won 26 seats in Quebec in 1962, and 20 in 1963, falling to only nine seats and 4.66% of the popular vote in 1965, after the schism. (In 1962, the Socreds got 11.60% of the vote nationally; in 1963, 11.92% – in all respects, they were stronger together than apart.) However, in 1968 the Socreds were shut out of Parliament (on an anemic 0.85% of the vote – the party had never done so badly) whereas the Ralliement rebounded to win 14 seats (on a much more efficient use of their vote: 4.43%), clearing the way for the parties to reunite in 1971 – and the rest is history (either ITTL or IOTL, whichever you prefer).

    [12] The Western proposal would be consolidated into what would become known circa the late-1980s IOTL as the Triple-E Senate: equal, elected, and effective. Notably, this would become a cornerstone of the platform of a Western protest party known as the Reform Party at about the same time.

    [13] The two existing territories at the time (Yukon and the Northwest Territories) were each awarded a Senator in 1975 IOTL, bringing the total to 104. The 105th Senator would only be added when Nunavut became a separate territory in 1999. (If the Turks and Caicos ever manage to become a Canadian territory, they would certainly receive a 106th seat.)

    [14] Kesler enjoys notoriety IOTL as the one-time leader of the Western Canada Concept, a regional separatist party. He was elected to the provincial legislature of Alberta in a famous 1982 by-election, becoming the only separatist to be elected in any province outside of Quebec since the 1870s. However, Kesler was often accused of not being a “true believer” by the founder of the movement, Doug Christie, and therefore ITTL joins the strongly pro-regionalist Socreds instead.

    [15] In contrast to the White Paper of 1981 IOTL – notably one year before the situation in Argentina came to a head as opposed to one year after.

    [16] David Owen was IOTL a member of the “Gang of Four”, a group of prominent moderates within the Labour Party, who left in order to form the Social Democratic Party (SDP), which would form an electoral alliance with the Liberals and would prove a thorn in Labour’s side through the 1980s, before the two allied parties merged into the Liberal Democrats, who endured to this day.

    [17] Militant were not effectively expelled from Labour until much later IOTL – though they were formally sanctioned in 1982.

    [18] A constraint that France faced IOTL which limited the overall length of the Charles de Gaulle. The story goes that there was another drydock in the same area which could handle longer ships – but it wasn’t owned by the company which had received the construction contract, so it might as well have been halfway across the world.

    [19] The third and last of the Invincible-class light frigates was given the name Ark Royal IOTL, also by popular demand – she was previously known as the Indomitable, and she will be commissioned with that name IOTL, alongside her sisters, the Invincible and the Illustrious.

    [20] For reference, the Audacious-class Ark Royal (R09) was 245 metres long (presumably at the waterline), had a beam of 52 metres, a draught of 10 metres, and displaced 43,000 metric tonnes (dry weight), with a complement of ~43 planes, following her 1970 refit. The French Clemenceau-class ships were 265 metres long (this time at deck height), had a beam of 51.2 metres, a draught of 8.6 metres, and displaced a mere 22,000 tonnes, with a complement of ~40 aircraft. The OTL Charles de Gaulle is 238 metres long at the waterline (and 261.5 metres at deck height), has a beam of 31.5 metres at the waterline and 64.36 metres at deck height, a draught of 9.43 metres, and a displacement of 37,000 tonnes. The British design from which our Entente-class borrows most heavily is the unused CVA-01 design, which was 273.1 metres long at the waterline and 282 metres long at deck level, had a beam of 37.2 metres at the waterline and 56 metres at deck level, had a draught of 10.2 metres, and displaced 54,500 tonnes. The extra weight comes from the nuclear reactor as opposed to the diesel-electric engines of the OTL design, and the extra length comes from the attempt to match the fineness of the OTL Charles de Gaulle (fineness is defined as waterline length over waterline beam – the magic fraction in this case is 68/9, and the Entente-class betters that ever-so-slightly).

    [21] The OTL Queen Elizabeth-class carriers, currently under construction, have been described as “supercarriers” by the UK press due to their sheer size, despite their many other deficiencies relative to the supercarriers of the US Navy. Here, the difference is considerably less extreme, giving the term more legitimacy vis-a-vis OTL.

    [22] France withdrew from the Eurofighter project in 1984 IOTL, successfully dragging Spain out of it as well (though the latter would relent the following year). France wanted a (CATOBAR) carrier-capable for what would eventually become the Charles de Gaulle, but was alone on that front, since none of the other partners could support such an aircraft. ITTL, since the UK also wants what France wants, the two are able to come together and leverage all of the other junior partners quite effectively – and flying a carrier-capable plane off land bases would still meet most of their needs anyway.

    [23] As has already been mentioned, this catchphrase enjoyed great currency ITTL and IOTL – though you may note that it has never been executed successfully.

    [24] The OTL term analogous to “ready-build” is “shovel-ready”. Formally, the term for “ready-build” ITTL is “planned projects awaiting only approval/funding that can be quickly executed”.

    [25] Even IOTL, Takei (or should I say “Mr. Sulu from Star Trek”) has a proud history of promoting mass transit nationwide.

    [26] That money obviously never materialized IOTL, leading Renault (the French company in question) to receive ownership of AMC (in lieu of providing credit) in exchange for a cash infusion of $90 million after the banks refused to extend further credit to them.

    [27] Bombardier also built the trains for the only extant high-speed rail line in the United States as of this writing: the Acela Express along the Northeast Corridor.

    [28] All of the rail corridors mentioned below have been popular candidates for first-stage development of a potential national high-speed rail system IOTL. Investigating such possibilities on Google will return many pretty maps, with widely varying degrees of plausibility.

    ---

    My apologies to everyone for the continued delay, as well as the extreme length (this is my second consecutive update to break the 10,000-word mark) though as you might have guessed, it was in part the latter that accounted for the former. Thanks as always to e of pi for proofreading and sounding board services, and also to Thande for his advice on certain matters.
     
    Last edited:
    Appendix C, Part VI: Rendering of the Verdict
  • Appendix C, Part VI: Rendering of the Verdict

    A long time ago, in a courthouse
    far, far away (from Hollywood)

    tumblr_ncjj615KSf1qlz9dno1_1280.png

    The front facade of the Supreme Court Building at 1 First Street, Washington, D.C., which houses the Supreme Court of the United States. The ultimate appeal in Lucasfilm Limited v. Paramount Pictures Corporation was argued here in September, 1982, and decided on February 23, 1983, nearly five years after the lawsuit was initially brought forward.

    “For almost its entire history, the Hollywood motion picture industry has been determined to play by its own rules, as though they deserve special treatment over the farmers, manufacturers, and servicemen and women that keep our country running. The only people who’ve been willing to stand in their way have been the honourable justices of this Supreme Court. From their very inceptions, the movie studios have willfully and deliberately violated established antitrust laws in this country, seeking to control the means of production and distribution and quash the competition to the emerging oligopoly. Then, as now, they went about pretending that commercial and financial regulations didn’t apply to them the way they did for everybody else. In 1948, this court put a stop to that in the Paramount decision – but the last distributor, Loew’s, didn’t divest MGM until 1959, eleven years later. At a time when the studios made it their solemn duty to circumvent the First Amendment in the name of antiquated standards of decency, this court denied them that opportunity in the Miracle decision – though, I might add, it took them fourteen years to replace the Hays Code with a ratings system, and one which only operates through corruption and cronyism. The major film studios have, throughout their history, shown no regard to any rules and regulations but their own, and have consistently disrespected the will of Congress, and the wisdom of this Court. They must be punished for their misbehaviour. Our evidence is clear and overwhelming, and a message must be sent to the motion picture industry in your ruling, that they cannot be allowed to continue carrying out their misbehaviour with impunity. Thank you.”

    Andy Taylor, in his closing argument before the Supreme Court of the United States, arguing Lucasfilm v. Paramount on September 24, 1982

    It had all come down to this.

    Andy Taylor could never have imagined, even in his wildest dreams, that he would be returning home in these circumstances. He had been born and raised in the Old Line State, and though Washington was technically separate from Maryland, it was near enough to his hometown of Baltimore that he could commute there by train in less than an hour – even before the high-speed rail extension was due to reach the Chesapeake. [1] And he made the trip back and forth countless times, trading off his celebrity status as the plucky young lawyer from Charm City made good, and catching the home games of his beloved Orioles, in the midst of a playoff run that would culminate in their winning the 1982 World Series. [2] It took up every spare moment of his time, the bulk of which was largely devoted to preparing his arguments before the Supreme Court of the United States, the final court of appeal in the judicial system. Despite his reputation as the David to the army of Goliaths that constituted Paramount’s legal team, Taylor’s lustre had faded somewhat due to his loss on appeal before the Ninth Circuit. That reversal, of a verdict rendered by the good people of the jury in Los Angeles not only stung the plaintiff, but also the people – and, incensed with righteous populist indignation, these people exhorted their legislators to “do something”.

    A key early sponsor of the “something” that Congress had in mind was none other than Taylor’s old friend, freshman Sen. Marlin DeAngelo, a Democrat from California, in a key indicator of the groundswell of public support for closing the loopholes that allowed the entertainment industry to understate profits, depriving stakeholders and tax collectors of their due. This was not without a palpable sense of betrayal from many of those who had helped DeAngelo win the party nomination, and then the election, two years before – as a Congressman, the district he had represented included Hollywood itself. However, DeAngelo seemed to have loftier and broader ambitions than pandering to a core constituency (especially when that constituency was severely outnumbered). DeAngelo’s recent replacement, friend, and ally, Rep. George Takei, also voted in favour of the eventual legislation, though he did not play a major role in sponsoring the bill beforehand. However, the Financial Accounting Bill, as it became known, was not met without opposition in the Congress. DeAngelo and Takei were among the few whose constituents included the entertainment industry that were not in their pocket [3], and resistance to the bill became a symbolic last stand for the fading Reaganite faction; in fact, the term “fiscon” first came into widespread use during the coverage of debates over the legislation.

    The bill, which became law upon being signed by President Glenn, created the Financial Accounting Commission (FAC), which was assigned many of the roles and responsibilities previously held by the voluntary, self-regulatory Financial Accounting Foundation. [4] Where there was overlap with the responsibilities held by the extant Securities and Exchange Commission, these would be assigned to the new agency. Because the SEC reported directly to the Executive Office of the President and not to any department, so too would the FAC. President Glenn offered the inaugural Chairmanship of the Financial Accounting Commission to C.A. Baxter, forensic accountant and star witness for the plaintiff, who declined pending the conclusion of Lucasfilm v. Paramount’s appeal to the Supreme Court.

    In the meantime, given the Byzantine legalese that regulating the accounting and financial professions would require, most media outlets lost interest in covering the legislation almost the moment it had been tabled, with their business experts assuring them (and their audience) that it would be sufficient in closing the loopholes that the entertainment industry had exploited for so long. Although there was considerable buzz surrounding Baxter’s possible appointment to the new Commission – with many observers claiming that it would serve as the culmination of, appropriately enough, a real-life example of one of the great Hollywood little-guy-takes-on-the-big-bad-machine stories, Mr. Smith Goes to Washington. When Baxter turned down Glenn’s offer, what little public interest remained in the legislative side of the saga dissipated. [5] The density and complexity of the responsibilities which the new agency would have which were outlined in the bill that was ultimately passed entailed hundreds of pages; every newspaper that covered the evolving legislation and the agency it produced relegated the details to deep within their business sections.

    By a verdict of 5-4, the Supreme Court ruled in favour of the plaintiff, awarding Lucasfilm $1,000,000,000 in damages, on February 23, 1983 (a Wednesday). It led all the headlines the following day – at least, everywhere that the harsh winter storms didn’t impede communications or transportation. The Wall Street Journal, which naturally covered the goings-on from a business perspective, led with “LUCASFILM GOES FOR BROKE; PARAMOUNT GOES BROKE”. The New York Times instead went with “MR. TAYLOR WINS IN WASHINGTON”; across the country, the Los Angeles Times announced the “JOURNEY OF THE LAWSUIT”, reserving the actual verdict for a subtitle (“Lucasfilm wins verdict, 5-4”). The New York Post went with the most regrettable headline possible (in hindsight), “BLOODIED BLUHDORN”, when they emphasized the damage it had done to the owner of Paramount. The National Enquirer lazily recycled their headline from the original 1980 verdict when they announced that “SUPREME COURT USES ‘FORCE’ ON PARAMOUNT FOR $1B RULING”. Every article focused, at least fleetingly, on what this ruling meant for the three principals: George and Marcia Lucas, and Andy Taylor (who was entitled to a whopping percentage of the verdict, as he had worked on contingency). Unsurprisingly, stock prices for all the Gulf+Western subsidiaries plummeted precipitously as a result of the verdict. There would be no further appeals, as there had been after the 1980 ruling; the conglomerate was on the hook for a billion dollars. Then the situation went from bad to worse.

    Two days after the verdict was handed down, on February 25, 1983 (a Friday), Charles Bluhdorn was found dead in his bedroom at age 56. [6] The cause of death was a massive heart-attack; observers ranging from his closest intimates to his fiercest critics judged that the shock of the verdict had essentially killed him. It seemed unreal, like something out of one of the many movies his studio had produced, but the fact remained: the fate of Bluhdorn’s crumbling empire would have to be determined without him. By this time, Marcia Lucas had officially resigned from her position at Desilu Post-Production, ending a ten-year stint in the editor’s room – one which had earned her two Academy Awards. [7] It was now clear that working full-time as a partner in Lucasfilm would be a far more lucrative and demanding business. C.A. Baxter, who had previously served as an auditor, was hired by Lucasfilm to valuate Paramount’s net assets. The larger, older studio was not sufficiently liquid to have $900 million [8] cash on hand, nor could they easily liquidate their current assets; only a fire-sale of their vast capital portfolio would have provided them with what they needed.

    The Lucases wanted to make movies, and they believed they would be better placed to do so with the capital and intangible assets already owned by Paramount under their control, as opposed to immediately re-investing their newfound windfall into the acquisition or manufacture of new capital assets and intangibles. Therefore, both sides entered into intense negotiations. The Lucases agreed in principle that they would accept a transfer of Paramount’s net assets – in effect, of the Paramount Pictures Corporation itself, to be transferred from the ownership of Gulf+Western Industries, Inc., as payment of the damages owed to them. This included assuming all debts incurred by Paramount – by law, creditors were entitled to returns ahead of stockholders, and the Lucases did not want Paramount or Gulf+Western to default on their obligations, lest their damages become uncollectable. [9] The remainder of the Gulf+Western conglomerate would be allowed to continue operations as an independent entity, with Bluhdorn’s estate and heirs allowed to determine its own destiny, mostly outside of the entertainment industry. The details, however, would have to be ironed out.

    Baxter judged that it was a good deal for both sides; Paramount would be insolvent if it sought to pay the damages it owed in cash, and the capital assets – and particularly the intangibles, such as pre-existing contracts – which Lucasfilm would be recovering had a value in use far greater than those which the studio would likely be able to generate through new investments. Baxter agreed that he would waive his consulting fees in exchange for being granted permanent access to all papers relating to the lawsuit, including the negotiations taking place in the aftermath; in addition, he sought publishing rights for the sordid story, and licensing rights for any adaptations. All three principals – George and Marcia Lucas, and Andy Taylor – agreed to allow Baxter to publish the “official” account of the Trial of the Century, though they would obviously retain editorial control and final approval of the finished product. Fortunately, Baxter had written a number of general reference books for public consumption, providing him with notoriety among the public (resulting in his “bid” for a seat on the Financial Accounting Commission) and had built a reputation as an educator of the masses.

    The tome which resulted was given the title David and Goliath: The Authorized Account of Lucasfilm v. Paramount, the Trial of the Century. C. A. Baxter was credited as sole author. Like most of his books, David and Goliath was a fairly dry read, just this side of inaccessible; it was also an interminable slog, often criticized as a “doorstopper”. The book’s scope was relatively narrow – starting with the initial contracts between the two parties signed in the mid-1970s, during pre-production on Journey of the Force, and concluding with the 1983 “sellout” of Paramount to Lucasfilm, but it was many hundreds of pages long. The length and complexity of David and Goliath invited comparisons to the Italian medievalist, semiotician, and literary critic Umberto Eco, whose first novel, finally translated into English, also hit the shelves in 1983. [10] Both books were best-sellers despite their shared inscrutable density: Baxter’s nonfiction book was a long-awaited exposé straight from the horse’s mouth, coming right on the heels on Lucasfilm’s triumph before the Supreme Court; Eco’s novel was crammed with labyrinthine historical allusions and convoluted theming and symbolism, making it attractive to intellectuals of all stripes (including those who wished to appear intellectual – a much larger demographic than anyone was willing to admit). Both books, accordingly, were awarded the sobriquet “the best-selling book nobody ever read”. [11] Both, however, were quickly optioned for adaptations, as they shared a key trait: there was a strong story buried beneath the layers of extraneous posturing.

    Once the adaptation rights to his story had been sold, Baxter’s schedule was sufficiently clear that he could accept an appointment to, and the Chairmanship of, the Financial Accounting Commission, which he did, effective as at the beginning of 1984 (once he was confirmed by the Senate). He would remain in this post for the next five years, marking the high point of his career. Baxter had been intimately involved with the process leading up to the creation of the Commission on which he now served ever since he was first contacted by Andy Taylor during preliminary fact-finding in 1978. This had granted him valuable insight for the role, but it also allowed him to greatly inflate his own importance and perhaps dwell too heavily on his own perspective to the exclusion of others. Granted, this was a weakness shared by no less a writer than Winston Churchill (Baxter himself did not make this comparison, though others did – he merely said “anyone mentioned in the same breath as Winston Churchill and Umberto Eco is in good company, even if it is because of their mutual flaws”). [12] Baxter’s pomposity and grandstanding had always been a stark contrast to his associate Andy Taylor’s temperate modesty and prudence – Marcia Lucas, some years later, aptly noted that “If you didn’t know, you’d have been real sure Cal was the lawyer and Andy was the accountant.”

    However, despite their non-stereotypical personalities, both men obviously were well-suited for their respective vocations. In a matter of weeks, Lucasfilm and Gulf+Western (representing the interests of Paramount) devised a suitable plan for Lucasfilm to recover the $900 million they were duly owed, without Paramount defaulting on their debt, and even allowing them to continue with their operations (in a manner of speaking). Paramount Pictures, and all assets and liabilities associated therewith, would be transferred in its entirety to Lucasfilm Limited. In order to pay down those liabilities, Lucasfilm chose to liquidate those assets which would not suit their strategic purposes. In order to cover the contingent legal fees which were owed to Andy Taylor (which amounted to hundreds of millions of dollars), he was given an equal partnership in the new Lucasfilm with each of the Lucases – he would own one-third of the company going forward, alongside both George and Marcia – along with a “modest” lump-sum in the eight digits, mostly derived from the $100 million bond payment which had been held in escrow since 1980.

    George Lucas had a sufficiently well-developed ego that he chose to retain the “Lucasfilm” name to carry on film development, production, and distribution, rather than continue using the Paramount name (thus abandoning a brand with over seven decades of history). Lucasfilm sold all brands, trademarks, and logos associated with Paramount Pictures to CanWest, which had not derived nearly as much benefit from the United Artists name as it had hoped it would. CanWest formally changed names to CanWest Paramount to reflect this acquisition, which was entirely superficial and had virtually no bearing on the day-to-day operations at Global Television or (the former) United Artists Corporation – which was swiftly renamed Paramount Pictures Corporation, despite having no legal continuity with the Paramount of 1912-83. [13] The remaining film and television library belonging to Paramount were sold to Ted Turner, with the sole and obvious exception of Journey of the Force, which went into re-release just in time for the Christmas season of 1983, the first time it had been in theatres in six years. A home video release, meanwhile, was planned for 1984.

    And, finally, the massive studio space on Melrose Avenue, the crown jewel of the Paramount studio, would be sold to the neighbouring Desilu Productions at a bargain price, as a personal reward and show of gratitude for Lucille Ball’s enduring friendship and support of the Lucases during their many years on the Hollywood blacklist. The large, imposing wall (known internally as simply “The Wall”) which separated the two studio lots was auspiciously demolished, with a long ribbon being set up in its place. [14] Lucille Ball, along with George and Marcia Lucas, would cut this ribbon in a grand ceremony featuring many Desilu and Paramount stars and staffers, past and present. Ball famously quipped: “If this wall can finally come down, maybe there’s hope for the one in Berlin, too.”

    Although Desilu was only purchasing the studio space, Ball (who was, after all, a veteran of the Golden Age of Hollywood) promised to establish a permanent exhibit recognizing the storied film history of the location under the auspices of Paramount Pictures – in addition to honouring the past, it would also make an excellent (and lucrative!) tourist attraction. [15] Desilu didn’t really need the room anyway – they already had the most studio space of any production company in the United States – and were happy to rent much of the space out, as they were already doing at Cahuenga. Lucasfilm was given the first right of refusal on unused studio space for rental purposes – duplicating the original agreement between Desilu and Paramount from 1967 – and their offices would be housed there as well.

    George and Marcia Lucas were living the dream: they had started with nothing and now controlled their own production and distribution companies. One of the first press releases issued by the newly amalgamated and consolidated Lucasfilm promised the long-awaited sequel to Journey of the Force, with a tentative release date set for the summer of 1986…

    ---

    [1] The high-speed rail line between Washington, D.C. (terminus of the Northeast Corridor) and Baltimore was not scheduled for completion until the autumn of 1984 - just in time for election season.

    [2] The Baltimore Orioles finished second in the American League East during the 1982 season IOTL, falling one game short of completing a remarkable comeback, losing to the team that was ahead of them, the Milwaukee Brewers, in the final game of the season, which took place (at home) on October 3, 1982 (a Sunday). As a result, the Brewers had a record of 95-67, against 94-68 for the Orioles.

    [3] Variety summed up the palpable feelings of betrayal amongst Hollywood insiders at one of their own taking a stand against them in their memorable headline: “HAS SULU GONE SPACEY?”

    [4] The body of the text doesn’t quite elaborate as to how comparatively extreme a decision this is – even after Enron and the other accounting scandals of the early 2000s, the accounting and financial professions continue to self-regulate, though under guidelines and mandates from the US government (most notably the Sarbanes-Oxley Act). Basically, accounting fraud is now being monitored as closely and punished as severely as securities fraud, which requires (in the opinion of Congress) direct government oversight.

    [5] Glenn’s offer to Baxter was never made public by his administration until after he later accepted appointment to the Commission (at the press conference: “as you all know, we’ve been after Mr. Baxter for some time”), but political commentators on the Hill still knew about it due to Glenn’s preliminary attempts to secure the necessary support from a majority of Senators; Baxter did not belong to either major political party and would sit as an Independent on the Commission.

    [6] IOTL, Charles G. Bluhdorn died on February 20, 1983 – as ITTL, the cause of death is listed as a heart attack.

    [7] Marcia resigned by telegram, which That Wacky Redhead would display in her office. It reads, in its entirety: “TENDERING RESIGNATION – WINDFALL RECEIVED – EFFECTIVE IMMEDIATELY”.

    [8] $100 million has been kept in escrow since 1980, as the bond on the initial verdict. Therefore, Paramount effectively owes “only” the $900 million difference.

    [9] Remember that Lucasfilm is collecting on its share of the profits from Journey of the Force, which makes them investors as opposed to creditors. Investors receive dividend revenue; creditors receive interest revenue. In addition, creditors generally expect to receive the principal – that is, the total of what was initially loaned out – to be returned to them, either in whole (as with bonds) or over time (as with loans), whereas investors generally forfeit their initial investment (in Lucasfilm’s case, that would be the film itself, and all associated copyrights, licences and trademarks which can be derived therefrom) in exchange for the promise of their rightful share of the future returns. In this instance, it just so happens that Lucasfilm will be receiving Journey of the Force back – the best of both worlds.

    [10] The Name of the Rose, Eco’s first novel, was translated into English at around this time IOTL. Even an author as detail-oriented and deliberate as Eco probably would not have written exactly the same book after over a decade from the POD. That said, it is still a historical mystery with a lead character plainly inspired by Sherlock Holmes, probably a monk like William of Baskerville (it’s not as though many other professions in the middle ages lent themselves as well to scholastics and scientific deduction).

    [11] This description is paraphrased from an (OTL) Publisher’s Weekly review of Eco’s next novel, Foucault’s Pendulum (though describing The Name of the Rose) – which has the reputation of being the Finnegan’s Wake to The Name of the Rose’s Ulysses. (Note that it, unlike The Name of the Rose, never got a film adaptation.)

    [12] Indeed, several of the many reviewers mentioning Churchill and/or Eco can be paraphrased as saying “all of [their] flaws; none of [their] virtues”.

    [13] As a result, as of January 1, 1984 ITTL, only the following Golden Age studios remain in operation: MGM, 20th Century Fox, Universal, Columbia, and Warner Bros.

    [14] A very similar ceremony took place IOTL, in 1967, though it commemorated the opposite transaction: Desilu being sold to Gulf+Western. Here is a picture taken of the two principals at that ceremony.

    [15] It will not surprise you to learn that, IOTL, Paramount pays little to no regard for ¼ of their Melrose lot having once been the possession of RKO, and then Desilu – even though a certain property which was developed by Desilu kept that studio afloat through some very lean years.

    ---

    Unlike the pompous C.A. Baxter (what a jerk, am I right?) I’m not nearly so immodest that I would not credit those with whom I have consulted extensively in the writing of this update, so I must thank Andrew T, Dan1988, and e of pi for their input and advice at various stages of the development of both this update and the entire development of Appendix C. I apologize for not ending this post with a dialogue-free montage scored by John Williams, but I can provide you to a link which might inspire you to imagine one. May the force be with you… always.

    (And yes, I will write about the sequel… that’s why it’s coming out in 1986, and not 1987, as e of pi cruelly yet characteristically suggested. Happy birthday, you great big troll :p)
     
    Last edited:
    1983-84: Out with the Old, In with the New
  • Out with the Old, In with the New (1983-84)

    Located at the northeast corner of the intersection at Gower Street and Melrose Avenue, in the heart of Old Hollywood, Desilu is headquartered in the former home of two Big Five studios from Tinseltown’s Golden Age. While Desilu has ample studio space all across Los Angeles, the historic Gower lot has been home to countless film productions under the auspices of RKO and Paramount Pictures, including Citizen Kane, Sunset Boulevard, and The Ten Commandments. Desilu also owns the famous Forty Acres lot, where Gone with the Wind was filmed, along with such classic television series as The Andy Griffith Show, Star Trek and Mission: Impossible. Their Cahuenga lot, once home to I Love Lucy, currently hosts Rock Around the Clock. Highly recommended for anyone interested in the history of Hollywood, guided tours are available by appointment in the morning and afternoon, Monday to Friday, with memorabilia including props and costumes on display in areas open to the public.”

    – Excerpt from the Travel Guide to Los Angeles, published by the American Automobile Association (AAA), 1984 edition

    After buying out the former Paramount lot from Lucasfilm, Desilu seemed to be going from strength to strength. But as was the case after digesting such a massive acquisition, the fat would have to be trimmed to keep the studio fighting fit. An unfortunate casualty came with the cancellation of Deep Space at the end of its third season. [1] That show’s creator and showrunner, Gene Roddenberry – whose Norway Corporation had kept its offices on the Gower lot for nearly two decades – was sufficiently aggrieved by what he perceived as a lack of faith and support on the part of the studio and formally severed ties with Desilu, with whom he also had collaborated to bring Star Trek, Re-Genesis, and The Questor Tapes to the masses. Desilu reserved all rights pertaining to those properties as well as Deep Space, leaving him with little more than his grievances pertaining to his perceived mistreatment by them, which he simply refused to let go. Naturally, he blamed Brandon Tartikoff – the man who had ruined Deep Space, in his not-so-humble opinion – and someone far less sympathetic to him than the two previous Heads of Production – his friends and allies Robert H. Justman and Herbert F. Solow – had been. In fact, Roddenberry viewed Solow as having “betrayed” him; as Senior Executive Vice-President and Chief Operations Officer, Solow could have overruled Tartikoff, but consistently chose not to do so. [2] Roddenberry, fortunately for him, had a cadre of loyal followers who cheered him on whenever he read the persecution act on the lecture circuit.

    Fandom, naturally, was divided. Star Trek’s Puritan faction sided with Roddenberry – Desilu’s “betrayal” and Tartikoff’s role therein confirmed their pre-existing biases and reinforced the criticisms they had levelled against Star Trek: The Next Voyage. Some of the more indignant (and painfully naïve) of these even went so far as to demand that Desilu “restore” ownership of Star Trek to Roddenberry, or at least grant him the freedom to do with it what he pleased; Desilu didn’t even dignify these outrageous demands with an official comment. [3] Mainstream Trekkies viewed Roddenberry’s departure as more bittersweet, and purely symbolic – the end of an era, certainly, but the fact remained that he had ceased providing meaningful, constructive contributions to Star Trek by 1967. Puritans conflated him with Gene L. Coon (it didn’t help that both were named “Gene”) who had tragically died in 1973. If there was a reason that Deep Space was inferior, many reasoned, it was because Coon wasn’t there to flesh it out, as he had done with Star Trek, or to set it off on the right foot, as he had done for The Questor Tapes. Simply put, Roddenberry was an “idea man” who needed to be surrounded by the right kinds of people – like Coon, like D.C. Fontana, like Herb Solow and Bob Justman. When that didn’t happen, his imposing mystique would inevitably fade in the harsh, clinical light of reality. Ever since Journey of the Force had been released in 1977, many observers noted the similarities between Roddenberry and George Lucas, a filmmaker with limited creative ability and myriad weaknesses – which were shored up by the strengths of his closest collaborators. As the years went by, these similarities coalesced to such a degree that each of the two seemed a reflection, or an echo, of the other.

    The cancellation of Deep Space would presage an even more significant shakeup on the Desilu programming roster. Two of the studio’s longest-running, most dependable hits – Rock Around the Clock and Three’s Company – would both end their run in 1984. The plan was to launch two simultaneous spinoffs for the latter show – which corresponded to the two spin-offs of the British original, Man About the House. In the first, the newly-married Robby and Chrissy (unlike in the original, where Chrissy ran off with Robin’s brother and Robin married an entirely new character) would balance home and work life as Robby opened his own trendy restaurant. In the second, the landlord and landlady at the apartment complex, Mr. and Mrs. Roper, would sell it and move into a retirement community. The dramatic shift in focus (from swinging singles sharing a bachelor pad to a young married couple and an elderly married couple, respectively) would, it was hoped, refresh the creative staff, which largely carried over to both new shows. In fact, the initial idea had been a straight retool of Three’s Company into a show which would star both Robby and Chrissy and the Ropers (improbably finding themselves neighbours in their new home) but this was quickly dismissed; the cancellation of Deep Space had freed up the studio space to run two shows instead of just one, and besides, the “move to get away from each other only to wind up in the same place again” premise was so ludicrous that even the studio which had once produced I Love Lucy could not in good conscience produce it.

    Therefore, straight remakes of Robin’s Nest and George and Mildred were in order. Robby’s Roadhouse, as expected,would target young professionals, but The Ropers targeted an audience largely ignored since the infamous Rural Purge of 1971: older viewers. The entire cast at the retirement community was a “mature” kooky cast of characters, including allusions to past sitcom archetypes: the nosy neighbours, the promiscuous “cougar” (or older woman), the naïve, unworldly type who had somehow managed to live into old age, and the grumpy, humourless old man. In other sitcoms, these characters served as the peripheral comic relief; here they were the main attraction, and the comedy would come from the Ropers as straight men (the role that suited them best). The show would also move away (as The Patriot had) from the traditional Desilu sitcom formula of slapstick and farce, toward more sophisticated and elaborate, character-driven wordplay and gags.

    Perhaps the focus on older character was a sop to Desilu being run by a woman in her seventies, who (for obvious reasons) deeply connected with the premise and the notion that there could be life after 65. No doubt, were she still an actress, she would want to appear in such a series, but the fact remained that her plate was more than full just running the studio. She wasn’t the same woman she had been even two decades ago, when she was in her fifties, running the studio and starring in The Lucy Show. But Ball knew that she wasn’t alone; Desilu was willing to bet that an old dog could be taught new tricks (or at least, convinced to buy new goods and services), and that The Ropers could tap into an untapped market. They also wagered on the universality of the experiences facing the older generation, their memories, hopes, and fears. Anyone who lived long enough would grow old, after all, and it did not escape the notice of many demographers and sociologists that the oldest of the Baby Boomers were fast approaching the big 4-0 and middle age – the eldest of their parents’ generation, traditionally born just after WWI, had been the same age during the 1950s.

    Desilu already had another show that appealed to nostalgia for the 1950s, or at least it didRock Around the Clock would end production in the 1983-84 season, though chronologically in 1963 – having fallen slightly behind the original commitment to remaining two decades ahead for fear of having to “cover” the Kennedy assassination, which is usually perceived as marking the end of the “cultural” 1950s (a very long “decade”, which ran from 1946 to 1963). Politics would occasionally intrude on the “idyllic” 1950s Milwaukee, though never to the extent or blatancy of a Norman Lear sitcom: a famous running storyline was the re-enactment of the 1960 election, which divided the Cunningham family as cleanly as it divided the whole country: Harold and Marion preferred Vice-President Nixon, whereas their children favoured Senator Kennedy. (Nixon would carry Wisconsin in the 1960 election, as indeed he would in his failed 1968 bid.) The famous two-part 1980-81 season premiere episode, “The Debate”, was set during the famous first Presidential debate of September 26, 1960, and took place in real time (a conceit which earned the two-parter the writing and directing Emmys the following year) – it borrowed the “bottle show” approach made famous by an earlier Desilu production, Star Trek, and refined it further, borrowing from televised stage-plays such as Captain Miller and demonstrating the impact of the 1970s character-based model. The series finale of Rock Around the Clock, airing in May 1984, tied up all the loose ends for all of the major and recurring characters in the show’s long history, a far cry from many shows airing in the actual 1950s and 1960s, which often ended quite abruptly without any closure. The finale was a massive ratings triumph, as series finales increasingly were in the modern age. It further demonstrated that more modern production techniques were even beginning to leak into programming explicitly conceived as a throwback.

    The first season of Police Squad! was already being acclaimed as perhaps the funniest television series in the history of the medium – a bold claim, given its long and rich comedic history, but one typical of the increasingly present-oriented critical mindset – and ratings were solid enough to merit a second season. However, The Patriot, a more tragicomic [4], sentimental program, won Outstanding Comedy Series in September of 1983, many observers believed it was because of the longstanding disdain held by awards shows for pure comedies – in the opinion of many sophisticates, that was made clear when Anhedonia [5], the brilliant Woody Allen comedy about relationships, lost out for Best Picture to a childish, whiz-bang action-adventure film dredged right out of the 1940s serials in Journey of the Force. Allen would only win Best Picture two years later, for Manhattan, a more serious, self-consciously “artistic” (the film was shot in black-and-white, the first to win Best Picture since 1960’s The Apartment) movie with a “heavier”, more controversial storyline involving casual infidelity and the middle-aged lead in a relationship with a schoolgirl.

    Similarly, it was widely felt that Police Squad! lost out to The Patriot because it was the less serious of the two series – as the acting categories had shown, actors in the ostensibly “comedic” category generally won only when they submitted an episode which was heavily dramatic. Jean Stapleton, who played the sweet, daffy “dingbat” Edith Bunker on Those Were the Days, once submitted the famous episode in which her character was nearly raped – and won the Emmy for Outstanding Lead Actress in a Comedy Series for it. To the credit of the producers of Police Squad!, they did not attempt to make their show or their characters in the least bit serious, soldiering on with their wall-to-wall comedic potpourri. Unfortunately, they had already written 22 episodes, in the process using up many of their best gags – and maintaining such a high volume and density of jokes would prove too much for anyone. As the second season wore on, Police Squad! began an irreversible decline, and the ratings began to reflect this. The show would end not with a bang, but a whimper – however, its cancellation was announced early enough to give the show a proper finale in May, 1984.

    For all the acclaim and all the awards recognition, the most popular shows on primetime belonged to a genre long held in low esteem: the soap opera. Texas endured as a smash-hit despite several years having passed since the height of the “Who Shot T.R.?” craze – which had contributed to the mass popularity of season-ending cliffhangers. Over at Desilu, The Patriot had – surprisingly – ended its first season with “Big Dave” Sullivan and Rebecca Hopkins on the cusp of resolving their year-long sexual tension… which they ultimately did in the second season premiere, to boffo ratings. [6] Surprisingly, ratings improved further still with Dave and Rebecca as a couple, though the writers (as had been the case on Rhoda, so many moons ago) tired of their relationship and thought them more intriguing apart, though with the undercurrent of sexual tension. However, audiences disagreed, and surprisingly, so did critics, who enjoyed the realistic depiction of a flawed but loving relationship wherein both partners continued to change and grow as characters. The Charles Brothers still favoured splitting them up in the second season finale, but the network refused. In compensation, their chef George, who had separated from his (unseen) wife as a counterpoint to Dave and Rebecca’s new union, did not reconcile with her as was originally planned. [7] The Charles Brothers also quietly withdrew from active showrunning, passing that responsibility onto one of their senior producers – industry rumblings would always blame the network pulling rank on them. [8]

    However, soap operas themselves continued to exist in pure form, undiluted by overtures at more traditionally popular primetime genres such as the sitcom. Direct imitations of Texas had emerged, and most notable among these was Wasps, a series about… WASPs (as in, White Anglo-Saxon Protestants) – more specifically, well-heeled high Episcopalians living in New England who could trace their ancestry back to the Mayflower, along with other founding populations of the early 17th century. The working title was the more precise Yankees, however this term was scrapped due to its ambiguity the potential for viewer confusion. [9] Produced by Aaron Spelling, who reused some of the cast from his previous productions (including The Alley Cats), Wasps followed the tried-and-true daytime soap formula of pitting two large clans against each other. One family was the Parkhursts, the aforementioned well-heeled Episcopalians [10]; the other was the Mulroneys, a more nouveau riche (relatively speaking – they had only made their fortune in the 20th century) family descended from immigrants fleeing the Irish famine, wrapped in a dark past which included mob ties, Prohibition-era bootlegging, and other obvious indicators that the family was intended as a blatant pastiche of the Kennedys, including extensive political connections.

    In yet another way that Soap, the primetime sitcom parody of what was then a strictly daytime genre, was far ahead of its time, the great success of these primetime soap operas, and the gradual encroachment of continuing plotlines into seemingly episodic formats such as the sitcom (which had its origins in the 1970s) demonstrated a growing trend toward serialization on American television. However, this was happening in a more organic, subdued fashion, as opposed to the melodramatic and abrupt cliffhangers so common to soap operas, with their lingering reaction shots and sweeping crescendos – viewers were no longer asking “I wonder what they’re up to this week”, but instead “I wonder what’s happening next, after last week.”

    CBS continued to struggle in primetime, despite doing very well in daytime – The Price is Right, hosted by Dennis James, continued to dominate all comers (fittingly, the primetime version had been cancelled, as weekly game shows had fallen out of favour with audiences). The CBS lineup of soap operas in the early afternoon continually outperformed the ABC and NBC offerings. Merv Griffin remained a solid second-place contender in the late-night sweepstakes – though only by default, as ABC had finally tired of Dick Cavett and sought to take advantage of the growing interest in newsmagazine programming by airing one in the late-night timeslot. Still, when it came to news, CBS was the undisputed champion. Walter Cronkite, the man whom they had nearly relegated to early retirement, enjoyed unchallenged supremacy over the nightly news, bringing in the ratings and the accolades. 60 Minutes, meanwhile, continued to be the one unqualified, enduring success for CBS in primetime during this very lean period, which had no doubt encouraged ABC to devote more resources to their news division – which had resulted in the rival 20/20, created by network guru Roone Arledge. Still, at least in this case, the imitation paled in comparison to the genuine article, both in terms of critical acclaim and especially with regard to the ratings.

    In the end, though, all three networks saw their share of the viewership pie decline with the continuing ascendance of pay-TV. MTV and CNN were the two basic cable juggernauts of the early-1980s, but other channels were on the rise as well. Most households had multiple TV sets – two or even three were increasingly common. The head of household could watch late night entertainment in the master bedroom; the homemaker or the children could watch alternative programming in the kitchen or dining room. Families were less likely to come together for all but the biggest “event television”. This was where live sporting events came in: Monday Night Football, a moderate success in the 1970s, saw its ratings remain stable as other long-runners saw their audience shares decline.

    MNF, as it was commonly known, finished in the Top 20 for ABC during the 1983-84 season, though Texas was solidly ensconced at #1 once again – heading the six shows in the Top 10 at the alphabet network. NBC improved on the previous season, with two shows in the Top 10, including Wasps; CBS also cracked the Top 10 with two shows, 60 Minutes and their own take on the lap-of-luxury soap opera, Vintage. [11] Notably, of the five most popular shows on American television, three were soap operas. [12] Within the Top 30, the three networks found themselves in a surprising photo-finish: ABC had 11 shows there, NBC had ten, and CBS had nine (most of which were mired in the 20s). Fortunately for the networks, even the most popular cable channels proved utterly unable to crack the Top 30.

    Despite the major audience and critical disappointment in Police Squad!, and the cancellation that had ensued in the off-season, it was still nominated for Outstanding Comedy Series once again, with many feeling that it would win over The Patriot in consolation for the previous year – but that was how the Oscars worked, not the Emmys. The Patriot would repeat for Outstanding Comedy Series. Outstanding Drama Series went – for the third year in a row – to Hill Avenue Beat. This meant another coronation for Desilu, which surprised nobody. Brandon Tartikoff took the podium for both wins, making clear that Desilu hoped to continue to keep audiences happy with their upcoming offerings. Lucille Ball, sitting in the front row in the Pasadena Civic Auditorium that night alongside her husband, Gary Morton, smiled for the cameras, beaming and applauding with each win for her studio. No matter how old she got, the approval of her peers always made her feel young again…

    ---

    [1] Deep Space was cancelled (without a proper sendoff, which was increasingly rare by this time) in May of 1983, after a run of three seasons, with 78 episodes in the can (almost, but not quite, enough for syndication). Do those figures sound familiar to you? They should!

    [2] Unfortunately, this left Justman, a close friend of both Solow and Roddenberry, stuck in the middle of their rift, attempting (in vain) to broker a reconciliation between them.

    [3] Gene Roddenberry, in all likelihood, ceased to own Star Trek when Desilu agreed to produce a pilot based on his concept all the way back in 1964 – he apparently lacked the clout to negotiate some residual claim to the show’s copyright, as many more experienced (and successful) producers would do (including, for example, Sherwood Schwartz, who was a co-owner of The Brady Bunch with Paramount Television – director John Rich also owned a stake which he foolishly sold to Paramount, though he would land on his feet and direct for All in the Family shortly thereafter).

    [4] The term “tragicomic” is used in this instance as the term “dramedy” (actually a subset of the wider tragicomic genre) would be IOTL; the term “dramedy” is widely held to have been popularized in reference to the innovative, genre-busting series Moonlighting, which began airing in 1985. However, Cheers is in many ways a proto-Moonlighting (basically bridging the gap between the “MTM school” of hyper-realistic, character-driven sitcoms and the drama with heavy comedic elements pioneered by Moonlighting), and the same is also true of its TTL sister series.

    [5] Anhedonia is defined as the inability to experience pleasure. It was the working title for a film which, of course, became known IOTL as Annie Hall, re-named for its lead, Diane “Annie” Keaton (nee Hall). And yes, just as IOTL, cineastes ITTL debate over which of those two movies deserved to win Best Picture. Only instead of populists demeaning the “elitists” as IOTL, it’s intellectuals demeaning the “rabble”. Either way, the implications aren’t pretty.

    [6] IOTL, Cheers actually ended its first season with the more traditional consecutive weekly two-parter, entitled “Showdown” (Parts I & II). This is likely because the show’s ratings were so low that there was a fear of cancellation, and thus an unresolved cliffhanger. Therefore, “Showdown, Part II” famously ended with Sam asking Diane: “Are you as turned on as I am?” and Diane lustily replying “More!” before they threw themselves into a dramatic kiss which closed the episode (and the season). However, they certainly made up for the lost opportunity: all four of the subsequent season finales during the Sam/Diane years were cliffhangers. (Tellingly, after the show’s transition to a more purely comedic series after Shelley Long’s departure, Cheers largely avoided cliffhangers in the Rebecca years.)

    [7] Norm and Vera separated during the second season IOTL, for much the same reason – as a counterpoint to Sam and Diane. True to form, they reconciled, and would remain married (and Vera largely unseen) for the remainder of the show’s run.

    [8] Glen and Les Charles left as showrunners after the second season IOTL as well – here they just have a convenient excuse to do so. They would continue to write a handful of episodes, though always season premieres or finales, intermittently all the way up to the series finale.

    [9] An aphorism credited to author E.B. White summarizes the varying geographical definition of the term “Yankee”:

    To foreigners, a Yankee is anAmerican.
    To Americans, a Yankee is a Northerner.
    To Northerners, a Yankee is an Easterner.
    To Easterners, a Yankee is a New Englander.
    To New Englanders, a Yankee is a Vermonter.
    And in Vermont, a Yankee is somebody who eats pie for breakfast.

    Of course, this is a comedic exaggeration – the first four lines are accurate but the last two are not. The term “Yankee” is accepted throughout New England to refer to someone of predominantly colonial ancestry, which makes Yankees an ethnic group as opposed to a geographical one. You’ll note that, up to this point, I’ve predominantly used the term “Yankee” (or “Yank”) from the perspective of a foreigner, referring to the United States of America, and the totality of its people and culture.

    [10] “Parkhurst” was the working name for the family which became immortalized under the name Carrington.

    [11] Vintages (about rival winemaking families in the Napa Valley, a region which by this time already enjoyed prominence as the premier viticultural hotspot in the United States) is TTL’s version of Falcon Crest, which (yes) aired on CBS. (Dynasty aired on ABC IOTL, but Wasps airs on NBC ITTL.)

    [12] IOTL, it was “merely” the top seven: Dallas at #1, Dynasty at #3, and Falcon Crest at #7. Yet another soap, Hotel, also finished within the Top 10. Notably, only one sitcom (Kate & Allie, at #8) cracked the Top 10 in 1983-84, helping to explain the widely-held belief at the time that the genre was moribund. But soon, a man in a funky sweater would come along to change all that...

    ---

    Thanks, as always, to e of pi for assisting with the editing for this update. Thus begins the ante-penultimate cycle, 1983-84! I hope you all enjoy what I have in store for you in the updates to come :)
     
    All You Have to Do is Count Down from Ten
  • All You Have To Do Is Count Down From Ten

    From the Counterfactual Collabase:

    British Space Program

    Saved by Harold Wilson.

    ---

    Television shows such as Rock Around the Clock served to demonstrate that nostalgia for the past was not fixed, but seemed to move along a floating continuum; it was eternally tethered approximately two decades before the present. In the mid-1980s, naturally, this meant nostalgia for the glory days of the space race, especially with a “spaceman” as the incumbent President. The “problem”, such as it was, which stymied hopes of re-creating that glorious rivalry with the Soviets was that the Reds were in no position to pose much of a threat to the preeminence of the American space program, their last gasp of success having been their narrow (and largely symbolic) victory in the “Race to Mars” over a decade earlier. In addition, relations between the two superpowers had been cordial in the years since; even the term in office of the staunchly anti-communist Ronald Reagan had seen the furthering of the détente between them. [1] The Glenn administration made no attempt to reverse this trend, though the United States did finally open diplomatic relations with the People’s Republic of China in 1981, which was not terribly well-received at the Kremlin. [2] As far as the space program was concerned, the challenge facing Glenn, a major NASA booster who had even appointed his fellow astronaut, Gemini and Apollo veteran James McDivitt [3] as NASA Administrator shortly after taking office, was how to replicate the 1960s zeal for spaceflight without reigniting the brinkmanship that had spawned it.

    The engine that would power the NASA revival was designed and built in the 1970s: the Caelus rocket. In form and function, it was a variant of the Saturn line (the name Uranus was immediately rejected for obvious reasons, leading engineers to propose the Latin variant of the Greek name for that celestial body instead), with only the S-IC first stage overhauled substantially for this new iteration. By modifying the S-IC to discard four of its five F-1A rocket engines during ascent as fuel was consumed (an innovation pioneered by the earlier Atlas rocket), the newly-dubbed C-I could function as a launch vehicle in its own right. This allowed the rocket to deliver more than 20 tonnes of payload to Low Earth Orbit (LEO), the location of Skylab and, potentially, any future space stations and depots. [4] In this stage-and-a-half configuration, with the detachable engine modules recovered downrange for reuse, the Caelus-I was a cheap and highly effective launch vehicle, considering the payload it could carry. With Caelus-I functioning as the lower stage of a multi-stage rocket (with the old S-II and S-IVB merely given new, upgraded engines but still renamed C-II and C-III respectively), it could ditch the engines on the lower stage, and then each successive upper stage, in order to launch up to 150 tonnes into LEO, significantly in excess of Saturn V, making it the most powerful rocket ever developed – though this impressive capacity had yet to be fully employed, with the Caelus instead used solely as a ferry for the Space Shuttle (which was sufficiently light to be launched by the C-I rocket alone). Space enthusiasts naturally decried the inefficiency of the Caelus rocket and exhorted NASA to develop programs which would take full advantage of their impressive design.

    That was where Glenn and McDivitt came in. They had two key policy goals: a new space station to replace the aged and overextended Skylab, and a return to the Moon to follow up on promising leads that NASA had been forced to abandon in the mid-1970s. However, it was clear that all of the existing infrastructure favoured a new space station, and none of it favoured going back to the Moon, which had effectively been abandoned for over a decade. Moreover, Congress would not provide the funding to cover both projects; it had to be one or the other – and they would lean heavily on the one that was cheaper and easier. Although Glenn did make speeches and platitudes about NASA’s eventual return to the Moon, he (unlike his predecessor, John F. Kennedy) deliberately avoided assigning a deadline for that accomplishment. However, he did promise that a replacement for Skylab would begin development immediately.

    Like the Skylab series of space stations which had preceded it, the new “skybase” (which, in the grand tradition of naming celestial objects after mythological figures, was codenamed Olympia) would consist of modified rocket stages serving as its primary habitable spaces. However, unlike Skylab, which had used the 290-cubic-meter S-IVB tanks, Olympia’s core would be a converted S-II (or C-II) stage, offering more than 1200 cubic meters. The vessel itself could be sent as part of a single flight, but most of the fittings (such as additional solar arrays and cargo for outfitting) would have to be sent along with a second module, a converted S-IVB stage. Together, the modules would offer the capacity for as many as twenty crew members, launched aboard Space Shuttles (though not all at once), and would contain large laboratories for materials, physics, and biological experiments, while exterior facilities could be used for experiments which required access to the vacuum of space. However, even with the cost-saving measure of adopting Skylab’s budget-conscious approach to construction and leveraging existing stocks of Apollo-era stages for conversion, Olympia would be expensive. One of the most important means of reducing costs for their planned “skybase” was shunting some of them onto willing international partners, which also provided valuable political cover and promoted good diplomatic relations with their allies. At the height of the Space Race, only the two superpowers had been able to keep up the great expense of research and development for space travel and exploration. However, in the years since, the decreasing cost of technology had made it feasible for other countries to develop their own space programs. Indeed, several of these countries would pool their resources, infrastructure, and manpower in the creation of supranational entities which provided them with useful potential roles as junior partners to NASA.

    One of these newly-formed organizations was the Commonwealth Space Agency (CSA), which, much like its sister organization, the Commonwealth Trade Agreement, had developed out of ad hoc arrangements between the largest economies in the Commonwealth of Nations: the United Kingdom, Canada, and Australia. It formed after repeated, failed attempts by the UK to co-operate with France on the development of launch vehicles and satellites, in a largely abortive “third front” of the Space Race in the 1960s. As with the CTA, the timing for the CSA’s formation was driven by exigent circumstances: the European Economic Community shut the door on admitting the UK (for the third time) in 1974. Concurrently, the organizations under whose auspices the UK and France had attempted to work together, the European Launcher Development Organization (ELDO) and the European Space Research Organization (ESRO), were merged into the European Space Agency (ESA). The UK cut off any ties with the newly-formed organization, leaving France as the uncontested top dog in the ESA. However, the other heavyweights amongst the remaining partners, West Germany and Italy, retained the leverage to veto total French domination of the new organization. The CSA and the ESA were, accordingly, well-placed to emerge as rival second-tier space powers, each backed by three of world’s ten largest economies and with approximately half of the assets and technologies developed by their precursor organizations at their disposal.

    That said, the ESA was far more able to take advantage of their strengths at a much earlier juncture. The Guiana Space Centre in the French overseas department by the same name had been operational since 1968, and was naturally chosen as the main ESA spaceport, given its ideal location and pre-existing infrastructure. By contrast, choosing a launch site to develop was the CSA’s very first order of business – the established test site at Woomera, South Australia, was ill-equipped for the kinds of equatorial satellite launches which would be the organization’s bread and butter. The newly-chosen site, in Far North Queensland, was fairly remote, and would have to be built from the ground up – committing the CSA to a late start in the race for commercial contracts while the competition (with the ESA along with American launch providers) had already begun in earnest.

    iQ31LxSLET7rKl3CDcd6KPofGb4o5PvdsujfsTldsXdiOc3J3Hk8D613ioqFXlrvlAKBvWJDFdjr1hjLomyhpmOp00eBGM6yvjBxv_Phh-humXbutoQrEqCoZkeIHb4BZA

    The British Blue Streak rocket.

    It was the latest in a long series of protracted delays and rearrangements that defined the British space program, going all the way back to the development of the Blue Streak – its signature rocket – which had started out as a missile armed with the nuclear deterrent. However, land-based nuclear missiles were impractical to house so near to the enemy which they were intended to deter, so the Blue Streak was re-purposed into a rocket as so many missiles had evolved into rockets in years past. [5] However, in the years following the Suez Crisis, spending on research and development had taken a nosedive, and that necessitated finding partners. Thus, ELDO was formed, and the plan was for Blue Streak to serve as the first stage of a multi-stage rocket (which was naturally given the name Europa), to be completed with stages developed in France and West Germany. Blue Streak would prove an unqualified success; however, the French second stage, Coralie, and the West German third stage, Astris, were disastrous failures. If Blue Streak were ever to fly, it would not be as part of Europa; that project was finally scrapped in 1971.

    iBREA_Zz0sOBwbfh5d_8jAyvbcL55rSnTmpu1dLwZ0T716VFZdzTmueo5oM58NDe7PdvC_-Bv1ubjMWcUS3PxC76AFN6hNjawFyckgfx-zrIXz9_hZYis-PuJu4oT6l9Ug

    The British Black Arrow rocket at Woomera on the morning of October 28, 1971, before its first successful flight carrying the Puck 1 satellite.

    Supposed “sideshow” projects, such as the Black Arrow sounding rocket and the Puck series of satellites [6] that it launched into orbit, were much more promising than Europa, and the idea eventually occurred to the government of Prime Minister Harold Wilson that Black Arrow had roughly the same power and mass that it could serve just as well as an upper stage for a multi-stage rocket anchored by Blue Streak as Europa would have done; better, in fact, since both Blue Streak and Black Arrow were demonstrable successes. [7] Thus, the CSA did have a launch vehicle ready to go, albeit a jury-rigged one which was severely underpowered even by 1970s standards. It was given the name Black Prince in reference to long-abandoned plans for an all-British multi-stage rocket, for which Blue Streak was to be the first stage. In this case, though, the rocket would be all-Commonwealth, and any improvements to the Black Prince design would have to be iterative. By contrast, though the ESA didn’t have a rocket ready to go per se, they were developing one which would be given the name Ariane. It would be larger, more technologically advanced, more efficient, and more powerful than Black Prince.

    rxUAI99UEu8iAE6O8PQcfSUE4D1czjLK7UWzKDWQO-sMAPjfKEVfsTlHgIneKol-YjDwzaQsQW9F9LjMiHlDPk5SA6F-DAcDwqfDBv0C9XwUzuQeVfRc3OaC7oy7T_Lq9A


    The component stages of the Black Prince 1 rocket – the UK-built Blue Streak as the first stage, and all three stages of the UK-built Black Arrow as the second through fourth stages; the final Waxwing stage was intended to be built by Australian engineers as soon as they were able but the first Canadian-built Centaur stages were ready before this, resulting in a switch to the Black Prince 2 configuration.

    But Black Prince had the advantage of being ready to launch ahead of Ariane, since all the component parts had already been designed and built. The first stage was a Blue Streak rocket; the second, third, and fourth stages were all three component stages of the Black Arrow. After several successful test launches, the first proper Black Prince mission took off from Woomera in 1977, sending a communications satellite into geosynchronous orbit. [8] The launch site would not relocate to the more suitable location in Far North Queensland until 1979, by which time the Black Prince would see a major overhaul of its component stages. [9] This was the result of increased Canadian and Australian investment and involvement in CSA research and development, to better reflect the international orientation of the agency, as opposed to one dominated by a single partner (in this case the UK). The second and third stages (both borrowed from the Black Arrow) were removed, replaced with an American Centaur rocket, though built (under licence) by de Havilland Canada. The previous fourth stage (the Waxwing stage) remained as the new third stage, however it was no longer built in the United Kingdom, but in Australia, thus allowing each of the “Big Three” countries in the CSA to contribute their own stage to Black Prince. The first satellite launched on this new “Black Prince 2” was the Canadian Anik B1, which (massing at just under 900 kilograms) was barely within payload capacity. However, the ESA Ariane rocket, completed in the same year, could throw double that payload into geosynchronous orbit, making it the preferred launcher for the European telecommunications industries at what could have been a critical point in time.

    rs2TnBU3ddi65a5zeil9UWclWgu2gUzFmcH-SbCTUB3x2bNwZH6mzVyKZZPEz19v_u8tlj00Ax9GImFzOPMj4UiD-ldIQ_DQsPfOXM91iDw_nVZQJzZ2Ab24rkzvFeouHA

    The component stages of the Black Prince 2 rocket – the UK-built Blue Streak as the first stage, the Canadian-built Centaur-CA as the second stage, and the Australian-built Waxwing as the third stage.

    However, in the grand scheme of things, both the CSA and the ESA were competing with the venerable United States aerospace industry. The Reagan administration, wanting to spur the economic development of the telecommunications industry without footing the costs on the public dime, encouraged privatization through selling payloads aboard Titan III rockets to firms hoping to launch satellites to LEO. [10] Naturally, most American service providers would gravitate toward this option – even without any subsidy, it would be cheaper and capable of larger payloads than any alternative that would be devised by any foreign space agency for several years to come. This competition did not prohibit the opportunity for cooperation, as launching telecommunications satellites was not the only purpose of even the “lesser” space programs. Nurturing native aerospace industries was considered a key step towards fostering homegrown astronauts and sending them into space, which were invaluable instruments of prestige. The rockets eventually developed by both the CSA and the ESA would have been sufficient as launchers of crew vehicles into LEO, but the speculative design costs for such vehicles was so great as to incentivize cooperation with NASA in order to achieve their desired results. Several proposals were made on all sides, but consensus soon arrived at the CSA and the ESA each sending their own laboratory modules to be connected to the Olympia complex. The Japanese space agency, NASDA, was invited to participate on the same terms at this time – negotiations were also opened with tertiary space powers, such as Brazil, Israel, Turkey, Iran, and India, offering a “ticket to space” as tools of diplomacy.

    Unlike both the UK and Canada, with their established aerospace industries, and sufficient infrastructure and expertise to take on their fair share of the CSA’s burden, Australia was comparatively lacking; Woomera had remained a legacy of the UK space program. All of the money that the Land Down Under would put into the CSA had to be reinvested into native industry, and this suited the needs of the Australian government very nicely. The Waxwing schematics brought in from the United Kingdom had nobody to build them until faculty and students at the University of Sydney’s Department of Aeronautical Engineering were recruited by the government to design a prototype. Aided by the small size of the task – the Waxwing massed in at just under 270 kilos, and was small enough to fit in a phone booth, the bootstrapping of native Australian solid rocket manufacture was successfully achieved, with the aid of advisers from UK Waxwing manufacture. The resulting team ended up forming a corporation – the Australian Aeronautics Corporation (AAC) – and were contracted to manufacture Waxwing rockets for the new Black Princes. Prior to this, Australian engineers had only worked on the fairings encasing the launch payloads – proportionally, more akin to the contributions expected of a minor partner, such as New Zealand. They would make up for their lag in a big way later on – the AAC would eventually be commissioned to design an all-new third stage for a later iteration of the Black Prince – and the first purpose-built design in the history of the line – which (given its small size but potential for great “leaps” into space) was given the name Wallaby.

    CTx1N155d5d1j3nP_ypH8aF4oNQqlcbWhXibKh834atsgQNPY2ZMdrgEqTStGV7Pwn4W7cj2dZq6iWSrBQ_FRD4g28w8SnTvdC9hQy2f89EbAYtuKOvJ1j7rKWa2dm8pTg

    The component stages of the Black Prince 3 “Heavy” (or 34, for the number of attached boosters) rocket – the UK-built Blue Streak, modified to include attachment points as well as central core reinforcement; the Canadian-built Centaur-CA rocket as the second stage; and two options for the third stage, either the Australian-built Waxwing or (later) the Australian-designed and built Wallaby.

    Wallaby topped what would become known as Black Prince 3, which otherwise would have strongly resembled Black Prince 2, with the UK-built Blue Streak as the first stage and the Canadian-built (and American-licenced) Centaur as the second stage, if not for a key engineering innovation: multiple Blue Streak rockets – up to five – could now be clustered together, greatly boosting the potential payload of Black Prince launches (indeed, the Black Prince 34 – or “Heavy” – more than quintupled the payload to geosynchronous orbit over the Black Prince 2, from 900 kilograms to 4.6 tonnes). Most importantly, this was more than what the latest iteration of Ariane, which was once again unveiled that same year, could manage (at only 2.7 tonnes maximum). The mass production of Blue Streak rockets in the UK reduced unit production costs and provided steady jobs, allowing the CSA to take advantage of economies of scale. It was one of the more tangible successes of the amalgamated British Aerospace PLC, which was formed from the merger of British Aircraft Corporation and Hawker Siddeley in 1973, at the behest of Prime Minister Wilson.

    L_cVarkNEggbXsClS4qeSQ604w5kM-6RqKTMG_dYz85BB7Bw2GHGBfEiuZ4igJbIGmbvRBl8_DRjzVT1ZBZ4oAZWL4uYGlqJOCKG7DmK4chLPkrWmklMNy_0o6UGImx_Ng

    All five variations of the Black Prince rocket family.

    The merger, one of Wilson’s major policy contributions during his time as PM, was the result of several external forces coalescing upon the fragile British aviation industry. While the British had kickstarted the jet age with their de Havilland Comet, most of their makes and models in the years since had been overshadowed by more economical options from American corporations like Boeing and McDonnell-Douglas, and while aircraft like the Vickers VC-10 and the BAC-111 had seen success with British flag-carrier airlines, they had largely failed to tap into the large overseas markets that competitors like the DC-9 and 737 had developed. What was supposed to be the British Aircraft Corporation’s crowning achievement – the supersonic jet Concorde – had seen most of the orders put in for it by various carriers worldwide withdrawn in the face of increasing fuel prices and noise pollution concerns. By 1973, virtually all of the potential buyers had withdrawn their orders, with only three interested customers remaining: the respective national airlines of the United Kingdom and France, along with Air India. In the face of such a devastating loss of future inflows, the British government felt compelled to take decisive action. The several key players in the industry, on their own, were each too small to support themselves. This multitude of firms mostly subsisted on government contracts and limited numbers of contracts for small fleets of aircraft. It was an unsustainable situation, but if could be convinced to band together, they might stand a chance.

    As a result, British Aerospace PLC came into being. It was a revival of earlier plans to convince the leading lights of the industry to merge into a new giant, but this time they were successful. The government held a minority stake in the new corporation, though the majority would remain held by private investors. [11] However, Wilson and his Labour government were thrown out of office shortly thereafter, and British Aerospace found that the incoming Whitelaw government was not as loose with the purse-strings as they would have hoped. Their plans for commercial aviation would have to be streamlined, and this led to a renewed focus on the One-Eleven, their most successful commercial jetliner. They were able to take their pitch to Air India when they finally got the memo and withdrew their order for the Concorde in 1975 – in exchange, the airline was the first (even ahead of native British customers) to place an order in principle for the new and improved One-Eleven. This newest model was called the 111-700, and was intended to fly with a Spey engine built by Rolls-Royce. The intent was to make a grand announcement at the Farnborough Air Show in July, 1976. However, as the re-design was carried out, technicians soon encountered a minor setback: this new engine’s thrust was lacking, and the fuel efficiency was abysmal, which translated to higher operational costs and therefore lower market demand. The engineers at Rolls could not promise anything concrete within any reasonable deadline – or indeed, even within the next decade. Waiting for Rolls to get the engine right would take years, which would surely repel Air India, the only potential foreign buyer, and mark an inauspicious start to the attempts by British Aerospace to right the course of their national aviation industry. Without the proper engines, the plane might never get off the ground – literally.

    Management then decided to approach other engine manufacturers as a “stopgap” measure, to furnish the One-Eleven-700 with what it needed until Rolls had a suitable engine. Naturally, Rolls was irate: this took a huge chunk out of potential sales. However, the government sided with BAe; better more British planes with fewer British engines than no British planes or engines. Snecma International, a consortium jointly owned by the American firm General Electric and the French firm Snecma, was contacted and agreed to sell their CFM-56 engines to British Aerospace for the 111-700. The powerful engine would also provide the thrust necessary to propel a longer, heavier, higher-capacity airplane than previous models of the One-Eleven, which opened a new market for smaller-capacity planes that could take advantage of the less-powerful engines that Rolls could reliably manufacture, allowing the two companies to continue their working relationship, if only as a sideshow to what was quickly shaping up as the marquee BAe commercial airliner. Air India was pleased, as was the newly-formed British Airways, and both carriers “officially” made their interests known.

    The BAe 111-700 was formally announced at the 1978 Farnborough Airshow, astonishing many rival firms. Foremost among these was Boeing, perhaps the most successful airliner manufacturer in the world, who were planning on rolling out a new model of their 737, which was in the same class at the One-Eleven. British Aerospace had scooped Boeing with their big announcement, attracting interested buyers whom Boeing had thought would be in their pocket. Meanwhile, in Europe, the emerging Airbus Industrie consortium were behind schedule on their own competitor, the latest member of their A320 family, which would ultimately not be announced until the Paris Air Show on June 14, 1981. [12] The 1980s promised to be a cutthroat period for manufacturers of twin-engine single-aisle airplanes (seating approximately 150 passengers) – the class was popular with regional and national airlines, and on many “bread-and-butter” routes – such planes were fully capable of flying between most any two points in Continental Europe, or the continental United States, Canada, or Australia – which lacked the glamour of Concorde but were far more significant for commercial purposes.

    Early in 1982, the first completed 111-700 rolled off the line at the Bournemouth Assembly, with its first successful flight on February 15, 1982. The initial orders would be shipped out late in the year, just in time for them to be pressed into service over the 1982 holiday season. Among those airlines which took deliveries within the first year of commercial operations were British Airways, Air India, Canadian Pacific Airlines, and Trans Australia Airlines. However, sales were not limited to the Commonwealth; a major coup for BAe had been the order of One-Elevens by American Airlines, to replace their fleet of aged Boeing 707 planes. (It helped that American Airlines had previously operated an earlier make of the One-Eleven.) By 1984, they had sold as many of the 111-700 model as they had all previous versions of the One-Eleven combined. Sales continued to be brisk enough that consumer demand exceeded the absolute supply limit, even as orders picked up for rival airliners in the same size class. This allowed British Aerospace to become a viable competitor in the global aviation industry, and ultimately allowed the One-Eleven to become one of the best-selling commercial airliners in British history. [13]

    All planes sold in those first two years sold with CFM-56 engines. Starting in 1984, the option to purchase airplanes with either the CFM-56 or the Tay engines became available; in order to make Tay engines more attractive to foreign buyers (as British Airways symbolically “exchanged” the 111-700s they had purchased in 1982 for a new fleet with Tay engines), the British government agreed to subsidize their purchase price. However, despite this incentive, many buyers (especially repeat buyers) naturally preferred the higher-performance CFM-56, resulting in a scenario that Rolls had dreaded. Fortunately for them, a contingency plan they had been fostering for some time would soon come to fruition…

    Although American influences were increasingly felt in the realm of European transportation and technology, the reverse was also true. One of the most controversial pieces of legislation passed by the 97th United States Congress, and signed by President John Glenn, was the Metric Conversion Act of 1982, which built on efforts by the Humphrey Administration (which had passed exploratory legislation in 1975 before the matter was suspended under the Reagan Administration) to convert from customary units to metric weights and measures for the purposes of trade, commerce, defence, transportation, industry, athletics, and education. This met with considerable resistance, as metrication so often did in many countries, as it was systematically imposed from above with little regard for the will of the common people, for their own good (or so its supporters argued). [14] In practice, most people (especially those who were not educated in metric) continued to use customary weights and measures in everyday conversation, as was the case in the United Kingdom and Canada, among other countries. It helped that consumer products and services which had always been distributed in customary units were not changed – their measurements were just converted to metric (internally). For example, the standard aluminum beverage can, 12 fluid ounces (fl. oz.) in volume, was re-labelled 355 millilitres (ml). However, the average barfly continued to order pints of beer, rather than converting (to the awkward 473 ml). Bottles of wine and distilled spirits continued to be known as “fifths”, even though a fifth-gallon was almost exactly three-quarters of a litre (757 ml) – and by this time, liquor was sold in 750 ml bottles worldwide, including the United States.

    Although it was the responsibility of the United States Metric Board to educate the consumer on the importance, definition, and use of metric units, their campaign was a largely ineffectual one. It would be left to future generations to learn metric units in a formal educational setting – and education was the responsibility of the states, not the federal government. Schools which were controlled by the federal government (primarily military schools) did convert to metric-only education – many states chose to educate their students in both systems, which was certainly the most practical method, since it facilitated conversion and kept mnemonic writers employed for years. Others obstinately ignored metric, though they only had the luxury of doing so in states with industries which were not reliant on international trade and commerce. Since most of the most populous states did have a vested economic interest in converting to metric, this meant that most American schoolchildren of the 1980s memorized such iconic mnemonic devices such as “Kittens Hate Dogs But Do Chase Mice”. It also gave the fledgling Electric Company a shot in the arm with the show’s most famous sketch of the 1980s: “All You Have to Do is Count to Ten” (which won the show that year’s Emmy Award for Outstanding Children’s Program), featuring a song written by occasional contributor Tom Lehrer. [15] The genius of the sketch, and song, was in combining the counting-to-ten element (so basic that it was actually part of the Sesame Street curriculum) along with mnemonics for the metric prefixes – although it did not teach the metric system per se, it laid down the groundwork for education in metric units and – more impressively – multiplication, a concept otherwise just a bit beyond the target audience for the show (though not the peripheral adult demographic who watched for the subversive sketches). In fact, this effort was so successful that students who did learn about customary units were frustrated at how unsystematic they were compared to the metric system. (Some years later, Lehrer would remark that it was perhaps the most popular song he had ever written.)

    However, their parents did not enjoy nearly as uniform or as complete an education on the metric system, and many people continued to think about metres or kilograms as units to convert to from the customary units with which they were familiar, resulting in what was sometimes called “miles thinking” or “pounds thinking”. But this phenomenon also had distinct advantages: Glennrail speeds, at the behest of the Department of Transportation, were always quoted on press releases, signs, and promotional materials in kilometres per hour, and the media, after a fashion, followed suit. This approach took advantage of there being 1.6 kilometres to every mile; top speeds achievable on high-speed rail were over 185 miles per hour… or over 300 kilometres per hour. (Naturally, in oral communication, speeds were often quoted without reference to a specific unit, such as “250 an hour”, to further confuse the issue.) However, this same “miles thinking” backfired on the Department of Transportation when, despite the ample conversion signs posted on highways and (starting with the 1982 model year) speedometres which quoted both mph and kph, speeding became an epidemic. Then again, many jurisdictions – which were funded by the income from speeding tickets – saw unexpected windfalls from this development. That said, the most exciting development that resulted from the customary-to-metric conversion involved neither road nor rail, but the skies…

    The date was November 3, 1983 (a Thursday). At 5:00 PM, Republic Airlines Flight 307, a McDonnell-Douglas DC-9-51 jetliner with the registry number N766NC, departed from Hubert H. Humphrey International Airport (airport code MSP) [16], bound for Seattle-Tacoma International Airport, where it was expected to arrive about four hours later. However, though Flight 307 cleared the runway at Humphrey and left the Twin Cities behind without incident, it would never make it to Seattle…

    The problems started on the tarmac. N766NC had flown in earlier that day from O’Hare in Chicago, and was thus in need of additional fuel to complete the flight to Seattle. Metrication having rolled out gradually, unit confusion remained a tricky situation. This was compounded when it came to fueling an airplane, because the fuel was loaded into the tanks by volume (gallons or litres) but was measured by the plane’s instruments by weight (pounds or kilos), and this doubled the margin for error. Unfortunately for the passengers of Flight 307, the crew – wearied by a long day of travel – fell headlong into that margin on the afternoon of November 3rd.

    As a result of the conversion factors, far less fuel was loaded onto the DC-9 than would be needed for the voyage of over 2,000 kilometres. The one foolproof way to catch this error was the traditional hydraulic fuel gauge, which was out of service and due to be repaired once the plane had landed in Seattle. As a result, the flight crew only had their own manual input logs and calculations to verify the fuel levels – which were based on faulty assumptions due to the incorrect conversions. Essentially, the pilots, Captain David P. Nylund and First Officer Frank Neville, were flying “blind” – lacking the fuel needed to complete their journey, and indeed the knowledge that they were lacking in the first place.

    Shortly after crossing the airspace over the border from North Dakota into Montana, as the plane approached the small city of Glendive, the warning system in the cockpit sounded an alarm – the fuel pressure was too low on the right side of the plane. This was because of the reduced fuel quantity – the drop in pressure indicated the emptiness of the tank. Because of their previous calculations, Nylund and Neville believed that the tank had sufficient fuel (even though their gauge was non-functioning) and therefore blamed this alarm on an error with the fuel pump. Nevertheless, as was protocol in situations such as these, they informed air traffic control and prepared to divert to an alternate landing site if necessary. Just as they were finishing their report, the right-side engine shut down entirely, having consumed the last of its fuel. The pilots barely had time to inform air traffic control of this new development, and their plans to divert to the nearest major airport in Billings, Montana’s largest city, when the alarm sounded for the other engine. As it was vanishingly unlikely that both pumps could be malfunctioning, it became clear that the problem was with the fuel; Nylund and Neville mistakenly suspected contamination and attempted to restart the engines. These quickly proved futile and it became apparent that all they could do was land what had become a glider as safely as possible. Billings, over 300 kilometres away, was no longer an option, and no other airports within their range in the sparsely-populated region of Eastern Montana could receive them – except for the one they had just flown past, Dawson Community Airport in Glendive. In order to bleed off altitude and airspeed, the crew began a series of long steady circles around the down, trading altitude for time as Glendive and surrounding areas hurriedly assembled every fire company and ambulance they could muster.

    Glendive was a small city of 6,000 souls, whose primary claim to fame up to this point was for being the smallest television market in the United States – only one of three broadcast networks, CBS, had even bothered to affiliate with a local station: KXGN-5. That station’s only news reporter, Ed Agre, who doubled as its news editor, was in the field when he overheard on his radio scanner that a major jetliner had run out of gas and was going to glide in for a landing at their airport (and that all emergency vehicles and personnel should immediately proceed there). It was a once in a lifetime opportunity for the “crack reporting team” at KXGN (namely, Agre and his cameraman) – besides, it wasn’t like they had anywhere else to go, and it wasn’t like anyone else would be providing live coverage. It took them less than fifteen minutes to arrive at the airport, by which time a crowd was already beginning to form (word always spreads fastest in a small town).

    The plane first became visible a few minutes before it was due to land, descending in an eerie silence, headed directly for the growing crowd, and the assembled emergency services personnel. The one video camera at the scene was pointed directly at the approaching plane, with Agre’s sonorous tones being the only thing to break the hush that had come over the crowd.

    “We see it now, Republic Airlines Flight 307, coming in for an unexpected landing. Although the engines have no power, officials here at Dawson Community Airport in Glendive inform me that the pilots should be able to land the plane as if it were a glider.”

    Finally, after the plane had completed its final turn and was headed straight for the runway, the landing gear was deployed – but because there was no power, the plane’s hydraulics could not be used, forcing the crew to rely on gravity to do the job for them in a pinch. It worked – the first parts of the plane to hit the asphalt were the main landing gear under the wings. The runway was 1,739 metres (or 5,704 feet) long – the recommended landing margin for a DC-9 was 1,400 metres, which fortunately allowed Nylund (who was piloting the glider) plenty of room for error. Fortunately, they didn’t need it, executing what countless observers in the years to come would describe as a “flawless” landing – one recorded, in its entirety, for posterity on videotape by the entire reporting team at KXGN-5. When the brakes finally brought the plane to a stop about halfway down the runway, the crowd suddenly and boisterously broke their rapt silence with elated cheering. After their joyful noise faded into a dull roar, Ed Ager gracefully positioned himself in between the camera and the airplane it continued to film, having been struck (appropriately enough) by an eleventh-hour burst of inspiration:

    “It’s come to a complete stop. Ladies and gentlemen, the Glendive Glider has safely landed here on the runway at Dawson Community Airport.”

    The name stuck.

    The plane officially landed at 5:49 PM, Mountain Standard Time, which was too late to be covered on that night’s national news, since it was already past eight o’clock Eastern by the time Ager returned to the studio and set to work editing the raw footage into what would become the only news story on his five-minute nightly newscast at 9:55. However, he made sure to make duplicate copies of his original recordings to send out to, among other places, the head offices of CBS News in New York City.

    Meanwhile, the 79 passengers of Flight 307 were compensated by Republic Airlines with free accommodations for the evening of November 3rd – fortunately, Glendive being on I-94 meant that there was sufficient motel space for everyone – and hired buses to drive them into Billings the next morning, from which they would be given a free flight to either Minneapolis or Seattle – on top of their ticket prices for Flight 307 being fully refunded.

    The story hit the newspapers nationwide on Friday morning, which was when President Glenn became aware of the Glendive Glider and its remarkable feat of piloting. The Federal Aviation Administration was beginning its investigation, but already there were rumblings that the plane could only have run out of gas as a result of unit conversion gone awry – and that went back to the metrication efforts which his administration had championed. Eager to make lemonade before the lemons could be harvested by his critics, Glenn had his people contact Nylund and Neville, hoping for a photo opportunity and a chance to tie his image with the happy ending that had resulted from the situation, as opposed to the mishap which had created it.

    That evening, at 6:55 PM, Eastern Standard Time, Walter Cronkite presented the story of the Glendive Glider on CBS Evening News, complete with the live footage from the previous evening, though with new audio overdubbed, and a brief additional interview with Ed Ager, the reporter who had captured the story, for follow-up information (concerning the health and safety of the crew and passengers aboard the Glendive Glider). As for the plane itself, N766NC was refueled at Dawson Community Airport – with just enough to get to Billings, from which the plane was able to fly back to the Republic Airlines maintenance base in Minneapolis, where it arrived just over 24 hours after it had departed the previous evening. The plane was thoroughly inspected by airline officials as well as the FAA, but it was found to be completely undamaged by the ordeal, with only the maintenance problems which had been noted before its departure as Flight 307 found to be in need of repair. These repairs were completed in short order, and N766NC was flying passengers again in time for the Christmas holidays.

    7xr9rOMRzh-U9FVAoZt8VukflOrzktmT2O-IReFFYpbHd3ZgSQNg4H1fJzBdQnJe_gfIXN5AyATUpfos_7559uWOwnSdLt21r9aK_gpUk7CcXaAI5T9wJAjd1x4cACg37w


    The famous Glendive Glider, N766NC, (finally) coming in for a landing at Seattle-Tacoma International Airport in December, 1983.

    ---

    [1] Remember, as the saying goes: “Only Reagan could go to Moscow”.

    [2] A belated fulfillment of the promise to Red China that they would extend some form of recognition on behalf of their abstention when the time came to vote on intervention in Argentina.

    [3] James McDivitt, IOTL, served as Program Director on Apollos 12 through 16, giving him valuable administrative experience. ITTL, he remained in that position through the end of the Apollo Program and was then transferred to oversee the Space Shuttle, rather than leaving NASA and becoming a private citizen. Therefore, although there is favouritism in Glenn’s pick (choosing an astronaut over a military officer or scientist as NASA Administrator) it is not an example of promoting an unqualified person to the position.

    [4] The Caelus family is based largely on the OTL “Saturn S-1D” concept from Boeing, which called for a similar concept--the outer four F-1 engines on the stage would be fitted to a ring structure that would drop off at about 70% fuel consumed, leaving the remaining “sustainer” center engine to carry the payload to space. You can read more here. Since it uses the F-1A instead of the F-1, with improved thrust and efficiency, the “Caelus” as implemented here actually has improved performance beyond these studies. Though it loses a bit of performance due to the parachutes and such on the droppable rings, it seems like a logical step to reduce costs of the launcher which otherwise compares poorly to more conventional systems. (The H-1, which was similar in era to the F-1, was actually tested and found operable after saltwater immersion, so the requirements for reuse of the booster rings should be reasonable.) In-TL, the concept was to use Caelus as the first stage for Block III Saturns as well as a the Space Shuttle’s launch vehicle, which is then cut down to just the Shuttle when the moonbase plans die out.

    [5] The Blue Streak can be compared to its American contemporary, the Atlas, which continues to fly to this day – one of its most famous passengers being none other than John Glenn himself!

    [6] The Puck series of satellites was IOTL just the one satellite, Prospero, which remains the only British satellite to have been successfully launched into orbit by a British rocket. (The name was changed from Puck to Prospero when it became clear with cutbacks to the program that there would be only one satellite – Prospero is the protagonist of The Tempest, which concludes with him renouncing the magical powers he had learned at the beginning of the play.)

    [7] There is some evidence that Black Arrow’s designers may have intended this secondary use for the rocket (which, sadly, never came to pass IOTL, as both Blue Streak and Black Arrow were scrapped by the Heath government).

    [8] According to our calculations, the Black Prince 1 could only launch 350 kilograms of payload to geosynchronous orbit. For comparison, Puck 1 (IOTL, Prospero), launched into LEO, massed 66 kilograms. The Canadian Anik A telecommunications satellites (an HS-333 design) launched in the early-1970s (IOTL and ITTL) massed 560 kilograms; Black Prince 1 could not have thrown them into geosynchronous orbit even if it had been ready in time to do so (American Delta rockets launched the Anik satellites.)

    [9] As previously mentioned, the nearest urban area of any prominence is Cairns, Queensland – though the launch site is actually near Cooktown, 167 kilometres northwest of Cairns as the crow flies (about a four-hour drive). For comparative purposes, Cooktown is located at 15.5 degrees south, compared to Kourou at five degrees north and Cape Canaveral at 28.5 degrees north.

    [10] IOTL, the US government focused on the Space Shuttle as its commercial launch platform, forcing US payloads of certain classes suitable for launch on that vehicle to use it, while other options like Titan remained unused. Here, with TTL’s Space Shuttle being just a crew and cargo ferry, that pressure doesn’t exist. Titan is selected over Caelus (which could achieve higher payload for similar launch costs) mainly because the payload of the latter is too large to be commercially useful, coupled with a certain preference of Reagan for the military-developed Titan over the NASA Caelus.

    [11] The Wilson government made similar schemes in the late-1960s IOTL, but these were eventually abandoned and not revisited before their defeat in 1970. The constituent companies were ultimately nationalized and merged into British Aerospace by Wilson’s successor James Callaghan in 1977, though his successor privatized the conglomerate in 1980.

    [12] Because the British government is not involved with the funding of the A320, there are fewer delays relating to work spread between the various European partners (though there are still plenty) which therefore accelerates its development vis-a-vis OTL.

    [13] The most commercially successful passenger liner of British manufacture IOTL was the de Havilland DH.89 Dragon Rapide, which first flew in 1934. 731 units were manufactured, but the capacity was a mere eight passengers – which today all but the smallest planes can carry – within a range of 920 kilometres (shorter than distance from Land’s End to John o’ Groats as the crow flies).

    [14] This is common to many “compliance” reforms switching from customary (and often arbitrary) to standardized (and usually systematic) usage. The ensuing backlash is always inevitable and occasionally leads to reforms being watered down, as they were in the case of US metrication IOTL. Other examples include the decimalization of the pound in the UK (and Ireland) in 1971 (which was successful) and the German orthographic reforms of 1996 (some of which were eventually reversed).

    [15] Lehrer, though retired from public performance in the early-1970s, composed ten songs for The Electric Company IOTL, and ITTL continues his association with the series into the 1980s.

    [16] ITTL, what was previously known as Minneapolis-Saint Paul International Airport was renamed after the late President Hubert H. Humphrey, who hailed from Minnesota, in the late-1970s. It retained the old airport code MSP because HHH was already taken by the Hilton Head Airport, a small commercial and general use airport in South Carolina.

    ---

    This update was co-written with e of pi, whose extensive knowledge of aerospace technology was invaluable during every stage of its development. As the opening section implies, one of our goals was to rehabilitate someone we feel to have been Mis-blamed for the failure of the British space program IOTL – whatever his government’s other failings, that was not among them. Thanks must also go to nixonshead for having taken the time to create these gorgeous renderings of the Black Prince rocket family, which in my opinion really helped to bring this update to life. It’s such a thrill to finally be able to share them with all of you after having sat on them for quite some time. I certainly hope you’re all reading his timeline, Kolyma’s Shadow, which has even more incredible artwork!

    Believe it or not, the story of the Glendive Glider is based entirely on the equally incredible OTL experience of the Gimli Glider.
     
    Last edited:
    Cel-ing Abroad
  • Cel-ing Abroad

    By the time Walt Disney died in the closing days of 1966, the American animation industry which he had personified had already been in decline for some time. The primary culprit was none other than the medium that Disney had so cannily embraced: television. Theatrical animated shorts had sustained the industry from its infancy, but these old cartoons – much like newsreels and film serials before them – were being supplanted by equivalents available on television. Thanks to another key innovation brought about by television, the rerun, some of these equivalents were in fact many of the animated shorts which had been produced as early as the 1930s – though largely from the 1940s and 1950s. Children, being the target demographic for these cartoons, had short memories and attention spans and, as it was soon discovered, didn’t need any more cartoons than were already available, or at the very least, not produced top-dollar by bloated studios whose animators were grossly overpaid. [1] If more animation were needed, it would be made on a budget, so as to allow its creators to generate a reasonable profit in a timely fashion. Corners would have to be cut as a matter of course; framerates were reduced, in-betweening was virtually eliminated, and character designs were simplified, among many other shortcuts. The “limited animation” technique that resulted was aptly named.

    Ironically, the pioneers of this method, William Hanna and Joseph Barbera, had worked for MGM during the Golden Age of Animation, producing their beloved (and lavish) Tom and Jerry cartoons. Their greatest success following their move to television was The Flintstones, essentially a prehistoric animated version of The Honeymooners [2] – they would again ape live-action trends in the 1970s with Wait Till Your Father Gets Home, clearly inspired by Those Were The Days. In between, much like Disney and Warner Bros. before them, the studio created an entire stable of animal characters, including Huckleberry Hound, Yogi Bear, and Scooby-Doo. What could be said for Hanna-Barbera was that their characters had personality, if perhaps to excess, and that (charitably speaking) this could compensate for the lack of animation in any of their cartoons. The studio that emerged as Hanna-Barbera’s great rival in first-run television animation could make no such lofty claims, however. Filmation Associates was founded in 1962 by Lou Scheimer, Hal Sutherland and Norm Prescott. Sutherland would serve as the primary animation director at the studio through most of its history, despite being colour-blind – perhaps no single fact was more emblematic of the overall lack of care with which Filmation treated their product. [3] But no studio could produce more cheaply, nor accomplish quicker turnaround – Saturday mornings of the 1970s were flooded with Filmation cartoons, always in 22-episode packages.

    This glut of limited animation spread beyond the small screen and onto the larger one, with lazy “efforts” such as Robin Hood and The Rescuers showing how far the studio that had once produced Snow White, Pinocchio, Fantasia, Bambi, Cinderella, and Sleeping Beauty had fallen, leaving the task of revitalizing the industry to outsiders. Even the Lord of the Rings trilogy which was eventually credited with spurring the 1980s animation “renaissance” was not without flaws and the cutting of corners, though one of that particular movement’s leading lights, Don Bluth, decried this practice and the moderate success his films enjoyed helped to lead to its decline. (By contrast, Ralph Bakshi was far more willing to hold back – with mixed results, as the Lord of the Rings films proved.)

    Granted, the situation had improved considerably from the days of television animation’s infancy. Clutch Cargo had made infamous the “synchro-vox” technique, in which static images were “animated” by overlaying a filmed image of a person’s mouth moving (as he or she read lines, providing lip-sync). This would supposedly compensate for what was otherwise a rather blatant presentation of a series of stills, more like a slideshow or comic book than an actual cartoon. Stop-motion animation, which enjoyed considerable cachet in the 1950s and 1960s with the works of George Pal, Art Clokey, and Rankin-Bass, also had a nefarious corner-cutting cousin in “chuckimation”, which was essentially the technique used by little kids when playing with their dolls and action figures: holding their toys (with their hands kept carefully out of frame) as they shook them for emphasis to indicate speech or reaction.

    Even as early as the 1960s, some of the best animation produced in this period was not made by American hands. Rankin-Bass produced their celebrated stop-motion animated Christmas specials, Rudolph the Red-Nosed Reindeer, Frosty the Snowman [4], Santa Claus is Comin’ to Town, The Year Without a Santa Claus, and Jack Frost, starting in 1964, and continuing into the 1980s - the stop-motion animation itself was farmed out to studios in Japan, in one of the earliest examples of that country’s animation industry receiving contract work from American studios. Like many post-war Japanese industries, it emerged seemingly out of nowhere.

    Animators in the Land of the Rising Sun, however far-flung a locale it might have been, were also influenced by the Disney tradition, as part of a broader “Americanization” of Japanese culture which followed World War II. That Japanimation was already visually distinct from contemporary American animation by the 1960s was a powerful example of divergent artistic evolution, informed by the vast differences between American and Japanese cultural tropes and aesthetics. American audiences, however, regarded the few products of Japanimation to successfully cross the Pacific as mere curiosities, and often their origins were deliberately obscured from audiences (though to little effect, as Speed Racer so famously made clear).

    The growing number of productions filmed overseas, including in Japan, accompanied by the wholesale broadcast of foreign animated series, alarmed the cartoonists’ union, aware that their sub-par work (which, to be fair, came at the behest of their penny-pinching bosses) was being threatened by this competition. In 1978, the union went on strike specifically to head off “runaway” productions, as they were called; they were (temporarily) successful, though they were only granted a five-year reprieve, as the issue would surely come up again in subsequent renegotiations. In the meantime, Japan – with a sufficiently large native consumer base and a rapidly-growing economy, already in the global Top 5 by 1960 and displacing the “superpower” Soviet Union to become the #2 economic power in the world (behind the United States) by 1980 – could continue to develop their industry and refine their technique without American investment.

    Still, animated adaptations of popular Western media formed a bread-and-butter genre of Japanimation. The Japanese, in addition to their love for Disney – to the point of licencing the construction of a Disneyland theme park in Tokyo, which opened in 1983 – were notoriously fond of pastoral Americana (and Canadiana, if their Anne of Green Gables fandom was anything to go by). One of the properties this love extended to was L. Frank Baum’s The Wonderful Wizard of Oz series – which, in the West, had been almost completely overshadowed by the iconic 1939 MGM film adaptation of the first book in that series. However, in the decades since that film’s release, the rights to many of the Oz books written by Baum had fallen into the public domain; creating a new adaptation of The Wonderful Wizard of Oz had fewer barriers to entry than ever before. Given that Oz was an institution in the United States, it wasn’t a surprise that it was ultimately the project to attract American backers.

    Chosen as director was the young, up-and-coming animator Hayao Miyazaki, in what career retrospectives would later mark as the last – and best – product from the “work-for-hire” phase of his career. Miyazaki had some fondness for the property, since it touched on a few pet themes of his, including a strong, assertive female protagonist in young Dorothy Gale, and the association of evil with exploitation of the environment and its people, and goodness with harmony and nature. He chose to emphasize these points in his adaptation. Japanese culture also had a pronounced fondness for witches, again due largely to an American influence (the classic 1960s sitcom Bewitched), and greater focus was placed on all four of the cardinal witches – the 1939 film had conflated the two Good Witches into a single character, and the Wicked Witch of the East had appeared only long enough to be crushed by Dorothy’s house. In one of the few nods to the MGM version, the Wicked Witch of the West was depicted with green skin – notably, an almost-cyan hue identical to that of the Emerald City, to emphasize the “evil” inherent in that particular colour. The Ruby Slippers (as opposed to the Silver Slippers of the book) also returned, again for reasons of colour theming: they were the only prominent objects to be crimson in hue, emphasizing their uniqueness and appeal.

    Miyazaki’s deliberate pace (his film would have an even longer runtime than the 1939 live-action movie) and the lack of songs would give the movie a very different tone from The Wizard of Oz which was best known to Western audiences. Nevertheless, the film was shown at the Cannes Film Festival and was released near-simultaneously on both sides of the Pacific: in October, 1982, stateside, and that December in Japan. [5] Surprisingly, it was fairly successful – enough so that MGM rushed out a re-release of The Wizard of Oz the following year, even though the film had already been released on home video. [6] It would be re-released on video in a “45th Anniversary Special Edition” in 1984, with new interviews from the two surviving cast members, Ray Bolger and Margaret Hamilton. The Japanese film was itself released on CED, Beta, and VHS that same year. It did even better business on home video than it did on the big screen, particularly in Japan; this gave Miyazaki the cachet he needed to buy out the Topcraft animation studio which had produced the film, renaming it Studio Aurora, where he would enjoy complete creative control as an animator, writer, and producer of Japanimation films. [7]

    The widespread release and moderate success of The Wizard of Oz was only the most visible sign of a sea change in how Japanese animation quality was perceived by gaijin. In fact, the difference in quality between the average overseas product and what could be produced stateside for roughly the same price was widening into a chasm. Several of the major Japanese animation studios, including Topcraft/Aurora, Toei, and Tokyo Movie Shinsha (TMS), caught the attention of American investors who were interested in further co-productions. They weren’t the only ones, either – even other countries in the Anglosphere were getting in on the act, such as Canada and the United Kingdom, as were countries in Western Europe such as France and Italy. However, the threat of Japan encroaching on American territory was most thoroughly emphasized in trade papers, reflecting the wider concerns of Japan’s economic boom and how that phenomenon was a modern take on the old “Yellow Peril”.

    American studios who held more firm to using homegrown animation than Rankin-Bass did – such as Hanna-Barbera, Filmation, and Ruby-Spears - all felt the threat, but upping their game would be difficult without facing down the animators’ unions. Their collective bargaining agreements were due for renegotiation in 1983, which was dreadful timing on their part. By this time, especially after the successful Wizard of Oz release worldwide, the writing was on the wall. The animators lost their protections in the new collective bargaining agreements, effectively marking the end of television animation made by American hands. (Animators continued to work stateside on major motion pictures which – fortunately for them – were not uncommon in the 1980s.) Many of them optioned adaptations of popular properties, such as the popular Franco-Belgian comic series The Smurfs, as well as the Conan the Barbarian film series (sanitized from the more brutal and explicitly violent film adaptation). [8]

    The Wizard of Oz itself would receive a television continuation – the movie, in what would become a tradition in 1980s animation, was itself divided into several episodes which formed the premiere arc of the series. Hayao Miyazaki had no direct involvement with the program, though he worked unofficially (and informally) as a consultant behind-the-scenes. It was followed by loose adaptations of the later Oz books, starting with the second, The Marvelous Land of Oz. That book infamously featured a story arc in which a girl (Princess Ozma of Oz) had been turned into a boy (Tippetarius, or “Tip” for short) through the nefarious powers of a Wicked Witch, Mombi, and this was faithfully depicted in the cartoon. Surprisingly, this cartoon was also brought over to the United States, though it aired in syndication as opposed to one of the three networks’ Saturday Morning lineups, and was largely uncensored – a stark contrast from the considerable re-working that earlier Japanimation went through upon reaching American shores. [9] Granted, the series was adapted from a beloved series of American children’s books, which had a vocal fanbase, and perhaps this stayed the hands of distributors – but it set a curious precedent for the years to come. One of a great many, in fact, in a medium and a period on the weekly schedule which had seemingly fallen into stagnation not so long ago[FONT=&quot]…[/FONT]

    ---

    After the tumult of the acquisition of what had once been Paramount Melrose, things finally seemed to be settling down at Desilu. Lucille Ball and her long-time lieutenant, Herbert F. Solow, continued to celebrate their good fortune. But another VP at Desilu, one much younger than either of them, was not one to rest on his laurels. Brandon Tartikoff wanted to let it ride, and luckily for him, he was always coming up with new ideas. They weren’t all winners, but enough of them were that even the most radical of them were given their due consideration. Star Trek overall merchandise sales are down slightly from last year,” he announced, as he entered Ball’s office with a bundle of file folders cradled under his arm. “I think we might want to do something about that.”

    “Aren’t sales for those little role-playing booklets up big from last year?” Solow asked.

    “Yes, but they’re a niche segment of the overall marketing mix. Most of our big-ticket segments like action figures and playsets are down. CED sales are still solid, but we don’t have any more to sell – the whole show and the miniseries is available on home video. Convention ticket sales are stagnant, there’s no fresh blood coming in. We have to consider the Mini-Boomers are getting old enough to be very attractive to advertisers.”

    Ball knew a pitch was coming. By now she had a sixth sense about it. “What’d you have in mind, Brandie?”

    “It’s been buried in the back pages of the trade papers lately, what with all the reporters hanging around here trying to get the latest scoop about what company belongs where now, but the Animator’s Strike? Went really bad for them. They lost their runaway protection – that means any studio can farm out their animation to any shop anywhere in the world.”

    “That’s very nice, but what does it have to with us?” Ball asked.

    “I’ve been reading through the studio archives. I know there was talk about a decade ago about making a Star Trek cartoon. But you didn’t like the animation style – if I can call it that – any of the cartoon houses had to offer. But times have changed since then.” With this, he reached into one of the folders and pulled out a collection of exemplar animation stills and storyboards from many of the foreign studios who were catching the eyes of American executives – now including those at Desilu. “The advantage to making a cartoon was always the money we would save on building the sets or paying the actors their salaries and dealing with their egos. Not to mention, with a science-fiction series like Star Trek, physical limitations means we can only do so much to create fantastic worlds and aliens and visuals in the first place. We’ve always prided ourselves on our quality product, and I think the opportunity exists for us to pursue something very special here.”

    Solow regarded the artwork carefully. “Any idiot can draw a picture – do you have anything from these guys in motion?”

    He hadn’t even finished his sentence before Tartikoff produced a videotape – it looked like VHS, which was just as well, because the TV in the corner was hooked up to a VTR player that played VHS tapes, a fairly rare commodity outside of the industry – and strode over to set it up. As he was doing so, Solow turned to Ball, who was also regarding the images.

    “What do you make of this?” he asked his boss.

    “The fellas at Hanna-Barbera drew some pretty pictures when we talked about ten years ago,” Ball recalled. “But didn’t they want a talking dog to join the crew or something like that?”

    “Right, with a speech impediment.”

    “That’s right. And then the people at Filmation sent us a tape with the same few frames of walking animation playing for 22 minutes straight.”

    “And everyone was wearing hot pink,” Solow remembered.

    “God, it’s no wonder these animators are losing their jobs.”

    “Here we are, this is from the new Aurora studio that produced the Wizard of Oz movie from last year. They’ve turned it into a regular cartoon series, I think one of the American networks is looking at picking it up and airing it over here.” As Tartikoff said this, the cartoon started playing in mid-scene, with loud Japanese dialogue nearly drowning him out. He turned down the volume.

    “What, are they gonna dub that over?” Ball asked.

    “Yes, last I heard, they wanted John Drew Barrymore’s daughter to do Dorothy’s voice.”

    “Here’s hoping she bothers to show up for work,” Solow groused. [10]

    “Notice the animation quality. Much more fluid and expressive, and less repetitive than what was on offer back in the ‘70s.”

    “It’s… very stylized,” Solow said, a puzzled expression on his face.

    “That’s the way they like it in Japan. Obviously, we could ask them to animate it differently. That’s the point – they can animate, and not just loop the same few frames over and over again.”

    “It’s head and shoulders above a lot of the stuff I’ve seen, Herbie,” Ball said. “You should see some of the stuff they’re passing off as cartoons nowadays. I’ve watched it with my grandkids – it ain’t pretty. This… this looks alright. I mean, it’s not Fantasia or anything, but it works.”

    “And they’re just one of several quality animation studios in Japan. I’ve sent out feelers to all of them – they’re all very interested in having a hand on projects for American properties like Star Trek. And Herb, if you think their style is too “out there”, we can always farm out the character designs to an American studio like Hanna-Barbera – Lucy, I know you’ve had close ties with them for a while.”

    “They did the opening animations for I Love Lucy,” she said. “Of course, nobody sees them anymore anyway, since they were shilling for Philip Morris, and that’s not allowed.” She did her best to hide the bitterness in her voice, but did not entirely succeed. “Now it’s just that valentine card.”

    After mentioning Philip Morris, she suddenly felt a craving and reached into her drawer for a cigarette, lighting it as Tartikoff continued.

    “If we have American writers sketching the plot outlines, and American artists designing the characters, and American actors doing the voices, I think bringing that together with Japanimation at cut-rate prices is a winning combination. We’re not the only people thinking about this, either. I heard one of the big toy companies wants to partner with the Japanese to make a cartoon to sell a line of robots they’re developing.”

    “The problem is getting all the actors back,” said Ball. “Even if they’re not showing up in the flesh and they don’t have to age anymore – wouldn’t that be great – it still took a miracle to get them all back for the miniseries.”

    “And George and John’s characters are dead anyway – they’re not coming back,” Solow added.

    “Well, maybe we might consider using different characters?” Tartikoff suggested. “This isn’t live-action, so a lot of people won’t consider it ‘real’ Star Trek anyway. Might as well take advantage of that.”

    Star Trek without Kirk and Spock? I can’t even imagine that,” Ball said.

    “Well, as you know, the fan community generates a lot of material regarding original characters and situations. I think they’d be more open to the idea than you’re giving them credit for.”

    “Those role-playing game sales prove it – Trekkies will eat up anything with the Star Trek name, even if it doesn’t have Kirk and Spock and all the other, familiar characters attached to it. They even wanted that God-forsaken Deep Space show to be a part of the Trek universe, for crying out loud!”

    Tartikoff, who’d played a key role in the development of Deep Space, shot Solow an annoyed glare. “You know, Herb, just because you weren’t a fan of Deep Space doesn’t mean – ”

    “All right, all right, boys, that’s enough,” Ball interrupted, before the argument could boil over any further. “That show’s dead and buried now, let’s not dig it up again, please.” She took a long drag on her cigarette, hacking rather violently as she exhaled. “Brandie, I gotta say, your idea sounds interesting. You have my blessing to start making official inquiries.”

    “Thank you, Lucy – you won’t regret it!”

    “Hell, I’m old enough I probably won’t even live to regret it.”

    She’d meant it as a joke, but wasn’t surprised when nobody laughed. She was never as funny when she worked off-script.

    ---

    [1] In the opinion of short-sighted, avaricious executives, not this editor. The adage that “you get what you pay for” is no less true with animation than it is with anything else.

    [2] A version made without the permission of that show’s creator, producer, and star, Jackie Gleason, who planned to sue Hanna-Barbera but backed down upon deciding that he’d rather not be known as “the man who killed Fred Flintstone”.

    [3] One of Filmation’s most acclaimed productions IOTL, of course, was the 1973-74 Star Trek animated series, which miraculously achieved some measure of success despite the aforementioned colour pallette problems, in addition to extremely limited animation (frames were constantly reused), egregious repetition of a miniscule library of stock music – two minutes total, not counting the minute-long theme song, mostly in the form of brief stings and constantly repeated over 22 episodes and tracked into other, later Filmation cartoons – and abysmally choppy pacing.

    [4] Frosty the Snowman was produced IOTL as a hand-drawn animated special, as opposed to most of Rankin-Bass’ other specials, which were indeed filmed in stop-motion. Frosty was in fact the first Rankin-Bass special to utilize hand-drawn animation IOTL, and largely the studio returned to the technique only to produce subsequent Frosty specials (and for serial productions, up to and including the beloved ThunderCats series of the 1980s).

    [5] IOTL, The Wizard of Oz did not receive a film release until 1986 in Japan, the same year an otherwise-unrelated serial adaptation of the books began to be broadcast (it was eventually dubbed and brought over as well, and is actually the very first anime this editor ever remembers watching).

    [6] The Wizard of Oz was among the earliest classic films to be released on home video (as it was IOTL: on VHS and Beta in 1980, and Laserdisc and CED in 1982), in time for the film’s 40th anniversary in 1979. (On two CED discs; later pressings – remember, CEDs have short lifespans – were released on a single disc, though otherwise unaltered, starting in 1981.) For the 45th Anniversary Special Edition IOTL, in the admitted highlight of the special features, Bolger and Hamilton are recruited for traditional “talking head” style interviews about the film, for which they sit separately – and together (as the two were close friends in real life).

    [7] As opposed to, of course, Studio Ghibli. Both names have Italian origins – Ghibli was the nickname for the Caproni Ca.309, a reconnaissance plane used by the Regia Aeronautica in WWII, in reference to the Arabic name (as the plane flew predominantly in Libya) for one of the prevailing Mediterranean winds, the Sirocco. The intention in using this name was to signal that the studio would “blow a new wind through the anime industry”. ITTL, the term aurora is used instead (a reference to the Italian – and, of course, the Latin – word for dawn), to represent a “new dawn” for the anime industry instead, a reference also to Japan’s status as the Land of the Rising Sun. Artistically, of course, the aurora representing light (and colour, and vision) is also more meaningful. Linguistically, an advantage is that the word aurora is more accurately phonetically translated into Japanese (as “orora”) than ghibli (as “jiburi” – in Italian, that’s a hard “g” sound).

    [8] It has long been a popular belief that the Masters of the Universe toy line began life as an adaptation of Conan, and were only changed at the last minute to an “original” line because of the adult content in the Conan franchise. ITTL, the Masters of the Universe line does not exist (because Mattel didn’t develop them as their latest desperate attempt to catch the lightning they’d missed out on in rejecting Star Wars in a bottle) and a straight, sanitized Conan adaptation airs instead. “Adult” properties being adapted into kids’ cartoons was certainly not uncommon in the 1980s.

    [9] By contrast, the 1986 Wizard of Oz anime IOTL was heavily censored by the Western studio which bought the distribution rights, pulling the classic Speed Racer technique of completely obfuscating the show’s origins and passing it off as American-made (with about as much success).

    [JDB] John Drew Barrymore – IOTL, the father of Drew, ITTL her identically-named “sister”, since she was born in 1975, well after the POD – was originally cast as Lazarus in the Star Trek episode “The Alternative Factor”, but failed to show up for work when filming began in mid-November, 1966 (shortly after the POD, but I’ve decided to keep the event intact ITTL – simply because we can’t butterfly every howler away to our advantage). As a result, Desilu filed a grievance and the SAG suspended Barrymore for six months.

    ---

    Thanks, as usual, to e of pi for assisting in the editing of the update, and for providing the horrendous pun of a title.

    And yes, you finally have to answer as to how Star Trek will continue ITTL – as an animated series! But one made in the 1980s instead of the 1970s, and therefore with a much, much better shot of being a quality (and long-running) show, even notwithstanding the absence of Kirk, Spock, Bones, Scotty, Uhura, and the rest (barring occasional guest appearances).

    And yes, Japan Takes Over The World is still very much a trope ITTL. The miraculous recovery of the Japanese economy was very much a reality even at the POD. It’s difficult for me to imagine a set of circumstances which would derail that growth before it happened IOTL. Fortunately for me, the rise of Japanimation (the term anime would not come into vogue in English-speaking countries until the 1990s) in the West is an excellent demonstration of how economic power and cultural power are closely intertwined (as are video games, for that matter).

    Go ahead, ask me why I bothered to discuss the plot points in The Wizard of Oz, the one film virtually everyone (in America, at least) has seen – but then I am nothing if not thorough :p

    For those of you who can
    ’t quite picture “Synchro-Vox” from my explanation, I implore you to watch this pitch-perfect parody of the “technique”. (I think it may be the best thing Pixar has ever made.)
     
    It's All Fun and Games, Until...
  • It’s All Fun and Games, Until…

    For three consecutive cycles, the prestige of the Games of the Summer Olympiad had been marred by political scandal: the 1972 games in Munich were disrupted by the hostage crisis and subsequent massacre; the 1976 games in Montreal were scandalized by rampant overspending on infrastructure projects, many of which had to be hastily completed in time for the opening ceremonies; and the 1980 games in Moscow were overshadowed by the Soviet invasion of Afghanistan, which threatened to trigger a boycott, had it not been for President Reagan’s backchannel negotiations. Although the last of these public relations nightmares had not yet taken place by 1978 – when the host city for the 1984 Games was chosen – Montreal’s massive hemorrhaging of public funds (the Government of Canada had to bail the city out, resulting in a substantially increased federal deficit) scared away many other countries, increasingly wary of the growing commercialism (and expense) of the Games.
    Only the two superpowers – the United States and the Soviet Union – appeared even remotely interested in hosting the Games of the XXIII Olympiad, but there were political obstacles in the way. As of 1978, an American city – Denver – had already hosted the (Winter) Olympics just two years before, meaning that both the Summer and Winter Games of 1976 had been hosted in North America. In addition, Vancouver-Garibaldi was the only candidate to host the 1980 Winter Games, meaning that three out of four Games in only two cycles would be hosted in North America. Therefore, both New York City (which had never hosted the games before, and was hobbled by a reputation for crime and squalor besides) and Los Angeles (which had hosted the 1932 Games), both of which had expressed an interest in bidding for the 1984 Games, were rebuffed by the International Olympic Committee. A Soviet city was obviously right out, since they would be hosting the 1980 Summer Games. For a time, there were no contenders at all – to the point that the IOC was willing to reconsider their “request” for the United States not to mount a bid, before Tehran, the capital of Iran, put in a bid of their own. [1]

    Geographically, Iran was an attractive host country for a number of reasons. This would be only the third Summer Olympiad to be hosted in Asia (after Tokyo 1964 and Moscow 1980), and the first Games of any kind to be celebrated in the Middle East. [2] There remained considerable vocally-expressed doubts about Tehran’s (and Iran’s) ability to host the Games, given the repressive and undemocratic state regime – which faced considerable, and vigorous, popular opposition. However, the IOC’s hands were tied – there were effectively no other candidates to host the Summer Games.

    By contrast, the bid for the 1984 Winter Olympics were vigorously contested between Sarajevo, Yugoslavia, and Sapporo, Japan. The recency of the 1976 Summer and Winter Games, both held in North America, were the deciding factor in awarding the 1984 Winter Games to Sapporo, ensuring that their 1984 counterparts would both be celebrated in Asia – the first time that this would be the case for the world’s largest and most populous continent. [3] Notably, although the prospect of Los Angeles hosting the Summer Olympiad was mooted to avoid a repeat, Sapporo had also previously hosted the Winter Olympiad as recently as 1972. American sportswriters were just about the only people who cared enough about this latest bit of hypocrisy on the part of the IOC to make note of it, however. The Reaganomics of the era were such that many commentators believed the United States had dodged a bullet in avoiding having to fork over funding to support an Olympic bid, especially when it became clear that winning that bid (and therefore actually having to host the Games) was a near-certainty.

    In stark contrast to the heated (in more ways than one) Summer Olympics which would come later in the year, the Winter Olympics at Sapporo proceeded much more smoothly, from the preparation – many of the facilities constructed for 1972 were reused in 1984 – to the popular response. Although they were not hosting the more prestigious Summer Games, the Japanese government still sought to show off how far they had come in the four decades since they had been left in ruin by World War II. Japan was now the second-strongest economy in the world and, as many increasingly feared, might possibly threaten the United States for #1 in the not-too-distant future. However, their economic prowess did not translate so easily to athletic prowess – particularly with regards to the ice and snow. Though Japan had won all three medals in the normal hill ski jumping event in 1972, they failed to place in a single other discipline at those games, and won nothing at Denver in 1976. However, they staged a partial comeback in 1980 at Vancouver, winning silver and bronze in the normal hill event. [4] The Japanese Olympic Committee wanted very much to sweep the normal hill event once more in 1984, again on their home turf. The JOC also set ambitious goals for men’s and women’s figure skating, in which they had performed well (if not enough to place) in the past, and speed skating. That said, they fielded a full complement of athletes, including an ice hockey team, though they were well aware of being a dwarf among giants in what was arguably the marquee event of the Winter Olympics. In order to secure the funding necessary to develop their team, the Sapporo Olympics were supported by corporate sponsors, which drew the ire of many in the IOC and Olympics “traditionalists” who favoured the intended “amateur spirit” of the Games. [5] One of the primary sponsors was, unsurprisingly, Sapporo Brewery – although for an unexpected reason: the Olympiad happened to be timed with a planned expansion by that firm into the US market, and the brewery’s executives were well aware of the tremendous visibility the Winter Games enjoyed in the US. This strategy met with some resistance stateside, however; moral guardians questioned the propriety of an alcohol manufacturer sponsoring a family-friendly event, and cultural nativists and protectionists bristled at yet another facet of the growing “Japanese Invasion”. [6] “First they get us to drive their cars, now they want us to drink their beer,” commented the American Party strategist, Pat Buchanan.

    As was typical during the Winter Games, sportswriters were abuzz with regards to the “main event”, ice hockey. Canada, in the famous “Miracle on Ice”, had won the gold medal in 1980, and the Canadian government and media were high on their chances of defending their title and scoring a repeat victory. Competition was fierce, however – primarily from their most storied rivals, the Soviet Union. 1980 had been an absolution for the tragic loss at the 1972 Summit Series, and the Politburo was determined to regain the upper hand in this, one of the more literal fronts of the Cold War. Other countries on both sides of the Iron Curtain were eager to prove their mettle, however, such as the United States, Czechoslovakia, Sweden, and West Germany. The host country of Japan had never placed better than 8th at the event (in 1960), and were largely considered a non-entity, having finished 12th in the qualifying rounds – they were then eliminated in the semi-final round-robin.

    Advancing to the final four were the Soviet Union and West Germany from Team A, and Canada and the United States from Team B. Czechoslovakia, Sweden, Finland, and Poland rounded out the Top Eight. West Germany’s finish in the top four was considered a fluke against tough competition in Sweden and Czechoslovakia, and indeed they would be shut out in each game against their three opponents. Canada, sadly, lost against not only the Soviets in their decisive rematch (scoring a goal in the third period to spare themselves the ignominity of a shutout), but also to the United States, and took home the bronze medal. The final game scheduled was that between the Soviets and the Yankees, which would determine the winner of the gold medal. Americans had a complex relationship with ice hockey, but as a rule they always took an interest in any event where they were deemed to be the underdog – especially against their arch-nemesis. As soon as mention of “another Miracle on Ice” faded from Canadian newspapers, the phrase began popping up in American ones, confidently predicting the defeat of the Soviets for the second consecutive Olympiad by a Western power – and at a game which they had so dominated in the 1960s and 1970s, no less. However, it was not to be. The Soviets won the gold medal match with a score of 3-2; the United States was awarded the silver, still their best performance in the ice hockey event since 1972, when they had previously won the silver. Canadians who were heartbroken about losing the gold so soon after having finally won it back could at least take solace that they had won something, and although none of them were particularly thrilled at the Americans having finished ahead of them, at least they hadn’t lost to the Soviets in that heart-pounding nail-biter of a final game. Besides, the Canadians had other wins to celebrate.

    The first was a major coup in scoring recognition for curling, which had been an official Olympic sport at the 1924 Winter Games in Chamonix, France – and never since. It had been played as a demonstration sport in 1932, and then again in 1980 at Vancouver, and its great success there – unsurprisingly, as Canada was the heartland of the sport despite its creation in Scotland (no doubt due to the large Scottish-Canadian population) – spurred the IOC to add the sport to the program in an official capacity in time for the 1984 Games. [7] To the surprise of many, curling attracted some curious onlookers from the United States, where the sport had never enjoyed much native popularity – attributed to its unique novelty (it was the only new sport added to the program in the 1984 Winter Games), approachability (unlike more intensive winter sports such as speed skating or alpine skiing), and (of course) the fact that the American team was doing quite well in the sport, and received plenty of ink and airtime as a result – depicted, of course, as the plucky underdog against such world powers as Canada and Great Britain (technically the defending champions, having won the gold in 1924). The quaint etiquette surrounding the sport (called the “Spirit of Curling”) also won its share of admirers. For all of these reasons, the surge of interest in the sport lingered despite the US ultimately finishing only third, taking home the bronze medal. However, as was the case with most fads, public interest would eventually ebb. Great Britain, meanwhile, failed to defend its title, taking home the silver – and Canada won the gold medal, a victory which would help to soothe the bitterness of the lost hockey gold. (As would Canadian teams dominating the NHL throughout the 1980s.)

    Japan failed to win any of the big-ticket events, but repeated their feat of sweeping the normal hill event of the ski jumping discipline, also winning the gold in the large hill event and two silver medals in speed skating (500m and 1000m) and a bronze in women’s figure skating – the two golds, three silvers, and two bronzes won by Japan were good enough for that country to finish in eighth place overall. As usual, the leaderboard was dominated by the three perennial athletic superpowers: East Germany in first place (with 25 medals and nine golds – more than any other country on both counts), the Soviet Union in second, and the United States in third. Canada won four gold medals, finishing fourth for the second Olympic Games in a row, following their triumph in Vancouver – ahead of the three Nordic countries. Finland, Sweden, and Norway. Switzerland and West Germany rounded out the Top 10. All in all, despite the Cold War tensions being fought out by proxy in the hockey arena, the Winter Olympics were, as they had ever been, a more congenial, sportsmanlike variation on the more cutthroat, vicious Summer Games – a contrast which would perhaps never be more palpable than in 1984.

    Tensions were high in late-1970s Iran, with many critics of the Pahlavi dynasty viewing the upcoming Tehran Olympics as yet another attempt by the Shah’s regime to glorify themselves at the expense of the common man (akin to the 2,500-year anniversary celebrations of the monarchy in 1971). However, the unpopular (and secular progressive) Shah Mohammad died of cancer in 1980, replaced by his more devout, traditionalist son, Reza II, who granted a new, democratic constitution which placated all but the radical forces to the extreme left and right of the Iranian political spectrum. Notably, his government mitigated the customary emphasis on ethnic nationalism that defined the Pahlavi dynasty, instead choosing to focus more on the unifying force that did more accurately define the vast majority of his people: Islam – about 99% of the Iranian population was at least nominally Muslim, compared to a mere three-fifths who were of Persian ethnicity.

    That said, though the constitution recognized Shia Islam as the state religion, it officially provided for the five most prominent minority creeds in the Empire – Sunni Islam (at over 5% of the population, by far the largest religious minority group), Zoroastrianism (the oldest religion in Iran, which was also native to the region), Christianity (mostly Oriental Orthodox), Judaism, and the Baha'i faith. State tolerance (up to and including special representation in the legislature) of these last two groups rankled religious fundamentalists; the Ayatollah Ruhollah Khomeini, their spiritual leader (in exile since 1964) was a vocal opponent of the Baha’i faith and considered the new Shah’s tolerance of them to be heretical to the point of invalidating his supposed devotion to his faith, and all of the conciliatory steps he had otherwise taken to overturn the secularism of his father. [8] However, most Iranians – despite their personal distaste for Baha’is, and disapproval of their being tolerated by the state – did not agree, and popular demonstrations against the Pahlavi dynasty had abated so completely that US President Glenn was able to announce a withdrawal schedule for American troops from Iran in 1981; by the end of 1982, only a specialist cadre of trainers and military intelligence remained. By this time, the ranks of the Iranian military were sufficiently loyal to the Shah and his government that they were perceived as able to root out and quell isolated terror cells and dissidents loyal to the Ayatollah’s ideals on their own.

    Shah Reza II, for his part, cannily promoted the upcoming Tehran Olympics in such a way as to promote regional solidarity, a chance to demonstrate to the world the prowess of Islamic athletes and athletics – a strategy which he claimed would unite his nation, but just so happened to appeal to hundreds of millions of people, spanning from Morocco to Indonesia. Iran was already well known on the world’s stage for their wrestling team; other countries sought to carve out similar niches for themselves. These included Pakistan, known for their field hockey team; Morocco, known for their athletics team; in addition, many countries in the Islamic world were renowned for their talents in boxing, wrestling, and weightlifting.

    Tehran, being in Northern Iran, was near enough to the Caspian Sea (about a three-hour drive from the Olympic Village) that it would play host to most of the boating events – giving the neighbouring Soviet Union (with whom Iran shared the Caspian Sea) an additional edge. [9] Ironically, no Iranian competitors in any of those events would make it past the qualifying rounds. [10] The National Olympic Committee of the Imperial State of Iran decided to focus their limited resources (most of which, after all, had been diverted to fund the completion of the planned Olympic infrastructure in time for mid-1984) on their areas of traditional strength, aware that, as in past Summer Olympiads held in smaller countries, they would be dwarfed in the medal count by traditional titans such as the United States, the Soviet Union, and East Germany. The national committee was well aware of the prospect of becoming the first host nation in the history of the Summer Games not to win a gold medal, and strived to prevent that nightmare scenario. [11] However, and perhaps as a result, other nightmare scenarios were perhaps not as adequately prepared for as they might otherwise have been…

    The Games of the XXIII Olympiad opened in July of 1984, officially opened by Shah Reza II, to a generally warm reception from the people of Tehran, and Iranians in general. The international community responded well to the opening ceremony, seeing it as a symbol of the triumph of liberal democracy and constitutionalism, Iran having been successfully brought “back from the brink” at the time they were awarded the Games. However, security was heightened, and for good reason; the Islamic world was mostly happy to have the Games taking place within their borders (and Islamic iconography was given equal space with traditional Persian cultural displays during the opening ceremony), but the dissenting voices could be heard loud and clear.

    Ayatollah Khomenei, though tolerant of Iranian Jewry (unlike some of his fellow fundamentalists), was openly hostile toward Zionism and despised the cordial relationship between Israel and Iran. He supported calls for Iran to ban Israel from participating in the Games at Tehran; however, only the IOC had this authority, and refused to exercise it. Iran, indeed, had no legitimate reason to bar Israel for the Games even if it could have done; it had diplomatically recognized Israel for decades, and was not a member of the Arab League, having no ethnic solidarity with their cause. Factions within many of the Arab nations lobbied for a boycott of the Tehran Games, but this would fizzle for many reasons; for one, the Arab nations and Israel had competed alongside each other without difficulty at past Olympiads, which many diplomats and commentators did not hesitate to point out. [12] The lack of any sanction by Iran against Israel only served to fuel the fire.

    Mindful of the lessons of 1972, security at the opening ceremony was very tight (the Israeli delegation was placed under heavy guard) and concessions were made to appease the fundamentalists (no Baha’i athletes competed for Iran, though several did for other countries, and they too were placed under heavy guard). Both groups being placed under protection unfortunately helped to reinforce the connection between Baha’i and Israel in the popular imagination. Khomeini in particular exploited this connection in his speeches, which were broadcast on Iraqi radio and could easily be heard across the Iranian border, calling for popular revolution and acts of terror against the state regime. His pleas were heard, and one of his devotees, operating in a sleeper cell out of Tehran, decided to answer them.

    The method which was used to carry out this act of terror was suicide bombing, which had last been exposed to Western audiences by the audacious kamikaze attacks of Japanese fighter pilots in World War II. [13] The event chosen as the target for this attack was women’s fencing, which seemed to hit every single ideological bugbear for the terrorists and protesters: Israel was competing (and, notably, not a single other Middle Eastern country was), one of the fencers (an American) was Baha’i, and the sport itself (with its martial overtones) was deemed by them to be unsuitable for female competitors. Therefore, on the first day of August, 1984 (a Wednesday), the Tehran Plot was executed. A lone suicide bomber gained entry to the event (as it was later discovered, with the help of an inside accomplice, who fled the scene and was apprehended by the authorities in Tabriz, en route to a safe haven in Iraq. But the damage had already been done; the bomber attacked when the American (Carol Wilson) and the Israeli (Tamar Dahan) were competing against each other, throwing himself between them and detonating his bomb, killing all three of them instantly. Miraculously, nobody else was killed, though the referee and many of the other competitors (seated on the floor just a few feet away) were injured, some seriously, by shrapnel. The arena was evacuated and the event was scrapped entirely by organizers; ultimately, no medal was awarded. [14]

    The rest of the Summer Games proceeded without incident, for whatever that was worth, and the closing ceremonies (which, much like the opening ceremonies, were placed under heavy guard) included a lengthy tribute to Wilson and Dahan – however, some scattered boos and jeers could be heard from the audience. The controversy at Tehran, coupled with Munich 12 years before, and the great expense of Montreal, had made the Olympic Games a far less desirable and prestigious event than they had been in years past. Los Angeles had already been chosen to host the Games in 1988 – the only other city which put in a bid was Seoul, the capital of South Korea. The IOC was frankly relieved at avoiding the risk of political turmoil that came with hosting the Games in a dictatorship.

    The Soviet Union finished first in the medals tally, including dominating the Caspian Sea-set boating events, as predicted. East Germany finished second, and the United States in third. Iran won three gold medals in wrestling - all within the freestyle category – and two in weightlifting. The five golds overall won by Iran were good enough to put the host country within the Top 10 by most gold medals won. Neighbouring Pakistan won the gold medal in field hockey – ahead of arch-rival India, which won the silver medal. Morocco won two gold medals, both in Athletics. As had been touted and predicted in propaganda preceding the Olympics, these Games would force the rest of the world to take notice of the Middle East – though perhaps not for all the right reasons…

    ---

    [1] Tehran was indeed the only other city to bid for the 1984 Summer Olympics besides Los Angeles IOTL, though it quickly withdrew as the pressures leading to the Iranian Revolution continued to build. ITTL, of course, the presence of sufficient American troop coverage in Iran during a critical period have provided enough (enforced) stability for the bid to continue – this, coupled with the larger number of North American games in the years preceding 1984 (IOTL, only two such Games were held in North America following Mexico City 1968 – Montreal 1976 and Lake Placid 1980).

    [2] No Olympiad has ever taken place in the Middle East IOTL. For obvious reasons, the Winter Games are never likely to be celebrated there, leaving the Summer Games as the only realistic prospect.

    [3] The host city of the 1984 Winter Olympics IOTL was Sarajevo, winning in the second round over Sapporo by three votes, 39-36, in a come-from-behind victory. (Also-ran Gothenburg, Sweden, was eliminated in the first round with 10 votes to 31 for Sarajevo and 33 for Sapporo.)

    [4] Japan only won the silver medal in that category at Lake Placid 1980 IOTL, tied with East Germany for second place behind Austria (and therefore, no bronze medal was awarded). Considering how vanishingly unlikely such a tie would be over a dozen years out from the POD, it doesn’t happen ITTL and instead the fourth-place finisher for Japan in that event finishes third instead, leaving poor Manfred Deckert of East Germany in fourth, and without his only Olympic medal IOTL. (Considering how many medals East Germany has won, IOTL and ITTL, I don’t think depriving them of one will be much of a loss.)

    [5] The (Summer) Olympic Games of 1984 IOTL, in Los Angeles, were also heavily sponsored, generally reckoned as the first such Games with extensive corporate sponsorship. Not coincidentally, the Games were incredibly financially successful, Montreal having served as a cautionary tale. Since then, many other Games (particularly the two since held in the US, Atlanta 1996 and Salt Like City 2002, along with Calgary 1988 in Canada) have also been heavily sponsored, always with opposition from certain corners in the IOC, but always to great financial success.

    [6] Sapporo Brewery signs, posters, and billboards plastered throughout the event always include the English-language slogan “Drink Sapporo in Sapporo!”, due to the Japanese cultural fascination with the English language. Of course, these slogans are only an appendage to a far more verbose Japanese language message, but keep in mind that most American readers can only read the English.

    [7] Curling was restored to the Winter Olympics program in 1998 IOTL, by which time it was introduced as a men’s and women’s event. ITTL, it is only recognized as a men’s event – 1998 was something of a banner year for women at the Olympics, that being the first year in which women’s ice hockey was contested as well. Worth noting is that, after 1932, curling was not played as a demonstration game again until 1988 IOTL, also at a Canadian Games (in Calgary). Also worth noting is that the Games in which curling makes its grand return were celebrated in Japan both IOTL (Nagano 1998) and ITTL (Sapporo 1984).

    [8] Khomeini spent the vast majority of his exile (following a year-long sojourn in Bursa, Turkey) in the holy city of Najaf, Iraq (fittingly, given his advocacy of Sunni-Shiite reconciliation, it was a holy city for both major denominations of the Islamic faith). IOTL, he departed that city in 1978, less than a year before the Iranian Revolution, on the advice of then-Iraqi Vice-President Saddam Hussein (yes, that Saddam Hussein), residing in Paris until such time as it was clear that he could enter Iran without opposition. ITTL, he remains in Najaf through 1984, continuing to advocate the overthrow of the monarchy and the establishment of an Islamic Republic, preferably with himself at its head – and vehemently opposing the Tehran Olympics, a symbol of Shahist and pro-Western decadence.

    [9] Despite the narrator’s description, “boating” is not a discipline at the Olympics, but rather a group of disciplines: canoeing and kayaking, sailing, and rowing. It is distinguished from the discipline of aquatics (swimming, synchronized swimming, and water polo) in that the events take place outdoors, and not in an Olympic swimming pool – and therefore requires an appropriate locale. (This is more of a problem in the Winter Olympics, wherein most events are on-location.)

    [10] Despite being a maritime nation bordering not only the Caspian Sea but also the Persian Gulf, Iran never fielded any competitors in the boating group of disciplines IOTL until 2000 (in canoeing) and 2008 (in rowing) – and none at all in sailing.

    [11] IOTL, Canada earned that dubious distinction in Montreal 1976, which remains the only time that a Summer Olympics host nation has never won a gold medal (which has happened on several occasions at the Winter Games, including again in Canada at Calgary 1988). ITTL, therefore, the possibility remains very real for Iran to get the black eye instead.

    [12] Also, no large-scale boycott of the Olympic Games had ever been attempted by this point ITTL, compared to the two consecutive boycotts of OTL (followed by a third in 1984). Many diplomats and commentators outside the Arab world made clear that the Olympics were to be seen as apolitical (they weren’t, of course, but they were to be seen that way), and accordingly, such disputes should be set aside in the name of sport.

    [13] IOTL, the first suicide bomber as we understand the term today (as opposed to previous incarnations of the same general idea, such as the kamikaze pilots of World War II) was Hossein Fahmideh, a 13-year-old Iranian boy who threw himself under an Iraqi tank with a grenade during the opening phases of the Iran-Iraq War in 1981. However, suicide bombers would not gain notoriety in the West until 1983, during the Lebanese Civil War.

    [14] As far as I am able to determine, medals not being awarded for an event is unprecedented in the history of the modern Olympic Games. IOTL, it has never happened.

    ---

    Thanks to e of pi for assisting with the editing of the update, as usual.

    I realize that the video game update was originally scheduled to be posted ahead of the Olympics update, and I apologize for disappointing the surprisingly large number of you who wanted to see that one first – however, this one was much further along and I felt the need to end the involuntary hiatus sooner, rather than later. (Relatively speaking, that is.)

    Also, every now and again I feel the need to remind my readers that I’m not writing a utopia!
     
    A Challenger Appears!
  • A Challenger Appears!

    I think there is a world market for maybe five computers.

    Apocryphally attributed to Thomas J. Watson, Chairman and CEO of International Business Machines (IBM), 1943

    Whether at the office, at school, or at home, the question now appears to be who doesn’t need a computer, and companies are lining up to prove they have the best ‘computer for everyone’…

    Excerpted from “The Macromarket for Microcomputers,” as featured in the July, 1983 issue of Circuit Magazine [1]

    Desilu Productions had no shortage of cash cows in the 1980s, but perhaps one of the most unexpected – at least in the bemused opinion of the burgeoning conglomerate’s owner, Lucille Ball – was its share in Syzygy, a video game company which had licenced many of Desilu’s television properties and developed video games adapted from them – and in so doing, created an incredibly profitable new revenue stream for their parent studio. Star Trek had been adapted into several different video games by this point, first for the arcade, and then for the home VCS – followed by the more advanced VCS II. As a result, the success of the Star Trek (tabletop) role-playing game inspired a new layer of synergy for Syzygy. Immediately after the RPG had been released, amateur programmers got to work writing a version of the game that could be played on the emerging microcomputers of the era. Needless to say, these versions could not be sold legally; however, as with many other Star Trek fan works, distributors were not prosecuted so long as plausible deniability could be maintained (and as long as a profit was not sought in said distribution). Indeed, when those in charge at Syzygy became aware of the Star Trek RPGs produced for personal computers (through, longstanding rumour had it, a copy smuggled into the Syzygy offices and played on Nolan Bushnell’s microcomputer), they suggested creating an officially licenced version of the game for their HCS microcomputer – a more sophisticated and versatile variant of their dedicated VCS console. This suggestion was accepted by Desilu’s licencing division, and The Official STAR TREK Role-Playing Game for SYZYGY HCS was released shortly thereafter – programmed in only a few weeks by one of the several fans of the tabletop RPG on-staff. It was the first and only HCS game not to include a graphical interface – the game shipped on floppy disks which were filled to the brim with flavour text and had no room for graphics. [2]

    The HCS was among the most powerful microcomputers on the market, thanks in large part to the TMS9900 16-bit microprocessor at its core, which it shared with the VCS. This legacy provided the infrastructure and economies of scale necessary for the HCS to be comparably priced with other high-end machines, with the additional edge of being extremely user-friendly. This was a result of the guiding design principles of the chief architect of the HCS, Steve Wozniak. He had joined Syzygy in 1973 after having been informally commissioned by another employee, Steve Jobs, to reduce the number of computer chips in the Breakout arcade cabinets. Syzygy was offering their employees $100 for each chip they could eliminate, and Jobs had promised Wozniak that he would split the bonus – without disclosing the amount of said bonus. Wozniak had eliminated 50 chips, but Jobs gave him only $350 – a mere 14% of the amount to which he was rightfully entitled under their arrangement – having insisted that the total bonus was only $700. He justified this decision due to the stripped-down design being incompatible with a coin slot or scoring mechanism (vital for arcade play). Still, Syzygy saw potential in the design despite its limitations, and no less an authority than Nolan Bushnell sought to confer with Jobs about it. As Jobs could not possibly hope to answer his questions, Wozniak was invited along – and it wasn’t long before the truth emerged. [3] Of the seven deadly sins, the only one which Jobs possessed in greater abundance than avarice was pride, and so he resigned from Syzygy before Bushnell could fire him, teaming with his fellow deserter Ronald Wayne to form a new company, Apple Computers. [4] This company would be forced to change its name following a successful lawsuit against it by Apple Corps, the Beatles’ record label – legal costs capsized the nascent company and drove Wayne into bankruptcy. [5] (Jobs, who was in his early twenties, had no appreciable assets which could be seized by creditors.) Wozniak was hired to take Jobs’ place at Syzygy, quickly rising through the ranks to become their most prominent technical mind. He was the principal architect behind Syzygy’s home computing hardware during the company’s Golden Age. [6]

    The Home Computing System, Wozniak’s baby, was introduced in 1981, at a time when much of its competition in the microcomputer market (especially on the lower end) was still powered by 8-bit microprocessors. Thanks to Wozniak’s genius for streamlining, along with vertical integration, the computer sold with the introductory price of $1,111. [7] By this time, the VCS II was selling for about $200 – or less than one-fifth the price, as the bells and whistles associated with the HCS added up to a lot. The HCS was essentially a very large keyboard (in roughly the same shape as the VCS consoles, though with a hard beige plastic casing as opposed to faux-wood paneling) with four ports for joystick controllers, as well as a port for the monitor (sold separately), and an audio output (either speakers or headphones). [8] Unlike most microcomputers of the era, the HCS had a graphical interface – instead of exclusively entering text commands, the user could employ their joystick controller to move an arrow icon across the screen, and then press a button as the arrow moved over various other icons to hit that icon and run the associated program or function. [9]

    The HCS competed with a number of other microcomputers which were marketed for personal use – most notably the Commodore 64, cheaper than the HCS by a considerable margin but far less powerful and slower to boot, with a text-only interface (though graphical programs – and games – could be played on them). IBM, long a titan in the industry (where it was affectionately known as “Big Blue”), sold their “Personal Computer” or PC (which, despite the name, was intended primarily on business use) for more than the HCS, focusing on office applications such a word processor and spreadsheet technology. [10] IBM did not develop this software in-house – instead outsourcing this development to the prominent Tippecanoe Software company, headquartered in West Lafayette, Indiana, which created both the first widely-used integrated “PC Office” suite of programs, along with the dedicated “Personal Computer Operating System”, or PCOS. [11] Along with the Syzygy HCS, it was one of the first microcomputers that did not use a derivative of the BASIC programming language – first developed in 1962 – as its primary interface (though both included on-board BASIC interpreters).

    Syzygy, having two parallel lines of hardware, marketed them as competitors to the two different types of microcomputers. “Budget” lines with spartan features and technological limitations – a market dominated by Commodore – were competitors of the VCS (which eventually released a keyboard peripheral, though never a “true” native operating system in the classical sense). More lavish hardware with extensive software suites and cutting-edge technology were the competitors to the HCS – primarily the IBM PC. The two were neck-and-neck on the sales charts through the early-1980s; Syzygy marketed their HCS more toward home hobbyists, whereas the IBM PC was sold to commercial customers. Institutional buyers were split down the middle – schools and libraries tended to prefer the HCS due to its ease of use and the larger library of games available, but bureaucratic offices naturally preferred the IBM “PC Office” suite – which included top-of-the-line word processors and spreadsheets (with graphical printout capabilities!). However, over time, the battle lines tended to muddle – the team of programmers at Syzygy quickly developed in-house knockoffs for the PC Office suite (called “SyzygyWorks”), whereas Tippecanoe Software’s experience developing games (including for the VCS) helped to stock the PC’s game library – and they also eventually co-developed an equivalent to the Syzygy point-and-hit interface, which was codenamed Wabash after the river which flowed through West Lafayette (though it was publicly known as PCOS v.2.0).

    However great the strides in electronic gaming had been in recent years, the contemporary rise of the role-playing game proved that tabletops could still capture the interest and imagination of even the youngest generation of players. Surprisingly, the surest evidence of this phenomenon did not originate stateside, but in Japan, a country known for its booming economy and otherwise-slavish devotion to the hottest technological trends. One of the hottest hobbies of the 1980s – not just in the Land of the Rising Sun but eventually worldwide – was the product of a company founded in the 1880s

    The Nintendo Playing Card Company (Nintendo being a Japanese phrase meaning “leave luck to heaven”) was established in 1889 in Meiji-era Japan. The company originally manufactured hanafuda cards, local playing cards which were used in a wide variety of games, similar to the 52-card French deck standard to Western play. However, the profitability of this line of business declined in the years after World War II, forcing the company – under the stewardship of its leader, Hiroshi Yamauchi – into other avenues. Most of these were radical departures (including, but not limited to, “love hotels”, taxi services, food manufacturing, and television production) but the company would eventually find lasting success aping the Danish toy behemoth, LEGO, with their N&B Blocks. [12] Nintendo did not shy away from direct comparisons of the two products, believing that theirs came out more favourably due to the larger variety of distinct shapes. LEGO, on the other hand, chose to sue for copyright and trademark infringement. (At about the same time, they also sued a Canadian company, Mega Bloks). [13] However, they would eventually lose the suit – though Nintendo found that their innovations worked far better on paper than in reality. The oddly-shaped N&B block designs (including their popular cylinder, cone, and dog-bone shapes) did not tessellate properly, limiting the potential of buyers to innovate on pre-planned designs with their own creations – a disadvantage not shared by LEGO pieces (and Mega Bloks, for that matter). Therefore, the N&B product line was revamped – pieces which did not tessellate were treated as ancillary, primarily for decoration and detailing, with the core product being comprised of more standard geometric shapes, such as squares, rectangles, triangles, and hexagons. Nintendo also built on their previous history of licencing arrangements by seeking to build playsets of established properties, primarily Disney (which was hugely popular in Japan). [14] N&B Block playsets of famous set pieces – including the home of the Seven Dwarfs from Snow White, Captain Hook’s ship from Peter Pan, and the spaghetti restaurant from Lady and the Tramp – became hot sellers for Nintendo. However, this did not completely sideline their traditional playing card business. In fact, their newfound yen for licences would culminate in Nintendo becoming the exclusive manufacturer of collectible baseball cards for Nippon Professional Baseball, an eminently logical move which paid off almost immediately – and even more so, in the longer term…

    However, Nintendo wouldn’t make a name for itself stateside until the 1980s, when it introduced an innovative new children’s card game which, against all odds, became a worldwide fad. And it all started with a young beetle-fighting enthusiast and would-be entomologist named Satoshi Tajiri. Born in 1961, by the early 1980s he had begun writing and publishing his own magazine called Game Freak, focused on toys and games of all kinds. An avowed Nintendo fanboy, he dreamed of working for the company someday – a dream that came much closer to reality after he met a fan of his magazine, artist Ken Sugimori. [15] The two became fast friends and bonded over Tajiri’s brainchild – a game in which pocket-sized monsters (similar to the stag beetles of his childhood) fought one another in sporting matches. They named their concept Pocket Monsters. It became apparent that the fighting system that Tajiri had in mind was an RPG – a burgeoning genre even in Japan – as his pocket monsters would need to fight each other according to a structured system of rules. The brainstorm which enabled he and Sugimori to bring the game to Nintendo’s attention was that many of these rules could be included on the backs of playing cards. Indeed, the prototype presented to Nintendo executives was modeled directly on their NPL baseball cards. [16]

    The Pocket Monsters battle system was based on the rock-paper-scissors model, where each monster “type” (roughly equivalent to a class in conventional RPGs) had strengths and weaknesses to certain of the other types. This meant that no monster could possibly be strong or weak against all other monsters, that there was no overall winning strategy as in other competitive games. For instance, the various “types” included fire (weak to water), water (weak to plant), and plant (weak to fire); in fact, each type was strong and weak to multiple other types. Plant was strong against earth, but weak to bird and insect types. Indeed, bird types were also strong against insect types – but weak to ice and lightning. The original generation of Pocket Monsters cards had twelve types: Fire, Water, Plant, Earth, Bird, Insect, Ice, Lightning, Fighting, Poison, Spirit, and Metal. Each type had multiple exemplars. [17]

    To have a battle, each player would agree to a fight and on the number of pocket monsters each of them would be allowed to bring to the fight. The official rules for Pocket Monsters called for each competitor to bring an equal number of monsters to any battle – this was a natural result of the win condition being the elimination of all the other player’s monsters. This allowed for a fair amount of flexibility in determining the lengths of matches – a two-monster match would be enough for recess or a coffee break, whereas a dozen or so could last for an hour or more. (The standard match-up was nine-to-nine, inspired by the size of a baseball team – which also meant that only nine of the twelve types could be represented in a match, allowing for greater strategic variations in play.)

    Each player would be allowed to assemble their deck secretly, but once the match began, they would have to show their roster to the other player, facilitated by large full-colour illustrations of each monster on the obverse, or front, of the cards. On the reverse, as was the case with the baseball cards which had helped to inspire them, the cards featured capsule descriptions of the monsters, along with basic statistics (speed, to determine which monster attacked first, and hit points, which were represented in play by counter units) and the list of attacks each monster could make. However, some of these moves would need to be “unlocked” through defeating a certain number of enemy monsters over the course of a battle. This served to differentiate monsters within the same type category – some had a wide array of mediocre moves available to them from the outset, whereas others had exceptionally powerful moves that could only be accessed after defeating several opponents.

    In yet another inspiration from the sales and marketing of collectible baseball cards, only one card was sold in each pack. However, the content of these packs was concealed from the end consumer by opaque foil wrapping – as was the case with a box of chocolates, the purchaser had no way of knowing which of the 72 pocket monsters they were going to get. Many cards (especially the more powerful ones) were much rarer than others, and were often distributed only to specific geographical regions – usually those which were particularly far-flung from major urban centres. Lore abounded of that one hard-to-find card being shipped off to remote locations like Hokkaido. These rumours had unintended effects when the 1984 Sapporo Olympics brought a large number of tourists from Kanto and Kansai to the island, many of whom stopped by local card shops just in case – clearing out inventories and causing widespread shortages. This spoke to the sheer number of interested buyers, and Nintendo was already on its way to manufacturing the second generation of Pocket Monsters to meet the demand. The high degree of collectability naturally attracted consumers who had no interest in playing the game; indeed, the inability to know which cards were inside their original packaging before purchase led to a burgeoning secondary market, with cards carefully unwrapped and stored in plastic or laminated. The rarest of the cards sold for tens of thousands of yen. As a consequence, counterfeiting was rampant, and imitators clogged the shelves. [18]

    The 1984 Sapporo Olympics did help to introduce Pocket Monsters to a wider audience, as Nintendo was one of the main sponsors of the games. Pocket Monsters had already been exported to the Americas and Europe, as well as to Red China, by this time. In the particularly toyetic 1980s, they became a huge hit, especially when they were paired with ancillary licenced product, such as plastic models, plush toys, N&B Block playsets, and a manga and anime series whose English-language translation reached American shores that same year. A video game based on that anime was planned for release on SEGA home consoles in 1984, and localization into foreign markets was virtually assured.

    SEGA was originally known as Service Games, a company which formed out of Hawaii in 1940. In 1951, after the war, the company’s owners decided to relocate their operations to Japan – which at that time was still recovering its lost industrial base and had plenty of cheap labour available of suitable technical expertise to manufacture the jukeboxes, slot machines, and later arcade cabinets produced by the company. In 1965, the company officially changed its name to SEGA, a syllabic acronym typical of the Japanese language, and was one of a great many companies to fall under the corporate umbrella of none other than Gulf+Western – that conglomerate bought out the original owners in 1969, in a move by Charles Bluhdorn to expand into one of the world’s fastest-growing economies. [19] It was in direct competition with the Desilu-Syzygy tandem that Bluhdorn and his underlings agitated for SEGA to expand their video game operations to cover home consoles – the Syzygy VCS was a smash success from 1977 onward, and Desilu was raking in a fortune from their product licencing. 1977 was also the year of Journey of the Force – part of the same genre, loosely defined, as Star Trek, and which featured a heavy “space dogfighting” element. Ironically, some observers noted, the Star Trek video game for arcades (which was ported for the VCS) would have made a much better Journey game. A transparent Star Trek clone bearing the Journey name made it to arcades on SEGA cabinets in 1978 – the game was still in development when Lucasfilm launched their infamous lawsuit against Paramount Pictures, but as the licence had already been awarded to SEGA (intra-corporate negotiations being what they were), the game was completed, and sold briskly. Surely, executives reasoned, a port made for a notional SEGA home console would sell briskly as well, with the added benefit of the consoles themselves selling briskly? George Lucas couldn’t touch a penny of the sales from those, so even if (heaven forbid!) Paramount were to lose his frivolous lawsuit, there could still be some profit in the Journey property after all.

    As a result, SEGA began development on a home video game console in late 1978. The SG-1000, short for the SEGA Game 1000 (given an “English” name by Japanese designers due to the trendiness of that language in their culture) was launched in 1979, becoming the first home console introduced in Japan – and, after the Syzygy VCS, only the second in the world of any consequence. [20] Journey of the Force was a launch title for the SG-1000 – attempts were made to introduce the console into the American and European markets, though this did not happen before the verdict in the Trial of the Century came through in 1980, depriving SEGA of the desperately needed seed money in order to expand (especially in recessionary times). However, Paramount never licenced the Journey game for the American home console market, in anticipation of this planned expansion – thus making it one of the earliest “exclusive” titles, in the modern sense of the term. (Belatedly, in 1984, Lucasfilm – having assumed control of all Journey of the Force licences – authorized VCS/HCS versions of the game.) SEGA itself was divested by what remained of Gulf+Western after the Trial of the Century gutted the organization (and an ill-timed fatal heart-attack left it rudderless), sold to another conglomerate, this one based out of Japan, in 1984. This conglomerate almost immediately moved to release a new console (successive generations of the SG-1000 proving increasingly inferior to the VCS II), and one which would be marketed in the rest of the world…

    ---

    [1] Circuit magazine is TTL’s version of Byte magazine, a microcomputing publication which ran from 1975 to 1998, and was known (and respected) for its broad editorial scope.

    [2] The floppy disks in question are 5½-inch disks, each capable of storing 140 kilobytes of information.

    [3] Jobs misleading Wozniak and depriving him of his rightful share of the money is all OTL, though Wozniak didn’t find out about this until ten years later – butterflies result in Bushnell spilling the beans much earlier ITTL. What might have been water under the bridge after a decade had passed and both Steves had made a mint from Apple is a whole other animal ITTL, and it mortally wounds their friendship before the fateful decision to leave Syzygy/Atari is made in the first place.

    [4] Ronald Wayne formed part of the initial partnership in Apple Computer alongside the two Steves in 1975, holding a 10% stake in the company – which he sold less than two weeks later, for $800. (He would receive $1,500 the following year, after Apple had incorporated, in exchange for forfeiting any further claims against the company.) Had he retained his 10% share to the present day, it (and he) would be worth over $60 billion. However, he sold his interest for fear of what capsizes him ITTL – he (and, more importantly, not the two Steves) had assets which could be seized by creditors.

    [5] IOTL, Apple Corps agreed to settle their lawsuit against Apple Computer in 1981, for what would be revealed decades later to have been a miniscule sum of $80,000. (Contemporary legal experts estimated anywhere between $50-$250 million.) Why does the suit press on ITTL? Butterflies, mostly – and John Lennon being a greedy, ruthless shark. (More like “Give Me A Piece”, am I right?)

    [6] After the collapse of Apple Computers following their loss in Apple Corps v. Apple Computer, Steve Jobs decided to change tack and eke out a career in sales. As of 1984, a banner year for his career ITTL as well as IOTL, he is an infomercial pitchman for his own highly successful direct retail company in the vein of Popeil Bros./Ronco. Admit it, the patented Steve Jobs lectures of OTL translate very well to the classic infomercial style of the 1980s: “Just one more thing…” is, after all, little more than a variant of “But Wait, There’s More!”

    [7] IOTL, the IBM PC released in 1981, with an introductory price of $1,565. The Apple III, the most direct analogue to the HCS released by Apple in this period, had an introductory price set at a whopping $4,340 (for the basic model) in 1980. With such a huge pricing gap, it’s easy to see why the PC became the industry standard, relegating the Apple to a niche it has enjoyed ever since.

    [8] Later production runs of the HCS would come in different finishes, including the “classic” faux-wood of the VCS, re-emphasizing the “it’s not just a video game, it’s a piece of furniture” aesthetic.

    [9] The point-and-hit interface of TTL is obviously based on the point-and-click interface popularized (though not created) by Apple IOTL – the TTL Syzygy interface is more simplified, however, because the HCS is not quite as powerful as the Macintosh, and it’s more based on video games since that’s what the hardware was designed to support.

    [10] The main difference between the IBM PC of OTL and that of TTL is what makes it tick: the 8-bit Intel 8088 IOTL, and the 16-bit Intel 8086 ITTL. Although the 8086 is more expensive than the 8088 (which is why IBM went with the 8088 ITTL), here the more widespread use and acceptance of 16-bit technology (along with the HCS having a 16-bit microprocessor, as opposed to the 8-bit microprocessors used by Apple) forces their hand. As a result, the PC is a bit more expensive than IOTL – about half again the price of the HCS ($1,699 for a basic, no-frills model).

    [11] Widely known amongst detractors as the “POS” – and yes, the PCOS is indeed TTL’s equivalent of DOS.

    [12] N&B Blocks were 100% OTL, right down to the unusual shapes (which were, naturally, advertised as a selling point). Though they ultimately were not successful IOTL, they are immortalized in a level of Super Mario Land 2: 6 Golden Coins. (The final level in the Mario Zone contains a castle made out of them – yes, I bet you thought they were LEGO bricks when you were a kid too, just as I did.)

    [13] LEGO’s litigiousness is the stuff of legend, and Mega Bloks, an OTL company, is the most well-known target of their (failed) lawsuits.

    [14] This is well before LEGO caught a second wind with licencing, which is arguably what they’re best known for today. In a classic example of Newer Than They Think, the first licenced LEGO playset dates to as recently as 1999, with LEGO Star Wars (along with Winnie-the-Pooh), and was a direct result of the company experiencing a consistent decline in profitability over the previous several years.

    [15] All of this is as per OTL – Tajiri did indeed publish what we in the West would call a fanzine in the early-1980s, and it was through these efforts that he met with Ken Sugimori. Game Freak was later used as the name of the video game developer at which the two work to this day (IOTL, it was formed in 1989), and that studio’s most famous product is, of course, Pokémon.

    [16] IOTL, the original Pokémon types were Normal, Fire, Water, Grass, Electric, Ice, Rock, Ground, Fighting, Flying, Psychic, Poison, Ghost, and Dragon – ITTL, Psychic and Ghost have been merged into Spirit, Rock and Ground have been merged into Earth, and Normal has been eliminated entirely. The original 150 Pokemon included multiple types with only a single exemplar (or a single evolutionary line): Ghost-types were represented only by the Gastly family and Dragon-types were represented only by the Dratini family. In addition, dual-typing does not (yet) exist for the initial Pocket Monsters ITTL – that would be introduced later for greater variety. IOTL, many of the 150 Pokemon were dual-types – indeed, there were no pure-Flying type, Ice-type, Rock-type or Ghost-type Pokemon in the original game (the Gastly line was part-Poison, which helped to capsize their advertised strength against the juggernaut Psychic-types). Only a single Grass-type Pokémon (Tangela) was not also Poison-type.

    [17] Basically, the game plays like a cross between tabletop RPGs in the Dungeons & Dragons vein and something akin to the Strat-O-Matic system, using the stats of each player featured on their collectible cards. It’s quite different from the OTL Pokémon Trading Card Game, a simplified derivative of Magic: The Gathering (first published in 1993).

    [18] Speaking from personal experience. Maybe I couldn’t find a legitimate Squirtle – maybe I had to resort to getting one of those ugly cards instead.

    [19] How could I possibly resist working SEGA into this TL when Gulf+Western owned the company from 1969 to 1984 IOTL, right? Their ownership of SEGA actually preceded the more famous acquisition of Atari by Warner Communications (Bushnell sold to them in 1976), thus creating the delightful (or terrifying, depending on your outlook) possibility of vertical integration between movie studios and video game developers as early as the 1970s… something that never quite happened IOTL.

    [20] This gives SEGA a lead of five years over OTL – where the SG-1000 was not released in Japan until 1983, and (as a result of that year’s Great Crash) never saw release outside of the Land of the Rising Sun. (Their follow-up console, the Master System, was released in North America and Europe in the wake of the Nintendo Entertainment System’s enormous success.) This is despite the core of the SG-1000 being a variant of the Zilog Z80 8-bit microchip, first released as early as 1976 – another example of advances in computing technology greatly outpacing the necessary infrastructure in this era. Obviously, the SG-1000 is not exactly the same as that of OTL, but they’re far more alike than the five-year gap might lead you to believe.

    ---

    And now, to properly celebrate reaching the million-view milestone, here is a proper update! This post was co-written with e of pi, who also deserves credit for tirelessly encouraging me to work on it!

    There you have it, I think that just about covered most of the major players people were interested in reading about – and you know what they say: be careful what you wish for, you may get it. (Although I imagine some of my British readers will be very pleased with me, for reasons which should be obvious if you know anything about video game culture in the UK.) And as for high-end microcomputers? PC vs. HCS? A far more even battle than IOTL, to be sure. What can I say, I love Clashes of the Titans. (SNES vs. Sega Genesis/Mega Drive? Best. Console War. Ever.)
     
    Upsetting the Applecart
  • Upsetting the Applecart

    The final verdict in the Trial of the Century was framed as the culmination for an epic, years-long power struggle, but it did not result in a Hollywood ending. The aftershocks of that trial’s verdict took some time to fully settle. The protracted negotiations that ensued as a result of the collapse of Paramount Pictures would see new power players emerging from the ashes of the old, and quickly rising to the top – the Lucases (and Andy Taylor) and their associates at Lucasfilm, Israel Asper at CanWest, and Ted Turner of Turner Broadcasting were all among them. They weren’t the only ones, either – indeed, the dissolution and reorganization of Paramount had followed on the heels of several other movie studios changing hands. Of the major studios from Hollywood’s Golden Age, only MGM remained independent by the mid-1980s. [1] It was only natural that the entertainment industry would become vulnerable to outside interference in such a moment of weakness and uncertainty.

    Kirk Kerkorian, a Las Vegas mogul and venture capitalist, had attempted – ultimately without success – to wrest control of MGM from Bronfman in the late-1960s. In the wake of such smash successes as Napoleon and Ryan’s Daughter, MGM became a less viable target for him, forcing him to set his sights elsewhere. There were plenty of other studios in Hollywood, after all, and most of them struggled through the 1970s. Kerkorian managed to acquire an interest in one of the old “Little Three” studios, Universal, in 1978. [2] This enabled him to put his long-standing plan into action; purchasing a film studio was merely a means to an end. This end goal was to exploit that connection in order to shore up his other interests, namely those in the hospitality industry. Therefore, he sponsored the construction of the Universal Grand Hotel in Las Vegas, which opened in 1982 and featured “the world’s largest globe” dominating the atrium (and visible through a glass entryway from the Strip proper). [3] This suited Kerkorian’s needs just fine – disappointed that he could not gain control of one of the Big Five, it was said that he ultimately chose Universal for its impressive title as well as for the globe imagery of their logo, which was exploited not only outside the resort complex, but as part of the showgirl revue which headlined the resort.

    With the absorption of United Artists into the CanWest fold, the other remaining “Little Three” studio besides Universal was Columbia Pictures, which found an interested buyer in RCA. NBC already fell under their corporate umbrella – buying a film studio in addition to a television network was a natural outgrowth for the media conglomerate. Their SelectaVision VDPs had provided the company with the capital they needed to acquire the studio, and they did so in the early 1980s, ahead of other interested buyers. The hope was that Columbia’s existing television division (Screen Gems, later re-branded to Columbia Pictures Television) would produce shows which would air on NBC – both ABC and CBS remained independent at this time and had no formal association with any production studios. [4] In-house productions were an expensive and risky proposition. Executives at RCA and NBC both hoped that this would change going forward, with Columbia bearing the brunt of the production costs. There was a special irony in Columbia coming to be owned by RCA, as the “C” in CBS stood for Columbia – though Columbia Pictures and CBS had never been affiliated, the name referring to a national personification of the United States of America similar to Britannia in the United Kingdom, Marianne in France, and Mother Russia (or Mother “Homeland”, as the multiethnic Soviet Union referred to her) in… Russia.

    It was difficult for any company to top either Kerkorian for naked greed or RCA for blatant opportunism, but the soft drink manufacturer Coca-Cola turned the trick in seeking to buy out 20th Century Fox – which, unlike Columbia and Universal, had actually been one of the Big Five in days of yore – in order to fill all of that company’s future output to the brim with product placement. [5] Never again would any character be seen drinking or be heard discussing Pepsi in any future Fox film. Critic Roger Ebert, one-half of the beloved critical duo of Ebert and Siskel and a noted opponent of product placement, briefly alluded to his distaste for this acquisition in one of his columns, noting that “I think we can all agree that The Sound of Music would have been a much stronger film if the characters took some bottles of Coke with them when they went hiking through the Alps. That way, if they ever ran into the Nazis, they could start flinging bottle caps at them!”

    Nevertheless, despite these dramatic changes in ownership – which brought every major studio in Hollywood (excluding the Walt Disney Company) into the hands of some conglomerate or another – the media continued to focus on the players who had gained the most from the collapse of Paramount, and these were outsiders who wanted in, all of whom had built their own media empires. Both Asper and Turner were profoundly ambitious; the Lucases, who had gained by far the most from Paramount (obviously) were surprisingly rather less so.

    At Lucasfilm, George Lucas quickly found that, as a wise man once said, having a thing was not so pleasing, after all, as wanting it. What he’d wanted had been the studio infrastructure and financial capital necessary to produce motion pictures without any outside interference – and he sought much the same for his old friends and colleagues (to the point that industry wags sometimes decried Lucasfilm as the “USC Film School Alumni Association”) – Steven Spielberg sat on the studio’s Board of Directors alongside many of George’s other old friends and fellow New Hollywood auteurs: Francis Ford Coppola found himself on the cusp of a career resurrection; Martin Scorsese – who, like many in Hollywood, had cleaned up his act and shunned cocaine after the death of Robin Williams, was rewarded for getting his life together by getting a seat on the board; John Milius was also a member. Lucille Ball was offered a seat on the board, but declined it, as she had done with many such appointments in the past. Lucas wanted his studio to become known as a bastion for auteur filmmaking – although (in the wake of Coppola’s disaster with Tucker, first among other examples) Lucas made clear that budget overruns and falling behind schedule would not be tolerated. George would – and did – continue to write and produce films himself, but chose not to actively direct the upcoming Journey of the Force sequel, leaving that Spielberg, as the first of what was intended to be many Steven Spielberg films for Lucasfilm Studios. [6]

    Still, George didn’t like being a studio chief – as Lucille Ball could (and often did) tell him, it was grueling work – which his wife, Marcia, wanted no part of, having no interest whatsoever in running a film studio. She was quoted during a media scrum held on the steps of the US Supreme Court immediately after her landmark victory that “I don’t know what the future holds in store for us, but I can tell you I won’t be using a Moviola again for a real long time.” [7] Although she had won two Academy Awards for Film Editing, she was totally serious, and stuck by her impromptu pledge – happy to raise a family away from the spotlight, because she wanted more kids. Their daughter Amber was 10 years old – another daughter or even a son would be nice. The mutual decision on the part of the Lucases to focus on a hands-off approach to studio running had also been inspired by Ball, whose technique of hiring capable, competent, and trusted underlings to make their own decisions, only reining them in or vetoing them when necessary, had obviously been very successful for Desilu.

    That left the studio’s third partner, Andy Taylor, Esq., to step up to the bat. After years of researching and immersing himself in the filmmaking industry, and with his landmark legal victory surely a tough act to follow, he felt a change in careers would be appropriate. Therefore, Taylor was formally appointed the President and COO of Lucasfilm Limited – along with (naturally) the general counsel. George served as the CEO, a position that befitted his skills at “big-picture” concepts, along with the CCO, or Chief Creative Officer. Marcia served as the CPO, or Chief Product Officer, a position which had fairly vague responsibilities because rigid corporate hierarchy did not suit her versatile skillset – in practice she functioned much as she had for most of her professional career thus far: as an editor, a polisher, a fine-tuner, and a sounding board. As Lucasfilm’s offices continued to occupy space in the former Paramount Melrose lot, she enjoyed lunching with her former co-workers at Desilu Post-Production, not to mention her one-time boss Lucille Ball. Both George and Marcia found plenty of time to spend away from their offices once they adopted their second child, a son whom they named Anakin.

    That said, the Lucases did not divorce themselves from Hollywood politics, taking a stand on perhaps the most controversial issue of the day, at least once the Trial of the Century itself (along with the Hollywood Accounting debate that it had brought to the fore) had been largely resolved. This concerned the modification of motion pictures, following their initial release, and most often by firms and/or individuals who had the legal right to do so (being in possession of their copyrights) but had nothing to do with the production of such films. The most famous participant in such modification practices was Ted Turner, the media mogul who, by the 1980s, found himself in control of a plurality of motion pictures produced during Hollywood’s Golden Age.

    Turner was alone among the newer power players in Hollywood who favoured post-release modification of motion pictures. Canwest Paramount and Lucasfilm were both in support of preserving films in the form of their original releases – though George Lucas, ever the auteur and vexed at his experiences with the studio in the making of American Graffiti, added the caveat that this was imperative primarily in the case of films whose creative vision was that of the creator. Citizen Kane was the obvious example, one which would become a rallying cry for the movement, especially after Turner (who owned much of the RKO library) facetiously suggested colourizing the film, forever earning him the ire of cineastes and film critics everywhere. [8] It should be noted, of course, that Turner had a vested interest in allowing edited re-releases, considering his investment in post-production technology that facilitated certain processes (such as colourization) along with a truly impressive library of films on which to use this technology; Lucasfilm and CanWest Paramount, on the other hand, had sold their acquired film libraries to other interests and effectively had no dog in the fight.

    Ted Turner had quietly accumulated the largest library of classic American films owned by anyone in the world – the Paramount, RKO, and 20th Century Fox oeuvre all belonged to him. Like Lucille Ball at Desilu, one of his openly acknowledged inspirations, he invested in many innovative technologies, most notably colourization. Turner fancied himself a visionary, and also viewed black-and-white as an unfortunate technical limitation of its era, and surely one that filmmakers would have eschewed, had the option to film in colour been available. Naturally, he believed that his view on black-and-white photography was shared by everyone in the industry – and he invested heavily in those startup organizations that sought to bring colour to black-and-white footage. However, he faced opposition almost immediately, starting with America’s two most beloved film critics, Roger Ebert and Gene Siskel, who devoted an entire half-hour episode of their program, Coming Attractions, to the artistic reasons for filming in black-and-white, and why the visuals would be degraded by adding in a false layer of colour. [9]

    In many ways, it was not surprising that colourization would become popular in the 1980s, a visually “loud” decade of neon and pastels that followed the drab, earth-tone, stylistically-monochrome look of the 1970s. The garishness of the visual aesthetic was in many ways far more reminiscent of the 1960s, the first decade in which colour fully superseded black-and-white, in film and on television, the medium upon which Turner built his empire. However, the technology to turn existing black-and-white media had not yet been developed at the time. Indeed, many shows of the 1960s – including two of the most popular shows of the era, The Andy Griffith Show and The Beverly Hillbillies – had transitioned from black-and-white to colour partly through their runs, usually to jarring effect. Turner, who syndicated many of these classic 1960s series on TBS, proposed that colourization could allow syndicators to present their entire shows in a single, consistent picture format. This was an eminently practical and fairly uncontroversial suggestion – and because the later seasons of these shows were filmed in colour, true frames of reference existed for all of the characters, settings, and props. When Turner presented this proposal to syndicators, he was met with good responses across the board – except from Desilu Sales, the syndication arm of the famous television studio of the same name (whose own The Lucy Show had started out shooting in black-and-white and then switched to colour early in its run), which did not answer any of his marketing department’s calls.

    This tentative triumph merely served as a smokescreen for the main event, however. Turner’s company was taking old black-and-white movies and colourizing the footage – without the consent of their creators (who were either dead or – even worse – actively hostile), which they were fully within their power to do since they owned the copyrights; no court of law would grant an injunction to stop them. The court of public opinion,however, was an entirely different matter. Ted Turner, a heartless mogul born with a silver spoon in his mouth, was facing off against the actors and directors who had defined the Golden Age of Hollywood, perhaps the pinnacle of American popular cultural influence – and many of these had risen from obscurity and squalor to do so.

    Even four decades later, these men and women loomed large in the collective consciousness, though many of them had since shuffled off this mortal coil. Along with Ebert and Siskel, many of the survivors began advocating that the government archive the original prints of classic Hollywood films in order to protect them from being permanently altered (and, effectively, lost) to technological “restoration”. In perhaps the greatest assemblage of Golden Age stars ever seen outside of Hollywood, many of them began to actively petition for film preservation – Jimmy Stewart, Cary Grant, Fred Astaire, Gregory Peck, Katharine Hepburn, Bette Davis, and Barbara Stanwyck were all among them. Before their deaths in the early-1980s, Henry Fonda and Ingrid Bergman both expressed similar sentiments. Lauren Bacall spoke on her own behalf as well as that of her legendary (and late) husband, Humphrey Bogart, in denouncing colourization. Although most of her films had been shot in colour, HSH Princess Grace of Monaco [10] also spoke out in favour of film preservation when asked about the subject at the Cannes Film Festival – making her the second-most prominent political figure to support it.

    The Smithsonian Institution and the Library of Congress were both frequent recipients of petitions and appeals of this nature, both from these Golden Age celebrities and from everyday people. This movement dovetailed with the movement to preserve films that were, in the normal course of events, genuinely in danger of becoming lost, as many pioneering movies of the silent era already had been. (Home video, which had helped to save many of them – every Academy Award-winner for Best Picture prior to 1980 was released on home video by 1984, Journey of the Force being the last of these – could only do so much.) [11] A consensus that films were, indeed, works of art was emerging – and like any works of art, their destruction should be rallied against and prevented whenever and wherever possible. These two initiatives, taken together, would soon yield dramatic results.

    Former President Ronald Reagan, himself a former actor (and one-time President not only of the United States of America, but also the Screen Actors Guild), was an early supporter of adding motion pictures for preservation within the Library of Congress, and this advocacy served to rehabilitate his image somewhat post-Presidency. In a famous speech given to SAG, Reagan called on the US government to enact legislation which would create a film preservation board. “Movies inspire us to laugh, to learn, to live,” he said at the event. “They also help us to escape from cold, hard reality. People who come to own the copyrights for these films by happenstance, and decide to ‘tinker’ with them – compromising the vision of the filmmakers, cast and crew for their own whims – only reminds us of that cold, hard reality.” Reagan, always known for his sense of humour and self-deprecation, was also famously quoted as saying “There are some who would say my starring alongside a chimpanzee in a B-movie did more good for this country than my four years as President.” [12] This earned him a standing ovation; true to form, once the applause died down he added “I’m not sure if you’re clapping because you think that’s funny or because you agree with me.” Reagan’s post-Presidential involvement with the film preservation movement marked the beginning of his rehabilitation in the public eye – though his Presidency had been considered uneven at best, disastrous at worst.

    The incumbent President, John Glenn, was not an actor but an astronaut, though he shared Reagan’s appreciation for the power of modern mass media. His own exploits had been brought to the masses through television – and, soon enough, would be dramatized in the film adaptation of the popular non-fiction book about the Mercury Seven, Seven Up [13] – and he too was open to the idea of an archive, though he liked the idea of preserving not only film but also television, a medium whose earliest works were not archived nearly as completely as those of film. Rep. George Takei, also a former actor and a key Glenn ally, supported the Library of Congress admitting works in film and television for preservation, as did Senator Marlin DeAngelo (who represented the state of California) and many other politicians, and figures in the media. [14]

    There was support from some quarters for extending these new protections to television and radio in addition to motion pictures, however this was not deemed suitably feasible due to the large volume of serialized material produced in each medium. This despite a great number of radio and especially television programming having already been wiped from network archives because of that sheer volume, arguably proving a greater need for what remained to be preserved. Far more of the earliest years of television programming no longer existed than had been preserved, and oftentimes what little had been saved was down to the combination of pluck and foresight possessed by individual producers and archivists. Tape wiping was endemic to television production, regardless of location; for example, the BBC was as guilty of the crime as any of the American commercial networks were. Nevertheless, logistics would have necessitated library space far beyond what Congress was willing to provide the Department of the Interior for this undertaking. [15]

    The Republicans had taken back the House of Representatives by 1983, when legislation creating the National Film Registry (the National Film Preservation Bill) was tabled, but many of these Republicans were more urbane, patrons of the arts in the Rockefeller mould – even many fiscons were willing to support the bill because of Reagan’s advocacy for it. As a result, the National Film Preservation Act passed in 1984, creating the National Film Registry. [16] 1985 would be the inaugural year of induction for films into this registry, which naturally would be done in a lavish annual ceremony of the kind to which Hollywood types were well accustomed.

    This would force Ted Turner’s hand. He had influential – and vocal – opponents who were far more persuasive and charismatic than he, united only in their opposition to him – on one side of the political spectrum, Chicago liberals Ebert and Siskel on PBS; on the other, Californian conservative Ronald Reagan, former President of the United States. However, Turner’s tenacity was as much a virtue as it was his downfall; he stubbornly continued to colourize the films in his possession, and showed only those versions on his TBS, which by now had not just a nationwide reach, but in fact an international one, as it could also be seen on most cable packages offered in Canada. [17] Nevertheless, Turner did redirect his efforts somewhat in the wake of the film registry’s creation. He had acquired a sizeable television library prior to moving into films, and his deepest regret was not being able to snap up I Love Lucy – still one of the most popular shows on TBS – before Desilu was in a position to buy the show back. (Turner failed to realize that doing so would have given him even more bad press in the court of public opinion.) Still, he knew that Lucille Ball was a reasonable woman; like him, she was a skilled entrepreneur, and perhaps if he were to meet with her, he might be able to win her over.

    In contrast to Turner, Lucasfilm had the ideological motive of supporting film preservation based on the auteur philosophy of its chief creative force, CanWest Paramount was both more cynical and more pragmatic; the owner of that studio, Israel Asper, was a foreigner, and sought to endear himself and his company to the Hollywood intelligentsia (ambivalent at best, and hostile at worst, to his presence). As the Film Preservation Act was making its way through Congress, however, his native Canadian film industry was also undergoing significant changes. Most notably, a new ratings system was introduced during this time, inspired by the MPAA ratings system used in the United States, but with some key differences. [18] The Canadian film ratings system was one of many worldwide which introduced a restriction on pre-adolescent moviegoers, which many judged to be sorely lacking in the MPAA ratings system.

    Edgar Bronfman, scion of the Seagram distillery dynasty, also owned a major film studio – MGM – and was also Canadian, like Asper – the two even shared their Jewish ethnicity. Naturally, then, they were exceptionally fierce rivals, but their combined positions of strength allowed to apply considerable leverage upon both the Canadian and the American film industries – and enable those two industries to co-ordinate their laws and regulations more harmoniously. Both the Asper and the Bronfman families were supporters of Canada’s Liberal Party, which had been in opposition since 1972 – the governing Tories, under Prime Minister Robert Stanfield, was not their natural ally.

    However, they were able to make this work to their advantage – trading their support in exchange for concessions from the Canadian government. Both favoured loosening of the famed Canadian content restrictions, and each in different areas: Asper, who also controlled Global Television, wanted there to be fewer mandatory broadcast hours of Canadian-made programming on his stations; Bronfman wanted there to be a lower threshold of manpower and materiel to qualify a production as “Canadian” (in order to receive funding and incentive from the federal and provincial governments, and in order to allow more dubiously “Canadian” programming in under even the weakened regulations that would remain). Asper supported this endeavour as well, viewing it as key to his company’s overall strategy. Both were able to secure pledges from Stanfield’s Minister of Communications in the run-up to the 1982 federal election (in exchange for support from the Aspers and the Bronfmans).

    Attempts by the opposition Liberals to turn this into a scandal when the backroom deal was inevitably exposed during the election backfired when they promised Canadians that a Liberal government would “protect Canadian talent” by forcing Canadian viewers to watch more of their own subpar programming on Canadian stations. Coupled with the unpopular simultaneous substitution system which often prevented Canadians from picking up the feed from American stations, this allowed the Tories to frame the Liberals as promising viewers “fewer viewing options than anywhere else on this side of the Iron Curtain”. It was not the first time that the Liberals had been accused of Communist sympathies, nor was it the first time that their attempt to appeal to the country’s intelligentsia resulted in their being outmanoeuvred by the Tories and their more populist approach.

    That said, Canadians did favour some degree of cultural protectionism from the Americans, if not as pervasive as the Liberals would have it. In that respect, Stanfield was insulated by his vociferous support for Canadian sport – protecting the CFL from the encroaching NFL being foremost among his accomplishments in the eyes of many. Just as only Reagan could go to Moscow, only Stanfield could sell out to Hollywood. Indeed, sporting events “involving at least one team whose membership or management is at least 50% Canadian in origin” – essentially, every NHL and CFL team, as well as every Canadian team in the other leagues – were given more “points” in the rejigged CanCon system, incentivizing stations and networks to bolster Canadian sports coverage, even as dramatized genres fell by the wayside. [19] After the PCs formed another majority government in the aftermath of that election, loosening of CanCon restrictions were formally introduced into Parliament, and would ultimately receive Royal Assent in 1983 – just in time for Asper and CanWest to reap the rewards from a favourable ruling in the Trial of the Century.

    One company that was not able to win over an influential clique was the Japanese conglomerate Sony, developer and manufacturer of the Beta video format, whose appeal in the landmark Sony Corporation of America vs. Universal City Studios, Inc. case reached all the way to the Supreme Court, in one of several important media-related cases that body deliberated in the 1980s. Essentially, the plaintiffs (a consortium of Hollywood studios, led by Universal) alleged that VTR technology (unlike VDP technology) allowed for the end user to record transmitted images onto their tapes as well as playback existing ones. This, in turn, allowed for consumers to engage in time-shifting practices – they could watch live programming, such as sporting events, after the fact, and that this violated the copyrights of those who produced such media. This opinion was held by the majority of producers in the American entertainment industry, though not all of them – time-shifting had some defenders, such as Mr. Fred Rogers, who hosted and produced Mister Rogers’ Neighborhood, and testified in favour of time-shifting before the Supreme Court. However, his influence was ultimately limited.

    The Supreme Court of the United States ruled that the use of VTR technology which allowed for the recording of existing media by the end consumer, be it through direct transmission or tape duplication, was a violation of copyright and that the availability of such technology (which was, on most VTR machines sold in the United States, simply a “record” button on the control panel) would have to be eliminated. [20] At the same time, the court took pains to confirm that camcorders and VTR recording for commercial and industrial use remained legal, because these resulted in the creation of original content and did not have the potential for copyright violation. Therefore, the sale of VTR players with recording capability was to be tightly restricted, a responsibility which would later be deemed to fall under the purview of the Federal Communications Commission.

    Although the verdict technically did not outright ban the existing VHS and Beta formats, their only real edge over the CED and Laserdisc formats had been eliminated. Sony, recognizing when they had been beaten, withdrew Beta from the US market. However, both the Canadian and the Mexican judiciaries would eventually rule that time-shifting was legal, allowing for the emergence of a black market which made acquiring an “old-style” VTR trivial, in an echo of the Prohibition era (to which many critics naturally compared the VTR ruling).

    The collapse of the domestic VTR market was treated in many of the trade papers as the culmination of the “new order” having swept the entertainment industry by the mid-1980s, but it seemed the only true constant was change. The time for stagnation, as in most other sectors of the economy, had passed; the new (and surviving) power-brokers, having consolidated their gains and shrugged off their losses, were poised to make new strides in the years, and quite possibly the decades, ahead…

    ---

    [1] IOTL, of course, MGM was purchased by Kirk Kerkorian in 1969, and merged with United Artists (in the wake of that studio’s collapse as a result of Heaven’s Gate). Kerkorian would stubbornly hold onto the studio for decades, only briefly selling to none other than Ted Turner (who was unable to retain the studio, but did keep the library).

    [2] Kerkorian also attempted to buy out Columbia in 1978 IOTL, but ultimately failed because he already owned MGM – spurring antitrust action against him which was ultimately successful and forced him to sell his interest back to the previous owners. ITTL, he doesn’t own any other film studios, allowing him a clear shot to Columbia.

    [3] The MGM Grand opened much later IOTL – Kerkorian purchased an existing hotel (the Marina) in 1989 and closed it to develop the MGM Grand on the land, which then opened in 1993.

    [4] CBS would remain independent until it was acquired by Westinghouse in 1995; that same year, the Walt Disney Company bought out ABC. NBC belonged to RCA until 1986, when RCA was sold to General Electric, which had previously owned RCA (and therefore NBC) until 1930, when it was forced to divest them as a result of antitrust legislation.

    [5] Coca-Cola bought out Columbia Pictures IOTL, and subsequently rammed that studio’s films full of product placement – including, most notoriously, the Bill Cosby vehicle Leonard Part 6. (The Coca-Cola Company also produced New Coke during this era, so the 1980s were not the most fertile period for their marketing department.)

    [6] Before anyone asks, this is the first full-on collaboration between George Lucas, the screenwriter, and Steven Spielberg, the director. There may be more to come thereafter, perhaps even involving a throwback to the old cliffhanger adventure serials, but not for the foreseeable future.

    [7] Marcia Lucas, who (unlike her husband George) did not enjoy a comfortable upbringing, was content to enjoy the high life once the Lucases had made their fortune from Star Wars IOTL. The only difference here is that George (exhausted from years of marginal living as a result of the lawsuit) finds his own views more compatible with those of his wife after coming into their hard-earned money.

    [8] Orson Welles was said IOTL (on his deathbed, no less!) to have said “Don’t let Ted Turner deface my movie with his crayons”.

    [9] Siskel and Ebert did this IOTL as well – in a 1986 special entitled “Colorization: Hollywood’s New Vandalism”. You can watch the special online right here.

    [10] Yes, she’s still alive ITTL. She had a stroke, but not while she was driving, and it was mild enough that she made a full recovery.

    [11] It should be noted that Oscar recognition correlates very strongly with the long-term survival of a film; not a single Best Picture winner (IOTL or ITTL) has been lost, nor have any of the films nominated, with a solitary exception: 1928’s The Patriot. (It is, therefore, sadly impossible to determine which film should have won Best Picture at the 2nd Academy Awards.)

    [12] Reagan refers here to 1951’s Bedtime for Bonzo, certainly the most notorious movie in his filmography. Fun fact: apparently he never saw the picture himself until 1984.

    [13] TTL’s version of The Right Stuff, though not written by Tom Wolfe (remember, New Journalism isn’t as prominent ITTL), and written later (published in late 1980, just in time to cash in on Glenn’s presidential run) and more complimentary towards Glenn (who, IOTL, did not care for The Right Stuff because he didn’t like how he was depicted in the book).

    [14] One of the reasons the act passes on an earlier timetable ITTL – key TTL-only sponsors of the relevant legislation.

    [15] Once we reach the (IOTL, present) era when entire series can be stored on thumb drives, then we can talk about a National Television Registry. Alas, this TL ends in 1986…

    [16] The National Film Preservation Act passed in 1988 IOTL, and the first batch of films were inaugurated into National Film Registry the following year.

    [17] Fun fact: the CRTC granted a licence for WTBS specifically to be broadcast in Canada, and this was never extended to cover the TBS network that emerged from it. Therefore, Canadians continued to receive the TBS Atlanta feed, and were exposed to the many delightful ambulance chasers and fly-by-night “career schools” in the area through their commercials. When, some years ago, WTBS was rebranded as “Peachtree TV”, divorced entirely from TBS… the CRTC refused to change the licence to accommodate this, and therefore Peachtree currently serves Atlanta… and Canada.

    [18] For fear of overcrowding the body of the text, I present to you a simple index of the Canadian motion picture ratings:

    C For Children Aged 10 and Under
    E Visa pour les enfants 10 ans et avant

    F For Families and All Ages
    F
    Visa général et pour les familles

    PD Parental Discretion Advised
    SP Surveillance parentelle suggéré

    PD-M Parental Discretion Recommended – Mature
    SP-M Surveillance parentelle recommandé – Thèmes mûr

    13+ Ages 13 and over / 13 ans et plus

    16+ Ages 16 and over / 16 ans et plus

    18+ Ages 18 and over / 18 ans et plus

    AO/AS Adults Only / Pour les adultes seulement

    [19] This is easier ITTL than IOTL, because there are more Canadian major league sports teams: by 1984, there were 8 NHL teams (7 IOTL), 10 CFL teams (9 IOTL – barely, as Montreal’s CFL presence was… tenuous), 2 MLB teams (the same as IOTL), and 2 NBA teams (none IOTL).

    [20] Yes, that’s the reverse of OTL’s verdict, which (narrowly – the margin was 5-4) ruled in favour of time-shifting. This could have some most intriguing precedents…

    ---

    Thanks to e of pi for his assistance in the editing of the update, and thanks also to Dan1988 for his help with the alternate Canadian film ratings system!

    And yes, although it may have taken me half a year, that concludes the 1983-84 cycle! Just two more to go…
     
    1984-85: Virtue and Vice
  • Virtue and Vice (1984-85)

    “I DON’T LOVE DESI” – Splitsville as Patty Duke Files for Divorce from Lucy’s Son
    – From the front cover of the National Enquirer

    “DESI DYING OF LIVER CANCER” – Lucy’s Heartbreaking Discovery!
    – From the front cover of Star magazine [1]

    ---

    May 4, 1984

    “Nice to see the supermarket tabloids getting it right for once,” Herb Solow joked, gamely attempting to inject some levity into the increasingly sombre atmosphere at Desilu’s head offices. It was in vain, however. He’d never seen his boss so down in the dumps. With some of the headlines in the supermarket tabloids strewn across her desk, it was no wonder. Ironically, she’d always hated her soon-to-be ex-daughter-in-law, having opposed the relationship between Patty Duke and her son from the very first.

    (The tabloids had reported on
    that, too – they’d always had a knack for getting things right when it came to her and her family’s disastrous love lives.)

    Ball could barely muster a gravelly groan in response to her right-hand man. She’d been letting her cigarette burn into ashes as it perched between her index and middle fingers, dangling over a charming ornamental crystal ashtray she’d received for her 70th birthday.

    There was a long pause after that. Solow hated the awkwardness of the situation, but his decades of working with Ball had taught him when not to say anything more until prompted.

    But Tartikoff played by his own rules.
    “Lucy, are you okay? How do you feel?”

    “Old,” she said quietly, barely loud enough to be heard over the din of the air conditioner. “I feel old.” Noticing the column of ash which had formed on the end of her cigarette, she flicked it into the ashtray and took a long drag on what remained – almost as if she could will all her troubles away if she sucked hard enough.

    She let loose with a hacking cough. The stress of the situation had led her to ramp up her daily cigarette consumption. Even she’d lost track of how many packs a day she went through. Philip Morris directly delivered shipments of cigarettes to Desilu’s loading dock on a weekly basis – one of the perks of a long and profitable association with the studio’s head honcho – and with each delivery, at least one carton would always find its way to her office.

    Various people in her life would occasionally try to talk her into quitting. It was the 1980s; smoking-related illnesses had already claimed the lives of several beloved celebrities, and they were worried she might be next. All that got them, though, was a promise that she, too, would participate in anti-smoking PSAs if ever she got lung cancer; she considered that a fair compromise.

    Having exhausted the consumable nicotine from this latest cigarette, she crushed the butt into the ashtray with, perhaps, a bit more force than was necessary. “At least now she won’t get any part of this,” she said, gesturing to her surroundings. “I can give Desi a share of the studio, and if he ever gets married again, I’m having him sign a pre-nup first.”

    Solow and Tartikoff silently exchanged glances at this. Desi was over 30 years old, and was clearly not as beholden to his mother as she would have liked to believe. After all, he had run off to Reno to elope with Patty Duke at the age of 17, shortly after he had, in Ball’s own characteristically blunt words, “knocked her up”. He was older now, wiser, more mature – which was one of the reasons he and Duke were divorcing in the first place, as a bid for her to confront her inner demons and for him to finally, successfully, beat his own addictions. But that also made him even more independent.

    Ball noticed Solow and Tartikoff’s skepticism. “Don’t worry, I’ve already talked with my lawyer. If he decides to run off to Reno again, that automatically revokes his claim to any share and residue in Desilu Productions, Incorporated. And I’ll have you know I’m jumping through an awful lot of loops to make the arrangements.”

    “You’re putting an awful lot of thought into your estate planning lately, Lucy,” Solow said.

    “In case you haven’t noticed, I’m no spring chicken,” she replied. “I’m 73 years old and running a studio. I should have retired a decade ago. I keep wondering why I haven’t already.”

    “Isn’t it because of that dream you had?” Tartikoff asked. “The one with… was it Claudette Colbert?”

    “Carole Lombard,” Ball and Solow corrected him in unison.

    “Claudette Colbert is still alive, I think,” Solow added.

    “She is,” Ball said. “She lives out east now, still does quite a bit of stage acting. She and Carole were both big in screwball. Between that and the similar names, I can see why you got them confused.”

    “Nice to see she’s still so active,” Tartikoff said. “See? You’re not the only one.”

    “A lot of us are still kicking around,” Ball replied. “But none of us are getting any younger.”

    They’d had conversations along these lines a thousand times, but somehow, Solow knew this time was different.

    Sure enough, Ball lit another cigarette, taking a long drag before she finally dropped the bombshell.

    “I think, after this next year, I’m finally going to pack it in.”

    ---

    The industry was abuzz – and in their view, vindicated – by the Supreme Court’s decision to rule against the legality of time-shifting. Critics of this decision never failed to note that it was handed down in 1984, the year in which the famous George Orwell novel of the same name was set; among its themes were repressive state control and censorship of the media. As a result, Orwell’s classic dystopia was the best-selling novel of the eponymous year in question, excluding new releases. Despite, or perhaps because of, the ruling, 1984 was a banner year for the home video industry.

    More CED releases – and players – sold than ever before. All remaining “recordable” VHS and Beta players on the market had to be returned to the manufacturer and scrapped, though many naturally slipped through the cracks, and a thriving black market was soon established. Critics naturally took to calling it “the new Prohibition” – political cartoonists would take to drawing people (illegally) recording television broadcasts as if they were bootlegging bathtub gin, and the resultant (clandestine) “viewing parties” were drawn to look as if they were taking place at speakeasies. The “old” VTR devices were still common enough that nearly everyone in a populated area knew
    someone who owned one – and very few had been returned. It didn’t help that the Supreme Court of Canada upheld time-shifting, and that the border between those two countries was the longest in the world, and also the freest; sojourners didn’t even require passports, and were rarely searched by border patrols.

    Thus, it was ludicrously easy to smuggle VTR devices from Windsor to Detroit, Toronto to Buffalo, Montreal to New York, Vancouver to Seattle, or Winnipeg to Minneapolis, among other, less-frequented routes. This created an additional incentive for governments on both sides to further develop their high-speed rail infrastructure – it was much easier to smuggle bulky VTRs across the border in a station wagon or minivan than it was onboard a train. VTRs and their continuing popularity despite their newfound illegality were, if nothing else, a microcosm for the human tendency toward nostalgia. After all, people taped current programs as a means of preserving them and being able to revisit them at will – perhaps not entirely necessary when most classic movies (and, increasingly, television series) were available for mass consumption on CED; not to mention the proliferation of pay-TV channels, many of which (known as “rerun farms”) broadcast exclusively second-run syndicated programming. Nostalgia allowed for the enduring popularity of all these old shows and movies from two decades before, up to and including a certain ubiquitous science-fiction program which, over a decade after it had ended production, was only now seeing development of a sequel series...

    On Saturday, September 8, 1984 (eighteen years to the day after Star Trek had first started airing in 1966) [2], the first episode of Star Trek: The Animated Adventures began airing, at 9 AM on NBC. It began life as a pilot movie (two hours long with commercials, capable of being edited into four episodes) which aired in August 11, 1984, as a special presentation for the network. It was part of the escalation of Star Trek that there had to be a pilot movie “event” – Star Trek had not aired a normal-length episode in first-run since March, 1971. The pilot movie (in a marked contrast to both pilots of Star Trek, which began in medias res) showed the assembly of the crew aboard the new starship (of a new class, the better to sell more merchandise) and their inaugural mission – which was a success. The show was otherwise planned to follow a largely episodic format, much as the original Star Trek had done – the heavy serialization of the miniseries had deeply divided Trekkies. The pacing had to be altered as well – Saturday morning cartoons were just 22 minutes long, in contrast to the 51-minute runtime of hourlong dramatic television in the 1960s.

    The popularity of The Journey of the Force in Japan – unsurprising, given the influence of the native jidaigeki period dramas on the film – was such that many Japanese animation studios clamoured for the opportunity to produce a Journey adaptation. It was eventually decided that an animated series would premiere shortly after the release of the second Journey film in the summer of 1986 – just in time for the beginning of the 1986-87 season, on Saturday mornings. It was given the working title Journeys to a Faraway Galaxy, alluding to the famous opening line of the original film (which would be reused in the sequel): “A long time ago, in a faraway galaxy…”. Lucasfilm’s close association with Desilu would pay dividends here as well; NBC would pick up the show for airing, and the show was planned to follow the already-running Star Trek back-to-back in a one-hour block, with Journey airing at 9:30 AM. This was part of NBC’s attempt at theming various hours in their Saturday morning schedule.

    The networks were also not those to resist giving into sensationalism. In time for May Sweeps, CBS had a telefilm ready dramatizing the terrorist attack on the Women’s Fencing event at the 1984 Summer Olympics in Tehran. (Rival network NBC had broadcast the Olympics.) It was a big hit, although not without its flaws – it was told from the perspective of the American woman, Carol Wilson, with her meeting the other victim, Israeli Tamar Dahan, only in the film’s closing minutes. It doubled as part of a general attempt by the media to raise awareness of the religious persecution of Baha’is, which had become a cause célèbre in the wake of Tehran, appearing in newsmagazine programs regularly throughout the 1984-85 season.

    Over at Desilu Productions, network-level strategies were important, but tended to take a backseat to studio-level political manoeuvring. When it came to the studio’s biggest hit, The Patriot, the challenge facing the producers was how to keep the Dave/Rebecca relationship intriguing despite the resolution of the sexual tension between them. This had driven the original showrunners, Glen and Les Charles, to quit. Surprisingly, fans continued to respond well to the relationship between them even despite many writers’ misgivings – the writers brought over from Paramount Television, many of whom had worked on Rhoda, had already noted this peculiar disconnect, and coined the phrase “Rhoda problem” to refer to it. Desilu’s corporate culture had always stressed the closeness of the studio with fans of its programs – and many of the writers who opposed keeping Dave and Rebecca together fought an uphill battle as a result. It didn’t help that many of them quite obviously wanted to write Dave and Rebecca as single because it was easier for them, and were not shy about saying so; this irked Brandon Tartikoff, in his capacity as the executive in charge of production.

    “My job is to hire the best writers in Hollywood,” Tartikoff was quoted as saying in an interview on the subject published in The Hollywood Reporter. “Their job is to do the best writing possible. If they think it’s too much of a challenge to write Dave and Rebecca as a couple, then it’s my job to fire them and hire writers who will embrace that challenge. That’s what I’m paying them for. Writing is a job, it’s not supposed to be easy. If it were so easy, if just anyone could do it, we wouldn’t be paying them for it.”

    Tartikoff had proven in his dealings with Gene Roddenberry that he was not afraid to play hardball with writers and producers, and Roddenberry’s long tenure and considerable success with the studio having been unable to sway Tartikoff did much to kowtow other writers and producers working at the studio in line. The studio chief, Lucille Ball, was increasingly aware of Tartikoff’s uncompromising resolve, which worried her somewhat. She had built her reputation on lionizing the efforts of her writers – though this adulation came after having laid off most of her writers from The Lucy Show, the very same writers who had worked on I Love Lucy – and she didn’t want Tartikoff to foil these efforts as she became increasingly conscious of leaving a legacy. (She had reconciled with her I Love Lucy writers in the interim.) [3] Her right-hand man (and the only other person superior to Tartikoff in the studio hierarchy), Herb Solow, had suggested providing writers and producers with additional perks and benefits, to soften the blow from any studio edicts. Desilu, which had always been a studio known for a soft touch in terms of content demands, would also have to pick their battles. Insisting that their producers not arbitrarily reverse course on their shifts in creative direction seemed fair and reasonable, but by the same token, it wasn’t the studio’s place to insist on new directions or plotlines, as long as the series was successful. In addition, Desilu would throw its weight around and go to bat for producers in any battles with the network. This would have a two-pronged “good cop, bad cop” effect: Tartikoff could be a taskmaster while Solow and Ball were more benevolent and giving; at the same time, the network could (and would) be blunt, harsh, and unyielding in their demands, whereas Desilu could be more accommodating, but at the same time, when the studio made demands, the producers would know that Desilu meant business. (Networks, by contrast, tended to be more fickle and ephemeral when it came to what they claimed to want from the shows they aired.)

    However, this arrangement had resulted in considerable friction during the production of Deep Space, and the network had tired of the headaches that came with mediating the battles between the studio and the crew – especially since the middling ratings that the show received were not nearly worth the trouble. As a result, the show was cancelled in 1983. Desilu was able to shrug off the negative buzz they had accrued within the industry as a result of the situation, but the same could not be said for Gene Roddenberry, who quickly emerged as the scapegoat. Roddenberry, the creator of Star Trek, thus ended his two-decade-long association with Desilu Productions, sought to sell his ideas elsewhere. He went back to the same well with another science-fiction series – his sixth attempt to market a show in the genre, following Star Trek, Assignment: Earth, Re-Genesis, The Questor Tapes, and Deep Space. Most of these shows had been more optimistic than his latest project, Battleground: Earth. [4]

    Apparently influenced by the classic Arthur C. Clarke novel Childhood’s End (Roddenberry was an acknowledged admirer of Clarke’s), the series depicted the arrival of seemingly benevolent aliens (deliberately evocative of Vulcans, advanced beyond human understanding and with inscrutable motives). The medical and environmental advances freely provided by the aliens (generally called “the Companions” within the show’s universe, with these Companions claiming that their proper name was unpronounceable by humans) allowed for the elimination of diseases and pollution, while their services as adjudicators and arbitrators allowed for the end of war. In this way, they evoked several alien races from Star Trek, including the Organians, who had ended the war between the Federation and the Klingons. However, over time, humans had become increasingly dependent on these aliens to meet their basic needs, and (as in many episodes of Star Trek) this complacency would have disastrous consequences for the vibrancy and ambitious nature of humanity – resulting in a rebellion against the presence of the Companions.

    The moral ambiguity of the situation – the desire for self-sufficiency and the obvious condescending nature and imperialist, colonialist allegory of the Companions was contrasted with many members of the Resistance having joined for less altruistic motives: xenophobia and nativism were so common as to be typical of its membership. However, in the grand tradition of (among other examples) To Serve Man, the aliens did indeed have a nefarious purpose: they intended to make humans totally dependent on them, only to then deprive them of their resources in exchanged for their continued services –ultimately bleeding the Earth dry and leaving its people helpless and doomed to a slow, painful death. Although this eliminated the prospect of moral ambiguity between the two sides, it did tick off a number of boxes on Roddenberry’s ideological checklist: the theme of the show was anti-capitalist as well as anti-imperialist (Roddenberry would cite both India and China as two examples of real-life sites of similar exploitation by “Companions”), environmentalist (Roddenberry never failed to stress that Man was doomed to deplete the Earth’s resources on his own), and encouraged humanity to focus on personal (and collective) self-improvement through innovation and ingenuity, as opposed to reliance on outside, seemingly-omnipotent forces (vaguely anti-religious, though this theme was difficult to reconcile with the more concrete anti-imperialist angle and was not overly emphasized).

    Meanwhile, Roddenberry’s nemesis Tartikoff continued to prove his worth as an “idea man” for Desilu, even as he clashed with the production-level creative types to whom he inevitably handed these ideas off for development. His hottest new idea even got him a co-creator credit on Desilu’s latest action-drama, even though said idea had allegedly consisted of merely two words: “MTV cops”. Tartikoff wanted a cop show that was the antithesis to Hill Avenue Beat, the yin to its yang. This new show would be the “style” to Hill Avenue Beat’s “substance”.

    Since it was a show that would borrow heavily from the MTV aesthetic, he wanted “sexy” crimes and criminals to be thwarted – something high-stakes. Hill Avenue Beat, by contrast, borrowed from Captain Miller’s precedent of depicting the mundane, everyday lives of the average beat cop (hence the name of the show). “MTV cops” needed to be more exciting, more glamorous. Tartikoff first approached Steven J. Cannell, who produced Hill Avenue Beat, to develop on a second series for Desilu. [5] Together, the two hit on the idea of vice cops – drug cartels were the prohibition-era gangsters of the 1980s, after all, a fact cemented by the recent – and highly-successful – remake of Scarface, directed by Sidney Lumet and starring Al Pacino as a Cuban refugee-turned-drug kingpin. That film had won Pacino his long-awaited first Academy Award for Best Actor. [6] It had been set – and shot – in Miami, the central hub of the Cuban-American community and still a fairly exotic locale up to that point – a seeming tropical paradise with a seedy underbelly. However, location shooting was an indulgence beyond Desilu, so the location was changed to San Diego, and the drug cartels changed from Cuban to Mexican in origin. (This actually reflected the shift from Miami to Mexican border cities taking place in the drug trade at that time, a happy coincidence.) San Diego was near enough – especially with the recent completion of the high-speed rail line between there and Los Angeles – that sojourns for location shooting would be feasible and relatively inexpensive. Ultimately, however, Tartikoff and Cannell would set the show in a fictional border city, which they named San Andreas (for the fault line which travelled along the California coast). The show would focus on a unit of the San Andreas Police Department Vice Squad – Vice Squad was the working title for the show, but it was ultimately dropped due to similarity with the parody cop show Police Squad!. After some deliberation, the show would ultimately be titled Neon City Vice, after the in-universe nickname for San Andreas. Although drugs would be the primary focus of the show’s attention, prostitution, gambling, and alcohol could also be included as topics of attention due to the scope of most real-life vice squads. Indeed, as a result, San Andreas was given an active nightlife, including casino resorts – San Diego by way of Atlantic City, a coastal gambling mecca – leading to the city’s famous nickname of “Neon City”, an allusion to Vegas and Reno, two desert gambling meccas like San Andreas.

    Casting for Neon City Vice was tricky – Scarface had faced considerable backlash from Cuban-Americans for their depiction in that film, and that (unsurprisingly) struck a chord with Lucille Ball, who after all had once been married to a Cuban, and had seen first-hand the discrimination he had faced (as well as what she herself had faced, being married to him). Any number of Mexican “baddies” would have to be countered by at least one unambiguous good guy of Mexican descent. This suggestion was supported by the actor who was first approached to play Senor Gutierrez, “the Al Capone of San Andreas”, the primary antagonist of the series (who would appear only intermittently, borrowing a convention from The Untouchables, an old Desilu show in which Al Capone and Frank Nitti were recurring characters). His name was Ricardo Montalban, and he was just finishing a lengthy tenure on Fantasy Island. Being a staunch advocate of positive depictions in the media, he would agree to appear only if – if – at least two Mexican actors be cast among the “good guys”, one of whom would be a co-lead and appear in most every episode. He offered the services of the Nosotros Foundation he had co-founded in finding and casting the right individuals, an offer that was accepted. Montalban relished the role of Gutierrez as it was written because the character was witty, suave, and charismatic – and unfailingly loyal to those who demonstrated loyalty to him in kind. It was very much a post-Godfather portrayal of a drug kingpin – but to compensate for this, and to better reflect the realities of the cutthroat cartels, he was unthinkably ruthless and brutal (as much as could be depicted on 1980s network television, at least) to his opponents, particularly those who betrayed him.

    Gutierrez quickly became the show’s breakout character, eclipsing T.R. Walsh as the villain television audiences loved to hate. Montalban was listed only as a “special guest star” for the episodes in which he appeared during the show’s first season, but his popularity ensured that he would be promoted to the opening titles (receiving the coveted “And” credit) for the second season in 1985-86. Likewise, the setting of San Andreas captured the popular imagination, being a composite of many attractive locales (Atlantic City, Las Vegas, Reno, and San Diego) and consistently depicted as glamorous and thrilling, dangerous and seductive. In a story widely believed to have been concocted by Desilu’s publicity department, Variety reported that the studio received letters (“by the truckload”) from viewers who sought directions to the fabled Neon City, unable to find it on road maps. The degree to which Neon City Vice had captured the popular imagination had been unknown to Desilu since their previous commercial peak, with Rock Around the Clock and Three’s Company in the late-1970s.

    Unfortunately, television was a zero-sum game, and the success enjoyed by the shiny and new Neon City Vice directly detracted from the old-school Three’s Company continuation, Robby’s Roadhouse, which also found itself mired in an awkward position within the Desilu roster. It was unique only in that it depicted the adventures of a young(-ish; John Ritter and Pam Dawber were both over 30) married couple with no children; the Barefoot in the Park formula. Barefoot had struck a chord with many Baby Boomers in the 1970s, but perhaps the time for such a show had passed. With the economy improving, and with Boomers aging, many of them were beginning to settle down with children and in comfortable middle management jobs. Two plucky kids trying to start a new business (the titular restaurant) didn’t capture the pulse of the era in which new ventures tended to be multi-million-dollar public-private enterprises involving infrastructure or industrial complexes. Small business certainly existed, and remained vital to the American economic recovery, but it didn’t capture the popular imagination as much as the big-money, high-stakes activities which served as window-dressing for shows like Texas, Wasps, and Vintages, not to mention popular movies of the time.

    Ultimately, Robby’s Roadhouse was cancelled after just one season. This was partly because Robby’s Roadhouse was a show not only at war with itself, but also with Desilu’s other offerings: focusing too much on the workplace elements, it was decided, made the show too much like The Patriot (it didn’t help that two Desilu sitcoms out of four were titled after a hospitality establishment workplace). Focusing too much on the domestic situation, on the other hand, made the show too closely resemble sister series The Ropers, whose primary setting was the Ropers’ homestead, with a never-ending stream of drop-in neighbours. Surprisingly, The Ropers scored something of a coup when it nabbed a recurring cast member in the role of Helen Roper’s mother: Bette Davis. She and the studio chief, Lucille Ball, were very old friends, who had attended drama school together in the 1920s, and she accepted the role upon Ball’s personal request (there was no audition or casting process). It was the silver screen legend’s first role on a sitcom, for which she would win the 1984 Emmy for Outstanding Supporting Actress in a Comedy Series – a nomination which she was at first highly reluctant to accept, being from a generation of movie stars who, upon being elevated to lead, stayed leads for the rest of their career. Davis was only 11 years the senior of Betty Garrett, who played her daughter, but wryly remarked upon being so informed that she had always played older than she was for her entire career, and that this would be no exception. [7] Davis, who portrayed “Mother” with the same boldness which characterized all of her most famous roles, found herself popular with the post-Baby Boomer audience for the first time – for whatever reason, younger people loved The Ropers, despite that show focusing primarily on people beyond even the upper limit of the cherished 18-49 demographic. [8]

    Despite the success enjoyed by both The Patriot and The Ropers, Robby’s Roadhouse wasn’t the only Desilu sitcom that had to change creative direction to remain afloat; Eunice, which by 1984 had rather improbably become one of the longest-running shows on the studio’s current roster, also had to make changes. In general, primetime soap operas continued to perform very well indeed, despite Desilu’s curious disinclination to tackle the genre. The genre that studio had pioneered, the sitcom, continued to be in dire straits. The Patriot was the only sitcom in the 1984-85 season to finish within the Top 10, which added some leverage to Tartikoff’s stand against the producers not to change the show’s plotlines. Arguably, however, despite being to the benefit of The Patriot, it was perhaps to the detriment of the sitcom genre in general, which thrived on sticking to the status quo. Soap operas, by contrast, thrived on constant change. Although Desilu chose not to produce a bona fide soap opera, they did address the popularity of the genre by introducing parodic soap opera elements into their established Eunice sitcom, previously a relatively straight take on the kitchen sink realism style popularized by the Norman Lear sitcoms of the 1970s. Carol Burnett, the show’s star and producer, supported this shift in direction largely because she was a fan of soap operas, particularly All My Children. Longtime fans of the show, however, were more divided. Although never a ratings powerhouse (Eunice was Desilu’s lowest-rated show, though never too far behind critical darling Hill Avenue Beat), the show enjoyed a loyal – and vocal – cult audience, and this change brought about a deluge of protest letters to Desilu. But the studio couldn’t argue with results – ratings improved considerably, saving Eunice from a grisly fate, and leading Robby’s Roadhouse to get the ax – the only Desilu show cancelled at the end of the 1984-85 season.

    In addition to the soap opera genre, the anthology format continued to be very popular for dramatic series in this era, irrespective of genre. The creators of Columbo, William Link and Richard Levinson, re-teamed (along with that show’s producer, Peter S. Fischer) for an Agatha Christie-inspired series which told the story of a middle-aged widow living in a sleepy seaside town who becomes a mystery writer: essentially, a composite of Christie’s beloved Miss Marple character with Christie herself. This was their second attempt to strike gold with a mystery series featuring a novelist protagonist, following Ellery Queen in the 1970s, and they scored an impressive coup when Jean “Edith Bunker” Stapleton, late of Those Were the Days, agreed to star. [9] She had been severely typecast by her seven years on that iconic 1970s sitcom, despite the small collection of Emmy wins she had accumulated in the role. Stapleton was naturally drawn to the role of Jennifer “Jenny” Stoner (or J.B. Stoner, as she was known professionally) because the character was so intelligent – a breath of fresh air after playing the ditzy Edith Bunker for so many years. It was a chance for her to show her range as an actress, even if it did mean another weekly series, a prospect about which she had some misgivings. Nonetheless, she figured the benefits outweighed the risks and took the job – a decision she would never regret. Indeed, her typecasting was almost immediately a thing of the past, as the show became the second-highest-rated new drama of the season (behind only Neon City Vice).

    Other anthology series popular during this era were revival anthology series, kicked off initially by a big-screen version of The Twilight Zone, which in turn inspired CBS to revive the original series (which was produced in-house). The original show’s host, Rod Serling, obviously could not return to host the remake due to his intervening death in 1975. However, Serling had not only hosted The Twilight Zone, but also Night Gallery, which had been produced by Desilu; as a result, Fred Silverman at ABC inquired as to whether the studio would be willing to mount a revival of that show as well. The word was out that NBC was planning a revival of Alfred Hitchcock Presents – Hitchcock, like Serling, was deceased, but unlike Serling, his framing segments pertained little to the episodes to which they were attached, and could be reused with impunity. Naturally, given the trends of the time, these would be colourized. Silverman also made clear that if Desilu were not interested in producing The Night Gallery, then he would be happy to work on a remake of The Outer Limits instead; the rights to that show had fallen into the lap of Ted Turner’s conglomerate. (Lucille Ball sarcastically mused as to why Turner wasn’t behind the remake of Alfred Hitchcock Presents instead, the crass colourization being “more his bag”). Ultimately, Desilu declined Silverman’s offer, but surprisingly, a counter-proposal was made.

    Desilu, after all, was once the House that Paladin Built, so named not just for Gene Roddenberry’s contributions, but also for those of another Have Gun – Will Travel writer: Bruce Geller. Geller had created Mission: Impossible and then Mannix for the studio before setting his sights on the silver screen, where he enjoyed moderate success as a writer and producer of action-thriller pictures. Unlike Roddenberry, Geller was not an increasingly deluded egomaniac, and maintained good relations with the studio whose residuals cheques had enabled him to lead a very comfortable life – Tartikoff was eager to prove that he could maintain good relations with longstanding producers, making him to eager to support a continued working relationship. [10] Indeed, Mission: Impossible was the third-most popular Desilu production in syndication, behind only I Love Lucy and Star Trek themselves – and although the two towered over just about all others on that front, Mission: Impossible was no slouch. Unlike many other classic series (including Star Trek), the show did not receive any sort of continuation or revival during the miniseries craze of the late-1970s, as it had ended as recently as 1973.

    Much like Star Trek, however, the show’s later years were seen as distinctly weak: the mounting expense of keeping Martin Landau and Barbara Bain on the payroll resulted in the plots shifting from Cold War foreign adventures to gangland crime syndicates (due to the need for fewer purpose-built sets and less location shooting); the addition of Lynda Day as Dana, a protegée to Cinnamon, to add sex appeal with the younger male audience (which did not sit well with the aging Bain, nor did her more conventional damsel-in-distress character sit well with audiences); and the wholesale replacement of Willy Armitage (played by Peter Lupus) by Dr. Doug Robert (played by Sam Elliott). [11] Many fans – with a few quixotic exceptions, especially in the case of Dana – were happy to write off these developments in any would-be continuation project, and Bruce Geller (who had left the show long before such changes were introduced) was inclined to agree. That said, by the mid-1980s, many things had changed about the world. Japan and to a lesser extent Red China were major global players, detente with Russia had fully taken hold, and the stock enemy in international intrigue stories tended to be unreconstructed Backwards Bloc-type countries exemplified by junta-era Argentina. Satellite technology had widespread influence in industry and commerce; computers were a fact of life, common enough that the majority of households had a microcomputer (at its most broadly defined) by 1984. Thus, although the basic formula for the revival series would remain the same, specific plot nuances would change drastically. The plan was for the series to begin airing in the 1985-86 season, and this would allow for Lucille Ball’s effective reign as studio chief at Desilu to come full circle.

    In a shocking upset, NBC’s Wasps finished ahead of ABC’s Texas as the #1 rated series of the 1984-85 season; it was one of just two shows to place in the Top 10 for the Peacock Network that year. Texas itself finished at #2, heading the ABC roster in the ratings. The CBS soap opera Vintages finished at #4, a near-photo finish for the “Big Three” soap operas of the 1980s during their peak season, and (again along with 60 Minutes) one of the few smash-hits for the Eye in a lean period for that network. The highest-rated new entry of the season was Neon City Vice, cracking the Top 10 for ABC, maintaining the six shows in the highest echelon for the Alphabet Network. The Patriot and The Ropers also cracked the Top 10 for Desilu; Eunice, surprisingly, fell just short of that threshold, a marked improvement from the previous season. ABC managed a lucky 13 entries in the Top 30, and NBC had eleven; CBS had a lone six entries. Even those hits for that network which cracked the Top 30 tended to skew hopelessly older; it was the 1960s all over again. History really did tend to repeat itself; this was as true in the ratings as it would prove to be at the awards shows.

    Hill Avenue Beat repeated once again for Outstanding Drama Series at the 37th Emmy Awards, marking four total and consecutive wins, both of which broke Emmy records previously (and jointly) held by Playhouse 90 and The Defenders. [12] Hill Avenue Beat won against hot competition from Desilu stablemate Neon City Vice, which nonetheless won several technical awards, and Outstanding Supporting Actor in a Drama Series for Ricardo Montalban. The Ropers won for Outstanding Comedy Series, marking the third time in a row that Desilu claimed both top awards in the same night. Norman Fell and Betty Garrett also won for Outstanding Lead Actor and Lead Actress in a Comedy Series, to accompany the win by Bette Davis, a near-sweep of the acting awards in the comedy category. Lucille Ball, once again sitting in the front row at the awards ceremony with her husband, obligingly smiled for the cameras with each win for her studio, just as she did every year.

    But the smiles couldn’t quite mask her disillusion. The magic was gone. It was time to go out while she was still on top…


    ---

    [1] Desi Arnaz died of lung cancer IOTL, not liver cancer. However, given his heavy drinking and smoking, I feel that he was bound to be afflicted with one or the other.

    [2] Except in Canada, of course, where it aired two days earlier, on September 6, 1966 (a Tuesday).

    [3] And those I Love Lucy writers would work with her on what IOTL was her final starring vehicle, Life with Lucy, which began airing on September 20, 1986 (coincidence? I think not). This no doubt played a considerable part in that series, which was critically excoriated and unwatched and unloved by viewers, being derided as so aged and passé. Despite I Love Lucy’s enduring popularity in reruns, popular tastes had changed somewhat in the 35 years since its premiere.

    [4] Known IOTL as Earth: Final Conflict, and not developed until after Roddenberry’s death (by his widow, Majel Barrett-Roddenberry). The name was changed because of its resemblance to the L. Ron Hubbard novel Battlefield Earth, a film adaptation of which was in production by that time. In addition, ITTL, Assignment: Earth never saw production in any form, not even as a backdoor pilot episode of Star Trek, and therefore Roddenberry’s decision to use that title format is not a retread.

    [5] This concept, which IOTL was set in the same city as Scarface and was thus called Miami Vice, was indeed said to have originated from Tartikoff’s two-word pitch (which, it must be said, is a great pitch). IOTL, Anthony Yerkovich, a writer/producer for Hill Street Blues, was handed the pitch and developed it without further active involvement from Tartikoff (Yerkovich is credited as the sole creator). However, the show’s creative direction and tone are usually attributed to executive producer Michael Mann. ITTL, Tartikoff, being lower on the creative totem pole and with fewer connections, merely brainstorms the idea with Steven J. Cannell, leading to both being credited as co-creators. (Cannell receives the sole development credit.)

    [6] Of course, Al Pacino appeared in Scarface IOTL as well – though here the film was produced as a rather bombastic, exploitative picture by a known exhibitionist, Brian DePalma. ITTL, the film is directed as a spiritual sequel to Dog Day Afternoon; it is the third pairing of director Lumet – who was originally attached to Scarface IOTL – and actor Pacino following that film and Serpico (note that all three films are about crime and punishment, as were many of Pacino’s other early films, such as ...And Justice For All and – of course – The Godfather and its sequel). Given Lumet’s directorial style, the film is much more deliberate and intellectual than the OTL version – and more attractive to the Academy as a result. After all, even by 1982, it could be argued that Pacino has been robbed of several Oscars (he had been nominated for all five of the films I previously mentioned).

    [7] “Mother” on The Ropers was played IOTL by Lucille Benson, born on July 17, 1914. Mrs. Helen Roper was played by Audra Lindley, born on September 24, 1918, for a mere four-year age difference.

    [8] Much like the OTL show which inspires much of its tone, The Golden Girls. Also like The Golden Girls (and another reason why Davis’ inclusion has proven so successful), the show enjoys a large gay audience – many of whom identify with both Mrs. Roper and her mother.

    [9] I couldn’t resist tweaking one of television’s all-time greatest casting WIs in having Jean Stapleton (the first choice of the producers of Murder, She Wrote for the role of Jessica “J.B.” Fletcher IOTL) accept the role here. Why? Well, she was only on Those Were the Days for eight seasons (as opposed to the ten seasons she was on All in the Family and Archie Bunker’s Place IOTL), making her less weary of a long-time regular commitment. And yes, this means that IOTL, Angela Lansbury has yet to have that breakthrough which makes her a household name and national treasure. How many times have I said it now? I’m not writing a utopia!

    [10] Also unlike Roddenberry, Geller was dead by this point IOTL. However, his cause of death was an easily butterflied plane crash. Other butterflies which predate that one enabled him to enjoy a relatively successful film career, though he wasn’t exactly setting the world on fire.

    [11] Not to put to fine a point on it, but Barbara Bain turned 40 in 1971 (at the beginning of the show’s fifth season) and even the five Emmys she won for her role as Cinnamon would probably not stop producers from thinking that her sex appeal might be fading. As a result, Lynda Day (George), who was the final female lead during the last two seasons IOTL, was cast as a young ingenue type starting in the show’s sixth season. On-set tensions between Bain and Day became the stuff of legend around the Desilu lot, and (at Bain’s insistence) she was never added to the opening credits. However, Day’s vulnerable character (in contrast to her hyper-competent IMF teammates) did receive her share of praise from certain commentators (granted, many years down the line). Still, most people hate her, calling her pandering, demeaning, and distracting. However, perhaps Elliott’s character receives more audience hate, since Dana is at least not replacing Cinnamon outright. By the way, the character is named “Dana”, rather than “Casey”, as Lesley (Ann) Warren was never cast on the show ITTL, leaving the name of her character available for use.

    [12] Hill Street Blues won Outstanding Drama Series four years in a row IOTL, the first show in Emmy history to do so. (The West Wing and Mad Men would later share this distinction; L.A. Law would also win four times, though not consecutively).

    ---

    Thanks to e of pi, as always, for assisting with the editing of this update. Thanks also to Space Oddity for his input on a particular subject covered in some depth in this update (and to be covered further in a later update). Thus begins the penultimate cycle of the timeline! You may notice That Wacky Redhead is beginning to put her affairs in order. She’s been considering retirement for quite some time, as some of you might have suspected (even notwithstanding that this TL opens with her announcing her retirement), and the stars are finally aligned – but for how much longer?
     
    Last edited:
    Appendix A, Part XI: Persistence of Vision
  • Appendix A, Part XI: Persistence of Vision

    The decision to produce an animated spinoff of Star Trek was, it could be argued, a long time in coming. Desilu had been receiving pitches from animation studios for over a decade before they finally decided to seek out a creative partner for the venture; their timing couldn’t be better, given the changes in the industry. Following a recent (and unsuccessful) strike action, the Animation Guild had lost their protections over runaway productions, which had up until then prevented subcontracting to overseas workshops. [1] As a result, the Japanese Studio Aurora was hired to work their talents on Star Trek, but Desilu didn’t have the logistical capability to work with them directly - an intermediary was needed. That intermediary was Hanna-Barbera, a studio which had worked with Desilu since the halcyon days of I Love Lucy. In collaboration with the producers and executives at Desilu, Hanna-Barbera designers would create the characters and settings, but all the actual animation would be handled by Aurora. Initial plans at Hanna-Barbera were to revisit their own early-1970s pitches for producing an animated Star Trek spinoff, but Brandon Tartikoff and Herb Solow would have none of it - the show’s concept was developed not only by Tartikoff but also by two key personnel from the making of the original series and the 1978 miniseries The Next Voyage: D.C. Fontana and David Gerrold.

    The decision was made very early in development that the crew of the Enterprise (and her successor ships, the Excelsior and the Artemis) would not be the stars of this series, which would take place several years after the conclusion of Star Trek: The Next Voyage. Instead, the USS Hyperion, a Titan-class Starship, under the command of Captain George Probst, was the primary setting for the show’s adventures. Captain Probst, by design, was a very different Captain from the iconic James T. Kirk - although he admired Kirk enough to name his own son, Jimmy, after him.

    The presence of the Captain’s son aboard the Hyperion was a remnant from the recently-cancelled Deep Space, also conceptualized by Tartikoff, who had felt that Gene Roddenberry had not taken this idea to its fullest potential. Indeed, Deep Space fans loathed the “tagalong kid” character, Wesley, believing that Roddenberry favoured him to the detriment of others; it didn’t help that “Wesley” was, in fact, Gene Roddenberry’s middle name. Deep Space fandom would come to use the term “Wesley” to describe any such character in any work of fiction; this usage proved infectious. [2] Jimmy Probst, by contrast, was less obnoxious, but would serve his intended purpose as a surrogate for the audience - curious and inquisitive, but far less reckless and resourceful that young Wesley had been. Like most young boys in cartoon shows, Jimmy was voiced by a grown woman. The fate of Jimmy’s mother - George’s wife - was deliberately left vague; D.C. Fontana favoured having her be divorced from George and willingly absent from Jimmy’s life. Other writers argued that a more potent narrative drive could be derived from her death - ultimately, all sides agreed to leave her fate uncertain.

    The fandom immediately sought to deduce the identity of Mrs. George Probst - the one that caught on was perhaps the most tantalizing: that Jimmy Probst was not named in tribute to James T. Kirk but instead after his mother, who in turn was named directly after James T. Kirk, her own godfather. The woman in question was Jame Finney, daughter of Lt. Cdr. Ben Finney, and who had appeared in “Court Martial”. The explanation was distinctly non-canonical, but proved tenaciously popular.

    Although none of the regulars from the original series returned in that continued capacity, there were strong connections between the two generations, most notably personified in the Chief Medical Officer - Dr. Joanna McCoy, daughter of Leonard McCoy. A single, thirtysomething woman married to her job, the obvious (but low-key) attraction between her and the Captain became a running plot point in many episodes, as did her relationship with Jimmy - ever in search of a surrogate mother. The romance between George and Joanna was not overt, this being a show where much of the audience was actively opposed to romance. But it was a romance for the 1980s, between a single parent and a working woman. D.C. Fontana had always been fond of romantic plotlines, and had always felt an affinity for Joanna, ever since introducing her in the eponymous Star Trek episode back in 1969, and it became clear as time went on that the character was something of a surrogate for her.

    Another returning character was Freeman - the first gay character in Star Trek, as featured in The Next Voyage. This was David Gerrold’s idea. He saw the Ensign Freeman who appeared in several episodes of the original series (usually played by Shatner’s stunt double Paul Baxley), the Lieutenant Freeman who appeared in The Next Voyage, and the Commander Freeman who served as Security Chief and led all the landing parties (a long-held bugbear of Gerrold, who had grown to dislike when the Captain and First Officer beamed down to a hostile planet, putting themselves in mortal danger and potentially depriving their crew of strong and decisive leadership in a desperate situation) as all being one and the same, despite being played by three different actors. Commander Freeman’s sexuality was never explicitly mentioned in the cartoon, nor was he ever shown in a romantic relationship; the character was established as the same (to those in the know) fairly early on, given his mentions of having served aboard the Enterprise alongside James T. Kirk. All involved considered it an unsatisfying compromise, but even Star Trek had its limits. [3] As was the case with Joanna and Fontana, Freeman was obviously a surrogate for his creator, Gerrold (whose birth name was Jerrold David Friedman).

    The First Officer and ship’s Helmsman was a more assertive, Kirk-like figure named Cdr. Msizi Khumalo, a Zulu. Just like Uhura, he was an African character - and just like Uhura, his specific ethnicity was meant to capture the zeitgeist. Uhura had been made Swahili because of the affinity many black activists of the 1960s (including Nichelle Nichols herself) held for that culture; Khumalo was made Zulu as a statement against the Apartheid regime of South Africa - the Zulus were one of the main ethnic groups in that country. Notably, background fluff had Khumalo hailing from the “United States of Africa”, the same polity from which Uhura originated, implying a substantial territorial extent. [4] Probst and Khumalo disagreed frequently over preferred courses of action - their relationship was meant to emphasize the importance of consensus-building and collaborative decision-making. However, one of Khumalo’s primary flaws as a character was his occasional reluctance to accept Probst’s decisions as Captain - and it was also important to impart obedience to trusted and responsible authority figures (and institutions, here personified by Probst). Khumalo’s hotheaded and impulsive assertiveness was influenced in part by the enduring legacy of Blaxploitation heroes, especially the later wave of “Motherland” movies in that genre - though obviously sanitized for young audiences - and this was not without controversy. Khumalo also allowed the writers to critique Kirk’s “cowboy” reputation and how it could be obstructive - the irony being that Commodore Kirk himself was, in turn, Probst’s direct superior.

    Commodore James T. Kirk, for his part, was among the most frequently-appearing “legacy” characters, which was ironically achievable due to what the producers had initially feared would be a restriction: William Shatner did not return to play Kirk. Many of his co-stars did agree to reprise their roles, voicing their characters from Desilu’s studios in Los Angeles - voice recording for the regular cast was done in Toronto - but Shatner was the lone holdout. As a result, local talent had to be sought in the Toronto area, and a young impressionist comedian named Maurice LaMarche proved singularly able. [5] LaMarche’s skill and versatility would lead him to be hired to voice several characters for the series, and Star Trek would mark the beginning of his voice acting career in earnest. (Though he played the role fairly straight on the show, bootlegs from warm-ups and between takes of him goofing on the “Hammy Shatner” became hot commodities at Star Trek conventions ever after.)

    During development, Kirk had been earmarked for an overdue promotion to Rear Admiral (or even Vice-Admiral, matching the rank of his most frequently-appearing superior, Admiral Komack), but the Under-Secretary of the Navy, made aware of these plans through one of his assistants, personally requested that Kirk remain a Commodore so as to increase the legitimacy of the newly-restored rank of Commodore in the actual US Navy, which had met with considerable resistance among senior Captains (many of whom had held the courtesy title of “Commodore” before that was eliminated so as to prevent confusion - indeed, it was replaced with “Fleet Captain”, itself borrowed from Star Trek). This request was almost immediately leaked to the press (many suspected Gerrold was responsible, though he denied it), and though certain corners tried to make political hay of the issue, it was ultimately consigned to the same set of anecdotes as the $600 toilet seat. In-universe, Kirk continuing to hold the rank of Commodore was justified as a deliberate decision on his part, as it was the highest rank that enabled him to remain on front-line duty - anything above that would require him to fly a desk back on Earth. Indeed, he had resumed command of the USS Excelsior.

    And then there was the show’s most unique and inventive character: an android capable of directly interfacing with the Hyperion’s library computer itself. Named Internet (which was a reference to an obscure means of online information transmission used only by the military and research universities), the character stood in for Spock when it came time for the racial tolerance allegories. Internet also personified technology and artificial intelligence, two rather prominent fears in 1980s society. Although voiced by a woman, Internet was androgynous in appearance and had no gender in any meaningful sense of the word [6] - however, characters on the show (and the fandom at large) tended to use feminine pronouns for convenience’s sake, which she accepted. Her inquisitive questioning of gender roles additionally allowed the show to allegorize the changing realities of and expectations for men and women in modern society. (Characters who used dehumanizing pronouns such as “it” and “that” to refer to Internet were always depicted as being in the wrong for doing so - “she’s a valued member of our crew, not a thing or an object”).

    The primary advantage to switching formats from live-action to animation was that the latter allowed for countless settings and character designs, located only by the imaginations of the artists and writers. By contrast, the original series had only been able to suggest locales more exotic than a well-dressed set or outdoor location through the judicious use (and, occasionally, reuse) of matte paintings for establishing background shots - characters, for their part, could only be as elaborate as the physical limitations of makeup, costuming, or props making technology allowed. The whole reason that Spock was depicted as half-Vulcan in the first place had been to imply that full Vulcans were more “alien” in appearance to ordinary humans - it was only as Spock became more integral as a character, and as the setting evolved from a relatively isolated “Wild West in space” to a more populated “Cold War in space” that the need to depict aliens on a regular basis emerged. Not only Vulcans but also Romulans were differentiated from regular humans only by their pointed ears and eyebrows, and the greenish tint to their skin (which was difficult to discern on television sets of the 1960s). Even Vulcans could not appear in large numbers (most members of crowds tended to wear helmets which obscured their ears); more elaborate aliens, such as the blue-skinned Andorians and the porcine Tellarites, were very scarce. Even simple alien designs such as the Tribbles could only be produced in limited quantities - “More Tribbles, More Troubles” was produced only because most of the original tribble props from “The Trouble with Tribbles” had been saved, allowing for most of the budget to be spent on the other alien life featured in the episode.

    The animated series would change all that. Surprisingly, though, among the core characters, only Internet was non-human, and she was - if anything - easier to draw than most humans, given her androgynous and generically humanoid physique. However, among the supporting players, non-humans made their presence known. T’Pel, the female Vulcan Chief Engineer, was an attempt by D.C. Fontana to rehabilitate the reputation of the women of that species, generally remembered as either imperious and standoffish (like T’Pau) or shrewish and conniving (like T’Pring). T’Pel was also an Engineer (wearing operations red) to prove that not all Vulcans were theoreticians like Spock - though still (loosely speaking) a scientist (David Gerrold, discussing the cast of characters at the 1984 “Summer of Star Trek” convention, jokingly remarked that “all Vulcans work in the STEM fields”). Much as T’Pel was created to help defy stereotypes, so too was the Andorian Communications Officer, Ensign Thelos. Andorians were a notoriously warlike, cunning race - but Communications was by its very nature a passive, reactive position. Thelos was described internally as “the pacifist Andorian” - and he inherited the “forsaken his family legacy” plot point from Spock’s character, as his father was a typical Andorian martial officer who demanded that his son follow in his footsteps; instead, with the help of his Uncle, Captain Thelin of the USS Ares, he joined Starfleet. Despite not being a warrior, however, Thelos retained the Andorian mindset and had some difficulty adapting it to his new vocation (being a greenhorn fresh from the Academy). Rounding out the key officers of the Hyperion was the Navigator, Lt. Lora Quo [7] from China - the Chekov of the series, in that her role was intended to demonstrate a future in which Red China was harmoniously integrated with the rest of the world, similar to the role Chekov had played in the original series, representing the Soviet Union.

    In addition to Commodore Kirk, the classic crew of the USS Enterprise appeared in various capacities. In contrast to William Shatner’s absence, most of the original cast did reprise their roles.

    Leonard Nimoy returned as Spock, who continued to serve as the Vulcan Ambassador to the Federation, including in an episode loosely based on the original “Journey to Babel”, with the Hyperion serving much the same role that the Enterprise did (with the twist that Spock is not the suspected murderer but instead a would-be murder victim!). Likewise, Mark Lenard returned as his father, Sarek, informally “the President” but properly “President of the Federation Council”, though he rarely interacted with the main cast and existed primarily for expository purposes.

    Dr. Leonard “Bones” McCoy, being the father of one of the main characters, also appeared; DeForest Kelley reprised his role as well. Bones, like his friend Jim Kirk, held the rank of Commodore, and served as Chief of the Medical Research Department at Starfleet Headquarters - a job that enabled him to continue working with patients and in the field, as necessary.

    James Doohan voiced Commodore Montgomery “Scotty” Scott. Like Bones, he had returned to Earth to work at Starfleet Headquarters - specifically, the Research and Development Department. Scotty made it his mission to develop and test the necessary engine components to keep the Federation on the bleeding edge of Warp and Impulse propulsion technology. Functionally, this meant that Scotty appeared to introduce experimental engine equipment that needed to be “road-tested” on an active-duty Starfleet vessel - the Hyperion, being a modern front-line Starship, was often chosen for these tasks. Scotty often appeared in an episode only to brief the crew regarding the new engine component, leading many older fans to compare him with Q from the James Bond films.

    Captain Penda Uhura remained the CO of the USS Enterprise, as she had been at the conclusion of The Next Voyage, and continued to be voiced by Nichelle Nichols. Meanwhile, Captain Pavel Chekov was placed in command of the USS Ares, a frigate and sister ship to the lost USS Artemis, doomed command of his onetime crewmate, the late Captain Walter Sulu. Chekov was voiced by Walter Koenig, who also contributed scripts to the new series. [8] (Other actors, including Rep. George Takei, the former Mr. Sulu, provided some story concepts, though obviously not all were used.)

    The character of Dr. Christine Chapel, human wife of Ambassador Spock and former Head Nurse aboard the Enterprise, was deemed too important to exclude from the continuing universe, despite Chapel having previously been played by Majel Barrett-Roddenberry, who declined to participate in the new series out of solidarity with her husband. As a result, the same actress who portrayed Internet also portrayed the ship’s computer voice and Dr. Chapel herself. The key difference between the computer voice and Internet’s voice is that the former spoke in a harsh, clipped, and robotic monotone - Internet spoke with a more natural voice, though it was still overly formal and prosaic, in the best tradition of Mr. Spock. [9] Chapel spoke with a fully naturalistic cadence, and without the reverb effect applied to both the computer voice and Internet (along with other computers and robots) to differentiate them from “organics”.

    Continuing on from the plotlines of the miniseries, relations between the three galactic Great Powers (the Federation, the Klingons, and the Romulans) remained tense. However, this remained in the background for the most part - Klingons and Romulans appeared infrequently. As befitted most cartoon series of the era, the adventures of the Hyperion were largely episodic, and educational. Like the original Star Trek, the animated series tended to quickly settle into formula. Usually, the Hyperion encountered an undiscovered planet, and the crew beamed down to initiate contact and open relations with the natives. Usually, these natives (who were rarely humanoid, and always inscrutable, taking fullest advantage of the animated format) were hostile, but oftentimes the two sides would be able to come to an agreement or compromise by the end. There were plenty of opportunities for conflict - senior bridge officers often disagreed on just how to handle the aliens. One of the staff writers, Mark Evanier, is credited with the idea - a fairly novel one in cartoons of the day - to give most opinions (with the exception of obviously “wrong” ones) a fair shot. Occasionally, dissenting characters were proven right by events, or after further debate- this spat in the face of the convention which Evanier described as “the complainer is always wrong”, and it found favour with Fontana and Gerrold as well.

    Another of the show’s writers, Paul Dini, tended to focus on the mystery elements of Star Trek. The newly-discovered world or alien would inevitably be hiding secrets which would have to be divined by the crew of the Hyperion before they could solve their problems. Perhaps the most ambitious of the show’s writers was J. Michael Straczynski, who reminded both Fontana and Gerrold of the late, great Gene L. Coon, with his penchant for world-building and running storylines. Obviously, there were limitations on how far he could take his ideas, but he relished the opportunity to give characters on all sides complex motivations. For this reason, Straczynski was named the show’s Story Editor, a position once held by both Fontana and then Gerrold on the original series. Straczynski also took up Gerrold’s torch in communicating regularly and openly with the fandom. The only other female staff writer was Gerrold’s former assistant, and ardent Trekkie, Diane Duane. By this time, she had begun writing her own original novels, but relished the opportunity to write “canon” Star Trek material.

    But all of the staff writers were acutely aware of the fandom, and how it would react to this new Star Trek series. The original Star Trek had certainly been accessible to children in its day, as most primetime shows in the 1960s were by design. However, it was not explicitly a “kids’ show” - it didn’t air on Saturday mornings, alongside Bugs Bunny and the myriad Westerns where the bad guys wore black hats and the good guys wore white ones. There was no getting around that this incarnation of Star Trek, by contrast,was primarily intended for a young audience, mostly the “Mini-Boomers” who had been born in the years following the end of Star Trek’s original run, the generation following the original Boomers who had watched Star Trek in such large numbers. To this end, the entire writing staff did their best to ensure that the new Star Trek could be enjoyed by the whole family - that parents who watched alongside their children could enjoy the show. The ample references to the original series, and vague hinting at political machinations (the classic Star Trek technique of making the galaxy seem much larger than it really was) both paid dividends here. The writers borrowed from shows like Sesame Street and The Electric Company and using humour and wordplay which appealed to both children and adults. In terms of character development, the dual appeal was achieved through a dual narrative focus: many episodes would focus on the plot from the perspective of both the father (Captain Probst) and his son (Jimmy). Other characters (such as Joanna or Khumalo) would mediate or allow for a compromise between these two perspectives. Sometimes Jimmy himself would serve as the mediator, most memorably in the holiday special.

    When the show began airing in the autumn of 1984, critical reviews were very positive. The fandom, on the other hand, was more mixed. Unsurprisingly, the old battle lines re-emerged with remarkable swiftness, as though it were 1978 all over again. In one corner, of course, were the Puritans, many of whom had washed their hands of Desilu after Roddenberry left that studio. The animated series being such a radical departure gave them further resolve - first Star Trek had mutated into a soap opera, and now it was a Saturday morning cartoon. Many in particular resented the presence of both a cute kid and a robot in the cast - the two character types that were traditionally anathema to science-fiction. Surprisingly, this gave them some common ground with Straczynski, who was vocal about his distaste for cute kids and robots, but as a writer, appreciated the challenge of making them work within the context of the show. However, this did not sway the Puritans one iota. In their eyes, the two men who had made Star Trek great now had nothing to do with the franchise anymore - one was dead and buried, the other’s career was dead and buried.

    Moderate fans tended to be more sanguine about the show. A consensus had emerged over time that the magic of the original, 1966-71 hourlong live-action series was lightning in a bottle and could not be replicated - and Desilu was wise not to try. One of the key objectives of this animated spinoff was to sell more toys based off the crew and ship designs featured, with object lessons (more educational and less philosophical than those of old) being a secondary concern, at best. Still, in their eyes, the new show was a continuation of Star Trek in every meaningful sense of the word. And, just as with the original series, the new show was a smash-hit with target audiences, quickly emerging as the #1 show on Saturday mornings with all demographics under the age of 18 - and, technically, with all audiences, though this was a lesser concern for advertisers; commercials for playsets and toys (including Star Trek playsets and toys!) dominated the timeslot.

    On July 31, 1985 (a Wednesday), The Animated Adventures of Star Trek won the inaugural Daytime Emmy Award for Outstanding Animated Program, the first “series” win for the franchise since 1971. D.C. Fontana and David Gerrold, the two showrunners, accepted the award. [10] The Daytime Emmys were held in New York City, and therefore most of the Desilu brass did not attend, but the Emmy win remained a powerful vindication of the studio’s decision to branch out into animation - and to have waited until the time was right to do so.

    ---

    [1] IOTL, the Animation Guild went on strike twice over runaway production: once in 1978 (they won), and again in 1983 (they lost).

    [2] Yes, ITTL, “Wesley” is the term fandom culture uses instead of OTL’s “Scrappy”. Note that TV Tropes formerly used the term “Wesley” to describe a related trope, the Creator’s Pet; ITTL, the term “Wesley” refers to OTL’s “Scrappy” and “Wesley” types, in much the same way that IOTL the term “Mary Sue” specifically referred to what is now called the “Mary Sue Classic” but broadened over time to refer to other Sue-types as well. (Also, yes, Roddenberry was a big fan of the name “Wesley” - the Star Trek character you’re likely thinking of right now was not the first to be so named).

    [3] And it’s still better than what the franchise ever managed IOTL!

    [4] The implication in the 1960s and early 1970s (ITTL, at least) was that the “United States of Africa” covered only the Swahili-speaking regions of Africa (Tanzania, Kenya, Uganda, Rwanda, and Burundi) but subsequently the US of Africa was implied to also cover the minority-ruled territories which became a political cause celebre in the 1970s (Angola, Mozambique, Rhodesia, and of course, South Africa - including South-West Africa) - and, by extension, countries within their sphere of influence (Botswana, Zambia, Malawi, etc.). Thus, the United States of Africa, shown on a map of Earth in a Star Trek reference book printed in the early 1980s, covers all of Africa south of Zaire, Sudan (including South Sudan, obviously), Ethiopia, and Somalia - it also excludes the Angolan exclave of Cabinda, because that’s not very neatly space-filling, now is it? (The rest of Africa is deliberately left as a blank, undefined mass to avoid any unintended implications).

    [5] Maurice LaMarche got his start as an impressionist comedian, and toured with Rodney Dangerfield as his opening act for a time (appearing in that capacity on the 9th Annual Young Comedians Special in 1984). By this time IOTL, he had already appeared in holiday specials for the Canadian animation studio Nelvana, impersonating various celebrities. His first “real” gig IOTL is generally accepted to be his role on Inspector Gadget, starting in 1985; a series of tragedies in his personal life would lead him to abandon stand-up comedy for voice acting full-time some years later. ITTL, his skilled Shatner impression (which can be heard most anywhere on the internet) wins him the part, but in contrast to his OTL oeuvre of comedic and parodic takes on the character, he plays the role very straight. He never really did that IOTL, but consider Orson Welles, another of his famous voices, which he did play straight exactly once in his career: for Ed Wood (yes, that’s Vincent D’Onofrio playing his body, but it is LaMarche’s voice). I think it’s reasonable to assume that he can do the same for Shatner ITTL what he did for Welles IOTL.

    [6] This is a key difference from her OTL analogue, an android that was emphatically described as “fully functional” and engaged in sexual relations. That obviously wouldn’t fly on Saturday morning.

    [7] Quo is the name of the character in the show’s bible and in written promotional materials. The Chinese character representing her surname is 郭, which in the Pinyin script officially sanctioned by the People’s Republic of China is transliterated as Guō; however, this script does not enjoy widespread recognition in the West ITTL. The transliteration Quo is chosen over the more standard Kuo in order to look more “exotic”, though the name is obviously not seen by much of the viewership, given the lack of nametags. Quo has an English given name (unlike Khumalo, though as with her surname it has an unusual spelling) due to their widespread usage among the Chinese diaspora in the Anglosphere, which is therefore (mistakenly) perceived as standard.

    [8] Walter Koenig wrote a script for the animated series IOTL, “The Infinite Vulcan”. He was the only member of the original cast who ever wrote an episode of any series - Leonard Nimoy received story credits for The Voyage Home and The Undiscovered Country, and William Shatner received a story credit for The Final Frontier.

    [9] Essentially, the computer voice speaks with its original cadence, whereas Internet speaks with the same cadence that the computer voice adopted in later OTL productions.

    [10] The inaugural winner of the Daytime Emmy for Outstanding Animated Program IOTL was Muppet Babies, in the first of four consecutive wins (yes, even at the Daytime Emmys, they love their repetition). As ITTL, the award was first handed out at the 12th Daytime Emmys in 1985, signifying the new generation of Saturday morning cartoons entering the mainstream (in addition to Muppet Babies, the other nominees were The Smurfs, Alvin and the Chipmunks, and “old guard” holdover Fat Albert and the Cosby Kids).

    ---

    Thanks to e of pi for assisting in the editing of this update, as usual.

    And there we have it! The Animated Adventures of Star Trek, for your viewing enjoyment! Might I suggest a bowl of sugar cereal and footy pyjamas in order to replicate the full experience? :D
     
    Last edited:
    Appendix B, Part X: Faster than a Speeding Bullet Train
  • Appendix B, Part X: Faster than a Speeding Bullet Train

    John Glenn may not have been the first American in space [1], but he was the first astronaut to have ever served in Congress, and the first to become President. [2] (Sadly, and unlike his fellow Ohioan President, William Howard Taft, he had no plans to join the judiciary and complete the trifecta.) He was the latest in a long line of chief executives to hail from the Buckeye State – indeed, his inauguration would make Ohio the state to have produced the largest number of Presidents, with seven. [3] Glenn was the first Democrat among these – five of his six predecessors had been Republicans. (The first President from Ohio, William Henry Harrison, had been a Whig.) However, despite Ohio’s large population, and the “favourite son” effect enabling him to clinch the political bellwether state, Glenn’s victory over the incumbent President Reagan in 1980 had been rather narrow, despite the unpopularity of the administration’s policies and the economic woes plaguing the United States (and the world) at the time, but psephologists explained this by pointing out that both minor parties (the AIP and Senator William Proxmire’s Earth Party) largely leached votes from the Democrats, who in turn reduced the Republicans to their core voter base, with the addition of perhaps a few voters who were personally drawn to Reagan, whose likeability ratings and charisma remained very high.

    Both the American Party and the Earth Party were in very different places by 1984 than they had been in 1980. Proxmire was aware that his act of open rebellion against the Democratic machine would have repercussions on his own chances for reelection in 1982, and indeed lost the Democratic Party’s nomination for Senate that year. Ideally, he would have sought to run on a “fusion ticket” of his Earth Party (which he kept active as a pressure group attempting to influence the Democrats from without) and the Democratic Party, but this electoral tactic had been disallowed in the state of Wisconsin. [4] He therefore sought – and received – the Earth Party’s nomination for Senate in 1982. He also promoted Earth Party machines not only within his home state, but elsewhere, touting the “progressive tradition” of the Upper Midwest, which had backed Theodore Roosevelt in 1912, had produced Robert M. La Follette in 1924, and had resulted in the Farmer-Labor Party of Minnesota and the Non-Partisan League of North Dakota. Unsurprisingly, then, the Earth Party would run candidates for Congress and the state legislatures throughout the Upper Midwest, most notably in Wisconsin, Minnesota, North Dakota, South Dakota, Iowa, and Michigan. The other target region for the Earth Party was the Northeast (particularly New York City, Boston, and Northern New England), and there they performed especially well in Vermont (which was traditionally Republican, but had never been crazy about Reagan’s conservatism).

    The coalition of Midwestern rural progressives and Northeastern urban liberals who had given the Earth Party 5% of the national vote in 1980 had remained in place for 1982, though not in the same numbers. Ultimately, the party would win no seats in the US Congress – Proxmire himself lost his seat to the Republican candidate, who came up the middle to win with less than 40% of the vote. However, he was vindicated in that he finished second, with over a third of the vote – the only Earth Party candidate for Senate who did (a number of Earth Party candidates for House finished second, but only because they ran in seats where the Democrats would have finished with supermajorities, or otherwise run unopposed). In spite of these limited successes for his new party, Proxmire declined to carry the torch in future election cycles, entering a lucrative second career as a political commentator, emerging as one of the strongest left-wing critics of the Glenn administration. Though he praised the “Invest in America” program’s focus on infrastructure, he (naturally) vehemently and vocally opposed any further investment in the space program – President Reagan reducing NASA’s budget during his term was one of the few policies for which he had praised the Gipper. However, Proxmire could only do so much from outside Congress. Glenn had new allies in both Houses who supported the space program, from every section of the Democratic Party’s big tent. And to be fair, investment in the space program was very much a sideshow within the context of the broader Invest in America initiative, which focused largely on infrastructure and business development, and which Proxmire did support.

    In contrast to the slow and gradual decline of the Earth Party, the American Party flamed out in spectacular fashion. Like the Earth Party, the American Party had been the pet project of one man with a vision and a personality cult – and a former Democrat who left that party only to insist that the party, instead, had left him (a viewpoint shared by Democrat-turned-Republican Ronald Reagan, no less). However, George Wallace was welcomed back to the Democratic fold with open arms, if only because he continued to lead a numerically significant cadre of Congressmen despite his party’s decline (in contrast to Proxmire’s one-man operation). Although Wallace had not run for President in 1976 or 1980, he remained a highly visible and vocal backer of the American Party in the media and on the stump – until the Deal with the Devil went through. As was the case with many political organizations that had endured past the galvanizing issue of their formation had been resolved, the American Party had been forced to develop a more comprehensive agenda beyond support for segregation and opposition to further civil rights legislation. This was especially true in the states where they had taken control of the legislature, or of the Governor’s mansion. In general, many of their policies had been consistent with those of the Southern Democrats who had preceded them (or whom they had previously been themselves), and a “big tent” gradually emerged (as was the case for the Democratic and Republican parties, giving political commentators the opportunity to deride Washington as a “three-ring circus”). Wallace was the ringmaster; his authority alone kept the factions united through the 1970s, and once it had disappeared, each faction rapidly asserted itself and fought with the others for dominance.

    The faction to which Wallace himself had ostensibly belonged, and the oldest in the party, was the segregationist, or “traditionalist”, faction. All AIP senators belonged to this faction, which allowed the leader of the party in the upper chamber to double as the de facto head of that faction. After 1982, this was Trent Lott, the junior senator from Mississippi, who had always been considered a moderate – no doubt to appear more “palatable” to the general public. The second faction had emerged later in the 1970s, as part of the emergence of religious conservatism as a political force. The “evangelical” faction, as it was known, actually comprised both evangelical Christians (typically Baptists) and charismatic Christians (typically Pentecostals) – the two major theological movements in the South (though both enjoyed popularity in much of the United States). Many popular televangelists, who proselytized their sermons across the airwaves, were affiliated with this particular faction of the American Party, and many within this faction felt that their ideals were the future for the party. The third and final major faction were the labour unions who had affiliated themselves with the American Party, a legacy of Wallace having aggressively courted working-class voters (the “Archie Bunker vote”, as it was known at the time) in 1968 and 1972. Many within this faction were religious – though generally more of the Mainline Protestant or Catholic persuasion. Many were socially conservative and had reservations about the repercussions of desegregation, but generally considered that a sideshow to the bread-and-butter issues often tackled by labour unions. A common criticism of the “union” faction – from within and without the American Party – was that it had links to organized crime.

    All three factions would forward their own candidates for President in the 1984 election: although both remaining US Senators were approached to run, both declined, focusing on their uphill reelection campaigns as opposed to what amounted to vanity runs. Eventually, Arnold Thibodeaux, who had served for eight years as a Congressman from Louisiana before he was defeated in 1982, was chosen to represent the traditionalist faction in the American Party primaries. Of the candidates representing the factions, Thibodeaux was the only one with political experience. Unfortunately, he was also a Grand Wizard at his local chapter of the Ku Klux Klan, and did not shy away from this association, making him highly unpalatable to all but the truest of true believers.

    More charismatic – in more ways than one – was the chosen candidate from the evangelical faction, the Rev. Ted Murray, a charismatic preacher and televangelist (originally belonging to the largest Pentecostal church in the country, the Assemblies of God, before separating to form his own congregation), whose syndicated series, Miracles, was a veritable institution in much of the South. It was known for its spectacle – Murray, like many televangelists, spoke in tongues, promoted direct revelation, and practiced faith healing, all live on the air. This was purely a vanity run for Murray; he did not even suspend the filming of his series during his campaign, though he did end each episode with a plea for his viewers to donate to his campaign fund – until the FCC ruled this illegal in mid-1984 (he could only ask for donations to his church, as the registered not-for-profit organization which was sponsoring the broadcast). Unsurprisingly, his campaign was beset by financing scandals. Many of his fellow televangelists accused him of vanity, pride, and various other Biblical sins, for seeking glory for himself as opposed to taking the traditional tack of influencing government from without.

    The third candidate was Tony Russo, a public-sector union agitator with long-suspected (but never confirmed) mob ties, who was controversial within the wider labour movement for his social conservatism and perceived unwillingness to “fight the good fight” on behalf of his workers (earning him the derisive nickname “Brother Tony”). The union faction rallied behind him largely due to their opposition to both other candidates: though many held what might charitably be called “traditional” views on race relations, endorsing a Grand Wizard of the KKK was beyond the pale; likewise, the many Catholics within the Union faction felt little affinity for a radical Protestant televangelist (who himself was not averse to indulging in hoary anti-Catholic rhetoric when the mood struck him). Russo attempted to take advantage of this “wedge” factor to emerge as a compromise candidate for the wider American Party, or at the very least a kingmaker; however, the bosses within the party, desperate to cling to their dwindling legitimacy as a major force in American politics, were loathe to select as their candidate a man who had never won (or sought) the votes of anyone beyond the few hundred members of his local.

    Ultimately, the American Party could not unite behind any of the three candidates, and fractured along factional lines. All three candidates would appear on the ballot, running variously on tickets labelled “American Party”, “American Independent Party”, “American Democratic Party”, “Populist Party”, “American Populist Party”, “Independent Populist Party”, “Christian Party”, “American Christian Party”, or “Independent Christian Party”, depending on the state – indeed, all three candidates would use a party label in at least one state used by a rival candidate in another state.

    And then there was the main opposition to the incumbent Democratic administration: the Republican Party. In the GOP, the question of who would earn the chance to oppose President Glenn in 1984 were directly contingent on the incumbent’s popularity. His approval ratings had begun to recover as 1983 wore on and the economy continued to improve; by the end of the year, it was clear that nobody had any serious chance to defeat him. As a result, many popular state Governors – or former Governors – sat out the campaign, not eager to deplete the goodwill in their favour for a vanity run. As a result, the “elder statesmen” of the Republican Party decided to indulge their egos with vanity runs of their own – at the end of the day, in addition to their legislative accomplishments, they could also say that they were their party’s nominee for President. And in the unlikely event of Glenn becoming vulnerable during a campaign, they would be excellent fallback choices for disillusioned swing voters – experienced, responsible, and mature candidates standing in opposition to the reckless cowboy astronaut currently in office. (Republicans had stopped disparaging Glenn as “Captain Kirk” once their market research and polling advisors informed them that making such comparisons consistently improved his likeability numbers.)

    The three titans of the contest were all members of the upper chamber: Sen. Bob Dole of Kansas, Sen. George Bush of Texas, and Sen. Howard Baker of Tennessee – the Senate Majority Leader. All three were perceived as moderates – a shift in direction from the conservative Reagan, and a reflection of the moderate wing of the party resuming control of its upper echelons following their 1980 loss. (It helped that Reagan had few likely successors within the party’s fiscon wing – his most obvious heir, Sam Steiger, the Junior Senator from Arizona, was deemed unelectable as President because he was Jewish). Many pundits felt that Sen. Bush had the edge over the other two: he had been elected three times in one of the most populous and economically powerful states in the Union (granted, all three times had been in good years for the GOP nationally), but despite the cultural conservatism of the Lone Star state, was relatively moderate – but unlike Howard Baker, not excessively so, nor exceptionally conciliatory towards the opposition, a common criticism within the party’s grassroots as regarded Baker’s tenure as Senate Majority Leader. Bush also had an edge over Baker (whose power base was restricted to the South) and Dole (likewise limited to the Midwest) in that his family was originally from New England (indeed, his father had been a US Senator from Connecticut) and still maintained a summer residence there (in their family compound in the small seaside town of Kennebunkport, Maine). Crucially, this enabled him to gain an edge in the critical first primary of the election season, which was held in New Hampshire, on February 28, 1984. [5] This followed on the heels of his narrow victory over Dole in the Iowa caucus, and rather improbably saw him march to a near-sweep of the states (Dole and Baker won their home states of Kansas and Tennessee, respectively).

    The 1984 Republican National Convention was held in July, by which time Bush had long since sewn up the nomination – accordingly, the event was treated as a coronation for him. Dole and Baker both spoke at the convention, as did former President Reagan and Vice-President Nixon. The event received a great deal of media coverage only to determine who Bush would select as his running-mate. Being a Texan from New England, many pundits predicted a running-mate from the Midwest or West Coast to balance the ticket. He ultimately chose Houston I. Flournoy, the former Governor of California (who had served from 1975 to 1983), creating the fifth consecutive Republican ticket with a candidate from the West Coast. [6] Republican operatives fretted that Flournoy, who had worked with then-Governor Reagan as State Controller, might be too closely associated with the Gipper by the general public – however, Flournoy (unlike Reagan) was a moderate and had governed as such. Conservatives within the party were annoyed at an all-moderate ticket, but the fiscon faction continued to be in shambles from their comprehensive defeat in 1980. Flournoy, who had retired voluntarily in 1982 (though he likely would not have won re-election if he had run), had initially planned to retire from politics altogether, but (unlike many Vice-Presidential candidates) saw the perceived uselessness of the office of Vice-President as a benefit, not a drawback – it would give him plenty of opportunity to focus on other pursuits, or so he thought.

    And then there was the incumbent Democratic ticket. As was typical for incumbents in the 20th century (aberrations such as 1968 notwithstanding), President John Glenn faced no serious opposition for the Democratic Party presidential nomination in 1984, the many caucuses and primaries serving largely as a coronation. He would be elected unanimously at the Democratic National Convention in August, 100% of delegates having been pledged for him as a result of the primaries, where he received over 98% of the vote. [7] He re-selected Vice-President Jimmy Carter as his running-mate. Carter, like Vice-President Mathias before him, was chosen to appeal to an ancient and formerly integral but increasingly marginalized segment of his party. In the case of Mathias, it had been the liberal Rockefeller Republicans, and in the case of Carter, it was the Southern Democrats. Carter being an evangelical Christian enabled him to attempt to appeal to the religious conservatives who were increasingly alienated from an increasingly liberal and secular society, and in this respect he was very successful.

    Indeed, Carter played as large a part in the collapse of the American Party as George Wallace – Wallace swayed the elected representatives of that party, whereas Carter swayed its membership. In many respects, however, Carter was very similar to President Glenn, particularly in terms of ideology. Both were economic and social moderates. To this end, Glenn allowed his Vice-President (who, as past Governor of Georgia, had more executive experience than he did upon taking office) considerable powers, and an office in the West Wing of the White House itself. [8] Indeed, by 1982, political commentators were already describing Carter as the most powerful Vice-President in the nation’s history, more powerful than even Richard M. Nixon, the last man to be bestowed with that sobriquet, had been. In addition, Carter viewed himself as an ambassador of the Glenn administration, and his folksy, down-home charm informed his many appearances in the media. Polls consistently ranked him as the most popular member of the Glenn administration – even the First Lady, Annie Glenn, placed below him. Carter no doubt hoped to follow the lead of the late Hubert H. Humphrey, who in 1968 had been the first sitting Vice-President to be elected to the Presidency since Martin Van Buren in 1836. [9]

    Carter’s popularity throughout the South, coupled with the American Party implosion, convinced Democratic Party strategists to set the objective of winning every state east of the Mississippi – something no Presidential ticket had ever done. During the early years of the Republic, not all territories east of the Mississippi had yet achieved statehood, and in all the elections since, even in landslides, there were always holdouts for one side or the other. In the case of Republican landslides, the Solid South had always held firm for the Democrats; in the case of Democratic landslides, at least some of the Republican bastions in New England could be counted on to come through for the GOP. But Glenn was polling well in New England, and Carter ensured that the Democrats had restored much of their Southern support, so the possibility was a very real one. There was a setback once Bush – the only Republican candidate who had any ties to New England – was chosen as the Presidential nominee, but Democratic strategists continued to be optimistic. By 1984, they finally had reason to be. After a slow start, the Invest in America program was finally beginning to pay dividends.

    The cornerstone of the Invest in America agenda had always been infrastructure, for the simple reason that it best enabled future growth in other sectors. To this end, the most visible and highly-touted accomplishment of the program had been the development of the high-speed rail network, whose tendrils gradually spread from the central nexus points in various conurbations coast-to-coast, to increasingly distant terminus points. President Glenn, his Secretary of Transportation, and Congress had carefully arranged for many marquee lines to be operational in time for the election, and indeed, by the autumn of 1984, they were ready on schedule – which was, in and of itself, a crowning accomplishment. Investors from all across the country were eager to reap the potential benefits from transporting passengers along these lines, and the most eager of these were the casino resort hotel operators in Las Vegas.

    Thanks largely to their investments, the Los Angeles-to-Las Vegas rail line – which, due to its length, was considered a second- or third-tier priority – was on track to be completed far ahead of schedule, and with considerable private funding. In fact, the LA-to-Vegas line had the largest proportion of private-to-public funding of any line being built in the country – even more than the LA-to-Anaheim spur partly funded by Disney and Gene Autry. The objective was clear – bring in more gamblers to Vegas. The HSR line, once completed, would likely be cheaper than air travel, and certainly less inconvenient. Contrary to stereotype, however, the moguls and mobs who ran Sin City tended to hedge their bets; therefore, the city would eventually develop into both a flight hub and a rail junction – with plans to build lines to Los Angeles, San Francisco (via Reno), Phoenix, and Salt Lake City, though many of these lines were years from commencing, let alone completion. Still, that didn’t stop Vegas impresarios from dreaming of inviting daytrippers, overnighters, and weekenders from coast to coast to visit their famed showgirl revues… or their strip clubs. The seediness of Vegas additionally made for a delightful contrast with the wholesomeness of Anaheim, which unlike Vegas was already part of the HSR grid.

    The increasingly nationwide HSR grid was connected to the nationwide power grid, a decision made by the Glenn administration as part of what was touted as the “holistic approach” of the Invest in America program, in contrast to more ad hoc, piecemeal public works projects of eras past. The Secretary of Energy focused on several alternative sources of power generation: natural gas, abundant in the United States, unlike primarily imported crude oil; hydroelectric power, to take advantage of the plentiful waterways in the vast American countryside; and, most prominently, solar power. Solar panel technology had taken a great leap forward in the 1970s, thanks largely to the necessity to develop it for use in the infamously failed “microwave power” experiment. At least some good could be derived from that white elephant in reusing the panels developed for it on terra firma instead. The largest solar plant to be built was capable of generating 50 megawatts of electricity, and was located in perpetually sunny Phoenix, Arizona; the panels were fabricated in the “Glass City” of Toledo, Ohio. The Buckeye State, unsurprisingly, saw an influx of high-tech manufacturing jobs under the Invest in America initiatives, being not only a key swing state but also the President’s home state.

    However, the star of the Glenn administration’s plan for the energy sector was the lowly atom. Anti-nuclear activism had been a cornerstone of the hippie and later environmentalist movements, due in large part to its association with the bomb and with the (negligible) risk of a catastrophic meltdown, respectively. However, the former concern was discredited with the constructive use of nuclear-powered carriers and submarines, and the part they played in the relief efforts in Argentina following the war in that region; the latter concern was (temporarily, but critically) overshadowed by fear of the microwaves “cooking” unsuspecting earthbound victims, as memorably dramatized in The Greenpoint Dilemma. It was just enough for hysteria to shift in that direction for a few years, for just long enough that Invest in America cleared the funding for the continued construction of almost forty additional nuclear power plants, almost half of which went into operation throughout Glenn’s first term. Continuing breakthroughs in safety and efficiency (with particular emphasis on thorium and molten salt reactors) convinced many in Congress (particularly within the GOP) and in the White House that the future of American energy rested upon the atom – nuclear fission was more than adequate to meet the needs of the American people until fifty years hence, when fusion power might finally become a reality. In the meantime, switching to nuclear (alongside other alternative power sources) would ideally end dependency on foreign fossil fuels, which since the Oil Crisis a decade before had proven a perilous strategy. In addition, the high-speed rail lines would be powered by these nuclear plants, and would in turn reduce consumer dependence on automobiles powered by gasoline, itself refined from crude oil. [10]

    The goal of reduced reliance on foreign oil formed a powerful undercurrent of the Glenn administration’s approach to foreign policy. Traditionally strong relations with Saudi Arabia grew increasingly strained as American diplomats felt more free to agitate for liberalization of their oppressive regime – in turn, Saudi ambassadors became less inclined to conceal their displeasure at US recognition of and support for the state of Israel. [11] However, no country in the Middle East felt the brunt of US political pressure more than Libya, which had formerly (as the Kingdom of Libya) been a US ally, but had undergone a violent revolution in 1969, and was now under the leadership of the brutal and flamboyant dictator, Muammar Gaddafi, often derided as “Gaddafi Duck” in US media. Argentina had provided a vivid example of what could happen when a tin-pot dictatorship pushed their limits too far, and Libya – being in the Mediterranean – was much closer to the NATO heartlands than far-flung Argentina. Not to mention that Spain and Portugal, which had recently joined NATO, were eager to put their own aircraft carriers (currently under development) to bear against an easily-defeated enemy – alongside Italy (which was too close to effectively ignore any threat from Libya) in addition to Britain, France, and the United States. As a result, Libya was frequently lampooned in the media as a classically ineffectual third-world state – neither the Soviets nor Red China wanted much to do with Gaddafi’s regime either. Even other pariah states looked down on him.

    By contrast, US relations with the two Communist Great Powers were in a better place, though not without their growing pains. Relations with Red China were tense, but by the same token, relations actually existed. Ironically, the very fact that relations now existed between the United States and Red China resulted in a downturn between the two superpowers – the détente that had existed between the United States and the Soviet Union under Humphrey and then Reagan did not fade entirely, but the situation was considerably more tense than it had been in a time when the Soviet Union had actually voted YEA to a proposal by NATO-led forces to invade Argentina. This was reflected in the popular culture of the day, where China (along with Japan) gained increasing prominence at the expense of Russia and the Soviet bloc. With regard to more traditional US allies, Glenn personally got along well with many foreign leaders, including Prime Minister Stanfield in Canada, Prime Minister Whitelaw in the UK, and President Mitterrand in France, all of whom found him more their style than his predecessor. Glenn was widely liked and admired at home as well.

    Although he was often compared to Captain/Commodore James T. Kirk and other “space cowboys” of his ilk, perhaps the most pertinent comparison made of Glenn was to President Eisenhower. Like Eisenhower, Glenn served in World War II, though as a low-ranked officer in the Marine Corps (he had been promoted to Captain – equivalent to the Naval rank of Lieutenant – just before the end of the war), fighting mostly in the Pacific – as opposed to Eisenhower, who fought in North Africa and Europe and reached the rank of General of the Army and Supreme Commander of the Allied Forces stationed in those theatres. Like Eisenhower, even during and subsequent to his Presidency, he would be remembered in the popular imagination largely for his career before entering politics; in Glenn’s case, it was his time as an astronaut which would define him most vividly. Eisenhower and Glenn also resembled each other in their domestic policies; both Presidents focused heavily on infrastructure, particularly transportation infrastructure. Eisenhower created the Interstate Highway System; Glenn spearheaded the high-speed rail network which, it was hoped, would criss-cross the country by the year 2000. His goal was for it to be more convenient and comfortable to cross the country by rail than by plane, and faster and more economical than to do so by car. The interstates would ideally be the province of the shipping and hauling industries – in addition to Eisenhower’s intended purpose, that of facilitating transport of troops and materiel. It was unlikely that an eventuality of true mutual exclusivity would ever come to pass, but it did allow for high-speed rail to focus solely on passenger transportation going forward.

    It was along these battle lines that the election was ultimately contested in the autumn of 1984. Fittingly, for the first time since Eisenhower’s reelection in 1956, the campaign was fairly quiet and without much controversy on either side. Bush made the decision early on to attempt to demonize his opponent; Glenn, already intimately familiar with smear attempts by past opponents, chose to run a “high road” campaign, avoiding directly challenging Bush in his campaign materials and advertising. To this end, his most famous campaign commercial, entitled “A Brand New Day is Dawning for America”, featured the popular 1982 Don Henley song of the same name (officially titled simply “Brand New Day”). [12] Henley, a devout Democrat, allowed Glenn’s campaign to use his song free of charge. He also memorably performed “Brand New Day” at the Democratic National Convention that summer, in which he did indeed add the lyric “for America” to his refrain “a brand new day is dawning”. In the end, the GOP could not hope to compete with this message of optimism against the backdrop of a stable and improving economy. It became clear by mid-1984 that the Republicans would struggle to maintain the 27 states and 215 electoral votes that Reagan had won in 1980, even notwithstanding that the reapportionment of the most recent census had meant that many states had a different number of electoral votes than had previously been the case. The election results on the evening of November 6, 1984, would ultimately confirm the futility of their challenge.

    Nationwide, the Republicans ran ahead of the Democrats only in the Great Plains states, including Utah, Wyoming, and Idaho. Bush led in his home state of Texas, but by a far closer margin than the state’s robust conservatism might otherwise imply (after all, the Democrats had won the state as recently as 1972, without a native son on the ticket nor in the White House). Because Bush had been forced to retrench to traditional Republican strongholds, Glenn was able to fill the void he had left behind and conduct a fifty-state campaign, often appearing with prominent figures local to each state to celebrate whichever specific Invest in America initiative had brought home the pork for the local electorate. Hollywood, naturally, served as the epicentre of this demonstration – the 1984 Democratic National Convention was held in Los Angeles, after all, and among those who held court on the stage of the Forum were Senator Marlin DeAngelo and Rep. George Takei, both of whom played key roles in working to seize control of Congress back for the Democratic Party. However, though 1984 was a good year for the party, it was not without setbacks.

    The Democrats ultimately regained control of the House of Representatives, eking out an absolute majority of elected seats for the first time since 1972 – their control of the 97th Congress having been secured only through the controversial defection of Wallace’s faction of the American Party. Even not taking those Congressmen (Alabama’s entire Democratic delegation) into account, the Democrats still won enough seats to form a majority – though they still remained under the 240-seat threshold – a once-reliable minimum for the size of party caucus through the 1960s. Far more of their gains in this election came from the American Party, who (in the face of fractionalization at the Presidential level) lost every seat in the lower house, leaving them shut out for the first time since 1972. This allowed the Republicans to remain over 200 seats, allowing for a robust and vigorous minority caucus; the Speaker of the House, Tip O’Neill (resuming the position he had briefly held from 1981 to 1983) saw fit to eliminate the distinction between “Opposition Leader” and “Minority Leader” which the Republicans had introduced in the 1970s to divide and conquer between the Democrats and the Americans.

    The Senate, however, remained more elusive. The GOP won 51 seats in the upper chamber; just enough to prevent the Democrats from gaining control there (had there been a 50-50 tie, Democratic Vice-President Jimmy Carter would have been able to cast tiebreaker votes). Both remaining incumbent AIP Senators (Trent Lott of Mississippi, and James Waggoner of Louisiana) were up for re-election, and their neighbouring states enabled to campaign together with relative ease – both were members of the same faction of the party (the traditionalist, segregationist faction), which enabled them to support Thibodeaux for President, and allowed him to campaign with them as well. In the end, however, it didn’t do either Lott nor Waggoner much good – in Mississippi, Lott finished second behind Rep. Thad Cochran, a Republican, who won with the support of the state’s black population (the most Republican black voters in the country), giving the state two Republican Senators for the first time since Reconstruction. It was the only Senate race in the country where the Democrats finished third. However, the Democrats did win Louisiana, the only Senate race in the country where the Republicans finished third. The AIP was shut out from the Senate, and except for the independent Henry Howell of Virginia (a former Democrat who caucused with his old party, and who had faced no Democratic opponent in the election in exchange for his support), the Senate of the 99th Congress would have no members who did not belong to either the Democratic or Republican parties.

    The Democrats were frustrated at not winning total control of Congress, but many Americans were satisfied – after the extreme polarization, dramatic shifts, and hyper-partisanship of the 1970s – that no one side had total control of the US government. Howard Baker, the popular, moderate, and congenial Senate Majority Leader, stepped down in 1984 after losing the Republican nomination for President – his successor would be another failed candidate for the Presidency, Bob Dole. (Bush briefly entertained the notion of attempting to parachute into the position, but ultimately decided against it.)

    The relatively close results in the races for Congress belied the more decisive result in the race for the Presidency. Glenn didn’t merely echo Eisenhower in terms of his career history, nor in terms of his governance, but also in terms of his electoral results: the 1984 election echoed 1952 and 1956, in that Glenn – like Eisenhower – won by a landslide in the popular vote, and a lopsided victory in the Electoral College; in 1952 and 1956, the Democrats under Adlai Stevenson were reduced to their core bastions in the “Solid South”, which in 1984 the Glenn/Carter ticket swept for the first time since FDR four decades before. [13] Likewise, in this election the Republicans were reduced to their core bastions in the Great Plains, producing an electoral map not dissimilar to their showings in the 1940s. Senator Bush narrowly avoided humiliation by winning his home state of Texas, which provided the GOP ticket with 28 electoral votes. In all, his ticket would carry twelve states – a number which would belie the small population in many of them. Only seven of those twelve states had more Congressmen than Senators, and only Texas had an electoral vote count in the double-digits. However, Bush’s strength in New England – as was, once again, traditional among Republicans – enabled him to narrowly win New Hampshire, preventing the Glenn/Carter ticket from pulling off the first sweep of every state east of the Mississippi. Nonetheless, Glenn could take comfort in having won Vermont – he was only the second Democrat ever to do so, after LBJ in 1964. The only state which went to the Republicans in both 1964 and 1984 – and therefore, the most Republican state in the nation – was Arizona, which had last voted Democratic in 1948. [14]

    More importantly, for the first time since 1964, not a single state had been won by a third-party ticket – all told, the three AIP candidates received just over 1% of the popular vote. The remaining third-party candidates won less than one-half of one percent of the vote; the two major party candidates, between them, won 98.6% of the popular vote, the largest such share since 1964. It appeared that the surprisingly long and tenacious tenure of the AIP had finally come to an end. Then again, there might yet be a possibility for the turkey to arise, like a phoenix, from the ashes…

    …but probably not.


    m9M0tZL-CwNHrDSa4KMWWoQC8BYg6FcjSe3zYYT2wdz9koKpF7Ds5rtFAbtEyZU5lhGK1mOP7YEpOtsBheuIjFi9wfblKLlYCbbAOMDY7qFvgBW4CUScgaVodgWabjo5kDQqPX0

    Map of Presidential election results. Red denotes states won by Glenn and Carter; Blue denotes states won by Bush and Flournoy. The Democratic ticket received 54.4% of the popular vote and 452 electoral votes; the Republican ticket received 44.2% of the popular vote and 86 electoral votes. The ten-point popular vote margin enjoyed by Glenn/Carter was the largest for any ticket since the 1964 Johnson/Humphrey landslide, as was their total of 452 votes in the Electoral College (narrowly edging the 450 electoral votes received by Reagan/Mathias in 1976).

    ---


    [1] For whatever reason, although popular culture consistently remembers John Glenn as the first American in space, he was actually the first American to orbit the Earth – an important distinction. Glenn was actually the third American in space; Alan Shepard was the first – although at least Shepard reached the moon (on Apollo 14, IOTL and ITTL), and became the first to golf thereupon. (The second American in space was Gus Grissom, who tragically died in the Apollo 1 fire, again IOTL and ITTL.)

    [2] IOTL and ITTL, though OTL has a technicality in that Sen. Jake Garn (R-UT), who was inaugurated three days before John Glenn (on December 21, 1974) flew aboard the Space Shuttle Discovery (while still a sitting US Senator!) as a Payload Specialist in April of 1985. This makes Garn the first person to serve in Congress who would fly in space during his lifetime. (Not to be outdone, Glenn also flew in space while serving in Congress, in late 1998, by which time Garn had retired from the Senate).

    [3] There have been six Presidents IOTL whose home state was Ohio, and these were, in order: William Henry Harrison (1841); Rutherford B. Hayes (1877-81); James A. Garfield (1881); William McKinley (1897-1901); William Howard Taft (1909-13); and Warren G. Harding (1921-23). Apart from Harrison, all were Republicans. Notably, none of these six Presidents served two full terms, and indeed, a whopping four of the eight Presidents (50%) to have died in office (and two of the four Presidents to have been assassinated) hailed from the Buckeye State. Consider that, ITTL, the Curse of Tippecanoe has yet to be broken, and that Glenn – elected in 1980 – is from Ohio, and the superstitious are understandably leery of his chances for serving those two full terms, even if he is re-elected.

    [4] Electoral fusion – the practice of a single candidate running on the line of more than one party – was outlawed in many states in the early 20th century, and Wisconsin is among these states. (It remains legal in a handful of states, including New York, where it is widely practiced.)

    [5] IOTL, in the most directly comparable set of GOP primaries (held in 1980), George Bush won 32% of the vote in the Iowa caucuses, narrowly beating the frontrunner, Ronald Reagan (with 30%), and declared that he had “the Big Mo”, which lasted one month and five days before Reagan crushed him in the New Hampshire primary (50% to 23%). Given that New Hampshire was in New England and Bush had connections to two other states in the region (Maine and Connecticut, both of which he did win), a better showing for him was expected. Ultimately, he won only six of 50 states, and New Hampshire would prove a thorn in his side in 1988 and 1992 as well (though he ultimately won the Granite State both times).

    [6] This is just another one of those peculiar streaks associated with Presidential tickets – IOTL, for example, every winning Republican ticket since 1928 has had either Richard Nixon or a Bush on it.

    [7] It bears noting that no incumbent Democratic President has ever received so large a share of the primary vote in the modern era: Jimmy Carter fended off a strong challenge from Ted Kennedy in 1980; Bill Clinton in 1996 and Barack Obama in 2012 each won a relatively modest 89% of the primary vote. (By contrast, Ronald Reagan won 98.8% of the primary vote in 1984 IOTL.)

    [8] This honour was first bestowed IOTL by Jimmy Carter upon his own Vice-President, Walter Mondale.

    [9] Richard Nixon narrowly thwarted Humphrey’s attempt to turn the trick in 1968 IOTL, of course, which was payback for his own attempt to do so having been thwarted by JFK in 1960. IOTL, the first (and to date, the only) sitting Vice-President since Van Buren to be elected President was George Bush, in 1988.

    [10] Electric cars, deriving their power from rechargeable batteries (along with hybrids, which have supplement their internal combustion engines with battery power as needed), are considered a medium- to long-term solution to the problem of gas guzzlers at this point ITTL, with the research needed to develop their viable manufacture and usage deemed decades away at the time of the Oil Crisis, and not much sooner in the early 1980s, when the Transportation Department funded a comprehensive study on the matter.

    [11] Relations between the United States and Saudi Arabia IOTL, on the other hand, have always been very strong, which has had decidedly mixed results.

    [12] The “Brand New Day” commercial is, of course, based on the OTL “Morning in America” commercial for Reagan’s 1984 campaign. Reagan also memorably used “Born in the USA” as a campaign theme (briefly) before both the subtlety of the lyrics drowned out the anthemic chorus and (more importantly) Springsteen himself vehemently protested the use of his song by the GOP (this seemingly happens every election cycle – old-school rockers naturally don’t tend to get along with “the Man”, and yet “the Man” always tries to co-opt one of their anthems).

    [13] IOTL, no Democratic candidate for President has won every state in the Old Confederacy since FDR in 1944. Truman in 1948 lost four states in the Deep South to Strom Thurmond’s splinter “Dixiecrat” ticket; Stevenson lost three states to Eisenhower in 1952 and four in 1956; JFK lost three to Nixon and two to Harry Byrd in 1960; LBJ lost four to Goldwater in 1964; Humphrey lost five to Nixon and five to Wallace in 1968 (carrying only Texas); Nixon swept the South (and everywhere else, except Massachusetts) in 1972; even Carter still lost Virginia in 1976 (by 1.34%), and that’s the closest any Democrat has come to sweeping the Old Confederacy since FDR (he also lost Oklahoma, not yet a state in the 1860s but generally regarded as having been Confederate territory, by 1.21%).

    [14] Although Arizona voted all four times for FDR, and quite decisively for Truman in 1948, it has been a solidly Republican state ever since – favourite son Barry Goldwater won it in 1964 (his only win outside the Deep South) – with one curious aberration: Bill Clinton carried the state in 1996, despite having lost it in 1992; along with Florida, it was one of two states the Republicans won in the earlier election only to lose in the later one. ITTL, the GOP wins Arizona partly due to Goldwater’s legacy and partly due to the staunch support of the state’s other Senator, Sam Steiger.

    ---

    Thanks to e of pi for assisting with the editing, as usual. Thanks also to vultan for his help with the preliminary planning for this update, all those eons ago, and to
    Asnys for his advice regarding nuclear power. I hope you all enjoyed this last US Presidential election to be covered in That Wacky Redhead! Who will run in 1988? Sadly, the world will never know
     
    The Thrill of Victory
  • The Thrill of Victory

    Crookback Castle.jpg
    Historic Alnwick Castle, seat of the Duke of Northumberland, and primary setting of location shooting for The Crookback, in early 1983.

    British television had been broadcasting regularly scheduled programming since before the Second World War, giving it the longest history of any national television broadcasting industry in the world. This longevity, when combined with the tremendous prestige accrued by the state-owned BBC, unsurprisingly fostered an atmosphere of elitism and conservatism within the highest echelons of that industry. Private broadcasters were only (and belatedly) introduced by legislative fiat, rather than by free enterprise as had been the United States, and even these were carefully regulated, though they were dependent on advertising revenue as was the case for most private broadcasters. The BBC, by contrast, was funded not only from allocations by the Exchequer, but also through the television licencing scheme, giving viewers (and critics) a personal stake in the quality, variety, and modernity of the service’s programming. ITV was less directly accountable to viewers, but counter-intuitively owed more to their loyalty; advertising revenues based on their viewership figures were more fickle than the licencing revenues, which were forwarded to the BBC no matter what.

    When it became clear that a fourth television service was finally and definitively going to begin broadcasting in the early 1980s (as a result of the Broadcasting Acts passed through Parliament in 1979 and 1980, fulfilling a campaign promise made by the governing Conservatives in the 1978 election), the question of how to fill the newly-vacant airspace – one-third-again what had been available previously – dominated the planning process for executives at ITV, who were due to be awarded the fourth service (to be branded ITV-2) in 1982. Game shows were quite popular in this era, and could be produced both cheaply and quickly, but a channel could not be built on game shows alone. Delivering on the consistent – and insistent – demand for regularly-broadcast league football games was rightly seen as an avenue with vastly more potential. The notion of there not being enough room for them on the schedule, an overriding concern in the 1970s, had evaporated. Thus, negotiations commenced between the Independent Broadcasting Authority, which operated both ITV channels, and the Football League, which governed the sport in the United Kingdom, for broadcast rights.

    Association football was, even by 1980, the most popular sport in the world – but television coverage of the beautiful game had been surprisingly sparse in its homeland up to that point. The quadrennial World Cup had been available starting in 1954, and broadcast live from 1966, but regularly scheduled top-flight league programming – already a reality for most sports in North America by this time – remained largely unavailable into the 1970s. League highlights were broadcast on the Match of the Day program each Saturday evening starting in 1965, but these merely served to whet the appetite for a feast which was years in the making. It was only the commencement of broadcasts by ITV-2 which finally brought an end to this famine – and, surprisingly, the more established and mainstream (and obligingly-renamed) ITV-1 found itself getting into the act as well, though only for marquee Football League matches. The first major league football match broadcast on ITV-1 was the 1982 Football League Cup Final, on March 13 (a Saturday). Aston Villa defeated Nottingham Forest to win that Cup; they also won the Football League Championship that same year.

    Each ITV-2 affiliate station carried matches played by prominent local teams, particularly where such teams were members of the First Division. Where this was not possible (due to the lack of First Division clubs in the region), Second Division matches were carried instead, giving the clubs belonging to that cohort some much-needed exposure with regional audiences, and allowing them to enlarge their fanbase. On occasion, special matches between a First Division club and a club belonging to a lower-flight division were carried, where such matches had great significance within the region, or to the specific First Division club in contention. In general, the following ITV-2 stations broadcast matches involving the following clubs:

    • Tyne Tees: Sunderland, Newcastle United, Middlesbrough
    • Yorkshire Television: Leeds United, Sheffield Wednesday, Sheffield United, Rotherham, Barnsley
    • Granada Television: Manchester United, Manchester City, Liverpool, Everton, Stoke City
    • ATV: West Bromwich Albion, Aston Villa, Birmingham, Coventry, Nottingham Forest
    • Anglia Television: Norwich City, Ipswich Town
    • Thames Television: Arsenal, Chelsea, Fulham, Tottenham Hotspur, West Ham United
    • Southern Television: Southampton, Portsmouth, Brighton & Hove Albion
    • Westward Television: Bristol Rovers, Bristol City, Plymouth Argyle, Bournemouth, Exeter City
    • Border Television: Carlisle United [1]
    The various ITV-2 affiliates were not allowed to broadcast games on Sunday evening, which was when ITV-1 broadcast their flagship “Sunday Night Football” program, carrying a Football League match between two First Division teams. In addition, as part of an agreement with the Football League, neither ITV-1 nor ITV-2 could broadcast games on Saturday after 3 PM, as this was the traditional kickoff time, and the Football League worried that viewers would have rather stayed home to watch games than come out to the pitch. As a result, an increasing number of games (in order to be broadcast live) were scheduled for any time other than Saturday afternoon or evening – a seemingly inevitable consequence which the Football League had nevertheless somehow failed to foresee.

    The popularity of football was not without a dark side. The ugly spectre following the beautiful game wherever it went was hooliganism, especially football riots. Although these was hardly particular to English football fans, or indeed football fans in general, they were nevertheless strongly associated with them – football hooliganism had come to be known around the world as the English Disease, and it had grown so raucous by the onset of the 1980s that several English football clubs were banned from competing on the Continent – this was extended to a blanket ban covering all English clubs in 1981, as a result of the notorious Parc des Princes disaster at that year’s European Cup Final, when fans of Aston Villa FC charged a retaining wall separating the spectators from the field of play, crushing and killing over a dozen fans of rival club Real Madrid, which made international headlines.

    Although the ban only affected English teams (those from Scotland, Wales, and Northern Ireland were all exempt), England’s absence from the European football scene was emblematic of the growing gulf between the islands and the Continent – a strikingly literal insularity. The United Kingdom – and by extension, and emphatically not willingly, the Republic of Ireland – had been growing apart from Europe and closer to the Commonwealth nations, with whom they shared a common language, culture, and heritage. Continued advances in transportation and telecommunications technology continued to shorten the once-immense distances between the far-flung reaches of the former British Empire, even as cultural posturing and trade barriers widened the much narrower gap from the Continent.

    Although Aston Villa FC was the club at the epicentre of the Parc des Princes disaster, it should be emphasized that the supporters of the Villa were not especially more vociferous in their hooliganism than were the supporters of other teams – it was simply a matter of them being in the right place at the right time. The reason they were able to be in the right place at the right time was because of the club’s tremendous success in the early 1980s, a streak of multiple first-place finishes in the Football League and FA Cup victories which was finally ended (in one fell swoop) by the twin victory of Southampton FC in the 1983-84 season. Southampton finished one point ahead of Aston Villa in the final standings, knocking them out from the qualifying rounds of the FA Cup on the road to defeating Everton in the final, a victory which reached a live audience of millions. By 1984, league football on live television was a reality and a mainstay – it was already difficult to imagine life without it, despite it having been introduced within living memory for even the youngest viewers. Indeed, in the years to come, television would result in a dramatic shift in the core audience for league football.

    The popularity of football in Europe provided an excellent example of how British culture had lopsidedly influenced that of the Continent since the Industrial Revolution that had established the British Empire as the pre-eminent world power, after a great many centuries (dating back to Roman times, if not further still). The modern incarnation of the beautiful game had been invented in England in the late nineteenth century, spreading first to elsewhere in the UK, and then to Europe, and finally to the far-flung then-current (and even former, as South Americans could attest) colonial empires of the various European powers, taking hold pretty much everywhere it went… with (despite a number of false starts) one prominent, and perennial exception: the United States. Indeed, even as regularly-scheduled (association) football broadcasts brought the already astounding popularity of the game to new heights in the UK, soccer (as it was universally known stateside, so as to avoid confusion with the locally popular, and rugby-derived, gridiron football) was faltering stateside. Despite having shown such great promise during their peak years in the 1970s, the top-flight North American Soccer League (NASL) would ultimately fold before the end of the following decade; rapid overexpansion during uncertain economic times, and fatal efforts at one-upmanship amongst the league’s owners, all of whom were increasingly willing to pay big bucks for foreign superstars past their prime in a desperate attempt to attract audiences, were primarily responsible. Given soccer’s disproportionate popularity with immigrant groups, the focus on recruiting foreign players was unsurprising, but rampant inflation, along with the law of diminishing returns, would eventually render this policy unsustainable.

    Individual owners who could not afford to sustain seven-figure losses soon went bankrupt trying to keep up with the Joneses, and their teams folded; those large corporations with deep pockets (and there were several) who could sustain even substantial losses in the longer-term were still forced by their shareholders to divest with what were seen as “unprofitable divisions” in an era of widespread belt-tightening and corporate restructuring. By 1984, the NASL was officially defunct, ending soccer’s aspirations for representation within the major professional sports leagues for the second time in the 20th century. American soccer fans lamented what could have been, but the NASL did leave an enduring legacy in helping to establish soccer as one of the most popular league and intramural sports among American youth. Even as individual NASL owners were playing the short game, the league itself had been playing the long game – one which, perhaps, might yield future benefits…

    Until then, the failure of soccer to gain traction with the American populace – just as so many American sports had utterly failed to gain British converts in years past – was emblematic of the peculiar stalemate between the two dominant powers in the Anglosphere. Although British culture had disproportionate influence on that of the Continent, the familiar tug-of-war which had defined the interactions between British and American culture for close to a century continued into the 1980s. The 1960s saw British pop and rock music – itself heavily influenced by American rock-and-roll of the 1950s – topping the charts stateside, with the Beatles leading the charge. On the small screen, many of the most popular American series of the 1970s – Those Were the Days, Sanford and Son, and Three’s Company, among others – had been based on British mainstays. James Bond, whose dominance had begun under Sean Connery in the 1960s, continued until Michael Billington in the 1970s and 1980s. However, eventually, the Americans found their own ways to exert their own cultural influence over their one-time colonial masters, albeit in unexpected fashion…

    As previously noted, sporting events weren’t the only ways television programmers filled their newly-available timeslots on British television. Game shows were as popular with executives as they were with viewers, as they were in general cheap to produce while simultaneously providing the audience with sufficient spectacle and the opportunity to win cold, hard cash and fabulous prizes. However, the conception and production of game shows were a surprisingly delicate balance, one which required considerable patience and hard work to get right. It was far easier, executives reasoned, to simply import what was already a successful format from another source, especially such an apparently inexhaustible one. There were a great variety of game shows popular during the daytime hour in the USA. Many involved a “quirky” take on the traditional question-and-answer format. Match Game involved a fill-in-the-blanks test with a group of celebrity panelists. Hollywood Squares invited contestants to agree or disagree with celebrity assertions. The Dating Game sought to pair bachelors (or bachelorettes) with one of three contestants whose answers were most compatible with the desired responses to questions asked. Family Feud, a game show specifically tailored around the popular Match Game panelist Richard Dawson (formerly of Hogan’s Heroes), brought on two teams – each consisting of a family unit – and asked them to match a series of surveys. Few of these games involved a significant element of random chance, although some certainly existed.

    Many of these would see import to the UK, though often with at least token changes made for the benefit of a subtly different audience. Family Feud, for example, saw its name changed to Family Fortunes. However, this seemingly dramatic alteration was also a superficial one – the gameplay remained largely intact. The same was also true of the remake of Match Game – or rather, Blankety Blank, a title which emphasized the process of the gameplay as opposed to its objective. However unnecessary these changes might have seemed to outside observers, it continued a proud transatlantic tradition dating all the way back to the 18th century, with Samuel Webster and his revisionist “dictionary”. To be fair, a handful of British game shows, including some of the most popular, were not imported from the United States, but from other foreign countries, including Countdown (from France) and 3-2-1 (from Spain), thus allowing the Continent some influence over British culture after all if only because there were so many game shows on British television in the 1980s that producers very likely ran out of suitable English-language game shows to adapt.

    The genre was so popular that not even the high-minded BBC could not ignore them entirely, and indeed would go on to commission several of their own. However, in this respect, the BBC were not quite so highbrow as their reputation suggested, particularly given
    Match of the Day, along with their popular televised “talent show” program, Opportunity Knocks, and Top of the Pops, a weekly rundown of the UK Singles Chart. On the whole, however, the BBC chose to combat the glut of newly-available populist offerings, even on their own networks, through “counter-programming” – after all, the state-owned broadcaster had a maintain to educate and inform as well as to entertain. Granted, their own broadcast schedule was not nearly so highbrow as they liked to pretend, nor as removed from topicality. Nevertheless, the BBC made it their endeavour to present light entertainment which was both respectable and timeless.

    One of the more ambitious efforts at “light entertainment” during this period was The Crookback, a “secret history” set during the tail-end of the Wars of the Roses, and written by Richard Curtis and Rowan Atkinson, two alumni from Oxford University who cut their teeth working on productions put on by the Oxford University Dramatic Society. Both had done work for the BBC before, but The Crookback was their first traditional sitcom for the network. As the name implied, The Crookback told the story of Richard III, and was produced with an eye for a premiere on the 500th anniversary of his historical usurpation of the throne of England from his nephew, Edward V – and indeed, the first episode would air on June 26, 1983 on BBC-1. Atkinson himself starred as Richard III – though in the program itself, he was still Duke of Gloucester, given the complicated premise. The “secret history” posited that Edward V peacefully succeeded his father, Edward IV, as King in 1483, despite his minority. However, the Regency Council that had been formed to govern until such time as Edward was of age quickly factionalized into two groups: the “Yorkists”, led by Gloucester and his ally, the Duke of Buckingham; and the “Woodvilles”, led by the Queen Mother, Elizabeth Woodville, her brother, the Earl Rivers, and her son (and Edward IV’s stepson), the Marquess of Dorset. On the fringes of the English Court were the rump Lancastrians, dormant since 1471. Their claimant since then, Henry Tudor, Earl of Richmond, was in exile in Brittany; however, their spiritual leader, Lady Margaret Beaufort, Countess of Richmond, through whom Henry derived his claim, remained in England, though theoretically kept under the watchful eye of her “Yorkist” husband, Thomas Stanley, 2nd Baron Stanley.

    In the show’s premiere episode, Edward IV (Peter Cook) was on his deathbed, beseeching his beloved younger brother to take good care of his children, Edward and Richard. However, none of Edward IV’s family particularly cared for his widow, the newly-Dowager Queen Elizabeth (Miriam Margolyes) – who was portrayed as an over-the-hill harlot and prima donna, accustomed to having won men (including her late husband) over purely on her looks and sex appeal, which have long since disappeared with age (and after having carried so many children). As the Dowager Queen was also the Queen Mother, she was often formally referred to as Queen Elizabeth, the Queen Mother – or, more familiarly, the “Queen Mum”, an explicit and humorous nod to the real-life person presently referred to as such. On a more crass show (in the Monty Python vein), she would have been played by a man – here, however, Margolyes handled the thankless role with aplomb. Queen Elizabeth was supported by her close relatives, parvenus all: her brother, the loud, boisterous, and obnoxious Earl Rivers (Brian Blessed), and her son, the moronic but brutish Marquess of Dorset (Mel Smith).

    Gloucester, though very intelligent and cunning, was totally lacking in charisma, and thus utterly unable to rally other aristocrats to his cause, save for the duplicitous (but bumbling) Duke of Buckingham (Griff Rhys Jones), who himself had a dynastic claim to the throne, and was thus presented as the bumbling upper-class twit, allowing the program to engage in typical class-conscious comedy. Meanwhile, Gloucester’s own wife, Anne Neville (Miranda Richardson), was a beautiful but shrewish woman who constantly belittled him, and was unfaithful; something oft noted sardonically by Gloucester himself (“my loving and ever-faithful wife”). Their son (if indeed he was their son), Edward of Middleham, had no respect for him. (Of the three child actors in the cast – those playing Edward V, Richard, Duke of York, and Edward of Middleham – the actor playing Middleham was by far the most prominent, though even he too had a minor role.) It seemed that the only remotely competent person in the English court was the only one with no power or influence whatsoever: Lady Margaret Beaufort (Elspet Gray), though even in her case, humour was often found in her proclamations that she was “biding [her] time” for the perfect moment to summon her son and brother-in-law from across the Channel, even when the latest court catastrophe taking place often right in front of her had seemingly furnished the perfect moment time and again. Nevertheless, her obvious scheming was always smoothly covered up by her husband, the Baron Stanley (Tim McInnerny), by cleverly obfuscating stupidity.

    All six episodes of the series, starting with the premiere and continuing through the finale, featured all three factions jockeying for position against each other, in highly contrived and farcical ways, though not also without some light satire of government bureaucracy. The perennial victims of these internecine plots and struggles were the common people of England, personified by the long-suffering Baldrick the Dung-Gatherer, played by Tony Robinson. His presence allowed the program to indulge in further class-based humour which defined British comedy. (Naturally, Baldrick and Buckingham played off each other at least once an episode.) Previously written narratives set in the period were the frequent targets of parody, up to and including the works of William Shakespeare. Many lines from his Richard III were borrowed for The Crookback, though often delivered in a different context than how they had appeared in the play. (On one memorable occasion, Richard himself broke the fourth wall after being on the receiving end of yet another Shakespearean quotation, saying “I swear I’ve heard that line before, but it might just have been an allusion.”)

    The final episode ended, as it ought to have done, with the “Yorkist” faction finally emerging victorious, killing the Earl Rivers (who, being played by Brian Blessed, naturally got a spectacularly hammy death scene) for good measure, though not without great cost: Richard’s closest ally Buckingham, his wife Anne, and his son Edward all died as well. Richard took this in stride, declaring that he would marry his niece, Elizabeth of York, prompting outrage – Baldrick encapsulated the visceral popular reaction by delivering his most famous line: “you’re a Plantagenet, not a bloody Hapsburg!” [2] – and leading Margaret Beaufort to finally decide that this was the perfect moment for her son, the Earl of Richmond (Robert Bathurst) to launch his invasion, so she duly summoned the “King Over the Water” (a deliberately anachronistic Jacobite reference). He arrived just in time to disrupt the coronation of King Richard III, who engaged him, eager to prove his mettle as King – possibly with the aid of liquid courage as, still drunk from his revelry, he fell from his horse in battle – and the chaos this created among his ranks led Stanley, watching from the hill overlooking Bosworth Field, to launch his assault, ultimately killing Richard III (whose last line was, of course, “A horse! My kingdom for a horse!” – Curtis and Atkinson knew better than to try topping Shakespeare). The newly-crowned Henry VII vowed to rewrite the history books as regarded the reigns of Edward V and Richard III, and his vision prevailed. The closing narration wryly noted that there were those who would seek to rehabilitate the reputation of Richard III (the Ricardians, a small but extremely vocal minority by the 1980s) [3], though perhaps they would be best to leave well enough alone…

    The Crookback was at the vanguard of a new generation of comedians and entertainers, but though the established generation was now forced to share space with them, it did not vanish entirely. Mike Yarwood, whose political impressions were the headlining feature of his massively popular comedy specials in the 1970s, continued to draw high viewership numbers into the 1980s. He was fortunate in having developed a killer impression of Prime Minister Willie Whitelaw, though his efforts at impersonating the current Leader of the Opposition, David Owen, were not quite so fruitful as they had been of previous Labour leaders, though this ultimately did neither his career nor his reputation much damage. [4] Yarwood’s longevity was such that he became more-or-less the undisputed elder statesman of British television comedy upon the death of Eric Morecambe in 1984, ending the iconic Morecambe & Wise partnership which had stood as his only significant rival to the title. The ambition of The Crookback – and the innovative, if unsuccessful, alternative comedy programs which had preceded it – stood in marked contrast to the complacency of Yarwood’s shows. Political satirists, in particular, spurned Yarwood for his consistent pattern of playing it safe – however, British audiences, used to a government which very much tended to play it safe, were seemingly quite happy to go with the flow…

    ---

    [1] Border Television served mainly the Scottish Borders, in addition to the English county of Cumbria, physically separated from the rest of the Northwest (save for a narrow bottleneck along the coast) by Yorkshire’s protrusion inland. This explains their being served by a different affiliate. The absence of any Scottish teams on Border Television is naturally explained by their playing for a different league than the (English) Football League. IOTL, the Scottish Football League did have live match broadcasts until 1986, three years after the (English) Football League.

    [2] Another deliberate anachronism: in the fifteenth century, the Hapsburgs were no more prone to incestuous marriages than most other dynasties – it was the Spanish dynasties that were big on the practice, and of course the Hapsburgs married into Spanish royalty at the end of the fifteenth century, inheriting the tradition, as it were. The reason the line is considered especially funny in-universe is that Baldrick’s lines up to this point have been mostly monosyllabic – a great “shocked” take from Atkinson at Robinson’s delivery hammers home the surprise that Baldrick can even pronounce the word “Plantagenet”, let alone that he knows what it means (or, being working-class, that he cares what it means born with a wooden spoon in his mouth and all that).

    [3] The oldest and most prominent Ricardian organization (IOTL and ITTL), the Richard III Society, lacks the patronage of Prince Richard of Gloucester ITTL, because – well, note the difference from his OTL style Prince Richard, Duke of Gloucester. He isn’t the Duke ITTL, because his elder brother Prince William survives, and it is he who succeeded his father (Prince Henry, son of George V) as Duke of Gloucester in 1974, and Prince William, despite sharing his predecessor’s title, does not share his name, and thus is not tickled by joining the organization devoted to his historical rehabilitation.

    [4] As opposed to the situation IOTL, in which he proved utterly (but understandably) unable to impersonate the female Prime Minister from 1979 onward.

    ---

    Happy Anniversary! Today marks four years since I first began posting That Wacky Redhead to this forum! I hope you’ve all enjoyed reading along as much as I’ve enjoyed writing. There are still a few more updates to go before we reach September 20, 1986, so I hope that you’ll all continue to enjoy the home stretch with me!

    Thanks, as always, to e of pi for assisting with the editing. Thanks also to Thande for serving as the sounding board for my alt-Blackadder, and to nixonshead for serving as my Official Footy Consultant!

    Crookback Castle.jpg
     
    Last edited:
    Top