Urban Renewal
  • Urban Renewal

    The metaphorical “dawn” of the 1970s had never seemed more apt of a term, having followed the sheer darkness and pandemonium that had reigned as the 1960s came to a tumultuous close. That decade, though it had been dominated by social upheaval, saw a great deal of economic uncertainty as well. Indeed, those two concepts were very much intertwined; costly and unpopular overseas entanglements, race riots and high-profile assassinations naturally played havoc with consumer and investor confidence in the world’s largest economy. Heavy industries which dominated American cities in the Midwest and Northeast through to the 1950s – a decade which, not coincidentally, was the focal point of the retro nostalgia which also served to characterize the 1970s – had been facing significant and increasingly rapid decline ever since. It was no wonder that the now middle-aged generation of men and women who had fought in World War II (whether on the battle front or the home front) faced great hardship and alienation, ironically enough in much the same fashion as their children did on the other side of the “Generation Gap”. As the factories closed down, the railroads that had once served as the backbone of passage from sea to shining sea also found themselves facing an uncertain future. The beginnings of this decline, too, could be traced to Those Golden Fifties: President Eisenhower had overseen the development of the Interstate Highway system, which replaced the haphazard, uneven backroads with modern, well-maintained, multi-lane, high-quality superhighways. This had been an ambition of his for several decades, ever since he had participated in an Army convoy which had taken two months to travel across the Continent by road, shortly after the First World War. The National Interstate and Defense Highways Act was passed in 1956; the following year, perhaps by coincidence, Major League Baseball’s Giants and Dodgers both departed the Big Apple for the Golden State. New York City – the largest urban centre not just in the United States, but in all of North America – stood alone as the prime representative of what America had once been, and the harsh reality of what it had become.

    Though the name “interstate” to describe these highways that Eisenhower had built was technically correct, perhaps “intercity” might have been more precise; these new routes, despite their great lengths and courses running parallel to all the others (at least, those which also moved either north-south or east-west) generally passed through (or near) large urban centres wherever possible. Appropriately enough, cities which had previously tended to be rail hubs inevitably found themselves at the junction of multiple interstate highways. As a result, the passenger rail which had formerly dominated those same hubs fell into rapid, seemingly irreversible decline. The 1950s and 1960s came to be known as iconic automotive decades, and certainly not locomotive ones. The rise of sports and muscle cars and the popular image of them cruising on the open road informed the culture of transportation in the United States in the mid-20th century. Even the hottest competition thereto came in the form of motorcycles, another definitive post-war mode of transport, though the subcultures associated with those particular vehicles were, perhaps, not quite so wholesome and all-American as they were with cars. The wealthy, for their part, preferred private air travel, with modern, luxurious models first becoming available to end consumers in this era; this allowed them something akin to the sense of freedom and adventure felt by barnstormers and thrill-seekers at the turn of the century. As with many other new technologies, the popular name for these kinds of planes came to be associated with the largest manufacturer thereof, Cessna, whose Model 172 Skyhawk became an instant bestseller, remaining so well into the 1970s. With all of these new and exciting modes of transport available to the everyday American consumer, the boring, steady, and dependable locomotive was a mere relic as far as passenger travel was concerned, restricted solely for use as freight. Even that saw declines in the 1970s when contrasted against trucking, which was far more “cool” and “rebellious” and thus indicative of the American Spirit than the comfortable, efficient, and European-style trains. The rise in popularity of Citizens’ Band (or CB) Radio, the preferred mode of communication for truckers everywhere, played a key part in this image; the appeal of talking over the air with complete strangers – previously the exclusive province of call-in radio shows – was infectious, and many people had the radios installed into their station wagons for their own long drives along the interstate.

    However, despite the overall popularity of automobiles (and all else which travelled over asphalt), moving into the 1970s, the American companies – such as Ford, Chrysler, General Motors, and American Motors – who had previously dominated that industry saw themselves losing ground to the Japanese (in another recurring theme of the 1970s), for their cars were cheaper and better-made. Manufacturers from the Land of the Rising Sun began to build their own factories stateside, as this would allow them to “beat” the high tariffs that would come from having to import their inventory across the Pacific. This decision had the collateral advantages of creating additional jobs, and stabilizing the automotive industry despite the continuing decline in performance of the stateside manufacturers therein. Additionally, innovation, which had been a hallmark of that particular industry ever since the days of Henry Ford and his assembly line, found itself a new outlet when the Oil Crisis of 1973 created the need for greater fuel efficiency in order to defeat the “gas guzzlers” which were increasingly problematic in the new era of OPEC and trade embargoes. The interventionist administration of President Hubert H. Humphrey supported a bill imposing new fleet-wide average mileage standards, which became the last major piece of legislation passed into law by the lame-duck Democratic-controlled 93rd Congress in late 1974. [1] Denigrated by members of the incoming 94th Congress as “the last gasp of the Great Society”, they nevertheless would not act to repeal that legislation. Simply put, under a framework of regulations called the Corporate Average Fuel Economy (or CAFE) standards, the onus was on the automakers to develop more fuel-efficient vehicles, or face severe penalties for noncompliance. However, because the regulations put into place a weighted-averages measurement system, they would be allowed to continue selling their less-efficient land-yacht models, so long as the new, more-efficient cars they developed sold better. Though compliance would not become mandatory until 1978, most consumers had voted with their wallets by that point, sending many of the most beloved marques of the previous decades to their doom. [2]

    But despite the big bump in the road for American automakers, their brethren in the passenger rail industry would still manage to beat them in a race to the (financial) bottom. Several former titans of the business – such as Penn Central and Union Pacific – had each declared bankruptcy and made plans to cease operations by the late 1960s; but such a symbol of American progress and unity no longer being viable would prove a titanic blow to the overall reputation of the American Dream. Thus, the federal government shouldered the responsibility of attempting to salvage the system, which it did under the terms of the Rail Passenger Service Act in 1970. The government would begin funding passenger rail service on a large scale, forming the National Railroad Passenger Corporation (or NRPC) in order to do so. Rail lines merged into the national entity took the brand name “Paxrail”, an eminently topical designation for a country newly released from a certain overseas quagmire. [3] However, all sides clearly intended for this to only serve as a temporary solution, not a permanent one; and, indeed, a more lasting solution was reached, once again, during the twilight years of the “Great Society”. The National Intercity Railways Act of 1973 became the largest-ever public works project in American history [4], creating a mammoth bureaucracy that would fund and oversee all passenger rail in the United States, with the surviving transit companies formally joining into the scheme. In later years, none other than George Takei would take credit for inspiring the creation of this bureaucracy, claiming to have discussed the issue with Humphrey at the Democratic National Convention of 1972. At the time, however, this discussion was said to have consisted largely of Takei coaxing the President into naming the first space shuttle after the USS Enterprise. [5] Takei, in rebuttal, insisted that “it took me all of fifteen seconds to get him to change the name”, and that most of their several-minutes-long conversation was indeed devoted to transportation policy, of which he had always been a fan.

    Meanwhile, across the northern border in Canada, that country’s largest city, Montreal, had taken increasing strides toward becoming a major world city since it had famously hosted the Expo in 1967; it rose in the global estimation in a roughly proportional fashion to the decline of New York City, which was just six hours down Autoroute 15 and, after a border crossing, Interstate 87. Montreal had become the first Canadian city to host a Major League Baseball club (which was named for the Expo) in 1969; the National Basketball Association, after having completed their merger process with the rival American Basketball Association in 1974, also saw the addition of a local team (named, like the Expos, for a recent event hosted by the city – in this case, the Olympics) four years later, as part of league expansion into Canada. [6] With the arrival of the Olympians, the island city now had a representative club in each of the four major professional sports of North America at that time, though their football team (the Alouettes) played not for the National Football League, but for the Canadian Football League, as did eight other teams throughout the Great White North. The teams competed for the Grey Cup – which, like the Stanley Cup before it, had been bestowed upon players of the sport by a former Governor-General; in this case Albert Grey, the 4th Earl Grey, had granted the award to enthusiasts of the sport. His grandfather, the 2nd Earl, had served as Prime Minister of the United Kingdom during the passage of the Great Reform Act (and who later gave his name to a famous blend of tea). Interests within the NFL had repeatedly attempted to expand into either Montreal or Toronto, as MLB and the NBA had done, but the government of Prime Minister Stanfield – whose own favourite sport was, in fact, Canadian football – would always threaten to table a bill that would legally prohibit such an occurrence. [7] Amusingly, this was one of the few points of agreement between Stanfield and his greatest rival, former Prime Minister Pierre Elliott Trudeau.

    But in terms of civic planning and infrastructure, the defining triumph of the Montreal region, despite the many developments therein throughout the 1970s, was the completion of the high-speed rail line linking the Central Station in the downtown core with the terminals of Mirabel International Airport – a distance of over 30 miles, or 50 kilometers. [8] Serving as the centrepiece to a vast transportation network linking the island with the airport, the line – nicknamed “the Rocket”, both for its powerful velocity and as an homage to the legendary hockey player Maurice Richard, who attended the opening ceremonies – was completed in 1978, “only” two years after the Olympics had concluded. The Rocket could complete the journey between Central Station and Mirabel in just under twenty minutes – half the time that it would take to make the same trip by automobile, though not including the cumbersome and often frustrating arrangement of securing parking, nor the time-consuming trip from the often fairly remote lots to the thick of the action. In the case of both the airport and the downtown stations, their respective rail platforms were centrally located. The Rocket was able to attain a maximum operational velocity of 130 miles per hour (or 215 kilometres per hour). Canadian Prime Minister Robert Stanfield had accepted an offer from the Montreal Locomotive Works to complete the construction contract, having been very impressed by their prototype; he was also quite ecstatic that the company was based in the city itself, and would therefore be providing jobs for the locals. Construction on the line had begun in 1974, and shortly thereafter the Montreal Locomotive Works were purchased by the conglomerate Bombardier. [9] An acquisition of that sheer scale had to be approved by the federal government, especially given their existing relationship with one of the involved companies. In the end, it was permitted only because Bombardier, too, was a Canadian-owned corporation; in fact, up to that point, they had primarily been involved in the manufacture of snowmobiles, which, granted, were a necessity in the wintery province of Quebec. But most importantly, the Rocket was intended by Stanfield as a potential pilot project for a far more ambitious planned high-speed rail connection which would eventually connect Quebec City in the northeast to Windsor in the southwest; the two cities, over seven hundred miles apart, had most of Canada’s population living within two hours of the corridor of road and rail links between them. [10] An “express” high-speed train following that route would take about six hours to complete its trip; a car making the same journey would arrive in one city well into the evening, if they departed from the other at the crack of dawn. Gasoline was ever-more expensive, and despite manufacturers working to improve fuel efficiency in Canada as well as in the United States (again, not without government incentive to do so), the “gas guzzlers” which so defined automobiles of the 1970s would make any such trip ruinously expensive. Also worth noting, though from a far more hypothetical perspective, was that this distance was actually greater than the one between Boston and Washington, DC - between which lay the largest, most populous and densest conurbation in the United States, described by futurist Herman Kahn as the BosWash megalopolis in 1967.

    Meanwhile, Toronto, the second-largest city in Canada (and a perennial rival to Montreal for nearly 150 years by this point) and the capital of the country’s most populous province of Ontario, also saw the construction of a massive infrastructure project. 1976, the year of the Olympics in Montreal, was when a broadcasting tower constructed by and named for the Canadian National Railway (though always properly the “CN Tower”) was finally completed, at which time it became the world’s tallest freestanding structure. [11] Toronto also saw benefits from the continuing expansions of major sports leagues through the 1970s; though it lacked the storied baseball history of Montreal with a Jackie Robinson having played for the city, it still received a Major League Baseball club in 1977, to complement the Montreal Expos. The team was named the Blue Jays, the name being a reference to blue being the primary colour of both the city in general and all the athletic clubs playing there in particular. In addition, a basketball team, the Toronto Huskies, had played for one season (1946-47) in the city shortly after World War II, as part of the Basketball Association of America; that league had merged with another to form the modern National Basketball Association in 1949. The newly formed Toronto Huskies joined the Montreal Olympians as part of the NBA expansion into Canada in 1978, giving “Hogtown” two new clubs in as many consecutive years. [12] The Huskies and the Olympians naturally became immediate rivals, helped along immensely by their play in the very same division of the NBA (the Central Division of the Eastern Conference), in contrast to their more distant situation in MLB and the NHL.

    Canada, after consistently paling in comparison to the economic powerhouse that was the United States, had finally gained some ground on them, enjoying a consistently strong economy through most of the 1970s, in contrast to the more erratic situation stateside. However, this was not to say that government efforts to ameliorate the American situation did not bear fruit. In fact, the combined effect of the many infrastructure and industry-boosting laws and regulations passed by the Humphrey administration in tandem with Congress, coupled with his protectionist policies, worked to somewhat revitalize the industrial states of the Northern U.S., nicknamed “the Foundry” [13], not to mention their core cities, most of which had previously been in decline. However, and despite this “Manufacturing Miracle”, in relative terms, that sector continued to decline in those regions as a proportion of the total within the United States – though this decline did indeed slow considerably in contrast to the 1960s. But even the most optimistic statistical findings were not without exception. The Empire State of New York continued to see their standing in freefall, largely tied to that of New York City, which was riddled with crime, poverty, homelessness, unemployment and unrest. Despite the turnaround achieved elsewhere, rehabilitating the Big Apple would take more than just a miracle.

    Several Midwestern cities unexpectedly flourished even above and beyond the Manufacturing Miracle of the 1970s. One of the most seemingly unlikely examples was Indianapolis – a rail hub which was completely surrounded by the archetypal rural, agricultural state – quite fortuitously through a confluence of the many forces dominating economics and public policy in the 1970s. The creation of Paxrail, followed by the Intercity Railways Act, prevented what otherwise might have been the loss of hundreds, if not thousands, of jobs in one of the largest industries in the region. The merger of the ABA into the NBA saw the city’s team, the Indiana Pacers (founded in 1967), joining the top professional league in basketball, that sport which Hoosiers adored above all else. In addition, their nearest and fiercest rivals, the Kentucky Colonels, based in Louisville (two hours down I-65, across the Ohio River), were also an ABA team which successfully merged into the NBA. [14] But Indianapolis (one of those cities which attracted nicknames like moths to a flame) was unique in being a host city not only to an ABA team which joined the NBA in 1974, but also to a WHA team which joined the NHL in 1977. These two league mergers, along with a political amalgamation of Indianapolis proper with the surrounding communities in the rest of Marion County in 1970, earned it the enduring nickname of “Mergertown”. The Indianapolis Racers and the Indiana Pacers both played in Market Square Arena, completed in 1974 under the auspices of then-mayor Richard Lugar, less than two months before he was elected to the United States Senate; Deputy Mayor Michael A. Carroll replaced him, but continued to implement his strategy to revitalize the downtown core through the promotion of new and exciting sporting events, hosted in venues that would surely bring tourists into the region, and suburbanites into the city. [15] It should be noted that this strategy did not take into account the already-established Indianapolis 500, one of the Triple Crown of Motorsport, though it certainly helped to cement “Mergertown” as a sports mecca at the heart of the Crossroads of America. Perhaps this reputation precluded the need for the construction of a major amusement park, which followed in the wake of the success of Disneyland – perhaps the crowning creative achievement of Uncle Walt in his lifetime, and certainly the one which had the greatest impact on the travel and recreational activities of families throughout the United States.

    Much as I-65 linked Mergertown with Louisville, across the Ohio, I-74 linked Indianapolis with Cincinnati, which was situated on the Ohio, and in the state of Ohio. The two cities not only shared an interstate, but also a status as home to a WHA-turned-NHL team; in the case of Cincinnati, the Stingers. However, Cincinnati, unlike Indianapolis, did have a shiny new theme park in King’s Island (completed in 1973), though it was actually located in the fairly remote suburb of Mason. [16] Cincinnati also did not quite enjoy the booming success of Indianapolis, but it was nonetheless given a new lease on life in the 1970s primarily through notoriety, as opposed to solid public policy. “The Queen City”, as it was known, had by happenstance been chosen as the setting of the cult sitcom WMTM in Cincinnati; the quirks attributed to the city were matched in real-life by the man who was Mayor when the show began its run, Jerry Springer. [17] The earnest and eloquent Springer, known for his serious, thoughtful demeanour, had been an advisor for the ill-fated Presidential campaign of Robert F. Kennedy in 1968; after that position had expired, courtesy an assassin’s bullet, he decided to transition to a run for Congress, losing against the incumbent Republican, Donald D. Clancy, in 1970. However, his valiant run in a tough district for his party impressed President Hubert H. Humphrey, who urged him to run for a second time, along with taking up a key role in the 1972 Presidential campaign for the Buckeye State. Springer juggled both tasks with aplomb, helping Humphrey to narrowly clinch Ohio and won the seat of OH-02 against Clancy in a rematch. His victory was short-lived, though, as he was easily swept aside in the Republican wave of 1974, but he reluctantly acceded to the demands of local Democrats in running for municipal council the following year. In 1977, he was elected Mayor; WMTM, dutifully, occasionally mentioned “Mayor Springer” in various episodes. He, in turn, played along in consistently referring to the (fictional) station as being “home to the best music in the tri-state”. [18] With regards to sporting events beyond hockey, though Cincinnati lacked a professional basketball team, their Bengals had played football since 1968; the Cincinnati Reds, on the other hand, had played since 1881, making them one of the oldest teams in Major League Baseball.

    And then there was Denver. Like Montreal, it had flourished in the years following its hosting of the Olympics (though those of the Winter, rather than of the Summer) in 1976. The Olympic Sports Arena built there housed the Denver Rockets [19], a basketball team that had been a charter member of the ABA, and the Colorado Rockies, an NHL team formerly located in Kansas City before travelling west along I-70, in the very same year that the city had hosted the Olympics in the first place. Sport, perhaps more than was the case in any other American city to thrive in the 1970s, was very much the lifeblood of Denver. Even as a tourist attraction, the “Mile High City” and its environs were primarily known as the site of some of the finest skiing in the world; newer outdoor activities, such as snowboarding and snowmobiling, also became very popular. However, the high mountains were treacherous, and accidents were plentiful. Indeed, for those wealthy travellers flying their own private planes to Colorado, there was an additional element of thrill and danger even beyond the typical risks of flight: the Cessna 172 had a maximum operational height of 13,500 feet above sea level, as the lower air pressures at higher elevations proved insufficient to keep the craft airborne. As it happened, the highest airport in the United States, Lake County Airport, was located in Leadville, Colorado; its location made it a potential stop-off for private flights from many points in the American Southwest. It was 10,000 feet above sea level, which was enough to make taking off and staying in the air quite an exhilarating challenge for the aspiring pilot. Despite perhaps managing to be an even bigger sports mecca than Indianapolis, the Denver suburbs were selected to play host to a brand-new amusement park, this one intended as a true theme park in every sense of the word. Science-Fiction Land, inspired in parts by the pulp novels, movie serials, and comic books of yesteryear, by venerable genre films, and by popular television shows and movies of the recent past and present (including all the usual suspects – Star Trek, Journey of the Force, Galactica, 2001: A Space Odyssey, and Moonraker, among many others) broke ground in Aurora, Colorado, in 1979. [20] An eccentric by the name of Barry Ira Geller – truly, the Howard Hughes of the modern day – served as the impresario behind the theme park, but it was perhaps a testament to the economic recovery and the drive for free enterprise which characterised the mid-1970s, not to mention the strong fundamentals and good growth in Colorado in general and Denver in particular as a result of the post-Olympics boom. And the buzz that surrounded the construction – seemingly out-of-this-world promises about the coming attractions and events that were in store – were enticing, even enthralling. The 1970s being a decade of nostalgia, as infrastructure and modes of transportation tried to rebuild themselves to the heights of past glories, it was not altogether surprising that – coupled with other contemporary factors – the gradual return of the Moonie Loonies may well have reared its ugly head about as close as most people got to the stratosphere

    ---

    [1] IOTL, the CAFE standards were passed in 1975, under the Ford administration.

    [2] Compliance becomes mandatory as of the starting day of the first new fiscal year on or following January 1, 1978, for each automaker.

    [3] “Paxrail” was the working title IOTL, as well, before the NRPC instead settled on the name “Amtrak”, which remains in use to this day.

    [4] The National Interstate Highways and Defense Act was valued at $25 billion in 1956, making it the most expensive public works program in American history up to that point. Adjusted for inflation, this would be equal to $40 billion in 1973; however, because the common masses are judged by the media to have a problem ascertaining the time value of money, the Intercity Railways Act need only exceed the original $25 billion to be valued as the most expensive public works program in American history, even if it is worth less than $40 billion in 1973 dollars. (For the record, $25 billion in 1956 is equivalent to over $210 billion in today’s dollars. Also, term “billion”, as is always the case with American numbering, refers to the short scale of one thousand million, or 10 raised to the 9th power).

    [5] Whether Takei – an avowed fan of public transit, IOTL and ITTL, actually did as much as he claimed he did to convince Humphrey to enact a nationwide railway bureaucracy is to be left to the individual interpretations of each reader. Granted, it probably wouldn’t take much to talk Humphrey out of naming a shuttle Endurance; but Takei is a politician, and at the time that he was quoted as taking credit for the decision, Humphrey was already dead andwell, I’m getting ahead of myself.

    [6] The National Basketball Association and the American Basketball Association did not merge until 1976 IOTL. Putting an effective moratorium on negotiations for several years was a lawsuit proceeding through the courts; IOTL, Congress came close to circumventing it but ultimately let it run its course. ITTL, on the other hand, the vote goes the other way, and both sides are at the negotiation tables by 1973. Note that the basketball merger still precedes the hockey merger by three seasons.

    [7] This actually happened IOTL in 1974, when a team from an American football league (and not even the NFL; this was a bush-league that didn’t last two seasons) attempted to play in Toronto. The bill passed second reading (there are three readings in the House of Commons before they move on to the Senate, and then Royal Assent) before the owner of the team backed down. Today, it is widely accepted (due to cultural protectionism being a cornerstone of Canadian public policy) that a similar law will develop if the NFL ever attempts to establish a presence in Canada (the reason why they are the only one of the major leagues which has never done so).

    [8] The high-speed rail line was tentatively planned, but never built (and nor were most of the highways, for that matter). No such line would exist in all of North America until the Acela Express began operations in 2000, along the aforementioned Northeast Corridor, though on average at much slower speeds than even the relatively modest 130 miles per hour managed by the Rocket ITTL (in the 1970s, mind).

    [9] Bombardier purchased Montreal Locomotive Works in 1975, IOTL, as part of its initial expansion into the railway industry (they have since also gone into aviation). Not coincidentally, Bombardier built the Acela HSR line IOTL.

    [10] The Quebec City-to-Windsor “corridor”, named after the train which travels from one to the other IOTL, is an incredibly vital route which is, essentially, the backbone of the Canadian economy. However, the train (part of the VIA rail system, essentially the Canadian answer to Amtrak) is not high-speed, and rough calculations place the cheapest ticket prices at about equal to the cost of a road trip by car, with at least the same travel time (and that’s being very generous with the number of stops by car). The much greater opportunity cost involved with an HSR route as an option (assuming that prices are relatively even, and they would have to be subsidized to that point in order to become viable in car-loving North America) makes it far more attractive.

    [11] The CN Tower remained the world’s tallest freestanding structure (an oddly precise superlative which, nonetheless, most Torontonians have memorized) for just over three decades IOTL, before being dethroned by the one-upmanship construction blitz that was Dubai prior to the recession (just prior, in fact – in 2007). Note that at the time of construction, the builders did not expect the tower to hold the record for very long, let alone for three decades. Such are the happenstances of history.

    [12] It should be noted that the NBA did not attempt this “Canadian expansion” until 1995 IOTL, at which time they moved into Toronto… and Vancouver. By that time, the “Huskies” name had been completely swept under the rug due to post-Jurassic Park dinosaur hysteria and thus, the name “Raptors” was chosen instead. (The Vancouver team became known as the “Grizzlies”, a name which has since served them well in Memphis, a city on the Mississippi Delta, thousands of miles from their habitation range.)

    [13] The “Foundry” is a historic name for that which, IOTL, would become known as the “Rust Belt”, a name which only achieved widespread prevalence in the early 1980s, by which time the “Manufacturing Miracle” is well and truly apparent. Consider it a “scouring” of the Rust Belt, if you will. And yes, that pun would have gone into the body of the text if it weren’t an anachronism.

    [14] The Kentucky Colonels were (quite notoriously) left out of the NBA-ABA merger IOTL despite their strong record and popularity with audiences, but (as with the WHL) the larger pool of teams from the earlier merger ITTL enables their salvation. This allows the Colonels-Pacers rivalry which dominated the ABA to carry on into the NBA.

    [15] As Lugar was unsuccessful for his 1974 Senate bid against Birch Bayh IOTL, he continued as Mayor of Indianapolis until 1976; his successor, William H. Hudnut, was first elected in 1975. Hudnut had briefly served as a Congressman, representing IN-11 (an Indianapolis-area seat) from 1973 to 1975; but ITTL, he won his Congressional re-election bid in 1974, thus necessitating the need for Carroll to step up and replace Lugar upon his midterm resignation that year (winning a full term in his own right the following year).

    [16] Those of you who are fans of The Brady Bunch (which, I remind you all despite your shocking apathy, does not exist ITTL) may recall the family visiting King’s Island in a late fifth-season episode (so late, that Cousin Oliver actually joins them). Mason is actually not that far from inner-city Cleveland, but Barry Williams (who played Greg Brady) remembered it being so in his Growing Up Brady memoir, and his recollection suits my purposes for evoking a sense of time and place.

    [17] Yes, that Jerry Springer. Yes, he really was a Democratic insider in the late 1960s and ran for Congress in the early 1970s. Yes, he really did serve on Cincinnati City Council through the 1970s, even becoming Mayor in 1977. However, IOTL, he did not run for Congress a second time, instead running directly for Council in 1971, where he remained until he resigned in 1974 after having been caught hiring a prostitute (his cheque bounced). He was actually returned to Council the following year (an early example of the Clintonian “sex scandals boost electability” factor), subsequently becoming Mayor. His career since then has… well, let’s just say it’s been a mixed bag.

    [18] Based on a persistent rumour of OTL references on WKRP in Cincinnati to a “Mayor Springer”. His term ended in 1978, so the overlap between it and the show’s run was vanishingly small, if indeed it existed at all. IMDb (which, granted, is perhaps even less reliable as a resource than Wikipedia) explicitly disclaims the rumour.

    [19] IOTL, the name was changed to Nuggets in 1974, the year of the merger ITTL. Given the more space-happy atmosphere of society in general (especially with the big boost from Moonraker that year), Rockets remains in place.

    [20] Science-Fiction Land was planned IOTL, but never happened, partly because the economy was a disaster in the late-1970s, and partly because the idea simply collapsed under the weight of its own hubris. It’s highly unlikely that what Geller has conceptualized for the park will ever be fully realized, but at the very least, we’re going to get an honest stab at it. After all, TTL is probably the most hospitable for such a bold gamble!

    ---

    My thanks to e of pi for his advice and suggestions in the making of this update, as well as for directly assisting during the editing process.

    And so, there you have it! Something of an overview of how cities in the United States (and Canada, for a comparison and contrast) dealt with the 1970s ITTL. This is probably one of the more economics-heavy updates you’ll be seeing in this timeline, and I’ve deliberately left out most of Reagan’s actions, because that’s what the Appendix B update this cycle is going to focus on. But hopefully, this ties up a few loose ends from earlier updates, and gives you an impression of what politicians are going to be attempting as we move into the 1980s. But on the whole, manufacturing is a far more robust sector in the United States as the 1970s draw to a close ITTL, even though the “Manufacturing Miracle” is far more a clotting of the wound than actual healing. Then again, it’s not as if Americans ITTL will have cause to refer to anything else in this era as a miracle…
     
    Back to British Telly
  • Back To British Telly

    The great ocean – often called, with characteristic irony, “The Pond” – which separated the United States from the United Kingdom had been successfully traversed by transatlantic telegraph cable in 1866, allowing for permanent telecommunications between the two powers from that point forward. Over a century later, as technologies had evolved far beyond the capabilities of that original medium, ever-faster ships – followed by, eventually, ever-faster planes – reduced trips that had once taken weeks to a matter of hours; still, the vast physical distances remained. Television, the most advanced and ubiquitous form of media from the 1950s onward, was wholly dependent on radio signals which were transferred along the airwaves. Though this state of affairs was starting to change by the late 1970s, the status quo ensured that signals produced stateside or in the British Isles could never cross, which resulted in programming largely unique to either side. The use of film, videotape, and syndication did allow for a certain degree of overlap, however, especially as each market became more aware of trends taking place in the other. And trends in the British Isles were certainly most noteworthy

    Echoing the many major television events taking place stateside in the mid-to-late-1970s, perhaps the pièce de résistance of British television in this era – and certainly the most widely-viewed event of such, with an audience of over 30 million [1], more than half the population at the time – was the appearance on the Morecambe and Wise Christmas Special of December 25, 1977, by HRH The Prince of Wales (often, though incorrectly, referred to as simply Prince Charles), the Heir Apparent to the throne of United Kingdom and that of all the other Commonwealth Realms. His Royal Highness had been an admirer of Morecambe and Wise, as were most of his future subjects throughout Great Britain; and, against the advice of his courtiers, he consented to making this unprecedented (though pre-recorded) appearance. He featured in the climax to a sketch in which both Morecambe and Wise bluffed about having made the acquaintances of various people, none of whom either of them had obviously ever met. Both of them, naturally, took the time to note their intimate friendship with the Prince of Wales
    through female relatives, given his status as the world’s most eligible bachelor only for them to suddenly cross his path, at which point he firmly (though very politely) rebuffed their appeals to recall having ever met them. “I’m afraid I’m not familiar with you at all,” was his famous line, an obvious bit of meta-humour at the pair being household names (“even at Buckingham Palace”, writer Eddie Braben shrewdly observed). This show-stopping turn by the Prince of Wales was very well-received by the press, though one reviewer was jokingly dismissive: “Call me when they manage to have Hirohito make a guest appearance”. This was in reference to singular talents of producer John Ammonds, who was said to have the ability to book anyone to appear on Morecambe and Wise. [2] Whether Ammonds actually tried to get in touch with the Emperor of Japan (and former adversary to the British Empire) in response to this challenge was unknown.

    However, Morecambe and Wise, though undeniably popular, were only renowned on their side of the Pond. One of the several phenomena which was felt on both sides of the Pond, on the other hand, was Monty Python’s Flying Circus, the seminal, anarchic, and Dadaist sketch-comedy series. The long-awaited second film to be produced by the troupe (the first had been a compilation of popular sketches) was finally released in 1974, after repeated delays for additional fundraising, re-writes, and re-shoots. [3] American investors wielded considerable influence on the ultimate shape of the film at every stage of development – beginning with the title. A comedic spin on the Arthurian legend, the film was originally to be called Monty Python and the Matter of Britain before this was turned down flat, given what producers perceived as an obvious lack of appeal to potential American viewers. Cleese, in regaling later audiences with the labyrinthine story of the film’s development, would often attribute this veto to an unnamed Texan investor, putting on a ridiculous accent as he imitated him: “He was a short little fellow – half his height alone came from his ten-gallon hat and his impressively high-heeled cowboy boots with the spurs attached”. Monty Python and the Holy Grail was rejected after continued plot revisions, with the title Monty Python and Camelot eventually being chosen instead. In the United States, the term “Camelot” was associated almost entirely with the Kennedy administration by the mid-1970s, thanks to the incumbent President Humphrey referring quite incessantly to his predecessor. “It’s about time we took that back for ourselves,” another investor was apocryphally claimed to have said. [4]

    The ultimate question of the movie’s plot, naturally a subject of great deliberation amongst the sextet of Pythons, was essentially answered for them after producers on both sides of the Pond insisted that Connie Booth, known for her past role on Doctor Who (and more popular as an individual than any of the Pythons, including her husband John Cleese), play a key part in the film. It was a relatively new situation for the Pythons, a men’s club through and through, for they had played most of the female parts in Flying Circus themselves, unless the role specifically called for a genuinely attractive woman – in which case they usually enlisted Carol Cleveland, whose… “assets” as a performer were best demonstrated by their affectionate nickname for her, “Carol Cleavage”. Cleveland would indeed be involved in the film, but Booth would have to play the female lead – Queen Guinevere. [5] This was actually quite serendipitous for the Pythons, as they had already decided that Cleese would play Lancelot. Graham Chapman – who was openly homosexual – played King Arthur, allowing for plenty of comedy to be wrung out of that very skewed interpretation of the classic love triangle. This romantic element was set against a backdrop of post-Roman Britain defending itself against the Saxon incursions starting in the fifth century. In fact, the opening scene of the film depicted the last Roman general (facetiously named “Biggus Dickus”) leading his troops off the island of Britannia and leaving the Britons “in God’s hands”. [6] This set up the righteousness of Arthur’s reign (his pulling of the Sword from the Stone was depicted in animations by Terry Gilliam) before the film settled into a groove of comedic Arthurian set pieces. Though only a modest hit at the box-office (having been released in 1974, the tagline in the United States was “If you only see one British film this year, make it Moonraker” [7]), it did well enough for investors to entertain the possibility of a second film, though none of the Pythons were particularly fond of any suggestions for plots or settings; the only one which many of them had felt had showed the most promise – a satire of organized religion set during Biblical times
    was flatly rejected by American investors. [8]

    Cleese, whose departure from Monty Python’s Flying Circus had sealed its fate, was unsurprisingly not thrilled with working on the film, despite it having been a good experience overall, and had no desire to work on a sequel anytime soon. His true passion, one that had remained with him through the years, was to tell a story about a man running a hotel. The seed had been planted by an unforgettable experience at the Gleneagles Hotel in Torquay, whose proprietor was a highly eccentric individual by the name of Donald Sinclair. He was one of those people who proved that reality was indeed stranger than fiction; and Cleese knew, after having been exposed to his shenanigans, that he would love to write and star in a television program about a similar character. The BBC, having been the previous home of the Pythons, was the obvious first port of call. But despite their established relationship, executives at the network were initially lukewarm on his rather peculiar pitch; that is, until Cleese informed them that he wanted his wife to co-write in and co-star for the show with him (she had virtually no role in the scripting for Camelot, but the pair had written together for the stage). At that point, the excitement became palpable – even as late as 1975, Booth continued to… arouse strong feelings from the right kinds of audiences on both sides of the Pond. By the same token, they were wary of both Cleese and Booth, given their established record for abandoning popular shows at their very peak. Although there was something to be said for quitting while you were ahead, that was not a notion which carried much weight in the television industry – even at the publicly-owned and operated BBC. The couple offered to write for a single season of six episodes, with an option for a second season to be exercised at their discretion. The network countered with a demand for two thirteen-episode seasons upfront, with an option for a third to be exercised at the discretion of the BBC. This unusually high number of episodes (for a British sitcom, at any rate) made a great deal more sense when factoring in the potential for sales to the lucrative American market – feelers were even put out to Desilu, as it was becoming apparent by the mid-1970s that the American studio was increasingly returning to its sitcom roots (as that year had notably marked the end of the Doctor Who distribution agreement).

    As for Cleese and Booth, they flat-out refused what they saw as a ludicrous proposal, despite the fact that both Monty Python and Doctor Who had production seasons of at least thirteen episodes during their respective runs. They threatened to take their pitch to ITV, before negotiations yielded a sensible compromise (though Cleese would later admit that his threat was a bluff). There would be one season of eight episodes, followed by an option for a second, at the discretion of the BBC, and then possibly a third, though at the discretion of the creator couple. [9] Fawlty Towers would star Cleese as pompous, overbearing hotelier Basil Fawlty; Booth was not to play his wife Sybil (the only woman who could instill fear in Basil), but the mild-mannered, sensible maid, Polly (who also attended art school). Rounding out the staff were the Spaniard waiter, Manuel (who could barely speak English), and the Cockney chef, Terry; [10] the only other regulars were the long-term guests residing at the hotel. The first season aired in February and March of 1976 and was – unsurprisingly, given its pedigree – very successful. The second season, immediately green-lit by the BBC, had not yet gone into production by the time the show had aired stateside, on PBS. Meanwhile, in Canada, the first season of Fawlty Towers aired on the CBC in the autumn of 1976, as part of the expanded roster of British-made shows on that network (the result of a separate situation entirely). The second season of Fawlty Towers aired in the autumn of 1977. Working so diligently on their scripts (revisions on some of them took months) put a tremendous strain on the marriage of Cleese and Booth; the two divorced in early 1978. [11] Naturally, they also declined to exercise the option to write and star in a third season of Fawlty Towers, though perhaps someday they could be persuaded to change their minds…


    Doctor Who, meanwhile, continued apace, the Yank Years now well and truly behind them. The budget, as a result, dropped precipitously; fans were ambivalent. The show certainly no longer looked anywhere near as good as it originally had, but there was an undeniable (and to a certain contingent of viewers, irresistible) back-to-basics feel that evoked the tenures of the First and Second Doctors. Incidentally, the BBC had wiped the episodes depicting their adventures from their archives; this necessitated a sheepish call on their part to Desilu, which retained the syndication copies made back in 1971. [12] Jim Dale and Jane Seymour had excellent chemistry and were strong candidates for being the finest Doctor-Companion pairing in the history of the program. Seymour remained with Doctor Who for four seasons, the longest consecutive tenure of any companion, departing only when she was chosen to appear as the female lead in the James Bond film, The Spy Who Loved Me, which would be released in 1980. [13] Aware that she would be departing, producers allowed her reduced participation in her final season, working to groom replacement Companions – male and female – for the Doctor. Dale, for his part, soldiered on without Seymour, happy with his role as the Fourth Doctor; in the process, he endeared himself to British audiences, cementing the “Pertwee vs. Dale” debates that would dominate the fandom on that side of the Pond forever after. And besides, no matter how far the visual effects and overall art direction for Doctor Who had declined from its height during the years of the collaboration between the BBC and Desilu, it still remained head-and-shoulders above any other genre programming on British television. The only show(s) that had been able to compete, the UFO series of programs, had come to an end shortly after the departure of Michael Billington, in order to become James Bond; this was a recurring theme on many high-profile British shows, dating back to the 1960s. Jane Seymour was the latest in a very long line of these performers, which also included two co-stars of The Avengers: Honor Blackman, who played Cathy Gale, left to appear in Goldfinger as the iconic Pussy Galore; Diana Rigg, who replaced her as the legendary Mrs Emma Peel, would also depart for On Her Majesty’s Secret Service, playing Teresa di Vicenzo, better known as Tracy Bond, the one woman who had captured the heart of 007. But as far as inferior visuals went, rebuttals to the decline of Doctor Who came in the form of three simple words: The Tomorrow People. Indeed, it was perhaps the endurance of that ITV “ripoff” (which actually owed a great deal more to Star Trek than to Doctor Who, as did so many 1970s science-fiction programs on both sides of the Pond) that made the “shock” of Doctor Who coming back down to Earth seem less abrupt – even at its very best, The Tomorrow People was clearly inferior to Doctor Who at its very worst, in that department. Notably, it failed to make the crossing of the Pond, even after the science-fiction revival started in the mid-1970s. Another product of this movement was the dystopic Blake’s 7, created by one-time Doctor Who writer Terry Nation (the creator of that program’s most notorious adversary, the Daleks). As with many other science-fiction properties of the time, Blake’s 7 owed a great deal to the Western genre (despite the fact that said genre obviously did not have cultural resonance on the islands which had not constituted “frontier” territory for over a millennium). Nation had developed Blake’s 7 largely in response to the optimism of Star Trek (though in reality, the political situation depicted in that series was a great deal more nuanced and complex than the interstellar utopia often depicted in parodies), and indeed he had shopped the show around stateside first, but had no luck. American viewers were willing to accept depressing shows that depicted harsh realities set in the here and now, but far-future settings were much more the province of escapism. It was only when Nation reluctantly returned to England that he was able to sell his show to the BBC. It told the tale of a group of escaped prisoners (the titular seven, led by none other than Blake) on the run from the evil, totalitarian Confederation. [14]

    Outside of speculative fiction, many shows found themselves crossing the Pond, continuing a tradition that was kicked into high gear with Till Death Us Do Part and Steptoe and Son. The popular Man About the House sitcom had been successfully adapted in the form of Three’s Company, though the two spinoffs produced from the original series (Mildred and George and Robin’s Nest) had not yet seen transatlantic counterparts, largely because Desilu (the studio producing the show stateside) did not want to tamper with a winning formula. In addition, another hit show, The Liver Birds – which featured two young women from the northern industrial city of Liverpool – was also to be sent over the United States, set in Baltimore.
    Greater emphasis, however, was placed on the female leads working their blue-collar jobs as opposed to their home lives, reflecting the socioeconomic realities of the time. [15] None of these shows, however, were quite as successful as one which did not make the crossing: the classic situation of the impoverished aristocrat and the nouveau riche lower-class merchant was given a romantic twist with To The Manor Born, the story of a widow who could not afford to maintain her late husband’s estate, being forced to sell it to a grocer whose family hailed from Eastern Europe. She continued to live in a cottage on the estate, with her faithful servant, and often found herself in conflict with her new landlord. Given the primacy of class-based humour in the United Kingdom, it was not surprising that the show was so successful, and that the more egalitarian-oriented United States might not prove as receptive a market for it. But adults weren’t the only people watching television, of course; children were less conscious of class, and also more receptive to visual styles that might charitably be described as “trippy”. This was especially true of those programs produced by Stop Frame Productions, which (as the name might suggest) was particularly fond of stop-motion animation (also known as “Claymation”, given the use of modeling clay as the preferred artistic medium). [16] The more psychedelic children’s entertainment produced by studios such as Stop Frame was juxtaposed with renewed interest in an old standby, the children’s puppet show Sooty, which saw a changing of the guard from the original puppeteer, Harry Corbett, to his son, Matthew. Its popularity withstood even the arrival of a British-produced sister program to the American Sesame Street, which was named Sesame Square; the alliteration had allowed that name to win out over the more fun-sounding Sesame Circus. [17] It was produced by ITV (the BBC having been unimpressed, and even somewhat dismayed, by their pedagogical techniques). The Spanish-language and American culture segments on the original program were replaced with those deemed more relevant to British children; these were filmed at Elstree Studios.

    Long-running British sitcoms, such as Are You Being Served?, continued to remain popular, even after casting changes – venerable character actor Arthur Brough, who played senior menswear salesman Mr Ernest Grainger (played as an over-the-hill, somewhat cuddly Winston Churchill type) had decided to retire from the program following the death of his beloved wife – the episode “Goodbye Mr Grainger” was written to give him a proper send-off. [18] The decision was made to promote the popular Mr Humphries character to senior, and the layabout “everyman” type Mr Lucas to associate. This was considered a key strategic decision, as Lucas – played by Trevor Bannister – was originally written as the main character, before becoming eclipsed by his talented roster of co-stars. One fewer person to compete with was just fine by his reckoning. [19] Are You Being Served? co-writer David Croft obviously had proven experience with older, veteran actors, as they were the entire focus (as opposed to playing nominally supporting roles) in another sitcom of his, Dad’s Army, which was a period piece set in the Home Front of World War II. The show actually predated Are You Being Served?, and ended prior to the death of Brough; in fact, the idea of re-casting Grainger with one of the now-unemployed actors from the earlier program was briefly floated, though just as quickly dismissed. Dad’s Army, as the name might suggest, depicted the lives of those men who were too old to serve in combat, and were obliged to contribute through less direct – but no less essential – means. Initial concerns that the program would belittle or dismiss the efforts of the Home Guard (a running gag on Are You Being Served? was the denigration of the “Captain Peacock” character, who claimed to have fought against Rommel but was actually an army caterer [20]) were fortunately found unwarranted. In fact, the same writing team behind Dad’s Army worked on a later program, It Ain’t Half Hot Mum, which actually was set on the Front, in what was then known as the British Raj (though in the closing days of the Second World War, so as to avoid excessive encounters with the Japanese). The plot entailed the day-to-day interactions of the British troops with the Indian natives – creating an obvious parallel subtext to the situation in the United Kingdom, which saw large influxes of immigrants from the former Raj settling in the domain of their one-time colonial masters. It was a classic shorthand for the changes facing the world as distances continued to shrink in the modern age, including those once-vast oceans…

    ---

    [1] IOTL, “only” 28 million viewers are claimed to have watched their 1977 Christmas Special; the presence of Prince Charles is enough to boost those figures further.

    [2] Prince Charles did not appear on Morecambe and Wise IOTL, though Ammonds vigorously sought to secure his presence, and he is thus “the one that got away”.

    [3] Still a year earlier than IOTL, which saw Monty Python and the Holy Grail released in 1975.

    [4] Obviously, IOTL, they went with “…and the Holy Grail” instead of “…and Camelot”.

    [5] Booth appeared in only a single scene IOTL, playing a (suspected) witch. ITTL, her higher profile radically alters the plot structure of the film, which now focuses on the Arthur-Guinevere-Lancelot triangle as a result. This does allow them to spoof a fair number of love triangle and romance-related tropes that they could not IOTL.

    [6] The film was a good deal less historically accurate IOTL, set in “England 932 A.D.” – which was during the reign of Athelstan, by some reckonings the first effective King of England, though by this time any “King of the Britons” would have far more to fear from the Danes than from the Saxons, for obvious reasons. The character of Biggus Dickus, of course, appeared IOTL in a later Python film – and with all due respect, I have no doubt that it took the Pythons all of five seconds to come up with that name.

    [7] Inspired by the OTL tagline: “Makes Ben-Hur look like an epic”.

    [8] Greater mainstream success earlier on can be a double-edged sword. Thus, roadblocks have prevented anything like the OTL Life of Brian film from coming to fruition.

    [9] The OTL agreement yielded just one season of six episodes; it soon became clear that the option to produce a second, of equal length, rested with Cleese and Booth.

    [10] Sybil Fawlty is not played by Prunella Scales ITTL, as Cleese wanted another (unnamed) actress for the part IOTL, and thus it is she who plays Sybil. Note also that the chef Terry (who joined the cast for the second season IOTL) is part of the show from the outset.

    [11] IOTL, the first season aired in the autumn of 1975; the second would not follow until 1979, by which time Cleese and Booth had divorced (just as they would do ITTL).

    [12] A sheepish call that, sadly, could not have been made IOTL, resulting in a perpetual search for new leads to recover those lost episodes for the past three decades.

    [13] Seymour was the principal Bond Girl in Live and Let Die IOTL, which was released in 1973. The more mature Seymour ITTL plays the more complicated role of Anya Amasova (portrayed IOTL by Barbara Bach), regarded as one of the finest Bond Girls, up there in the highest echelons with Honey Ryder, Pussy Galore, and Tracy Bond.

    [14] Or, in a particularly blatant nod to the program’s modus operandi as the anti-Star Trek, simply the Federation IOTL. The BBC, who ITTL have a good working relationship with Desilu and intend to keep it that way, refuse to allow Nation to go ahead with calling his evil empire that, much to his chagrin.

    [15] The Liver Birds never made the transatlantic crossing IOTL; but note the lack of a Laverne & Shirley ITTL, and then re-examine the plans for the adaptation.

    [16] Stop Frame ceased to be in 1975 IOTL, but was resurrected as Cosgrove Hall, under which name most of the principal artists did some of their most successful work. ITTL, the original company survives, due to the vagaries of the business world, though its output is largely similar to that of the later OTL company.

    [17] For an example of how a British version of Sesame Street might graft local colour onto the skeleton of the American program, consider the OTL (and TTL) example of Canadian Sesame Street. And yes, those sets at Elstree are the very same ones that, IOTL, housed those of The Muppet Show.

    [18] The episode “Goodbye, Mr Grainger” existed IOTL as well, though it ended with the character remaining at Grace Bros. Department Store; Brough’s wife died during the off-season, at which point he declined to return, necessitating a re-cast (actually, a series thereof). Brough died shortly thereafter, IOTL and ITTL.

    [19] Bannister departed from the program in 1979 IOTL, but will remain in place until the bitter end ITTL.

    [20] Precise character details on Are You Being Served? were generally subject to the Rule of Funny, but consensus seems to be that Captain Peacock served in the Royal Army Service Corps, which (needless to say) made his contributions invaluable toward the war effort, despite the lack of glory from serving on the front lines.

    ---

    Thanks to Thande for his advice and suggestions in the making of this update!

    So here we have another look at the Telly across the Pond! I hope that this helps to answer some of your many questions. There will be more, of course – this timeline still has the better part of a decade to go, after all – and we’ll be hearing a good deal more about British society in the very immediate future. But I certainly can’t start talking about politics without exploring the underlying popular culture first! I mean, who could possibly imagine doing such a thing? ;)
     
    Last edited:
    Appendix B, Part VI: Everybody Votes, Sometimes
  • Appendix B, Part VI: Everybody Votes, Sometimes

    Europe in 1975.png
    Map of the European economic situation in 1975. Countries in BLUE are members of the European Economic Community (EEC). Countries in GREEN are members (full or associate) of the European Free Trade Agreement (EFTA). Countries in RED are members of the Warsaw Pact. Countries in GOLD are aligned with the People’s Republic of China. Countries in BROWN are members of the Forward Coalition (Backwards Bloc). Note that Portugal is a member of both the Backwards Bloc and the EFTA.

    Indeed, it has been said that democracy is the worst form of government except all those other forms that have been tried from time to time.

    Winston Churchill, addressing the House of Commons; November 11, 1947

    For all that the 1960s were increasingly remembered as a decade of turmoil and unrest, the 1970s were not particularly stable either: they were a period of exceptional highs and lows, marked by pronounced economic fluctuations, and ever-shifting socio-political attitudes. Very few of the countries of the world – be they part of the First, Second, or Third – found themselves in much the same place by the close of the 1970s that they were as the decade began. The only constant was change, whether the societies in which these changes took place were democratic or as far away from such an idealized concept as was possible while still maintaining the overall façade…

    Ronald Reagan was inaugurated as the 38th President of the United States on January 20, 1977, seventeen days after the members of the 95th United States Congress. For the first time in more than two decades, the Chief Executive and control of both Houses of Congress belonged to the Republican Party. (President Eisenhower and the 83rd Congress had been the most recent tandem to accomplish this feat). Many of the members of the GOP caucus were supporters of the more libertarian, conservative ideological wing of the party, fostered by Reagan’s idol, Sen. Barry Goldwater, Sr., of Arizona, and shepherded by Reagan himself. This faction was named the “Reaganites”, after their President and de facto leader, as a counterpoint to the more liberal “Rockefeller Republicans”, who had supported the former Governor of New York; these were now personified by Reagan’s running-mate, Vice-President Charles Mathias. There was plenty of room for both factions in both houses of Congress, as the Republican domination could not be overstated: the GOP held 259 of the 435 seats in the House of Representatives, and 62 of the 100 seats in the Senate. Attempts were quickly made by the new party of the supermajority to institute legislative reforms – particularly those which sought to mitigate the dreaded filibuster tactic in the Senate [1] – but were (unsurprisingly) met with a great deal of hostility on this score by both the Democratic and American parties (both masters of the technique). However, they did achieve a measure of success in working to divide-and-conquer, depriving the Democrats of such titles as “Minority Leader” and “Minority Whip”, arguing that – as another party with a minority of legislators served in both Houses, that title was a misnomer. [2] This had been a sticking point with the Republicans for some time, as the Democrats had not represented the majority of seats in the Senate of the 93rd Congress (in fact, they held only 48 of the 100 seats officially, tying them with the Republicans; independents who caucused with the party effectively gave them the majority). The American Party, unlike the Progressives (in their many incarnations through the first half of the 20th century), seemed very much here to stay by the mid-1970s, so the Republicans (who knew that the party appealed at least as much to Democratic core voters as to those who might support the Republicans) sought to take advantage. Talk of the “permanent Republican majority” emerged at this time, though (as is always the case) such confidence in their supremacy would not last forever.

    But President Reagan certainly had a chance to enact some of his policies in the meantime, and that’s precisely what he did. First and foremost, at the insistence of many influential economic minds within his party, he placed the United States back on the gold standard, as the “shock” from the collapse of the Breton Woods system under Humphrey was believed to have contributed to the worsening of what had at first seemed to be the relatively mild recession which had followed the Oil Crisis. [3] But President Reagan’s crowning fiscal accomplishment, by the reckoning of many of his supporters, was the reduction of the income tax rate for the top bracket of earners from 70% to 50%, as part of his economic strategy (nicknamed “Reaganomics”), and based on the “Laffer curve”, which postulated that income taxes in excess of 50% disincentivized taxpayers from working to earn additional income, which in turn reduced potential revenues for the government (and also reduced consumer spending). This rate was the lowest for the top bracket since prior to the Great Depression (it was raised from 25% to 63% in 1932). 50% was also an important milestone for optics purposes, and indeed many policy-makers argued that it was only logical for a person to keep at least half of what he earned through wages, interest, or profits from business and investment (capital gains were another matter entirely); though obviously most states also imposed income taxes, which when combined with the federal rate would equal greater than half of the income earned by the highest earners. At the same time, Congress also simplified the tax structure, reducing the number of brackets from 25 to 14, with the lowest bracket also seeing its rate reduced, from 14% to 12%, allowing lawmakers to trumpet “lowering of taxes across the board”. [4] Naturally, the lowering of taxes was wildly popular among the masses.

    The social policy bills passed under the Reagan administration, on the other hand, were more controversial. The tent-pole piece of such legislation was known as the Liddy Act, named for its primary sponsor, Rep. G. Gordon Liddy of New York. [5] As with the lowering of taxes, it was a decided triumph for the libertarian faction of the Republican Party, described as “forever vindicating freedom of speech over fear of indecency in all forms of media”. This rhetoric was grossly exaggerated; the near-omnipotent FCC, though restructured, retained most of its censorship powers (in fact, it was the privately-operated MPAA that seemed more shaken by the new regulations), but it captured the spirit of the law very well. Talk radio – prior to the passage of this legislation, a largely benign and “soft” segment of the radio industry (that which remained, at least, following the migration of scripted, dramatic programming to television) – found itself given powerful teeth, with the presence of figures who became stars on the national stage. One of the defining early examples was Sam Steiger, a strong libertarian figure from a strong libertarian state – Arizona – who defied the “racist” label attached to many controversial figures within the movement by virtue of his ethnicity and religion, being Jewish. [6] One side effect of the bill was to eliminate the Family Viewing Hour, due to opposition from the right as well as the left. However, it remained de facto at all three networks, due to the preponderance of young children who were still awake at 8 o’clock in the evening. It was obviously too late to save Those Were the Days, though Norman Lear did (very reluctantly) support the bill (the First Amendment made for strange bedfellows), later describing it as the only worthwhile piece of legislation passed by the Reagan administration. Certainly, it would prove one of the most enduring and influential.

    Despite Reagan working to pass social legislation at the federal level, the true cause célebre was a proposition in his home state of California. The Briggs Initiative, formally Proposition 6, proposed that all gay and lesbian persons (and even those who were sympathetic to their plight) be banned from teaching, or even working, in the California public school system; this was for fear that they would “contaminate” children. Homosexuality was delisted as a mental disorder from the DSM-II by the American Psychological Association in 1974 after extensive lobbying by gay rights activists, who had joined the Sexual Revolution after the Stonewall Riots of 1969. However, public acceptance of the condition (or lifestyle, or orientation – terminology varied even more widely than views on the matter) was limited, even as the 1970s came to a close. The mid-to-late-1970s were, in general, a reactionary period, which had culminated in this proposition. However, and perhaps surprisingly, the conservative establishment largely opposed the proposition; the most surprising naysayer was President Ronald Reagan himself, to the astonishment of his staffers and supporters. “Whatever else it is, homosexuality is not a contagious disease like the measles. Prevailing scientific opinion is that an individuals sexuality is determined at a very early age and that a childs teachers do not really influence this” [7], he famously announced at a campaign appearance for his former running-mate, Governor Houston I. Flournoy (who also opposed the initiative, as did his Democratic challenger, Jerry Brown), who was running for re-election. Reagan’s opposition ensured the collapse of the proposition, but he was hardly the only high-profile politician to oppose it. Los Angeles City Councilman George Takei, riding high from his appearance in Star Trek: The Next Voyage earlier in the year (his first acting role of any kind since 1971), made his first foray into larger-scale politics when he vocally condemned the Briggs Initiative while taking questions at a press conference announcing his candidacy for the Democratic nomination for the (federal) 28th Congressional District. [8] He would eventually lose that nomination – both that seat and his present council district (whose boundaries fell partially within it) were largely African-American, and indeed Takei had been returned to council in the 1975 general election by a surprisingly narrow margin against a black opponent. Takei would announce his retirement from municipal politics in 1979, having largely completed his objectives with regards to developing an advanced mass transit system for the city of Los Angeles, and now inspired to take his ideas “on the road”, as it were.

    Just as was the case in the United States, the government in the United Kingdom had a seemingly-insurmountable lead in its (elected) legislature, the House of Commons. William Whitelaw, like Ronald Reagan, had seen his party swept in as a reaction to the perceived mismanagement by the ruling party of the time – across the Pond, it had been the Democrats, but in the British Isles, it had been Labour, led by Harold Wilson. And though Hubert H. Humphrey had only survived the end of his term by mere months, Wilson had a much longer tenure out of office; after resigning his seat in late 1974, he sought a second career as a television personality, meeting with varying success. [9] But on the whole, he faced much less hassle than the party he left behind; Labour was bitterly divided between the more established, moderate, pro-Europe right-wing and the younger, grassroots, anti-European left (opposing union on socialist or even Marxist grounds). Labour, being on the left side of the British political spectrum, and having been forced to withdraw to their electoral strongholds after their devastating defeat in the 1974 election, was mostly dominated by this leftist rabble thereafter; they naturally chose one of their own, Michael Foot, as Leader of Her Majesty’s Loyal Opposition (and, potentially, a future Prime Minister) to challenge Whitelaw and his Tories in the next election. [10] That was a long time in coming, however, and despite headaches in facing (or rather, putting off facing) the trade unions, the economy was mostly good in the intervening years. Polls showed the Conservatives with a solid lead throughout their first term back in Government. Largely, this was because Whitelaw did his best not to make waves.

    That said, he was capable of taking a stand, if need be. For example, his Tories, in contrast to Labour, were far more united on the issue of joining the European Economic Community (being mostly in favour). But despite having campaigned in 1974 having promised to “reach a fair and equitable arrangement for admission to the EEC”, he eventually resisted working to bring this about, for a myriad of reasons. For one thing, the “temporary” arrangement which was worked out with Canada, at the behest of its Prime Minister Robert Stanfield, to solidify trade relations between those two countries (and fortify existing treaties with Australia and New Zealand) in the wake of the EEC postponing the admissions process after the Oil Crisis was proving a surprisingly profitable enterprise for all involved parties – it helped that, between them, the United Kingdom, Canada, and Australia were three of the ten largest economies in the world, even as late as 1975. The linchpin, however, was a fatal mistake made by the French President, Francois Mitterrand, in continuing negotiations with Whitelaw and his Cabinet. Mitterrand, who was the de facto leader of the EEC, had been spearheading closer economic ties between its members in the wake of the Oil Crisis (which was naturally quite devastating to the oil-poor organization), and his pet project was a joint currency. [11] As the British pound sterling had only recently been decimalized by the mid-1970s, he felt that the attachment of the British people to their ancient currency was not so strong as to prohibit discussions with the British government which would entail joining the new currency. Mitterrand, ordinarily a fairly canny politician, would later describe this as the biggest mistake of his political career. Though the French President had sought to make clear that monetary union was intended as a long-term objective which would follow years or even decades of integration, the headline naturally made far greater waves than the fine print. The overall attitude of the British population toward joining the organization, which was generally neutral beforehand (depending, of course, on the precise phrasing of polling questions), grew increasingly hostile, and Whitelaw reluctantly put the issue on the back-burner. The “Commonwealth Trade Agreement”, meant only as a temporary substitute to integration with the EEC, seemed ever-more permanent with time. However, this in turn created a big problem in Ireland, whose economy and trade relations were tied so closely to those of the United Kingdom that it effectively could not join the EEC, even though it, unlike the UK, had agreed to all of the entry conditions. The Irish trade issue was the primary source of tension between both countries in the British Isles through the 1970s; the British felt obliged to “look after” a country which had left the fold decades before, whereas the Irish resented the British controlling their destinies; they were emphatically not interested in joining anything with the “Commonwealth” label attached, which would require some legalistic wrangling.

    However, that section of “Free” Europe outside of the EEC was not restricted solely to the British Isles (particular since many in the British Isles believed themselves “With Europe, But Not Of It”). In fact, against the backdrop of British attempts to sort out their position relative to Ireland and to France, several countries in Southern Europe found themselves living out entirely different situations. Fascism, though largely discredited and certainly stigmatized in the aftermath of World War II, continued to endure on the Continent. Indeed, two regimes from that era remained in place into the 1970s: Nationalist Spain, and the Estado Novo in Portugal. Another, the military junta in Greece, continued to move toward the far-right ever since the King was effectively exiled in 1967 – though Greece remained a de jure monarchy in the tradition of interwar Hungary in the meantime – the process greatly accelerated by the Cyprus Incident of 1974. All three countries, despite doing very well indeed economically prior to the Oil Crisis of 1973, were hurt very badly by it, especially as their European neighbours entrenched themselves in their various economic alliances; Portugal was a member of the EFTA, which also contained oil-rich Norway, but they were embroiled in a lengthy and costly colonial war, which served to mitigate that advantage. Thus, these pariah states decided to seek closer ties, forming an association which was described internally as the “Forward Coalition”, espousing quasi-fascist policies and anti-communism; opponents (generally on the European left) quickly labelled it the “Backwards Bloc”, a name that only gained more currency when South Africa and Rhodesia, both under white minority rule, joined their organization. South Africa, despite having been a pre-WWI “White Dominion”, and founding member of the Commonwealth, had been suspended from that organization as a result of their apartheid policies. Rhodesia, which illegally declared independence from the United Kingdom, was certainly not on speaking terms with her erstwhile sister nations either. These five countries were united in their opposition to socialist thought in general and Communism in particular (a cornerstone of fascist ideology), particularly Red China, the preeminent boogeyman of the far-right in the 1970s. Several South American countries were openly sympathetic to the Backwards Bloc, but did not formally join them.

    The long-standing dictator of Portugal, Antonio de Oliveira Salazar, was already a very old man by the turn of the 1970s. An accident in his own home in 1968, causing minor injury, served to reinforce this fact, and reminded him uncomfortably of his own mortality. [12] The Estado Novo regime which he had formed would need a figurehead to continue on without him. Taking inspiration from his longtime friend and colleague, Generalissimo Francisco Franco, the dictator of neighbouring Spain, Salazar decided to appoint Duarte Nuno, the Duke of Braganza, and Pretender to the long-vacant throne of Portugal (as the nation had been a republic since 1910) to the position. Salazar had toyed many times in the past with installing Duarte Nuno as a figurehead monarch; now, he felt, it was an idea whose time had finally come. In fact, Salazar was recently deceased when Duarte Nuno became King of the restored Kingdom of Portugal in 1972, replacing Americo Tomas, who had been President since 1958 (re-elected by the legislature in 1965, which similarly acclaimed Duarte Nuno in 1972), at the end of his term. [13] Duarte Nuno took the throne as Edward II, King of Portugal and the Algarves, officially known in Portuguese as “O Regressado”, or “the Returnee”. As the last King of Portugal, Manuel II, had died in 1932, by which time Duarte Nuno was his Heir Presumptive, this allowed for an unbroken line of succession. However, the elderly King, aware of his precarious position on the newly restored throne, constantly deferred to the Estado Novo government despite his very presence securing the support of conservative, ultra-religious, and reactionary elements within Portugal. His opponents had many, rather less complimentary names for the King, such as “O Impotente” – the impotent – or (more to the point) Vitor Emanuel, a reference to the penultimate King of Italy, whose reign had been dominated by his fascist ministers, most notably Mussolini. The regnal name Edward II was appropriate as well, for it evoked the legendarily ineffectual English King of the same name.

    However, Duarte Nuno did have an effect on the Francoist regime next-door in Spain. Generalissimo Franco, himself a very old man, had planned on reinstating the monarchy to succeed him, though he favoured Juan Carlos, the son of the pretender Infante Juan, whom he believed too liberal for the job. Edward II died in early 1976, having “enjoyed” his restoration for barely four years, which were fraught with stress and anguish. His son Duarte Pio, the Prince Royal of Portugal, succeeded him as Edward III, against the better judgement of many within the Estado Novo regime. Their fears were not unfounded, as the new King immediately began working to pass liberalizing reforms. The situation in Portugal, appropriately enough, was echoed almost precisely in Spain; Franco, however, did not choose the nominal Pretender (Infante Juan, Count of Barcelona, and son of the last reigning King, Alfonso XIII), but that man’s son, Juan Carlos, creating him the Prince of Spain in 1969 and designating him heir-apparent, hoping to groom him to continue the Francoist regime after his passing. Generalissimo Francisco Franco died in late 1975, and the following year (at which time he was still dead), Juan Carlos began instituting reforms. The two monarchs, born seven years apart, became close friends and each made their first official state visits in their monarchical capacities to each other in 1976. After constitutional reforms were complete, Portugal withdrew from the “Backwards Bloc” in 1977, at the same time signing a peace treaty with India recognizing their “lawful and legitimate” annexation of their former colonies, and withdrawing from Angola, Mozambique, and mainland Portuguese Guinea in Africa; this marked the final departure of European colonial forces from the Continent. However, Portugal retained control of their insular colonies throughout the world, including Madeira, the Azores, Cape Verde, Sao Tome and Principe, Macau, and part of the island of Timor in the East Indies. [14] Spain followed a similar trajectory, though they retained none of their colonies. Democracy coming to both countries near-simultaneously, and through similar means (constitutional monarchy) resulted in what commentators described as the “Iberian Sunrise”. Edward III thus acquired the popular nickname “O Democrata”, or the democrat. “I serve at the Will of the People” became his catch-phrase, and later his motto.

    And then there was the last remaining Backwards Bloc member in Europe. The sudden – and, from the point of view of the Greek population, far from tragic – death of the Queen Mother, Frederica of Hanover, in surgery did much to bolster monarchism in Greece, given her meddling in the reign of her son during his active rule in the 1960s. [15] The threat of her being allowed to do so should Constantine II return to Greece loomed over any attempts at his restoration, but her death allowed him to be perceived as his own man, for better and for worse. As the junta regime dragged on through the 1970s, and the Greek reputation as a pariah state was solidified by their continued affiliation with the Backwards Bloc, even as Spain and then Portugal withdrew from the union; this left Greece as the sole European representative thereof, as the two major Western European economic blocks – the EEC and the EFTA – further crystallized. This proved devastating to Greece in the wake of the major global recession that commenced in 1978, which was the straw that broke the camel’s back as far as the junta was concerned. Democratic uprisings broke out throughout Greece, and (inspired by the Iberian Sunrise) various political leaders invited the now-orphaned Constantine II to return to his throne in early 1979. [16] The white-controlled government of Rhodesia finally collapsed under its own weight that same year, being without allies in Europe; this left South Africa as the sole surviving member of the Backwards Bloc as the decade came to a close.

    Greece wasn’t the only country whose people took inspiration from the Iberian Sunrise. In fact, a resurgent wave of monarchism spread throughout Europe, even reaching the three solidly republican titans of the EEC, though ultimately to no effect: Italy, where monarchism was strictly outlawed by the constitution, which naturally scuttled any organized support for a restoration; France, where, in addition to disagreement between the three pretenders for the throne (Legitimist, Orleanist, and Bonapartist), a century of republicanism had led the political class to accept the system as the best of the worst; and even West Germany, though support there was also divided between two pretenders: the legitimate Prussian heir, Louis Ferdinand von Hohenzollern, and the “Pan-German” candidate, Otto von Habsburg. Otto was (obviously) the candidate of choice among Austrian monarchists as well, popular and respected among all classes for his vocally liberal-democratic political stances. However, the notion of his installation as a “pan-German” monarch of a combined West German and Austrian state was a pipe dream, though it may have perhaps hampered his (slightly) more realistic chances at his dynasty being restored to the throne of Austria. (The rest of the former Habsburg Dominions, all of which were Communist, obviously saw little popular support for such a restoration). In the end, the popularity of monarchism in these republican countries was the result of a vocal minority (as well, perhaps, as a certain romantic nostalgia), and the odds of their success ranged from small to negligible. Then again, the same could have been said for those countries which did see monarchical restorations in the 1970s. [17]

    And then there was the exact opposite situation: a monarch facing the risk of being popularly deposed. At the request of the Shah of Iran, whom President Reagan viewed as “one of our most important allies in the Middle East”, American troops were dispatched for anti-insurrectionist purposes. This attracted ire from certain sections of the American population, fearful of repeating the overseas quagmire of a decade before, but the key difference was that the American military of the late-1970s was all-volunteer, and much more strictly regulated than the free-for-all of decades past. The “task force” dispatched to Tehran was a small core of elite units; in the opinion of many strategists, the Shah was overestimating the possibility of an uprising. Nevertheless, they worked to keep the peace; this proved a more difficult task than anticipated, especially in the face of numerous armed uprisings, starting in late 1978 (caused by a wide variety of disparate factions, most in opposition to, and a few in support of, the regime). [18] The American government, having been fed reports by forces on the ground, began urging the Shah to consolidate his support base in the country, which subsequent fact-finding investigations found to be perilously thin. It soon became clear that American peacekeeping forces, rather than being an unnecessary dalliance, were the glue keeping the regime together, and the Shah was reluctantly obliged to accede to their demands, implementing reforms which would bring the government closer to the Persian Constitution of 1906 in order to appease the bourgeoisie. In spite of this, an American military presence would remain for some time, fearful of rebellion not only by the omnipresent Communist insurgent threat but also by the fundamentalist followers of the Ayatollah. The Shah, notably, was one of the few monarchs whose popularity did not see a boost in the late 1970s, despite that era being one of the best periods for public support of that institution in the latter half of the 20th century. Still, the peace held, however tenuously.

    The atmosphere of Détente with the Soviet Union cultivated by both Hubert H. Humphrey in the United States and his counterpart, Comrade Brezhnev in the Kremlin (surprisingly) continued, though in a more muted fashion, under the staunchly anti-communist and far more bellicose Ronald Reagan. Despite this, Cold War tensions did not wholly dissipate. In fact, the situation in Europe appeared to present a microcosm of the global situation in the 1970s; the nebulous tripartite situation of years past was beginning to crystallize, though one of those blocs most certainly did not collapse under its own weight, in contrast to the Backwards Bloc. Indeed, that ill-fated organizations arch-nemesis, Red China, decided to pick up the pieces in Southeast Asia where the United States had left off a decade earlier, invading the very country in which American troops had become mired not so long ago, and facing hostile international reaction (not to mention finding themselves in a very similar situation). Red China had entered the 1970s as a power on the rise; most of the world had finally, belatedly recognized that they controlled the vast territory and population lost by Generalissimo Chiang and his Kuomintang in the 1940s. However, the Cultural Revolution, starting in the late 1960s, had proved utterly devastating to the people and the economy of the would-be Great Power. As was the case in most totalitarian regimes, including many of the Backwards Bloc states which stood diametrically opposed to Red China (autocracy made for strange bedfellows), it fell into disarray upon the death of Chairman Mao Tse-tung [19], centre of the cult of personality called Maoism, in 1976. It was decided that Red China would continue to follow their leader’s economic policies, resisting any and all attempts to shift toward capitalism, and prove their mettle by extending their sphere of influence – and not just in Southeast Asia, but also into India, the second-largest country in the world by population, and one of the most significant of Third World, non-aligned states. Many of those countries which had reluctantly switched recognition from the Republic of China (Taiwan) to the People’s Republic were now sorely regretting it. The United States, notably, never extended even informal acknowledgement of Red China (lacking the political capital to do so under Humphrey, and then the will to do so under Reagan); of all the NATO countries which had acknowledged the People’s Republic, the one which came closest to revoking that recognition was Canada, which had made the initial gestures of goodwill under Prime Minister Pierre Trudeau (also a good friend of the Cuban Communist dictator, Fidel Castro), who was followed by the far more traditionalist Robert Stanfield, who had even caused a kerfuffle at the 1976 Montreal Olympics, when he refused to reject the Taiwanese Olympic team at the behest of the Pekinese delegation.

    With regards to domestic issues in the Great White North, Stanfield found his attentions divided between the four corners of the Dominion that stretched from sea to sea, and from the river to the ends of the Earth. Quebec had been a major focus of his premiership, and that alone raised the ire of most of the rest of Canada, particularly Alberta, which was the economic engine for the whole country after the Oil Crisis. Although Stanfield and the Premier of Alberta, Peter Lougheed, were on fairly good terms, it very much seemed to Stanfield that it would be far more difficult to alienate Albertans than it was to appease Quebec voters, and Quebec had many more seats than any of the Western provinces (despite the latter being a longtime base of support for his party). Then again, he was between a rock and a hard place. The New Democratic Party, eager to take advantage of previously missed opportunities in the West, selected as their leader Rural Saskatchewan MP Lorne Nystrom; the Opposition Liberals, on the other hand, were bound by party convention to select an Anglophone leader to succeed the Francophone Trudeau. They chose John Turner, an English-born MP from Ottawa, who sadly lacked the charisma and dynamism of his controversial predecessor, and thus failed to appeal to either English or French Canada. Finally, Réal Caouette, the leader of the Social Credit Party, and never in the best of health, decided to retire from politics, to be replaced by the younger André-Gilles Fortin, who emerged from the hotly-contested party leadership convention as the staunchly federalist alternative (versus his separatist opponent). [20] The Socreds, like the NDP, had once been much stronger in the West, and Fortin decided to attempt to rebuild the party there in a way that his Quebec-oriented predecessor had never done (as his leadership had been defined largely in opposition to English Canada). The sense that there was enough for everyone was cemented by a reapportionment of the electoral districts, or ridings, which had come into effect for the 1978 federal election. The House of Commons gained 18 seats, increasing from 264 to 282. Vote-rich Ontario saw the plurality of these gains, but both Quebec and the West saw greater representation as well. None of the four Atlantic provinces, where Stanfield was most personally popular, gained a single seat.

    It was largely this reapportionment which allowed the Tories to be returned with a slight majority, winning 149 seats (a net gain of seven on top of the 142 that they had won in 1974). The vagaries of the First-Past-The-Post electoral system saw them gain seats (despite losing vote share, though they remained above the vital 40% threshold) largely because the PCs mostly lost support in areas where their support was overwhelming, but maintained their popularity, or even saw boosts, in more competitive regions, which tended to be more highly populated; this resulted in their picking up the lion’s share of the “new seats” apportioned for the new Parliament. Stanfield maintained strong support in his Atlantic home base, and the Liberals – led by new Opposition Leader John Turner – failed to catch on in Western Canada, barely holding their own even in their Quebec strongholds (losing several seats in the English-language regions of Montreal to the Tories). The Tories and their surprising strength in Canadas largest city – they had won all of one seat on the Island of Montreal in 1974, but took nearly half-a-dozen in the region in 1978, including the former Liberal stronghold of Mount Royal, once held by Pierre Trudeau himself [21] – echoed the provincial situation, in which the former French-Canadian parochial nationalist party, the Union Nationale (which had merged with a social credit splinter group and was subsequent renamed the Union Conservateur) won significant support among Anglophone voters for the first time. However, this was against the backdrop of the victory by an avowedly separatist party – the Parti Québecois, led by the charismatic Premier René Lévesque in the National Assembly of Quebec. [22] Unlike the previous premier, Bourassa, he was not particularly willing to play nice with the government in Ottawa, and immediately got to work attempting to establish the primacy of the French language in the province. Despite their disappointing results, the Liberals remained as the Official Opposition; the NDP made marginal gains, though these were below expectations for the most part. The Socreds lost several of their Quebec seats, to both the Tories and the Liberals, and though they became competitive in Western Canada once again, they failed to actually break through and win seats in fact, they served mainly as spoilers, allowing for vote-splitting of the two right-wing parties (themselves, and the PCs), for NDP candidates to come up the middle between them. (The Liberals, once again, failed to win a single seat west of Manitoba.)

    Across the Pond, in another Commonwealth Realm, in another election also held in early 1978, the status quo also endured. Though they lost seats, the Whitelaw Conservatives had far too great a lead in the House of Commons for their majority to have been whittled away, and the campaign by Michael Foot (who was on the far left of even his own party) did not endear him to the moderate swing voters, many of whom felt that Whitelaw was being far too easy on the trade unions. (Foot did not make the disastrous gaffe of claiming that Whitelaw was being too hard on them, but that was one of the few “risky positions” he did not take, with predictable results.) The Liberal Party performed moderately well in the election, gaining ground from the Conservatives, as did the Scottish National Party, who gained solely from the Tories in the election. However, the Conservatives (through their Ulster Unionist allies) performed well in Northern Ireland, where the situation had improved considerably from the late-1960s. Whitelaw was thus returned in 1978 with a majority of 73 (reduced from 142 in 1974) [23], though with the expectation that a confrontation with the trade unions – deferred during his first term thanks to the improving economy – would be inevitable in his second, especially after the second recession settled in by the end of that year.

    In the United States, the elections of 1978 were, predictably, not terribly good news for the governing Republicans, who had reached their peak and had nowhere to go but down. That said, it was unfortunate for President Reagan that the late-1970s recession hit in the third quarter of 1978 – late enough that the Canadian federal election and United Kingdom general elections had already taken place, but not those of the United States, which were bound by law to occur on the first Tuesday following the first Monday of each and every November. Even so, a particular sticking point for many within the party was that many of the liberal and moderate Republicans who had worked with Democrats to stymie some of the proposed bills favoured by Reaganites had been returned, including Clifford P. Case in New Jersey and, surprisingly, Edward Brooke in Massachusetts, maintaining the two-member standing of minorities within the Upper Chamber. [24] But perhaps not coincidentally, the Republicans saw better-than-expected voter retention with minorities; an estimated quarter of black voters stayed true to the GOP, due largely to the resurgent AIP/ADP allowing Republicans to point to that party as the home of racists, segregationists, and the intolerant. Interestingly enough, and despite the Iranian regime being depicted by its opponents as overly secular and hostile to the historical Islamic communities in the region, the Republicans did better among Muslims than among any other ethno-religious minority in the United States (Arabs and other “White Muslims” voted overwhelmingly for the GOP; ironically enough, Black Muslims voted far less Republican than either non-Black Muslims or non-Muslim Blacks). It was primarily the working-class white voters who had come out so strongly for Reagan in 1976 who were turning against him now; the socially conservative “Archie Bunker vote” lived on, despite the end of Those Were the Days. Archie Bunker himself no doubt would have been a steadfast supporter of the Briggs Initiative (just as firmly as Carroll O’Connor, the actor who had played him, was a staunch supporter of gay rights), and would no doubt feel deeply betrayed by his one-time idol, “Ree-gan”, for whom he so vigorously campaigned in 1976, in hopes of bringing back the “good old days”. The generation gap and alienation of older people from ever-changing social mores weren’t going away anytime soon.

    The American Party, despite being hammered by both the Republicans and the Democrats as “racist”, “fascist”, and “reactionary” – allusions to Adolf Hitler abounded – performed better than it ever had in the 1978 election, winning 22 seats in the House of Representatives and an astonishing five in the Senate, though this jump was partly attributable to the death of incumbent Alabama Senator James Allen, at which time Governor George Wallace (who was term-limited, and could not run for that office again) appointed himself to replace him, winning the special Senate election of 1978 without serious opposition; even the “National Democrats” did not run anyone against him, focusing their energies on the other seat, vacated by Independent Democrat John Sparkman, which was also won by the ADP (House Leader Walter Flowers sought that golden opportunity to move on up). Their success was obviously due to their having co-opted the social conservative vote, even in areas far afield from their traditional Southern base. Their 1972 candidate for Vice-President, John G. Schmitz, ran for and won a State Senate seat in California, even as the Briggs Initiative which he supported went down in flames. [25] However, it was not all sunshine and roses for the American Party; their 1976 candidate for President, and Leader in the Senate, Lester Maddox, was narrowly defeated in a three-way race to retain his Georgia seat, losing to another former Governor, Democrat Jimmy Carter. [26] It was an unquestionable bright spot for the Democratic party, given their underwhelming result opposite the Republicans, who, despite moderate losses in both chambers of Congress, maintained their commanding leads there – the full impact of the recession would not become obvious until 1979, and many voters remained wary of handing power back to the Democrats (who had enjoyed unfettered control of the US government for fourteen years) so soon – one of the reasons that the third-option American Party did so well. The AIP/ADP won more seats in the House than any third party since the Populists in 1896 (though, it must be said, the Populists won a larger percentage of seats, as the House was smaller at that time).

    Though the old Senate Majority Leader, Sen. Hugh Scott of Pennsylvania, remained in the Senate, he stepped down from his leadership position, tired of constant squabbles between the various factions of the Republican Party, and deciding to serve out the remainder of his term as a backbencher. A more diplomatic candidate, Sen. Howard Baker of Tennessee, was chosen to replace him. In the House, on the other hand, Rep. Gerald Ford continued to serve as Speaker of the House; though he, like Scott, was a moderate, he thoroughly enjoyed holding the Speakership and was well-regarded by all of his fellow Representatives on both sides of the aisle.

    Perhaps the most widely-followed election of 1978 had nothing to do with any of the democratic governments of the world, but with the Conclave in the Vatican City, which was due to select another Pope of the Roman Catholic Church, the previous Pontiff (Paul VI) having died after a fifteen-year tenure. There were some 750 million Catholics in the world, more than in any other country save for Red China, and so the question of who would be chosen as the spiritual leader of more than one-sixth the global population was obviously one which would have major ramifications. Many from within and without the Catholic community were vocal in their opinions as to which sort of man should become the next Bishop of Rome; or, more accurately, as to which sort of man should not. No non-Italian had served as Pope since Adrian VI, 450 years earlier, and indeed many felt that the next Pontiff should have come from outside Europe, let alone Italy, as the Catholic population was burgeoning in the Third World. However, of the historic assemblage of over 100 Cardinals, more than half of these were European, and a full quarter were Italian; this proportion included all of the serious candidates, just as had always been the case. In fact, the list of papabile in 1978 included a notable contender from previous Conclaves in Cardinal Giuseppe Siri. A staunch conservative, he failed to consolidate support from the liberal and moderate Cardinals; however, said liberals and moderates also failed to coalesce around one of their own as an alternative. This impasse resulted in a compromise candidate after the third day of voting: the affable, well-regarded, and perpetually smiling Cardinal Sebastiano Baggio, who took the Papal name of Innocent XIV, which was something of a throwback, as the previous Pope Innocent had reigned over 250 years before. [27] The famous proclamation “Habemus Papam!”, followed by the crowning of Pope Innocent XIV with the famous Papal Tiara, came on March 15, 1978 (ironically, the Ides of March, a fateful day indeed for a prior Pontifex Maximus); just in time for the new Pope to prepare for Easter celebrations the following week. [28] And so, the cycle began anew…

    ---

    [1] The filibuster, though hardly invented by American politicians, was certainly perfected by them. IOTL, legislation was passed in 1975 which allowed to the support of only three-fifths of serving Senators (usually 60) to invoke cloture (ending debate). ITTL, this legislation did not pass, for a variety of reasons (up to and including it having been filibustered by the acknowledged masters, the Democrats and the Americans). After 1976, the GOP was finally able to pass a similar law ITTL.

    [2] The last party with widespread national support to have a sustained tenure in Congress – the Populists – predate the establishment of formal leadership positions in the House and the Senate. IOTL, no such third party has yet emerged since then, but the AIP seems to be here to stay by the late 1970s ITTL.

    [3] IOTL, the United States never returned to the Gold Standard after the “Nixon Shock” of 1971. ITTL, the “Golden Interregnum” lasted for about three years.

    [4] Most of this tax legislation was passed in 1981-82 IOTL.

    [5] G. Gordon Liddy is best known IOTL for his involvement in the White House Plumbers until President Richard M. Nixon. However, this followed a failed attempt to primary Rep. Hamilton Fish, IV, in the 28th Congressional District of New York in 1968, which was then in Dutchess County. He would be elected in the Republican Revolution of 1974 ITTL.

    [6] Steiger, ITTL, ran for President in 1976, as opposed to running for Senate; IOTL, he won that the primary against paleoconservative John Bertrand Conlan, who did make great hay of Steiger’s ethnicity as part of a very ugly and hard-fought campaign. Conlan thus won the Senate seat ITTL, where he serves alongside none other than Barry Goldwater, Sr. Steiger later switched the Libertarian Party, on whose ticket he ran for Governor in 1982, before indeed becoming a talk radio host in later life.

    [7] This comes verbatim from an OTL editorial written by Reagan, and published on November 1, 1978.

    [8] No, Takei does not “come out” at any point while emphasizing his opposition to the Briggs Initiative, especially as he makes plans for a national run.

    [9] Wilson did the same thing IOTL, though he was on the whole more successful in doing so ITTL (though he’ll never be the next Parkinson or the like).

    [10] The more moderate James Callaghan became Labour leader upon Wilson’s resignation in 1976 IOTL. Foot would not become the party leader until 1980, after Callaghan was massively defeated by the Tories, led by a Mrs Thatcher, in the 1979 general election.

    [11] Mitterrand, IOTL, the President of France from 1981 to 1995, was the strongest advocate for what became the Euro, using it as a bargaining chip during the German reunification process. By this time, of course, the United Kingdom was already part of the EEC (soon to become the EU), though it still opted out of the Euro.

    [12] That accident caused him a severe brain hemorrhage IOTL, which led him to effectively step down from his longtime position as Prime Minister in 1968; ITTL, he is able to secure his succession before his (slightly later) death.

    [13] Tomas remained President until the Carnation Revolution of 1974 IOTL.

    [14] Portugal retained only Macau IOTL. East Timor, upon gaining independence, was immediately invaded by Indonesia.

    [15] IOTL, the unpopular Queen Mother also died in (elective and cosmetic) surgery, in 1981. ITTL, this happens in 1976 instead.

    [16] A popular vote taking place in early 1979 to confirm the restoration of Constantine II as King of the Hellenes gives him 55% support; IOTL, he received 30% support in the 1974 referendum (though this was up 10% from another referendum, held in the previous year). It should also be noted that Constantine II has largely stayed away from Greek politics during his exile, though not always by choice; however, this endears him to his people and gives his promises to reign as a constitutional monarch some additional weight.

    [17] Support for the monarchy in many of these European countries is, perhaps, overstated by monarchists and their sympathisers; even in Portugal and Greece, it was a near-run thing. However, and as you might imagine, it does not stop alternate historians from suggesting that various other European countries (most frequently Italy – after all, the May King, Umberto II, was still alive at the time) might have gone monarchist in the 1970s.

    [18] These minor incidents rapidly escalated by early 1979, of course, developing into the Iranian Revolution. The Shah repeatedly appealed to the United States for help in the late 1970s IOTL, but was met with continued rejections; ITTL, the staunchly anti-communist and interventionist President Ronald Reagan sends troops in the summer of 1978, against the advice of his staffers, and is able to delay the fatal blows long enough for last-ditch reforms to take hold.

    [19] The use of the traditional Romanized spelling Tse-tung for his name, as opposed to the “official” Pinyin of Zedong, is deliberate, as is the use of Peking (reserved in the present-day of OTL only for duck) instead of Beijing.

    [20] Turner did indeed replace Trudeau as leader upon the latter’s retirement in 1984 IOTL, serving briefly as PM before his party suffered a massive defeat in that year’s election. For the NDP, Ed Broadbent (representing the urban, working-class riding of Oshawa) was chosen as leader over Nystrom upon the resignation of David Lewis IOTL, but Broadbent was defeated by Michael Starr in 1972 ITTL and (after a failed 1974 rematch) returned to academia. Finally, after Caouette retired from politics (greatly injured in a snowmobile accident), Fortin did indeed replace him as leader, before he died tragically in a car crash.

    [21] Mount Royal, purportedly the strongest Liberal riding in Canada, nearly went Tory in their landslide victories of 1958 and 1984, as well as in 2011 IOTL. Though it went Liberal in the by-election following Trudeau’s resignation, that came before the investment by the government in Montreal bore fruit (with the successful Olympic Games, and then the launch of the Montreal-to-Mirabel “Rocket” line).

    [22] The Union Conservateur won over 12 seats in 1976 ITTL (Union Nationale won only 11 seats IOTL), with over 20% of the vote (only 18% IOTL), clearing both thresholds for Official Party Status in the National Assembly (it is not clear if the Union Nationale was recognized as such IOTL, as sometimes exceptions have been made).

    [23] In 1979 IOTL, the Tories formed a workable majority of 42.

    [24] Both Case and Brooke lost in 1978 IOTL – in fact, Case was successfully primaried.

    [25] Schmitz is the highest-profile AIP/ADP legislator from a nominally “liberal” state, being a member of the Upper House of the state legislature.

    [26] Carter did not attempt his maverick Presidential run in 1976 ITTL, given the Muskie-Jackson Battle of the Titans.

    [27] IOTL, of course, Cardinal Albino Luciani was chosen as the compromise candidate instead, and he selected the Papal name John Paul, in honour of his two immediate predecessors, John XXIII and Paul VI. It was the first double-barrelled Papal name in history, and the first new Papal name chosen in over a millennium (since Pope Lando in the early tenth century). He lasted about a month in the position before he was succeeded by Cardinal Karol Wojtyla, who took the name John Paul II in his honour.

    [28] John Paul I retired the Papal Tiara IOTL; Paul VI had used it only for the Papal coronation, a tradition that Innocent XIV chooses to continue.

    ---

    Many thanks to Thande, vultan, and my newest consultant, Dan1988, for their invaluable advice in the making of this update! Also, thanks to Archangel for his help with Portugal, and to Don_Giorgio for his help with Greece. And, finally, thanks again to e of pi for proofing. It took quite some time to bring everything together, but I hope that you all found it as enjoyable to read as I found it rewarding to write! Though, I must admit, this update was never too far from threatening to run away from me.

    There is an awful lot of information to digest in this very long update, for which I must apologize. I
    ll be happy to answer any and all questions you might have, and I hope to post infoboxes with regards to the vital statistics of the various elections Ive discussed in this update (except for the Papal Conclave, of course).


    Europe in 1975.png
     
    Last edited:
    Brand New Hollywood, Same Old Industry
  • Brand New Hollywood, Same Old Industry

    Welcome to the Academy Awards, or, as its known at my house, Passover.

    Bob Hope, in the first words of his opening speech as Host of the 40th Academy Awards, April 10, 1968

    Welcome to two hours of sparkling entertainment spread out over a four-hour show.

    Johnny Carson, in the first words of his opening speech as Host of the 51st Academy Awards, April 9, 1979 [1]


    For almost as long as the film industry had existed in Hollywood, those who were a part of that industry had possessed a phenomenally inflated opinion of themselves, and of the work that they did. This was common to virtually all kinds of entertainers, but filmmakers had truly elevated their pomposity to an art form. Fittingly so, for as of the release of The Birth of a Nation in 1915, motion pictures themselves were no longer deemed mere frivolous entertainment; they too were true art. In 1927, the studios on the suggestion of Louis B. Mayer, the quintessential studio chief created the Academy Awards, popularly known as the Oscars (a nickname of disputed origins), and the most famous self-congratulatory event in popular culture. With each passing year, the ceremonies grew longer, and even more bloated and decadent than the last. Even wiseguy, bubble-bursting hosts like Bob Hope and Tonight Show host Johnny Carson could not quite deflate the egos on display. Perhaps nobody could not in a single night.
    Then again, even over longer periods of time, the chattering class seemed utterly oblivious to changes shaking the very foundations of their industry. By the 1970s, the Golden Age of Hollywood was well and truly past – fortunately, this decade also saw a tremendous wave of retro nostalgia, primarily focused on the 1950s, which were among other things the waning years of the Studio System, and the final period of dominance for many formerly-iconic but now-passé genres such as film noir, musicals, and westerns. Those studios which were still extant, if reeling from the collapse of the decades-long status quo, continued to have a great deal of difficulty adapting to new realities. All of the seven major studios (MGM, Universal, Paramount, Columbia, Warner Bros., 20th Century Fox, and United Artists) had expanded their operations into television by the 1970s – though some were dragged into doing so, kicking and screaming – and were increasingly forced to divest their historically valuable but obsolete assets. The eighth major studio of the Golden Age, RKO, had dissolved in 1959, having already sold most of its backlots to the upstart television studio, Desilu Productions, some years before.

    MGM, perhaps the defining studio of the Golden Age – which had continued to pay dividends throughout the Great Depression – had long ago been reduced to a holding pattern where their entire film division was almost wholly dependent on one major hit per year. This continued into the 1970s: Ryan’s Daughter had been very successful for them in the opening year of the decade, with Napoleon proving a veritable smash-hit the following year. It had also won the studio its first Best Picture trophy in over a decade, restoring some desperately-needed prestige at a critical time. A surprise hit for MGM in 1971 had been the pioneering Blaxploitation film, Shaft, allowing the studio to take advantage of a burgeoning genre, which would serve them well in the lean years ahead. For in 1972, they only managed to perform well with the first of the Shaft sequels; this situation repeated itself in 1973 with yet another sequel to that film. The success of MGM being increasingly tied to black audiences was reminiscent of a similar situation at NBC in the same era. It didn’t help that their more traditional successes, Ryan’s Daughter and then Napoleon, were bound to the whims of their directors – David Lean and Stanley Kubrick, respectively – who were meticulous perfectionists, and often took years to churn out their next pictures. Ryan’s Daughter had only been Lean’s fourth film since 1957. Stanley Kubrick was only slightly more prolific – he had made six films in the intervening years. [2] In the same span of time, a filmmaker of similar renown, Alfred Hitchcock, had directed seven films (with an eighth to come in 1972) and numerous episodes of his Alfred Hitchcock Presents series, on top of further work for television.

    As far as Kubrick was concerned, MGM did what they could to accommodate his ever-fickle muse, for better and for worse. Ongoing discussions about adapting The Lord of the Rings, the revered trilogy of fantasy novels by J.R.R. Tolkien, had first caught the attention of Kubrick in 1969, when the Beatles, who were planning on producing and starring in the film themselves, approached him and suggested that he direct. At the time, United Artists had owned the rights to adapt the books for the screen, having secured them directly from Tolkien himself (who, for the record, did not endorse the involvement of the Fab Four). Kubrick was uncertain about the viability of such a tremendous undertaking, given the immense logistical complications involved, but the point was mooted by his work on Napoleon, which commenced later that year. Kubrick had promised to revisit The Lord of the Rings, despite his misgivings, once his historical war epic was completed. But by the time that production had wrapped on Napoleon in 1971, the Beatles had separated for good; though this dissolution had only come after MGM had purchased the rights to The Lord of the Rings from UA, much to the chagrin of studio executives. However, Kubrick was true to his word, and set out on preliminary work in order to bring the trilogy to the screen. But after attempting several treatments, reading the books back-to-front, and even scouting out locations, Kubrick finally abandoned adapting The Lord of the Rings in early 1972. Perhaps he had simply tired of epics, having directed two of the most exhaustive films in a row (2001, and then Napoleon); by this time, he had become intrigued with the prospect of telling a story about the Holocaust. Perhaps he found such daunting subject matter invigorating, just as he had found the threat of Mutually Assured Destruction when he decided to adapt Red Alert (though that turned into the very different Dr. Strangelove). Among the ideas rejected outright by Kubrick were adaptations of A Clockwork Orange and The Luck of Barry Lyndon. [3]


    It was later in 1972 when a most unlikely candidate to direct the Lord of the Rings films emerged. It was the height of Porno Chic and two of the five highest-grossing films of the year were X-rated pornographic pictures: Deep Throat and Behind the Green Door. Also finishing in the Top 10 was the animated Fritz the Cat, directed by Ralph Bakshi, a maverick who – it so happened – was also a devoted fan of the Lord of the Rings trilogy, and been ever since the 1950s, when first floated the idea of adapting the film for animation. At the time, he did not have artistic clout or commercial success behind him, but that all changed with Fritz. And when he learned that Kubrick had decided not to go ahead with the project, he seized the opportunity. On the whole, MGM bigwigs weren’t sure what to make of his proposal, but Edgar Bronfman, the studio chief, was eager to revitalize the reputation of his company, whose cartoon unit – which had produced and distributed the Tom and Jerry shorts – had shut down in 1957. [4] Granted, the directors of those shorts, William Hanna and Joseph Barbera, were now contributing to the degradation of the medium (at least, in the opinion of many animators, including Bakshi) with their “limited animation” style. Nevertheless, they had still produced major hits in the past such as The Flintstones, which ran for six seasons and the present, like Wait Till Your Father Gets Home, which managed a five-season run before it ended in early 1977. [5] With regards to feature-length films, the primacy of the Disney studio had been badly shaken by the death of Uncle Walt himself in 1966. The Aristocats, the first film produced without him, had grossed well but critical reception had been lukewarm at best. And despite the time-consuming and laborious nature of animation, Bronfman reasoned that expenses would still come in well below analogous costs for three live-action pictures. With luck, for a moderate investment, Bakshi’s project would produce three high-grossing films in a row, in addition to grosses from MGMs live-action roster.

    One much-discussed technique used in Fritz the Cat was the use of watercolour backgrounds, which were traced from original photographs; this was deemed suitable for use in the Lord of the Rings project as well. It would also take advantage of the extensive scouting photography done by Kubrick’s team, which remained in the hands of MGM despite the director’s departure from the project in fact, these were pooled with photographs from the pre-production of Napoleon, resulting in some overlap of “settings” between the two projects. However, plans to use extensive rotoscoping of live actors in the animation process were quickly mooted. “It didn’t look good in Snow White forty years ago, and it doesn’t look any better now”, an executive sagely observed. [6] Rotoscoping was to be reserved only for the extensive battle scenes, which would take months or even years to animate without it. However, extensive use was made of live-action reference. In contrast to the revolving door of animators working on Fritz the Cat, Bakshi was able to assemble a dream team to work on The Lord of the Rings (which, granted, included several artists who had worked on Fritz), as the early 1970s saw a great many animators who had once worked at the now-closed cartoon divisions of the studios out of work and were happy to put their talents to use. MGM was, perhaps, slightly less guarded with money than they might otherwise have been, had Bakshi been more forthright with cost projections, but they were convinced they were onto a good thing with the release of Disney’s Robin Hood in 1973, which was heavily criticized for its recycled animation, from sources as old as Snow White [7] though the film still performed well at the box-office.

    As much preliminary work was done simultaneously on all three films as was possible, in order to ensure for a pattern of consistent annual releases. It thus took three years from the beginning of “principal photography” in 1973 for the first film, The Fellowship of the Ring, to be released in 1976. [8] The screenplays for the entire trilogy of films were written by Peter S. Beagle, a fantasy author of some renown, though from the plotting and storyboarding of Bakshi himself, who had consulted extensively with Tolkien’s daughter, Priscilla, in doing so (Tolkien himself having died in 1973, living long enough only to express some misgivings about seeing Middle-earth depicted in “cartoon” form). [9] The Fellowship of the Ring cost over $5 million to produce – appropriately, about as much as Napoleon had cost MGM some years before. However, it was also a hit, grossing over $40 million in the United States alone, cracking the Top 10 for 1976 and proving MGM’s highest-grosser of the year – finishing in first place was the Elvis Presley/Barbara Streisand remake of A Star Is Born, which netted the King an Oscar nomination for Best Actor, the first of his career. [10] Streisand was also nominated for Best Actress for the third time, though as with her previous two attempts (and like fellow gay icon Judy Garland in the same role, over two decades before), she did not win; however, she did take home the Oscar for Best Original Song. [11] As far as The Fellowship of the Ring, it (unsurprisingly, given the Academy bias against animation and fantasy) neither won nor was nominated for any awards, but it was well-liked by audiences, even the Tolkien fandom, for being very faithful to the original novel (though the character of Tom Bombadil and subplots related to him were excised for dramatic irrelevance), with critical praise going toward the art direction and to the voice acting (the actual animation was deemed merely above-average – better than Disney and leagues above Filmation or Hanna-Barbera, not that it was saying very much).

    The Two Towers followed in 1977. Though it obviously stood (as so many films did that year) in the long shadow cast by The Journey of the Force, it still finished sixth overall – one of two MGM films to finish in the Top 10, grossing over $50 million dollars on a budget of less than $4 million. However, it was regarded as a disappointment in that it failed to out-gross the animated competition from Disney, The Rescuers; in fact, it also failed to finish as the top-grossing MGM film of the year. The Robert De Niro vehicle Bogart Slept Here, for which the thespian won an Academy Award for Best Actor, did so instead. [12] Again, The Lord of the Rings was shut out of the Oscars. The film, despite its impressive grosses, received more lukewarm critical attention and audience reception, suffering as so many middle instalments of trilogies did from that certain “directionless” feeling. Despite the better raw grosses and (notional) profit margins over Fellowship, MGM brass were “concerned” at the direction of their series, and insisted on keeping Bakshi on a tight leash for the final film, The Return of the King, which was released in 1978. Fortunately for all involved parties, it would perform the best of all three movies, grossing $60 million (again reaching the Top 10, and becoming the biggest hit of the year for MGM). Critical acclaim was stronger for this film, and the Academy finally took notice, so to speak, in awarding Ralph Bakshi a special Oscar “for his creative and artistic adaptation of a modern fantasy classic to the screen through the use of animation”. [13] The film was perhaps about as different from the top box-office hit of the year (the long-awaited adaptation of the retro-nostalgia musical Greased Lightning) as was possible, but it spoke to the tremendous diversity of popular films throughout the decade. Nonetheless, as the years went by, definite trends emerged.

    If claims to being “the first blockbuster” were not foisted upon Moonraker, then they definitely would have gone to Jaws, released a year later. That Universal film was directed by the young wunderkind, Steven Spielberg, and advance word was so strong that EON Productions chose to hire the Hollywood Brat to direct for the James Bond films – he would helm 1976’s Live and Let Die and 1978’s The Man with the Golden Gun before moving on. Jaws was a man-against-nature thriller, a cousin to the disaster films that dominated the box-office through the decade, and was based on the bestselling novel by Peter Benchley, which depicted the “enemy” as a great white shark. These fish were popular romantic enemies of Man and had been for millennia, despite the absence of evidence implicating the creatures as particularly fond of human flesh. But no matter; they were known to be bloodthirsty and intimidating, which made for a great story. The film told the story of a small coastal New England town hounded by the titular shark, which leads a trio of locals to take him on once and for all. Jon Voight played the role of Hooper; veteran actors Roy Scheider and Robert Shaw played Brody and Quint, respectively. [14] The film’s production in 1974 was not without problems, particularly centred on the shark (the animatronic design did not respond to remote control, and had great difficulty staying afloat, sinking more than once). The draft script was re-written frequently, with Spielberg associate (and fellow Hollywood Brat) John Milius eventually receiving the screen credit. [15] It would earn him an Academy Award nomination for Best Adapted Screenplay, one of six received by the film, which were translated into five Oscars, including those for Best Picture and Best Director – Spielberg, at age 29, became the youngest person ever to win that award (and the first Baby Boomer to do so). [16] It also won for Best Original Score, awarded to John Williams (in his second win, after Fiddler on the Roof), Best Sound, and Best Film Editing, awarded to Marcia Lucas at Desilu Post-Production Editing Unit B. [17] But all of the critical acclaim and awards-show recognition in the world could not compete with the grosses mustered by Jaws, which raked in nearly a quarter of a billion dollars in the United States alone. The success of Moonraker in 1974 was no fluke; a new order was rapidly emerging.

    But as had been the case for the past quarter-century, the motion picture industry wished to demonstrate that it was artistically, morally, and intellectually superior to that most threatening upstart medium: television. One of the most outspoken screenwriters working in film, Paddy Chayefsky, had strong feelings about the small screen, and decided to write them down in hopes of coming up with something constructive – or perhaps, suitably destructive. The resulting polemic was a script called Network, which depicted the goings-on at a struggling (and fictional) fourth broadcast network (named the United Broadcasting System, or UBS). MGM produced and released the film in 1976, despite the executives and producers working within their own relatively prosperous television division having… reservations about the plot, which entailed an embittered, embattled news anchor, the “Mad Prophet of Airwaves”, forced out of his position by low ratings, only for his inspired ravings to draw an unexpected audience. A young and particularly ruthless female network executive, played by the ubiquitous Jane Fonda [18], despite her age nearing 40 by this time, took advantage of his new popularity, and the rest of the film charted his resulting descent – in more ways than one. Chayefsky, a member of the generation which fought in World War II (he himself had been an army veteran), was on the other end of the famed “generation gap” which had so defined the last decade, which had informed a wry – and inaccurate – observation about Fonda’s baby-boomer character (“She’s television generation. She learned life from Bugs Bunny.”) which, ironically enough, the great screenwriter failed to appreciate – Bugs Bunny, though kept alive by television reruns, had begun life on the big screen, in cartoon short-subjects, and virtually all of the content now seen on Saturday morning cartoons had originally been produced as such. Then again, Chayefsky had a decidedly conflicted relationship with television as a medium; he had gotten his start there as a writer during the Golden Age in the 1950s, and wrote a teleplay which would eventually become the Oscar-winning Best Picture of 1955: Marty. However, he eventually bought into the hype, seeing his launching-pad as a “vast wasteland” like so many of his fellows. Network, like his earlier Hospital, was a scathing, self-important satire, and in addition to winning Best Picture, it became only the second film to win the Big Five Oscars (Picture, Director, Actor, Actress, and Screenplay), after It Happened One Night had done so forty years before. [19] Director Sidney Lumet, who, like Chayefsky, had first rose to prominence in the 1950s, represented a “bridging” generation between the established studio hacks of yore and the New Hollywood auteurs. Jane Fonda won her second Oscar (after Klute), and Chayefsky won his third (after Marty and Hospital), making him the first individual to win three Screenplay Oscars as an individual – the previous three to pull the hat-trick (Billy Wilder, Charles Brackett, and Francis Ford Coppola) having done so as part of a team. MGM, which had produced Network, won its second Best Picture statuette of the 1970s, though obviously for a film that could scarcely be any different from their first.

    In subsequent years, and along similar aspirational lines, one of the great undertakings, spoken of only in hushed tones among the Hollywood Brats, was their planned adaptation of the famed Joseph Conrad novella, Heart of Darkness. Written as a critique of the notorious Congo Free State – a personal fiefdom of the King of the Belgians, Leopold II – at the turn of the century, the prospect of adapting the novella for the screen became a timely one in 1960, with the Congo Crisis that saw the former Congo Free State (annexed to Belgium in 1908) gain its independence (amidst a wave which swept many African countries in that era). However, the New Hollywood generation would not achieve critical mass until the late-1960s, but Africa very much remained in the headlines even into the ensuing decade, for a variety of reasons. The formerly “Dark Continent” was coming into its own on the world’s stage: it formed an ideological battleground between capitalism and communism (though virtually the entire continent technically remained in the Third World as opposed to formally joining the First or the Second); Portugal, alone among the imperialist powers of the previous century, continued to fight to maintain their colonies (as “integral provinces”), finally conceding Angola, Mozambique, and Guinea-Bissau (but maintaining their insular territories) in 1977; the death (though disputed by Rastafarians) of the Emperor Haile Selassie of Ethiopia, and his replacement by his young, charismatic, and Westernized grandson, Zera Jacob Selassie a graduate of Oxford University, who took the regnal name Constantine III after a previous Nəgusä Nägäst with the same birth name [20]and, perhaps most significantly, the celebrated “Rumble in the Jungle” taking place in Zaire, the former Belgian Congo itself, between two of the greatest boxers in the world, Muhammad Ali and George Foreman, in 1974. This fight had been a boon to the image of the Zairian Dictator Mobutu, who was rebuilding the capital of Léopoldville (renamed Kinshasa in 1966) in his own image. Among his grandiose projects was a film studio; Zaire was a francophone country, and the Congo River on which the city was located was densely populated, even by Sub-Saharan African standards.

    Mobutu may have already suspected that increased Hollywood interest in Africa might have lured prospective filmmakers across the Atlantic. The African Queen had famously shot on the Congo a quarter-century before, but visiting Hollywood productions had been very scarce in the years since – granted, most of Africa had still been under European control at the time, and the continent had since been swept by revolutions, dictatorships, and very poor living conditions. Even the relatively well-off countries (such as South Africa) were abhorrent to the Western Allies for altogether different reasons. But the story of Heart of Darkness was the story of a trip up the Congo – for the purposes of authenticity (and what was New Hollywood if not inspired by the principles of cinema vérité?), any auteur director would have to film there. It was a big gamble for Mobutu, and a very close call indeed. For the producer, Francis Ford Coppola, had hoped to “modernize” the story at the core of Heart of Darkness by transposing the era to the late-1960s, and the setting to Southeast Asia. [21] But no studio was willing to touch his concept, even after the success of his two Godfather films. “We don’t need another M*A*S*H” was the common rejoinder. Long after that film had been forgotten by the general public, it continued to serve as a cautionary tale. Coppola grudgingly accepted their verdict, deciding to hand over the project to his screenwriter, John Milius. By the time, Blaxploitation had become entranced with “Brother Against Brother In The Motherland” themes, with multiple pictures being shot in the newly-built Kinshasa studios. The technology available there fit perfectly with the fast-and-dirty exploitation aesthetic of that particular genre; however, it perhaps lacked the refinement of the more lavish Hollywood studios. Location filming in the Congo would be workable, though certainly not ideal. But in the end, authenticity was worth something to Milius, especially as he sought to stake his claim in popular culture, as all of his peers had done by the mid-1970s.

    Filming was a challenge – largely because all sides sought to gain optimal control of the film’s creative direction. Mobutu’s government had been made aware of an Afrocentric critique of the original novella by a Nigerian academic, in which the book (despite being condemnatory of imperialism and colonialism) continued to depict native Africans as the shadow archetypes, the “other”, whose savageness and barbarism threatened the complacency of the White Man’s existence from the harshness of the jungle. [22] They demanded that greater emphasis be placed on the brutality of the occupying powers, even above and beyond the incidents portrayed in the novella, and that the humanity and dignity of the Congolese people always remain in evidence. Environmentalist and animal rights organizations insisted on increased demonization of the ivory trade (which was present in the original book), as the elephant population was in rapid decline, the pachyderm already having been wiped out from large parts of the continent as a result in previous centuries. [23] As the weaponry used to dispatch elephants had advanced greatly in the past eighty years, Milius balked at this demand; he informed representatives from the World Wildlife Fund that the most effective way to dissuade viewers from supporting the ivory trade would be to hire native extras to re-create a historical hunt, which would in all likelihood result in the death of one or more actual elephants. In a compromise, the film did feature the live animals (borrowed from the Kinshasa Zoo), allowing the lead character, Marlow, to comment on the beauty of the creatures and lament their value only as a commodity. Chosen to play Marlow was Harvey Keitel, who had worked with another Hollywood Brat, Martin Scorsese, on his film Mean Streets. [24] Steve McQueen, who was originally offered the role of Marlow, had declined due to not wanting to spend too much time filming in Zaire – he accepted the smaller role of Kurtz. The part was in fact shrunk further in rewrites, which increased the mystique of the character in having him become mythologized prior to his first onscreen appearance in the final act. The decision to cast McQueen, one of the most potent actors of his generation, as Kurtz served to solidify this character arc. [25] Despite the role being quite atypical for McQueen, he surprisingly relished the opportunity to prove his chops – at least, for a hefty salary and top billing. It would prove the last such role of his career; the actor had been diagnosed with an incurable form of cancer prior to the film’s release. [26]

    With regards to post-production, Milius secured the services of an editor whose work he held in the highest respect, Marcia Lucas, by now a two-time Oscar winner. Despite considerable difficulties in retaining her services during what proved to be a very tumultuous year for both her and the industry, he was insistent, and she would receive an Oscar nomination for her services. It was one of many received by the film; it won Best Picture and Best Actor for McQueen, whose performance, less than fifteen minutes long, was the shortest-ever to win that award [27]; Milius won two Oscars, for Adapted Screenplay and for Director. However, Lucas did not win a third Oscar for Best Film Editing, for reasons that were widely perceived to be political. Milius went out of his way to thank her specifically in his acceptance speech for Best Director, as did Producer Francis Ford Coppola, who was accepting Best Picture for the first time, having notably lost the award for both Godfather films. In addition, the film performed well at the box office, grossing $80 million in the United States alone. [28] It was another vindication for the Hollywood Brats, who had managed to achieve success by working with the major studios. However, the uneasy peace between the radical, revolutionary forces of New Hollywood and the staid, complacent, establishment of the retrenched studio system came to a definitive end on the morning of April 6, 1978. Less than 72 hours after George and Marcia Lucas had won their Oscars for The Journey of the Force, they (on behalf of their studio, Lucasfilm Limited) filed suit against Paramount Pictures for breach of contract, fraud, and negligent misrepresentation. Thus began the Trial of the Century…

    ---

    Academy Award-Winners for Best Picture [29]

    42nd (1969-70): Midnight Cowboy (dir. John Schlesinger)
    43rd (1970-71): Patton (dir. Franklin J. Schaffner)
    44th (1971-72): Napoleon (dir. Stanley Kubrick)
    45th (1972-73): Cabaret (dir. Bob Fosse)
    46th (1973-74): The Exorcist (dir. Peter Bogdanovich)
    47th (1974-75): Chinatown (dir. Peter Bogdanovich)
    48th (1975-76): Jaws (dir. Steven Spielberg)
    49th (1976-77): Network (dir. Sidney Lumet)
    50th (1977-78): The Journey of the Force (dir. George Lucas)
    51st (1978-79): Heart of Darkness (dir. John Milius)

    Top-Grossing Films of the Year in the USA and Canada [30]

    1969: Butch Cassidy and the Sundance Kid (over $100 million)
    1970: Love Story (over $100 million)
    1971: Napoleon (over $100 million)
    1972: The Godfather (over $125 million)
    1973: The Exorcist (over $200 million)
    1974: Moonraker (nearly $175 million)
    1975: Jaws (nearly $250 million)
    1976: A Star Is Born (over $100 million)
    1977: The Journey of the Force (over $300 million)
    1978: Greased Lightning (nearly $175 million)


    ---

    [1] Both of these quotes are as per OTL. Granted, you may consider it a stretch that Carson would say the exact same thing he said IOTL, given the dozen years of butterflies that have accumulated ITTL, but I’ll allow it simply because it’s such an incredibly obvious observation. For the record, at that time, no telecast had yet run for four hours, IOTL or ITTL. Indeed, the longest-running ceremony was that of the 12th Academy Awards, which celebrated the films produced in 1939, the annus mirabilis of the Golden Age of Hollywood. In fact, as recently as 1972, a telecast had come in under two hours, but by 1979, the last several ceremonies had each run for well over three hours.

    [2] IOTL, Lean would not direct another film after Ryan’s Daughter (which, it should be noted, was a good deal more successful, critically and commercially, ITTL than IOTL) until A Passage to India in 1984. Kubrick, though he worked at a (relatively) moderate pace through the 1960s, saw his pace slow dramatically after 2001 was released in 1967: he would direct only five more films in the next thirty-two years. This was largely due to his all-consuming search for an ideal project, which rarely satisfied him.

    [3] Of course, A Clockwork Orange and then Barry Lyndon were Kubrick’s two films released after 2001 IOTL. Given that the director was extremely fickle about which projects he would bring to the screen, I’m going to posit that the window of opportunity for A Clockwork Orange has well and truly closed, allowing for it to be brought to screen later in the decade under the auspices of some lesser filmmaker, and obviously failing to achieve anything close to its OTL notoriety.

    [4] Recall from a previous post that Bronfman was able to cement his position as the Chairman of MGM strongly enough to fend off a challenge from Kirk Kerkorian which resulted in his deposition IOTL. In the ensuing years, with the relative success of Ryan’s Daughter and the boon of Napoleon, Bronfman was able to consolidate his position, having proven himself as good at making movies as his father was at making liquor. Speaking of which, upon the death of Samuel Bronfman in 1971, Edgar inherited the lion’s share of his father’s empire (which, in addition to Seagram, also included an oil company), and became one of the wealthiest men in the world, especially after the Oil Crisis of 1973. However, like a certain other fabulously wealthy media tycoon, Bronfman found himself accustomed to running the operations of his studio, and remained primarily focused on that enterprise; fortunately, he had many members of his large family to leave in charge of keeping the booze and the crude flowing.

    [5] Wait Till Your Father Gets Home ended in 1974, after two full seasons and a truncated third (quite common in animated series, for whatever reason) IOTL. In the death throes of the Great Society ITTL, the show has more resonance and finds a larger audience. Not coincidentally, the show finally wraps after Reagan is elected in 1976.

    [6] Yes, someone with a modicum of good sense has pre-emptively kiboshed the ludicrous overuse of rotoscoping by Bakshi. You’re welcome.

    [7] Sometimes a picture really is worth a thousand words – this moving picture should be worth about a million.

    [8] The one and only Lord of the Rings animated film to be directed by Bakshi IOTL (which depicted all of Fellowship and most of Two Towers) was released in 1978. Though the film was a box-office success, it did not receive a proper sequel, though Rankin-Bass released a (musical!) version of The Return of the King in 1980.

    [9] As the early 1970s saw an attempt by John Boorman to adapt the novels into a (single) film IOTL, during which time he did indeed correspond with Tolkien on the matter, the later plans for an animated version helmed by Bakshi would not come to fruition until after the author
    ’s death in 1973, and therefore Tolkien would never learn of it.

    [10] Elvis and his manager, Tom Hulett, agreed to take the part for scale in exchange for top billing; Streisand, a massive prima donna, forced a compromise of “diagonal billing” (pioneered for The Towering Inferno, which co-starred Paul Newman and Steve McQueen, a few years earlier, IOTL and ITTL) in which the King’s name would appear in the lower-left and her name would appear in the upper-right, with both names above the title. (Elvis was then offered, and accepted, a larger salary.)


    [11] IOTL, Barbra Streisand won the Academy Award for Best Actress at the 41st Oscar ceremony in 1969, for Funny Girl – in a tie, with Katharine Hepburn (becoming the first actress to win for the third time) for The Lion in Winter. This is one of only two ties ever in the history of all the acting categories, and unlike the previous “tie”, between Wallace Beery and Fredric March in 1932 – which reports varyingly held to be a lead for March of between one and three votes – this one was an exact tie. However, it was awarded on April 14, 1969 (a Monday), more than two years after our POD (and right in the industry where it takes place), also after the election and inauguration of President Humphrey and during the resolution of the overseas quagmire. All of this allows Hepburn to (narrowly) win her third Oscar solo ITTL.


    [12] Bogart Slept Here was based on the Neil Simon screenplay that would, IOTL, become The Goodbye Girl, which happened after Robert De Niro was deemed not right for the part (a takeoff on the Dustin Hoffman story – perhaps he might have been?). The role was then recast with noted movie actor Richard Dreyfuss, who had just appeared in Jaws (and, prior to that, in American Graffiti), but ITTL, why would anyone cast Richard “Meathead” Higgins in a romantic role? This gives Robert De Niro the Oscar for Best Actor in Bogart Slept Here, which is largely considered an “apology” for his shocking loss of the Best Supporting Actor trophy to Harvey Korman (which was nothing new even then – decades before, Jimmy Stewart won for The Philadelphia Story because he lost for Mr. Smith Goes To Washington), thus allowing the cycle of Oscar entitlement to begin anew.

    [13] Oscar chose the same tactic IOTL for Who Framed Roger Rabbit.

    [14] Hooper is another, far more visible role that Richard Dreyfuss was unable to play ITTL due to his commitment to Those Were the Days. Voight had been a finalist for the role IOTL. This naturally scuttles any further collaborations between Spielberg and Dreyfuss in the future, most notably Close Encounters of the Third Kind.

    [15] Milius was merely a ghostwriter for the film IOTL (one of many, as it happens). Screenplay credit was awarded solely to Benchley himself, along with Carl Gottlieb.

    [16] Spielberg, quite notoriously, would not win Best Director until 1994 IOTL, for Schindler’s List, at the age of 47. Indeed, he was first recognized by the Academy as a producer (receiving the Irving Thalberg Memorial Award in 1987), by which point he had been snubbed even for nomination on numerous occasions (including for Jaws itself, as well for his first major “Oscar bait” film, The Color Purple). Among the films for which he lost Best Director IOTL: Close Encounters, Raiders, and E.T.

    [17] The editing job – and the Oscar – went to Marcia’s mentor, Verna Fields, IOTL. ITTL, Marcia became the first woman to win Best Film Editing twice upon receiving the Oscar for The Journey of the Force. Thus, when George Lucas won his Oscar for Best Director that same night, he was still one behind his wife.

    [18] The female network executive was played IOTL by Faye Dunaway. But ITTL, Jane Fonda, whose career was not (temporarily) hobbled by her actions in support of an enemy of the state, wins the part instead, despite really being too old for the role (though, granted, this has never stopped Hollywood before, nor has it ever since IOTL).

    [19] IOTL, One Flew Over The Cuckoo’s Nest won the “Big Five” Oscars the year before, becoming the second film to turn the trick. Network won Best Actor, Best Actress, and Best Original Screenplay, but lost both Best Picture and Best Director to one of the screen’s greatest Cinderella stories, Rocky, a film which does not exist ITTL.

    [20] IOTL, Haile Selassie was deposed in 1974, in a coup by insurgents who were supported by pro-Communist elements, which ITTL the CIA works to defuse. This, coupled with the death of his Heir Apparent, Amha Selassie, in a severe stroke in 1973 (from which he recovered IOTL, living for another quarter century), paves the way for his young and liberal grandson to take the throne upon his death. Fortunately for the young Constantine III, he is taking the throne amidst a wave of pro-monarchical sentiment (to which his own accession indeed contributes), which helps to blunt initial opposition to his reign, before he is able to assert himself and win over his people.

    [21] This produced the resultant film Apocalypse Now IOTL, one of several films of the late-1970s which were utterly obsessed with rehashing the overseas quagmire.

    [22] The academic, Chinua Achebe, was also an Afrocentric novelist of some renown, and wrote his critique in February, 1975, IOTL. You can read more about it here.

    [23] Worth noting, in another entry for the “Suddenly Always This Way” file, is that the ivory trade was only banned in 1990 IOTL.

    [24] Keitel was originally chosen for the role of Capt. Willard in Apocalypse Now IOTL, before he was dismissed and replaced by Martin Sheen.

    [25] McQueen was the first choice for the role of Willard IOTL, but declined to participate due to the extensive shooting that would be required in the Philippines (which turned out to be far more than anyone could have realized, and likely would have killed him). The suggestion by Milius to cast him as Kurtz is an invention for TTL.

    [26] As IOTL, sadly; McQueen died on November 7, 1980, though by late 1978 he had developed a persistent cough which plagued him for the rest of his life.

    [27] The shortest performance to win a lead acting Oscar IOTL was that of David Niven, who won Best Actor for Separate Tables, in a performance lasting for fifteen minutes and thirty-eight seconds of screentime. Also worth noting is that Beatrice Straight, whose supporting performance in Network was the shortest ever-recognized by Oscar (at five minutes and forty seconds), did not appear in that role ITTL, and her equivalent did not won the Oscar either.

    [28] About par with the OTL grosses for Apocalypse Now in 1979.


    [29] Midnight Cowboy and Patton are as IOTL. All subsequent winners differ from OTL: Cabaret wins instead of The Godfather; The Exorcist wins instead of The Sting; Chinatown wins instead of The Godfather Part II; Jaws wins instead of One Flew Over The Cuckoo’s Nest; Network wins instead of Rocky; The Journey of the Force wins instead of Annie Hall; and Heart of Darkness wins instead of The Deer Hunter. All of these also win Best Director except for Chinatown; Francis Ford Coppola wins for The Godfather Part II.

    [30] Butch Cassidy and the Sundance Kid and Love Story are both as per OTL. Subsequently, the top-grossing films of their respective years IOTL were as follows: Fiddler on the Roof in 1971 (with $80 million); The Godfather in 1972 (with $135 million); The Exorcist in 1973 (with $193 million); Blazing Saddles in 1974 (with $120 million); Jaws in 1975 (with $260 million); Rocky in 1976 (with $120 million); Star Wars in 1977 (with about $300 million); and Grease in 1978 (with $160 million).

    ---

    This was originally going to be a smaller, more intimate update, before the length burgeoned to 7,000 words and I accrued 30 footnotes – the most I have ever had in any update that I’ve ever written; and I honestly thought that I would never top the 28 that I managed to include with the previous update. Nevertheless, I want to thank all of you for reading, and I hope that this gives you a good impression of American cinema in the 1970s and how it compares to that of OTL. And as you can see, there’s now a spectre looming over Hollywood that could well threaten to shake the status quo as nothing has done before. We will be revisiting that, of course. Many times, in fact
     
    Last edited:
    Meanwhile, At the Hall of Justice...
  • Meanwhile, At the Hall of Justice…

    You will believe a man can fly!

    – Tagline for the Superman film, 1978

    Comic books, though possessing direct antecedents dating back to the nineteenth century, if not further, truly came of age in the late 1930s; in doing so, they formed the mosaic for one of the most tumultuous eras in global history. The Golden Age of Comic Books, as it came to be known, was (in a rarity among historians)
    universally agreed to have commenced with the publication of Action Comics #1, on April 18, 1938. That first issue saw the debut appearance of the character known as Superman, the first modern superhero, in whose wake a great many would follow. Notably, the Golden Age of Comic Books overlapped with those of both the motion picture and radio industries; all three spanned the entire Second World War, a demonstration of how the flourishing of popular culture worked to cement that conflict as the most iconic in world history. Comic books had never more popular, more important, than they were in the 1940s; being pulp literature, their depictions of men (and women) with superpowers fighting alongside the troops against the Nazis in Europe, and the Japanese on the Pacific, struck an instant and indelible chord with the general public. However, given the unapologetic demonization of the enemy, many images propagated by these comics… did not age very well, to put it delicately. To put it blatantly, their depiction of the Japanese in particular was horrendously racist, perhaps even for the era. It was part of a barrage of dehumanization of people belonging to that ethnicity during that conflict, a matter of which great political hay would be made in the future. Then again, the visual depiction of virtually all minority races – in every medium – very much left something to be desired.

    It was not surprising, however, that during a conflict with the unprecedented co-opting of the privately-owned-and-operated media for propaganda purposes, that comic book readers would take to superheroes (often with very humble and unexceptional origins) fighting the enemy so voraciously. In fact, Superman was not even the most popular of superheroes during the Golden Age which he had kick started; in fact, a character who might charitably be called a “knockoff”, Captain Marvel, held that title instead, with his comic being the best-selling of the 1940s. Marvel also beat Superman to the silver screen, with a twelve-part film serial of his adventures released in 1941. They even predated the celebrated Fleischer Superman cartoon serials – which, in another sign of the times, evolved from relatively apolitical science-fiction plots to pure wartime propaganda in later shorts, after the Fleischers had been bought out by Famous Studios. However, Captain Marvel (nicknamed “The Big Red Cheese”) did not age nearly as well as Superman (“The Man of Steel”), becoming a relic of the Golden Age, with publisher Fawcett Studios cancelling his comic in 1953. The early-1950s were a transitional era (and not just for Comic Books) in which many popularly-held preconceptions about the world and the people who lived in it had to be reassessed. The hated Japanese had been defeated, through the use of a heretofore unknown weapon as mighty as anything seen in the pages of those wartime comic books. Their American cousins, all of whom without exception had demonstrated unwavering loyalty to their new homeland, had been interned without due process of law, entirely as a result of their ethnicity. The “Negro” soldiers, though still segregated from white units, had served with distinction on every front, and in every service, of the United States Armed Forces. The wildly popular entertainment form that existed primarily to mock and belittle them, the minstrel show, was rapidly falling out of vogue. This could be demonstrated on the screen: the 1942 musical Holiday Inn had featured a “blackface” minstrel performance, whereas its 1954 remake, White Christmas, did not. This was representative of barriers being broken down throughout society in this era: segregation of the armed forces, fittingly, had ended once and for all that very same year. The drive for civil rights was a fact of life.

    By about this time, multiple live-action Superman serials had been released to theatres; and, more importantly, the famed radio show which had run for over a decade had since evolved into the Adventures of Superman, the 1950s television series which starred George Reeves as the Man of Steel. One of the most popular and enduring action-adventure series of its era, the impact it had on popular culture was confirmed when Reeves put in an appearance on none other than I Love Lucy, playing himself (though, for the benefit of young viewers at home, he was identified only as “Superman”). Adventures of Superman bridged the transition between two specific aesthetics from the opposite direction; departing a dark, cynical, morally ambiguous period – the years between the end of World War II and the Korean War, which established the Cold War hegemony and replaced the threats of Nazi Germany and Imperial Japan with the emerging superpowers of Soviet Russia and Red China – to be replaced with one of glossy, overly-affected and societally-imposed sunshine and cheer: the 1950s, the era of suburbia and the burgeoning middle-class baby boomer families. The first two seasons, which were filmed in black-and-white, were influenced by film noir styles (still omnipresent in the early 1950s – Humphrey Bogart was still alive, after all), but later seasons, filmed in colour (a pioneering and prescient move by producers, which would have boffo financial results in the years to come) took on the campier tones that would define comic books and their derivative media in the Silver Age. In a way, it presaged the Batman series that would follow, even if it did not delve into the same wretched excess. Of course, the history of the program could not be complete without its infamously tragic coda, when George Reeves committed suicide in 1959; production had ended on Adventures of Superman the year before.

    Though the start of the Golden Age of Comic Books could be dated with unusual precision, finding the end date was far more problematic. The very height of the comic industry had been during wartime, and sales of superhero comics had declined immediately after V-J Day, only to be supplanted by other burgeoning genres, particularly romance and horror. These more visceral topics were depicted with increasing frankness on the pages of bright and colourful books that were popularly (if incorrectly) perceived as being intended exclusively for young audiences. And therein lay the problem: though there was an increased awareness of the need for racial tolerance and integration in society, the 1950s were in many other ways quite culturally conservative. Nobody knew this better than Dr. Fredric Wertham, the author of an enormously influential expose on the comic book industry, Seduction of the Innocent, published in 1954. One common thread of Golden Age comics had been the intimate same-sex friendships that had formed between many characters. This was also reflective of the World War II backdrop, in which young men from disparate corners of the Union would form instant and unimpeachable bonds with the other men in their unit, with nothing more than pictures and the occasional care packages reminding them of their girls waiting for them back home. But Wertham saw a subtext there that discomforted him, and he wrote in great detail about it, cherry-picking and even manufacturing evidence whole-cloth to suit his premises, and to raise the ire of his readers, in the most sensationalistic fashion possible. Batman, one of the most popular figures of the Golden Age, had since 1940 been accompanied on his crime-fighting adventures by his young, pubescent ward, Robin. Their secret identities were, respectively, Bruce Wayne, a millionaire bachelor playboy, and Dick Grayson, an orphaned circus acrobat. The two had been shown in some issues sharing the same bed. Wertham immediately came to what he saw as the only obvious conclusion: the two were homosexual lovers, in the tradition of the ancient Greek pederasts. This was the smoking gun, as it were, in his laundry list of complaints about all genres of comic books, demanding that some form of regulatory body be established to censor the impropriety of the fledgling medium. The result was the Comics Code Authority, a Hays Office for pulp literature. Comic books would never be the same... though, ironically, Batman (and, in fact, most superhero comics) would survive the purges that followed the institution of the Comics Code. The once-rising romance and horror genres, on the other hand, were not so fortunate – it had proved relatively easy to adapt superhero comics to the specifications of the Code, but titillation and shock factor were crucial to the success of those other genres, which found themselves eviscerated by the overwhelming restrictions thereupon. Those which were not immediately cancelled simply tapered off into oblivion. Some, like MAD Magazine, found entirely new niches, and were very successful.

    For all its notoriety it was in fact Batman itself which, for better or for worse, came to define the popular perception of comic books during the era later described as the Silver Age. It had started off as a purely film-noir-derived comic, with the titular character taking the guise of a bat in order to strike fear into the hearts of criminals. As was the case for Superman, multiple film serials would follow. But the live-action television adaptation which premiered in 1966, and starred Adam West as Batman and Burt Ward as Robin, was loud, colourful, absurd, and campy – the one difference was that the comics were shockingly sincere in their lavish ludicrousness, but Batman – having been brought to the small screen by a cynical producer, William Dozier, who refused to take the material seriously – would furnish every sight gag or bit of convoluted exposition with a knowing wink. And in the show’s early years, the delicate balance between cotton-candy sights and sounds, and the mocking cynicism buried just beneath the surface was maintained by the head writer, Lorenzo Semple, Jr. His departure, followed by the end of the famous cliffhangers which allowed the show to appear on the air twice a week (in its final season, it was reduced to the standard once-weekly schedule), saw a decided decline in the show’s perfectly-honed quality, and thus its popularity. To put it more bluntly, the show went off the rails. Even the introduction of the Batgirl character, played by Yvonne Craig, could not forestall the inevitable, nor could a bizarre running plotline set in swinging London (described in the show as “Londinium”). Batman was unceremoniously cancelled in 1969. However, its stars would continue to portray the characters, primarily in animation, through to the end of the 1970s. In many ways this continued association with the Dynamic Duo was forced upon them by typecasting; West, the story went, had turned down the role of James Bond, and Ward had rather desperately sought the role of Benjamin Braddock in The Graduate, only to lose it to Dustin Hoffman.

    Many shows had followed in the footsteps of Batman; formerly serious, if equally outrageous, programming like The Man From U.N.C.L.E. had been capsized by a shift from sincerity to camp. The producers of Star Trek, on the other hand, had made a conscious decision to avoid moving in that direction; this paid dividends when Batman was cancelled just as the Star Trek began its ascent into becoming a legitimate pop cultural phenomenon – charmingly earnest and laden with warts-and-all sincerity – as the 1960s came to a close. [1] The new wave of optimism sweeping American culture as a result of the end of the overseas quagmire and the exhilaration of Moonshot Lunacy found a peculiar reflection in comic books, however. Just as the devastating conflict that was World War II was corresponded by a Golden Age of fun and adventure on the page, the sunshine and roses of the early 1970s saw a counter-intuitive move to focus on the visceral and harsh realities of the seedy underbelly. Largely, though, this new “Bronze Age” which had emerged stood in contrast to the Silver Age which had just concluded. The children who had kept superhero comics alive were now growing up, and (as the Mini-Boom proved) were having children of their own. Television shows such as Mary Tyler Moore and Those Were the Days reflected a new paradigm: optimism and confidence for the future did not have to go hand-in-hand with willful ignorance or sheltering the vulnerable from the truth. This movement made for strange bedfellows when many pedagogical techniques, including those championed by Mr. Fred Rogers on his PBS series, took the same tack to childhood education.

    The Bronze Age of Comic Books marked a shift in censorship policy – echoing that which had already taken place in American cinema, some years before. The governing body of the comic book publishing industry, the Comics Code Authority, was continually revised in the early 1970s. [2] Gold Key Comics, effectively a satellite company of Desilu Productions by the mid-1970s, was not bound by the Comics Code, however, and did not seek to become so. Star Trek, the most popular comic published by neither DC nor Marvel, was thus able to delve into adult themes in even greater detail than the television series had done, always keeping one step ahead of the ever-relaxing censorship restrictions which bound the larger companies. [3] The Bronze Age was, above all else, a backlash against the Silver Age which had preceded it (as many new periods tend to be). Again, Star Trek had played a part, although as part of the greater Moonie Loonie mosaic of the era. Genre fiction was being taken seriously by an ever-larger number of consumers, and superhero fiction was part of the genre. Retro nostalgia, counter-intuitively as it might have seemed, helped too: prior to Dr. Wertham, comic books had enjoyed darker plots, influenced by film noir of course, but also by the realities of the conflict that had framed much of the Golden Age. War, death, murder, and brutality had all been facts of life in the 1940s. The defanged “bad guys” of the Silver Age were a joke. A new, rising generation of writers who were willing to push the envelope was emerging, and they felt that serious issues deserved proper coverage, and that their audience, regardless of its composition, deserved proper respect. Themes which had been completely ignored in the Batman television series (always brushed aside for the sake of a laugh), became topics of serious, almost withering analysis in the new comics: the psychology of superheroism, the ethics of vigilantism, the allure of crime, and many others. Social issues also took on greater importance.

    The 1970s were obviously a decade of great strides for women’s rights, continuing trends which had begun with the Sexual Revolution of the late 1960s. “Liberated” female characters were demanded by women in each and every medium, with comic books being no exception. The most prominent female superhero, Wonder Woman, had been created during the Golden Age by psychologist William Moulton Marston, who had also been a pioneer in the invention of the lie detector, which explained one of the character’s most famous powers: the use of a lasso which could bind her opponents and compel them to speak the truth. Wonder Woman was given an Amazonian heritage, allowing writers to exploit Greek mythology in portraying her origins, characterization, and powers. Her abilities were plainly superhuman, though the character was briefly de-powered in the late 1960s to bring her more in-line with popular heroines of the time, such as Mrs. Emma Peel. Intense backlash, including from many women’s rights activists, saw her powers quickly reinstated. Wonder Woman entered the 1970s as the definitive superheroine, and one of the Big Three of DC Comics, alongside Superman and Batman. And naturally, with Superman having been brought to the small screen in the 1950s, and Batman having followed in the 1960s, the question of Wonder Woman following their footsteps was a matter of “when”, not “if”. Technically, Wonder Woman herself had first appeared in the iconic Superfriends cartoon, which had premiered in 1973. [4] This cartoon, very much in the mould of the “limited animation” popular in the era, carried on the Silver Age aesthetic even into the 1980s. Wonder Woman – like most of her stablemates – was far from unscathed by her presence in that program, with satirists mocking the infamous sequences of the character “flying” through the skies in her invisible airplane (as, unlike Superman, she could not fly under her own power). The following year, in 1974, a pilot movie was produced. Owing a great deal to retro nostalgia, the decision was made to avoid the modernization affecting the character in then-current comics and instead take advantage of retro nostalgia, putting the movie (and the show which would result therefrom) into a vintage, World War II setting. [5] The pilot movie arranged for Princess Diana of the mythical Paradise Island to transport the fallen Maj. Steve Trevor of the USAAF back to the States; after hijinks ensued, she found herself permanently stationed at the USAAF as Yeoman and secretary to Maj. Trevor, under her civilian identity as Diana Prince. As a superhero, however, she became known as Wonder Woman.

    The role of Wonder Woman was portrayed by Lynda Carter, an actress, singer, and model, who had been named Miss World USA in 1972. [6] Her physical attractiveness was matched by her enthusiasm and her willingness to perform stunts herself, to enhance the experience. Her earnest performance endeared her to fans and critics alike; the essential “powerful femininity” of Wonder Woman had always defined her character, and Carter worked tirelessly to channel that into her performance. Wonder Woman proved a reliable hit for ABC for the five seasons it aired, from 1974 to 1979, with a total of 133 episodes to its name. [7] The series finale, which aired (in the standard 1970s fashion) as a telefilm, entailed the conclusion of World War II, and the question of whether Wonder Woman would return to Paradise Island, or remain in the United States. Unsurprisingly, she chose to become an American, having fallen in love with the country to which she had immigrated, as so many generations had done before her. Diana Prince, in the meantime, accepted the offer by Steve Trevor (as he was no longer her superior, having been honourably discharged) to begin seeing him on a personal basis... only after she revealed her secret identity to him. [8] To his credit, he responded as well as any man in his circumstances might have done, and even endorsed her desire to continue working as a professional, despite the overwhelming drive for most of her fellow women in the workforce to return to their past, domestic lives.

    Like DC Comics, Marvel Comics saw the success of one of their marquee properties in an adaptation of The Incredible Hulk (the superlative adjective being something of a trademark with Marvel properties). Most of the Marvel properties developed from the 1960s onward, primarily by the writing tandem of Stan “The Man” Lee and Jack “King” Kirby (with an occasional assist from Steve Ditko, among others), allegorized specific societal ills of the era; the Hulk, for his part, represented the horrors of war. The character, a modernized take on the old Jekyll-and-Hyde story (with elements of Frankenstein, in modern science having created a monster), was the involuntary mutation created by an unauthorized scientific experiment gone very wrong; the human behind it, Dr. Bruce Banner, was depicted as meek and withdrawn, and highly intellectual. This was, of course, to better contrast with the monosyllabic Hulk monstrosity. The Incredible Hulk was favoured for adaptation to live-action television because the Hulk was a lone wolf with no obligations to anyone (unlike Spider-Man, the Fantastic Four, or the X-Men), and this would allow him to walk the Earth, a setting that matched many popular action-adventure series: The Fugitive, The Way of the Warrior, and The Questor Tapes among them. The decision was made to have Dr. Bruce Banner played by a seemingly milquetoast actor, and the Incredible Hulk played by a bodybuilder. After an extensive search, the decision was made to cast two unknowns in their respective parts. Ted Danson, who had up to that point appeared primarily in soap operas, was chosen as Dr. Bruce Banner. Though in reality a handsome man who did not physically suit the role of a timid academic, this was disguised with some well-employed costuming, in particular the use of large, horn-rimmed glasses. [9] As his alter-ego, the Hulk, an “actor” was chosen who did not resemble Danson, but this didn’t matter, and neither did the fact that he spoke little English. A six-time Mr. Olympia, the Austrian bodybuilder Arnold Schwarzenegger nonetheless had an undeniable screen charisma, and was very effective at playing a loutish, barbaric brute. [10] His “dialogue”, such as it was, had been dubbed over by Jack “Lurch” Cassidy, a veteran at providing booming, contrabass voices. Despite this, both Danson and Schwarzenegger became iconic in their portrayals of the respective Jekyll and Hyde characters, Schwarzenegger in particular making his mark on popular culture far above and beyond what one would expect of a mere bodybuilder.

    Despite the great popularity on television of both Wonder Woman and The Incredible Hulk, the Alpha and the Omega was, and remained, Superman. Plans for a full-length motion picture (about the only format the character had not explored by the 1970s) had been discussed for many years. Independent producers Ilya and Alexander Salkind had secured the rights from DC Comics in 1973, with a laundry list of potential actors and directors for the project. Chosen to direct was Guy Hamilton, who had directed the iconic James Bond film Goldfinger, and who took an active role in every step of the production. [11]

    After an exhaustive talent search, a virtually unknown actor named Kirk Allen was chosen to play the Man of Steel. Classically handsome and athletic, with a boy-next-door-all-grown-up appearance, Allen’s only flaw in regards to not resembling Superman was his light blond hair, which was corrected with a rather caustic – but effective – hair dye. Allen played Superman and his alter-ego, Clark Kent, very differently, often exaggerating the traits of each character in order to keep them separate. It was, perhaps, a somewhat blunt approach, but it was crudely effective. [12] Chosen to star opposite Allen as Superman (and Clark Kent)’s eternal love interest, Lois Lane, was Stockard Channing. Though she was older than Allen during principal photography (33 to his 29), she won the part thanks to her mature, urbane attractiveness and her singing ability (as the part called for Lois to perform an internal monologue as if it were a musical number). [13] Veteran actor Dustin Hoffman, a proven box-office draw, was selected to portray the primary villain, mad scientist Lex Luthor, and was given top billing – and the film’s largest paycheque – for doing so. [14] The other above-the-title star was the Golden Age icon, Jimmy Stewart, who portrayed Pa Kent, Clark’s adoptive father (and died tragically at the end of the first act). The production team could not resist the opportunity to stunt-cast Ma Kent, choosing Donna Reed (Stewart’s one-time co-star in It’s A Wonderful Life, his personal favourite film) for the role. On-set lore had Stewart continuously flubbing his lines by referring to Reed’s character as “Mary” instead of “Martha”. [15] The film was well-received critically; the score, special effects, and simple but well-told story were all highly praised. The earnest, if somewhat clumsy performance by Allen was given good marks, though most reviewers agreed that Channing, Hoffman, and Stewart all stole the show. Stewart would surprisingly receive an Academy Award nomination for Best Supporting Actor for his performance in Superman; ironically, it was longer than the fifteen-minute turn by Steve McQueen that won for Lead Actor at that year’s ceremonies. However, and most importantly, Superman proved a box-office hit, grossing over $150 million at the box-office that year, coming in a close second to Greased Lightning, and guaranteeing a sequel to continue the story. [16] It was a triumphant return to the peak of mainstream popularity and relevance for Superman, within the world of superhero comics. The Man of Steel, who was “faster than a speeding bullet, more powerful than a locomotive, able to leap tall buildings in a single bound”, and who stood for “truth, justice, and the American Way”, had once again captured the hearts and minds of audiences everywhere.

    ---

    [1] IOTL, of course, the final season of Star Trek, which I have so affectionately described on multiple past occasions as the “Turd Season”, did dive headlong into camp under the auspices of the new showrunner, Fred Freiberger, and in particular his grossly unqualified story editor, Arthur Singer.

    [2] A single, cataclysmic event (a request by the Department of Health, Education, and Welfare to run an anti-drug storyline in the pages of The Amazing Spider-Man in the early 1970s) resulted in the chain reaction that sent the Code down the long road to irrelevance. However, ITTL, the Nixon Administration does not exist, and therefore that request is never made (the War on Drugs isn’t exactly high on the list of priorities for the Great Society). This allows the CCA to adapt further, and continue to exist for the longer-term, just like the MPAA had done a few years before with the switch from Hays to the ratings system.

    [3] Gold Key did not adhere to the Code IOTL, either.

    [4] Superfriends also premiered in 1973 IOTL, as well. It was, in fact, produced by Hanna-Barbera, one of the two pillar studios of limited animation.

    [5] The original pilot movie took a different tack IOTL, instead attempting to adapt then-current storylines (which had controversially modernized Wonder Woman), to lukewarm response. After retooling, a second pilot movie was released which much more strongly resembled the show which was to come.

    [6] Yes, I’ve cast Carter as Wonder Woman ITTL. What about the butterflies? She wasn’t cast for the original OTL pilot movie! It starred Cathy Lee Crosby instead.

    [7] The complete adventures of Wonder Woman lasted for three seasons of less than 60 episodes IOTL (from 1976 to 1979).

    [8] Though Maj. Trevor (played by Lyle Waggoner, of all people) was intended as the love interest, he and Diana did not hook up IOTL.

    [9] IOTL, the role of Dr. David Banner (the name was changed from Bruce because the alliteration seemed to overtly betray its comic-book origins) was played by the established, and older, actor, Bill Bixby. Danson, of course, would go on to become known for appearing in Cheers, as the former relief pitcher of the Boston Red Sox, Samuel “Mayday” Malone. Don’t believe he could pull off the “nerdy” look? I submit to you his appearance in the 1981 film, Body Heat.

    [10] The role was won by Lou Ferrigno IOTL, who had appeared alongside Schwarzenegger in the 1975 documentary film Pumping Iron, chronicling a Mr. Olympia contest. Schwarzenegger, for his part, did not achieve success in mainstream film or television at all during the 1970s, going on to win his seventh and final Mr. Olympia title in 1980. His career from that point forward was unfortunately somewhat obscure, and cannot be reliably determined.

    [11] Hamilton was chosen to direct, but was forced to drop out due to his tax exile status in the United Kingdom, where filming was moved on account of Marlon Brando facing an obscenity charge (for Last Tango in Paris) in Italy, the originally planned shooting location. Fortunately, Brando is not involved in this film at all ITTL, and Superman is shot largely at the famed Cinecitta Studios.

    [12] Allen is an original character - the first to be introduced so far for this timeline, but not the last!

    [13] Channing auditioned for the role of Lois IOTL, losing it to Margot Kidder, who has proven a rather contentious choice. Channing then went on to appear in Grease, as Rizzo, playing a high schooler at, yes, the age of 33.

    [14] Gene Hackman played Luthor IOTL, receiving second billing behind Brando.

    [15] Stewart plays Pa Kent instead of Glenn Ford ITTL, taking a much more modest paycheque than Marlon Brando did IOTL for Jor-El (who is accordingly played by a nobody). The chance to stunt-cast Reed (who, like Stewart and Hoffman, is an Oscar-winner) proved irresistible, especially once Stewart recommended her for the part. Believe it or not, It’s A Wonderful Life, though rising in popularity, was not the perennial Christmas classic it would become in later years by the late-1970s IOTL, and of course, ITTL, the 1974 clerical error that allowed it to fall out of copyright did not happen.

    [16] Superman and Superman II were filmed together IOTL, but for administrative reasons, that was not the case ITTL.

    ---

    Thanks to e of pi for his assistance in the editing of this update! Speaking of which, this is the first of a double-barrelled update for the long weekend; his guest interlude should be ready tomorrow, just in time to close out the month. In fact, as I write this, I’m also asking him if he’s sure he’ll have it ready, and his response is most promising.
     
    1979-80: Evening in America
  • Evening in America (1979-80)

    December 31, 1979

    The good people at Desilu Productions worked hard, so it stood to reason that they knew how to play hard as well, and so they did. Everyone in the employ of the studio, and all of their families, had been invited to the massive New Year’s Eve bash, which spanned the entire lot. Thousands had turned out, a marked contrast to the eerie quiet on the other side of the wall, at Paramount. Those in charge over there were in no mood to celebrate. But the atmosphere was jovial at what was once known as the House that Paladin Built, with Lucille Ball holding court over her well-run empire. The 1970s had been very kind to her, despite some ups and downs along the way. Over the course of that decade, virtually her entire staff had turned over, except for those alongside her at the very top – her husband, Gary Morton, and her right-hand man, Herbert F. Solow. They were close at hand as she held court, along with many friends, old and new, and employees, current and former, going all the way back to the studio’s first heyday in the 1950s.

    For the first time since the I Love Lucy 25th Anniversary Special in 1976, Ball and her ex-husband, Desi Arnaz, were appearing together at a public event, having reconciled after becoming estranged due to the publication of Arnaz’s dirty-laundry-airing, tell-all autobiography, A Book, that same year. Arnaz was accompanied by his second wife, Edith Mack Hirsch, an old family friend; to everyone’s credit, the night went more than smoothly, not least of all because the grandchildren were present. Desi Arnaz, Jr. (properly Desiderio Alberto Arnaz y Ball IV) and his wife, Patty Duke, had brought their children with them to the party: son Desi V (Desiderio Alberto Arnaz y Duke V) and daughter Lucille Patricia Arnaz, named for her grandmother, aunt, and mother, and known by her nickname of “Lulu”. [1] Among the other children present was Amber Lucas, daughter of George and Marcia; she spent much of the night being chased around rather persistently by Eugene Wesley Roddenberry, Jr., or “Rod”, the son of Gene and Majel Roddenberry. “Just like his father”, multiple observers were overheard to remark. However, Amber wasn’t the only Lucas who felt rather like a deer caught in the headlights.

    “I want to thank you for inviting all of us tonight, Lucy,” Marcia said. “I can’t tell you how much your support has meant to me and George. You’ve been a real rock for us.”

    “Well, you know I don’t like to get involved in politics,” Ball said. “So I have to judge you on your individual merits, and you’ve been nothing but a treasure to me and this studio. Just because you’re having a little squabble with ol’ Charlie Bluhdorn next door doesn’t mean it has anything to do with me. I like you anyway!”

    Marcia grinned; only Lucy could refer to her industry-shaking multi-million-dollar lawsuit as a “little squabble”.

    “Besides,” she continued, “there’s so much to celebrate. Have you seen that issue of Variety, Marcie?” As she posed the question, she brandished a copy of the trade paper in question, opening it and flipping through the pages before coming to a sudden stop. “The 1970s in Review. They threw together a very flattering write-up about this studio.”

    Marcia chuckled. “You’ve showed it to me a dozen times before, Lucy,” she said. “I think it’s real great, and I’m proud to have some small part in it.”

    As she said this, her husband George ambled up to them. “Someday I’d love to have the chance to make Lucasfilm as big and successful a studio as Desilu,” he said.

    Ball smiled indulgently; she’d heard George say this more often than she herself had told Marcia about the Variety article. “Anything’s possible, Georgie,” she said, obligingly.

    Her right-hand man, Herb Solow, came to join them. “You remember, Lucy? You were in Variety at the end of the Sixties, too.” He produced the paper in question, having retrieved a copy from his office; he had stored it in the drawer underneath the spot on his desk upon which the original three-foot model of the USS Enterprise rested. [2]

    “God, was that really ten years ago?” she asked in reply, staring dazedly at the article in question. “We see no reason that she won’t continue to be as firm a fixture in the coming decade as she has been in the last two,” she read. “Lazy journalists! They reused the exact same line in this version. They just swapped three for two.”

    “I’d say it feels like ten years,” Marcia remarked. “A lotta things were real different back then. I mean, back in sixty-nine, me and George had only just got married.”

    “I don’t know if it’s really felt like that long,” said Solow. “Seems like just last year. Maybe sixteen months ago.”

    Ball guffawed at this. “Yeah, back when you were producing four shows in a single season. This decade’s been a breeze for you compared to that.”

    “Says the woman who doesn’t have to deal with Fred Silverman on a daily basis,” he shot back, though good-naturedly. Ball immediately burst into laughter, and was soon overcome by shortness of breath, letting out a hacking cough. Marcia offered Ball her drink, which she gulped down, taking a deep breath before lighting up a cigarette.

    After taking a long drag, she said, “You got me there, Herbie,” as if nothing had happened.

    Suddenly, Amber Lucas dashed out of the crowd, quickly hiding behind her mother’s legs from the pursuing Rod Roddenberry, who nearly knocked poor Ball over.

    “Kids, kids!” she cried. “Careful, or Auntie Lucy’s going to drop her cigarette.” She laughed again at this. “Ah, sweet bird of youth. Tell me, Herbie, were we ever that young?”

    Her right-hand man grimaced; he got that question a lot, with the frequency only rising as the years continued to pass.

    ---

    Smiley Face.jpg

    “Censorface” – the heavily-derided logo for The George Carlin Show.

    Desilu Productions, much like Lucille Ball herself, was a veritable Gibraltar in the trying economic times that marked the close of the 1970s. And the studio did it by keeping their audiences laughing… for the most part. Eunice premiered in September, 1979, surprising audiences with its melodramatic overtones, despite the broad and largely comedic characterizations: Carol Burnett, as the perpetual loser and
    “good daughter”, Eunice Harper Higgins; Vicki Lawrence, as the bitter and sarcastic old crone, “Mama” Thelma Harper; and Betty White, as the self-absorbed housewife, Ellen Harper Jackson, whose character bore a passing resemblance to her previous role of Sue Ann Nivens on The Mary Tyler Moore Show. Roddy McDowall, the fourth pillar of the cast, functioned as the “straight man”; his character, Philip, achieved great success outside of the small town in which all the characters lived, but was compelled to return largely by the guilt he felt for having “abandoned” his family. His bitter and vindictive relatives begrudged him every ounce of the fame and fortune he earned through his career as a novelist. Eunice was in many ways a “throwback” when compared to other new series from that season; reminiscent in style and tone to the bleaker, more muted and hyper-realistic sitcoms of the earlier 1970s, as opposed to the higher-concept, loud and colourful oeuvre on the rest of the Desilu roster, all of which continued to perform well as the “dinosaurs” of the past decade went into rapid decline. This was to the credit, in very large part, of the excellent cast, whose variety-show heritage had trained them well to go from the sublime to the ridiculous and back again, each and every week.

    One of those “dinosaurs” was Captain Miller, a show whose name was synonymous with quality writing, sterling performances, and considered by far the most realistic of the many cop shows on the air, despite being a quirky sitcom. It could not recover from the death of the beloved Jack Soo, who proved as crucial a cog in the well-oiled machine that was the show’s cast as his character, Sgt. Nick Yamamoto, had been at the 12th Precinct of the NYPD. A touching and thoughtful tribute to the actor had marked the premiere of the previous season, as he had died shortly after filming had gone into the summer hiatus, and though ratings for that episode had been very strong, they had been on a steady decline ever since. [3] The show had managed to cling to the very bottom rungs of the Top 30 even in the 1979-80 season solely by virtue of being part of an exceptionally strong lineup; it was not the first time that a favourable timeslot would prove so beneficial for an otherwise struggling program.

    Soap aired immediately after Captain Miller, and it showed no signs of slowing or fading from the headlines despite entering a third season. Jessica Tate, the matriarch of her household, had separated from her husband, as both had been carrying on affairs during their marriage; the writers made the decision to ratchet up the tension even further, when Jessica began a relationship with her butler, Benson. Everyone involved delighted in parodying the Guess Who
    ’s Coming To Dinner paradigm, and twisting the traditional morals of such “message” movies. Indeed, Robert Guillaume made it a point to play Benson as unlike a Poitier character as he possibly could. Dignity does not make people laugh,” as he explained in a contemporary interview. [4] The potential for controversy – the pairing crossed racial and class boundaries, and as Jessica was still married, was technically adulterous was intense, and indeed in any season other than 1979-80, it might have emerged as the top story in the entertainment press. However, it was still one of the most talked-about plotlines of any show in television; Baba Wawa frequently discussed the topic on The Today Show, despite that program being on a different network than Soap was. It may have had something to do with the Jessica/Benson romance having been partly inspired by the real-life liaison between Wawa herself, and Massachusetts Sen. Edward Brooke. It had done nothing but good things for his career, as he was comfortably re-elected for a third term in 1978 despite it being a bad year overall for his Republican Party. [5] Wawa, on the other hand, continued to languish away on a morning show, holding out in vain for when John Chancellor finally retired.

    The sustained popularity of Soap was reflective of the genre that it parodied having finally reached the mainstream – which is to say, primetime. It was a natural outgrowth of the popularity of another format, the miniseries, in the late-1970s; many of these were, in and of themselves, highly melodramatic in presentation. Sumptuous romances, which were a frequently-occurring genre to be found in the miniseries, were also long-standing, wildly-successful pieces of Americana. Gone with the Wind had been a wartime romance, and so had Casablanca. Peyton Place had been a smash novel, and then a smash film, before becoming a smash series. History was about to repeat itself.

    Texas Tea, a miniseries airing in the late spring of 1978, was a mishmash of styles. Set in the Lone Star State, it was evocative of the western and frontier programs which had defined television for the past thirty years, and motion pictures for the last half-century, though in a bizarre fusion with the creature comforts of the suburbs
    Houston, the city in which the show was set, had become a thriving and prosperous coastal city akin to those (much older) metropolitan areas all along the eastern seaboard. [6] Texas Tea chronicled the lives of the Walsh family, in particular the trio of brothers who were sons to the family patriarch, oil tycoon Thomas R. Walsh, Sr. [7] His eldest son, T.R., quickly emerged as the key protagonist, however. It was the casting for T.R. Walsh which proved revelatory: chosen for the role of the cunning, unscrupulous scion of the plutocratic dynasty was Larry Hagman, formerly known as Tony Nelson, male lead of the frothy fantastic sitcom I Dream of Jeannie, one-time occupant of the fabled NBC Monday night lineup in the late-1960s. He was actually the second astronaut-made-good from that show, following Bill Daily (who played second-banana Roger Healey) and his part on The Bob Newhart Show (which had ended the very same year that the Texas Tea miniseries had premiered). In converting to a regular series, the “Tea” was dropped from the title, and the show came to be known as simply Texas. Naming the show for the city in which the show was set (Houston) was deemed insufficiently sweeping and romantic. After all, the city (as was the case for Hagman’s previous sitcom, which was set in Cape Kennedy, Florida) was by this time known primarily for its connection to the space program – during the height of Moonshot Lunacy a few years before, tee-shirts and bumper stickers addressing “Houston”, the nerve centre of NASA, had been positively ubiquitous.

    But Texas wasn
    t the only show to redefine a setting. As far as skewed interpretations of westerns went, as always, Gene “Wagon Train to the Stars” Roddenberry was the reigning champ. Roddenberry, in seeking a blueprint for his series, decided to build off his original work for Star Trek. The complex political situation of the Federation had primarily been the doing of Gene L. Coon, when he joined the show in the middle of the first season; prior to that, the Enterprise had been depicted largely as a frontier ship, remote and isolated from any organized society. Thus would be the case for his space station – it would be way off the beaten trail, on the farthest spur of the most erratic trade route imaginable. The space station which would function as the primary setting of the show was given the twee name “Eagle’s Nest Station”, which was to orbit a marginally habitable planet of mostly scrubland (allowing for the use of the ubiquitous Vasquez Rocks Park in the Sierra Pelona, where a distinctive formation had already become internationally known as “Kirk’s Rock” from its many appearances in Star Trek). Neither the planet, nor the red dwarf star it orbited, were distinctive enough for a proper name, and were often described as simply “the planet” and “the sun”. The star was located in the “Eagle Cluster”, several thousand light-years from the core of “the Systems Commonwealth”. [8] Brandon Tartikoff, who had taken an active interest in the show’s development from the very beginning, encouraged a vibrant alien cast. Tartikoff also chose the eventual title for the series: Deep Space [9] (Eagle’s Nest was flatly rejected, as focus groups had expected a show about anthropomorphic birds). The pilot movie aired in February, 1980, and was a solid success; the timing was impeccable, as there was a hunger for more space-based science fiction emerging (Galactica would end its five-season run that May, with the Colonial Fleet finally arriving at Earth). Ratings were good and the show was picked up for a full-season order starting in September. Tartikoff had floated the idea of tying Deep Space with Star Trek to his superior, Herb Solow, but this was flatly rejected. Star Trek is those characters, those ships,” Solow noted in a memo. And I’m pretty sure you can’t do any star trekking on a station orbiting a planet.” Tartikoff disagreed, but he had to yield to the power structure in place at Desilu.

    In stark contrast to the strict hierarchy at that studio, production on The Richard Pryor Show was as haphazard and slipshod as was possible for a weekly primetime series. It was only the show’s bravura ratings which kept it afloat; network executives would forgive a great deal if it translated into advertising dollars. And Richard Pryor was a solid hit for NBC in what was otherwise a relatively lean period for them. But the show itself was a mess, plain and simple. If its spiritual predecessor, Laugh-In, had perfected the illusion of an anarchic ruckus passing itself off as a variety show, Richard Pryor had made it a reality. This was even explicitly referenced on the show itself, whenever one of that older program’s cast members guested. However, it was the off-set antics of Pryor and Robin Williams that drew media attention and made them tabloid fixtures, including their infamous partying at the notorious Medina nightclub, off the Beverly Hills Freeway in Westside Los Angeles, where liquor, drugs, and prostitutes were never in short supply. [10]

    If Richard Pryor could be described as Laugh-In for a new decade, the inevitable rush of imitators that followed in its wake naturally included a Turn-On. George Carlin, who like Pryor, was a drug-fueled provocateur comedian (who took to profanity like few before, or since), was offered the chance to host his own show in a rather ill-advised move by the rather desperate CBS. [11] Carlin, however, was less apt to “play nice” than Pryor, and insisted that he be allowed to deliver his stand-up routines intact. Carlin was one of those comedians who viewed his profession as important, prone to postulating on the “meaning” and “purpose” of comedy in a societal context (and seldom hitting on the obvious answer: making people laugh). A stumbling block was that his signature routine was entitled “Seven Words You Can Never Say On Television”
    which spoke for itself. The compromise, such as it was, would entail Carlin being allowed to say whatever he liked, though Broadcast Standards and Practices would of course censor any offending words with the traditional “bleep”, as well as a smiley-face superimposed over his mouth. This smiley-face would become the logo of The George Carlin Show, in one of the more… curious creative decisions behind the show’s production. [12] Carlin did not participate in any of the sketches, leaving that to his cast – which did not include a single potential breakout star in the Williams mould. He would deliver several monologues throughout the show – more so than Pryor, who generally stuck to an intro and an outro.

    The broadcast of the premiere episode of Carlin – cobbled together from the original pilot and subsequently-taped episodes – aired on Monday, March 24, 1980 (after months of delays). CBS heavily promoted the series, and a result, ratings were fairly solid; and critics and audiences alike found it funny – hilarious, in fact – but for all the wrong reasons. Carlin appeared in three sketches through the half-hour program, and did not utter a single complete sentence without being interrupted, often multiple times, by his smiley-face logo. Carlin himself (who would later claim to having been “baked out of my mind” while filming all of his routines) strongly implied that he deliberately went over the top (even by his standards) in order to demonstrate the absurdity of censorship regulations; his little experiment, to put it bluntly, went horribly right. [13] Several network affiliates did not return to show after its first commercial break – this on top of over a dozen that had refused to air the show in the first place. Ratings were good enough for a second episode, and five more had already been taped, but overwhelmingly negative reaction (including to the sketches, which were deemed mediocre and forgettable), coupled with righteous indignation from watchdog groups and the FCC, ensured that Carlin would be a one-and-done affair. The remaining episodes would eventually air, but not on network television. Ardent Carlin fans had recorded the episode during its one and only airing, often making drinking games out of how many times “Censorface”, as the logo came to be known, would bleep his monologue. (The question of how to count the myriad instances of Censorface proved problematic, as Carlin would often utter several verboten words in a row, all covered up by a single bleep). Carlin proved just the latest in a string of variety shows to crash and burn, and in retrospect came to be regarded as the straw that broke the camel’s back. “Variety shows just can’t work in the 1980s,” concluded TV Guide at the end of the season, “unless the stars are covered in fur”. This referred, of course, to the two exceptions, Pryor and The Muppet Show.
    [14] That said, Jim Henson was beginning to tire of the format himself, and hoped to transition to more ambitious projects on the big screen. With Eunice and Deep Space on the table on top of the two other established Desilu hits, it seemed that the time was right to allow The Muppet Show to take its bow. Ball, though reluctant to see it come to an end (surprisingly enough, considering her initial misgivings about the show), agreed that the coming 1980-81 season would be the show’s last. [15] Fortunately for Ball, both Rock Around the Clock and Three’s Company showed no signs of slowing down, remaining firmly ensconced in the Top 10.

    For the second consecutive season, Pryor was the #1 show on the air, though in absolute terms, ratings had declined from the previous year. As had been the case a decade before, the singular variety-show smash had bolstered NBC and allowed it to punch above the weight of the rest of its schedule. The Peacock Network had broad but shallow viewership support, eking out a respectable nine slots in the Top 30, though only one of these, Pryor, had ranked in the Top 10. CBS tumbled even further from their already dangerously low vantage point, with just four shows in the Top 30; like NBC, they managed just one finish in the Top 10, with the newsmagazine program 60 Minutes proving their last bastion of relevance. Even the once-reliable Rhoda was fading fast. This left ABC with seventeen of the Top 30 shows, and a whopping eight of the Top 10, at this, the zenith of the Alphabet Network
    ’s popularity. Their failure to secure the top-rated show on television was the one feather missing from their cap. [16]

    At the Emmy Awards that autumn, Soap won Outstanding Comedy Series over Captain Miller, Taxi Drivers, WMTM, and Three
    ’s Company. It also repeated for Lead Actress and Supporting Actor, allowing Katherine Helmond and Robert Guillaume to collect their second Emmys in a row. We’re a pair, and now so are our Emmys,” Helmond joked backstage. The two obligingly shared a (chaste) kiss for the cameras, to top their famous embrace from the previous ceremony. But if we win again next year, we’re not going any further than that,” said Guillaume. Judd Hirsch won Outstanding Lead Actor in a Comedy Series for Taxi Drivers; there was some controversy in Guillaume not having been nominated for Lead himself. Outstanding Variety Series went to The Richard Pryor Show, naturally; however, Robin Williams did not repeat his win for Individual Performance.

    Finally, Texas won Outstanding Dramatic Series, as did Hagman for his role as T.R. Walsh; in both cases, there were
    … extenuating (and topical) circumstances. [17] The season finale of Texas had ended on a shocking cliffhanger a few months prior. Someone had shot T.R., though who that might have been was left a tantalizing mystery. T.R., being a megalomaniacal villain in the finest moustache-twirling tradition, had left a trail of enemies in his wake, with one of any number of them having the means, motive, and opportunity to pull the trigger. The legendary question, “Who Shot T.R.?” was one which would come to define the first year of the new decade, and not just domestically, but abroad as well (as the show had become a major international sensation). Naturally, wags would consistently answer that question with “Schrank” (the man who shot Theodore Roosevelt – also known as TR – in the 1912 Presidential campaign); this joke, which quickly grew tiresome the more frequently it was heard, was credited to the most literate and highbrow of the late night talk show hosts, Dick Cavett. [18] (Cavett was not known as a jokester, and this episode helped to demonstrate why that was the case.) It was all part of the long buildup on the way to finding out the answer, which devotees both old and new would have to endure, with no end in sight

    ---

    [1] Duke married Arnaz instead of John Astin ITTL, and consequently little Desi V and Lulu were born instead of Sean and Mackenzie Astin. (Technically, Sean was the biological son of Michael Tell, Duke
    ’s previous spouse, though he was adopted by Astin and has always regarded him as his “real” father but Desi V is the son of Desi IV.)

    [2] The three-foot Enterprise model was the first to be completed, having started construction on November 8, 1964, based on the plans created by Matt Jefferies and approved by Roddenberry and Solow. The completed model was delivered to Roddenberry on December 14, 1964, though it (like the subsequent eleven-foot model) was extensively refurbished over time. IOTL, Paramount held onto the model after the show had been cancelled (while sending the eleven-footer off to the Smithsonian) until May of 1975, at which point it was given to Roddenberry (who had commenced pre-production on Phase II). Apparently, he then lent the model out sometime in the late 1970s, but had forgotten to whom he had lent it. It has been missing ever since. ITTL, on the other hand, Solow is given the original Enterprise model in 1971, as a gift for his involvement in the production of the Star Trek from start to finish. It therefore occupies a spot on his desk, in similar fashion to this OTL photo (with Roddenberry).

    [3] Soo died on January 11, 1979, living long enough to continue making appearances into the 1978-79 season of Barney Miller IOTL. However, he died on April 11, 1978 ITTL, nine months earlier. (In both cases, he died of esophageal cancer, resulting in his famous deathbed quip:
    it must have been the coffee”).

    [4] Guillaume
    has spoken of not wanting to play Benson with “dignity” IOTL; the quote is a paraphrase of one which you can find in this video interview.

    [5] Brooke was one of several Rockefeller Republicans to lose his seat in the 1978 midterms IOTL, despite their being a very good year for the GOP in general; ITTL, the opposite happens, and only the Rockefeller Republicans do well for the most part (including Brooke, Sen. Case in New Jersey, and newly-elected Rep. Green in Manhattan, among others).


    [6] The city in which the action took place IOTL was, of course, the eponymous Dallas.

    [7] Thomas R. Walsh, Sr., was known IOTL as John Ross
    “Jock” Ewing, Sr. Just as ITTL, his son was named for him.

    [8] Roddenberry would use this name for his planned Andromeda series, which was developed and produced after his death IOTL.

    [9] As previously noted, Tartikoff was involved in the development of an OTL spinoff of Star Trek which has a very similar name and premise.

    [10] Medina is a fictional nightclub; given that we’re over a decade from the POD, along with the ephemeral nature of trendiness in late-night hotspots, the OTL haunts of the late-1970s would likely not achieve popularity ITTL. Medina is so named by analogy to Mecca; the slogan “Pilgrims go to Mecca, partygoers come to Medina” is frequently heard in the Los Angeles clubbing scene of the time. (Devout Muslims naturally aren’t thrilled by the comparison, but then, they wouldn’t be likely to visit that Medina anyway). Also, you may note that Medina is located off a highway extension which was never built IOTL (Houston I. Flournoy may be a moderate, but he
    ’s no Jerry Brown).

    [11] No, this never happened IOTL. Even with the comparatively looser content restrictions on network television at this stage ITTL, it
    ’s frankly a ludicrous proposition.

    [12] Yes, Carlin would have hated being associated with a smiley-face. You might say the joke was on him :) (Sometimes lucidity has its advantages.)

    [13] Carlin was also under the influence while performing his monologues as he hosted the first-ever episode of Saturday Night Live in 1975, IOTL.

    [14] Recall that Robin Williams is an extremely hirsute individual.

    [15] The Muppet Show ended in 1981 IOTL, as well, for many of the same reasons.

    [16] IOTL, in the 1979-80 season, ABC had fifteen shows in the Top 30, though only two in the Top 10 (though one of them had been the top-rated Three
    ’s Company); CBS had eleven shows in the Top 30, but a truly impressive eight of these cleared the Top 10; and NBC carried behind the rear with just four shows in the Top 30; none of those cleared the Top 10. Believe it or not, CBS is still doing slightly better ITTL than NBC did IOTL; this is despite Silverman (who worked wonders at both CBS and ABC) having been in charge at the Peacock Network since 1978. In fact, Silverman was responsible for two (thankfully butterflied) flops by this time: Supertrain and Pink Lady (and Jeff).

    [17] Taxi won for Outstanding Comedy Series IOTL, with Cathryn Damon winning for Lead Actress rather than Helmond (who was also nominated). Her TV husband, Richard Mulligan, won for Lead Actor. Supporting Actor went to Harry Morgan for M*A*S*H, a show which I remind all of you does not exist ITTL. Outstanding Variety or Music Program (as opposed to Series) went to Baryshnikov on Broadway; Outstanding Drama Series went to Lou Grant, with Ed Asner taking home the Lead Actor Emmy for playing the eponymous character. Also worth noting is that, IOTL, the Emmy Awards ceremony took place during an SAG strike; all but one (Powers Boothe) of the nominated actors boycotted the ceremony as a result. ITTL, the circumstances leading up to the strike are
    … considerably altered, as you will soon discover.

    [18] It may not surprise you to learn that Cavett was no longer appearing on private network television by this point IOTL, having sought asylum
    at PBS.

    ---

    And there we have our Pink Lady of TTL! The show that stands up and tells the world:
    “Variety is dead!” The sad reality is that the genre simply cannot cope with the changing technology that becomes predominant in the 1980s, at least in the United States, in any timeline with as late a POD as mine. As long as the average American home has more than one television, and as long as the television has more than three or four channels available, variety television is doomed to become superfluous.

    That said, thank you all once again for your patience and understanding! I welcome you all to the 1979-80 cycle! Here
    ’s hoping that May will be flowering with updates :D


    Smiley Face.jpg
     
    Last edited:
    Dieu et Mon Droit
  • Dieu Et Mon Droit

    British Royal Family, 1968.jpeg
    The British Royal Family in 1968.
    Clockwise from upper left: HRH Princess Anne; HRH The Prince of Wales; HRH Prince Andrew; HM The Queen; HRH Prince Edward; HRH The Duke of Edinburgh.


    At her birth in 1926, the baby girl then known as Her Royal Highness Princess Elizabeth of York did not seem terribly likely to ascend to the throne of the British Empire, which was held at the time by her grandfather, George V. The Heir Apparent was his eldest son, and her uncle, Edward, the Prince of Wales. Granted, he was already over 30 by this time, still unmarried, and notorious for his womanizing ways. But surely he would be bound to settle down eventually; and even if he didn’t, the second-in-line, Elizabeth’s father Prince Albert, the Duke of York, could easily have a son, who would displace her in the line of succession. But, as were the purported wishes of George V himself, nothing ever came between “Lilibet” and the throne. For upon his accession as Edward VIII in 1936, the former Prince of Wales scandalized British society by announcing his intention to marry a 40-year-old divorcée, Mrs Wallis Simpson. Parliament was incensed, and the government of the day (the National Government, led by Stanley Baldwin) threatened to resign over the issue, which would have obliterated the carefully-groomed appearance of neutrality in political matters that the monarchy had maintained for the past century. Such an action would shake the moral foundation of the United Kingdom to its very core, in a time of rising tensions and uncertainties throughout Europe, and indeed, the wider world. PM Baldwin led the charge in compelling him to choose between his lady love and his throne – and so he did, abdicating at the end of the first year of his reign. His younger brother Prince Albert became George VI, despite his own strong reservations about assuming the role; Princess Elizabeth then became the Heiress Presumptive. With the steadfast support of his wife (and her mother), Queen Elizabeth, the King led the country through the horrors of World War II, forever endearing him to his subjects, and proving a tremendous success in restoring the dignity of the monarchy. (Meanwhile, his elder brother – who had been created the Duke of Windsor shortly after his abdication – was discovered to be an admirer of fascism and in particular Nazi Germany, leading the King to send him overseas to the Bahamas for the duration of the conflict.)


    Her Royal Highness The Princess Elizabeth, as she became known upon the accession of her father, was first-in-line to the throne, and it seemed increasingly unlikely that her parents would have any sons to displace her. Her only sibling – a sister, Princess Margaret – had been born in 1930. Given her gender, serving in a combat role during the War was not an option for Princess Elizabeth, even notwithstanding her tender age; however, like many young women, she served on the home front, and also proved (like the rest of the Royal Family) a bulwark for the people. Although London was devastated by German blitz bombings, neither Princess Elizabeth nor her sister intended to leave for the safety of Canada without their mother the Queen, who in turn would not leave without the King, who would simply never leave. It was the future husband of the Princess, Prince Philip of Greece and Denmark, with whom at this time she was already well-acquainted, who served with distinction in the Royal Navy. The two married soon after the war ended, on November 20, 1947, by which time he had renounced his foreign titles, only to be created the Duke of Edinburgh and given the style His Royal Highness by his father-in-law, George VI. The bride and groom were second cousins once removed, through their common ancestor, Christian IX of Denmark, “the father-in-law of Europe”. They were also third cousins through Queen Victoria, “the grandmother of Europe”. Intermarriage between such relatives had been the standard within European monarchies for many centuries; Elizabeth’s parents had been a rare exception, her father marrying the daughter of an aristocrat (though a powerful Scottish Earl) as opposed to a foreign princess. Upon her marriage, she then became formally styled Her Royal Highness The Princess Elizabeth, Duchess of Edinburgh, which she remained for the rest of her father’s reign, during which time she had two children: a son, Charles, in 1948; and a daughter, Anne, in 1950. As female-line grandchildren of the Sovereign, they would not ordinarily be entitled to be called Prince or Princess, nor to the style of Royal Highness; however, George VI decided to authorize their use by letters patent, as it was plainly evident that any children born to the Duke and Duchess of Edinburgh would eventually become Princes and Princesses of the United Kingdom. (This action also had precedent, as the King’s grandfather, Edward VII, had granted a similar privilege to the daughters of his own daughter, Princess Louise). The Duke and Duchess of Edinburgh toured the British Commonwealth extensively on behalf of the King, whose health – badly shaken by the strains of the War – was beginning to fail. He would not survive his daughter’s marriage by five years, perishing in early 1952. Per ancient custom, his daughter – who was in Kenya with her husband at the time – immediately succeeded him as the British Sovereign.

    Though simply known as Her Majesty The Queen, her full list of titles and styles was simply enormous; she had taken the regnal name of Elizabeth II, becoming the first to reign by that name since the previous Queen Elizabeth, who had acceded to the English throne nearly four centuries earlier (the other half of the Union, Scotland, had never known a Queen regnant by that name). The years which marked the first quarter-century of her reign were (naturally) profoundly eventful ones, though obviously not quite so palpable and immediate as those of World War II. With regards to her personal affairs, she created her husband, the Duke of Edinburgh, a British Prince in 1957; the following year, her son, Prince Charles, was created the Prince of Wales, the customary title awarded to the Heir Apparent, at the age of nine. The Queen then had two more sons: Prince Andrew, in 1960, and Prince Edward, in 1964. Her only daughter, Princess Anne, became the first of the children to marry, though not without controversy, when she wed Major Andrew Parker-Bowles in 1973, after a lengthy engagement (as the two had been dating since 1970). [1] He was more than a decade her senior, and (far more importantly) Catholic – in fact, he was a descendant on his mother’s side of a notable family which had been recusant from the Protestant Church of England for centuries. On his father’s side, however, he was the descendant of the aristocratic Parkers of Macclesfield. The engagement between Princess Anne and Major Parker-Bowles became the subject of great debate due to the Act of Settlement 1701, which removed anyone who converted to or married a member of the Roman Catholic Church from the line of succession.

    At the time of her marriage, Princess Anne had been fourth-in-line to the throne, behind her three brothers (including the two who were younger than her). In some corners, the ancient and discriminatory law which would remove her from the succession had been deemed severely outdated, not least of all by the substantial Catholic population in the United Kingdom. Indeed, by this time, the Queen reigned over her many realms individually, and the laws of succession applied separately to each of them; the Act of Settlement was certainly no more popular in many of these. This was an early impetus for closer diplomatic and economic ties between her Commonwealth realms, thanks in large part to the showing by Canadian Prime Minister Robert Stanfield at the Commonwealth Heads of Government Meeting in 1973, which he hosted in his nation’s capital of Ottawa that August – a few months before Princess Anne and Major Parker-Bowles were due to marry. [2] He suggested an accord which would unite all of the Commonwealth Realms into parallel legislative action on the matter (as required by the Statute of Westminster 1931). However, an overall lack of interest in the plan by those at Whitehall (despite the traditional support of the Labour government by the Catholic electorate) rendered the abortive “Ottawa Accord” moot before it could even get off the ground. But Stanfield had made himself known throughout the Commonwealth, and particularly in Westminster, which would serve him – and his peculiar agenda – well in the coming years.

    Princess Anne married that November, being struck from the line of succession in so doing in keeping with the Act, though she had sought (and received) permission from the Sovereign (her mother) to wed in accordance with the Royal Marriages Act 1772. For the Queen, this was a moment of making amends for past misjudgements; she had withheld assent from her sister, Princess Margaret, to marry the man she loved (who, like Mrs Simpson, was divorced), over two decades before (on the advice of her ministers), and the man her sister had then married instead had proven… not so compatible (she would divorce him the year after her niece had married for love). [3] On the morning of the wedding, Princess Anne been given the customary title of Princess Royal (which was granted to the eldest daughter of the Sovereign); her husband-to-be, for his part, was created the Earl of Crewe, the first of the second creation, chosen because of its proximity to his ancestral title of Macclesfield (both settlements being in East Cheshire). [4] The Royal Wedding became a smash success, watched by viewers across the globe (over half a billion people, all told), many of whom were attracted to the romantic story of a couple defying ancient prejudices and marrying for love. [5] (The other Royal who had married for love over tradition – Edward VIII – had passed away by this time, and his widow had not been invited to attend the ceremony.) Catholics, naturally, were particularly drawn to their union; ironically enough, those in the United States were among the most enthralled by the entire narrative, despite that country having blithely cast the British monarchy and patrimony aside some two centuries before. Reinforcing this irony was that a great many American Catholics were, in fact, of Irish extraction. Time, it seemed, truly did heal all wounds. Northern Ireland, which had been the epicentre of sectarian tension for several centuries, responded surprisingly well to this cross-confessional union, though not completely without the occasional quarrelsome rumblings from extremists on both sides of the aisle. [6] All that said, the Princess Royal and the Earl of Crewe agreed to raise their children in the Church of England, in order to ensure their place on the line of succession, though the first of their children (a son, Henry Andrew Parker-Bowles, by courtesy the Viscount Ampleforth) was not born until 1977.

    The Commonwealth, meanwhile, found itself tested in entirely unforeseen ways, wholly unrelated to the succession. Attempts by the United Kingdom to enter the European Economic Community, twice stymied in the past by the since-deceased French President Charles de Gaulle, were at an impasse, due to the inability by the two sides to reach a workable compromise; eventually, both the UK and Ireland (which would not be able to enter the EEC unless Britain did the same, due to the inexorable trade ties between them), were left out of the enlargement of the organization in 1973, with only Denmark entering (and not without some resistance from its populace). The EEC then closed itself off from further overtures after the Oil Crisis forced it to take stock of its infrastructure, leading the UK to re-evaluate their own trade links: with Ireland, with the other states in the European Free Trade Agreement, and with nations in the Commonwealth, primarily Australia and New Zealand. Canada, the eighth-largest economy in the world in the early 1970s (Australia was tenth), had been drifting away from the United Kingdom for the better part of the 1960s, but by 1973, the Dominion had a leader who was more pro-British than any Canadian PM in the last half-century. He had already proved his mettle with his attempts at shepherding an Ottawa Accord, and despite its failure, he was more than willing to negotiate more favourable trade ties with his fellow Commonwealth Realms, not least of all because it allowed Canada to mitigate the immense influence that their southern neighbour, and largest trading partner, the United States of America had over their imports and their exports. Thus were a number of multilateral treaties signed in the ensuing years, which established the Commonwealth Trade Agreement, formally recognized at the Heads of Government Meeting 1975, in Kingston, Jamaica.

    Initially, membership was to be open to only the Commonwealth Realms (those which recognized Elizabeth II as their Sovereign); the United Kingdom, along with Canada, Australia, and New Zealand – three of the five pre-WWI Dominions – were charter members. Of the remaining two, Newfoundland had joined Canada in 1949, and South Africa (along with Rhodesia, which was still a de jure British colony) had been excluded due to sanctions against that Apartheid Backwards Bloc regime. [7] Although the lowering of trade barriers was the primary objective of the CTA, other, more nebulous concepts (such as facilitating migration, and greater investment into sporting and cultural events) were also discussed. However, the British government (power having been assumed by the more Europhilic Tories) continued to see this new organization as strictly a temporary measure until such time as they could join the EEC. However, as was often the case, it endured, even as their chances (and eventually, their willingness) to integrate with the other major powers of the Continent evaporated. An alternative solution was eventually proposed – bringing the CTA and the EFTA, two looser, more permissive associations than the restrictive EEC – together, and including Ireland (which had reluctantly joined the EFTA as an associate member after being forced to withdraw their bid to join the EEC in 1973). [8] The Republic was an “observer nation” to the CTA, as it had not been a member of the Commonwealth since 1949, and was not likely to rejoin (even though many other republics, including India, had remained despite abolishing their monarchies). Upon the collapse of the Backwards Bloc in 1977, it seemed likely that Portugal (an erstwhile member of the EFTA) would direct its energies into joining the EEC and other “inner” organizations, alongside Spain (and, later, Greece), driving home the need for the EFTA and the CTA to consolidate. [9] However, the vague, uncertain commitments would only crystallize after the major recession of the late 1970s took hold on the global economy.

    But the backdrop of financial uncertainty which gripped the 1970s did not diminish the popularity of the Royal Family. The monarchical revival which was taking place in much of Europe had indeed spread across the Channel, and the year of Her Majesty’s Silver Jubilee, 1977, was one of great celebration throughout the Commonwealth, and especially in the United Kingdom, which the Queen toured extensively over a three-month period, visiting over three dozen different counties. This followed a brief trip to New Zealand and a nearly month-long visit to Australia in March. In late September, the Queen proceeded to Canada, where in addition to her husband, she was joined by the Prince of Wales, and they toured the length and breadth of that geographically massive Dominion for several weeks. [10] This was the second major visit of the Queen to the Great White North in as many years, following the Olympic Games in Montreal in 1976. She returned to that city to observe the progress being made on the Montreal-to-Mirabel Rocket line, noting in so doing that Canada was ahead of even the rail-dominated United Kingdom on the high-speed curve. Just as she had started her tour of Australia with a State Opening of Parliament in Canberra, so too did she end her tour of Canada with the same, in Ottawa, before proceeding to the Caribbean. The Jubilee year, which also saw yet another Commonwealth Heads of Government Meeting, this time in the capital at London (hosted by Prime Minister Willie Whitelaw), also (as previously noted) was blessed with the birth of the Queen’s first grandchild, The Hon. Henry Andrew Parker-Bowles, Viscount Ampleforth. Her Majesty came to regard 1977 as an annus mirabilis of her reign.

    The question of when the Prince of Wales would finally settle down was one which dominated the headlines of the era; Prince Charles had reached the age of 30 in late 1978, at which time he still had not married, and had no serious attachments. Many in the Royal Family were uneasy. The previous Prince of Wales, the man who would one day become Edward VIII, had enjoyed a lengthy bachelorhood, acceding to the crown unmarried and… the rest, unfortunately, was history. It was the Earl Mountbatten, the younger brother of the Prince’s paternal grandmother, who had a suggestion for the ideal royal bride: his very own granddaughter, the Hon. Amanda Knatchbull. His matchmaking skills were irrefutable; some years before, the Earl had arranged a meeting between his nephew and the Princess Elizabeth, which had resulted in (among other things) the birth of Prince Charles. Amanda had been born in 1957, making her nine years younger than the Prince of Wales; their romance, therefore, did not begin in earnest until she was 21, in 1978. Their courtship, though amicable, was certainly not inflamed with passion, but he was a royal and she was an aristocrat; they were both well-accustomed to that state of affairs. In September, 1979, Prince Charles proposed marriage to Amanda, and she accepted. [11] The couple were second cousins, both descended from their mutual great-grandfather, Prince Louis of Battenberg. They were to be married in the spring of 1980, on the 30th of April, which was proclaimed a national holiday.

    Royals and heads of state from all over Europe (and the world) came to bear witness to the union. Constantine II, King of the Hellenes, attended the wedding in his first foreign visit since being restored to the Greek throne in the previous year. The architects of the Iberian Sunrise, Juan Carlos I of Spain and Duarte III of Portugal, were also among the foreign monarchs who observed the nuptials, thus completing the rehabilitation of the three former Backwards Bloc states into vibrant, active members of the First World. [12] All three Kings had, of course, seen their monarchies restored in the previous decade – much to the envy of the many rulers-in-exile who attended, such as the Kings of Romania and Bulgaria, and the Crown Prince of Yugoslavia, not to mention all the minor German princes who constituted the extended family of the groom. The President of Ireland had agreed to attend, though not without some misgivings and isolated protests from certain corners of the Republic. [13] Also present was the one-time Hollywood starlet, Grace Kelly, in her capacity as Princess of Monaco. The principal supporter of Prince Charles was his first cousin once removed, Prince William, Duke of Gloucester, who served in that capacity alongside his brothers Prince Andrew and Prince Edward. [14] Numerous children and adolescents, relatives of both the bride and the groom, served as attendants. The pair were married by the Archbishop of Canterbury at St. Paul’s Cathedral (as opposed to Westminster Abbey, as the cathedral was much larger and could therefore seat many more guests). The ceremony began shortly after 11 o’clock in the morning, and was conducted largely in the traditional style. [15] An estimated billion people worldwide viewed the event on television; the most-watched broadcast since the Apollo 11 landing in 1969. The Hon. Amanda Knatchbull, at the conclusion of the ceremony, became Her Royal Highness The Princess of Wales, though the press (particularly outside of the UK) often (incorrectly) described her as “Princess Amanda”.

    The entire affair was certainly a most auspicious debut to the new decade, as far as the Royal Family were concerned, in particular Her Majesty The Queen. Elizabeth II was hopeful that soon, her eldest son would have a child of his own, further cementing the future succession. This child would likely become Sovereign, whilst being born as a grandchild of the present Sovereign – identical circumstances to those under which Her Majesty had herself been born, 54 years before. Monarchy, after all, was tradition…

    ---

    [1] Major Parker-Bowles and Princess Anne did indeed date briefly in 1970 IOTL, after which time he reconciled with an old girlfriend (Miss Camilla Shand, who in the interim had dated the Prince of Wales) and married her, raising their children together in the Catholic faith. Princess Anne, meanwhile, married Captain Mark Phillips, whom she met through their mutual interest in equestrianism. Both marriages ended in divorce, with the second marriages of Princess Anne, now-Brigadier Parker-Bowles, and (of course) the former Mrs Parker-Bowles all proving a good deal more successful than their firsts. Brigadier Parker-Bowles and Princess Anne remain close friends to this day, IOTL.

    [2] Discussions to amend the succession did not begin in earnest IOTL until after the marriage of Prince William of Wales and Miss Catherine Middleton, when lawmakers became aware that their first child stood an excellent (about 50 percent) chance of being born female, and given the preponderance of absolute primogeniture succession having been implemented the various other European monarchies (starting with Sweden, in 1980). The Act of Settlement 1701 (barring the marriage of Catholics) and the Royal Marriages Act 1772 (preventing any descendant of George II who lives in Britain from marrying legitimately, without permission from the Sovereign) were also superseded at this time. The various bills (one must be passed by each Commonwealth realm) usually take the name Succession to the Crown Act, or similar, and were drafted as a result of the Perth Agreement, which was made at the Commonwealth Heads of Government meeting 2011, in the eponymous city. As of this writing, Royal Assent has been granted to those Acts in the United Kingdom, as well as Canada, with other legislation pending (a bill has been tabled in New Zealand, which has yet to pass through Parliament).

    [3] The marriage between Princess Margaret and Antony Armstrong-Jones, the Earl of Snowdon, did not end until 1978 IOTL. She never remarried in the remaining quarter-century of her life, though he did (almost immediately after his divorce, in fact); his second marriage would also end in divorce.

    [4] Princess Anne was not granted the title of Princess Royal until 1986 IOTL, at which time her marriage to Captain Phillips was rapidly falling apart. The Queen could have granted her daughter the title (which is held for life) at any time after the death of its previous holder (her aunt, George V’s daughter, Princess Mary), in 1965. It’s very likely that she did not receive the title upon or soon after her wedding because her husband chose to remain a commoner – Parker-Bowles, on the other hand, is as blue-blood as they come, descended on both sides from the aristocracy and the landed gentry, going back for generations. I think he would accept a title – Crewe is the nearest town to Macclesfield, and it’s been used before (though not by royals). Ampleforth is a reference to the prominent (Catholic) school which he attended in his youth.

    [5] The wedding of Princess Anne and Captain Mark Phillips was said to attract approximately 500 million viewers, IOTL. The more symbolically significant nuptials of the Princess Royal and the Earl of Crewe attract a commensurately larger audience ITTL, particularly in the United States and, yes, in Ireland (as in, the entire island).

    [6] There are no Troubles ITTL, and therefore all involved parties are willing to go ahead with a wedding. IOTL, in the early 1970s (the very height of the Troubles) it is difficult to imagine a Protestant Princess being married to a “Papist” going over well at all in Ulster. But ITTL, although not everyone is thrilled, nothing goes too far beyond words. Many in Northern Ireland are quite moved by this crossing of sectarian lines, and believe that it represents hope for the future.

    [7] The symbolism of a “vacant seat” represents attempts by the Commonwealth to shame the South African apartheid regime, in addition to the obligatory trade sanctions (of the Commonwealth Trade Agreement, in addition to other sanctions imposed by other bodies).

    [8] Ireland was not a party to the EFTA prior to joining the EEC in 1973 IOTL.

    [9] IOTL, Greece joined the EEC in 1981. Spain and Portugal both followed in 1986. Of course, the UK and Ireland had already joined.

    [10] The Queen’s visit to Canada lasted for only five days IOTL, very likely because she had just been to Canada for the Summer Olympics the year before. Nonetheless, PM Stanfield requests that she devote as much of her time to touring Canada as possible – she had never ventured west of the Ottawa River during her 1976 tour, remaining largely in Montreal, with occasional sojourns to Quebec City and to the Maritime provinces. ITTL, she arrives in Canada on September 21, remaining until October 19, for a stay of exactly four weeks; IOTL, she arrived on October 14. (In both cases, she conducted the State Opening of Parliament on October 18 – a Wednesday).

    [11] IOTL, on August 27, 1979, Earl Mountbatten was assassinated in a bombing by the IRA, which also killed and maimed several members of his family (the Hon. Amanda Knatchbull – who upon the accession of her mother to the Earldom became Lady Amanda Knatchbull was not among them). Against this backdrop, Prince Charles departed for India, then proposing to Lady Amanda upon his return. Devastated at the loss of so many family members (including her younger brother), she turned him down, understandably wary of becoming attached to the Royal Family. ITTL, on the other hand, with Earl Mountbatten alive and well, she accepts his proposal. By no means is it a love match, but at least Prince Charles isn’t fixated on anyone else during their marriage, as he never did meet his brother-in-law’s ex-girlfriend ITTL.

    [12] The King and Queen of the Hellenes (and their children) attended the wedding IOTL, as well, though (obviously) in exile. Though Duarte Pio (as he was known IOTL) was largely uncontested as the pretender to the throne of Portugal, he strangely did not attend the royal wedding. And the only one of our three ex-Backwards Bloc monarchs who was also King IOTL, Juan Carlos I, did not attend because the couple was planning to stop over in disputed Gibraltar en route to their Mediterranean honeymoon. ITTL, Charles and Amanda will be honeymooning in the Caribbean instead. This wedding can certainly be regarded as the very apex of the monarchical revival ITTL.

    [13] The President of Ireland did not attend IOTL because of – you guessed it! – the Troubles.

    [14] Prince William of Gloucester (who died in 1972 IOTL) does not die in a plane crash ITTL, thus becoming the Duke of Gloucester (as opposed to his younger brother, the ominously named Richard, who remains, simply, HRH Prince Richard of Gloucester). Though he’s diagnosed with porphyria as IOTL, it is kept under control with relative ease.

    [15] Which, yes, includes Amanda vowing to “obey” Charles, contrary to his OTL wife deciding against doing so.

    ---

    Thanks to Thande for his helpful advice in the making of this update!

    And so, we have our first two pairings of the royal children ITTL! Princess Anne and Major Andrew Parker-Bowles! And Prince Charles and the Hon. Amanda Knatchbull! Some of you may be asking: what will become of the OTL bride of the Prince of Wales? I’m might just take the “overseas quagmire” approach with her, considering the oppressive and incessant overexposure with which we’ve all been inundated for the past 15 years or so. Much like with the quagmire, a certain portion of the collective psyche seems utterly unable to move on from this individual. This is why I knew I wasn’t going to marry them ITTL. For those of you who are curious, this is what the future Queen looks like in the present day IOTL – on the attractiveness scale, definitely somewhere between his first and second wives. This is claimed to be an image of her as a younger woman, though it’s undated (it’s labelled with her married name, however, which means it’s likely from after 1987). I hope you all enjoyed this glimpse at monarchical machinations! I found it great fun to write, and you can consider it my tribute to the Before 1900 section of this forum, where these kinds of updates are very much par for the course.


    British Royal Family, 1968.jpeg
     
    Two Small Steps Forward, One Giant Leap Back
  • Two Small Steps Forward, One Giant Leap Back

    It was no surprise that the space program, being so integral to the legacy of Camelot, suffered considerable setbacks in the 1970s; just as President Humphrey seemed intent on running his prestigious and ill-fated predecessor’s rarefied reputation into the ground metaphorically, it often appeared that President Reagan would be happier to do the job rather more literally. The cuts to the bloated government expenditures that had been allocated during the Great Society years had to come from somewhere, and with Moonshot Lunacy having faded considerably since its days in the limelight, NASA was a prime target. It didn’t help that so many of the personnel attached to that agency had military backgrounds; Reagan felt they would be put to better use bolstering US defences, which had atrophied during the détente. In all, funding for NASA would be slashed by nearly one-half through the first term of his Presidency – from approximately 2% of the federal budget in 1975 to a mere 1.25% thereof by 1980. [1]

    The largest single expenditure cut by the Reagan administration was the plan for a successor series of lunar missions which were to follow the Apollo Program (the last of which had discovered water ice on the Lunar South Pole in 1974). Development plans for longer-term, semi-permanent lunar bases (which were to be assembled on-site through the use of robotic cargo landers and remotely-operated rovers) were nearly complete when Reagan pulled the plug on the project – codenamed Artemis, after the twin sister of the god Apollo, from Greek mythology – in the budget for FY 1977. These bases would have involved multiple launches of lunar modules, similar to those used in the Apollo missions, but modified to serve as taxis or cargo landers. In addition, mobile pressurized habitats, with a great deal more creature comforts than the cramped and short-term “crash pads” used by the Apollo astronauts, would have been constructed to allow for an extended roving range, and more thorough surveys of the specific base sites. Missions could therefore be much lengthier in duration than previously. The eventual objective of the Artemis program was to introduce even larger, and more advanced, landers which could support crews on the lunar surface for months at a time, for permanent “moon bases”. Indeed, the water ice discovered by Apollo 20 would be able to sustain life, provide oxygen, and perhaps even fuel for reusable spacecraft (which could then take off from lunar launch sites), enabling the potential base to become (at least partially) self-sustaining. The name for the program would live on in the popular Star Trek: The Next Voyage miniseries, released in 1978, which saw the newly-promoted Captain Sulu commanding a vessel named the USS Artemis, NCC-1966 (which, perhaps fittingly, would prove the instrument of his demise). The potent symbolism of this gesture by President Reagan (scuttling the possibility of an immediate follow-up to what was perhaps the crowning achievement of the succession of Democratic administrations which had preceded his own rise to power) was, surprisingly enough, the only major blow suffered by the NASA coffers in the late 1970s, though it would obviously deal a devastating blow to morale within the agency, and among enthusiasts of space exploration and travel. It didn’t help that there proved a decided lack of new blood during these lean years; the 1975 cohort of astronauts selected by NASA (eight people all told, including the first black female astronaut, Dr. Julia Plymouth) would be the last to join the organization until the 1980s. [2] However, minority involvement and interest in space exploration was at an all-time high, thanks largely to the work of Nichelle Nichols, who had played an active role in the candidate selection for this latest batch of astronauts (an apocryphal urban legend often cited Plymouth as having been personally chosen by Nichols herself).

    The two Viking probes were launched in 1973, rushed out the door in the wake of the crushing blow to morale caused by the Soviets winning the Race to Mars in 1971. [3] This added to the sheer plethora of launches that year, which might have contributed to the exhaustion of Moonshot Lunacy due to intense over-saturation. It didn’t help that, even under President Johnson, plans for Mars beyond the Mariner probes had been dramatically scaled back. Originally, multiple probes would be launched on a single Saturn V rocket under the name Voyager; subsequently, a different program came to be known as such, and the next phase of the Martian exploration became known as Viking. However, the question of whether life existed on Mars – or if the planet was even viable – had dominated cosmological inquiry in the early 1970s, as the Soviet Mars 3, upon deploying its landing equipment onto the Martian surface, returned inconclusive results with regards to the critical question of life. However, the very nature of the Race to Mars and the expectation of one-upmanship on the part of NASA prevented complex biological instrumentation from being included on the payload of either probe – both of them would conduct primarily geological research; which, granted, could infer whether Mars had once been viable, through analysis of past atmospheric and chemical composition in soil samples that might imply the presence of water or even more direct evidence of past life-forms. The landers would build on past (and present, when including the work of their mother probes) orbital reconnaissance which had worked to measure the atmosphere and extensively photograph the planet’s surface. This allowed for the selection of (mostly) ideal landing places. The Viking 1 probe, upon arriving at Mars, completed its mission with aplomb, though as noted, it could not categorically reject that Mars had ever been viable (though the odds of life currently existing on the present-day world were infinitesimal). Viking 2 was not so fortunate, with the measuring equipment being partially damaged by a rougher-than-expected landing onto the planet. Given the other problems that plagued NASA in this era, it would prove something of a culmination.


    The Skylab orbital space stations continued to operate under Reagan much as they would have under Humphrey; to the Gipper’s eternal chagrin, Skylab B launched in late 1977, with little that he could have done to prevent it (having expended his political capital on terminating the Artemis program). Regular flights by the quartet of space shuttles (the refit Enterprise, Columbia, Discovery, and Atlantis) to service each station allowed for their smooth and efficient operation – though they were very short-lived, with Skylab A seeing only two years of useful service before it was replaced by Skylab B. It did not help that Skylab A (then known, simply, as just “Skylab”) had been heavily damaged during launch, one entire section of solar paneling being effectively destroyed and giving the station a curiously asymmetrical appearance in all photographs. [4] It seemed a potential triumph for Soviet propaganda of the era despite their spacecraft appearing notoriously haphazard and jury-rigged in comparison to the American product. However, Yankee ingenuity would ultimately win out, giving NASA a very badly-needed reprieve in the face of backlash from the American public. The first crew (who arrived at Skylab aboard the Space Shuttle Columbia) were able to affect repairs from outside the station, in what observers would describe as the world’s most expensive and remote salvage job. Although the original Skylab was never the same after that, it was able to accomplish everything it had set out to do (with the help of its crews) despite the profound impairment with which it had begun its tenure; a variety of experiments on the effects of long-term spaceflight on the human body were conducted, and the Earth and the Sun were both extensively observed from the unique vantage point of this semi-permanent orbital station. In a blatant attempt to further improve NASA PR, a “classroom in space” was established, in which minor experiments – suggested by actual students – were carried out by astronauts during their downtime from regular missions in order to demonstrate the unique environment of space and engage young learners (a continual challenge through the “generation gap” that defined the 1970s). Skylab B launched without any problems, and it would remain in space for much longer than originally anticipated, as President Reagan refused to authorize any replacement stations in any of his budgets. Skylab B built on the lessons of the first Skylab, integrating that data into more sophisticated and complex modules. The Apollo Telescope Module which had enabled the solar observations made by Skylab A had been eliminated from the design for Skylab B to make room for an expanded laboratory module consuming some of the nearly 50 tons of the Saturn V’s 120-ton capacity that had gone unused on Skylab A. [5] To supply this additional power, the stations solar arrays were extensively upgraded, and the maximum crew complement could be doubled from three to six. The station would also continue to observe the Earth, and prove as a hub for experimentation with X-Rays, Zero-G material processing, and the effects of extended exposure to spaceflight and microgravity on humans and all manner of flora and fauna. The importance of the aforementioned space shuttles could not be overemphasized in assuring the success of these missions; they were able to ferry multiple rosters of crews and the critical cargoes of consumables and experiments necessary to support the station to its fullest potential back and forth in short spans, with optimal efficiency. [56] They were a rousing success, and served as a bittersweet reminder of the lost potential of what might have been a system of regular Earth-to-Moon transportation along similar (though obviously larger-scale) lines.

    Similarly, despite their devastation at having lost the chance for Artemis, most of the scientists and engineers within NASA (particularly at JPL), held firm in having made that sacrifice in order to protect the opportunity to launch as many probes as possible, given the once-in-a-lifetime opportunity to send interplanetary probes on the “Grand Tour”, which would allow them to take advantage of the precise arrangement of the massive gas giant planets to visit all of them on a single set of routes toward the outer reaches of the solar system; in addition to trajectory alterations, the gravity of these Jovian spheres would also provide multiple boosts to their velocities. This would also, eventually, allow these interplanetary probes to become interstellar, as they would be able to traverse the very edges of the solar system (and become the first man-made objects to ever do so) while much of their internal sensory and measurement equipment was still capable of surveying their surroundings. Needless to say, from a financial perspective, creating probes that could perform multiple tasks over the course of their lifetimes was one which appealed even to the more lukewarm elements of the bureaucracy. Most estimates had the six originally planned Voyager probes, which were to launch in the late 1970s, due to leave the solar system in the first decade of the next century. With any luck, their equipment would continue to operate and receive instructions from Earth for several years beyond that point. However, despite attempts by NASA to retain all six probes, they were eventually forced to cut out the non-essentials and produce only four. [7] The first two Voyager probes were launched in 1977, along the Jupiter-Saturn-Pluto grand tour route, by which the two largest of the gas giants would provide gravitational assists that would increase the velocity of the probes and reduce the time needed for them to reach their respective destinations. Both probes reached Jupiter before the decade was out, and proceeded to Saturn in 1980. The first of the Voyager probes to reach Saturnian orbit, Voyager 1, was diverted to investigate Titan, by far the largest of that planet’s moons, about which curious findings had been recorded by terrestrial instruments. Voyager 1 confirmed the presence of a dense atmosphere – Titan was the only known moon in possession of such. Indeed, the atmosphere was so dense that it was impossible to get readings of the surface. From Titan, Voyager 1 followed a trajectory that would lead it out of the Solar System with no further flybys; Voyager 2, however, skipped the moon entirely – there being little value in more obscured sensor readings and proceeded as originally planned, directly from Saturn to Pluto, the ninth and last planet in the Solar System (though, perhaps appropriately enough, Pluto would in fact be closer to the Sun than Neptune by the time the probe would arrive due to the planet’s highly elliptical orbit). The next two probes were launched in 1979, following a route which would (once again) see a gravity assist from Jupiter upon arriving in the early 1980s before exploring the outermost Jovian planets, Uranus and Neptune. Plans for the ultimate trajectories of this quartet of probes, upon completing their respective planetary flybys, had not been determined at the time they were launched; one technician at NASA was said to be rooting to aim at least one of them in the direction of a nearby star.

    The media, naturally, played a part throughout the rise and fall of the popular phenomenon. The Final Frontier, which had been available only to Canadians and those cross-border Americans who had access to the CBC feed over antenna in 1972, became a co-production of the CBC and two PBS stations – WNED Buffalo and WGBH Boston – the following year. [8] The 1973 batch of episodes, all of which focused on specific events taking place on the NASA calendar that year (a veritable embarrassment of riches) aired on both the CBC and PBS beginning on October 7, 1973. James Doohan returned as host, and the show continued to be taped from the CBMT studios in Montreal. Professors Bob Davidoff and Ian Mitchell at McGill University remained the primary consultants for the series, though others would be brought aboard on an ad hoc basis. Among these was a Professor of Astronomy at Cornell University named Carl Sagan, who was a passionate advocate for space exploration and had worked closely with NASA since its inception. Though Sagan’s contributions to the series were limited – he admired what the show was trying to accomplish, but was more interested in the “wonders of the universe” which the show often merely glossed over. “The Final Frontier always made me think about taking a glass-bottom boat on a tour of the coral reefs. And instead of spending any time actually looking out at those reefs, and answering questions about how they formed, and how they interact with their ecosystem, our tour guides spent the entire trip talking about how the boat was built and how it’s able to go on these tours,” Sagan would later explain. “The boat might be a marvel of engineering and human ingenuity, and I can certainly respect that, but they’re still missing the bigger picture.” Nevertheless, Sagan maintained amicable relations with the show’s producers, even touring the set in 1974. By this time, the presentation of The Final Frontier had changed a great deal from the rather jury-rigged first season; full video was now a presentation option, and Doohan would even interview guests, usually with some relation to the episode at hand. The season premiere, “Lunar Isolation”, which talked about the engineering feats necessary to sustain habitation on a totally hostile environment, as well as exploring the delicate nature of landing and then safely returning home again, featured Buzz Aldrin, the second man on the moon, as the show’s first guest. (Aldrin, a gregarious personality, had often “represented” the Apollo program within the media from 1969 forward, in contrast to the more withdrawn and reclusive Neil Armstrong). Doohan himself rejoiced in interviewing guests, because he was very much a people person. This added a particular verve to those segments, making the show all the more engaging for audiences. That, coupled with a redesign of the set (making it slightly larger, and replacing the darker colour scheme of the first season with something brighter and livelier – a reversal of the trend towards earth tones and muted colours common to the early 1970s), made the show more visually appealing, as well. Despite being quite possibly the straw that broke the camel’s back in terms of Moonshot Lunacy overkill, The Final Frontier was a hit, and a landmark in co-production between the CBC and PBS. Educational television appeared to be at its zenith in this era.

    But the good times were not bound to last forever. The very notion of PBS was tied to the Great Society championed by Presidents Johnson and then Humphrey, and their stock had plummeted by the mid-1970s. When President Reagan was elected in 1976, budget cuts were the order of the day, and not only to the space program, but also to PBS – partly as an extension of his “private enterprise for the media” platform, which had also resulted in the elimination of the Fairness Doctrine and the Family Viewing Hour. Only libertarians were truly ecstatic with these sweeping reforms; liberals and conservatives were both ambivalent (though obviously they had reservations about very different things). The PBS budget was slashed, putting several shows in danger, in the budget for FY 1977. The Electric Company, always the red-headed stepchild of the Children’s Television Workshop stable compared to the venerable fortunate son that was Sesame Street, was very nearly threatened with cancellation, but at the last minute the show was saved, largely due to changes in cast (the male lead, Morgan Freeman, departed at this time, having finally let his ego get the better of him) and appeals to the CTW from the parents of children (many of whom were Mini-Boomers who were now just getting old enough to transition from Sesame Street to The Electric Company). [9] The advantage of Sesame Street was that the show could quite literally pay for itself through extensive merchandising (and licensing fees from foreign versions, such as Canadian Sesame Street or the UK’s Sesame Square), an advantage not shared by The Electric Company (or, indeed, any other show on PBS). It was therefore agreed that Sesame Street would take major cuts from underwriter funding to become (largely) self-sustaining, and that these would be transferred to The Electric Company, which would, therefore, keep being able to bring viewers the power. Rita Moreno, by far the most well-known of the cast, remained with the show, though she only filmed two days a week with them due to her simultaneous commitments on Broadway. However, other shows, such as The Final Frontier, were not so lucky; the show ended its run in 1977, with 65 episodes produced by the CBC-PBS tandem over five seasons, on top of the original 13 by the CBC alone, for a total of 78. [10] By this time, however, James Doohan had agreed to appear in Star Trek: The Next Voyage, making him one of the very few members of the Star Trek cast to find steady work during the seven-year gap between the original series finale and the mini-series that followed. (Notably, none of them had done so as actors – he had been a presenter, Leonard Nimoy a director, and George Takei a politician).

    Another science series, NOVA, which had premiered in 1974, had (barely) survived the pruning at PBS. But that show had a much broader focus than space (and, indeed, often went out of its way to avoid space, given the presence of The Final Frontier and, later, the backlash against Moonshot Lunacy), and Carl Sagan, despite his relative indifference toward The Final Frontier, was irate at that show’s cancellation. He immediately began lobbying for the production of a new space-related series (though one which would focus on cosmology). Despite his reputation and prestige, proposals went nowhere, until Mr. Fred Rogers, longtime host of Mister Rogers’ Neighborhood, joined forces with him. [11] The two unlikely allies – with diametrically opposed metaphysical perspectives – nonetheless proved an effective tag team, working to set an example of the kind of pluralistic educational offerings they both supported for public television. In the end, a limited series was indeed green-lit, and was to be named Cosmos. Mr. Rogers would be given “special thanks” in each episode; WQED Pittsburgh would produce alongside KCET Los Angeles. Sagan and his co-producers were inspired by British science documentary programs of the late-1970s in establishing the themes and topics for discussion. [12] The thirteen-episode series was budgeted at $6 million, very lavish for public television of the era. And, indeed, Cosmos lived up to the hype for “event television”, attracting unprecedented ratings for any program that was not airing on one of the Big Three networks.

    Ear-marked as the largest expenditure for NASA was the Solar-Power Satellite prototype, better known to the general public as the test bed for “microwave power”. After considerable time allotted for research and development, sufficient experimentation had been carried out, sufficient engineering review conducted, and sufficient data collected by 1979 to allow it to proceed to flight. Whether the project would survive the whims of President Reagan was the driving question that would put an exclamation mark on energy policy, one of the defining political problems of the 1970s. The Gipper had his own ideas on the matter, but then, so did most everyone else

    ---

    [1] Obviously, these numbers were far more grave IOTL, with NASA already receiving less than 1% of the federal budget by 1975 (as President Nixon was certainly no friend of the organization either, the precipitous drop in funding having occurred in the early 1970s under his administration); nominal funding increased in the Carter years, though the rate continued to decline, due to the rampant stagflation of the era, settling at 0.84% of the federal budget in 1980 (and it would continue to decline from there).

    [2] IOTL, Astronaut Group 8 joined NASA in 1978, with a frankly massive cohort of 35; it was the first since 1969. From that point forward, a new astronaut group has been chosen on a biennial basis. ITTL, eight individuals joined Group 8 in 1973, and a further eight joined Group 9 in 1975. These sixteen astronauts include: the first African-American man and woman in space; the first Asian-American in space; the first Jewish-American in space; the first mother in space; and the first Army astronaut.

    [3] NASA being edged out in the Race to Mars accelerated the schedule for the Viking program, to the detriment of the potential for discoveries by those probes.

    [4] The damage done to Skylab A ITTL is very similar to what befell Skylab IOTL. Simply because we cant be having every NASA launch going off without a hitch :p

    [5] Recall that President Humphrey ordered additional Saturn V rockets in 1969, shortly after taking office.

    [6] It should be noted that the space shuttle ITTL does not resemble that of OTL, and instead looks far more like this.

    [7] Of course, just two Voyagers were launched IOTL, one on each Grand Tour route. Sadly, the budget simply would not allow for six, even ITTL.

    [8] Note that WNED serves a market on the Canadian border. IOTL, at present, with the reliance of PBS stations on viewer pledges, it caters extensively to viewers in Southern Ontario (to the point of identifying on-air as serving Buffalo and Toronto), who provide the majority of pledge dollars. WGBH, for its part, serves as a bulwark station for PBS.

    [9] The Electric Company was indeed cancelled in 1977 IOTL, largely because it, unlike Sesame Street, could not be sustained by merchandising revenues (as the show was very lightly merchandised). CTW (since renamed Sesame Workshop IOTL, after its meal ticket) is a not-for-profit corporation, and therefore they have no reason not to use their revenues from Sesame Street to help sustain their expenses. ITTL, since the money coming in through PBS is still higher in absolute terms than IOTL (coupled, of course, with the much larger cohort of children who are the exact target demographic for the show), the show is saved, however narrowly.

    [10] As The Final Frontier fulfills CanCon obligations (and for other reasons which will be made clear in future updates), the CBC will continue to air these 78 episodes (mostly in weekend afternoon and late-night timeslots) for many years to come after its cancellation in 1977. (This is obviously something of a pattern for Doohan ;))

    [11] Yes, Mr. Rogers saved Cosmos ITTL. His activism needed some outlet, given his passionate advocacy for public television. WQED Pittsburgh, his home station, thus participates in the production of Cosmos along with the station primarily responsible for it IOTL: KCET Los Angeles. The co-operative working relationship between Dr. Sagan and Mr. Rogers was, at least in part, consciously formed by both men as a rebuttal to the occasional“educational challenges” facing American Party-governed states in this era.

    [12] For various bureaucratic reasons, the BBC has no direct involvement in the making of Cosmos ITTL. The $6 million budget is actually slightly lower than IOTL, though it should be noted that this is still an extremely exorbitant figure, considering. In addition, ratings are roughly on par with viewership levels IOTL.

    ---

    Thanks to e of pi, truth is life, and Dan1988 for their guidance and suggestions in the making of this update! Additional thanks to e of pi for helping to revise and edit as well. I hope you all enjoyed another look at the space program in That Wacky Redhead! This marks something of a nadir for the final frontier… but will it last, or even worsen? :eek:
     
    Last edited:
    Fight the Power
  • Fight the Power

    Rwb4HjHazwQNrT0NTnthwZegBhZIejt-gopjQmS5xuDD8QjOQggsPe_u_37PPfZRhlCbCGMpGBVncLfz5L7dA2xns8_A5WsZf4lHlpklPzA0CSWjeBPUeKHDNQ

    The ubiquitous “Microwave for a Brighter Future” poster, first published in 1974. [1]

    The Oil Crisis of 1973 had served as a very rude awakening for the drafters of American energy policy. US oil reserves were rapidly dwindling, and the vast (and, thanks to the Manufacturing Miracle, steadily recovering) industrial sector stateside was utterly dependent on black gold, not only as a power source for their factories and machines, but also for the many direct applications of that particular resource. An increasing amount of crude was derived from foreign sources, which were controlled by powers that were increasingly hostile toward the United States and its geopolitical interests (and which could not be brought in line with such consummate ease, as had been the case in decades past). Never before had the need for new alternatives to oil been so apparent or urgent.

    In the wake of the Oil Crisis, President Hubert H. Humphrey, eager to build on his space-borne legacy, had authorized the Solar Power Satellite (SPS) prototype which would collect energy in orbit from solar panels, through the photovoltaic conversion process. These would operate in peak, brilliant sunlight at all times, unencumbered by atmospheric or weather conditions, not to mention the day/night cycle – all of which would, and did, dramatically reduce the potential power output and efficiency of ground-based solar panels. The solar energy collected by the satellite would then be beamed to the Earth’s surface through the use of microwaves (hence the ubiquitous media nickname of “Microwave Power”). A “rectenna” – which is to say, a microwave antenna – on Earth would collect these beams of microwave energy and convert them into electricity (measured in watts), to be fed directly into the power grid. Given the popular understanding of microwave technology (as microwave ovens had been in use for two decades by this point), the inevitable question of what were to happen if the beam somehow missed the rectenna and instead hit the nearby area (which could, theoretically, be populated by plants, animals, or even people), soon arose. The notion of these innocent bystanders being “cooked” by the microwave energy was understandable, and insidious. [2] And it didn’t stop there; going one step beyond that, and imagining radiation poisoning as a result of wayward beams was certainly unfounded (but could easily be explained by conflation with another popular-but-controversial energy source which became prominent in the 1970s).

    From beginning to end, the development process of the solar power satellite prototype had lasted for five years – from the appropriation of funds in the FY 1974 budget, to the conclusion of the final battery of tests, conducted by an actual satellite, launched to geosynchronous orbit (by the same Saturn rocket which also launched the space shuttles) in early 1979. Upon receiving the data collected from these run-throughs, President Reagan was satisfied that his initial suspicions – that microwave was simply untenable as a system for power generation – had been confirmed; the total costs of the entire process of bringing the solar energy to the rectennae on the ground were far, far in excess of any other practical source of electricity at the time, including even ground-based solar and (needless to say) nuclear, his own pet choice. [3] The number of nuclear generators had been rising steadily through the late-1970s, and the President had covered his bases by deciding to continue another research program in alternative energy that actually predated the Oil Crisis altogether: thorium-based nuclear power, known among its proponents as “clean nuclear power”, which alleviated some (though certainly not all) of the concerns held with regard to the risks involved by the vociferous anti-nuclear lobby. [4] So, as far as the government was concerned, microwave was dead; long live the nuke. However, there were some peripheral benefits to be derived from the SPS prototype; the solar panels created for use on the satellite may yet have had applications for Earth-based solar power. Each of the two sets of panels, which measured 6,000 square metres all together, was capable of converting the photovoltaic energy into 1.5 megawatts of power, which translated to approximately 17% efficiency – very high, by the standards of the era. [5]

    Though Reagan had never been a friend of microwave power, he was surprisingly facilitated in his attempt to discredit it by the media; specifically, by the famed disaster movie, The Greenpoint Dilemma. Having been written and filmed in 1978, and released the following year, it assumed that microwave would be found viable and then put into commercial operation through the 1980s. [6] Therefore, the setting was near-future; this technically qualified the film as science-fiction, though this was strongly de-emphasized, to implicitly remind audiences that “this very thing could happen here and now”. The plot entailed a commercial SPS firm, Sunburst Industries, which had launched the very first commercial microwave satellite over a dozen years prior to the start of the film, and said orbiter – Sunburst Platform Alpha, known as “Platform Alpha” or just “Alpha” – was by that point approaching the end of its operational life. Alpha provides power to where else but New York City, and its gargantuan size – 4 miles long by 2 miles across – meant that it could be seen like a star in the night sky from throughout the tri-state area, despite being over 22,000 miles above the Earth’s surface. (Fortunately, the microwave beams that it fired were invisible.) The rectenna was located in Great Neck, Long Island, just across the bay from Queens, the most residential (and populous) of the Five Boroughs. [7] However, most of the action took place aboard the platform, where the crew complement of 120 were tasked with ensuring a smooth operation.

    Sunburst Industries, as was the case for most for-profit corporations depicted in fiction, was not exactly a valued contributor to this future society. Maintenance and repair costs were kept to strict minimum standards, or perhaps even allowed to fall below them – after all, it was very difficult for government inspectors to tour the facilities, given their extreme remoteness; the station itself was chronically understaffed at any rate. The harsh realities on the ground (or, rather, in orbit) were contrasted against the sunny, optimistic advertising from Sunburst which depicted Microwave as the “wave of the future”, and touted the “clean, safe, and practically infinite” source of energy that space-based solar power could provide. [8] In fact, the film quite famously opened with an in-universe commercial for Sunburst microwave power, in one of the more avant-garde touches that dotted the otherwise fairly conventional potboiler thriller (again, to emphasize the here and now aspects of the disaster plot).


    zd8UQ3R8wg-Wp8-3MMlzpEUuNVPaE8YnYO892raKrMSFUH12yLQZ563QxYwlaWiEfLsnKiRVZaz76AGBDe8bqSWoO55r7Vok_ohAwOuhIQAxhQYBjQ19eKrJoA

    The logo used in The Greenpoint Dilemma for Sunburst Industries. Note the allusions to the “Microwave for a Brighter Future” poster (including the Sigma standing in for the pro-microwave Mu, and use of the famous “1970s font”, Cooper Black), along with visual similarities to the iconography of Soviet Russia and the Empire of Japan.

    The beleaguered crew of Platform Alpha fought continuing battles to keep the hardware and exteriors in good working order, and this was depicted in part through a famous (though largely gratuitous) “spacewalk” sequence out on the station’s massive array of panels – a dramatic sequence made all the more extravagant in comparison to the more cramped interiors of the station, and which many reviewers noted was reminiscent of 2001: A Space Odyssey. As was expected, the venerable Desilu Post-Production handled most of the extensive visual effects work, which included construction of the massive model satellite, and the compositing and editing. The incredible amount of money spent on effects equated to a very steep production cost, which could only have been permitted by the studios in the spendthrift New Hollywood era; sure enough, the film was directed by William Friedkin, who had helmed The French Connection at the beginning of the decade (and little else of note since). [9] Like many auteurs he was a notorious control freak, which made filming quite a bit more difficult and problematic than it otherwise might have been. If the movie were not a hit, that would have spelled the end of his career.

    The on-set troubles actually seemed to echo the course of the plot. We were introduced to Platform Alpha at the beginning of a typical overworked, understaffed shift, and to the central characters, a motley crew of lowly technicians (played by Richard “Meathead” Dreyfuss, Jill Clayburgh, and newcomer John Lithgow), supervised by their typically middle-management supervisor (played by Rip Torn). [10] The “face” of Sunburst was provided by Jack Nicholson, playing against type (and for scale) as the slimy public relations officer of the company, Raymond “Call me Ray” Delsol; the highest-ranking executive who is seen onscreen. The situation on Platform Alpha – never more than barely adequate even at the very best of times – quickly aggravated over the course of the first two acts and became too much for the crew to bear. The technicians were forced to repair external damage and keep the internal systems intact, facing long hours, labour shortages, increasingly high production demands from the Sunburst headquarters on the ground despite these issues, and a decided lack of creature comforts to boot. The manager grew increasingly apathetic toward the concerns of his employees, who in turn saw tensions rise amongst their own ranks. The formerly close friendship between the Dreyfuss and Lithgow characters disintegrated as Lithgow grew increasingly unhinged; the low-key romance between Dreyfuss and Clayburgh soon evaporated. Something had to give – and indeed, something did; in the climax of the film, the beam of microwaves, having been successfully converted from the solar energy collected by the massive network of panels attached to the platform, were sent down to Earth, but went way off course from Great Neck, instead cutting through Queens, quite literally cooking thousands of people. (Those people who had lived through the catastrophe were depicted with shockingly realistic makeup jobs which lent the film something of a post-apocalyptic flavour). Sunburst Industries, which had up to this point willfully ignored rumours of strain and underfunding affecting their platforms, were now exposed. Meanwhile, on Platform Alpha itself, the situation was even worse

    The name of the film, The Greenpoint Dilemma, was an allusion to what was then the recently-discovered Greenpoint Oil Spill in Brooklyn, in which, over a period of several decades, millions of gallons of oil seeped into the ground and destroyed the environment of that neighbourhood. [11] In the film itself, Lithgow’s character (at the very height of his hysterical ravings) drew this comparison, of how the people in the area around that rectenna had expected to live their lives unmolested before they had been irrevocably violated by the terror of that which had theoretically been intended to help them. Shortly after his speech, Lithgow’s character committed suicide, clearly despondent from failing to prevent this horrific disaster. In a less subtle example of just deserts, Torn’s character was killed in his attempt to escape from the riots that ensued on the platform once the consequences of the misfire were revealed. The combination of disaster film and message movie produced an obvious moral: the insatiable thirst for energy to accommodate the growth of industry and technology, followed by negligence and disregard for the health and safety of others to maximize profits, would inevitably produce catastrophic results. The preachiness and pretentiousness that characterized The Greenpoint Dilemma made it a critical favourite, and proved to spur ample discussion on talk shows, and around water coolers and coffee tables all through 1979. In fact, the timing couldn’t have been more ideal; Greenpoint was released in the late spring, just weeks before the SPS project was officially cancelled by the Reagan administration. Rumours would persist from then on that Greenpoint had something to do with the Gipper’s decision. [12] The film finished in the #10 slot for the year, grossing over $60 million in the United States. However, because of the very high budget (estimated at $30 million), profits were minimal. The film, however, did very well come awards season, receiving several Academy Award nominations, including for Best Picture, Best Director, and Best Supporting Actor for Lithgow. However, it only won two Oscars, both in technical categories: for Best Makeup and Best Visual Effects. It was a vindication for Desilu, whose technicians received the latter award, as the studio which continued to employ Marcia Lucas (though only for in-house television projects) despite her having been blackballed by the motion picture industry. [13] But Greenpoint had another, more curious effect; an anti-nuclear film released later the same year, based on a number of incidents taking place at plants throughout the earlier 1970s, was a flop, largely as it was deemed to be a “ripoff” of The Greenpoint Dilemma (and it did not help that film’s case that it contained the word Syndrome in its title). [14] Lightning never did seem to strike the same place twice. The message was clear as the 1970s drew to a close: microwave bad, and nukes, if not good, were at least better. It helped that the sizeable contingent of members of Congress (in both Houses) between the Democrats, moderate Republicans, and the more pork-minded Americans, did their best to force through extremely strict safety regulations for new nuclear generators; each faction had its own reasons for doing so, but it was certainly enough, when coupled with the promise of “clean” nuclear research, given investment into thorium, to keep fission on the table and allowing it to dominate energy policy going forward.

    Another lasting impact of the film was on the not-insignificant environmentalist movement of the time – many of whom were former (or particularly stubborn current) Moonie Loonies, and thus firm space advocates, and (in general) supporters of microwave power over nuclear (the perpetual bête noire of virtually all environmentalists). Greenpoint would precipitate a schism between what would emerge as the pro-SPS and anti-SPS factions of the broader movement, and those who were anti-SPS soon found themselves adrift without a remotely viable option in the near-term. Even the advances in ground-based solar power that could be derived from the failed SPS experiment were years, if not decades, away from competitiveness with nuclear in terms of efficiency and cost. New, alternative energy sources also seemed a pipe-dream, although science-fiction certainly came one step closer to reality when a team at the University of Sheffield, led by the wunderkind Dr. Thomas W. Anderson (only 32 years old), made the landmark discovery of the buckminsterfullerene molecule, also in 1979. [15] “Buckyballs”, as they quickly came to be known, were named for architect Buckminster Fuller, whose famed geodesic domes resemble the structure of the molecule. Buckyballs possessed a number of intriguing properties: for one, they could be used to demonstrate wave-particle duality. But more importantly, for the sake of energy policy, Dr. Anderson revealed that derivatives of buckyballs might prove useful in facilitating the productive economic use of hydrogen in providing electrical energy – the holy grail of energy sources (easily derived from the water, or H2O, which covered 70% of the Earth’s surface). The media referred to this as “fuel cell” technology, though hydrogen was not technically a “fuel” in the traditional sense of the word. Hydrogen-derived fuel cell technology in the United States dated back to the Apollo program, where it had been remarkably successful. This new discovery, coupled with this crucial past connection, galvanized environmentalists, and even more strongly tied their fates to that of the pro-space lobby; both groups became well-accustomed to rooting for long shots (both literally and metaphorically, as the case might be).

    The fates of energy policy and the space program seemed inextricably intertwined, both in practical terms as a result of the (failed) SPS prototype, and in metaphorical ones, given the media connection, and the impact on environmentalists and Moonie Loonies that each successive revelation and decision brought with it. But “microwave” was dead, though solar might yet have caught a second wind from what it had left behind. In the future, beyond the pragmatic and immediate realities of nuclear (generator) proliferation, and the continuing (though weaning) reliance on coal, oil, and natural gas, whole new avenues were available for exploration and investment: hydrogen fuel cells, through the use of the newly-discovered buckyballs; and nuclear fission, giving way to fusion, which (once viable) would be a dramatic improvement thereupon, along virtually any conceivable metric. But these technologies were decades into the future; certainly, far beyond even the newly-dawning 1980s which beckoned…

    ---

    [1] This image was created by e of pi, based (very) loosely on a (terrible) rough drawing by myself, modeled on a (far more professional) instructive diagram.

    [2] Being “cooked” by microwave beams (in much the same way that a microwave oven “cooks” food) becomes more possible with the rising intensity of the beam in question. Higher beam intensity generally means more efficient energy transfer, allows a smaller receiver dish, and also slightly higher efficiencies of transmission – all beneficial. Presumably, then, Sunburst is using extremely high intensity microwaves. Of course, given the great distances involved, accuracy is a perennial concern.

    [3] Even today, IOTL, microwave power is a prohibitively expensive prospect, as can be seen by the increasing prominence (and use) of simple ground-based solar power (despite the many restrictions imposed upon it which are already listed above). The greatest strike against microwave is the prohibitive initial cost from the launch of equipment into a geocentric orbit – 22,000 miles is not exactly a trip to the corner store, after all.

    [4] IOTL, thorium-based nuclear research was cancelled in 1973, under the auspices of (who else?) President Nixon.

    [5] Estimates for the efficiency of solar panels in the early 1970s – under ideal conditions – are 10 to 15 percent.

    [6] This film has no OTL analogue, largely because microwave remained a pipe-dream IOTL. Given the 2001 influences in the piece, the film probably would be set in around 2001, which would give Sunburst Industries one whole decade to design, assemble, and launch their titanic platform before the 12 years of operation are due to begin.

    [7] Yes, Staten Island is even more residential than Queens (if you don’t count the garbage dumps for which Richmond County was best-known in this era), but it doesn’t have the same prestige as Queens (well, by non-Manhattan standards, anyway). Recall that Archie Bunker lived in Astoria, Queens, along with his son-in-law, Meathead…

    [8] Most of these “commercials” and “press releases” are thinly-veiled satires of actual commercials and press releases from the pro-microwave lobby in that era ITTL.

    [9] Recall that Peter Bogdanovich directed The Exorcist ITTL – which means that Friedkin never acquired the necessary cachet to direct The Sorcerer.

    [10] This is the first film in which Dreyfuss appears after the conclusion of Those Were the Days, it having been filmed in 1978.

    [11] IOTL, the Greenpoint oil spill was discovered by the Coast Guard in September, 1978; ITTL, it is discovered in 1976 in preparation for the bicentennial celebrations.

    [12] Greenpoint had nothing to do with Reagan’s decision, although it certainly provided very handy cover for him to enact it.

    [13] Desilu Post-Production has become so venerable an effects house by this point that even the continued employment of Marcia Lucas by that studio has not prevented them from continuing to get steady work – though, of course, she has nothing to do with it.

    [14] This was, of course, The China Syndrome IOTL, the anti-nuclear film that preceded (by mere days) the Three Mile Island incident (which does not take place ITTL).

    [15] Buckminsterfullerene was discovered in 1985, IOTL, though also by a team at the University of Sheffield (what a shocking coincidence, wouldn’t you say?).

    ---

    Special thanks to e of pi, who is effectively the co-writer of this update, having actively contributed to the development of every topic covered herein to some degree or another: going back well over a year in the case of the microwave prototype; and less than 48 hours, in the case of The Greenpoint Dilemma – in which case he proved himself the Lawrence Kasdan to my George Lucas, taking my kernel of an idea for a “microwave China Syndrome” and helping to flesh out the titular Dilemma far better than I could ever have done on my own. For those of you familiar with his prodigious penchant for puns, it will not surprise you to learn that the title was his doing as well.

    Yes, alas, microwave is dead; and what’s worse (in the eyes of some) is that nuclear seems very much here to stay. And I’ve butterflied Three Mile Island, too – only to unleash something much worse down the line, or will these tighter safety regulations stick, and truly work to prevent catastrophe on the scale of Chernobyl or Fukushima?
     
    Last edited:
    You've Come A Long Way, Baby
  • You’ve Come A Long Way, Baby

    Ever since the Sexual Revolution of the late 1960s, which sparked (among many other things) the Women’s Liberation Movement, the place for women in society, and in popular culture, had been in constant flux, their depiction in the media experiencing seismic shifts in an attempt to keep with the times, despite widespread uncertainty of what “womanhood” looked like in a very chaotic era. This began as early as the mid-1960s, with the prototypical “single young working woman” show, That Girl, bearing the torch for newly-liberated women everywhere. In a keen example of ideology making for strange bedfellows, it did so alongside even the more fantastic action-adventure programming popular at the time, as shows like Star Trek, Mission: Impossible, and The Avengers all featured female characters who were competent, work-oriented professionals, defined by their place within their organizational hierarchy, as opposed to their relationship to a husband or father – and who were not afraid to be “sexy” in the performance of their duties, a far cry from the demure (some would say “puritanical”) demeanour of those teachers, nurses, and secretaries featured in most shows from the 1950s and early 1960s. However, these groundbreaking shows stood in stark contrast to certain others, such as Bewitched and I Dream of Jeannie – in which magical women with exceptional powers were entirely subservient to completely normal (and rather domineering) men. This was demonstrative of the rapidity with which these changes were taking place, and the uncertainty on all sides of their overall tenacity. That the era was one of great confusion about woman’s place in society was perhaps best (and most succinctly) illustrated by the Helen Reddy dichotomy: that popular singer had performed the feminist anthem “I Am Woman” in 1972, with the single reaching #1 on the pop charts at the end of the year. However, she also reached #1 in 1974 with “You and Me Against the World”, a song about a mother’s devotion to her child. [1] They were the two biggest hits of her career.

    It wasn’t until The Mary Tyler Moore Show in 1970 that what became the iconic “working woman” of the new decade finally seemed to “stick” in popular culture; it was perhaps no coincidence that, by this time, Jeannie was off the air and Bewitched was reduced to recycling scripts from earlier seasons. Ironically, though, the seemingly definitive status quo that had emerged on the small screen was not as true-to-life of the society which Mary Tyler Moore was attempting to represent. At the conclusion of the overseas quagmire in early 1969, and as young men returned home to settle down with their loved ones (timed perfectly with the maturing of the oldest of the Baby Boomers), the birth rate (in decline for the last several years) began to rise again. The Manufacturing Miracle and the overall prosperous economy seemed to indicate a return to “The Best Years of Our Lives”, as had been the case for the previous generation. That said, this time around there was not nearly so strong a stigma for women seeking employment as there had been in 1945. In fact, a not-insignificant number of women sought work in the factories, warehouses, and loading docks of America, though most women on television usually sought more white-collar, service-industry jobs instead. [2] Mary Richards worked as an associate producer at a television station; Gloria Higgins on Those Were the Days was a clerk at a department store. Gloria, however, did represent the lot of many young, married working women at the time, supporting their husbands or boyfriends through school (often on the G.I. Bill) rather than bettering themselves strictly for their own sakes. Such employment was therefore utilitarian and pragmatic.

    Television, like most media, did not tend to dramatize the mundane unless doing so was the whole point – and in the muted atmosphere of the 1970s, it had surely become so. The contrast between That Girl (one of the sunniest-ever depictions of “glamorous” New York City, which by that time faced rampant crime and net emigration) and Mary Tyler Moore (a cautiously optimistic show set in a typical Midwestern city) was obvious. Mary Richards was sweet, friendly, and completely non-threatening, but she was also unmarried and childless, and this did not change at any point throughout the show’s run. In fact, the (all-male) production team stubbornly refused to even consider such an option, though they relented to mounting criticism against the “anti-family” Paramount Television – which, prior to the mid-1970s, starred only singletons, divorcés, or childless couples in all their shows, save for the anomalous Room 222 – in preventing the planned divorce between Lou Grant and his wife, Edie. In fact, Edie Grant even carried over onto the spinoff, Lou Grant, though since it (like Mary Tyler Moore) was a work-oriented sitcom, she seldom appeared in the flesh, and was infrequently mentioned. [3]

    Inevitably, the strong reaction against Paramount forced a skew in the depiction of women on their shows vis-à-vis the changing reality on the ground. By 1977, when Mary Tyler Moore had concluded, more and more women were seeking employment, as the birth rate once again declined, and the economy began fluctuating, making stay-at-home motherhood a less attractive proposition; the average age at first marriage was also on the rise. But Paramount, suitably chastised, had decided to tread carefully from then on; in its final season, the formerly childless couple in Barefoot in the Park had a son, and Rhoda almost immediately became pregnant upon getting married in her eponymous sitcom spinoff (which naturally earned it a reputation as the anti-Mary Tyler Moore), giving birth to a daughter. This spawned a famous in-joke amongst the higher-ups at Paramount, as the son had been named Grant and the daughter Mary, after the married couple who formed the backbones of the studio; surely, those two babies would someday be destined to wed and have children of their own. In fact, executives delighted in suggesting hypothetical names from one of any number of the more odious “family values” critics who had denounced their programming. Those Were the Days, one of the hyper-realistic Tandem shows, avoided the bouncing between extremes of their rival studio: Gloria Bunker had married Richard Higgins soon after high school, getting a job to put him through school as her father (reluctantly) put a roof over their heads. They had one child together before both halves of the couple decided to focus more on their careers, eventually resulting in their departure from New York City for sunnier pastures elsewhere.

    But the decline in marriages and birth rates as the decade wore on resulted in shows like Police Woman (noted to be a personal favourite of the Speaker of the House, Gerald Ford, who quite famously once put an early end to proceedings in order to get home in time to watch a new episode of the show [4]), and spiritual sister The Alley Cats – which was both more absurd and more escapist than Police Woman, reflecting a move away from the grounded, realistic shows that dominated in the decade’s earlier years. Notably, both shows depicted the women protagonists as subservient to men, but not in any way denigrated by their superiors on account of their gender. In fact, the “feminine wiles” of the characters on Alley Cats were essential to their success, as had been the case on the earlier Mission: Impossible. However, grittier, less glamorous fare endured; Penny Marshall followed up Those Were the Days with an equally envelope-pushing sitcom, Inside Straight. Created with her producing partner Linda Bloodworth, it cast Marshall as a thirty-something divorcée, whose husband, fed up with her gambling addiction, had walked away. With few other options and armed with only her associate degree in interior design, she chose to start her own business – sublimating the thrill of the risk from gambling into entrepreneurship, especially in the trying economic times that marked the era. [5] The depiction of Marshall’s character as a divorcée was an explicit callback to the original plan for Mary Richards to have been depicted as one, before CBS executives insisted that audiences would assume that the character had divorced from Rob Petrie (on The Dick Van Dyke Show). Here, the equally potent assumption that Gloria Higgins had finally ditched the Meathead was left unchallenged – either viewers were more sophisticated, or (more likely) this show’s producers were more stubborn. Over the course of Inside Straight’s run, both Richard Dreyfuss and “Daddy”, Carroll O’Connor himself, would appear in guest roles. [6] Even more demonstrative of the enduring “grittiness” was The Birds of Baltimore, the American adaptation of the British Liver Birds program, which starred two single women dockworkers living and working together in Baltimore, considered the most apt analogue to Liverpool. The title referenced not only the original version (as “bird” was UK slang for a young woman), but also the Baltimore Orioles baseball team.

    Sex appeal could not be underestimated as an indicator of the liberated woman. This was the era of “bra-burning” (which actually never happened in a literal sense, though the symbolism of such an event was encouraged for metaphorical purposes). Pride in one’s own, natural self was a recurring theme of the civil rights struggles from the late 1960s onward – “Black is beautiful”, “Gay is good” – and this naturally extended to womanhood as well. Miniskirts were in, as Star Trek so famously demonstrated (in fact, early episodes had women wearing uniform pants, just like the men, but these were later discarded). But even more important was what were out: brassieres. This helped to cement the “bra-burning” legend (women didn’t wear bras because they had burned them, or so the logic went), and it certainly contributed to how fashions of the era were remembered. If anything, it seemed a foundational principle of how women were dressed in television and film at the time: from the very outset, costumers took great pains to ensure that titillation and liberation went hand-in-hand. In fact, this ideal was codified, so to speak, in the “Theiss Titillation Theory”, named for Trek costumer William Ware Theiss: “the degree to which a costume is considered sexy is directly proportional to how accident-prone it appears to be”. [7] It certainly explained the fundamental appeal of The Alley Cats, not to mention Three’s Company. Those boomers – male and female – who had not yet married tended to be wealthier (and less in need of financial support from a spouse) and more educated (putting off marriage and children until able to support themselves financially – in other words, the demographically ideal viewer. [8] Even the oldest Boomers had not yet reached the age of 30 by the mid-1970s. And they were legion – the largest cohort in history. Appealing to the crème de la crème of such a massive crop was irresistible to programmers, and this informed their choices of which shows to put – and keep – on the air.

    Marcia Lucas – who, along with her husband, George, was in the process of suing Paramount Pictures on behalf of Lucasfilm for that company’s rightful share of the profits from The Journey of the Force – found herself the primary breadwinner of her family when the pair were blackballed from Hollywood. Her employer, Lucille Ball, had enough pull that Marcia’s position as staff editor for the studio’s venerable post-production house was secured – though Desilu was given an ultimatum by the collective major studio chiefs: Marcia would not be allowed to edit any movies on threat of Desilu Post-Production being dealt an industry-wide boycott. So she was left to work solely on the in-house productions for the studio, primarily Three’s Company. The characters on that show – a slapstick farce very much in the vein of I Love Lucy but, once again, with added sexuality, befitting the era – were definite types: Janice, played by Susan Anton, was sexy but aloof and totally oblivious to her effect on men; Chrissy, played by Pam Dawber, was goofier and earthier, basically a toned-down “Lucy” type; Mrs. Roper, played by Betty Garrett, was assertive and man-hungry, trading barbs with her cold-fish husband. The central character was Robby Tripper, played by John Ritter, but the woman characters were each given their own plots and scenes without Robby (or their ornery landlord, Mr. Roper), and often discussed topics other than them, such as their jobs, or their desire to make the rent. Lucille Ball loved Three’s Company, easily her favourite of the shows that Desilu produced; and in her way, appointing Marcia to supervise the editing of that show was a distinct honour. Nonetheless, in absolute terms, it could not be perceived as anything but a career setback for a two-time Academy Award winning film editor. But in her own way, Marcia provided another interpretation of the working woman of the 1970s: her husband, George, had also been rendered unemployable by the lawsuit, and unlike Marcia, he did not have steady work to fall back on, forcing her to become the primary breadwinner for the family. Being the higher income-earner within the couple was something else that Marcia and her employer had in common, as the decade came to a tumultuous close. In more ways than one, Marcia Lucas would prove a new model for the new woman of the new decade…

    ---

    [1] “You and Me Against the World” only reached #9 IOTL, failing to become one of her three chart-toppers. ITTL, there are a lot more children and mothers who would appreciate this song at the time of its release, contributing to its success. Now, many people would claim that “I Am Woman” and “You and Me Against the World” are not necessarily songs with mutually exclusive themes, which is certainly true; certainly, Reddy herself obviously never thought that way. However, wags can’t help but note the irony of a singer hitting #1 with the defining anthem of Women’s Lib (which is to say, liberation from being identified, valued, and judged solely as a wife and/or mother) and then that same singer reaching that same plateau with one of the great maternal love songs, not two years later.

    [2] Many working women on early-1970s television, IOTL and ITTL, were in “pink-collar” jobs. However, ITTL, that term does not exist, for the simple reason that the proportion of women working blue-collar jobs is considerably larger. IOTL, one of the first hit shows to depict women working blue-collar was, ironically, the 1950s throwback Laverne & Shirley, in which the eponymous duo worked as bottle cappers at a Milwaukee brewery. However, that series does not exist ITTL.

    [3] Edie Grant appeared less often, and had less impact on the plot, than Liz Miller did in the later seasons of Barney Miller IOTL.

    [4] Based on an OTL story about (President) Ford re-scheduling a press conference so as not to miss an episode of Police Woman.

    [5] Bloodworth (as Linda Bloodworth-Thomason) co-created a series with the premise of women running an interior design firm IOTL as well: Designing Women.

    [6] Many of
    O’Connor’s preferred Those Were the Days writers also got gigs working on Inside Straight, in a contrast to the OTL situation behind the Archie Bunker’s Place spinoff Gloria, wherein his people were forced out of the production by the network, resulting in his (rather obstinate) decision to have no further involvement with that series.

    [7] The Theiss Titillation Theory, a cornerstone of the costume design principles behind Star Trek, was widely disseminated while the show was in first-run.

    [8] The definition of the “ideal” consumer has remained remarkably static over time. Generally, the younger you are, the less likely you’re set in your ways, which means you’re more willing to try new products or services; the more affluent you are, the greater your disposable income. Indicators of either age (younger people tend to live in more urban markets) or wealth (level of education correlates highly with annual income) tend to strongly influence the nature of the products or services being advertised.

    ---

    Thanks to e of pi for his assistance in the editing of this update, and for his terrific pun of a title suggestion as the title of Marshall’s star vehicle sitcom!

    I thought I would post this retrospective on the depiction of women in popular culture in the 1970s, as we enter this new decade. Along with the additions, I suggest that you take note of a deliberate omission: Maude, which was cancelled several seasons early ITTL, and has no second life in syndication. In all, there
    s less of a cultural backlash against Women’s Lib ITTL, because the steps it takes are more tentative, less united front than IOTL. However, the degree to which progress has been made can’t really be compared qualitatively to OTL, because (of course) such a metric is highly subjective, and wholly dependent on individual goals and values.

    Thank you all for your patience and understanding in waiting for this latest update! Coming up next time
    THE TRIAL OF THE CENTURY!!!
     
    Last edited:
    Appendix C, Part IV: The Trial of the Century
  • Appendix C, Part IV: The Trial of the Century

    A long time ago, in a courthouse

    far, far away (from the Eastern Seaboard)


    U.S. Courthouse, Los Angeles.JPG
    The United States Court House at 312 North Spring Street, Los Angeles, California, which houses the United States District Court for the Central District of California, Western Division. The “Trial of the Century” was argued and decided here in early 1980.

    Thursday, April 6, 1978. A day that would live in infamy, a red-letter date in the history of Hollywood. For it was the day that George and Marcia Lucas, on behalf of Lucasfilm Limited, filed suit against Paramount Pictures Corporation, controlled by Gulf+Western Industries, owned and operated by Charles Bluhdorn. The battle lines were drawn through Tinseltown swiftly, and brutally. Just days after their unqualified triumph at the Academy Awards, in which both halves of the creator couple went home with Oscars, the Lucases found themselves blackballed by an entire industry, at least for the most part. Marcia, who had worked as an editor for Desilu Post-Production since it commenced operations in 1971, remained with that outfit, as it was not owned by any film studio, but the television-oriented Desilu Productions, which (though it, too, engaged in the creative accounting practices which drove the industry’s profit margins) stood to lose far less from the precedent set by a successful lawsuit, especially given their plentiful legitimate revenue streams, from syndication and merchandising, along with their deals with RCA and Syzygy. Far more importantly, Lucille Ball liked Marcia Lucas, and always had; she no doubt saw something of herself in this younger woman’s character, and something of her ex-husband (and former creative partner) Desi Arnaz, still the great love of her life (and a close friend) despite their acrimonious and very public divorce, in Marcia’s husband George. But Desilu itself, though that studio had a great deal more clout and prestige than the Lucases did, would not be allowed to keep Marcia in its employ without consequences. Her Editing Unit B, which had focused largely on movies since American Graffiti, was “demoted” back to television – however, as none of the shows which were produced by the television divisions of the major studios would allow Marcia to work on them, Unit B became the de facto in-house unit, working on Rock Around the Clock, Three’s Company, The Muppet Show, Eunice, and Deep Space, among others. Though this reduced the burden on Unit A – the original television unit, headed by the multiple Emmy-Award-winning Donald R. Rode – it necessitated the creation of a Unit C to allow Desilu Post-Production to continue to work on motion pictures without tying Marcia Lucas to them. Ball herself took this with stride; Desilu Post-Production (like most every division of the studio) was thriving; why not expand further? Granted, one of their three units now functioned well below capacity, but Brandon Tartikoff, the studio’s VP Production, was a man positively brimming with ideas, many of them good ones. No doubt he could get more shows off the ground for Marcia and her team to work on in the coming seasons.

    ---

    George Lucas, for his part, decided to throw all of his energies into winning the lawsuit against Paramount. He had a great deal more pride than Marcia; theoretically, he could still get work on independent films and television commercials, but he felt them beneath his stature. Even prior to filing the lawsuit, still in the afterglow from the massive success of The Journey of the Force (not that he was personally seeing any of the dividends therefrom, of course), he had spent every spare moment searching for even a remotely credible litigator who had both the guts and the talent to take on one of the largest conglomerates in the world, and quite possibly emerge victorious. The ambulance-chasers had lined up to take on Paramount, no doubt hoping that the resultant publicity would bring them plenty of business, but George had received a valuable piece of advice from Marcia: “Never listen to anyone if all they do is just agree with you”, and it had guided his decision-making process. This had resulted in a great many rejections on his part. Another problem that George was facing was that most reputable lawyers demanded an massive upfront retainer and an exorbitant hourly wage ($50 per hour was not uncommon), and nearly everyone he had wanted to represent him had held firm on those two points. [1] Eventually, he finally resorted to the Yellow Pages in hopes of finding the right man for the job. He was more than halfway through the alphabet when he stumbled on the first prominent listing under T: “Taylor & Associates, specialists in contract law, contingent and alternative fees considered”. He couldn’t believe his eyes when he saw who was listed as the managing partner: Andy Taylor. The same name as the Sheriff of Mayberry from The Andy Griffith Show. He wasn’t sure if this was some kind of a sign, but figuring he had nothing to lose, he decided to call on Mr. Taylor, Esq.

    Andy Taylor was, in fact, a “simple country lawyer”, from rural Maryland (which also qualified him – like his television namesake, who hailed from the fictional town of Mayberry, North Carolina – as being from the South). [2] But he was also a smart cookie: he had moved to Los Angeles to attend the USC Law School on a scholarship, and had in fact been a roommate of his fellow law student – and future Congressman – Marlin DeAngelo, with whom he remained close friends. Taylor was intrigued about the possibilities of taking on Paramount, but he was not sanguine about the chances of this lawsuit being successful. “You have a case, Mr. Lucas, there’s no doubt about that,” he remarked, upon studying the original contract with Paramount. “But the odds of you beating the army of attorneys Mr. Bluhdorn will unleash on you if you bring this to court are… well, I don’t even think there’s a word for how small they are. Even ‘negligible’ or ‘infinitesimal’ are probably highballing it, really.” Actually, he was lowballing it, but that was the custom in the legal profession. He knew that he very well could win the case, though it would certainly be a very steep uphill climb.

    George sighed, this not having been the first time that someone had attempted to dissuade him from proceeding. “Look, Andy, I know what they’ll be throwing at me. I’ve built my entire career on beating the odds. They told me Graffiti would never work. Then they told me Journey of the Force would never work either. Called it ‘Lucas’s Folly’, even. Laughed at me before it even came out. Never gave it a chance. And now every halfway-decent lawyer in the entire Southland is telling me there’s no possible way I can beat Charlie Bluhdorn because he’s got more money – my money! – and his lawyers are better than anyone I could possibly afford. Please, just do me this one favour – never tell me the odds, all right? I’ve heard it all before. You said you think there’s a case here. Are you willing to represent Lucasfilm? That’s all I want to know.” It was a vanishingly rare moment for George Lucas – a moment of lost composure, of the impeccably professional, workaholic veneer cracking, and his baser instincts finally emerging after laying dormant for so long. Perhaps only his wife had ever seen his emotions laid bare like this, though no doubt even she would be positively shocked at this display, had she been present.

    And to his credit, Taylor was impressed. “Well, Mr. Lucas,” he said, “I run a pretty small firm here, I make a fairly modest living – for a big-city lawyer, anyway – and I try to help the little guy. But I guess the little guy can come in all shapes and sizes, just as long as the other guy is bigger. And they surely don’t come much bigger than Gulf+Western.”

    “Does that mean you’ll do it?”

    “We’ve got enough of a case that they won’t impose Rule 11 sanctions on me if I try and bring it before a judge,” Taylor replied. [3] “And you may have come to the right place after all, because I think I have an ace up my sleeve. I happen to know a forensic accountant who really has a chip on his shoulder about how the Hollywood studios report their profits and losses. As a matter of fact, he’ll talk your ear off about it if you give him half the chance. He could be our star witness.”

    George wasn’t sure what to make of the notion of an accountant being his star witness, but he knew beggars couldn’t be choosers. “Well, Andy, I look forward to working with you,” he said, after having gathered his thoughts. “I can’t make it official until I confer with my wife and business partner, Marcia, but I have a feeling she’ll like you when she comes back to meet you herself. I hope she does, anyway – she’s the best judge of character I know.”

    Taylor smiled indulgently. “I look forward to meeting Mrs. Lucas,” he said, “and the three of us drawing up a contract of our own. One which won’t end in heartaches.”

    As far as George was concerned, none of them had anything to lose.

    ---

    As it turned out, Marcia did like Andy Taylor, confiding to George that she found him to be “a real straight-shooter”. And with that, plans to bring the lawsuit forward in federal court commenced in earnest. On Thursday, April 6, 1978, a complaint was filed with the United States District Court for the Central District of California, Western Division, as Case No. 1:78-CV-00328-WJF, or more formally Lucasfilm Limited v. Paramount Pictures Corporation (just Lucasfilm v. Paramount for short, and the media universally referred to the case by this name), and served on Paramount that very same day; this allowed the story to lead the entertainment news and trade papers the following day. The Hon. Warren J. Ferguson would preside over the proceedings; in an odd coincidence, Ferguson shared his name with a (minor) character on The Andy Griffith Show, just as the chief counsel for the plaintiff did. [4] The complaint alleged breach of contract, fraud, negligent misrepresentation, and civil conspiracy on the part of Paramount; upon the deadline 30 days later, on the 6th of May, Paramount filed an answer to the complaint denying any and all allegations contained therein, and taking the opportunity to file a counterclaim of their own against Lucasfilm, for breach of contract. The stakes, once again, had clearly had raised. By this time, the media outlets had fallen into camps, depending (unsurprisingly) largely on their respective relationships with both companies, and on their economic ideologies. The Wall Street Journal vociferously supported Paramount, as did the Chicago Tribune and (unsurprisingly) the Los Angeles Times. (No major newspaper on the West Coast dared oppose Paramount – and therefore Hollywood – for fear of being blackballed). The New York Times and The Washington Post claimed neutrality; only populist tabloids (such as the New York Post) were unabashedly pro-Lucasfilm, which fit the David-and-Goliath tenor of the case. All of the Hollywood trade papers lambasted George and Marcia Lucas so viciously that Taylor threatened libel suits more than once; fortunately for them, he and his firm had their hands full with their present workload. Paramount’s chosen law firm: Gibson, Dunn, & Crutcher, one of the largest and most prestigious (and exorbitantly-priced) in the Southland, did not have this problem. [5] This massive disparity of manpower would provide the defense with a very powerful advantage throughout the duration of the judicial process; economies of scale applied to the judicial process.

    Lucasfilm gave their answer on the counterclaim after another month had passed; just like Paramount, they had waited until the day of the deadline. There would be no backing down; the suit would be moving forward. It was on the twenty-sixth of June, a Monday, that a scheduling order was entered which set a nine-month period for “discovery” (in other words, evidence collection, through fact-finding and expert testimony) effective starting on the 2nd of July, 1978 (also a Monday). Though Judge Ferguson had a judicial record indicating that he might have been more sympathetic to the plaintiff, the relatively narrow window was in this case seen as highly preferential to the defendant; on the whole, though, the Judge seemed to be taking great pains to appear as neutral and unbiased as possible. “Discovery”, in legal terms, referred to three key components for the purposes of this case: interrogatories, or written questions which the other side is theoretically obligated to answer; requests for admission (leading questions typically associated in the popular imagination with the back-and-forth verbal jousting of trial theatrics); and document requests, which were self-explanatory. The last of these three would prove far and away the most time-consuming aspect of the prep work leading up to the trial for Taylor & Associates. On July 13th, Paramount issued their interrogatories, requests for admission, and document requests – Lucasfilm made their reciprocal request on the 16th, after that weekend.

    It was in August of 1978, several months after the suit had been filed, that the wheels were firmly in motion. It marked the beginning of a long and contentious legal tug-of-war between both sides. What Taylor and his team wanted were “hot documents” from Paramount, which would prove their complicity in the willing fraud of Lucasfilm (resulting in the breach of contract), perhaps as part of a civil conspiracy. Paramount, naturally, was extremely reluctant to share any of their highly personal documents, being part of a highly competitive industry. But by the end of the month, Taylor was finally able to file a Motion to Compel, which would force Paramount to provide all the information that Lucasfilm needed to make their case. Two weeks later, the studio answered it, and the document flow finally began in earnest. But on Monday, September 25, 1978, at 4:30 PM, what employees at Taylor & Associates would forever after remember as “the deluge” arrived; a rental truck arrived at the firm’s office, offloading hundreds of bankers’ boxes, full to bursting with papers, file folders, and stuffed envelopes, among other things. All told, over a million pages were left behind when the truck pulled out of the Taylor & Associates parking lot. Every inch of space within their office was crammed full of boxes; Andy Taylor was forced to hire additional staff, needing the manpower to power through as many documents as possible before the end of the week. [6] Needless to say, this was not a reciprocal exchange; even if Lucasfilm did have as many documents for the other side to examine (and they didn’t – not even close), the attorneys for Paramount (and their staff) would have been able to sort through it far more quickly.

    That Friday, a scant four days later (though effectively just three, in fact), Taylor was forced to report to Judge Ferguson (and the defense) in a hearing to confirm whether or not the documents produced from the Motion to Compel had any relation to those items he had requested. Obviously, his staff had barely begun to scratch the surface of the massive mound of paperwork – which, in keeping with the “deluge” reference, was often derisively called “the Tower of Babel”, also because virtually every document was useless (though technically provided under the terms of the Motion to Compel, most were obviously irrelevant to the case at hand, and were chosen for that reason) – what with re-arranging their office to find space for the excess material, and bringing on new workers to decipher what they had. Only a few thousand documents had been gleaned for information, and none were even close to sufficiently “hot” enough for Taylor to properly assess whether Paramount had properly complied with the motion. Sheepishly, he admitted this to Judge Ferguson. “Your honour, I regret that we’re still reviewing the documents Paramount were good enough to send to us,” he said, right after the chief counsel for the defense had gloatingly informed the judge that Paramount did indeed send over one million distinct pieces of information to Lucasfilm’s counsel. “I honestly have no idea whether or not they complied with the Motion filed.” George Lucas, who was in attendance at the hearing, grimaced at this, as did Taylor himself.

    Judge Ferguson, meanwhile, was remarkably stone-faced and impassive. “Then I have no choice but to find for the defendant. Lucasfilm will compel no further information from Paramount for the duration of this case. Be satisfied with what you have now, Mr. Taylor, there will be no more forthcoming.” And for all Taylor knew, Paramount had given them one million pages worth of nothing. Even with killer testimony from his star witness, there were still ample grounds for a summary judgment against the plaintiff.

    And as the weeks wore on, it indeed looked very much like Paramount had sent them a whole bunch of nothing. As Taylor was reading through weekly reports of the ticket sales of Journey of the Force at each one of the couple-odd-thousand screens on which it played over the course of its theatrical release, he ruefully remarked, “Well, what do you know, it looks like that movie really did play in Peoria.” That was about the best news he had to report to the Lucases by Columbus Day. George and Marcia weren’t thrilled, naturally (they had already heard about how much their film had appealed to Middle America), and for the first time, George had serious doubts about whether the lawsuit stood even the faintest hope of succeeding. Had Taylor hoodwinked him? Was that “simple country lawyer” act, in fact, not an act at all? Was he in way over his head? Taylor often wondered that himself; for the first time, he was a complete no-show for the entire campaign run of his old friend Rep. DeAngelo, though he still managed to win handily without him. (“I think you would probably need my help way more than I need yours anyway,” the Congressman had said – and he was right.)

    But October and November marked the deposition period – in which each side would interview the key witnesses of the other (excluding outside experts). This was done outside of the courtroom, and away from the presence of Judge Ferguson. Despite aggressive questioning being done by sides (as they were going on what was commonly known as a “fishing expedition”, searching for weaknesses), little could be gleaned from any of the witnesses that could be perceived as devastating to the case of either side, until Alan J. Ladd, Jr. gave his sworn deposition. He was the Paramount executive who had green-lit The Journey of the Force in the first place, having brought Lucas to Paramount in exchange for the promise to make his film, and (accordingly) had been made privy to every aspect of production, including all matters financial. Under intense questioning, Ladd revealed that he had engaged in meetings with other Paramount executives during which the question of whether Lucas was aware that what he perceived as “profits” and what Paramount recognized as profits were not, in fact, one and the same had been raised. Taylor seized on this. “And did you turn these minutes over when Paramount was handed the Motion to Compel?” he asked, trying his best to couch his desperation in a practiced, conversational drawl.

    “Nobody ever asked me for them,” Ladd replied, bemused.

    And with that, Lucasfilm was back in the game again. As a result of Ladd’s sworn deposition, Taylor was now entitled to file additional interrogatories and document requests, this time with specific regard to the minutes referred to by the witness for the defense. Paramount, who were for the first time legitimately on the defensive, stalled and eventually produced the relevant documents, though heavily redacted; at the same time, they floated the offer of a settlement, which would entail a moderate-sized lump sum followed by a very large number of smaller instalment payments. Taylor was insulted on behalf of the Lucases, but reluctantly brought the offer to their attention. Marcia, wishing to return to editing for the movies again (“I liked working on Rock Around the Clock much better when it was called American Graffiti,” as she said to her husband), considered proposing a counter-offer, but she knew that George would hear none of it. She was right.

    “Let it ride,” was all he said on the matter. And so they did.

    Taylor and Paramount’s attorneys met once again on December 11, this time with Paramount reluctantly produced the minutes in a meaningful (and fully incriminating) format, thus effectively acknowledging that they had screwed Lucas out of his rightful earnings by any meaningful (and ethical) sense of the word. As far as the Lucases were concerned, it made for a delightful early Christmas present. However, Paramount formed part of an industry where profit had an entirely different meaning, one which was standard and universally accepted, and they would argue that when they went to trial… after the Lucases rejected their second offer to settle. Unfortunately for Paramount, the notoriously stingy Bluhdorn didn’t even raise the potential payout high enough to match Marcia’s originally planned counter-offer; she became convinced that they’d never get what they deserved unless they took the case to trial. Fact discovery closed with the end of the year; as of New Year’s Day, 1979, Taylor could make no further requests of Paramount for any additional documents. He had weakened their case, to be sure; but by no means had he hobbled it. Things could have been far worse.

    For example, the “ace in the hole” on the part of Lucasfilm, the star expert witness, was obliged to submit to questioning by the defense in a deposition, which took place on February 15, 1979. This deprived the plaintiff from having the element of surprise at his disposal, but fortunately the witness acquitted himself admirably, proving himself unwilling to be shaken, rattled, or rolled by the high-priced army of lawyers at Paramount’s disposal. Paramount itself could produce no expert witnesses who possessed the same level of righteous indignation, which was a definite preemptive win for Lucasfilm. Indeed, Taylor was surprised that Paramount did not make yet another attempt to settle.

    Nevertheless, after the close of expert discovery (and therefore of the entire discovery process) at the end of March, Paramount filed a motion for summary judgment at the beginning of May, hoping to skip the ordeal of bringing the case to trial, and believing Lucasfilm’s case (though stronger than originally perceived) to be weak enough to not withstand the judge’s scrutiny. All through the summer, the two sides exchanged procedural volleys until Judge Ferguson, walking the fine line he had done for the duration of the pre-trial, granted Paramount’s summary judgment motion in part (dismissing the Lucasfilm claim of civil conspiracy - one reason that Bluhdorn’s studio was thrilled to have dodged the bullet of surrendering further documents in discovery), but also denied it in part, allowing the Lucases to proceed to trial on the breach of contract, fraud, and negligent misrepresentation claims. He made this decision on October 12, 1979; the trial was scheduled to begin three months later.

    ---

    Finally, after nearly two years of legal maneuvering and wrangling, and sorting through mountains of documents, and dozens of depositions, the beginning of the trial proper was scheduled for Monday, January 14, 1980. Taylor, aware that the David-and-Goliath factor of the case would attract popular support, insisted that the civil case be brought before a jury of their peers, as opposed to a “bench” trial in which the verdict would be rendered by Judge Ferguson. Therefore, despite his inestimable influence, Ferguson would not be directly responsible for the outcome of the case. That the case would be decided by a jury had been known from very early on; however, Ferguson’s presence, and the impact of his judicial decisions, had been so closely scrutinized by the media that much of the public had been under the impression that he would also render the verdict.

    Taylor had an obvious advantage over his more polished competition: he possessed a natural, seemingly effortless charisma and the ability to lead his opponents to underestimate his abilities. His rural, Southern heritage was disarming, and proved invaluable for the “little guy takes on the big bad machine” bent of his case, simply but powerfully urging the jury in his opening argument to ensure that justice be done, irrespective of the massive financial advantage enjoyed by Paramount, Gulf+Western, and indeed Charlie Bluhdorn himself, over the downtrodden creator couple of George and Marcia Lucas and their tiny company, who had earned (through dint of hard work) the recognition of their peers and the adoration of filmgoing audiences, and whose well-deserved financial compensation had been cruelly deprived from them by the ruthlessness of an already too-powerful film studio, which had learned nothing from the lessons that the changes in the industry ever since the Miracle Decision and the Antitrust case of thirty years prior (in which, fittingly enough, Paramount itself had also been the defendant). This appeal to justice and fair play, though rigorously based on the spirit of government statute and precedent set by carefully-chosen examples of case law, was definitely more emotional in its orientation than the counter-argument provided by the chief counsel for the defense, who (naturally) stuck with a very “letter of the law” interpretation, emphasizing that the contract was reflective of “clearly established and universally recognized” precedent, all but stating outright that George Lucas was an idiot if he could not understand this. The jury was obliged, so the defense argued, to punish George and Marcia Lucas and their attorney for their patently absurd lawsuit (by finding them liable for damages in the Paramount counter-suit). The wildly divergent strategies employed by both sides emphasized their differing stations: an observer of the proceedings would later describe their opening arguments as “something out of The Devil and Daniel Webster”.

    Taylor built his case based his key piece of evidence, and the testimony of his expert. The evidence, those “hot documents” that he was finally able to locate after slogging through boxes and boxes of irrelevant scraps of paper thanks to the revelation from the deposition of Paramount executive Alan Ladd, Jr., were the minutes of a meeting between the officers at the studio reviewing new contracts made for their latest batch of upcoming productions, conducted in late 1976 (shortly after the Lucasfilm-Paramount deal had been signed). Multiple executives confirmed during this meeting that Lucas had been under the impression that “profits” represented revenues provided by ticket sales, less expenses incurred through production costs for Journey of the Force (the standard definition); whereas Paramount operated according to what was known as “Hollywood accounting” (also known as “creative accounting”), which calculated profit rather differently. These minutes were confirmed by inter-office memos, though they did not spell out this revelation quite as clearly or as brazenly as the minutes had done. Though the minutes might have seemed devastating to Paramount’s case, the studio planned to argue that even if Lucas were unaware, that was his fault, not theirs; their entire industry operated according to the “rules” of Hollywood accounting, and it was an accepted maxim of accounting principles than an industry could operate under standards that were different from what was generally accepted, so long as these were universally practiced. It was, in its way, a bold strategy, and one favoured by children the world over in dealing with their parents: “But everybody else does it!” That was where the star witness came in; he was the man who would ask the obvious rejoinder: “And if all the other movie studios jumped off a bridge, would you do that too?” Fortunately for Taylor, and for the Lucases, he was more eloquent and passionate than his willingness to respond to Paramount’s argument might suggest. His name was C.A. Baxter, and he was a forensic accountant.

    An old acquaintance of Taylor’s, Baxter originally hailed from Buffalo, but moved to the West Coast to attend the USC School of Business, where he graduated magna cum laude, receiving his CPA before accepting a position at Price Waterhouse as a staff auditor. [7] There he had his only direct experience working in the entertainment industry when he joined the team that was responsible for auditing the results for the Academy Awards [8], doing so for a number of years as he put himself through graduate school, receiving an MBA and shortly thereafter departing the firm (though remaining on good terms with his former employers) to start his own private consulting and forensic accounting business. In the years since, he had become known as an opponent of the Hollywood accounting system, irate at how studios were exempt from the rising income taxes that all the other corporations and individuals were at least recognized as being obliged to pay. He had written a book on the subject, Hollywood Can’t Make Money, which had been published in 1976; it was roundly dismissed by insiders as a tawdry exposé and fell out of print. Fuelled by the publicity of the Trial of the Century and his high-profile role therein, his book would be re-published (with added material referring to Lucasfilm v. Paramount and his role therein) before the end of 1980, reaching the New York Times bestseller list.

    During the round of depositions, Baxter had made his meaning plain, unwavering in the face of intimidation by Paramount’s attorneys. Much of his testimony was drawn from Hollywood Can’t Make Money, a tactic which he would repeat in the trial proper. Despite this, it was difficult for the defense to challenge his assumptions or get him to yield. “Who knew Mr. Smith could be an accountant?”, a Paramount executive idly wondered after the deposition had concluded. [9]

    ---

    Taylor, after Baxter had been sworn in to testify, asked his expert witness to introduce himself to the court at the beginning of his testimony, and then proceeded to launch into the first of his many definitional questions.

    “Mr. Baxter, could you please explain revenue to the court?” Taylor asked.

    “Revenue represents the gain of assets from the sales of goods or services,” Baxter said.

    “Could you give the court an example of a good or service within the context of the motion picture industry?”

    “Well, movie tickets sold would be classified as a good, although alternatively you could describe the opportunity to see the movie itself as a service provided by the venue.”

    “And what are expenses?”

    “They represent the loss of assets or the gain of debts incurred while in the process of manufacturing, purchasing, storing, displaying, or selling the goods and services that generate revenue.” [10]

    “And could you please give the court of an example of an expense?”

    “Well, usually when a film is in theatres you see commercials for it on television, or hear them on the radio. Those would be advertising expenses, because their goal is to sell more tickets to see that movie.”

    “So how are revenues and expenses linked, exactly?”

    “Well, the relationship between them is fundamental to the accounting discipline,” Baxter said. “Expenses incurred have to be matched to the revenues generated within the same accounting period, usually one year, for tax purposes. That is what enables direct links and comparisons between them, the simplest of which is that revenues minus expenses are equal to profit.”

    “And what is profit, Mr. Baxter?”

    “Profit is any surplus revenue derived from all costs related to the selling of goods and services. It is the net benefit of doing business. Virtually all businesses define their success or failure in relation to their profitability, and have done so throughout the history of commerce.”

    “Are these your own definitions, Mr. Baxter?”

    “No, although I do agree with them.”

    “Do you have a recognized source for them?”

    “I do. They are derived from the Generally Accepted Accounting Principles provided by the Financial Accounting Standards Board, a not-for-profit organization responsible for defining these terms in the public interest.”

    “Let the record show that the witness is referring to the Generally Accepted Accounting Principles, or GAAP, a copy of which has already been entered into evidence. Mr. Baxter, under whose authority does the FASB operate?”

    “The Securities and Exchange Commission, an agency of the United States federal government.”

    “In your estimation, as a trained and certified member of your profession, would you say that means GAAP reflects the laws and policies of the United States government?”

    “I would, yes.”

    “Would you consider it unusual for a company to not pursue profit, Mr. Baxter?”

    “By definition, all corporations seek profit; any that do not are explicitly called not-for-profit corporations, and are required to apply for tax-exempt status with the United States government. Profit is so fundamental to the operations of an economic entity that it is effectively doomed without it. In my years of experience, including during my time working at one of the largest public accounting firms in the world, I have never known any company, in any industry other than motion pictures, which has been unprofitable, even in the short-term – or even less profitable than in previous periods – to not radical overhaul their business plan, or overturn their board of directors and replace most, if not all, of their officers. And all of them have faced a catastrophic decline in stock prices as a result.”

    “Could you explain to the court how the motion picture industry operates, based on your expertise, and on the research conducted for your book, Hollywood Can’t Make Money, which has been entered into evidence?”

    “Yes, I most certainly can. For the last several decades of the motion picture industry’s existence, no film produced by any studio has ever shown a profit, in any year. Despite continuing to operate for decades, despite having been purchased by industrial conglomerates like Gulf+Western, which has dramatically improved their fiscal health, and despite continuing to pay out dividends – which are shares of these supposedly non-existent profits, mind you – to their owners, we would be led to believe by these studios that the Hollywood motion picture industry, as a business, has been a complete and total failure. For example, despite generating over one-half billion dollars in ticket sales since its initial release, Journey of the Force has apparently resulted in a substantial net loss for Paramount Pictures. In any other industry, such woeful mismanagement would result in immediate termination of everyone involved in that project – yet despite this so-called “New Hollywood” era, we see the same producers, executives, and officers working in each and every studio. However you choose to interpret the facts on the ground, the motion picture industry in the United States of America is fundamentally corrupt.”

    “Objection!” came the inevitable cry from opposing counsel. [11]

    “Sustained. The jury will disregard the last sentence uttered by the witness,” Judge Ferguson said. But, needless to say, juries were notoriously awful at pretending to have never heard that which they were supposed to disregard.

    “Mr. Baxter, please inform the court, based on your research and understanding of accounting principles, how would it be possible for corporations in the motion picture industry to continue doing business without making a profit on any of the films they make, even the ones with record-breaking grosses?” Taylor continued, doing his best not to smirk.

    “They overstate their expenses.”

    “And how they go about doing that?”

    “In any number of ways – whatever it takes to make them exceed revenues. Generally accepted accounting principles are based on an accruals system, and most of these accruals allow for the use of estimates. So any expenses that cannot be directly traced are overestimated – and then not corrected or adjusted when the actual figures come in. If that isn’t enough, then the costs of services rendered by company subsidiaries or affiliates are over-inflated – this is the advantage of increasing conglomeration in the entertainment industry; most goods and services are provided “in-house”. And if even that isn’t enough, then the unallocated costs incurred from different film projects are treated as expenses for the ones that generate more revenue. Ironically, this means that losses on movies that are genuinely unprofitable are therefore shown as much smaller than they really are, because they’ve been moved elsewhere. But in all cases, a studio will find the expenses they need to exceed revenues.”

    “And what are the negative consequences of this?”

    “Well, for one thing, companies don’t pay income tax on losses, only on profits. And the revenues generated by the film industry speak for themselves.”

    “Let the record show that a report of industry-wide box-office grosses for the last decade has already been entered into evidence. Please continue, Mr. Baxter.”

    “If a company is misleading about which of its goods or services is unprofitable, that will impact the decisions made by shareholders. It will also mislead creditors, who might be inclined to make decisions to loan money based on faulty risk assessment profiles. And it prevents potential investors from getting a clear picture of a company’s finances. The proper flow of economic resources is entirely dependent on transparency, and I can’t think of a single word less apt to describe the accounting practices that are prevalent throughout the motion picture industry.”

    With that came another objection from the defense, though more perfunctory and less indignant than the one before. “Sustained,” Judge Ferguson repeated, this time eying Baxter. “Don’t make me do that again, Mr. Taylor. The jury will disregard the last sentence uttered by Mr. Baxter.”

    “My apologies, Your Honour, but I have no further questions for Mr. Baxter.” Judge Ferguson raised his brow, but nodded, and with that, Taylor returned to his seat. He studiously avoided the gaze of the chief counsel for the defense; Baxter, however, glared at them, knowing that they would not be able to rattle him. Indeed, as their entire case depended on acknowledging what he said was true but then pointing out that it didn’t matter, their round of questioning largely consisted of highlighting his relative lack of experience with the entertainment industry, as if it was deserving of special treatment beyond outsider comprehension; a tactic with major potential for backfire.

    ---

    The rest of the trial proceeded largely without incident; the verdict arrived on Friday, February 29, 1980, Leap Day, after three days of deliberation (both sides having delivered their closing statements and arguments having concluded on Tuesday, the 26th). In the end, by a margin of nine-to-three, the absolute minimum majority (75%) needed for a verdict to be reached in civil court in the state of California, the jury found the defendant, Paramount Pictures, liable for breach of contract, fraud, and negligent misrepresentation (in other words, all counts) and were ordered to pay massive – and unprecedented – damages to Lucasfilm Limited.

    The damages which the jury decided were owed to Lucasfilm were based on that studio’s share of the profits generated by the film in the United States, retroactive to the original breach (in 1976), and then with the interest compounded monthly to the end of the most recent period (January 31, 1980). Under the terms of the original contract, Lucasfilm was entitled to one-half of the net profits, estimated at nearly $150 million before interest. The punitive damages levied upon Paramount (for their blatant fraud) multiplied that total by approximately a factor of six, though there was some rounding involved, as there was no doubt that the jury liked the sound of the final figure at which they arrived: one billion dollars. Needless to say, that particular figure led the headlines around the world the next day. [12] The New York Post famously described the result as “LITTLE LUCASFILM WINS BIG BUCKS”, other papers (and tabloids) were even more crass in their use of titles, including a multitude of “force”-related puns. (The National Enquirer set the standard with “JURY USES ‘FORCE’ ON PARAMOUNT”). For the first time since Bluhdorn had purchased Paramount in 1966, the other investors (not to mention the creditors) were irate. It didn’t help that, ever since the trial had started, many protesters had gathered at the gates to the main Paramount studio on Melrose Avenue, armed with a battery of raw vegetables and plenty of signs, picketing the “corporate greed” and “criminal racket” that they believed endemic to the industry. Marcia Lucas herself, who worked next-door at Desilu, did her best to dissuade these protests, urging the assembled malcontents to trust in the judicial system (or at least, to contain themselves, as far away from Gower Avenue as possible). The crowd rapidly disbanded upon learning that Paramount would be paying through the nose for their… error in judgement.

    The $1 billion which Paramount was due to pay Lucasfilm was a greater sum than the annual GDP of several dozen third-world countries. Stateside, by contrast, it took the US government a grand total of one day to spend that much money in their budget for FY 1979 – but the movie studios, Paramount included, were lucky to pull in that much revenue in a whole year. Their cash-on-hand was dreadfully low – every major studio had their most of their money tied up in either fixed assets, such as studio space (located on prime land in Hollywood and other central neighbourhoods), soundstages, and filming equipment; or intangibles, such as the copyrights for their film archives, and patents for new filming technologies. Although most of the Billion-Dollar Verdict would not be paid out by Paramount pending appeal (which the studio filed almost immediately, with the Court of Appeals for the Ninth Circuit), Paramount was still required to post a bond for the damages awarded, which would be theirs for a value of ten cents on the dollar, or $100 million – the largest bond ever posted in the history of the United States. Where Paramount would be able to find the money to post such a bond was one of the burning questions to emerge at this point in the saga, which would no doubt continue down the long road of appeals, heading for the Supreme Court or Bust. More pressing for Paramount, and indeed, for every studio in Hollywood, was that all the details of the case were now a matter of public record – and in an election year, the gory details of Hollywood accounting being laid bare to the voting public had dangerous, and previously unforeseen, consequences…

    ---

    [1] $50 per hour in 1978 dollars translates to roughly $185 per hour in 2013 dollars – this would break the bank for the Lucases, who had sunk all of the money they had made from Graffiti into Journey of the Force (along with the costs of raising their daughter, Amber). The upfront retainer would have cost about $5,000 ($18,500 today).

    [2] Maryland, a border state, has in recent decades been increasingly regarded as a de facto part of the Northeast, due to the rapid growth of the Baltimore (and Washington, D.C.) suburbs. This was certainly far less true during the era that Mr. Taylor would have been born and raised. Even today, many Marylanders identify with Southern culture.

    [3] Rule 11 refers to the obligation of lawyers to bring cases based in the law or in a “good faith extension” thereof; this theoretically serves to mitigate the threat of frivolous lawsuits, though obviously it has not eliminated them entirely.

    [4] Judge Ferguson achieved considerable prominence as a jurist IOTL, having ruled on Haywood v. NBA, and later Sony v. Universal. In 1979, he was elevated to the Court of Appeals for the Ninth District by President Carter; obviously, ITTL, that does not happen.

    [5] Gibson, Dunn & Crutcher were (and remain, under the abbreviated name Gibson Dunn) one of the largest and most successful law firms in the world; their OTL existence stands in contrast to the fictional Taylor & Associates. Their representation here as chief counsel for the defense is not meant to imply anything about the ethics or character of the organization (though I would be tickled if anyone connected with the firm ever did read this).

    [6] Roughly one million documents would fit into approximately 350 bankers’ boxes, each of which is fifteen inches long, twelve inches wide, and ten inches high. The combined volume of all these boxes is 29 square feet, assuming a nine-foot-high ceiling. They would find room for all of them at the offices of Taylor & Associates, but it would be a very cozy fit – especially given all the excess bodies they have to cram into those confined quarters to help read it all.

    [7] Price Waterhouse – which, IOTL, merged with another company, Coopers & Lybrand, in 1998, and is now known as PricewaterhouseCoopers – was, and remains, one of the largest public accounting firms in the world. It has audited the results of voting for the Academy Awards since 1934 (not coincidentally, at the ceremony immediately following the “tie” between Wallace Beery and Fredric March which was not actually a tie at all). And yes, this is the sole reason why Mr. Baxter worked at that firm; their presence in this timeline does not in any way reflect upon their relationship with the entertainment industry, nor with my (or their) stance on Hollywood accounting practices.

    [8] Yes, this means that Baxter was made privy to the Oscar results prior to the opening of the envelopes at each ceremony. The final tallies in each category being proprietary information, he sadly could not divulge just how narrowly some of the victors had taken home their respective trophies.

    [9] This is a reference to Mr. Smith Goes to Washington, which starred Jimmy Stewart, and was released in 1939 (the annus mirabilis of Hollywood).

    [10] Some additional definitions: assets represent resources (tangible or otherwise) owned or claimed by the economic entity (company). These are almost always used to generate revenue, and the costs of utilizing or consuming these assets are one of the two main ways to define expenses. Debts (or liabilities) represent the claim upon assets by creditors; increasing these debts (through direct borrowing or interest) is the other main way to define expenses. All assets less all liabilities are equal to the equity or claim upon assets by investors into the company (this formula is called the accounting equation), and any profits are either retained as equity, or paid out (in dividends) to investors.

    [11] This objection is due to the fact that “an expert witness may not opine as to the ultimate conclusion to the case”; Baxter crossed the line so blatantly that the chief counsel for the defense does not even have to provide grounds for his objection. But as that truism that litigators know so well goes: you can’t unring the bell.

    [12] This amount is being reckoned on the short scale; that is, the number one followed by nine zeroes (ten to the ninth power). Today, it would be three billion dollars.

    ---

    And thus concludes the Trial of the Century! Many of you may be wondering why this instalment of Appendix C is labelled “Part IV”. Well, as everyone knows, the fourth part of a saga always comes first! :p No, actually I’m employing another classic George Lucas tactic: the retronym (which has also been brandished upon a certain other television series IOTL, but emphatically not ITTL). You can consider “Another Night at the Movies” to have been Appendix C, Part I; “Marcia, Marcia, Marcia” to have been Appendix C, Part II; and “Brand New Hollywood, Same Old Industry” to have been Appendix C, Part III. Appendix C will have six parts in all; two more will be forthcoming.

    I hope you all enjoyed this update! It would not have been possible (nor anywhere near as good) without editing assistance (and encouragement) from e of pi, nor without the specialized advice of my legal consultant, Andrew T, with whom I collaborated closely in an effort to accurately reflect the proceedings of a civil trial. Any remaining errors or misconceptions to be found in this update, however, are solely the fault of my own, and I apologize profusely for their presence.

    And Happy Canada Day (still Dominion Day, ITTL), everyone! I realize that coverage of the Trial of the Century, which takes place in American federal court, may not seem patriotic, but I can assure you that Canadian media outlets covered the trial with just as much zeal as the Americans did. That’s how the maple sugar cookie crumbles!

    Thus concludes the 1979-80 cycle, and (for all intents and purposes) the 1970s! Everybody get ready for the totally tubular decade which lies ahead;)

    U.S. Courthouse, Los Angeles.JPG
     
    Last edited:
    1980-81: Triumph and Tragedy
  • Triumph and Tragedy (1980-81)

    “Punch my time card, boss! We’re going on a looong skiing trip in the Middle East this weekend!”

    Robin Williams, upon leaving the set of The Richard Pryor Show, on the evening of December 5, 1980

    1980 was an extraordinarily eventful year – so much so, it was indeed most fortunate that it had 366 days instead of the usual 365. [1] As was the case with each successive quadrennial, it marked the US Presidential election, along with the games of both the Summer and the Winter Olympiads. The regular season programming often had to yield to these “special” concerns, which displeased those were neither into sports, nor politics. Others, no doubt, were ecstatic. CBS, which had been the last-place network for a number of years by this point, was definitely part of the latter group. Because the network had been struggling to stay afloat for so long, heads were rolling, and those fronting the new regime in place at the network were in agreement only that they needed a steady infusion of cash to be able to compete with ABC and NBC going forward. Fortunately, they had a valuable asset which could be traded for such an infusion. In fact, CBS had something that Lucille Ball, the proprietor of Desilu Productions, had coveted for a very long time – and, indeed, had never really wanted to give up in the first place. But her then-husband, Desi Arnaz, had wanted to build an empire, and being able to afford to purchase the remains of RKO back in 1957, they needed the seed money which could only be raised by selling what was, at the time, by far their most valuable assets: the syndication rights to I Love Lucy. [2] They had thus been the property of CBS ever since. And though her dream vision from Carole Lombard had given her inspiration to remain in charge of Desilu, and she had genuinely learned to love running the place in the years since she had very nearly sold it to Gulf+Western, Ball needed her own personal stake, a driving ambition. She found it in resolving bring Lucy back home, sooner or later. At last, she saw a golden opportunity, which she promptly seized.

    Ball decided to invite Arnaz to purchase the rights to I Love Lucy back with her – for it had always been their show, not just hers – but Arnaz reckoned that, had Desilu managed to retain the rights to I Love Lucy, she would have owned 100% of the show once she had bought out his share of the studio in 1962 – not to mention that, ultimately, his rights would devolve onto their children together, just as Desilu itself would, and that splitting the ownership of the show would only be delaying the inevitable anyway. Certainly, for tax purposes, having Desilu own 100% of the rights to I Love Lucy would also keep Arnaz from having to pay exorbitant income taxes on his share of the syndication revenues – and the same would hold true, after a fashion, for their children as well. [3] What clinched the decision by Ball to have Desilu buy the rights in their entirety was Arnaz gently reminding his ex-wife that their son, Desi, could well find himself divorced from his own wife, Patty Duke. [4] Ball had never cared for her daughter-in-law, and the thought of her directly owning one-eighth of their show – that was enough for her to waive any further objections on the matter. Later, she would reflect that her silver-tongued and fiendishly clever ex-husband still always knew what to say and how to say it. Oddly enough, it was under the auspices of Arnaz that all of the other Desilu shows made in the 1950s had been sold to CBS (in 1960); as a result, the network made several counter-offers to Desilu which constituted “package deals”, asking for a few million more in exchange for some nearly-forgotten one-time stablemate to I Love Lucy. [5] None of these shows, however, had shared its singular staying power, and Ball was happy to let CBS keep them.

    CBS agreed to sell the syndication rights to I Love Lucy back to Desilu for a generous sum, which was to be paid in instalments over several years (or rather, seasons), and for transferring the right-of-first-refusal agreement back from ABC once that particular deal expired. [6] Fred Silverman was rather vocally displeased at this, but neither Herbert F. Solow nor Brandon Tartikoff were particularly fond of the head of programming at the Alphabet Network, whose tremendous success (including, once upon a time, at CBS itself) had clearly gone to his head (“You’d think he’d been the lead on Star Trek,” as Solow had quipped). As another condition of the deal with CBS, Ball was obliged to produce and headline a 30th Anniversary Special for I Love Lucy – the silver anniversary special in 1976 had gone over so well, and the network was desperate enough for ratings that they hoped very much for lightning to strike twice. It would air on October 15, 1981 (a Thursday) – effective on that very same day, at 11:00 PM Pacific (the end of primetime in the United States), the rights to the show would revert back to Desilu Productions, ending nearly a quarter-century sojourn away from the welcoming bosom of the studio which had produced it. Having buried the hatchet with her ex-husband, Ball invited Desi Arnaz to participate in the special, just as in 1976 – with Vivian Vance passing away in 1979, they were the two survivors from the core quartet. Supporting cast members were also invited; it was decided to give their recollections (and those of the surviving producers and crew) greater prominence than had been the case for the previous special. (A memorial segment for Vance was also planned.) Ball was delighted that her show would be coming back to her studio, crowing that her triumph had made her feel thirty years younger; she decided to give all her staff, at all her facilities, the following day (Friday, October 16th) off work, so that she could throw a lavish celebratory party to which everyone who had ever worked at Desilu – past and present – would be invited.

    But as far as present-day offerings by the studio were concerned, Deep Space became the third Gene Roddenberry series (in four attempts – even he tended to downplay his underwhelming early-1970s flop Re-Genesis [7]) to reach the Top 30, doing so (like The Questor Tapes) in its inaugural season. The generic title belied the very strong Western-style themes in the program, even more obvious than those of Star Trek. It served as a tenuous connection to the “House that Paladin Built” era at Desilu even as old warhorses such as Rock Around the Clock and Three’s Company continued to keep the locomotive running at the studio. Although Deep Space shared a setting with Star Trek, it was in essence a very different program – the Systems Commonwealth which governed the “core sectors” of the galaxy (those nearest to Earth, naturally) did not exercise even nominal control of the far-flung region in which Eagle’s Nest Station (properly Commonwealth Outpost Iota) was based, explicitly describing it as spatius nullius – no-man’s space – in an early episode. For this reason, Eagle’s Nest was established primarily for trade and commerce purposes; Commander William Boone, the Commonwealth Space Forces officer who was in charge of the station, functioned more as a Marshal in a town on the Wild West than as the leader of a naval base. [8] Most action on the station was set in the Market Quarter, particularly at the main watering-hole there, run by a quirky alien who was actually called Quirk (“you hum-mins couldn’t pronounce my real name”, was a common catchphrase of his). Quirk was portrayed by a puppet (operated by Frank Oz, using a combination of his voices for Bert and Grover, both from Sesame Street), and despite this functioned very much as an unwitting straight-man to the many alien traders (akin to Harry Mudd or Cyrano Jones from Star Trek) who found themselves hocking their wares at the station. [9] Many of the natives who lived on the Planet which the station orbited were common Western types, right down to the mature woman who owned the watering hole, and enjoyed sexual tension with Commander Boone. Hilariously, she quite resembled a female Spock (dark hair, green-ish hue to her skin, with pointed elfin ears), as did other natives of Planet. However, personality-wise, she and her kind were much closer to Scotty: boisterous and unpretentious. Deep Space frequently ventured down to Planet for various reasons, as Kirk’s Rock made for a singularly convenient shooting location; when towns were needed, the Desilu Forty Acres backlot in Culver City (whose Western façades had stood for decades) was used instead. It was an excellent way to make productive use of that very expensive property.

    The Richard Pryor Show had entered its fourth season with no expectation that it would be coming down from the tremendous highs (in more ways than one) that it had enjoyed in the waning years of the 1970s. However, it was an established fact that megahit variety shows tended to lose their lustre after a mere couple of seasons at the top; so it had been with Laugh-In, and so it was with Flip Wilson, both of which had also aired on NBC. Pryor was the first to outlast the exception that proved this rule – The Carol Burnett Show on CBS – but now it had competition from programming which was attracting even more buzz: Texas. The “Who Shot T.R.?” cliffhanger was the hot topic of the 1980 summer hiatus. Even Pryor was forced to acknowledge this through the series of “Who Shot Robin” sketches that featured in the season opener. All of these “starred” the show’s breakout performer, Robin Williams, as his own corpse, slumped over in a chair and – for the one and only time in the history of the series – totally silent. Pryor played the police detective who was investigating his murder, making inquiries of the long-suffering supporting players so as to determine their motives. And sure enough, in each sketch, one-by-one, every member of the Pryor repertory delivered impassioned, lengthy monologues about the many times that Williams had upstaged them, or stepped on one of their punchlines with one (more often several) of his own, or completely ruined the flow of a scene by interjecting with his stream-of-consciousness ramblings… the complaints were myriad, and seemingly endless. This episode quickly became the most infamous in the show’s history; it was plainly obvious that the rantings of the various “suspects” were firmly grounded in truth. Naturally, it was eventually revealed that everyone in the supporting cast was the culprit, each using a different murder weapon (in a reference to the popular board game, Clue). This “shocking revelation”, with Pryor playing detective and exposing each of the culprits in turn, stood in for his usual closing monologue; as he bid his audience good night, Williams attempted to rouse from his “death” and launch into his usual manic persona, but the rest of the cast physically restrained him from doing so.

    The “Who Shot Robin” sketch would take on a whole, much darker meaning after filming on The Richard Pryor Show had ended for the Christmas hiatus on December 5, 1980 (a Friday). In celebration of another job well done, Pryor and Williams headed to their favourite haunt, the Medina nightclub in Century City, to pursue a weekend-long bender in the private backrooms. Two would enter the glittering, Arabian Nights-style façade that night, but only one would leave. Robin Williams died of a cocaine overdose early in the morning of December 8, 1980; the funeral was held shortly before Christmas, with a visibly shaken Pryor delivering the eulogy. [10] The first episode back of The Richard Pryor Show (in January of 1981, after the holidays had ended) was a clip show, in which Pryor would introduce the various best-of sketches which starred Williams. The show continued through to the end of the season rather in the manner of a chicken whose head had been cut off. For the first time, the supporting players were asked to carry sketches; Pryor, who had been devastated by the death of his friend and fellow binger, drastically reduced his active involvement in the program. The bravura ratings for the Robin Williams tribute show (the second-highest rated telecast of the 1980-81 season) were barely enough to keep Pryor in the Top 10 for the season; NBC renewed it for a fifth, which would be the last in which Pryor himself was contractually bound to appear, and demanded that a replacement for Williams be found for September.

    It was no surprise that commentators would regard 1980 as the year that the variety genre had “died” once and for all, both figuratively with the Carlin disaster in March, and then literally with Robin Williams in December. The Muppet Show continued into 1981, but it had already been planned to end at the conclusion of that season, and these harbingers of death only served to reinforce creator, producer, and star Jim Henson’s decision. Lucille Ball herself was the final guest star of The Muppet Show, appearing in the series finale. Throughout the duration of the program, “Miss Ball” had often been referred to (though usually not by name) as the “boss lady” or owner of the Muppet Playhouse (named in reference to the famous “Desilu Playhouse” at which I Love Lucy was originally filmed). The conceit of the episode involved Ball being furious that she had never been invited to perform as a guest of The Muppet Show, which allowed the Muppets to send up her original assessment of the kind of show that Henson had pitched to her studio, all those years ago. “But Miss Ball,” Kermit the Frog had protested, “We’d heard you were a perfectionist, and didn’t want anything to do with crazy madcap unrehearsed variety shenanigans.” It also allowed for Ball to send up both her own modern image as a hypercompetent professional, and the classic image of her old “Lucy” character, simultaneously. The episode, naturally, ended with the Muppet Playhouse in a wreck, and Ball livid to the point of incomprehensible babbling. Kermit, meanwhile, pledged to take his show on the road, instead. It was the third-highest-rated telecast of the 1980-81 season, surprising even Henson and Ball with its success.

    The highest-rated telecast of the season, naturally, was the resolution to cliffhanger which posed the famous question: “Who Shot T.R.?”. In addition, had the bullet killed him? Audiences had to wait to find out for much longer than anticipated; the 1980 SAG strike delayed the start of the season, as did salary negotiations with series star Larry Hagman, who played the T.R. so named in the famous question. During that time, reruns scored terrific ratings, and older episodes had time to air overseas, which turned Texas into a worldwide sensation. By the time the question was met with an answer in October, the whole world was watching. As it turned out, the shooter did not kill T.R. Walsh, though the show’s producers strongly considered making the assassination attempt succeed when Hagman held out for a massive raise – which he deemed commensurate with his newfound appeal. The man who pulled the trigger was revealed to have been Rusty Bartlett, the paramour of T.R.’s own wife, Sue Ellen, in a fit of jealousy (earlier in the previous season, he had staged his disappearance to throw the scent off his trail). [11] Bartlett, upon finally being fingered as the verdict, was promptly arrested and sent to jail on the charge of attempted murder; however, the long-suffering Sue Ellen, moved by her adulterer’s would-be act of “heroism”, filed for divorce from her husband, demanding half his fortune in what would emerge as a long and convoluted trial – the writers openly admitted to having been inspired by the proceedings of the “Trial of the Century”, though with the obvious twist that the couple at the centre of this trial, rather than presenting a united front, were in fact creating the drama by becoming schismatic. [12] This was revealed in the episode “
    Who Shot T.R.?”, which received spectacular ratings – however, they fell just short of the threshold attained by Roots and then Star Trek: The Next Voyage in the late-1970s. [13] Still, the message was clear: variety had experienced its last hurrah as a genre, and the primetime soap opera made its first, triumphant thrust into the heart of popular culture. Imitators quickly entered development, often on the backs of proven hitmakers such as Aaron Spelling, producer of The Alley Cats.

    What remained of variety programming as a genre was forced to evolve with the times, and oddly one of the pioneers in this field hailed from Canada, a country known for being several years behind the United States when it came to cultural trends (most infamously with the cheaply-made Trouble with Tracy sitcom in the early 1970s, which was in turn based on quarter-century-old radio scripts). However, in this case, there was a cross-border connection which may have served to invigorate the mostly-Canadian cast and crew, through Second City. In fact, the show took its name from this connection: Second City Television, or SCTV. This variety show had the conceit of depicting the daily, locally-produced programming schedule of a small-town television station (the titular SCTV) in “Melonville”, in the time-honoured lets-put-on-a-show tradition.

    SCTV began its run on the small Canadian network, Global (ironically, based only in Ontario at the time). The cast consisted mostly of Canadian Second City veterans (mined from either Toronto or as far afield as the Chicago branch). Headlining the SCTV cast was Dan Aykroyd, a gifted character actor and impressionist, who bought the house down with his Hubert Humphrey and, later, his Ronald Reagan (despite being filmed in Canada by a mostly Canadian cast and crew, Melonville was seen as an Anytown, U.S.A.). However, Aykroyd’s prominence was not nearly as overwhelming as that of Williams on Pryor. The cast (all of whom were also writers) were egalitarian in their assignment of roles – many of them played important “townspeople” in Melonville, and all did impressions of celebrities or characters on other television series. The other cast members included John Candy, Jim Flaherty, Eugene Levy, Andrea Martin, Catherine O’Hara, and Gilda Radner (the only American in the cast, though a veteran of Second City Toronto). [14] Initially just half-an-hour long, it became an hour-long show in its third season of 1979-80. [15] The reason for this was clear: SCTV had become a smash in Canada – and, surprisingly, a cult hit in the United States. The content was more sophisticated than most sketch comedy on American television, but simultaneously very “clean” – sex, violence, and profanity were virtually nonexistent. In the 1970s style (which had extended to the stage as well as to the small screen), emphasis was placed firmly on mining the humour from the characters and their situations. It became to American audiences in the later part of the decade what Monty Python’s Flying Circus had been in the earlier part (and several Pythons gave SCTV their ringing endorsement – including Eric Idle, who even appeared in an episode), though PBS did not air it. SCTV was destined for a larger audience…

    1980 marked not only the quadrennial for politics and sports, but also the quinquennial contract negotiations with Johnny Carson – who wanted to work for one hour instead of ninety minutes. The problem was that NBC was doing rather better than they had been in 1975, and so was the competition – ABC was doing well enough that their lackluster late-night lineup could be conveniently ignored, and Merv Griffin was one of the very few bright spots for CBS in this era. NBC agreed to raise Carson’s salary, and end the “Best of Carson” broadcasts of Saturday night, along with additional days off (Carson wanted 30% of his schedule off work; the network insisted on 25%). The problem then became how to fill that Saturday 11:30 slot? Dick Ebersol, an NBC programming executive, suggested creating a new show, but higher-ups preferred the cheaper alternative of buying an existing one. SCTV was popular with “hip” audiences, and PBS continued to do well on Monty Python reruns (not to mention those of The Final Frontier – like SCTV, a Canadian-made series). The show would begin airing on the Peacock Network in a 90-minute format starting in the 1981-82 season – its fifth overall. [16]

    CBS, meanwhile, still did not see any improvements – in fact, as proof that they had hit a nadir, one of their last dependable hits, Rhoda, had fallen out of the Top 30 during this season, and was cancelled at the end of it. This was largely because of Valerie Harper, the show’s star, having made a woefully ill-timed demand for a bump in salary herself; the highest-paid person, let alone woman, on television until Larry Hagman had stolen her thunder. Tentative discussions about a Rhoda spinoff ultimately went nowhere, with network executives sensing that the planned Brenda series (about Rhoda’s younger sister, a single thirty-something living in 1980s New York) would be nothing more than a Mary Tyler Moore warmed over. More to the point was who would be producing the show, and in fact who produced Rhoda: Paramount Television. All three networks were extremely wary about ordering more shows from a studio which stood a more than reasonable chance of entering into bankruptcy. For once, television was bucking the trend by proving remarkably stable amidst the great upheaval within the entertainment industry, and far beyond it. This stability would not last, but it was curiously refreshing.

    Texas had emerged as the #1 show on the air, in another major boon for ABC; despite reclaiming the top spot, however, their overall position declined from the previous season, with “only” seven slots in the Top 10, and sixteen – a bare majority – in the Top 30 overall. NBC, for their part, had improved their position considerably, scoring two Top 10 hits and eleven – more than their fair share – in the Top 30. CBS, meanwhile, maintained a mere three shows in the Top 30, their lowest-ever proportion; fortunately for the individuals at that network, the eventfulness of 1980 ensured the continued success of 60 Minutes, which remained firmly ensconced in the Top 10. [17]

    Though the federal court system, and audiences in general, seemed to be deserting Paramount, that studio continued to enjoy empathy – or perhaps pity – from within the industry, allowing Taxi Drivers to win the Outstanding Comedy Series award. However, John Ritter on Three’s Company won Outstanding Lead Actor in a Comedy Series, his first-ever win in any category, even though his show was an established hit. As the enthusiasm over the Jessica/Benson romance on Soap had ebbed (and the storyline itself had ended when Jessica was possessed by the spirit of a vengeful ghost whose burial ground was disturbed), neither Katharine Helmond nor Robert Guillaume repeated in their respective categories; Cathryn Damon, also of Soap, and Howard Hesseman of WMTM in Cincinnati, another overlooked, underloved Paramount Television mainstay, won instead (though Helmond and Guillaume didn’t seem too disappointed). [18] The Muppet Show surprisingly edged out The Richard Pryor Show for the Outstanding Variety Series Emmy, with Lucille Ball winning her fifth Emmy as a performer (in the category Outstanding Performance in a Variety Series or Special); Robin Williams received a posthumous special award, accepted by his close friend Richard Pryor, in addition to his leading the In Memoriam segment. Without a doubt, it was a time for reflection…

    ---

    [1] Of course, that extra day – February 29, 1980 (a Friday) – was the single most eventful day of them all.

    [2] The rights to I Love Lucy were sold by Desilu to CBS for $4 million in 1957 dollars – which is $12.5 million in 1981 dollars, and $33.3 million today. Desilu then purchased RKO from General Tire that same year for $6 million ($18.75 million in 1981, and $50 million today) – yes, even back then, the rights to I Love Lucy were worth two-thirds of one of the Big Five Golden Age movie studios (which, granted, had been through the ringer thanks to Howard Hughes – basically the Kirk Kerkorian of his day – but still).

    [3] Corporations are taxed directly on income generated by their assets prior to the payment of dividends (which is why they are drawn from retained earnings, also called earnings after taxes). Depending on the jurisdiction, income from dividends (classified as property income, as it is derived from the ownership of stocks, or shares in a corporation) is taxed differently (usually at a lower rate, or with a partial credit applied against it, as the income has already been taxed) than direct property income.

    [4] Duke was already twice-divorced when she married Desi Arnaz ITTL; she would also divorce her OTL third husband, John Astin, in 1985, before remarrying (for good, at least as of this writing). The first marriage of Desi Arnaz, Jr., IOTL (to Linda Purl, of all people) also ended in divorce, before he too remarried (which has also lasted to this day).

    [5] The terms of the second deal were that all shows which premiered in the 1950s and had ended their run by 1960. This meant that shows which were currently running at the time (such as The Untouchables, which had premiered in 1959) would remain the property of Desilu Productions. However, the direct follow-up to I Love Lucy, which became known as The Lucy-Desi Comedy Hour and ran from 1957 to 1960, was sold to CBS under these terms; That Wacky Redhead is indeed buying these monthly specials (13 in all) back, on top of the 180 aired episodes of I Love Lucy. The potential future inflows from these episodes are much lower, but That Wacky Redhead is a completionist.

    [6] As noted, adjusting for inflation, the original $4 million purchase price would be worth $12.5 million in 1981; That Wacky Redhead offers CBS considerably more than that (even in terms of present value on the instalment payments) for the rights back – about enough to fund an entire half-hour program singlehandedly – in what is known internally at Desilu as the “Make Your Own Lucy” stipend. Under normal circumstances, the eight-figure sum offered by Desilu to CBS would be extraordinary – were it not dwarfed by the nine-figure bond which Paramount was obliged to pay Lucasfilm (as part of the ten-figure Billion-Dollar Verdict). This helps to keep the deal relatively low-profile.

    [7] If you recall, Re-Genesis ran from 1972 to 1974 ITTL, though that beats OTL, in which Genesis II (as it was known) never got beyond the pilot movie stage.

    [8] The name “William Boone” was used IOTL in the Earth: Final Conflict series, conceived by Roddenberry (and developed by his widow, from his notes)

    [9] In terms of appearance, imagine Quirk resembling a character from an OTL Star Trek series with a very similar name, only in Muppet form.

    [10] Williams obviously remains with us to this very day IOTL; instead, John Lennon died on this date. ITTL, on the other hand, he did not.

    [11] The character of Bartlett is based on the OTL character of Dusty Farlow, who filled the same role on Dallas, but did not shoot J.R. (that was his mistress, Kristin).

    [12] The divorce storyline happened later on IOTL, under different circumstances.

    [13] The episode which contains the reveal scores a 52.9 rating and a 74 share, slightly lower than the 53.3 rating and 76 share enjoyed by the equivalent OTL episode, “Who Done It?”, due to the wider variety of programming available to American audiences at this juncture ITTL (for reasons which will be explored in coming updates).

    [14] Aykroyd and Radner, of course, were among the Not Ready for Prime-Time Players on Saturday Night Live, a show that does not exist ITTL. They are subbed in for Harold Ramis and Dave Thomas within the original cast, with the overall balance of SCTV compared to how it is IOTL obviously being a matter of personal taste.

    [15] IOTL, at this time, the show did indeed go from half-an-hour to an hour-long… alongside other changes, such as filming moving from Toronto to Edmonton, and broadcasting from Global to CBC. Neither of these two latter changes happen ITTL, because SCTV is more successful earlier on, thus tying it more firmly to Global.

    [16] Just as SCTV Network 90 began running on NBC in the same season – though only the fourth, not the fifth, overall IOTL.

    [17] IOTL, ABC had ten shows in the Top 30, and two in the Top 10; NBC had six shows in the Top 30, and just one (Little House on the Prairie) in the Top 10; CBS had fourteen shows in the Top 30, seven of which were in the Top 10, including Dallas (as opposed to Texas, which airs on ABC ITTL).

    [18] Yes, Taxi won for Outstanding Comedy Series IOTL, though Judd Hirsch (and not John Ritter) won for Outstanding Lead Actor. Outstanding Variety Series (or rather, Program) went to a one-time special (Lily: Sold Out, which starred Lily Tomlin), though The Muppet Show was nominated. Isabel Sanford won Outstanding Lead Actress in a Comedy Series for The Jeffersons, and Danny DeVito won Outstanding Supporting Actor in a Comedy Series for Taxi.

    ---

    Yes, Robin Williams unfortunately could not overcome his cocaine addiction ITTL, and he paid the ultimate price. I commend Williams for managing to overcome his demons IOTL (which he did partly by learning from the example set by his close friend, John Belushi, who died young due to drug abuse), and attempt to impart no particular message in marking him for an early death beyond noting his high-risk behaviours, and recognizing that this period, IOTL, was lousing with celebrity deaths (and near-misses).

    After all, and as I have said many times now, I never said I was writing a utopia!
     
    Last edited:
    Citius, Altius, Fortius
  • Citius, Altius, Fortius

    CBC-2 Title and Broadcast Logos.png

    The logo of CBC-2, also used for the equivalent French-language service, Radio-Canada Télé-2. To the left is the official logo, an extension of the established CBC logo, affectionately known as the “exploding pizza”; at right is the logo as broadcast as part of
    the original branding campaign, with the bilingual catchphrase.

    “You deserve the best from us!” / “Vous méritez ce qu’il y a de mieux!”

    It had been an exceedingly long drought for the Dominion of Canada. That nation, from which ice hockey had first originated over a century prior, had not seen a gold medal in that discipline at the Games of the Winter Olympiad since Oslo in 1952. Twenty years later, Robert Stanfield (a passionate enthusiast of Canadian sport) was elected Prime Minister largely on the promise of improving Canada’s international standing in general, which an emphasis on its athletic standing; this pledge had great potency with the electorate in the wake of the Summit Series, in which the Soviet Union had defeated Canada and ended once and for all their lingering pretensions to claims of dominance in the hockey arena. In the years since, Stanfield’s government had invested heavily in intramural and extracurricular athletic activities under the auspices of the newly-created Ministry for Sport, so as to give the younger generation a better shot at reclaiming glory on behalf of Canada. This paid major dividends as early as 1976, during the Olympic Games in Montreal (Summer) and Denver (Winter), both of which saw Canadian athletes place at heretofore-unprecedented levels.

    In due time, Canadian sport (as an integral component of Canadian culture) would be integrated with Canadian broadcasting (being the primary outlet for said culture). In fact, this agglomeration had an antecedent in the Canadian Content regulations, popularly known as simply “CanCon”, which were passed under the previous Trudeau government. These formally instituted a system of cultural protectionism into Canadian broadcasting, under the auspices of the Canadian Radio and Television Commission, or CRTC. [1] This forced the existing Canadian broadcasters to produce and air programming that qualified under a rather Byzantine code as having sufficiently Canadian content, hence the name. Shows which met the requirements set by the CanCon standards would be granted “points”, which would then be measured against a threshold for the entire lineup of programming; failure to clear this threshold would result in sanctions. [2] The Commonwealth Trade Agreement, presented in 1975, resulted in modifications to these standards, which were tabled by the now-majority Stanfield PCs that same year. Legislation was forced through Parliament over objections from the Opposition Liberals, ironically led by the English-born John Turner, who attacked Stanfield on the basis of “regressing back to a colonial status”. Under the newly modified system, which was forced through the Senate after Stanfield made additional appointments to the Red Chamber, programs which were produced in the United Kingdom, Australia, or (at least in theory) New Zealand [3] – with plans to introduce other Commonwealth countries to this system pending their integration into the CTA – would be given partial credit on the points system, tentatively set as having a value of 50% that of Canadian-made programming. This system was introduced largely because cultural protectionism in Canada was primarily meant to defend against intrusion by American culture, given the shared border, commercial, and familial ties between the two countries for centuries past.

    Canada scored a vital early coup in their quest to regain prominence in the field of sport when, in 1974, the city of Vancouver was awarded the Winter Olympics for the year 1980. [4] This would immediately follow Canada having hosted the Summer Olympics on the other side of their vast country, in 1976. Vancouver (along with the nearby, mostly uninhabited but picturesque Garibaldi region) was ideal for skiing and other mountainous sporting events; Canada had a reputation to maintain, having swept the downhill skiing event in Denver courtesy of the “Crazy Canucks”. The summer Olympics were not held in friendly territory, as they had been awarded to Moscow. Indeed, the two rival locations would provide an opportunity for these old rivals to settle scores with each other, especially with regards to the ancient enmity in ice hockey, as Team Canada would be putting their sticks back on the ice in an Olympic setting for the first time since 1968, the previous dispute about recognition of certain athletes as amateurs or professionals having ended in an uneasy truce, in which Canadian semi-professionals would, like Soviet semi-professionals, be recognized as “amateurs” for the purposes of international competition (though, ironically, Team Canada was still comprised largely of amateurs despite this).

    As part of Canada’s preparations for the Winter Games, it was suggested to the government by the management of the CBC that a “complementary service” should be introduced for the main English and French television networks. Due to perceptions of the existing networks being either Montreal-centric or (in English, to a lesser extent) Toronto-centric, this new network would be devoted to showcasing Canada’s regions by bringing them closer to a national scale. Similar to BBC2, launched a decade earlier across the Pond, this new network would focus more strongly on arts and cultural programming as opposed to strictly “entertainment” fare. As a result of this, it would be a joint venture of both the CBC and existing provincial educational broadcasters where applicable, such as TVOntario, Radio-Québec, and others. CBC-2 and Radio-Canada Télé-2 – for the English and French languages, respectively – would effectively be “separate” channels where there was one common channel broadcasting via cable and satellite, along with directly-broadcast “regional” systems in association with those provincial networks. These would loan transmission facilities to the CBC in exchange for being allowed to retain a certain percentage of the time on those channels to broadcast their own programming – this functioned very similarly to the public broadcasting system in West Germany. In addition, there would also be common timeslots in which all of the provincial partners would participate, along with their agreement to air shows that were centrally produced directly by the CBC (or Radio-Canada, where applicable). The regional systems would also be able to maintain their own schedule of programming, specific and relevant to their individual province or region, such as local news or sports. In addition, Radio-Canada Télé-2 would also broadcast programmes from throughout the Francophonie, particularly those from Metropolitan France, Belgium, and Switzerland. [5] The established CBC Television and Radio-Canada services, now titled CBC-1 and Radio-Canada Télé-1, as well as existing services from the provincial broadcasters, would remain largely unaffected, though they obviously had potential to diverge from each other in the future.

    CBC-2 was first proposed by Stanfield to the electorate during the 1978 election campaign. The decision to create a second national public television network – and one which was far more similar in format to foreign public networks than the pseudo-private CBC-1 – was something that he deemed necessary for many reasons. For one, it was considered a valuable new front in the raging culture wars of the 1970s (as a means to stave off the pervasive influence of American broadcasters), and for another, this new network would be able to utilize technological breakthroughs in the telecommunications industry (many of which had come courtesy of the aerospace industry). In addition, Canadian sports coverage was still considered haphazard at this time. Stanfield’s beloved Canadian Football League (CFL) did not see organized broadcasts on the scale of Hockey Night in Canada despite being easily the second-most popular sport in the country, particularly in the West. And as for hockey, ever since the two major leagues had merged in 1977, CTV was without what had been their flagship sports show: WHA Hockey Tonight. This meant that the six Canadian NHL teams received far less coverage on television than either of the three old NHL teams or the three old WHA teams that played in Canada. Since Hockey Night in Canada was produced on the national level by the CBC and had long shown pronounced favouritism of the two “Original Six” Canadian teams, the Toronto Maple Leafs and the Montreal Canadiens, this sea change proved extremely displeasing to Western hockey fans (whose teams in Vancouver, Edmonton, and Winnipeg were sorely underrepresented). However, despite the emphasis on sport, it was included in the plans for CBC-2 only because “sport” fell under the purview of “culture” – and in the case of hockey, “heritage” as well. By 1978, Stanfield’s Olympic training régime was in full swing, and he felt that the best way to reap the dividends thereof was to showcase them to the fullest, doubling down on his bets. Therefore, it was his hope that CBC-2 could be on the air by the 1980 Olympic Games, the first of which were to be held in February. Upon winning another majority in August of 1978, the Tory government passed the necessary legislation (Bill C-8) through the House of Commons with ease (none of the three opposition parties opposed it), and it sailed through the Senate (which, nevertheless, now also enjoyed a Tory majority, due to more appointments by Stanfield) before being granted Royal Assent by the Governor-General on February 15, 1979. The new Act called for the creation of CBC-2 prior to the end of the decade – officially by December 31, 1979, at 11:59 PM. The celebratory New Year’s atmosphere which marked this début would become a hallmark from then on, especially as most of the other Canadian networks did little to mark the occasion. [6]

    Private television was also expanding in the 1970s, which neatly complemented what was happening in public television. By the dawn of the 1980s, there were effectively four national networks: the fourth, in addition to CBC-1, CBC-2, and the formerly lone private broadcaster CTV, was the Global Television Network, known simply as Global. Initially, it had a power base solely in Southern Ontario, which was densely populated but hardly constituted anywhere near a reach which might have been assumed, given the very name of the network – which was actually a reference to how it was interconnected, via satellites that were trained in geosynchronous orbits far above the globe of the Earth. Fortunately, they found buzz in what rapidly became their flagship show, SCTV, originally intended as counterprogramming against the popular WHA Hockey Tonight show on CTV, before the NHL/WHA merger forced the latter program’s cancellation at its ratings height, which drove viewers elsewhere. However, the objectives of those in charge at Global to reach a cross-country audience, a mari usque ad mare, were stymied by the federal government (via the CRTC) and would not be realized until they found themselves a target of the growing CanWest media empire, based in Manitoba, and under the control of politician-turned-mogul Israel “Izzy” Asper. His ambition knew no limits: he had formed and headed a consortium to purchase an American television station and bring it north to his base of operations in Winnipeg, managing to place his bid well ahead of all comers, and eventually earning the reluctant go-ahead from the CRTC, who monitored the situation closely. Negotiations were long and convoluted, but eventually he prevailed, and KCND, channel 12, based in Pembina, North Dakota, was re-branded as CKND, channel 9, Winnipeg, on June 1, 1974, beginning operations with a special report on the pending federal election; the writ was dropped just two days later. [7] This audacious trans-national channel hop would prove instrumental to Asper’s strategy, and a vital launch point for his planned expansion. Two years later he purchased the Global Television Network, which was facing major financial difficulties at the time. Given that Global covered a much larger audience than his Canwest Broadcasting (and that CKND largely rebroadcast Global programming anyway), he then re-labeled his company as CanWest Global Broadcasting to reflect the new weighting of his assets. The broadcaster would continue to expand through the 1970s into such markets as Vancouver, Calgary, and the Hamilton/Niagara region of Ontario, making it truly national, at least within English Canada. [8] At the same time, programming on the network would receive more generous budgets and be viewed by ever-growing audiences. This turned SCTV into a household name, even giving it a large and devoted American fanbase, which eventually led NBC to come calling.

    The longer-standing private network, CTV, was having growing pains of its own. The owner of flagship station CFTO in Toronto, John Bassett, had profited immensely from spearheading the WHA Hockey Tonight program, which (upon being cancelled in 1977 upon the absorption of the WHA into the NHL) had left a massive void in the network schedule (given that the program aired nationwide). Bassett himself, who had been reimbursed by the NHL upon the cancellation of WHAHT, decided to buy out the owner of the Toronto Maple Leafs (and his one-time partner), Harold “Pal Hal” Ballard, whose corrupt and miserly ownership style had alienated fans and enraged players – the Leafs had not won the Stanley Cup since the expansion era had in begun in 1967. [9] Ballard had massively overcharged Bassett for the team, but in this case pride won out over practicality. But CTV, which for a brief, shining moment had been a leader in sports on television, now found themselves adrift. It was therefore decided that co-operation with French language media in an attempt to nurture the growth of sports besides hockey (on whose broadcasts the CBC had a stranglehold) would be in their best interests. Enter TVA.

    TVA, though it was similarly organized in a co-operative model like that of CTV, had vastly different origins. Originally, it was established in the 1960s as a program-sharing arrangement between CFTM-TV (branded as Télé-Métropole), a popular private TV station in Montreal, a station in Quebec City that had just lost its affiliation with Radio-Canada, and a station in Chicoutimi that was desperate for revenue. It was not until 1971 that this informal arrangement was crystallized into the Téléviseurs associés (or TVA) co-operative. This new network became even more popular than Radio-Canada; though Radio-Canada was originally the viewing destination for French-Canadian audiences, there was a great deal of internal tension at that network, typified by disputes between those who wanted to take it into different directions. This was exemplified by certain programs, including news broadcasts, which sometimes mimicked Metropolitan pronunciations as opposed to the native Québécois dialect. TVA, on the other hand, was much more populist, and as a result, its programming attracted larger, more desirable audiences, and therefore greater revenues. Given such success, more stations joined the network to the point that TVA soon had coverage over most of Quebec as well as parts of Ontario and New Brunswick (other provinces with large Francophone populations). Considering that TVA was often seen as the French-Canadian version of CTV, it seemed natural that both networks should want to pool their resources. And as the CanCon restrictions were continuing to tighten (as a compromise for allocating partial credit points to Commonwealth programming), CTV had ample incentive to tap the French-Canadian market for programming ideas especially since French-Canadian television was largely comprised of homegrown shows which were very popular with Canadian audiences. CTV, on the other hand, had faced disaster whenever they had attempted any truly ambitious homegrown productions of their own, most notably with the debacle which had been The Trouble with Tracy. CTV and TVA initially developed their nascent “alliance” as it was unofficially known by collaborating on programming ideas, including the adaptation of TVA series for English-Canadian audiences. CTV also took on the responsibility of broadcasting the TVA network feed outside of their core regions, including completing TVA coverage in Ontario, for the benefit of those French-Canadian communities in Southern Ontario as well as Western Canada, giving those viewers greater choice, which was reflective of the situation for English-speaking viewers in this era, and ending the Radio-Canada monopoly for French-Canadians in Anglo territory. Usually, this meant using the base schedule of CFTM-TV specifically, despite the wide variation in scheduling between stations in the TVA core regions, adding on local news and programming as needed. The one exception to this policy was in the four Atlantic provinces of Canada, which instead used the feed of an existing station in Carleton-sur-Mer, in eastern Quebec which had its own rebroadcasters in New Brunswick, in part due to its proximity to the Quebec/New Brunswick border. Additional rebroadcasters were built in Nova Scotia and Prince Edward Island. This helped CTV to accrue their missing “CanCon” points and clear their minimum thresholds, while at the same time granting additional exposure to TVA. In terms of specific programming, it was surprisingly not serialized, episodic fiction, but instead a certain sport which bolstered the notoriety of TVA; the core area significantly overlapped with that of the nascent soccer league, the Atlantic Major Junior Soccer League.

    The sport known in most of the world as football, but in North America as “soccer”, was experiencing unprecedented popularity in that region after a decades-long dry spell, due in large part to the North American Soccer League, or NASL. However, that organization had fundamental problems which did not bode well for its long-term viability, and these included excessively rapid over-expansion (which, granted, was common to all of the sports leagues, to varying extents, through the 1970s), and heavy rules modifications in order to make the game more “exciting” to American audiences (most infamously, a clock which counted down to zero, as opposed to one which counted up to ninety, to match the playing format of all the other North American sports) – a compromise borne out of the “Soccer War” which had been the death knell for the sport in its previous wave of popularity stateside. Perhaps most vitally, all of the star players were imported from countries with large soccer-playing populations, as the United States had never possessed a sufficiently large or talented native base of soccer-playing youths; those American athletes who did play in the NASL spent most of their games warming the benches. This latter problem was one which would eventually be corrected by the free market, though not without great cost for the established order. In fact, it was the collapse of the Boston Minutemen team in 1976 (ending the NASL presence in New England) which precipitated the drive for a new youth league, the Atlantic Major Junior Soccer League, or AMJSL; this had been inspired by the popularity of college soccer in the region during this era. The AMJSL would soon expand well past the traditional borders of New England, including Upstate New York and Canada (largely French Canada, to boot, but also Atlantic Canada as well, something of a cultural satellite of New England at any rate). In the United States, games played by the league were broadcast on PBS stations; in French Canada, TVA carried the matches played by local teams (whose league was known in French as la Ligue de soccer junior majeur de l'Atlantique, or LSJMA). There were many local teams in small-town New England and Quebec, and this informed the character of the league – despite the name, the AMJSL was effectively a minor league in virtually all respects. It was also more European in organization, formally introducing the concepts of promotion and relegation based on team performance. Because expenses were very low (player costs were virtually nonexistent, as few teams were even semi-professional, and soccer required minimal equipment and supplies beyond simple uniforms and a freshly-mown lawn (or turf, as was especially common in this era), it allowed this league to be profitable from early on. In addition, it raised the profile of the previously-”highbrow” PBS stations with audiences that had not been frequent viewers beforehand, including in the most highly desirable demographics. This attracted a whole new group of sponsors to the affiliates in the region, such as WGBH Boston and WNED Buffalo (which was also carried throughout Southern Ontario, in Canada); many of them would then offer their services for programming which did not necessarily mesh with their desired customer base.

    CTV/TVA and their interests in other sports remained, for the time being, in the shadow of John Bassett’s involvement in the NHL, having bought back the Toronto Maple Leafs in the wake of the NHL/WHA merger. That merger had long-lasting and significant repercussions, but surprisingly, there was remarkable stability at the top tiers of the NHL both prior to and following the merger. It began after the back-to-back Stanley Cup wins by the Philadelphia Flyers in 1974 and 1975 – the “Broad Street Bullies”, as they were known, due to their rough, no-holds-barred style of play – were the first expansion team to hoist Lord Stanley’s Mug and the first from outside the Original Six since the long-defunct Montreal Maroons had done so in 1935. However, their success did not last; a new dynasty was soon to emerge. The Boston Bruins, the eldest American team in the NHL, had up to 1976 won only five Stanley Cups in their half-century-long history, two of which dated from the Expansion Era of the last decade (one against the laughably impotent St. Louis Blues in 1970). The Bruins roared back to prominence under the guidance of Don Cherry, a perennial minor-league hockey player (in his twenty-year career, he had played one game with the NHL – as a Bruin in 1955) who then went into coaching, building a mediocre record with the Rochester Americans of the American Hockey League. Cherry was nonetheless hired to join the Bruins organization as a coach in 1974 – the rapid expansion of the NHL and the emergence of the WHA had created a large surplus of job opportunities. His first season saw the Bruins perform respectably (finishing second in their four-team division) but from then on, the team struck gold, year after year. [10] The Bruins defeated the Toronto Maple Leafs in the semi-finals, just after the Leafs had defeated the Flyers in the quarter-finals in a shocking upset, thus tragically dashing that city’s suddenly renewed hopes – Cherry, a devoted Leafs fan from his youth, was said to be deeply ambivalent. He was reinvigorated when his team faced off against Montreal in the finals – their sixth such face-off, with all five previous encounters having been won by Montreal. Cherry, an ardent English Canadian nationalist, famously did not care for la belle province of Quebec, and that went double for Montreal, its largest city (despite its substantial Anglophone population). Although his attitudes spat in the face of the unified Canada which Prime Minister Stanfield was trying to create in the wake of the Quiet Revolution, the FLQ, and (later that year) the election of a separatist government in Quebec, they did not prevent his team from being successful, winning the Cup in a narrow victory of four games to three.

    The final season of the pre-merger NHL saw Montreal and Boston facing off against each other yet again, for a seventh time overall. As far as Cherry was concerned, it was “lucky number seven”. He repeated this catchphrase to reporters as the finals themselves wore on to a game seven, in which the Bruins scored the game-winning goal against Montreal… with thirty-seven seconds remaining. However, such dramatics were mooted for the future when, after the merger, Montreal and Boston were assigned to the same conference (the Prince of Wales), meaning that whichever of them won the conference final would face the winner of the Clarence Campbell Conference – in 1978, that winning team was the Winnipeg Jets, one of the old WHA teams, which had upset the New York Rangers, an Original Six team, in their conference final. [11] However, the Bruins ensured that they would not repeat their success at the Cup finals, winning 4-0 against them. The Jets, which had a large contingent of European players, were famously denigrated by Cherry (who, despite his American venue, had a roster of overwhelmingly Canadian players), who crowed that the Bruins victory marked the first step of Canadian reconquest in the hockey arena. The irony of the coach of an American team, having defeated a Canadian one, saying this was not lost on anyone. By this time, even though it was clear that the Bruins had formed a dynasty and that Cherry himself had played a large part in that, he was becoming far better known for his sound bites – the most notorious coach of perhaps any major league sport in this era. Many in the NHL brass disliked Cherry, whom they felt had an attitude that ran contrary to the image they were trying to project; Bruins management, being American, naturally had a serious love-hate relationship with him. As long as he kept winning, they could not reasonably dismiss him – and he did. In a rematch with Winnipeg, the Bruins won their fourth consecutive Stanley Cup in 1979, capping their impressive dynasty. However, all good things must come to an end: in the 1979-80 season, the Bruins surprisingly failed to make the playoffs, and Bruins brass immediately fired Cherry the moment that this became clear. However, “Grapes”, as he was also known (a pun on his fruit-related surname and the fitting expression “sour grapes”), landed on his feet, though not in a way that anyone might have expected.

    Though the Bruins – who were located in a climate that provided for reasonably cold winters and therefore natural ice when the days were short – were successful, the same could not be said for other American teams, particularly those in what was becoming known as the “Sunbelt”, which enjoyed year-long warm weather. The popular WHA team, the Houston Aeros, had been prohibited from joining the NHL in fear that they would become unprofitable when their fortunes changed. Already, the league had been forced to move a team from the comparatively cooler Bay Area (the California Golden Seals, who became the Cleveland Barons in 1976, and thrived thanks to instant rivalries with the Pittsburgh Penguins and, later, the Cincinnati Stingers). Only two warm-weather teams out of 24 remained: Los Angeles and Atlanta. By the dawn of the 1980s, there would be only one. The Atlanta Flames (named after the conflagration that had famously consumed the city during the U.S. Civil War) were in dire straits. Audience response was poor, and player performance was little better. Their owner, Tom Cousins, long tired of the red ink that came with operating the Flames, sold them to a consortium of owners from the Canadian city of Calgary (in the oil-rich province of Alberta) in 1980, for $20 million. [12] Calgary had also been badly damaged by fire in its early history, so the name Flames stuck. Indeed, given the importance of the oil industry to the city, it seemed even more appropriate. And hired as the new coach of the Flames during their inaugural Calgarian season in 1980-81 was none other than Don Cherry, who raised ire from his former American fanbase when he proudly announced that both he and the Flames were “right here, where hockey really belongs”. Despite the coach generating far more publicity off the ice than on it, his team made the playoffs in its first season, though they were defeated in the semi-finals by none other than the Montreal Canadiens, in what was no doubt a bittersweet series for Grapes. [13]

    1980 would indeed prove a tremendously auspicious year for Canadian hockey, for which the Flames being moved from Atlanta to Calgary proved merely the culmination. Team Canada had a lot riding on the 1980 Winter Olympics in Vancouver, especially since their men’s hockey team was once again back in contention. Prime Minister Stanfield was laying it all on the line – it had been eight years since he had promised to mould the street hockey-playing kids of the day into world-class athletes. Among this cohort of young men were goaltender Paul Pageau (born in 1959), and forward Ken Berry (born in 1960), who would emerge as the stars of Team Canada. But despite this road-tested talent, it was an uphill skate indeed. The qualifying rounds for the ice hockey championship would consist of twelve teams evenly divided into two divisions. These teams were chosen in preliminary contests which preceded the actual Olympic games. All of the teams would play one match against all five of their divisional opponents, and the two in each division with the best record would advance to the finals rounds. (The third-best in each division would instead move on to the consolation round.) The points won during the qualifying round would carry forward to the finals in which each team would face the two that they had not met in divisional play. The medals would then be assigned to the three countries with the best overall record after the finals had concluded.

    Canada had been placed in the Red Division, which included Finland and Czechoslovakia, having the good fortune to avoid the much tougher Blue Division, in which their ancient and most bitter adversary, the Soviet Union, had been placed, along with the United States and Sweden. Rounding out the Red Division were Poland, West Germany, and the Netherlands. But Canada defeated them all, along with Czechoslovakia, losing only to Finland (who entered the finals with a clean sweep). [14] Czechoslovakia, having won three games out of five, went on to the consolation round. In the Blue Division, the Soviets, naturally, also won all five of their games; Sweden won four out of five, including (crucially) against the United States, with a score of 3-2; the USA, which had won three of their matches in the qualifying round, would ultimately come in fifth place in the consolation round, beating Czechoslovakia. The “final four” were Canada, Finland, the Soviet Union, and Sweden. Canada would not have to face Finland again – though if the two surviving Red Division teams each won their two games against those of the former Blue Divisions, Finland would take the gold based on their divisional win against Canada. And this was considered highly unlikely, given the Soviet juggernaut (which had won gold in this event at the last four Olympic Winter Games). Sweden, though certainly not a powerhouse on the level of Soviet Russia, nevertheless remained a strong contender which could very well play spoiler.

    The anticipation felt by both sides of the reckoning that was the Canada-USSR game, the first since the Summit Series of 1974, was palpable. And the International Olympic Committee did not believe in delaying the inevitable; it was the first match scheduled out of the four, on February 22, a date which would go down in sporting history. The Canadian team, composed primarily of amateur and collegiate players, stood against the juggernaut, well-oiled (and semi-professional) Soviet team, the David to their Goliath. By the assessment of many, Canada was lucky to get this far upon their return to Olympic hockey, and would likely not improve upon the bronze medal they had received at Grenoble 1968. Vegas odds were so overwhelmingly in favour of the Soviets dominating Team Canada that many gamblers put money on the reverse, just because the potential payout was so enormous. Canadian commentators, naturally, had bought into the hype fed to them by the Stanfield government for the last several years, and did not waver in their support, or enthusiasm. Don Cherry, who had been invited to provide his opinion in CBC-2 analysis, confidently crowed with his standard nationalist rhetoric that his team was sure to triumph over those unsporting “Russkies”. Surprisingly, many American viewers, eager to see revenge for their heartbreaking defeat by Sweden (and their inevitable one by the Soviet Union) rose in support for the scrappy Canucks; they had always rallied behind an underdog, and in this case, were even willing to cross borders in order to do so. In fact, it seemed something out of an American sports picture: the meek, polite, and unobtrusive yet utterly determined Canadians were up against the veteran but complacent Soviets. Many Americans sympathized with the drought which Team Canada had been facing in the hockey arena, particularly those who supported certain baseball teams (one of which, in a supreme irony, was the long-suffering Boston Red Sox, who played in the same city as the Bruins). US-Canada relations were stronger than ever during this era, and many Canadians were appreciative of this groundswell of support, though perhaps slightly bemused that they had become, in their view, something of a proxy for other battles. But nevertheless, when the game began at 5:00 PM, Pacific Standard Time, an eerie hush hung over the viewers at home throughout North America. It was a high-scoring game; both sides gained a point in the first period, and then Canada let in two more while only managing one in return for the second; the score was 3-2 for the Soviets as play entered the third and final period, before the Russia quickly scored yet again in the first minute. Canada retaliated with a single goal, and then another, managing to tie the game with just minutes to spare. The crowd went wild. Viewers at home were on the edge of their seats. (Fittingly, in the most populous sections of both Canada and the United States, it was primetime.) And then, with nineteen seconds remaining, Ken Berry scored the game-winning goal for Canada, bringing the score to 5-4, and the beyond-capacity crowd to their feet. That relentless American optimism had, yet again, paid off, as had those bets which were made on the longshot. Cherry, when asked for comment following the upset victory, made his famous declaration: “Let me tell you something, everybody’s going to say that there was a miracle. Do you believe in miracles? Because that was no miracle, that was our Canadian boys playing their hearts out and beating the Russkies like they were supposed to, like we knew they would all along”. [15] Despite his confidence, most outside observers did come to view the game as a miracle, and it thus became known as the “Miracle on Ice”; indeed, Cherry was often credited as the “inventor” of the term, despite his speech (delivered in his thick and rapid-fire Canadian accent) having claimed the exact opposite; it was no surprise that the term actually originated with supportive American sports reporters, attributing a meaning to Grapes and his rantings to which he plainly did not intend. But despite the wave of spontaneous street parties which broke out across the Great White North (in the dead of winter, no less!), many pundits urged caution; the gold medal was not at hand just yet.

    Sweden, meanwhile, had narrowly beaten Finland in their match, which took place immediately after the “Miracle on Ice”. On January 24, 1980, they would then face Canada; the winner of that match would receive the gold medal. The loser, on the other hand, would not take the silver, but would only receive the bronze medal; silver would instead be awarded to whomever would win in the Soviet-Finland match which was to follow (due to Canada and Sweden having won only four out of their five divisional games, compared to the sweep achieved by both Finland and the Soviets). By the time that fateful date had arrived, less than forty-eight hours since the unlikely Canadian victory, both the players and their audience were exhilarated, a marked contrast to the trepidation that had marked the prelude to the match against the Soviets. The Americans were no less enthused on their behalf and this contributed to Canada entering the match as the heavy favourite, at least by the assessment of the pundits who were prognosticating. This rather patriotic interpretation (either to the first or the second degree, and a given with Olympics Fever in epidemic) ignored the facts on the ground: both countries entered the match with a relatively even record – four out of five games won in the divisional round, followed by a single win thus far in the finals. However, Canada did have history: it had won the gold six times before (though most recently in Oslo 1952); Sweden, on the other hand, had never done so, only taking the silver on two occasions, most recently in Innsbruck 1964. Canada also had the home turf advantage this time – for the first time ever in the Winter Olympics, in fact – and the Vancouver audience was overwhelmingly supportive of Team Canada, perhaps even more so than it had been during the Miracle on Ice match. Perhaps most tellingly was who was in the audience this time: Prime Minister Stanfield had a front-row seat. The second largest audience bloc, the Americans (many of whom had driven up from Seattle, or down from Alaska), were also strongly pro-Canadian; that single morale booster might have made all the difference.

    Either way, Canada did indeed win the game – and the gold medal – with a score of 2-1 (the game-winning goal being scored once again by Berry, who was naturally chosen as Canadian flag-bearer in the closing ceremonies). Sweden took the bronze, and finished below the Soviet Union after the Russians shut-out the Finnish team later that night. After 28 years, and so many hard-fought battles, Canada was back on top. Prime Minister Stanfield – whose elated reaction was famously captured by cameras immediately after the clock had run out and displayed in newspapers across the country the following day – was completely vindicated. [16] Certainly, 1980 enshrined hockey’s vaunted place in the North American sporting landscape, reinforcing not only the existing Canadian enthusiasm for the game, but also that of Americans, at least those living in suitable climes. Indeed, the many other wins for Canada at the Vancouver Olympics were considered mere footnotes in comparison: the Crazy Canucks once again swept the downhill skiing event, though the gold medalist in 1976 (American-born Ken Read) was not amongst the top three. In fact, only Dave Irwin (who won silver in 1976, and bronze in 1980) repeated; the gold medal went to Steve Podborski. Gaétan Boucher, who had won a bronze medal in speed skating back in 1976 (at the age of 17), took two medals: the gold for the 1,000-metre event, and the silver for the 500-metre event. Other skating medalists included female speed-skater Sylvia Burka, capping her career with a third-place finish in the 1,000-metre event, and the pair of John Dowding and Laura Wighton, who won the bronze for their performance in ice dancing. Rounding out the medals won by the Dominion of Canada in their own country was Kathy Kreiner, who had taken the gold medal for the giant slalom in Denver; oddly, she placed only in downhill, winning the silver medal. This left Canada with 3 gold, 3 silver, and 4 bronze medals, for a total of ten; three more than in Denver. [17] This placed them at fourth overall, an extremely solid performance.

    Moscow was another story. The capital of Russia, the Soviet Union, and the Second World in general hosting the Games represented an opportunity for the Reds to rehabilitate their reputation after facing numerous setbacks in the late 1970s, up to and including their ill-fated decision to invade Afghanistan in order to protect their interests in that Central Asian Monarchy-turned-Republic following a Communist revolution there. The international reaction was overwhelmingly hostile, but President Reagan did not choose to pursue heavy sanctions against the USSR, or even agree with a proposed boycott of the Moscow Olympics, as part of a gentleman’s agreement that had emerged between the two superpowers – the Soviets would turn the other cheek to the continued American military presence in Iran, so long as the United States ceded any interests in Afghanistan, formally establishing both states in their respective spheres of interest and extending the Iron Curtain across the Caspian Sea to the Indus Valley. [18] Many American political commentators were irate, particularly those cultural conservatives whom Reagan had already alienated with his opposition to the Briggs Initiative – but Reagan’s staunchly anti-communist reputation was unimpeachable, aided by his vocal opposition to Red China, and his government’s continued lack of recognition for the PRC (ironically, one of the few countries which did boycott the Moscow games). A famous expression, “Only Reagan could go to Moscow” [19] (in which the American competitors formed a metaphorical proxy), emerged as a result of this situation; most of the First World followed suit. But mostly for naught; the Soviets utterly dominated the events, finishing at an even more lopsided first place than they had in Montreal, winning over 50 gold medals alone. [20] East Germany, naturally, finished second, and the United States came in third. Rounding out the top ten were six more Communist states – Bulgaria, Cuba, Czechoslovakia, Hungary, Romania, and Poland – along with West Germany. Canada (along with the other Western democracies) did not perform nearly as well, and it was a credit to the afterglow from the victories in Vancouver that Moscow had not proved so glaring a disappointment, despite being a major setback compared to Montreal, four years earlier.

    Given that CBC-2 and Radio-Canada Télé-2 were created more-or-less for the express purpose of broadcasting the Olympic Games, the question of how the events were to be aired, both on the original CBC and Radio-Canada networks (quietly rebranded CBC-1 and Radio-Canada Télé-1, respectively, on the morning of January 2, 1980), and on the new networks, was a critical one. Simulcasting was an elegant enough solution for the really big events – the Opening Ceremonies, for example, were aired all across the CBC family of networks (including its external service, Radio Canada International), as were the Closing Ceremonies. In general, however, CBC-1 carried only the “major events” live or as-live, where such a thing was feasible – a cinch for Vancouver, but a logistical (and scheduling) nightmare for Moscow. Fortunately, though the Soviets had allowed other aspects of their-once resplendent space program to lie fallow, the germ of it – telecommunications satellites – remained robust. Their Gorizont system, also designed specifically for the Olympics, allowed them to transmit their television broadcasts across the whole length and breadth of the USSR, which just so happened to be the only country in the world which was even larger, geographically speaking, than Canada. [21] This allowed the CBC to use taped broadcasts from Soviet television in order to supplement their own recordings. CBC-2, therefore, broadcast all events in the Summer Games live from Moscow, and CBC-1 carried the highlights and major events on tape delay in primetime. In Vancouver, too, CBC-2 also broadcast virtually all the Olympics events live, but (given that the nascent network had far more time at its disposal, due to the lack of time difference) it adopted a strategy of providing a very comprehensive coverage of the entire Olympiad, including in-depth and historical perspectives of both the Olympics and the various sporting events which comprised its schedule. This resulted in the Olympics coverage on the CBC receiving profuse praise both within and outside of Canada, generating multiple International Emmy Awards for the Corporation, and it cemented CBC-2 as the home of “serious” sports reporting, which legitimized the decision by Stanfield to focus so heavily on that field instead of other, more “artistic” forms of culture. [22] This success would soon be replicated in their future expansions into entirely different venues than traditional, over-the-air broadcasting, the roots of which had already been laid down in 1976 with the launch of the Parliamentary Television Network.

    But there was more to CBC-2 than simply sporting or even high “culture” in general. One of the earliest “light entertainment” programs, in the parlance of British television, to become successful on the network was, fittingly, adapted from one of their most popular sitcoms of the era: To The Manor Born. The bilingual profile of Canada made for an excellent cultural translation of the divide between the landed gentry and the upwardly-mobile mercantile classes that existed in the original British version. However, given the many changes made to the premise to fit the new setting, a new name was chosen as well, which was deemed more reflective of its new nature: Hello, et Bonjour. [23] Set in the small (and mostly-fictional) village of Ditchfield, in the Eastern Townships of Quebec, which had been settled by United Empire Loyalists in the late 18th century, it featured Mrs. Laura Spaulding, widow of the wealthy landowner who was the scion of his ancient patrician family – or so it had seemed. Though the Spauldings had once belonged to the Château Clique, their fortunes (literally and figuratively) had fallen very far indeed in the many years since. Mrs. Spaulding (along with her son, Edward) were forced to sell their land to a nouveau riche French-Canadian entrepreneur from Longueuil, near Montreal, by the name of Guy Tremblay (whom she referred to as “that guy, Trem-blee”, almost exclusively), a widower who lived with his son, Georges, who was close in age to Edward. Mrs. Spaulding and Edward continued to rent a small cottage on the property, near a small lake, residing there alongside their one remaining servant, a middle-aged, full-bodied housekeeper named Marie. M. Tremblay had built his empire on general-service goods stores, called magasins in French, hence his company Magasins Tremblay (or “Trem-blee Magazines”, according to Mrs. Spaulding, though the company did not operate newsstands). Another major character was a Franco-American supplier, M. Boisvert, from New England; he was played by a prominent Franco-American actor, Christophe Blanchard, who was well known for his Tony Award-winning performances on the Broadway stage. He was given the coveted “And” credit in the opening cast listing, though his appearances (as a confident and business associate to M. Tremblay) were limited. The pilot was the first to be filmed using the combined resources of the French-language Radio-Canada Télé-2 and the English-language CBC-2 Québec, in early 1980; this was shortly after the first season of To The Manor Born had been such a hit in the United Kingdom. Location scenes were shot in and around Sherbrooke, the largest city in the Eastern Townships region (and, like Ditchfield, a stronghold for the old United Empire Loyalists of two centuries before). Responses were good, and production was green-lit after the Summer Olympics, with filming done concurrently with the airing of the second season of To The Manor Born in late 1980. The show would premiere in February of 1981, running through to May with a 13-episode season, though it initially aired only on the Quebec affiliate stations of CBC-2 and Radio-Canada Télé-2 which produced it, in both English and French respectively. Filming was done in English (with an eye for distribution to the rest of Canada), though the same crew produced a French dub which was used for Radio-Canada Télé-2, wherein the voices for Mrs. Spaudling and her son Edward were dubbed with upper-class Parisian accents, which was then used for all “Anglo” characters to retain the linguistic (or dialectic, in this case) barrier. [24] Ratings were so strong that network executives announced that Hello, et Bonjour would be added to the national feed of CBC-2 in the fall, with a second season ordered – indeed, the Montreal newspaper Le Devoir even compared its popularity to that of La famille Plouffe, Radio-Canada’s first major televised success, dating back to the 1950s (to the point where even hockey games were rescheduled so as to not conflict with airings of the show). The 1980 and 1980-81 seasons were rousing successes for CBC-2, which overall marked a sea change in the Canadian television landscape, one which was wholly reflective of the seismic shifts taking place across the border…

    ---

    [1] In OTL, this organization has been known since 1976 as the Canadian Radio-television and Telecommunications Commission (CRTC). The name was not changed ITTL – we will assume that the Trudeau government was behind it IOTL.

    [2] You can read more about CanCon regulations right here – you might find that I was not kidding when I described them as Byzantine (although, given the Quebec legal system, perhaps “Napoleonic” might have been more apt).

    [3] New Zealand occupies a position in terms of programming content relative to Australia which is very much analogous to that of Canada relative to the United States – much of their programming is imported from there (in addition to the US and the UK), and the present day reputation of God’s Own Country as an ideal filming location for outsourced productions is well over a decade into the future from this point IOTL.

    [4] IOTL, the 1980 Winter Olympics were awarded to Lake Placid, in the United States (which had previously hosted them in 1932), but recall that ITTL the previous Winter games also took place in the United States (in Denver). Therefore, they are awarded to the only other city which put in a bid IOTL, Vancouver (which it withdrew before the IOC was to make a decision). Note that consecutive games in the same country in different seasons do not seem to have been a problem IOTL (as Lake Placid 1980 was immediately followed by Los Angeles 1984), and considering that Vancouver did indeed bid despite Montreal 1976 taking place, I’m willing to award it to them. Vancouver, of course, would eventually host the Winter Olympics in 2010, IOTL, though alongside Whistler, not Garibaldi.

    [5] This is, in fact, an elaboration of the original application submitted by the CBC to the CRTC in 1980 IOTL – and using the same exact channel names – which the CRTC denied. The original application foresaw a single channel broadcast nationwide, bringing the regions to a national audience as a basic, non-commercial, must-carry service (in their wording); ITTL, this applies only to the common channel with the basic idea extended to the “rabbit ears” as well, with the common channel as the national feed.

    [6] Unless you count simulcasting Dick Clark’s New Year’s Rockin’ Eve.

    [7] IOTL, CKND launched in 1975 with the Jerry Lewis MDA Labor Day Telethon.

    [8] CHCH, sometimes described in Canada as “the Superstation”, was an independent for most of its early history (after being founded as a CBC affiliate) and would remain so for much longer IOTL, coming into the CanWest sphere of influence much later than ITTL, though it has once again re-emerged as an independent station in recent years.

    [9] Ballard would retain his ownership of the Toronto Maple Leafs until his death in 1990, IOTL. The Leafs, notoriously, have yet to win another Stanley Cup, the longest drought in league history (shared with the St. Louis Blues, who were founded in 1967 and have yet to win their first); their previous record was so strong that they still have more Cups overall than any other NHL teams, except for the Montreal Canadiens and the Detroit Red Wings.

    [10] Don Cherry has achieved far more success – and certainly far more notoriety – as the colour commentator on the popular Hockey Night in Canada segment “Coach’s Corner”, a position he has held since 1981 IOTL (shortly after the end of his professional coaching career). His political views are strongly right-populist, and though he is very popular with audiences, individuals (and commentators) tend to judge him according to their own political views. He never led the Bruins to the Cup IOTL, though they made the Finals twice during his five-year tenure as their coach (both times against the Canadiens dynasty of the late 1970s), before moving to Colorado in 1979.

    [11] The Winnipeg Jets, still part of the moribund WHA, won the Avco World Cup in both 1978 and 1979 IOTL. As their roster is better-protected under the more equitable (and therefore, favourable for the former clubs of the WHA) terms of the 1977 merger ITTL, it is able to emerge victorious over the relatively weaker Clarence Campbell Conference of the era, only to run up against the brick wall that is the champion of the other conference.

    [12] IOTL, the Flames were sold for $16 million; the economy, still larger than that of OTL in absolute terms, despite the massive recession, affects the purchase price.

    [13] The Flames did indeed see their fortunes improve dramatically with the change in venue; this allows Grapes to take credit for it (and given his sterling tenure with the Bruins, he’s far more likely to be taken at his word), which keeps him employed by them as a coach.

    [14] The Red Division contained the Soviet Union, not Czechoslovakia, IOTL, and it naturally swept that division, moving onto the finals. In addition, Finland narrowly defeated Canada in the qualifying rounds, and thus moved onto the finals, where they finished fourth overall (tying Sweden and, of course, losing to the United States). Canada, meanwhile, lost the consolation game – yes, against Czechoslovakia – and finished sixth.

    [15] IOTL, of course, the term “Miracle on Ice” (coined by ABC sports reporter Al Michaels) refers to the defeat of the Soviet Union by the United States.

    [16] Canada would not receive a gold medal in (men’s) ice hockey until 2002, IOTL, in Salt Lake City, before (yes) repeating in Vancouver.

    [17] Canada won only one silver (for Boucher) and one bronze (for Podborski – the only Crazy Canuck to actually win a medal at the Olympics), IOTL.

    [18] Obviously, under President Carter (in the midst of the Iranian hostage crisis, no less!) the boycott went on as planned IOTL, perhaps in an effort to bolster his mortally-wounded foreign policy credentials; there being no such crisis ITTL says everything you need to know about how different the situation there is.

    [19] No, he didn’t go to China, but he did go to Moscow!

    [20] IOTL, with both the United States (the perpetual second- or third-place finisher) and West Germany (usually in fourth) not competing, the Soviet Union won eighty gold medals, as part of just under 200 overall. East Germany, which came in second, won very nearly as many gold medals as the Soviet Union did in Montreal 1976 (47 to 49). It was a massive drop to third-place that year (Bulgaria won “only” eight gold medals).

    [21] Of course, the Soviets also had other satellite systems in operation - including the ingenious Molniya system, which allowed remote locations of the Soviet Union - particularly areas of Siberia and the Soviet Far East where conventional satellite transmissions are difficult - to receive television broadcasts. The system lent its name to its distinctive non-geosynchronous elliptical orbital pattern, the Molniya orbit.

    [22] The influence of Roone Arledge (whose crews swamped Montreal in 1976) plays its part in informing the tone of reporting by CBC-2 in their coverage of the Olympics.

    [23] Like many television series ITTL, the name is derived in part from the program’s theme song, “The French Song” (or “Quand le soleil dit bonjour”), popularized by Franco-Manitoban singer Lucille Starr in the 1960s (which, in the tradition of the Germans Love David Hasselhoff trope, became popular in the Netherlands, of all places!).

    [24] Similar to the Canadian French dub of The Simpsons IOTL, in which the elites speak Standard French, and the rest of Springfield speak with Québecois accents.

    ---

    My many thanks to Dan1988, who served as the co-author of this update, and also to e of pi for very thoroughly proofing this very long and very dry post!

    I hope that you all enjoy this rather technical look at the nitty-gritty of the broadcasting industry. Now many of you may be wondering, where does this fit in with regards to American technology? Fortunately, that question will soon be answered. (For those of you wondering where this will fit in with regards to British technology, just assume that it’s mostly the same, only the video quality is slightly different.) And yes, to those of you Calgarians who asked, I agreed with you in moving the Flames from a doomed market to one which would appreciate them (one could only hope that the NHL will someday learn to do this with a given team that is not based in Atlanta). And, for those of you who were asking after the Miracle on Ice… well, you know what they say: be careful what you wish for, you may get it.

    CBC-2 Title and Broadcast Logos.png
     
    Last edited:
    For the World is Hollow and I Have Touched the Sky
  • For the World is Hollow and I Have Touched the Sky

    I look at the example set by Miss Ball at Desilu, and I think: ‘Well gee, now there’s a great way to get started’.”

    Ted Turner, describing his nascent media empire in 1979

    Most great visionaries lived well before their time. One of the many examples of this phenomenon throughout history was that of the engineer and cosmologist Herman Potočnik, a Slovene who had previously served the Austro-Hungarian Empire. He had first devised the technology which would make the use of satellites for telecommunication and broadcasting purposes possible… in 1928, at a time when television itself was a very recent invention. It would be nearly a half-century before his vision of what would become known as geosynchronous satellites – those which orbited about the Earth so as to perfectly match the planet’s rotation, always remaining directly above the same point on the surface, which allowed for constant communication between the two – would be realized. The technology finally came about as part of a series of innovations by two of the oldest and most dominant telecommunications companies in the United States – Western Union and RCA (which owned the NBC network) – which led to the definitive breakthrough in Earth-to-orbit communication, due largely to the launch of their respective Westar and SatCom satellite networks, launched in the mid-1970s. [1] As for the third telecommunications giant, the complacent “Ma Bell” (properly AT&T), that organization did not take nearly as active an interest in that which laid outside their insanely lucrative monopoly – a marked contrast to the pronounced diversification of interests by RCA. Though it was RCA whose flagship division produced the ideal outlet for the development of satellite-based telecommunications: the television set. The days of the so-called “rabbit-ears” antennae capturing easily-distorted and poor-quality transmissions from terrestrial broadcast towers could become a thing of the past with satellite, though that too came with inherent environmental risks. But the potential upside was very great: satellite transmissions could carry dozens of channels, as opposed to the half-dozen or so available in most terrestrial markets: the Big Three private networks of NBC, ABC, and CBS; the also-nationwide PBS; and at least one or two “independent” stations, though that term was a misnomer; many of them were actually owned by various companies which possessed small groups of stations, and these were often widely dispersed throughout the over 200 media markets throughout the United States.

    The smaller stations which were scattered across the country could not, by and large, produce their own programming, given the lack of economies of scale that benefitted the three major networks, along with the attractive sponsorship deals and infusion of capital from the Corporation for Public Broadcasting which allowed PBS to compete with them. For this reason, local stations overwhelmingly favoured only local programming which, in effect, could not be produced or aired by network affiliates; the exploits of local sports teams were a perennial favourite. To fill the remainder of their schedule, virtually all stations relied heavily on syndicated programming, which most of the television studios were happy to provide. However, federal laws relating to syndication exclusivity, or “syndex”, applied to every market in the United States, which served to restrict re-broadcasting of any given series to only a single channel therein. It was fortunate for the independent stations that the network affiliates, which had the largest budgets, had relatively few timeslots in which to schedule reruns of shows that were not being actively produced by their respective networks; independents, on the other hand, could devote most of their own schedules to shows which had entered the syndication market, which (coupled with a dearth of original material) earned them the popular nickname of “rerun farms”.

    It was no surprise that the oeuvre of Desilu Productions was among the dominant crops grown on these rerun farms. I Love Lucy, though it did not fall back into the possession of Desilu itself until 1981, was a standby, particularly in early weekday afternoons – perhaps as counter-programming to the wildly popular soap operas on network television, as all sorts of comedy series could often be found airing against them. Star Trek, of course, though it was predominantly associated with 7:00 PM weeknight timeslots in the 1970s (mostly on NBC affiliates), was available at all hours of the day across the USA in markets where that iconic scheduling arrangement had not been arranged. These “exceptions” numbered nearly 100 at the height of Star Trek’s second-run popularity in the early 1970s – mostly among the smaller markets, particularly in the South. [2] But above all it was Doctor Who which was most heavily exploited by independent stations, as (unlike the previously-mentioned series), they were less desired by the network affiliates from the outset. The Third Doctor became a mainstay by the mid-1970s, cementing his stateside reputation as the Doctor even after the Yank Years had concluded; sales of the exploits of his predecessors, however, were sluggish until after “The Three Doctors” serial aired in 1973. The Second Doctor, who had interacted frequently with the Third in the serial, to the delight of audiences, became more sought-after as a result, and would soon after be found populating the off-hours on various station schedules. The First Doctor, on the other hand, remained more elusive, except on those remote (and especially redundant, in larger markets) UHF stations which were most desperate for any sort of content. [3] But those stations which could program correctly were duly rewarded with good ratings – and commensurate advertising revenues. And as was the case for most particularly enterprising small businessmen, those who were most successful were possessed of the ambitions necessary to expand their media “empire”. Without question, the definitive example of this was Robert Edward Turner III, who was better known by the alliterative name of “Ted”.

    Ted Turner had inherited the business interests of his father in 1963, at the age of 24; fittingly enough, these were in advertising, the very lifeblood of the television industry. His remarkable success in that field led him to expand into radio, purchasing several stations in the Southeast region, but this new venture was a mere stepping stone en route to the real prize: his very own television station. He purchased a struggling UHF station, WJRJ-TV 17 in Atlanta, in 1969. Established just two years before (and named for its founder, Jack Rice, Jr.), it was the first independent station in that fairly large market, but this novelty factor did not immediately translate into success; at least, not until after Turner made some changes. The first, which was in keeping with his… well-developed sense of self, was to rename the station WTCG, for his Turner Communications Group. However, with his typical hubris, he would often claim that the call letters stood for “Watch This Channel Grow” – and that’s exactly what it did. Oddly, one of the ways it would do so was in terms of the market it served – thanks to the use of cable technology. Cable providers throughout the Southeast would add the WTCG feed to their service.

    Cable was the earliest practical means of broadcasting other than over-the-air transmission, predating satellite television, and indeed, satellites of any kind. Physically speaking, cable lines functioned as a telegraph or telephone line for television – in an odd reversal of the development process for those much older media, wireless transmission had come first in television broadcasting. Cable allowed for clearer signals, which could not be weakened or dispersed through inclement weather or with distance from the transmitter – only when the line itself was severed. Before satellite technology became feasible, cable was the only way to convey broadcast signals to remote, far-flung areas. It became very popular in Canada, which (despite the concentration of its population along the border with the United States) had many regions where the various U.S. network affiliate signals were unable to reach. However, these remained densely populated areas, and this allowed the physical infrastructure to proliferate despite the greater distances involved. This density of population made for an important distinction, for more remote regions, particularly in the United States, were deemed by cable companies to not be worth the great expense of physically connecting them to their systems. The dearth of services available to these regions, which were numerous and, collectively, remained fairly populous, prior to the advent of satellite television would prove critical as the battle lines would be drawn between the two new technologies of cable and satellite.

    Cable had the same advantage – or disadvantage, depending on the preparedness of the provider – as satellite: even the most basic service could carry dozens of signals, as opposed to the mere handful of over-the-air stations that could be received in most markets. WTCG, for its part, aired in the largest city in the region, with an even larger metropolitan population (given the massive white flight from the urban core since World War II). Given that Atlanta was located in the densely-populated cis-Appalachian region, cable proved both practical and highly desirable; given the intensely protectionist culture of the South, WTCG was seen as a natural to be carried in many such markets on one of the myriad “spare” channels available on cable. This, more than anything else, allowed the station to flourish, and it also fed into Turner’s ambitions. In fact, the pattern with WTCG was repeated nationwide, with all of these regional hub channels becoming known as “superstations”. However, Turner, unlike most of his fellow owner-operators, was not content for his channel to merely be known as a superstation, but for it to instead form the foundation for an empire upon which WTCG would serve as flagship.

    The first step in doing so was to create a nationwide network feed out of WTCG, which he would then sell to cable distributors instead of his Atlanta feed – which had, by this time, already been established throughout the United States and even in Canada, where it was eventually registered with the CRTC. [4] He called this new national feed the Turner Broadcasting System, or TBS, and the callsign of his Atlanta station was accordingly changed to WTBS as a result (after having made a large donation to the Massachusetts Institute of Technology, whose radio service had previously held the rights to those call letters, in exchange). [5] This occurred at the end of the 1970s; by this time, the station was often described on-air as the “superstation”, and Turner had secured the rights to broadcast local wrestling matches (as part of the Georgia Championship Wrestling promotion) as well as Atlanta Braves baseball games. That Major League club, thanks to the wide reach of TBS, would become known as “America’s Team”. Other mainstays on the schedule throughout the 1970s included, as was the case for so many channels across the country, I Love Lucy, Star Trek (as it did not air on the NBC affiliate in Atlanta, WSB-TV, in a fortuitous rarity of which Turner took full advantage), and Doctor Who. Other shows to be found there included Gilligan’s Island, the one and only solo success of comedy writer Sherwood Schwartz; in fact, Gilligan reruns proved so popular that Schwartz was successfully able to pitch a revival miniseries which would finally get the seven castaways off the island. Return to Gilligan’s Island [6] aired in 1978, and was the second-highest rated miniseries of that year to be based on an established television series, after (of course) Star Trek: The Next Voyage. Given the predominance of Desilu shows on his roster, Turner came to refer to WTCG, WTBS, and finally TBS as “The House that Desilu Built”, a pun on that studio’s own self-referential nickname, “The House that Paladin Built”. Lucille Ball never made an official reaction to Turner’s shameless flattery, which insiders at Desilu revealed was because she found his behaviour unworthy of her time or attention.

    The market centred on Greater Boston operated differently than in the rest of the country. Technically, every television station operated as an independent, which resulted in several quirks where “official” network affiliates were forced to carry alternatives to the network feeds, because true independents were already airing them. The regional superstation accordingly had a convoluted history. It began life in 1964, as WIHS-TV – created and operated by the Roman Catholic Archdiocese of Boston. It aired a “hybrid” schedule of religious and educational programming (largely for an audience consisting of the many Catholic schools in the region) each weekday morning, followed by syndicated shows and movies for the rest of the day – which, as previously noted, included programming from the various network feeds. [7] In 1966, the Archdiocese sold the station to Storer Broadcasting, who promptly changed the call sign to WSBK-TV (as the company was traded on the New York Stock Exchange under the ticker symbol SBK). All religious and educational programming was dropped, allowing the channel to broadcast syndicated and pre-empted network programming, in addition to sporting events – they gained access to Bruins games in 1967, and Red Sox games in 1975 (in which they won the pennant). This naturally attracted an audience throughout New England, due to the contemporary successes of both the Red Sox and particularly the Bruins – who formed a Stanley Cup-winning dynasty in the late 1970s. This resulted in the station deciding to expand its reach via cable, just as all the other superstations did. By the mid-1970s, the station was carried by cable companies throughout New England, and even in New York State. It too would develop a national feed, from which it had to remove any network programming, and ensure a diverse schedule from TBS, along with the other superstations.

    These others included: WPIX, a New York City station best known for airing movies – which made for an amusing pun on the station’s call letters - and New York Yankees games (thus allowing the notorious Yankees-Red Sox rivalry to go national), though it initially became famous in the 1950s for its children’s programming (including The Three Stooges reruns and Dick Tracy cartoon shorts); WOR-TV, another New York City station, which was perennially last in the local ratings; WGN-TV, operating out of Chicago, owned by the same conglomerate as the Chicago Tribune newspaper, which allowed for a shared news division; and KTLA-TV in Los Angeles, which was a most intriguing case – it was the first commercially-licensed station on the US West Coast, it had, briefly, been affiliated with the moribund DuMont network, and for a long time it was owned by Paramount Pictures. Ironically, the station was sold by Paramount shortly before the studio was bought out by Gulf+Western, leading to the establishment of its television division; the new ownership, led by country and western star Gene Autry, focused on talk shows and (of course) Westerns – in addition to movies, sports, and other typical fare.

    Satellite networks required far greater startup costs than cable, which helped to explain why only telecommunications powerhouses like Western Union and RCA were able to establish them at first. Launching anything into geosynchronous orbit (without the aid of a Saturn V rocket, at least, and NASA was not lending those out with so few to spare) was extremely expensive, which helped to explain the very gradual growth of the telecommunications satellite industry. Still, satellites had one distinct advantage over cable: once the “bird” was in orbit, it could cover any geographical area of a certain radius, regardless of population density, for the exact same overhead. Cable, on the other hand, had to maintain all those lengths of wire manually, and in a most cumbersome fashion in remote areas – contractors were not cheap. This gave satellite an immense advantage in rural areas, allowing sleepy townships the chance to receive more than three or four channels for the first time in history. The gigantic receiver dishes – fashioned from either fiberglass or, later, out of wire mesh and either aluminum or solid steel – became something of a status symbol for wealthier rural homes, prominently displayed in the front yards of most subscribers. It was this massive size which helped to impair the viability of satellite in urban areas – only rooftops could support the receivers, which were quite often considered eyesores in that context. Besides, many tall buildings had even taller radio receivers to collect the signals from terrestrial transmitters, almost always located in the heart of large cities. By contrast, cable lines were unobtrusive and cheaper to maintain, so that became the dominant system in urban areas. Suburbs, as was so often the case, quickly emerged as the battleground in the nascent war between the formats.

    With the exception of the aforementioned TBS and WSBK, however, a new dilemma emerged for all the superstations upon being connected to the satellite networks: many of them carried largely identical programming – and most of them used the same national feed as the one which aired in their respective core markets. Under the syndex laws, this meant that viewers were often greeted with blank screens because the superstations were not only duplicating the schedules of each other, but also those of the local independent stations as well – upon request from the local broadcaster which possessed the syndication rights for each program in question, the local provider was obligated to block any and all signals which carried the offending programming. This was such a recurring situation that the solution of splitting the feeds depending on market eventually became common practice for the superstations, especially to make their channels “syndex-proof”. This resulted in each of the national feeds moving towards specialization in different programming genres: TBS placed more emphasis on Desilu programming and other shows from the Classic TV era; WPIX tightened its focus in movies; the struggling WOR-TV also tried to stand out from the crowd by bringing in programs from ITV in the UK. There was also an attempt to delineate superstations on a geographical basis, wherein KTLA became a West Coast counterpart to the East Coast WPIX. Amidst this grand realignment process, WGN struggled to find the right niche for its audience, until it decided to focus on its association with the Chicago Tribune newspaper. While it still maintained its schedule of entertainment programming (even more so on the local feed), it also decided to devote considerable amounts of time to the news and on shows intended for audiences in rural America; accordingly, it became a particular favourite on satellite. WOR-TV, on the other hand, was facing major growing pains under the auspices of RKO, a company which had emerged from the corpse of the one-time major studio following the asset liquidations of the 1950s. RKO General, as the modern-day incarnation was known, lobbied the Congressional delegation from New Jersey to draft a bill which would effectively allow WOR-TV to move to a “friendlier” market in the Garden State, while still tenuously maintaining its link to New York City proper. [8] However, this was all for naught, and when their scheme was exposed to the public, WOR-TV only found their situation even worse than before, given the close scrutiny it faced from the FCC. For all intents and purposes, WOR-TV’s pretensions to “superstation” status were effectively scuttled, and RKO sold off the station shortly thereafter. Fortunately, this lightened the syndex load, and many of the other superstations, particularly WGN and WSBK, carried the previously WOR-TV-exclusive programming.

    During this era, cable television required the “cable box”, a cumbersome apparatus which adapted the signals received via cable transmissions into usable output for the television picture tube, there existed a conduit for further innovation, perhaps by some means of allowing the consumer to interact directly with what could be observed on-screen. This role was filled by the trailblazing QUBE service, which used a version of the cable box, to which a large remote control was connected, allowing viewers to “communicate” by pressing buttons on the remote with the presenters of live programming which was being transmitted via cable. The buttons functioned in much the same way as a multiple-choice test; each row represented a choice of answers to questions which would be asked periodically by the presenters. The system was, essentially, an extremely sophisticated (though unscientific) polling service. Alongside this innovation, the QUBE service also doubled the channel capacity relative to other early cable services of the era, allowing for additional proprietary channels, most of which were intended for special-interest audiences. The QUBE gimmick of interactive media was outshone by other options, even at the time, but many of their other innovations were eagerly mimicked by the other cable (and satellite!) services; the special-interest stations, for their part, came to define what made them different from terrestrial broadcasting. Although the QUBE service did not survive, several of the channels created for it did, including the children’s station Pinwheel, though it would come to adopt a more conventional programming format as it was picked up by the other cable and satellite providers. [9]

    Cable and satellite transmission allowed for the creation of additional “premium” subscription channels which, for various reasons, could not be aired terrestrially in North America. Most of these focused on recently-released motion pictures, which were highly desired by their subscribers; these films would appear there long before they were released on home video or appeared on network television. As the premium channels were supported by subscription fees instead of advertising revenues, this allowed their programming to air without cuts demanded by sponsors, or by the censors (who were omnipresent on terrestrial television, even after FCC regulations were relaxed by the Reagan administration). One such channel was the Home Box Office, or HBO, which developed from a local service in the Lower Manhattan area. The Time-Life company, which owned HBO, began distributing the station nationally via the Westar system in 1975. The winning combination of movies, sports (including boxing), and the original series that it provided cemented its popularity with cable companies – all the more so the following year, when HBO switched to SatCom 1. It also produced a first in cable television when it launched a “free preview” service to combat a high rate of turnover within its original service area. When HBO first became available in northern Massachusetts [10], subscribers to the local cable company were able to sample HBO for free for a month; once the trial period ended, HBO moved to a different channel and the signal there was scrambled. This concept proved popular and soon became a staple of cable and satellite television, greatly increasing the potential for revenue generation.

    Just as television was undergoing a sea change in the late-1970s, so too was radio, thanks to the legislation passed by the Reagan administration. The Neutrality Act was a thing of the past, in the name of the First Amendment – broadcasters were now allowed to advocate for either side of any topical or controversial issues without having to provide for equal time to the other side. The television networks, however, largely did not take advantage of this provision, for fear of alienating audiences. PBS was more willing to make definitive statements on controversial issues in the name of educating audiences, but – mindful that the bulk of their revenues came from the government-controlled Corporation for Public Broadcasting, sponsorships from large corporations, and Viewers Like You – also generally shied away from controversy, with even shows like Nova and Cosmos choosing to mostly focus on questions with empirical answers, or simply leaving the audience to answer them. Television was entrenched as the mass medium of the era; the former holder of that title, radio, ironically proved the hotbed for experimentation when it came to flouting the former Neutrality Acts. After all, dramatic programming was long dead, and music formats were beginning to migrate to the higher-fidelity FM dial – there was time on the AM dial that needed to be filled with something. Given that radio tended to favour more conservative audiences (older listeners who still sat down and listened to the radio at the end of each day, along with middle-class commuters who listened in the car on the way to and from work), many of the earliest political commentators in talk radio to achieve nationwide recognition were themselves conservative. It helped that the generation which had come of age were disillusioned with the Great Society and New Deal-era politics in general; Reaganomics were “hip” with younger people. Perhaps the figure most associated with the era was former Congressman Sam Steiger of Arizona, who had run as a candidate for President in the Republican Party primaries in 1976 (to the right of the eventual President Reagan, in fact); this was after having flirted with entering that year’s race for Senate, but his would-be opponent, the future Sen. Conlan, had successfully intimidated Steiger away from that contest with ethnic and nativist rhetoric, given the latter’s Jewish heritage and birthplace in New York City. [11] Perhaps for this reason, Steiger strongly opposed racist attitudes, and vocally denounced the AIP, preferring to espouse his policies of “common-sense” or “compassionate” conservatism. By the 1980 elections, he was a national star, and spoke at the Republican National Convention that year. Steiger was one of a great many figures who achieved fame – more often infamy – on talk radio, however. The era of newspaper columnists as the premiere political commentators in the country had come to a definitive end, though (in a growing pattern amidst the changes facing media in this era) the establishment was fatally slow to react to this new reality.

    In the shadows of the developments of satellite technology was Canada. Until the 1970s, early satellite transmissions were transcontinental, which meant that Canadian and American viewers would be watching the very same channels, and even after the creation of the CRTC in 1968, the Broadcasting Act did not make provisions for the well-established transmission of broadcasts via cable services. This would change with the launch of the ANIK series of satellites, starting in 1972. For the first time, it was now possible to have a satellite network serving a single country. This allowed telecommunications firms to extend their reach to rural and remote areas – the latter being more prevalent in Canada than in the US, particularly in the North – which had not previously been feasible for them. In the case of the CBC, it would prove a preferential alternative to ferrying videotaped programmes around by airplane, which often did not arrive until weeks or even months after their original broadcasts. [12] Even so, Canadians were also able to watch channels through “grey-market” American satellite services, and this resulted in many of those channels attempting to court Canadian viewers in addition to their core American audiences. Canadian providers, who served as the intermediaries in this process, were happy to oblige, because the vagaries of the system enabled them to receive the licensing fees from subscribers but to pay little or nothing back to the original producers. An example was the licensing of Canadian Satellite Communications, or CANCOM, to provide a “cable in the sky” service for rural Canadians, which included the so-called “3+1” package of all Big Three American networks, plus one PBS station (the standard in all American markets). HBO, for their part, pioneered the first true localization of a television channel. Their decision to do so, however, was mired in controversy. HBO had long been popular with Canadian viewers ever since it was moved to SatCom 1; however, the CRTC had resisted letting cable companies add HBO to their lineups. The logical step, therefore, was to have Time-Life form a joint venture with various Canadian investors, including the head of a major supermarket chain and the mutual fund division of one of the major banks, to launch a local version of HBO and thus, via Canadian satellites, legally bring HBO directly to viewers. [13] This touched off a major controversy in Canada as to the purpose of satellite television, let alone cable. Given that HBO remained, in essence, an American outfit, and that the CRTC was created to enforce cultural protectionism against American product, the agency was forced to intervene, which resulted in their landmark decision on what had become known as “pay-TV” in 1980.

    The decision involved creating a limited amount of channels to the pay-TV lineup, including, at minimum: one English and one French national general-interest channel; two regional general-interest channels (one for Eastern Canada and one for Western Canada); one “specialty” performing arts channel; and one multilingual general-interest channel. [14] These general-interest and movie channels – akin to the premium channels in the US, and in Canada the core of the new system – were obliged to comprise up to 45% of their schedule with Canadian content, and provide financing to the further production of Canadian programming – the usual partial exceptions were made for programming from elsewhere in the Commonwealth. Their decision aroused an equal amount of controversy to the original plan by HBO, because both existing broadcasters and policymakers feared the resulting audience fragmentation would destroy the Canadian television market. Nevertheless, on the basis of this pay-TV decision, the CRTC allowed the “HBO Canada” joint-venture to go ahead, and it launched in late 1980. [15] Complementing these designated general-interest services were the “specialty” services – akin to the special-interest channels in the US, but in Canada these were to be restricted to specific, highly regimented and controlled categories, and were subject to the same CanCon regulations as terrestrial channels. Nonetheless, it seemed that legislators on both sides of the border were taking decisive steps to acknowledge the new situation, as it regarded television broadcasting at the dawn of this new decade. Times had changed, and it was obviously time for statutes to change with them.

    These new technological breakthroughs, and the commercial success derived from them, proved threatening to the complacency of the broadcast networks, who had enjoyed uninterrupted primacy within the medium for over three decades (carrying over from their primacy within radio), and had relied exclusively on terrestrial broadcasting. Their approach to cable and satellite was largely a negative one: they chose to ignore these other options and disregard their existence as much as possible, excepting of course the willingness of network affiliates to accept advertising revenue from local cable and satellite providers. Much like AT&T, they derived great benefit from the status quo, seeing no need to embrace any changes thereto; their status as holdouts against these new innovations would have dramatic consequences as the 1980s progressed…

    ---

    [1] SatCom was launched behind schedule ITTL, given their greatly accelerated progress on the SelectaVision CED system, allowing Westar to establish more of a foothold earlier on. However, it must be said that RCA is employing excellent strategy here with their investment, ensuring the viability of the whole even though one of their largest divisions (NBC) would be threatened by the success of their newer operations. Sadly, RCA was not as decisive or efficient IOTL, and this eventually contributed to their downfall.

    [2] Star Trek was, for obvious reasons, not popular in socially conservative markets, given the progressive themes of the series. As might be expected, there was a high reverse correlation between the availability of Star Trek and the popularity of the American Party in a given market. It could not be found in any of the five markets based in Alabama, for example (nor in three out of the four from neighbouring states which extended into it – the fourth, which served Atlanta, was the lone exception).

    [3] Although “The Three Doctors” serial does feature William Hartnell as the First Doctor actually interacting with the other two Doctors (in a single scene) ITTL, that is still not enough for him to make an impression on American audiences; this is in marked contrast to Troughton, who makes the most of his screentime (even moreso than IOTL), allowing the amusing bickering between the Second and Third Doctors to strike a chord with American audiences, which leads to a boost in the desire to see Troughton among viewers.

    [4] IOTL, it was indeed the Atlanta feed (WTBS) which was permitted to be distributed throughout Canada by the CRTC, and not the national TBS feed. This was never corrected, which means that, when WTBS disaffiliated from TBS and re-branded itself “Peachtree TV” (with the new call letters WPCH), that was what Canadian stations carried. Not a single cable or satellite provider has ever expressed interest in applying to the CRTC to transfer the licence held by WPCH to TBS, nor is that organization likely to do so (as restrictions are much tighter now than they were at the time – therefore WPCH is itself grandfathered in). Indeed, many carriers have dropped WPCH from their packages, though this editor is still able to view commercials advertising the services of over half a dozen Atlanta-area ambulance chasers and private “career schools” to this day.

    [5] Turner effectively purchased the WTBS callsign from the MIT radio station now known as WMBR by donating $25,000 upfront (used by the school to purchase a new transmitter) in exchange for their agreement to apply for new call letters; a further $25,000 was donated to the college once Turner secured the rights to WTBS.

    [6] This miniseries instead aired IOTL as a two-parter entitled Rescue from Gilligan’s Island.

    [7] During the 1964-1965 season IOTL, while still under the ownership of the Archdiocese of Boston, WIHS did carry Boston Celtics games. However, the team management was worried about the limited audience on a UHF station, so it was also simulcast on WHDH, then the ABC affiliate, before moving there permanently. WIHS, meanwhile, didn’t carry sports programming again until after being purchased by Storer Broadcasting.

    [8] RKO General was successful in getting a bill passed that moved WOR-TV to Seacaucus, New Jersey (technically still a part of the New York City market) two years later, IOTL, where it continues to exist as WWOR-TV. Successful implementation of their plan did not help RKO General with their legal difficulties, however.

    [9] IOTL, Pinwheel was renamed Nickelodeon – yes, that Nickelodeon – in 1979.

    [10] To be precise, IOTL and ITTL, the location of HBO’s “free preview” experiment was Lawrence, Massachusetts, a city on the border with New Hampshire.

    [11] IOTL, Steiger and Conlan were both part of the Arizona House delegation, and were bitter opponents for the Republican nomination in the race for the Senate seat vacated by fellow Republican Paul Fannin. ITTL, obviously, Steiger saw an opening that did not exist in the Ford vs. Reagan “Battle of the Titans” in OTL 1976. In fact, Steiger triumphed in the nomination fight despite similar mud-slinging against him by Conlan IOTL – however, it damaged him enough that Democrat Dennis DeConcini won decisively in the general. ITTL, the stronger coattails from Reagan’s landslide victory allowed even the more conservative and divisive Conlan to (narrowly) emerge victorious.

    [12] During the early years of CBC Television, prior to the emergence of the ANIK satellite network, the network operated the Frontier Coverage Package, which gave more remote communities (particularly in the North) a piecemeal television service. Much in the same fashion as how syndicated programming operated at the time, shows were “bicycled” from community to community (often with a delay of weeks or months at a time). The launch of ANIK put an end to the Frontier Coverage Package, allowing those areas to have the same scheduling and service as the rest of Canada.

    [13] These investors here are largely the same as those who were behind the failed First Choice/Premier Choix service in OTL. These investors include: Donald Sobey (of the Sobeys supermarket chain), J. R. McCaig, Norman Keevil, Royfund Equity Ltd., AGF Management Ltd. and the Manufacturers Life Insurance Co.

    [14] When the CRTC made its decision on “Pay-TV” in 1983 IOTL, there was a very similar controversy because, in a report published five years earlier, it had recommended licensing only one pay-TV service. The setup ITTL is similar to that of OTL, but with some notable differences. IOTL, the initial amount of CanCon in the schedule was 30%, which would gradually increase to 50%. In addition, there is also some difference in the channels – there were three regional general-interest channels (one each for Alberta, Ontario, and Atlantic Canada) and the multilingual general-interest channel was limited to British Columbia (whose largest subscriber base was comprised of the Chinese-Canadian community). Unlike IOTL, where most channels either west bust or fell into receivership or merged with each other just to survive – this setup is more viable.

    [15] HBO Canada, of course, never happened IOTL (at least, not in this format). Instead, First Choice evolved into The Movie Network (TMN) for Eastern Canada, Superchannel (the evolution of the regional general-interest channels) became Movie Central (for Western Canada), and Premier Choix became Super Écran after a merger.

    ---

    Thanks once again to Dan1988 for co-writing this, the second update of the Technology Trilogy! And, as (almost) always, thanks to e of pi for assisting with the editing – one of the challenges of collaborative writing is to ensure that the entire narrative is delivered in a single “voice” – namely, my voice. Also, say “hello” to one of the last major characters to be introduced to this timeline, media mogul and tycoon Ted Turner! Expect to see much more of him in the not-so-distant future…
     
    Last edited:
    Now You're Playing with Power
  • Now You’re Playing With Power

    TMS9900.jpg
    The Texas Instruments TMS9900 microprocessor – which revolutionized home computing and video gaming forever after
    .

    “What can’t you do with a VCS II?”

    William Shatner, delivering the slogan for the Syzygy Video Computer System, Mark II (VCS II), in a commercial originally aired November 22, 1979

    At much the same time as cable and satellite network services were changing what was available on television, peripherals such as VDPs and VTRs, along with video game systems such as the Syzygy VCS, were also playing a key role in expanding consumer horizons beyond the original limitations of that medium. The television set had entered the 1970s as the hub for a mere three or four viewing options on average; it would leave that decade with at least that many peripherals which could be attached to it, each of which in turn proved the conduit for households to find exponential means to make their own fun. More established, traditional markets – particularly the long-beleaguered movie houses – felt the threat, and were increasingly forced to adapt to face their new competition, with decidedly mixed results.

    First and foremost among these revolutionary new means which had been made available to the modern consumer was the ability to self-program, in much the same way as choosing which record to play on the home stereo system. The CED was a smash success, with nearly one million of the RCA SelectaVision units sold by 1980, and another million more made by licensed CED manufacturers. Every movie studio in Hollywood backed the format, save for Universal (which stubbornly held out on their LaserDisc VDP instead), and any studio they could afford to pay off (20th Century Fox and United Artists in particular). Paramount switched sides in 1978 as a result of Desilu – a major backer of SelectaVision – continuing to employ Marcia Lucas, in a bold gambit that totally backfired; Paramount would switch back to SelectaVision in 1980, in exchange for a very desperately-needed infusion of cash from RCA [1]; 20th Century Fox would soon follow, marking the effective end of the VDP front in the Format Wars, with the LaserDisc forced to concede the North American home video market in favour of other, more innovative applications. It helped that all the other studios – MGM, Columbia, Warner Bros., and (of course) Desilu – had remained with the CED format from the outset, as did most manufacturers, even including many Japanese firms. Nevertheless, the CED was perhaps the most technologically inferior home video format in the late 1970s (only VHS was likely to challenge CED for that title), and in order for RCA to consolidate their gains, enhancements to the format were deemed necessary, especially since (despite a smear campaign spearheaded by RCA itself) the VTR formats continued to loom.

    The CED had severe limitations when it was released in 1977; each side of the disc was limited to half an hour of footage, which meant that any films over two hours in length required a third disc, which usually carried only a few minutes more of footage out of the potential hour – this was deemed unacceptably wasteful and inefficient. Thus, many movies which were slightly longer than 120 minutes were selectively edited to bring them under that threshold. Many cinéastes were irate at this, which was not surprising in an age when the auteur theory was accepted as fact, and the creative integrity of filmmakers was sacrosanct. Movie critics Roger Ebert and Gene Siskel went so far as to openly denounce the practice on their film review show on PBS, in their special “home video” episode which aired in 1978 – Ebert had praised LaserDisc, to the exclusion of all others, whereas Siskel had declined to endorse any format without manufacturers taking “drastic measures” to meet their outstanding deficiencies. [2] Whether RCA brass had watched that episode remained an open question, but plans did exist to continuously improve on CED technology – for both the VDPs and the videodiscs themselves. [3] SelectaVision had the advantage of a good head start, strong industry connections, and excellent marketing, but these worked in concert to hide the weak fundamentals that rather desperately needed improvement. CED length was first and foremost on the list; durability and replayability came in a close second. RCA reinvested much of their retained earnings from SelectaVision profits into determining a more ideal medium of construction for their videodiscs. By 1981, a new solution had been determined which would double the length on each side (one hour apiece – the average movie could thus fit on a single disc), as well the replayability (from 500 estimated plays, under ideal conditions, to an even 1,000). This was achieved through the use of new material to fabricate the videodiscs, a new stylus mechanism which was both more flexible and more durable, and the use of a special, newly-patented lubricant which mitigated wear and tear. Other innovations were less physical, and more visual or aural in nature.

    For example, the original-generation CED discs had only been available in monaural sound. Stereophonic sound, the defining breakthrough of aural technology in this era, would become standard for all releases on next-generation CED, except where the original material did not support it (primarily the case for older movies, in addition to the extensive television catalogue). Among the other newly developed refinements to the format allowed for two different sound channels, which could be used for any number of purposes. [4] Foreign-language dubbing seemed an apt use of the technology; it would allow producers to sell two different versions of otherwise-identical product (one with a Spanish dub for the Southwest, and another with a French dub for the Northeast and Canada) for minimal extra cost. However, one alternative possibility was the creation of an entirely new audio track, one which would comment on the events that were unfolding offscreen from a post hoc perspective. There were multiple methods that could be employed to arrive at the same end result: interviews of principal cast and crew could be conducted, and clips therefrom could be spliced over the most relevant footage; or the more direct “live commentary” approach, in which one or more persons provided their thoughts on the footage they were watching, which was then recorded and overdubbed onto said footage. [5] This was pioneered by the 1981 release of Citizen Kane, which had been voted the greatest film of all time in the 1962 and 1972 critic polls in Sight & Sound magazine (and seemed a lock for the hat-trick in the upcoming 1982 poll), which included a commentary by writer-director-star Orson Welles himself – fortunately the granddaddy of Hollywood auteurs did not have an asking price nearly as large as his titanic ego or his colossal girth. He was interviewed by one of his most loyal devotees, the Oscar-winning director Peter Bogdanovich. [6] Kane had previously been released in 1978 (in a very straightforward, two-disc edition), as the RKO library had been secured fairly early in the lifespan of the CED; though it went out of print in 1980 (to prime the market for the far more ambitious release to come), marketers were aware that Kane would have to be properly advertised to secure the dollars of customers who had purchased the previous edition. Thus, the original negatives were carefully restored, the footage remastered, and the resultant re-release of the film – on May 1, 1981 (a Friday) – was fittingly described as the “40th Anniversary Edition”. The videodisc itself came in a blue caddy, which would serve to differentiate products with two audio channels from the basic, one-channel discs (in white caddies). Kane sold well, though so much had gone into the making of the film that the distributor sold them at a guaranteed loss, a grand gesture which secured the “legitimacy” of the CED format. [7] Ebert and Siskel both praised the 40th Anniversary Edition of the film – they had received a preview copy – on their own anniversary tribute episode for Citizen Kane. Nonetheless, the notion of “prestige” titles which would serve as loss leaders for other products struck a chord with many high-powered executives both at RCA and at the studios – including the television studios. For 1981 did not just mark the 40th anniversary of Citizen Kane, widely regarded as the greatest film in history, but also the 30th anniversary of I Love Lucy, widely regarded as the greatest series of all time. Though many principals involved with that show had since passed, its star – and the head of the studio which would regain the ownership rights to the program late in the year – was alive and well, and had given many interviews and speeches on tours across the country, always answering questions about the show.

    After some persuasion, Lucille Ball had agreed to involve herself in the planned commentary tracks for I Love Lucy, which (like all CBS properties) had not yet been released to home video in any capacity. In agreeing to do so, however, she declined to participate in the “live response” format, an irony for a show which had been filmed before a studio audience throughout its existence (something which was also true for all subsequent Desilu sitcoms). Ball simply did not believe that she could remain spontaneous and comment on the action as it was happening; that wasn’t her style at all. She suggested using clips from her past interviews, speeches, and the 25th and 30th Anniversary Specials. [8] She also agreed to record new material specifically for the commentary tracks, though in the standard abridged interview format (her answers would be given in response to unheard questions). The first “Best of I Love Lucy” episodes (four could be aired on a single disc, and eight on two – it would take ten to carry a 39-episode season) would be released just in time for Christmas, 1981 – the first released videodisc (“Volume I”) featured “Lucy Does a TV Commercial”, “Job Switching”, “Lucy is Enceinte”, and “Lucy Goes to the Hospital”, and Ball provided commentary for all four episodes (alongside others). It became the fastest-selling videodisc in the history of the format, and (unlike Kane) proved greatly lucrative, and a triumph for the “RCA Presents Desilu” marque. Needless to say, more “Best of I Love Lucy” was planned for 1982.

    Even though the CED had proved popular and profitable for RCA, the greatest unqualified (and seemingly unrivaled) success of interactive home media in this era was the Syzygy VCS. Unlike at the arcade, where the wide array of cabinets were in direct competition with each other, all home games were played using the medium of the VCS – or, granted, one of its competitors, however unlikely that might have been. VCS cartridges sold almost as well as the VCS system (or “console”) itself did, particularly the many adaptations of arcade hits. The Syzygy adaptation (or “port”, to use the technical term) of the arcade smash Space Invaders was one of the best-selling games for the console in the late-1970s. [9] This despite the fact that graphics for the VCS were plainly inferior to even the most simple arcade games available at the time. Indeed, only Pong could be captured on the VCS with full graphical fidelity – because it was just two lines and one little square.

    The processing power of the VCS was limited as well, and Syzygy continued to produce and maintain arcade cabinets for more advanced game ideas based on the licences they had acquired from other media. A new version of their classic 1973 Star Trek game was released in late 1978, in the wake of The Next Voyage miniseries. The centrepiece of this new version was the addition of shields, which would slowly recover with the passage of time. In a limitation of programming which was quickly turned into an advantage for gameplay purposes, the shields would only recover when the ship was not firing weapons; this required players to “budget” their firepower and be more conservative (and accurate) in taking their shots. If shields were reduced to zero, then the hull would begin taking damage, and this was irreparable. When the hull was reduced to zero, the game was over. This would have greatly increased the length of games (and decreased the amount of quarters which were to be fed into the machine), were it not for the resources devoted to diversifying the movesets and attack patterns of the various Klingon and Romulan ships, making them both harder to hit and more unpredictable in their actions. Fan response was extremely positive; the more strategic gameplay was evocative of beloved episodes like “Balance of Terror”, even though there was still a total absence of character interaction. The remake was also timely in that it followed the smash success of Space Invaders, which helped it to become an arcade mainstay. The only version of Star Trek available on the VCS was the (inferior) port of the original 1973 version, but sales of that, too, rose considerably as a result.

    The licence with Desilu had allowed for another of their shows, Mission: Impossible, to be adapted into an arcade game in 1974, the year after the show had ended; it required the player to complete a series of specialized objectives before the win condition could be achieved, in an attempt to replicate the taut tension and high-stakes thrill and adventure of the series. [10] The game had spent a long time in development (it was originally scheduled for late 1973, in hopes of twin tentpole releases for Syzygy’s Desilu licences), but all sides considered the delays to be worth it for the gameplay. Mission: Impossible was exceptionally text-heavy for its era, and relatively light on graphics (Star Trek had been much more visually impressive), however this allowed the player to fill in many of the blanks himself. The sense of adventure provided by the creative puzzles and the fairly evocative writing would lend its name to a burgeoning genre, leading to a minority of games reviewers and historians regarding Mission: Impossible as a more historically significant game than the much bigger seller that was Star Trek. [11] Thus, when the remake of that latter game was released in 1978, fans naturally clamoured for a follow-up to Mission: Impossible as well, but the original had not brought in the quarters to nearly the same extent; indeed, many gamers were deeply frustrated by its complexity. Mission: Impossible was, however, ported to the VCS later that same year, in a game but extremely lacking attempt by Syzygy; most reports described the adaptation as literally unplayable, an unquestionable black eye for the company and, more notably, for the otherwise-sterling Desilu. The arcade game also received a spiritual successor, The Questor Tapes, which performed even more poorly than its predecessor had, perhaps because of the tepid fanbase for the series on which the game was based. On the other hand, Bruce Lee: The Way of the Warrior, one of the earliest fighting games, was a smash success for Syzygy in the arcades, one of their biggest hits of the late-1970s. It involved timed responses to moves performed by computer-controlled enemy players; pressing the correct button within the time limit would result in the execution of a martial arts move, which was depicted onscreen by the Bruce Lee facsimile (whose name and image was specifically licenced, making him the first celebrity to be depicted in a video game). It was based on the early Syzygy game Touch Me, an electronic version of the old “Simon Says” game. [12] In fact, that basic game engine was directly adapted for use in The Way of the Warrior; the game increased in complexity by requiring successful combinations (or “combos”) of moves to be completed in the correct sequence before the point was awarded. The timed-reaction component of the game allowed for an effective two-player experience: the point was awarded to whichever player pressed the right button ahead of his opponent. The Way of the Warrior was not adapted for the VCS, as it included four buttons which could not adapt to a controller which had only one. Given Lee’s involvement with the game, it became a massive hit in Asia, where it was known only as Bruce Lee, the name of his show being ignored. [13]

    All of the Desilu games were designed, programmed, and manufactured by Syzygy itself, but it was not the only firm which produced games for the VCS. The late-1970s were a period in which many enterprising game designers (and programmers, many of whom had previously worked in mathematics) entered the burgeoning genre, and they were all interested in reaching the largest possible market of potential consumers for their product. Most of these designers, especially those who sought “legitimacy”, acquired the tacit approval of Syzygy to develop for the VCS, but this was not necessary – indeed, the production of unauthorized games for the machine could not effectively be prevented. Many, perhaps even most, of these games were innocuous, usually knockoffs of popular VCS titles by Syzygy, or other legitimate publishers like Tippecanoe Software (or TipSoft), which was developed by a group of faculty and alumni of Purdue University, home of the oldest computer science program in the United States [14], or Infocom, which had emerged from a collective of instructors and graduates based at the Massachusetts Institute of Technology. [15] This naturally resulted in Syzygy finding themselves facing a quality control problem, though (as such games, by their very nature, flew under the radar) it did not make a perceptible impact on their bottom line, which allowed them to maintain a veneer of plausible deniability on the matter, so long as the “rogue” designers continued to remain underground (which they did). [16] However, a particular subset of these games would eventually become notorious for altogether unsavoury reasons, which would attract attention from far beyond the core market of gamers…

    As far as processing power was concerned, microcomputers available for personal use, which first became widespread in the late 1970s, had been unrivalled for the purposes of playing games electronically. Text-based adventure games and role-playing games of surprising complexity could be programmed on these machines, there being no need for any graphical component, even up to the limited capacity of the Syzygy VCS, let alone the arcade cabinets of the era. Naturally, this attracted a very specific sort of gamer, one who was extremely conscientious, detail-oriented, and imaginative. The 1980s would prove something of a golden age for this archetype of gaming. Those games which did involve graphics tended to be somewhat simplistic and were mainly obstacle-based. Still, these were generally capable of greater complexity than even the most ambitious VCS games (or, at the very least, those that were successful enough to be in any way playable as games). Among the general public, in particular younger audiences, educational computer games were best known, and these were capable of a wide variety of tasks, including helping to teach children how to type on the keyboard, teaching them about history (such as the popular Oregon Trail game), or about mathematics and basic economics (like Lemonade Stand). These “edutainment” games, as they became known, introduced the Mini-Boomer generation to the home computer largely because, although these were still very expensive for the average household, well-funded school districts would purchase these computers as an innovative new tool to facilitate learning starting in the late-1970s and continuing into the 1980s. For all these reasons, home computers were not seen as a threat to the VCS, though a number of other home video game systems did indeed emerge in the late-1970s. These included a second-generation Magnavox console, along with offerings from Fairchild Semiconductor and Bally Manufacturing. None of these were a threat to the VCS in any respect, but that would change when the popular toy manufacturer, Mattel, created a new electronics division which announced plans to release a video game console of their own, to be called the IntelliVision, and planned for release in 1979. Mattel naturally had a formidable marketing department, and reports were leaked that their machine would be powered by a 16-bit processor – superior to anything else on the market at the time – allowing for superior graphics and sound to the VCS, and indeed even most available home computers. These new external pressures proved most challenging with regards to the complacent attitudes held by Syzygy management.

    As did internal pressures. Needless to say, Desilu executives were horrified at the disaster that was Mission: Impossible, whose title, many wags noted, could accurately describe the ordeal of getting the game to work on any VCS console, and this, even more so than the potential future threat posed by Mattel, made it clear that the system would have to be replaced. In fact, to incentivize development of this replacement, Desilu refused to allow a port of the Star Trek remake of 1978 – which, as far as Syzygy brass were concerned, was a guaranteed seller – unless and until hardware was developed that could support the advances in graphics and coding which that game represented. Even the original VCS Star Trek of 1977 was considered several steps back from the 1973 arcade original. But from as early as that year, research and development at the company aspired to develop a sufficiently advanced replacement for the archaic-from-release VCS. Relief of this strong pressure from Desilu (on multiple financial fronts, given that studio’s investment in the company) came in the form of the hardware developer Texas Instruments, or TI, which in 1976 had launched the 16-bit TMS9900 microprocessor – a product seemingly far ahead of its time; for this reason, most manufacturers preferred the more “conventional” 8-bit processors. Syzygy, though, found the TMS9900 to be a perfect fit for the drastic improvements needed for the VCS, and it would become the heart of their new machine. [17] In addition to (and sometimes, as a result of) this dramatic advance in processing power, many of the problems with the original VCS had been corrected, dramatically widening the (previously severely limited) graphical capabilities and improving overall performance. It would also be noted for being the first console to include speech synthesis capabilities, as TI was prominent in that field.

    However, the process of improving the VCS would also have useful applications for the market in personal microcomputers, which (like the original VCS) had emerged in 1977. Indeed, there were those within the Syzygy engineering team who believed that the more versatile microcomputers were to be the future of home gaming, and thus what emerged as the Syzygy Video Computer System, Mark II, or VCS II, was designed with the potential of transferring that architecture thusly. It was decided very early on that the VCS II should be backwards-compatible with the existing library of games for the original VCS (which would enable production on the original model to be suspended after commencement of the newer one) [18], but the dramatically improved graphical capability and processing speed of the machine made it optimal not only for its intended purpose of gaming, but for myriad other tasks as well. Syzygy later decided to divide the project in twain, with each one initially code-named “Cindy” and “Terri” (named after two very attractive secretaries at the company) [19], the former being the original VCS II home console, and the latter emerging as a derivative Home Computer System, or HCS. Although both “Cindy” and “Terri” had largely identical technical specs, “Terri” alone came equipped with a range of bells and whistles that would enable her use as a machine for general home computing with an emphasis on gaming, as opposed to “Cindy”, who focused on gaming to the exclusion of all else. [20]

    The VCS II was released in time for the Christmas season of 1979, famously advertised during a commercial break of the football game between the Dallas Cowboys and the Houston Oilers which aired on Thanksgiving Day (Thursday, November 22) in the United States. [21] William Shatner, the former Captain James T. Kirk himself, served as pitchman for the console in the sixty-second spot, which (fittingly) advertised the VCS II port of Star Trek: The Next Voyage, the name for the 1978 remake of the original arcade game. [22] Gameplay was shown on screen, enough to get a clear view of the graphics, for the first time a distinct improvement from the arcade version. The new, streamlined controller was lovingly detailed, as was the “sleek finish” of the console body itself. Shatner also took pains to note that “all your old VCS games” could be played on this new console, a feature properly known as backwards-compatibility, in what would soon be established as a trend for the burgeoning industry. One vital piece of information he withheld was the price of the VCS II console, only informing consumers that it could be found “at your local department store”. He also uttered the tagline that made the console famous: “What can’t you do with the VCS II?” Future advertising for their new console, when it had established itself in the market and rivals had emerged, focused on processing power – at 16 bits, the VCS II had double the speed of most competing products, allowing for Shatner to make plenty of “warp speed” references. A secondary catchphrase for the VCS II, “video computer gaming for the 1980s”, was eventually shifted to their home computer model (as “the wonder computer of the 1980s”). Planned advertising campaigns by Mattel, on the other hand, had focused almost entirely on its superior graphical and sound capabilities to the original VCS – however, the IntelliVision was, at best, on par with the VCS II in those areas, and often fell far short. It also didn’t help that the MSRP for the IntelliVision was $299 – the VCS II, meanwhile, retailed for $269, and (because of the backwards-compatibility) had a much larger library of games available. [23] In fact, the original VCS port of Mission: Impossible was found to be functionally playable on the VCS II, given the greater processing power allowing a portion of the veritable thicket of kinks to be smoothed out. However, the game was still riddled with bugs, and greatly simplified from the original arcade release, leading Desilu to demand a new port of the game, to be made expressly for the VCS II. This version was released in late 1980, to critical acclaim and – for the first time – substantive commercial success. It paved the way for adventure games for the rest of that decade.

    Yet despite the unqualified triumph of Syzygy’s rollout, there remained people who claimed that there were indeed things that could not be done with the VCS II. Perhaps the most surprising – and convincing – rival to the supremacy of the VCS II was not another “video game” system, but a home video system – the LaserDisc, which had graphical capabilities far beyond even the most advanced arcade cabinets – the exact opposite situation to the original VCS. This came about due to a surprising discovery – through the efforts of one maverick animator whose creativity, having previously been confined at his former employer, was able to go into full bloom.

    Don Bluth had been an animator at Walt Disney Productions since 1971, after having been a journeyman during his early career (including during a previous stint at Disney in the 1950s). “Uncle Walt” himself had died in 1966, and ever since then, the flagship products of his empire, the feature-length animated films, had seen a precipitous decline in quality. The Jungle Book, the last of them which had been made under his auspices, would be remembered as a touchstone in the face of the many disappointments which would follow. Disney would produce only three animated features comprised of wholly original material in the 1970s – granted, the same level of output as in the 1960s, but of much lower quality. Robin Hood, their 1973 offering, featured animation reused from sources as old as Snow White; The Rescuers, released in 1977, perhaps marked a nadir for the company in terms of animation quality. Even Ralph Bakshi was getting better-quality work out of his animators for the Lord of the Rings films at about the same time. And when it became clear that Disney’s next offering, The Fox and the Hound, was going to more of the same visually, that was the last straw. Bluth, who was both talented and charismatic, led a revolt by a group of animators that split from Disney, which lacked not only the creative genius of Walt but also the administrative talent of his brother Roy (who had died in 1971). Bluth, to his credit, left Disney with concrete plans of his own; he had intended to adapt Mrs. Frisby and the Rats of NIMH by Robert C. O’Brien, but found himself sidelined by a far more innovative job offer. One of the myriad new game design firms, Advanced Microcomputer Systems, or AMS, was stymied by the lack of striking visuals in the nascent medium. [24] The promise for better with the VCS II and IntelliVision was not nearly sufficient enough, as far as AMS was concerned. High fidelity, true-colour, 24-frames-per-second animation was needed, in the opinion of those working at the company, in order to match the artistic merit of established media; this opinion was very much in keeping with the aspirations of pioneers in the motion picture industry (such as D.W. Griffith and Charlie Chaplin) or, of course, in television (such as Lucille Ball and Desi Arnaz). AMS knew that the dominant technology used in video games would not provide the experience they were hoping for, so they chose to pursue an alternative.

    They found it, surprisingly enough, in LaserDisc. It was by far the largest of the four home video formats in terms of storage capacity, and was capable of holding many hours of high-quality video footage. The idea was that “scenes” could be accessed from the LaserDisc and displayed onscreen in response to a player’s actions, functioning as a response to the correct prompt (or incorrect prompt, as the case may be). In its basest form, the LaserDisc technology would allow for a very simple but fundamental “interactive story” – a complete, fully developed narrative that would, with correct play, simply be “interrupted” by the actions of the player; if these actions were incorrect, then alternate scenes (showing bad endings, which would lead to game over) were shown instead. The storytelling experience – and, therefore, the gameplay experience – would be very linear; unlike puzzle and adventure games, which were flexible enough to allow gameplay to continue well after an “incorrect” choice had been made, the very detailed and very expensive animation could not feasibly allow for branching paths in storytelling – however, this didn’t matter to AMS, which simply did not consider true interactivity or freedom of gameplay to be an artistic virtue, given that they tended to think more in terms of standards set by the older media. And indeed, their game Dragonslayer (based, in part, on the classic legend of St. George and the Dragon) had perhaps the best-developed storyline of any game to date when it was released in 1981, including an array of memorable and clearly defined characters, but it involved no element of choice whatsoever on the part of the player. [25] Nonetheless, the novelty attracted paying customers, and the game made headlines, given that it was presented in a package which proved easily digestible by the press. The success of Dragonslayer inspired MCA to contract with AMS for additional such games, and AMS in turn secured the resources of Bluth and his animation studio, providing him with a well-needed and steady infusion of cash to fund his own ambitious animation projects – the money he had earned from Dragonslayer had funded the production on what had become known as The Rats of NIMH, which was expected to meet completion in 1982, which was one of many major fantasy releases in the early 1980s. [26]

    MCA was certainly delighted that their LaserDisc technology had found an alternative use, because their DiscoVision players were selling far less than RCA SelectaVision CED was, despite the many advantages of the format. Once more advanced CED players (and discs) were introduced, with no plans by MCA to do the same for LaserDisc, the format had reached a death knell; this was confirmed by the loss of affiliation with all movie and television studios other than the MCA-owned Universal by 1981, and later that year the DiscoVision division itself was shut down and its assets liquidated. MCA did continue to manufacture the discs, though arcade cabinets would prove their most substantial clients in the North American market from then on. The first casualty in the Home Video Wars had, for all intents and purposes, been logged. At the time, the best-selling product released for DiscoVision and LaserDisc VDPs was the 1975 Universal blockbuster Jaws, which had been the case for the entire lifespan of the format.

    During the height of the Home Video Wars, there was also a third competitor to both the LaserDisc and CED formats, based on the more old-fashioned videotape that had already made substantial headway among broadcasters and production companies alike prior to the start of the Home Video Wars. Sony’s U-matic videotape format had been popular with those working in the industry ever since it was introduced in 1969, but sales among end consumers – who were originally intended as its target market – were a non-starter. What made it so unattractive as an option for the general public was that both the cassette tape and the associated VTR player was too bulky and too ungainly. [27] In addition to those working in the industry, U-matic also became popular in the educational and commercial sectors, resulting in Sony deciding to shift the focus of U-matic in that direction. However, Sony was not one to permanently cede their interest in any market, and thus they prepared themselves for another round, which commenced with the 1975 launch of Betamax, which managed to eke out a far more sizeable market share than U-matic had done, though it also faced much tougher competition. In North America, for example, CEDs had a much stronger following, despite being inferior in terms of picture and sound quality to any of the competing formats save for VHS. But Beta, being a VTR format as opposed to VDP, allowed for the legally ambiguous practice of “time shifting”, which RCA naturally campaigned against quite vigorously, as did every television and movie studio in Hollywood. [28] This greatly limited the success of Beta in the United States and Canada; indeed, many retailers refused to sell recordable videotapes in the standard VTR formats. Europe and the Middle East, by contrast, were more receptive to Beta, as was the rest of the British Commonwealth. And it was in the Latin American and Caribbean markets – the most unexpected place for videotape technology – where Beta was truly successful. This may have been due to Sony conducting a major advertising campaign throughout Central and South America, and the Caribbean, showcasing popular television series (often with direct cooperation from the major broadcasters), including special events like Carnival and soccer matches (including those from the World Cup) and technical demonstrations. [29] In Guyana and the rest of the Anglophone Caribbean, Sony also focused on cricket matches. However, Beta was also popular for eminently practical reasons: most videotapes, which allowed for an hour of recording time at the standard speed, proved to be ideal for capturing episodes (known as capítulos, or chapters) of the perennially popular telenovelas, or Latin American soap operas, with a playback quality which was virtually as good as that of the original broadcast. In fact, the popularity of Beta in the region was such that Sony constructed a factory in Caracas, Venezuela, lowering transportation costs and making the tapes and VTRs more affordable to the lower-income consumers who lived in that part of the world.

    Even though the notion of recording existing television footage for the purposes of time shifting met with a strong stigma and considerable resistance, the previously-established use of videotape – as a medium for the recording of exposed images – also reached end consumers during this era, with modified versions of VHS and Beta VTR chassis (with attached camera equipment) functioning as video camera recorders, or camcorders, which similarly used videotape cassettes which were modified just enough to be rendered unusable in any standard VTR device – the camcorder would attach to the television set for playback in much the same way as a VTR would, bringing a definitive end to the era of slides and projectors as a means for people to watch their home movies and show them to others. RCA, in an attempt to avoid appearing monopolistic, manufactured both Beta and VHS camcorders, and advertised them, even going so far as to describe their products as “legitimate use of VTR technology”. In the strictest terms, this was a misnomer, as camcorders were in fact able to record televised footage in much the same fashion as kinescopes had done in the past; however, doing so would result in blurring and flickering of the image (due to incompatible frame rates), along with poorer picture quality. Even in ideal circumstances, sound reception on most camcorders was very poor compared to professional media, which resulted in the “bleeding” of sounds as heard on playback. However, with regards to ease of use, the camcorder was decidedly superior to earlier technology. It was the sole unqualified triumph for Sony in North America, with Beta camcorders far outselling VHS; for the first generation after “New Hollywood”, camcorders would also prove an invaluable resource for budding filmmakers with creative ideas of their own, some of which showed more promise than others…

    Unfortunately, all of the many advances in video game and home video technology had a heretofore unseen dark side. These new technologies very quickly found uses in ways that their makers had never intended, nor could they possibly have imagined. Although the era of porno chic had petered out by this time, the genre found a whole new lease on life on the new media available to its producers, which would soon become its dominant means of dissemination. Less significantly, though certainly more notoriously, there were (unauthorized) games produced for the Syzygy VCS that contained sexually explicit material (at least, as far as could be discerned given the extremely primitive graphics of that console), some of which became infamous through sensationalistic news reporting. The biggest outrage occurred entailed a game with an alleged historical setting (the Battle of Tippecanoe, in what was widely perceived as a slam at the TipSoft company). The game, which was actually designed by the “amusingly” named HardCore Productions, was named Tip My Canoe in a groan-worthy sexual pun on the name of the battle (and in another dig at TipSoft). Women’s rights advocates deplored its depiction of the male player character (Governor – and later President – William Henry Harrison) raping an anonymous female native, which was in fact the entire point of the game, with the score being based solely on how Harrison had… “scored”. [30] When the story broke, Lucille Ball (who, through Desilu, was a major investor in Syzygy) was said to be downright furious, and she even released a statement to the press that categorically condemned the makers of Tip My Canoe (whom she refused to address by name) for their appalling lack of moral judgment. This did not deter HardCore – which firmly believed in the adage that no publicity was bad publicity – in the least, and indeed they were even planning to create a new, higher-resolution version of Tip My Canoe for the VCS II, before Syzygy was able to put it a stop to it through threats of legal action. This whole affair would be the first time that video games were decried as “corrupting our youth”, but, sadly, far from the last. It was certainly not the only such scandal of the era, given the widespread release of home videos and video games which depicted subject matter that was, if not obscene, then at least pornographic. One new genre of film which proliferated on video purported to be “found footage”, but instead contained simulated depictions of cannibalism, animal cruelty, beheadings, and other highly graphic and disturbing content. [31]

    In many countries, such as the United Kingdom and particularly Canada, this resulted in such a profound series of major scandals based around moral outrage that the movies and video games which inspired them were dubbed the “video Nasties”. [32] In Canada, the Video Nasties were seen as such a threat to the keeping of the peace that the criminal justice system became involved, and several Royal Commissions were established. These naturally cast a long shadow over the development of cable and satellite broadcasting on television, and it was in this environment that HBO Canada was finally launched (after a parallel series of intense legislative actions which soon dovetailed with the “Video Nasties” scandals). When HBO became available to Canadians, due in large part to the controversy surrounding its licensing arrangement, HBO decided to eschew the MPAA ratings system used in the US since 1968, and create a motion picture rating system of its own which would also be used for television programming. This was a big gamble, as in Canada, ratings systems were the purview of the provinces, each of which had chosen to employ one which was distinct from most of the others. The new HBO rating system was based on many of these pre-existing ratings systems, which allowed it to be far more detailed than the rather basic and limited MPAA system (which was, itself, facing considerable criticism from all directions). This, along with HBO policy not to show R-rated films prior to 8:00 pm, was novel in Canada, as was the channel’s response to the video nasty scandals. It served to mollify critics, who now believed that at least some corners were treating the problem seriously.

    It was clear that although the opportunities provided to producers, marketers, distributors, and consumers by the proliferation of new, interactive technology seemed virtually limitless, it did not come without the potential for perversion or otherwise particularly gruesome exploitation by the seedy underbelly of society. It was the lot of virtually all innovations discovered by mankind since the birth of technology, but perhaps none had been so closely scrutinized, and so immediately and unreservedly blamed for the ills of society, even to a level that made past bêtes noires such as rock-and-roll or psychedelic drug use seem positively tame by comparison – though, granted, many of those who had indulged in those vices had aged enough to have children of their own, and naturally feared for their well-being. The cycle had begun anew…

    ---

    [1] Paramount made this rather dramatic about-face – “pulling a Churchill”, let’s call it, since they switched sides, and then switched back – in large part because they desperately needed an infusion of cash to help pay for their $100 million bond, and RCA (rather cunningly) took advantage of this.

    [2] Siskel and Ebert were both passionate defenders of films being presented in their original, theatrically-released format, and this incident is an example of that. Though both of them did, eventually, concede on the issue of VCRs IOTL, they were champions of film preservation (and federal legislation which would create a Film Registry, which passed in 1988). An OTL example of their advocacy in this field was a special 1986 episode of Siskel & Ebert devoted to colourization, entitled “Hollywood’s New Vandalism”.

    [3] It should, of course, be re-iterated that CED went far behind schedule IOTL. It wasn’t even released in any form until 1981, by which time it was scarcely more advanced than the technology had been in 1977 ITTL (only a few innovations, such as longer runtimes, and the capacity for stereophonic and multiple sound channels, had been added).

    [4] As noted, the capacity for dual-channel sound existed on some (not all) CED videodiscs (which were specifically identified as such, usually by the colour of their caddy) IOTL, but neither RCA nor any of the film distributors were able to exploit that technology to nearly the extent they did ITTL.

    [5] The audio commentary was invented IOTL by the Criterion Collection (on LaserDisc, not CED) in 1984. The very first audio commentary was provided by film historian Ronald Haver, for the original 1933 RKO version of King Kong (the second release under the Criterion Collection marque).

    [6] Orson Welles did not provide an audio commentary for Kane IOTL, in large part because the Criterion release of that film preceded that of King Kong, though perhaps Welles being three years older, more ornery, and closer to death might have played a part (he died in 1985). The once-great auteur was notoriously mercenary, willing to lend his name to just about any product that would pay him enough (though perhaps that commercial for frozen peas was a bridge too far). Peter Bogdanovich, a personal admirer and friend of his,
    participating in the Kane project ITTL also helps to convince Welles to consent to an interview.

    [7] The distributor of this release is modeled on Janus Films (a noted importer of foreign and “arthouse” films to the US), who created the Criterion Collection marque IOTL, which in turn pioneered the “special edition”, as noted. ITTL, with the earlier presence of home video, the timetable for these releases has been commensurately accelerated.

    [8] Funnily enough, IOTL Criterion released a 40th Anniversary Collector’s Edition of I Love Lucy for LaserDisc – but by this time (1991), unfortunately, That Wacky Redhead herself had passed and could not actively contribute to the audio commentary. However, the producers were able to assemble an impressive collection of past clips in which she had discussed her experiences making I Love Lucy, partial excerpts of which can be heard right here (along with thoughts from others). The commentary will be similar ITTL, though That Wacky Redhead will obviously be more reactive to specific moments in the episode as they happen (since the interviews are being conducted with playback in mind).

    [9] Space Invaders exists in substantially the same form as IOTL, including the famous oversight in programming which made the aliens move faster as more of them were defeated (with fewer objects on the screen). This fortuitous glitch is popularly credited with introducing the concept of a difficulty curve to video games, and would ITTL have an impact in the making of, among many others, the 1978 arcade remake of Star Trek, Bruce Lee: The Way of the Warrior, and the 1980 VCS II remake of Mission: Impossible.

    [10] Calling Mission: Impossible an adventure game in the traditional sense of the term is, perhaps, slightly misleading. The puzzles which comprised each “level” of play were more straightforward and discrete in execution, avoiding the labyrinthine rabbit holes which would make the genre notorious on account of its simplicity, brought on by technical limitations; however, the game placed a definite emphasis on creative, non-violent solutions (perhaps the defining trait of the adventure genre). At the time of the original game’s release in 1974, it was the most complex video game ever released, contributing to a sense that the market was not yet “ready” for it.

    [11] By the reckoning of most video game historians, the first adventure game was the unimaginatively titled 1977 game Adventure (also called Colossal Cave Adventure), which lent its name to the new genre. The game, like almost all early adventure games, was text-only, which allowed for more detail and complexity in the storytelling element of the genre. Many old-school adventure gamers continue to differentiate between “text adventure” and “graphic adventure” games, the latter of which did not become predominant until the mid-1980s; even then, most early graphic adventure games used a text parser before point-and-click interfaces became standard.

    [12] Touch Me was released by Atari IOTL (and by Syzygy ITTL) for the arcades in 1974. IOTL, designer Ralph Baer had played Touch Me at a trade show and found it severely lacking, seeking to correct its flaws in a game of his own, Simon, which was released in 1977 and is much better remembered today. Atari, inspired by the success of Simon, then made a handheld version which came out the following year; ITTL, however, those same designers instead devote their energies to The Way of the Warrior.

    [13] In East Asia, of course, Bruce Lee would undoubtedly be better known by his Chinese name, 李小龍, which is pronounced Li Hsiao-lung in Mandarin (Romanized in Wade-Giles, of course) and Lei Siu-loong in his native Cantonese (using the Meyer-Wempe Romanization).

    [14] Purdue did indeed establish the first degree-granting computer science program in the United States (in 1962) IOTL, though its faculty and alumni never quite came together to form a for-profit game design company. ITTL, Tippecanoe is named for the county in which Purdue University is located - which was also the site of the famous 1811 Battle of Tippecanoe. The name also proved fitting because one of the faculty members, who was chosen as President of the company, was a Dr. Tyler.

    [15] TTL’s Infocom is based on both the real company of the same name (which was indeed formed by staff and students at MIT) as well as ActiVision – the first major legitimate third-party cartridge manufacturer IOTL (who bought out Infocom in 1986 and continues to exist today as one-half of the Activision Blizzard gaming conglomerate).

    [16] The makers of “underground” games – particularly those with adult themes – largely did not advertise their products, except by word of mouth; they were kept behind the desk or in a back room and customers would have to specifically request them; the sleeves (unlike most other games) did not identify themselves on their packaging.

    [17] IOTL, because none of the existing computer manufacturers was willing to adopt 16-bit processing technology (most personal microcomputers running on 8-bit processors, which were cheaper to manufacture), TI designed a home computer system from scratch to house the TMS9900 microprocessor. This became the TI 99/4 in 1979, later released as the improved TI 99/4A in 1981. It was discontinued in 1983, due to a price war initiated by Commodore, which soon found TI on the losing end.

    [18] The Atari 5200, released in 1982 IOTL, was (among its many other problems) not backwards-compatible with the Atari VCS (by then renamed the 2600); gamers had to wait for a peripheral device which would allow the new console to play the old games, and by then, the infamous Video Game Crash of 1983 could not be stopped.

    [19] The names “Cindy” and “Terri” come from the two successive characters after the departure of Chrissy from Three’s Company IOTL. Cindy, like her cousin Chrissy, was a dumb blonde bimbo character, with the added “quirk” of being klutzy; Terri, however, was an intelligent and competent professional, and thus the HCS is named for her.

    [20] Here, at this point, the Syzygy VCS II and HCS are essentially the “best of both worlds” of two different OTL computer lines: the TI 99/4(A) on one hand, and the Atari 8-bit computer line on another. During the development of Atari’s first 8-bit PCs, they were codenamed “Candy” and “Colleen”. Candy became the Atari 400, which was a basic computer marketed as a gaming machine, complete with membrane keyboard and some limitations to the software. Colleen, on the other hand, became the Atari 800, which was marketed as a computer, complete with possibilities for expansion as well as a proper keyboard. As mentioned in a previous footnote, the TI 99/4(A) was built around the TMS9900 microprocessor, making the 99/4 the first 16-bit home computer. As TI’s computer line does not exist in TTL, much of the OTL development of the TI 99/4 would be used for the VCS II and HCS. The HCS is thus an OTL equivalent to both the OTL Atari 800 and the TI 99/4(A), and the VCS II is more of a cross between Atari 400 and the Atari 5200 console, which had, after all, been conceived as an application of Atari’s 8-bit computer technology to the video game console market.

    [21] In the Thanksgiving football game played between the Dallas Cowboys and the Houston Oilers on November 22, 1979, IOTL, the Oilers defeated the Cowboys 30-24.

    [22] Shatner was the pitchman for the Commodore VIC-20 (introduced in 1980) IOTL, the video for which can be seen here. Note how he slams the competition (Atari and IntelliVision) and mentions the price (“under $300”) in the first ten seconds of the ad. The TI line of computers, on the other hand, were endorsed by the ubiquitous Bill Cosby, whose ad can be seen right here. Note that Cosby does not mention their price. ITTL, Shatner is naturally chosen for his connection to Star Trek, given the importance of The Next Voyage as a launch title for the VCS II. His sixty-second spot was spliced into various thirty-second versions which were seen throughout 1980.

    [23] The launch price of the VCS, IOTL and ITTL, was $200 in 1977. The launch price of the VCS II ITTL and of the 5200 IOTL may seem to be the same ($269), but note that they were released three years apart (in 1979 and 1982, respectively) and the rampant inflation taking place over that period means that the $269 of 1979 was worth over a hundred dollars more just three years later. Note that the 5200 had an 8-bit processor, not a 16-bit one, which perhaps explains why it was so much cheaper ITTL (not that it prevented consumers from complaining). However, as ITTL, IntelliVision launched IOTL with a price of $299 (over an extended rollout that continued in 1980). That console was the most successful competitor to the dominance by Atari through the Second Generation of 1976-83, selling over three million units, largely because it was the only one with a 16-bit processor, and because it had launched well before the oversaturation of consoles took hold circa 1982 (leading to the Crash).

    [24] AMS also made the OTL Dragon’s Lair game, alongside many later LaserDisc games; these were all published by Cinematronics. The company gets an earlier start ITTL thanks to the earlier launch of the Home Video Wars, and this naturally affects the timetable of all products by the company as well.

    [25] IOTL, Dragon’s Lair (not Dragonslayer) had a more traditional “unlikely hero saves the damsel-in-distress” plot, unrelated to the tale of St. George and the Dragon. However, the success of The Lord of the Rings influences the writers, as well as accelerating the development of the game itself, which was completed in 1983 IOTL.

    [26] The Rats of NIMH was, of course, released IOTL as The Secret of NIMH, to critical plaudits (many still believe it to be the finest film ever made by Bluth) but lukewarm audience response, resulting in his company going bankrupt (the first of three times this fate would befall him). Regardless of the success of the NIMH film ITTL, his company will be spared that fate thanks to its deal with AMS having been made beforehand.

    [27] As a comparison of U-matic with other VTR formats, here is a photo comparing the 3/4
    U-matic tape with other formats. The U-matic tape is in the upper left.

    [28] As previously mentioned, RCA was IOTL perhaps the single biggest proponent of time shifting and of videotape recording in general; “SelectaVision” was also a trademark for their VCR line (being used for their CEDs after their introduction in 1981). Their TTL opposition, along with bringing so many producers and distributors onside, helps to keep videotape largely confined to industry and commercial applications (apart from camcorders).

    [29] The adoption of Betamax in Latin America is actually based on an OTL event in 1972, when Brazil – virtually alone in the Western Hemisphere – adopted the PAL colour standard for TV in conjunction with the existing 525-line TV standard. Most of the rest of the Americas adopted NTSC (the only other countries adopting PAL were other Southern Cone countries). One possible explanation is that Philips and Telefunken – along with the major European electronics manufacturers at the time – did an advertising campaign across South America in 1972, similar to Sony’s Beta campaigning ITTL. Only Brazil adopted PAL as a result of this campaign.

    [30] Tip My Canoe is based on an OTL video game, Custer’s Revenge, with a strikingly similar “plot” (with the exception that the player character was General George Armstrong Custer, and the setting was the Battle of Little Bighorn). Both games, as you might imagine, are about equally as faithful to history – as noted, HardCore (which, thankfully, exists only ITTL) chose the Tip My Canoe setting specifically as a dig at TipSoft (the designer was rejected by Purdue). Given the nature of depicting a President of the United States (however brief his tenure) as a rapist as opposed to a “mere” General, the controversy is probably even stronger than that over Custer’s Revenge was IOTL.

    [31] An OTL example of this phenomenon was Cannibal Holocaust, one of the first well-known “found footage” films, which contained disturbing and graphic depictions of violence, sexual brutality, (simulated) onscreen murder (leading to snuff film allegations), and (unsimulated) cruelty to animals. The film was heavily censored and even banned in many countries, and its director arrested on charges of obscenity and even murder until he was able to approve that those scenes (including a gruesome impalement shot) were staged. ITTL, though this exact film was not made, others with similar plots and onscreen depictions were.

    [32] The term “video nasty” was unique to UK and Ireland IOTL, dating to a similar series of scandals as those that exist ITTL. As a result of the video nasty scandal, the British Board of Film Classification, or BBFC, now rates videos as well as films. This demonstrates two very different approaches between the two senior Commonwealth nations: when a major scandal arises, Canadians patiently await the results of a Royal Commission, while Britons demand that Parliament enact legislation immediately.

    ---

    And here we are with our long-awaited return to interactive media! My thanks to Dan1988 for co-authoring this, the final instalment of the Technology Trilogy, and indeed for his helping hand in all three parts thereof. Thanks also to e of pi, whose brainstorming assistance in the writing of this update was invaluable, as was his time devoted to proofreading this quite lengthy update. (Remember when my posts were only a couple thousand words long? It seems so long ago now
    ) Indeed, this is the longest update yet written for this timeline, and the one with the most footnotes. Revising and editing naturally proved somewhat more taxing and time-consuming than usual, but it was great fun to write. I hope that you all enjoy the latest round of shocking revelations featured herein – once you give yourself enough time to absorb all of them, of course ;)


    TMS9900.jpg
     
    Last edited:
    Appendix B, Part VII: US Presidential Election, 1980
  • Appendix B, Part VII: US Presidential Election, 1980

    The United States was not particularly popular in the rest of the world as the 1980s dawned. This was par for the course, naturally, as it was the only superpower in the First World; however, President Reagan’s policies were, in general, less well-received on the international stage than those of President Humphrey had been. Indeed, it was difficult to say where the United States was least popular: perhaps it was in Red China, the singular target of Reagan’s anti-communist rhetoric and sanctions, given his continued (if reluctant) good relations with Soviet Russia; perhaps it was in the Islamic world, which had been burned by the continuing American military presence in Iran, and their support for Israel (which was, among other things, also supported by Iran – firmly against the wishes of those who would overthrow the Shah). The United States had not been popular in much of Central America and the Caribbean for close to a century, and that gave no indication of changing, especially as Reagan stubbornly maintained American control of the Panama Canal Zone, against the wishes of the local government. [1] Foreign powers aside, it was on the home front – particularly in areas most strongly affected by the late-1970s recession – that Reagan had particularly low approval ratings. His supply-side economic policy was often labeled by opponents as “do-nothing economics”, and this impression was most keenly felt wherever unemployment was a problem. However, though many Democrats believed that they could run just about anyone and defeat Reagan in 1980, the President had an undeniable charisma and appeal with voters which made him far more daunting and formidable a foe than would otherwise be the case.

    Nonetheless, many candidates did emerge as potential Democratic challengers to Reagan. However, neither of the major contenders from the “Battle of the Titans” four years before – Vice-President Edmund Muskie and Sen. Henry M. “Scoop” Jackson – chose to run again. Muskie’s name was mud after his landslide loss to Reagan, though he had managed to return to the Senate in Maine. Jackson, on the other hand, was a senior citizen by 1980 (just one year Reagan’s junior), and would be over 70 by 1984, were he to emerge victorious and seek a second term. Jackson quietly accepted his status as “Yesterday’s Man”, and decided to focus on being an elder statesman and power broker within the party. He would prove to the Democrats what Robert Taft had been to the Republicans a quarter-century before – a revered legislator on his party’s right-wing who never got the chance to seek the presidency on its behalf, though not for lack of trying. However, another elder statesman who had been born prior to the US entry into the First World War, William Proxmire, did choose to run for President that year in what was initially perceived as a vanity run but would rapidly emerge as something far more significant…

    Other Democratic candidates included the Governor of New York, Ed Koch, who had only been elected to that position in 1978 (after ten years in Congress), defeating the incumbent, Republican Malcolm Wilson, in a landslide. [2] He appealed to the Jewish community, which was a powerful, influential, and wealthy faction within the party, and was also able to rather uniquely combine this ethnic base with support from the white working classes – the onetime “Archie Bunker vote”, in what was surely a sign of the times. The good people of New York, on the other hand, were most displeased by their newly-elected head of state seeking higher office so soon after arriving in Albany, and Koch was branded an opportunist by his opponents – this greatly limited his growth potential. In the end, the white working classes had many candidates behind whom they could rally, such as Senate Minority Leader Robert Byrd of West Virginia, a thoroughly coal-powered state, as well two Midwesterners: the aforementioned Proxmire, and Sen. John Glenn.

    Glenn had easily the highest name recognition of any candidate in the Democratic field, even over Koch and his nominal “boss”, Byrd, because he had been in space; this triumphed over his relatively thin legislative record after a decade in the Senate. [3] Indeed, Glenn was often inaccurately remembered as the first American in space when he was actually the third – behind Alan Shepard (who had also been to the Moon as part of Apollo 14 during the very height of Moonshot Lunacy) – and the late Gus Grissom, who had tragically perished in Apollo 1 and had been earmarked to be the first to land on the Moon, had he lived. Shepard was said to be so annoyed by reports of Glenn as the “first American in space” – he was merely the first American to orbit the Earth, an important distinction – that he refused the Glenn campaign’s request to appear with his fellow member of the Mercury Seven. However, he may have chosen not to appear with Glenn for political reasons: many astronauts, such as Buzz Aldrin and even two of Glenn
    ’s fellow Senators, Jack Swigert of Colorado and Harrison Schmitt of New Mexico, were Republicans; either that, or they were avowedly apolitical, such as Neil Armstrong. Perhaps this was for the best, as though space exploration would remain a key element of his overall image, the more pressing issues of the election were, as always, rather more down to Earth. Given the difficult economic conditions which prevailed in this era, Democratic strategists flocked to each of the candidates, including Glenn, with their own ideas about recovery. The preponderance of moderates was reflective of a paradigm shift in the Democratic Party; the Great Society of Johnson and Humphrey was dead and buried, and a different approach would have to be taken to deal with the economic question. Reaganomics, more or less the polar opposite approach to the Great Society, had also proven ineffective. There had to be a middle ground somewhere and Glenn, with help from party insiders, believed that he had found it. State subsidization of private investments into public works – or, more succinctly, encouraging firms to invest in America – could combine the benevolence of big government and the efficiency of the free market into a palatable (and easily digestible) package. “Invest in America” emerged as the slogan for Glenn’s new proposed policy, as it was a clever and simple sound bite which allowed him to stand out from the crowd and to define himself in the media and, later, with the electorate. He started off strong by promoting wind power in Iowa, and then high-speed rail connections in Maine and New Hampshire, and was able to build on that early momentum as he proposed new programs in his travels throughout the country.

    Although Glenn was one of the many candidates did their best to stake a claim somewhere in that vast gulf between those two extremes, others, such as the unabashedly liberal Rep. Mo Udall of Arizona, did not embrace the golden mean; he eventually found himself representing the left-wing of the party more-or-less unopposed, much to his own surprise; other “leftists” within the race were less uniformly so in their convictions, with the exception of Rep. Barbara Jordan of Texas. [4] An African-American woman, Jordan was forced to withdraw from the race early on when the severity of her carefully-concealed multiple sclerosis was leaked to the public. The days of a candidate successfully concealing his or her physical handicap from the public were long gone. Nonetheless, her run was the first of any significance by a minority or a female in the Democratic Party, though the GOP had beaten them to the punch in both cases (with Edward Brooke and Margaret Chase Smith, respectively). Jordan’s base in the South did not coincide very strongly with her race; many black legislators in the South – particularly the rural South – were Republicans, not Democrats (which had been demonstrated in the election of Sen. Charles Evers of Mississippi), as that electorate had largely forgiven, if not quite forgotten, the GOP for the aberration of Goldwater in 1964. Urban and suburban blacks, on the other hand, decidedly preferred the Democrats, particularly in the North. The “Great White Hope” for the Democrats in the formerly Solid South was indeed a white hope – Senator (and former Governor) John Connally, by far the most conservative candidate in the race, who represented a blatant attempt to recapture the voters lost to the American Party over the course of the previous decade. [5] Connally did succeed in attracting some of the old-school Dixiecrat vote, largely because the AIP itself was moving in a new direction. However, the Democrats would never nominate John Connally for President. He may have been a Texan, and close with President Kennedy, but he was no Lyndon B. Johnson, and indeed even he could not clinch the nomination without already being the incumbent
    – and even then In the end it was Glenn, who had the best name recognition, the most appealing (or rather, least unappealing) economic policy, and who tried his best to be everything to everyone, who cleared the primary field.

    The selection of Glenn – an astronaut who endeavoured to take on a leadership role by presenting a third alternative to the polar options of the Great Society and Reaganomics – reminded many commentators of the iconic James T. Kirk, a comparison made even more appropriate in that Glenn had been a Colonel in the United States Marine Corps, equivalent to the naval rank of Captain. [6] Many of the Star Trek cast fittingly endorsed Glenn for the Presidency and even appeared alongside him, including Nichelle Nichols, Leonard Nimoy – who, being a television director, also filmed a number of advertising spots for the Glenn campaign – and, most prominently, the only career politician aboard the Enterprise, George Takei, who had in fact endorsed Glenn as the Democratic candidate for President early on, partly for strategic purposes. His close friend, Rep. Marlin C. DeAngelo, who represented CA-24 in North Central Los Angeles (which partly overlapped Takei’s old city council district), had joined the fray in seeking a seat in the US Senate, held by the far-right incumbent H.L. Richardson, perhaps the epitomic beneficiary of the 1974 Republican Revolution. DeAngelo, like Glenn, had served with the USMC (reaching the rank of 1st Lieutenant by the end of the overseas quagmire), and the two had formed a fast friendship in Congress, despite serving in different houses. This relationship would pay dividends for DeAngelo, as would his friendship with Takei and, most intriguingly, that with the celebrated Chief Counsel for the Plaintiff in the Trial of the Century, Andy Taylor. [7] The two East Coast transplants – Taylor from Maryland, and DeAngelo from New Jersey – had been roommates during their time together at USC Law School, and Taylor was certainly the man of the hour in early 1980, with his critical endorsement of DeAngelo being enough to win him the crowded nomination fight for the Senate seat. This created an opening in CA-24, a reliably Democratic district, and Takei (having represented parts of it as recently as 1979) decided to throw his hat into the ring.

    In making the jump from municipal to federal politics, Takei had set a number of objectives for himself. First and foremost, he wanted the US government to be held accountable for their shameful internment of Japanese-American citizens during World War II, even though not one of them was ever found to be engaging in treasonous activities (in fact, Takei himself had played such a traitor in a highly controversial episode of The Twilight Zone, which he defended by reminding viewers that the show was set in the twilight zone). [8] In fact, Takei had made headlines in 1976 by endorsing Scoop Jackson (who had supported the internment as a Congressman in the 1940s) after the Senator had apologized for his past actions and vowed to support suitable redress. [9] His pet issue of mass transit could easily be extrapolated onto the national scene, as the past decade of highway revolts had so vividly demonstrated; Takei was also suitably intrigued by the high-speed “Rocket” rail line in Montreal, believing that such a project could and should happen stateside; in fact, it was he who had coaxed the Glenn campaign into building initiatives for HSR into the “Invest in America” platform – his fawning claim that the Senator would be remembered as “the Eisenhower of mass transit” was likely what did the trick (Takei had a proven record with influencing Presidents on the issue, given his experiences with Humphrey). Finally, as a longtime supporter of gay rights, Takei also felt that preventing another Briggs Initiative was a priority, and that this was more easily done pre-emptively from Congress. [10] Further, Takei, like his friend DeAngelo, was a supporter of Glenn’s proposed policies (many of which had been repackaged from the failed Jackson campaign of four years earlier – in fact, Glenn had even hired his campaign manager), and wanted in at the ground floor. Takei’s celebrity and his six years of municipal experience, coupled with endorsements from his predecessor DeAngelo, were enough for him to consolidate the votes needed to win the Democratic nomination for the congressional district – tantamount to election in a seat which DeAngelo had won by 20 points even in 1974, as a fresh-faced nobody in a vacant seat during the Republican Revolution. Fittingly, CA-24 contained within its boundaries the headquarters of Desilu Productions, where Takei had first become famous. Lucille Ball, who consistently refused to get involved in politics, had declined to participate in his campaign, but was willing to take part in a photo-op with Rep. Takei after he had been elected and seated.

    The course on the federal level, however, did not nearly run so smooth for Glenn, who despite winning his party’s nomination was not nearly as able to consolidate support after the fact. Sen. William Proxmire, his longtime arch-rival as well as his opponent in the primaries, was so incensed at the Democratic Party choosing, in his words, “a space cowboy to run against the cowboy actor”, that he chose to mount his own independent run for President – the first major candidate from outside the South to do so since, fittingly enough, another Wisconsinite, Robert M. La Follette, in 1924. The “cowboy” comparison would come back to haunt Proxmire, who represented a state known for its dairy industry, leading political cartoonists to famously depict him as “the milkman” – the passiveness of such a role compared to the cowboy was not lost on observers. To complete the roster of caricatures, the AIP candidate, Jesse Helms, was variously depicted as an Indian counterpart to the cowboys as an ironic reference to his party’s nativism, as an enraged bull being corralled by the two cowboys, or simply as a turkey in order to match his party’s emblem. Despite his initial misgivings, Proxmire would eventually resign himself to embracing the “milkman” moniker, claiming that the profession represented the common-sense ideals which he was advocating. The metaphor was strengthened with the addition of Rep. Shirley Chisholm to the ticket; “chocolate milk” and “coffee with milk” jokes abounded. [11] She joined his ticket largely due to her dismay with the Democrats working to accommodate the old “Dixiecrat” electorate. Proxmire and Chisholm named the party which formed around their independent run the Earth Party, a reference to environmentalism as well as “putting our planet ahead of a whole lot of nothing”, in Proxmire’s words. His loathing of the space program extended to Cosmos, the wildly popular PBS series which was airing at the time, which (being the most expensive program ever aired on that network) he naturally viewed as a gargantuan boondoggle. This attracted the ire of Dr. Carl Sagan, who wrote, produced, and hosted the program; the cosmologist repeatedly attempted to arrange a sit-down with Proxmire, but to no avail. [12]

    The Reagan campaign largely ignored Proxmire, proving surprisingly aggressive in attacking their main challenger – perhaps because polls showed Glenn performing well along charisma and likeability metrics, which were otherwise the definitive strengths of the incumbent. They had some traction in their offensives, famously posing the question “What on Earth has John Glenn done?” in pointing out his undistinguished tenure in the Senate. [13] Glenn was ten years younger than Reagan, turning 59 during the primaries, but had more experience – a decade in the Senate – than him (with just eight years as Governor) when he first sought the Presidency; granted, none of this was executive experience. He sought to correct this weakness in choosing his running-mate, who would also hopefully consolidate some of the potent – and perhaps decisive – “Connally Democrats” who had been discovered during the primaries. The candidate who seemed to best fit these qualifications was Senator Jimmy Carter of Georgia, who had previously served as Governor of the Peach State from 1971 to 1975. Carter had briefly flirted with an abortive Presidential campaign in 1976, before, like most of his fellow Democrats, standing aside for the Battle of the Titans – in which he had endorsed Scoop Jackson, which proved a harbinger for the Muskie campaign’s fortunes in the Old Confederacy. [14] Carter, who had become a national star within his party upon defeating his own arch-rival, Lester Maddox, for Senate two years later, also flirted with campaigning in 1980, but Connally and Jordan between them hogged his natural constituency. Carter had eagerly accepted Glenn’s invitation to serve as his running-mate; born in 1924, he would still be younger running for President in 1988 (after two hypothetical terms as VP) than Ronald Reagan had been running (and winning) in 1976. [15] Many observers commented on the unusual vocations of the Glenn-Carter ticket: though both men had served in the military (Glenn as a test pilot, and Carter as a submarine engineer), the two respectively became best known as an astronaut and a peanut farmer. This resulted in the campaign’s second-most famous slogan: “From under the ground to above the skies” (which, inevitably, was eventually combined with “Invest in America”). Carter campaigned largely in the South, with operatives joking that he was to be treated like a general at the head of a Confederate army if he attempted to “invade” Northern territory. His folksy manner and his adroit rebuttals against the religious wing of the AIP (as Carter was, himself, a devout Baptist) won him plaudits from the chattering class and the sympathy and admiration of many Southern voters. In fact, he focused almost entirely on social issues, leaving the economy in the hands of Glenn, who toured the United States (though mostly the manufacturing centres of the Midwest and the Northeast) focusing on job creation, through novel means. One of his more famous stops was in Cleveland, in his home state of Ohio, where he noted that the much of the industrial infrastructure was already in place to produce solar cells for power generation – the technology to build them came courtesy of what both the Republicans and Proxmire viewed as the “failed” SPS project (for which advanced panels had been developed and put into use for energy collection). The solar cells would provide power with absolutely no pollution – and in addition to the equipment being built in Ohio, it would be operated in dry, sunny Arizona, which would benefit two completely different regions of the United States. In keeping with the “Invest in America” theme, private corporations would design, fabricate, place, and maintain these solar cells and collect the revenues from selling the power which they would produce. These operations would be funded by development loans, to be paid back with interest – and this would allow the government itself to turn a profit in the long-term.

    Just because Carter was left to handling social issues did not mean that they had little significance in the campaign. Sen. Jesse Helms of North Carolina had won the American Party nomination for President with the support of the religious right. Alienated by the relatively moderate course that President Reagan had forged on social issues, in particular with regards to the Briggs Initiative in 1978, many religious leaders – particularly Southern evangelicals such as Jerry Falwell and charismatics like Pat Robertson – allied with the one major party that supported most of their views, the AIP. Helms was sufficiently inspired by their fervour to effectively deem segregation a lost cause, urging his supporters to focus on standing their ground with regard to other issues; of the comprehensive desegregation system which had been instituted since the mid-1960s, only busing remained on the table by the end of the 1970s. Although the AIP uniformly opposed it, so too did many Republicans and Democrats. Therefore, the time was right for this nascent force, which had so strongly asserted itself in 1978, to emerge as a cohesive organization with the backing of Helms, but it faced powerful opposition from the outset. [16] Catholic leaders vehemently spoke out against them, even as they opposed many gay rights initiatives (at least in principle). Moderate Protestant groups, especially in the North, rallied against their discriminatory rhetoric. Black churches, for obvious reasons, did not buy into the organization, and some were rather vocal about the “trickery” to which its leaders resorted in attempting to deceive others, with Biblical comparisons abounding – Falwell, Robertson, and Helms were all frequently described as “snakes”. The religious right notably failed to win the support of the Rev. Billy Graham, perhaps the most popular and well-regarded of televangelists, though he presented himself as apolitical. Many Christian celebrities spoke out as well, to varying extents – Elvis Presley, “The King” himself, and an avid recorder of gospel music during this phase of his career, spoke out against prejudice and discrimination (though in vague terms – Elvis viewed himself as “just an entertainer”, and did not want to get too involved). [17] Jimmy Carter, meanwhile, was the foremost of many Christian politicians who opposed the religious right, making him a valuable asset in walking the line between appealing to “Connally Democrats” and facing accusations of racism. This precarious dispute may have inspired Helms to seek a running-mate from outside the South (as George Wallace had done in 1972), and one who was not particularly known for his religious convictions: the arch-conservative Republican Sen. John Bertrand Conlan of Arizona, elected in 1976 on Reagan’s coattails, but having grown increasingly disillusioned with the President in the years since. His defection prevented the GOP from rallying over apparent disunity amongst the Democrats; indeed, many pundits would later remark that it was something of a miracle that liberal Republican Charles Mathias had agreed to run with Reagan once again in 1980.

    TWR US 1980.png
    Map of Presidential election results. Red denotes states won by Glenn/Carter; Blue denotes those won by Reagan/Mathias; Gold denotes that won by Helms/Conlan.

    Turnout for the election was over 60%, or just short of 100 million, maintaining the threshold attained in 1976. [18] Glenn and Carter, in winning the election, carried just twenty-two states (and the District of Columbia) out of 50; though this translated to 313 (and one more for Glenn alone) electoral votes, their popular vote tally of 45.5% was more indicative of the tight margins which defined the whole campaign. Though Reagan and Mathias lost their bid for re-election, they won twenty-seven states and 215 electoral votes, as well as 43.1% of the popular vote; this was better than the GOP had managed in either 1968 or 1972, against Hubert H. Humphrey. The AIP (technically, the ADP) won only Alabama and its nine electoral votes, with Jesse Helms falling below 5% nationally for the first time in the history of the party. Indeed, 5.5% of the vote, good for third place, went to Proxmire and Chisholm, though this did not translate to any legitimate electoral votes, not even in Wisconsin. A faithless elector in New York who had been pledged to Glenn and Carter instead voted for Glenn and Chisholm, which was the first electoral vote after cast for a woman and minority candidate in American history. [19] In the House of Representatives, the GOP lost 27 seats, bringing them down to 204, below the majority line of 218. The Democrats, on the hand, gained only 31, for a total of 213. The AIP/ADP held the remaining 18 seats, and the balance of power, down four from 22. Meanwhile, in the Senate, the GOP retained 51 seats – a loss of five from the prior standings, but still enough to control the upper chamber outright. The Democrats gained six for a total of 44, and the AIP lost a single seat, with four left.

    In the end, despite another close race in the vein of 1960, 1968, and 1972, the Democrats were able to win due largely to their better retention of right-wing voters than the Republicans – Reagan’s “betrayal” through Briggs stung bitterly with many social conservatives who had so eagerly supported him in 1976. The Republicans, oddly enough, retained liberals better than the Democrats had done, as the Earth Party (in a mirror image of the AIP) thrived on the votes of disillusioned left-wing Democrats – notwithstanding the old Moonie Loonie cadre, whose loyalty to Glenn seemed unshakable. This lost several close states for the Democrats, including Washington (which, along with Muskie’s home state of Maine, was one of just two that the GOP won in 1980 but lost in 1976) and Proxmire’s home state of Wisconsin. The results of the election were a perfect storm against Proxmire, however, even notwithstanding that he had failed to win any electoral votes – Glenn had won the Presidency and his party the House, but the Senate remained Republican. This meant that Democrats didn’t need Proxmire’s support in the upper chamber, and the Ranking Member of the Committee on Banking, Housing, and Urban Affairs was duly stripped of that post at the beginning of the 99th Congress. Proxmire continued to serve with the Democratic caucus; unlike George Wallace in 1972, he had built no infrastructure to elect any of his allies to Congress, nor spur any defections. The Earth Party was very much a one-man show, or so it had seemed at the time.

    President Reagan was very nearly 70 years old upon his defeat, and decided to retire from politics and enjoy his golden years as a private citizen. Though his approval ratings had been underwater for most his term, he remained quite popular on a personal level. His successor, John Glenn, was inaugurated as the 39th President of the United States on January 20, 1981, becoming the highest-ranking astronaut in the federal government, and (needless to say) the first President who had left the Earth. His victory, though narrower than expected, would set the tone for the rest of the decade, the election of 1980 quickly coming to be described as a realigning election. Having promised during his campaign to revitalize the space program, and having defeated a splinter candidate who based his campaign largely on opposition thereto, it was clear that NASA would come roaring back. The cover of TIME magazine in a late November issue said it all: “RETURN OF THE MOONIE LOONIES?” More tellingly, TIME Magazine named John Glenn their Person of the Year for 1980, though this was widely derided as a “safe choice” over any number of those involved in the Trial of the Century; the editorial board acknowledged that George and Marcia Lucas, Andy Taylor, and C.A. Baxter had all been considered finalists for the position, implying without stating outright that they had split the vote and allowed Glenn (who, having won the Presidential election, was a natural candidate) to come up the middle.

    In legislative terms, the Democrats had won the plurality of House seats, but could not form a majority, therefore throwing the balance of power to the American Party; this finally achieved the long-held aspirations of the AIP/ADP head honcho, George Wallace, at least in some form. But the Wallace of 1980 was not quite the same man that he had been in 1968; segregation was a dead issue even in the Heart of Dixie. Taking their cues from Jesse Helms and his new coalition of supporters on the Presidential level, many AIP candidates and even incumbents chose to campaign on social conservative issues (which, theoretically, could appeal to African-American voters, though of course they did not in practice) rather than reactionary, openly racist ones. But the AIP, after the 1980 election, were ensconced; they had formed the longest-lasting third-party bloc in Congress since the Populists. Many within the party saw this golden opportunity to hold the House hostage as their long-awaited, hard-fought reward. But Wallace disagreed; always on fairly cordial terms with the Democratic brass even during the tensest moments of his campaigning against them, he negotiated plum committee seats for himself and his protégé Flowers in the Senate, as well as his ADP congressmen (who controlled six seats out of seven in Alabama – and fully one-third of the combined American Party caucus), in exchange for their support (and the de-recognition of the NDP within Alabama, restoring the ADP as the “official” state Democratic Party). [20] This deal, which notionally brought the Democratic caucus to 219 House seats, and a razor-thin majority, was announced on December 19, 1980, with the reconciliation between Wallace and his old party famously described as the “Christmas Miracle” on the next morning’s headline in the Washington Post. (A miracle more different than the other big miracle of 1980, the Miracle on Ice, could scarcely be imagined.) The 99th Congress would be the first in which the House was controlled by a “coalition” since the 65th Congress of 1917-19, though in that case the Republicans had won more seats than the Democrats, who held the lower chamber anyway with support from the Progressives.

    However, others within the rump AIP, including Helms, were irate. Unfortunately for him (though to the delight of many others), there was nothing that he personally could do about it. Though North Carolina had allowed him to run for re-election to his Senate seat concurrently with heading the Presidential ballot for his party, he had lost his seat to the Democratic star candidate, Andy Griffith. [21] Of the twelve remaining AIP Congressmen, not one would join the Wallace faction; in fact, the independence of the ADP within the American Party structure had, in the eyes of many political scientists, made such a schism seemingly inevitable. None of the remaining AIP Senators joined the Democrats, either; and indeed that caucus would grow when their VP candidate, the nominally Republican Sen. Conlan of Arizona (who had been booted from the party at the end of the 98th Congress) joined them shortly thereafter. Ironically, despite that defection, and the appointment of several new Senators to replace those who had departed for the Glenn administration (including, in addition to Glenn and Carter themselves, Dale Bumpers and John Connally), the Senate composition in mid-1981 was exactly the same as that which had been elected by the people in November of 1980; the GOP still held the upper chamber outright, preventing the Democrats from acquiring the trifecta enjoyed by the Republicans throughout Reagan’s term. The former Democratic Majority Leader, Tip O’Neill of Massachusetts, enjoyed majority support from the newly realigned (and re-brokered) House, and therefore replaced Gerald R. Ford as Speaker. No Republican had enjoyed a tenure as in the Speaker’s chair as lengthy as that of Ford – six years – since Joseph Gurney Cannon in the early 1900s. Ford was 66 when he was defeated for the Speakership and – reaching that post having been his life’s ambition – chose to follow in the example of President Reagan and retire, to enjoy his golden years with his beloved wife Betty, standing down from the leadership and announcing his retirement from politics at the end of the 97th Congress. At the end of that Congress – January 3, 1983 – he would have served his district of MI-5, based in Grand Rapids, for a total of 34 years, already as closely identified with the region as the previous Speaker, Carl Albert, was with Little Dixie (OK-3). Another Midwesterner, Robert H. Michel of Illinois, who had formerly been the Majority Whip, became House Opposition Leader, beating out conservatives John Jacob Rhodes of Arizona and Donald Rumsfeld – also of Illinois – for the position with support from Ford and the party brass. Rumsfeld was chosen as Minority Whip, with Rhodes deciding to follow in Ford’s footsteps and retire at the end of the present Congress. On the other hand, of the freshman Congressmen, Democratic Rep. George Takei of CA-24 gained the distinction, upon taking his seat on January 3, 1981, of being the first Buddhist in the history of the Congress. [22] True to his word, he appeared in a photo-op with Lucille Ball at the gates of Desilu Gower on January 24, 1981 (a Saturday), shortly after the Glenn inaugural – Takei had decided to put off his public appearances as a Congressman until after Reagan had ceased to be the President.

    The final frontier was, metaphorically speaking, where Glenn’s long road to the White House had begun, and it was fitting that it would emerge as one of the cornerstones of his administration. It seemed only natural to him – and to his loyal Moonie Loonies, who were as robust as they had not been since their zenith at the beginning of the previous decade – that investing in America would naturally include investment in the space program. The golden mean between government funding and private firms was something that would have to be determined through trial and error, though the system that had been in place during the 1960s seemed as good a place to start. The nadir of funding for the organization under Reagan had definitively ended; although they would not fully recover to Apollo-era levels, they would also never again plunge to their previous depths. In fact, as a symbol of this commitment, Glenn promised in his inauguration speech to replace the aging, overburdened Skylab-B with a “permanent, robust, multi-purpose space station that will serve as a platform for further exploration, to be planned and constructed before the end of this decade”. It was, naturally, symbolic of Glenn’s optimistic plans for the United States at large - with the help of properly-built infrastructure, enabling enterprise to launch upward into the stratosphere, and even further beyond…

    ---

    [1] IOTL, the Panama Canal Treaty was signed by the Carter administration in 1977, and was followed by the withdrawal of US troops two years later. This ended a prolonged period of tensions between the United States and Panama which were precipitated by the Martyrs’ Day riot in early 1964. ITTL, Humphrey, having “sold out” the American troops in the overseas quagmire, lacked the gravitas to do the same in Panama; Reagan, for his part, was disinclined to do so, especially with a hostile Congress.

    [2] Koch ran for and won the office of Mayor of New York City in 1978 IOTL, and not Governor of New York State. In both cases, his primary opponent for the job was the same: Mario Cuomo, as was a slogan used by the latter’s supporters: “Vote for Cuomo, not the homo.” It backfired just as strongly ITTL as IOTL.

    [3] Sen. John Glenn ran for President in 1984 IOTL, having been elected to Senate in 1974. He therefore enters this campaign four years younger but with the same amount of legislative experience. Glenn won in 1970 ITTL as an early beneficiary of Moonshot Lunacy, as you may recall from this update.

    [4] Rep. Jordan retired from Congress in 1978, IOTL. She continues serving through 1980 because so many of her fallen comrades have yielded their seniority to her, and because her oratorical skills made her more valuable as a bigger fish in a smaller caucus. Jordan spoke at the DNC in 1976 ITTL, as she did IOTL, though she did not give the keynote speech. Also of note about Jordan is that, in addition to being a black woman, she was also a lesbian, which she kept private throughout her life. One reason that her multiple sclerosis was revealed ITTL was that it was considered a trade-off for keeping her lesbianism, and her active relationship with another woman, secret.

    [5] By this time IOTL, Connally had switched to the Republican Party, and did indeed run for the nomination in 1980… on that ticket, instead (where he finished sixth).

    [6] Glenn, of course, was also a test pilot, and Star Trek canon (especially ITTL) consistently describes Kirk as a superlative command pilot.

    [7] The star witness, forensic accountant C.A. Baxter, declined to endorse any candidate who did not promise to overhaul the Hollywood Accounting system. None did.

    [8] The episode in question, “The Encounter”, originally aired on May 1, 1964, and does not appear to have been seen in American syndication since, due to complaints from Japanese-American advocacy groups, though it has since aired in most other worldwide markets (including Canada, which also interred its residents of Japanese extraction in much the same way). Takei, who is a prominent lobbyist for the rights of Japanese-Americans, and was himself interred as a child, discusses the episode in this video clip.

    [9] As far as I know, Jackson never apologized for supporting the internment IOTL. Consider it an early – and impressive – demonstration of Takei’s political cachet.

    [10] No, Takei does not come out at any point during this campaign, and has no intention to anytime soon after it.

    [11] Rep. Chisholm, of course, did not run for President in 1976 ITTL, supporting Vice-President Muskie. In joining the Earth Party ticket she is sometimes reckoned as the first woman on a “major-party” ticket, with political scientists naturally quibbling over whether the 1980 Earth Party run can be considered “major”. With regards to her nickname, though “chocolate” certainly makes far more sense than “coffee” (especially when paired with milk, as opposed to cream – except in Rhode Island), and is used by the majority of commentators, some prefer “coffee” as a reference to the Coffy film trilogy, with some political cartoonists depicting Chisholm as, essentially, Pam Grier.

    [12] Sagan, obviously a highly intelligent and gifted orator, was IOTL able to convince Proxmire to drop his objections to the Search for Extra-Terrestrial Intelligence, or SETI, program, to which the Senator had once awarded one of his Golden Fleece Awards, which were handed out to anything which was found unworthy of appropriations.

    [13] This slogan was used IOTL by then-Lieutenant-Governor of Ohio, Mike DeWine, in his (failed) bid to defeat Glenn in 1992; however, this was a variation of a slogan used by Jeff Bingaman against Harrison Schmitt in 1982:
    What on Earth has he done for you lately?”, and that was successful, so it may have inspired DeWine.

    [14] Carter, of course, sought and won the Democratic nomination for the Presidency, and then the Presidency itself, in 1976 IOTL.

    [15] It should be noted that Carter is being wildly optimistic here; after all, no sitting Vice-President had been elected President since Martin van Buren in 1836, and the two most recent attempts (Edmund Muskie in 1976, and Richard Nixon in 1960) had ended in failure, however narrowly so in the latter case. Even IOTL, only one Vice-President (George H.W. Bush in 1988) has been elected President since Van Buren, though Al Gore did come extremely close to doing so (even winning the popular vote) in 2000.

    [16] This group was (and is) known IOTL as the Moral Majority, and its membership is more inclusive than ITTL (which remains known as the religious right largely because the religious “left” that opposes it is larger and more conspicuous – obviously at least partly at the expense of the OTL Moral Majority.

    [17] Elvis famously used this phrase to describe himself whenever he was asked about topical issues, most notably the overseas quagmire.

    [18] Turnout was a rather anemic 52.6% IOTL, with 86.5 million voters.

    [19] The first electoral vote cast for a woman in American history IOTL was for Tonie Nathan, the Libertarian candidate for Vice-President, in 1972, by a faithless elector (Roger MacBride, a Virginian pledged to Nixon and Agnew). The first electoral votes cast for a racial minority candidate were, of course, those pledged to Barack Obama in 2008.

    [20] Recall that the AIP (properly the American Party – that acronym is also an archaism, and has been since 1972) existed in a coalition with the ADP, or Alabama Democratic Party, which was formerly the state wing of the national party before it was commandeered by Wallace (IOTL, only for the 1968 election; but ITTL, from that point forward). He also returned to the Democratic fold IOTL, though much earlier (he ran for and very nearly won their nomination for President in 1972).

    [21] Yes, that Andy Griffith. Now you see why I milked the Matlock references in the Trial of the Century post; that show, sadly, will not exist ITTL.

    [22] No Buddhist sat in Congress until 2007 IOTL, though to be fair there were two firsts: Mazie Hirono of Hawaii and Hank Johnson of Georgia, both Democrats.

    ---

    My many and profuse thanks to vultan and e of pi for their help and advice in the making of this update, and to Dan1988 for assisting with the editing.

    Finally, many of your questions have been answered! Did Reagan win his second term? Who would emerge to challenge him for the Presidency? And who would succeed Reagan? Not to mention many other, smaller questions about what the United States is like at the dawn of the 1980s, and all on account of That Wacky Redhead! If you have any more questions, please do not hesitate to ask them. As always, the elections infoboxes will soon follow, but until then, thank you all for your patience and understanding.


    TWR US 1980.png
     
    Last edited by a moderator:
    Accounting for Changes
  • Accounting for Changes

    tumblr_mtih60jvtn1qlz9dno1_250.png

    Summary of the Twentieth United States Census. [1]

    On April 1, 1980 (a Tuesday), the twentieth decennial United States Census was conducted, reflecting the demographic changes that had taken place in the republic through the tumult of the 1970s. The objective of the census was to apportion congressional representation according to population – loosely speaking, as the absolute size of the House of Representatives had been fixed at 435 since 1913, after having grown fairly consistently over the previous century. This allowed for smaller states to enjoy proportionally larger representation in the House than larger states – every state was guaranteed at least one representative in the House, per the Constitution. This disproportionality – which was tied to the size of the smallest states, given that they were always entitled to one Congressman, even if they had a population lower than that of the average number of voters-per-representative – was disconcerting to many, given that the upper chamber, the Senate, was disproportional by design, assigning two seats to each state regardless of population (or any other factor, for that matter). Perhaps fittingly, a Constitutional amendment was passed at approximately the same time that the size of the House was permanently fixed, allowing the direct election of Senators (as, previously, they had been appointed by the state legislatures). Questions of electoral fairness in the American system (which was certainly not limited to the legislature, given continued protest against the Electoral College used for Presidential elections) aside, the census also functioned increasingly as a gauge for public policy initiatives, given that it compiled, and then reported, sensitive socioeconomic information from the population. Over $100 billion of federal budget allocations were wholly dependent on the findings of the Census Bureau (through both the census proper as well as ancillary surveys), in concert with immeasurable state- and local-level financing and investments as well. Under the “Invest in America” policy promoted by the incoming President Glenn, the private sector would also become more involved in public allocations than ever before, and many large corporations were keenly interested in the metrics which were to be unveiled by the census; these would be extrapolated upon for future projections by demographers and statisticians in the employ of the government, but private research and development could also be conducted which might arrive at a different conclusion than those arrived at by public agencies. The census data, naturally, was kept confidential from the names with which they were associated for the length of the average lifespan, to allay the many privacy concerns which were often raised in the face of such probing questions.

    tumblr_m8q4lsZaOJ1qlz9dno1_1280.png

    Map of the Congressional Apportionment created by the 20th US Census, effective January 3, 1983. (BLACK indicates a state which lost more seats ITTL than IOTL; RED indicates a state which lost the same amount of seats as IOTL; ORANGE indicates a state which lost fewer seats ITTL than IOTL; GOLD indicates a state which lost no seats ITTL, but did IOTL; GREEN indicates a state which gained no seats ITTL, but did IOTL; TEAL indicates a state which gained fewer seats ITTL than IOTL; BLUE indicates a state which gained as many seats ITTL as IOTL. GREY indicates a state which neither gained nor lost seats, ITTL or IOTL.) [2]

    Several demographic trends which prevailed through the 1970s were reflected in the census. The Manufacturing Miracle was perhaps the most all-encompassing of these; the decline of heavy industry in the Northeast and Midwest had finally been mitigated, and this “recovery” held even despite the recurring periods of recession. This had a cumulative effect with the “Mini-Boom” of the early 1970s, resulting in states such as Pennsylvania, Illinois, Ohio, and Michigan gaining over a million new residents in the ensuing decade, with New Jersey, Indiana, and Missouri not far behind. Although all of these states either maintained the same level of Congressional representation they had achieved in 1970 or lost a single seat despite their robust growth, it was a much better fate for them than was postulated by projections from ten years before. After all, the overriding trend of the American population moving in a southwesterly direction continued through the 1970s: only two states east of the Appalachians gained seats in the House of Representatives, and both of them – Tennessee and Florida – were south of the Mason-Dixon line. However, both states also bordered Alabama and may have benefited from the Black Exodus, as it came to be known: African-Americans fleeing the state most tightly controlled by the American Party apparatus (despite the frequent intervention of federal troops to enforce the Civil Rights Act of 1964). Alabama was one of only two states to lose population from 1970, being reduced to six seats. The “promised lands” sought by former Alabamians included: Memphis, Tennessee; Atlanta, Georgia; and the rural Mississippi Delta in the state of the same name. All three were already home to a substantial black population, and indeed many political scientists believed that the “Exodus” to the Magnolia State had given Charles Evers the edge he needed to win the Republican nomination for Senate in 1976, and then the election. African-American voters in Mississippi were also the most Republican in the nation, after those of Alabama; no doubt the Exodus of repressed voters fleeing from a state whose leadership continued to identify as Democratic (much to the ire of the national party) also helped to explain this correlation.

    tumblr_mtih60jvtn1qlz9dno2_1280.png

    The population of all fifty states in 1980, plus the District of Columbia (not represented in Congress). [3]

    In contrast to the Black Exodus was the continued White Flight from urban centres, particularly New York City, although in that case whites were certainly not alone in leaving that city in droves. New York City lost over a million people through the course of the 1970s – despite the Mini-Boom, and despite continuing high immigration – falling below a population of seven million for the first time since the Great Depression. New York was the only state other than Alabama to lose population in the 1980 census, though the population of the “upstate” region remained steady from 1970. This was largely because the city itself was in a “tri-state” area, close in proximity to the nearby and highly affluent state of Connecticut, along with the industrial state of New Jersey. Other emigrants from New York, naturally, moved west to California, or south to Florida, though movement patterns did not show an exceptionally strong trend in any one direction. It didn’t help that California and Florida were both undergoing their own identity crises in the 1970s: the former was associated with the Briggs Initiative, and the latter with the success of the American Party, particularly in the northern parts of the state (far more culturally Southern, ironically enough, than the southern peninsular region). Internal migration was best reflected in the movement of the centre of population for the United States, which had been drifting ever-westward since the census of 1790 and, more recently, in a southerly direction as well. It was no coincidence that the fastest growing states were in the southwestern region of the country. Many demographers had predicted that the centre of population would finally traverse the Mississippi River in 1980, but these projections were premature; the centre was moved to just southwest of the village of Valmeyer, in Monroe County, Illinois, near the unincorporated community of Harrisonville; the marker itself was placed several hundred yards east of the river. This followed two consecutive censuses (in 1960 and 1970) which placed the marker, for the first time in American history, in the same county: Clinton County, Illinois (though it moved from the far east to the extreme southwest thereof, in the latter case just across the road from two other counties) – which had acquired the nickname of “the Central County” as a result.

    tumblr_mtht3wuTLD1qlz9dno1_r1_1280.png

    Progression of the mean centre of population for the United States of America with each census; “from the Mason-Dixon to the Mississippi”. [4]

    So many changes had taken place over the course of the 1970s, and in so many ways that were different from what had been expected, that it was fascinating to have the 1980 census reflecting a snapshot of a certain time and place in American history – in many ways, the “balance sheet” of the national well-being. This was also relevant to business, for what was census-taking if not a form of accounting? It was also the primary tool which the Glenn administration would use to implement their grand strategy…

    ---

    [1] The OTL infobox from which this image is derived can be found here.
    Note that the population ITTL is about five million higher than IOTL (231,582,000 to 226,545,805) which translates to a difference in growth rate of approximately 2.5%; this is primarily due to the Mini-Boom of the early 1970s.

    [2] IOTL,
    Alabama had 7 seats, California had 45 seats, Florida had 19 seats, Illinois had 22 seats, Indiana had 10 seats, Michigan had 18 seats, Missouri had 9 seats, New Jersey had 14 seats, New Mexico had 3 seats, New York had 34 seats, Ohio had 21 seats, Pennsylvania had 23 seats, and Texas had 27 seats. All other totals were the same as ITTL. note, of course, that even with the changes, there are still a total of 435 seats, due to the absolute number imposed by Congress; there is no “overhang”.

    [3] The OTL chart from which this image is derived can be found here.

    [4] This map is historical to 1960; the centre of population in 1970 was IOTL several hundred yards to the west, just across the county line in St. Clair County, Illinois. In 1980, the centre of population was in southwest Jefferson County, Missouri, well across the Mississippi. The map is modeled after an OTL map which can be found here.

    ---

    Thus concludes the 1980-81 cycle, with a look at a
    “snapshot” of the United States of America in 1980! I hope you all enjoyed the visual aids; this was definitely an update which I felt would be greatly enhanced by them. Thank you also, as always, for your continued patience and understanding in awaiting new updates!

    (As an additional note, please feel free to indicate if you find the colour-coding of the Congressional Apportionment map unclear. An alternate scheme has been devised.)
     
    Last edited:
    1981-82: You Can't Go Home Again
  • You Can’t Go Home Again (1981-82)

    LUCY.
    Hello, Herbie
    Well hello, Brandie
    It’s so nice to be back home at Desilu

    You’re looking swell, Donny
    I can tell, Bobby
    You’re still whining
    You’re still crying
    You’re still staying true.

    I feel the room swaying
    For the tube’s playing
    One of my old favourite shows from way back when, so
    Speed it up, fellas
    Pour me another cup, fellas
    Lucy’ll never go away again

    CHORUS.
    Hello, Lucy
    Well hello, Lucy
    It’s so nice to have you back at Desilu

    You’re looking swell, Lucy
    We can tell, Lucy
    You’re still whining
    You’re still crying
    You’re still staying true.

    We feel the room swaying
    For the tube’s playing
    One of your old favourite shows from way back when, so

    LUCY.
    Stuff those eggs, fellas
    Stomp those grapes to the dregs, fellas

    CHORUS.
    Promise you’ll never go away again!

    LUCY.
    I went away from my home at Desilu
    To spend some time at CBS
    But now that I’m back in my home at Desilu
    Tomorrow will be funnier than the good old days

    CHORUS.
    Those good old days!

    [beat]

    Hello (well, hello), Lucy
    Well hello (hey, look! There’s), Lucy

    LUCY.
    Let’s make a new start
    My heart is all aglow

    CHORUS.
    She’s all aglow!

    LUCY.
    You’re lookin’ great, Gary
    Lose some weight, Gary?
    Maybe now Ricky will finally let me in the show

    CHORUS.
    I hear the theme playing
    Title card saying
    And I still get great ratings in the Nielsens, so

    LUCY.
    Where’s the Queen, fellas?
    I’m craving some sardines, fellas

    CHORUS.
    Lucy’ll never go away again!

    [pause for dance number]

    Well, well, hello, Lucy
    Well, hello, Lucy
    It’s so nice to have you back at Desilu

    You’re looking swell, Lucy!
    We can tell, Lucy!
    You’re still whining
    You’re still crying
    You’re still staying true.

    ALL.
    I hear the theme playing
    Title card saying
    And I still get great ratings in the Nielsens, so

    LUCY.
    Wow-oh-wow, fellas!
    Look at the old girl now, fellas!

    ALL.
    Lucy’ll never go away again!

    – The lyrics to “Hello, Lucy!” (set to the tune of “Hello, Dolly!”), as performed by Carol Channing at Desilu Cahuenga, on the evening of October 16, 1981 [1]

    It had been thirty years since I Love Lucy had first aired on television, ushering in what cultural historians reckoned as the period of Classic Television – an era which had itself been dead and buried for a decade by 1981. But retro nostalgia continued to reign, and people looked back fondly at the 1950s, though naturally the focus of these recollections varied widely depending on who was doing the remembering. But the networks had to remain focused on the present. CBS, which remained mired in last place, had announced a dramatic shakeup in their line positions in desperate hopes of reclaiming past losses. Grant Tinker, one-time programming executive for the rival NBC, and longtime head of Paramount Television, had accepted the position of network President, giving CBS strong, decisive leadership for the first time since the departure of Fred Silverman for ABC several years earlier. [2] Tinker immediately approached Desilu, with which his network enjoyed a right-of-first-refusal agreement, in hopes of bringing television into the 1980s.

    But contrary to their public image, Desilu was increasingly facing an internal malaise, a fear that the studio might be dwelling too heavily in the past. Much of their energies in recent years had been focused on the re-acquisition of I Love Lucy, and the production of the 25th, and then the 30th, anniversary specials, though these had all been very successful. [3] Each show on their current production roster was a throwback: Rock Around the Clock, which remained the flagship program offered by studio even after several seasons, focused entirely on events taking place two decades in the past; Three’s Company had been sold as an “I Love Lucy for the Seventies”, which didn’t seem as timely with Annie Glenn redecorating the White House; Eunice chronicled the lives of people imprisoned by their resentment over past events; Deep Space seemed uncertain as to whether it wanted to revive the Western or re-bottle the lightning that was Star Trek. And even that venerable, groundbreaking series, which had helped to establish “The House that Paladin Built” as the most progressive and innovative studio in Hollywood, even amidst the backdrop of the “new freedom of the screen” in the cinema, had gone off the air an entire decade before. Well, out of first-run… that it was still ubiquitous in syndication cast a long shadow over Desilu’s efforts to move beyond its legacy.

    It didn’t help that Lucille Ball turned 70 in 1981 [4] – she was well past retirement age, though she showed no signs of slowing down. Already she had outlasted every other studio chief in Hollywood – though, granted, given that the years since 1962 had been exceptionally tumultuous for the industry, that was perhaps not the accomplishment it might have otherwise been. But the Baby Boomers were nothing if not fickle and antiestablishmentarian, and though they may have been entertained by Lucy as tots and by Captain Kirk as teenagers, they were far less willing to die on the hill for Mash or Janice. Rumblings within the industry were that the once-brilliant entertainer and savvy businesswoman was losing her touch. It didn’t help that the longtime ace up her sleeve, Herbert F. Solow, was on the wrong side of fifty years old. Fortunately, she still had her wildcard: Brandon Tartikoff. Barely in his thirties, he considered Deep Space mere target practice for the other ideas he had in mind.

    Gene Roddenberry, for his part, had a massive chip on his shoulder. [5] He was the creator of Star Trek, but Desilu owned the show outright and it alone profited (quite substantially) from its continued success. Roddenberry may have had offices at the Gower studio since 1965, but he increasingly resented the marginalization of his contributions to that series – particularly when juxtaposed against those of his longtime collaborator, Gene L. Coon, whom he snidely came to refer to as “Gene the Martyr”; in the opinions of many, including those at Desilu, Coon (who had been the effective showrunner since the middle of the first season) had done more to define the tone and style of Star Trek than anyone else. Naturally, since Coon’s name was also attached to The Questor Tapes (against Roddenberry’s will, once again, by higher-ups at Desilu), this created a halo effect: the show, while popular and not without its fans, was widely considered the creative inferior of Star Trek, and surely, many reasoned, it would have been better if only Coon were still alive to executive produce, as opposed to Roddenberry (who in fact took a more hands-on approach with that later series in hopes of restoring his previous reputation, only for it to backfire). [6] It was partly for this reason that Roddenberry had returned to space opera with Deep Space, but he frequently clashed with the Executive in Charge of Production, Brandon Tartikoff, regarding the creative direction for the show. Roddenberry wanted to exercise the clout he felt was his due and make the show he wanted to make – a repudiation of the many compromises that shaped both Star Trek and, later, Questor. Marcia Lucas, who edited Deep Space given her experience with Journey of the Force, was heard to remark “This auteur theory bug is becoming a real epidemic”, upon reading one of the veritable mountains of notes that Roddenberry would send along with the footage. [7] But he was fighting a losing battle; Deep Space was attracting a cult audience (as most science-fiction programs did) but hardly the ratings numbers of even Questor, let alone Star Trek. The show fell out of the Top 30 in its second season, and was only barely renewed for a third. Desilu brass were firmly behind Tartikoff in any and all of the power struggles between he and Roddenberry, and it didn’t help that the wunderkind had bigger fish to fry anyway.

    Tartikoff was becoming increasingly convinced that the way to bring Desilu out of its rut was to reinvent the wheel – or, rather, a moribund genre. The venerable old police procedural seemed as good a choice as any – Hawaii Five-O had ended in 1980, after a twelve-season run, which made it the longest-running crime drama in television history. The previous threshold of sophistication could not be further refined, in his mind, so it was time to break the mould. And Tartikoff found himself with a surprising new ally at the network to which he was first obliged to bring his ideas – CBS – in the form of Grant Tinker. Since the network also had additional cash flows coming in from Desilu through the buyout of I Love Lucy, Tinker cannily suggested a “reinvestment plan” – that money would effectively remain at Desilu to be spent on the production of a new show for CBS. Tinker liked the “whole new cop show” idea and the two jointly decided on Stephen J. Cannell, a veteran crime show writer who nonetheless had a certain rough edge to him, as the ideal creative mind behind the show. [8] He was commissioned by Desilu to write the pilot, under the instruction to “break all the rules” – and so he did, turning in a draft script that impressed Tartikoff and his superior, Herb Solow, not to mention Tinker. Robert Butler, already known for directing pilots (including that of Star Trek in 1964), was chosen to helm Hill Avenue Beat – so named for Hill Avenue in Pasadena, where Cannell lived, as well as punning off “Park Avenue Beat”, the theme song to Perry Mason. [9]

    In directing the pilot for Hill Avenue Beat, Butler was inspired by cinéma vérité filmmaking styles, particularly police documentaries, choosing to implement “guerrilla filmmaking” tactics such as the use of handheld cameras, eschewing detailed composition and even (during high-energy sequences) maintaining focus. The higher-ups at Desilu permitted these audacious departures from conventional cinematographic techniques so long as the show were shot on film as opposed to videotape; Butler would have preferred the grittiness of video but every show ever produced under the Desilu name was filmed as opposed to taped, from an edict imposed by Lucille Ball herself. Despite this, the studio was so singularly impressed with Butler’s contributions to the show that he was granted greater recognition; though a producer credit could not be arranged (because of labyrinthine WGA regulations, only writers were usually credited as producers), Butler was given prominent billing in most advertising for the series, particularly in the trade papers. [10] He was quickly signed to direct every episode of the series, which was a vanishingly rare situation for most American television programs.

    There was every indication of shakeup at the other networks, as well. Richard Pryor still had a year left on his contract with NBC, ensuring that his formerly-unstoppable juggernaut, The Richard Pryor Show, would continue, despite the tragic loss of breakout star Robin Williams. Pryor knew better than to try breaking his contract – he agreed to continue with his show, so long as he was given greater creative control. NBC made a counter-offer: Pryor’s leash would be loosened if he agreed to add “a new Robin” to his cast. It was a demand as inevitable as it was insane; even those who accepted the notion that Robin Williams was solely, or at least overwhelmingly, responsible for the success of Pryor, were forced to acknowledge that such success had carried profound risks – which had indeed come to pass. But television executives were about as well known for their foresight as they were for their tact, and so a friend of Williams himself, John Belushi (formerly of the Second City in Chicago), joined the cast of Richard Pryor for its final season. [11] For NBC, it was ideal synergy: their Saturday night offering, SCTV, was staffed entirely by Second City alumni. Belushi, like Williams, was known as an off-set party animal; in happier days, the two were often spotted together at the infamous Medina nightclub (which had, perhaps fittingly, become a dead zone since the Williams overdose).

    Also like Williams, Belushi was anarchic, though more physical than verbal in his comedy, and sketches were written to take advantage of this difference in style. But it was no surprise that Belushi could not compete with a ghost, and he became the scapegoat for any and all changes which took place on the show, even though much of the shift in tone was the doing of Pryor himself, who was hoping to push the envelope to its limits while he still had his weekly platform to do so. One observer who was bemused by the situation was George Carlin, already branded “the man who killed the variety show” (to which his inevitable retort was “euthanasia”), who was rumoured to have attempted to dissuade Pryor from “cribbing from my failed variety show to try and screw your own over” [12]. Perhaps his own personal experience had allowed Carlin to presage critical and audience reactions to the fifth, and final, season of Pryor – reviews were vicious, and ratings sank like a stone. The bedrock of the NBC primetime lineup for three seasons fell out of the Top 10, and the exception that proved the rule to the increasing consensus that variety was dead itself died an ignoble death, with NBC issuing a press release in March of 1982 announcing that the network’s professional relationship with Richard Pryor had come to an end. The comedian, for his part, decided to pursue a film career, even cracking jokes to that effect in the series finale: “Maybe I can play Superman” was the last line of his episode-ending monologue. [13]

    But even though NBC had lost a cornerstone of their lineup, they were still treading water compared to CBS. Grant Tinker was doing his best to get the network back on its feet, but Tinker faced red tape at every turn – William Paley, the founder of CBS, continued to have a substantial controlling interest in his baby, and – despite recognizing the need for change – was not without a substantial ego. The CBS daytime lineup was performing strongly – from sunrise to sunset, they were only consistently outperformed by the Today Show on NBC – Baba Wawa may have seen her career stall after being scapegoated as “the other woman” (while, by contrast, her fellow adulterer Sen. Edward Brooke had not only comfortably won reelection in 1978 but also had the cachet to briefly mount a challenge to then-incumbent President Reagan from the left in 1980), but she remained ensconced with early morning audiences. By contrast, CBS so dominated the evening news race with their anchor, Walter Cronkite, that upon hearing of the CBS mandatory retirement policy which would have put Cronkite out of work by his 65th birthday (in late 1981), Tinker immediately rescinded it, reasoning that getting rid of Cronkite would be “tossing the last lifebuoy off a sinking ship”. [14] Cronkite, who enjoyed incredible approval and trust ratings with audiences, was very grateful; less so was his heir-apparent, Washington correspondent Roger Mudd, who resigned from CBS to accept the news anchor position at NBC (replacing the retiring John Chancellor, whose newscast had fallen to third place behind both Cronkite and Max Robinson at ABC). [15] Late night was more of a mixed bag. Although Johnny Carson remained solidly in the lead, Merv Griffin on CBS was a clear second, well ahead of Dick Cavett on ABC – in fact, the Alphabet network chose not to renew his contract, which was set to expire at the end of the 1981-82 season. [16] As far as Tinker was concerned, that was all the more reason to keep Merv Griffin at CBS. More personable and friendly than the notoriously reclusive Carson, Griffin was generally regarded as the superior interviewer (and Carson, naturally, as the finer comedian).

    For the first time, as the season came to an end, it finally seemed that CBS had potential – to come roaring back and dominate the ratings as it had not done since the early-1970s. But neither NBC nor ABC were willing to cede the gains they had made in the intervening years without a fight. Still, ABC fell back below the majority line, with only fourteen shows in the Top 30, though they dominated the Top 10 with eight shows on the list; one of these was Texas, which repeated as the #1 show on the air. NBC managed only one Top 10 entry, though they maintained their proportionally respectable presence of ten shows in the Top 30. The remaining six, including Hill Avenue Beat, aired on CBS, which saw 60 Minutes as their lone entry in the Top 10, a welcome extension of their sterling news division’s reputation into primetime. [17]

    Outstanding Drama Series was awarded to Hill Avenue Beat, in an example of the Emmy Awards attempting to cultivate creativity and innovation by rewarding pioneers – as they had done for Those Were the Days a decade before. Robert Butler won the Emmy for Outstanding Directing in a Drama Series, which was just one of a whopping nine awards won for the series at the ceremony of September 12, 1982 – a new record. [18] It also became the first and only series to win the “Big Seven” Emmys – for series, directing, writing, Lead and Supporting Actor, and Lead and Supporting Actress. Outstanding Comedy Series was awarded to Captain Miller, for its final season – creator Danny Arnold had decided that the storytelling possibilities had been exhausted and felt no need to continue. Its star, Hal Linden, won his second trophy for Outstanding Lead Actor in a Comedy Series, with his joke, “I’m just so glad nobody from Hill Avenue Beat was eligible in this category”, getting the biggest laughs of the night. [19] It was a moment of levity that served as a distraction from the quiet vindication enjoyed by the contingents from both Desilu and CBS – that those left for dead should never be underestimated.

    October 16, 1981

    As waves of applause filled the makeshift “auditorium” (actually a re-purposed soundstage at the Cahuenga lot), Carol Channing took a bow with her fellow performers. She beckoned to Lucille Ball, sitting in the front row with most her fellow Desilu brass (who were name-checked in the song), to come down to the stage, which she did.

    “Carol, that was so beautiful,” she said as the two embraced. “Thank you so much for doing this.”

    “Honey, all you have to thank me for is remembering all these new lines. I’ve done the old ones so many times now I was sure I would slip up.”

    “You? Not a chance. You’re a consummate professional.”

    “Coming from you, sweetie, that means a lot.”

    “Well, hey, I run a television studio – I can appreciate having to learn so many new lines in such a short time. Besides, back when you did the movie you didn’t have to learn any new lines, so consider this due compensation.” At this, Ball let out a throaty laugh.

    Channing guffawed in return. “No, but Gene Kelly sure made it learn enough new dance numbers to make up for it!” [20]

    “But you were wonderful!”

    “Thank you, sweetie, you’re too kind. Especially since tonight is about you!”

    “No, tonight is about Lucy – there’s a big difference.”

    At this, Channing nodded knowingly. “Do you ever miss playing her?”

    “Sometimes. But I think that time in my life has passed. Only trouble is I’m not sure when this time in my life will pass.”

    “Well, honey, you just let me know when you find out.”

    “Don’t worry, I plan on letting everybody know.”

    ---

    [1] Carol Channing has played the role of Dolly Gallagher Levi since originating the part on the Broadway stage in 1964, and for an idea of how her performance here would have looked and sounded, here are two videos for viewer (and listener) reference: this command performance at the White House in the mid-1960s, during the original Broadway run (with the sound dubbed over from the original cast recording); and a rendition at the Theatre Royal, Drury Lane, for the Royal Variety Performance 1979 (much more contemporary with her performance in the update); imagine Channing’s performance from the later rendition (though with her hair dyed red, obviously) combined with the staging of the more intimate (some might say cramped) White House setting.

    [2] Tinker ended his active management role at the company he owned, MTM Enterprises, at about the same time IOTL, though instead he returned to NBC (which was also the third-place network at the time) in hopes of rebuilding its fortune. To that end, he hired wunderkind programming executive… Brandon Tartikoff.

    [3] The 30th Anniversary Special of I Love Lucy aired on October 15, 1981 ITTL, finishing as the #1 televised event of the night (and of the week), though in the opinions of most critics it was inferior to the 25th Anniversary special (and would lead Ball to refuse any further specials until the 50th, which would not come until 2001).

    [4] That Wacky Redhead was born on August 6, 1911 – though she pretended for many years to have been born in 1914.

    [5] Gene Roddenberry not acquiring a chip on his shoulder – in any timeline – would simply be ASB.

    [6] This is consistent with Roddenberry’s OTL tenure as showrunner for another series which he attempted to frame as a successor to Star Trek (albeit with considerably more cooperation from Paramount than Desilu is giving him for Deep Space), which also showed him to be well past his glory days. Note, of course, that by this time IOTL he had already crashed and burned with Star Trek: The Motion Picture.

    [7] Marcia Lucas, like most film editors, was not a fan of the auteur theory. However, her husband George, like most directors, was (and is) a fan.

    [8] IOTL, it was Steven Bochco, not Stephen J. Cannell, who was given this writing assignment. Those of you who are familiar with the bodies of work of these two esteemed writers may therefore be able to divine some of the differences that Hill Avenue Beat will have from Hill Street Blues ITTL.

    [9] Hill Street Blues was IOTL set in an unnamed American city, usually implied to be Midwestern and heavily based on Chicago. ITTL, as Cannell is based in Pasadena and visits Pasadena stations in his research for the show, he decides to base it in that Southland city (which does indeed contain a Hill Avenue, though I’m not aware if there’s a precinct station there). Also, IOTL, the series first aired midseason in 1980-81; ITTL it isn’t ready until the late spring of 1981 and has to be picked up for the autumn.

    [10] Butler left Hill Street Blues after the first five episodes (including the pilot) IOTL, entirely because he felt he wasn’t receiving due credit for his contributions to the show. As noted, Butler is perhaps the finest (and certainly the most accomplished) pilot director in American television, and for him to commit fully to a series is something of an aberration for him anyway (and I wonder if that might have been a factor).

    [11] Williams and Belushi’s close friendship is per OTL – Belushi’s devastating death by speedball overdose (very similar to Williams’ death ITTL) has been cited by Williams as a major factor in his own decision to quit cocaine and alcohol. Whether Belushi will learn from Williams in the same fashion ITTL is an entirely different question.

    [12] I’m paraphrasing. Those of you familiar with Carlin’s language can probably guess what he really said.

    [13] Yes, an obvious reference to Pryor having appeared in 1983’s Superman III IOTL, though he did not play Superman but his wacky (and sadly, not Wacky) sidekick; this kickstarted his film career, for better and (mostly) for worse.

    [14] Cronkite was forced into retirement IOTL, and was replaced by Dan Rather, whose rise to prominence has been butterflied ITTL; the ratings and prestige once held by the CBS Evening News steadily declined throughout the quarter-century that Rather occupied the anchor desk – to this day the newscast remains in third place.

    [15] Mudd also departed for NBC News after having been usurped by Rather IOTL; he was one of several interim successors to John Chancellor before Tom Brokaw took the position. Likewise, Max Robinson (the first African-American to host a nightly newscast) was one of several rotating successors to Howard K. Smith prior to Peter Jennings getting the job outright. ITTL, however, it is Robinson who emerges triumphant instead.

    [16] ABC dumped Cavett in 1975 IOTL; by 1977 none of the three private networks wanted anything to do with him and he moved to PBS, where his show also ended in 1982. During this time, Griffin’s show was still airing in first-run syndication, and would until 1986.

    [17] IOTL, ABC had fourteen shows in the Top 30 and three in the Top 10; NBC had just four shows in the Top 30 (and none in the Top 10), and CBS had twelve shows in the Top 30, but seven in the Top 10, including the #1 series on the air, Dallas. (60 Minutes was #2). Hill Street Blues, which aired on NBC, finished at #27 in 1981-82.

    [18] As previously noted, Hill Street Blues was first in contention IOTL during the 33rd Primetime Emmy Awards of 1980-81, in which it won “only” eight Emmys for its inaugural season, losing Outstanding Supporting Actress to Nancy Marchand for Lou Grant. (Marchand won in both 1980-81 and 1981-82, which constitute two of her four wins in this category). Barney Miller did indeed win Outstanding Comedy Series for its final season IOTL, though Alan Alda won Outstanding Lead Actor in a Comedy Series for M*A*S*H.

    [19] These sorts of jokes abound whenever one individual body of work sweeps an awards show, so I felt it was only right that I include one of them here.

    [20] In the 1969 film version of Hello, Dolly! IOTL, the role of Dolly Gallagher Levi, originated on the stage by Channing, was instead played by Barbra Streisand. ITTL, however, for several reasons (including Streisand’s failure to win an Oscar for Funny Girl the year before), Channing (a finalist for the part IOTL) was cast instead, and the film went on to modest success partly because Channing was far less a diva than Streisand and didn’t interfere with the production as she did.

    ---

    Happy Halloween, everyone! I (just barely) managed to honour my informal covenant of posting an update on or before the end of the month, and I can
    ’t thank you all for enough for your seemingly infinite patience in awaiting this update. Welcome to the beginning of the end, sadly– we’re in the home stretch, or what Frank Sinatra (or William Shatner) might refer to as “the autumn of the year”. (Sorry – that song has been stuck in my head ever since I recalled that “dregs” rhymes with eggs”).
     
    Last edited:
    Appendix A, Part IX: Star Trek Deuterocanonicity
  • Appendix A, Part IX: Star Trek Deuterocanonicity

    For better or for worse, the broadcast of The Next Voyage had changed everything about the ardent Star Trek fandom, which devoted immeasurable time and energy to dissecting its ramifications. The years before had been nebulous, an era of unfettered creativity and unrestrained imagination. There was the television series, and as far as just about everyone was concerned, that was all she wrote; the few other “official” media attached to Star Trek were well-constructed trifles at best (the Gold Key comic), and tasteless irrelevancies at worst (the advertising fluff for some of the more… questionable action-figure releases, for example). This was the atmosphere that incubated the most notorious subculture of the Trekkie fandom, the Puritans, who made themselves known starting in 1978, once Desilu had breached the “integrity” of the so-called gentlemen’s agreement in place between the studio and the fandom. Most Trekkies were more sanguine about both The Next Voyage and about Desilu as the caretaker of Star Trek [1] – and they would react to changes in the marketing of supplementary material in a far more varied spectrum of ways than the categorical rejection employed by the Puritans.

    But one aspect of Star Trek continuity which demanded consensus was canon. Desilu was surprisingly mum on the subject; as far as most at the studio were concerned, anything produced or licensed by Desilu and bearing the Star Trek name was exactly what it purported to be: Star Trek. David Gerrold, always the “Studio Ambassador to the Trekkies”, acknowledged that this was a problem – even the show itself had difficulty maintaining internal consistency (despite having an impressively coherent continuity for its medium, by the standards of its time), let alone the ancillary material, much of which was contradictory. But Gerrold, having been in charge of the comic since 1971, was hardly a neutral arbiter, and in the ad hoc system which he devised in the mid-1970s, the comic was unsurprisingly assigned greater canonicity than any other Star Trek media bar the series proper; the official short-story episode adaptations, written by James Blish, occupied the next step down, and this became an early point of contention.

    The Star Trek comic, during the “classic” Gerrold era (from 1971 to 1978), published mostly “original” stories set during the course of the five-year mission, but occasionally, when the well of ideas ran dry, Gerrold would resort to adapting episodes of the series proper into comic form. [2] “The Trouble with Tribbles”, which he himself had written, was the first episode to get this treatment, in 1973; as the comic wore on, this easy out to avoid writing original stories (which, granted, was entirely predictable given that Gerrold had been part of the writing staff that had exhausted viable story ideas in the five-year mission setting in 1970-71) was employed more and more often as a means of getting the issue out on time. In the year leading up to the airdate of The Next Voyage (co-written and produced by Gerrold), more than half of the published issues were adaptations of episodes – in all, over two dozen episodes (all written, co-written, or heavily edited by Gerrold) would see print. But Gerrold was not the first to adapt the television series for the page – nor were his efforts the most comprehensive. Both of those distinctions were held by James Blish, a science-fiction author of some renown.

    Blish had been commissioned in 1966 by Desilu to adapt several of the show’s earliest batch of scripts into short-story format; a collection of these formed the first of many books published until the Star Trek marque, in early 1967. In all, Blish would publish thirteen short-story collections, the last of which was released posthumously in late 1975. [3] Of the 133 episodes in the Star Trek syndication package (counting “These Were the Voyages” as a single episode, along with “The Menagerie”, as it – or rather, “The Cage” – was adapted as a single story), Blish adapted 89, with several notable omissions – none of the Harry Mudd stories were included (as Blish had attempted to release all five in a single volume, along with an original framing story), and nor was “The Borderland” (as Niven had expressed an interest in adapting the story on his own), nor “These Were the Voyages” (which Blish had quite logically decided to save for last). In fact, very few fifth-season episodes were among the “Classic 89” adaptations. [4] Quite a number of episodes had not been adapted by either Gerrold or Blish; however, the problem was with the overlapping adaptations. By any objective measure, Gerrold’s adaptations had the better claim to canonicity, since every doubly adapted work was one which Gerrold himself had played a large part in writing, in addition to his having worked on the show as a producer, and basing the comics on finished episodes, as opposed to Blish who worked from shooting scripts. However, the hypothetical situation of whether the comics were to trump the short stories if they ever adapted episodes not written by Gerrold continued to be raised, though it was never fully resolved.

    The five-year mission was officially declared over by Desilu (at Gerrold’s behest) after The Next Voyage had aired; the “classic era” of the Star Trek comic ended in the spring of 1978 with a partial adaptation of “These Were the Voyages”, though it focused far more on the aftermath of the episode, with the crew saying their final goodbyes and (some of them) leaving the Enterprise, seemingly for good; notably, the Excelsior was shown in full (looking as it did in the miniseries) and the Enterprise (having been badly damaged in the recent conflict) was sent in for repairs. Thus began the “Lost Years” era of the comic, which was split into two different lines: Enterprise and Excelsior. It depicted the five-year missions in between the end of the series proper (2170) and the miniseries (2176). [5] The Enterprise series, which starred Spock, was slower, more cerebral and character-based; the Excelsior series, which starred Kirk, was more action-oriented and artistically inclined. The decision to focus on the “Lost Years” was a way to maintain the edict imposed by Gene Roddenberry not to expand upon the established televised canon, one that Desilu itself never saw fit to contradict; most everyone at the studio believed that television was the property’s first, best destiny, and that ancillary media were a sideshow. David Gerrold was “promoted” to Editor-in-Chief Emeritus of the Star Trek line, having neither the time nor the inclination to directly oversee two separate issues per month. Both of the comics were well-received, however, and in completely different ways. Star Trek: Enterprise was praised as reviving the best cerebral, allegorical detective-story aspects of the series proper, and seemed to swing more for the fences in its approach to continuing storylines, including the depiction of the romance between Captain Spock and his CMO, Dr. Christine Chapel. Enterprise sold better than Star Trek: Excelsior, though that line had the advantage of more diverse characters, including aliens who could never have been depicted in live-action, such as the many-armed Lt. Arex, and the seductive felinoid Lt. M’Ress [6]. It was certainly more “edgy” in its content than Enterprise, taking full advantage of Gold Key Comics being one of the few publishers not to adhere to the Comics Code (which, to be fair, had followed suit with the MPAA and had been relaxed considerably in recent years). [7]

    Beyond the comic line, those Star Trek books beyond the Blish direct adaptations – which had been relatively few and far between, prior to 1978 – also flourished. Unofficial Desilu policy was to encourage writers to focus on the “Lost Years”; manuscripts depicting the official five-year mission continued to be accepted (in fact, adaptations of original episodes resumed after 1978, though they were generally regarded as inferior to the Blish stories), as were stories from the Captain Pike era, and depicting the earlier career of James T. Kirk (aboard the Republic, the Farragut, and during his years studying and teaching at Starfleet Academy). A precious few short-stories (never novels) were able to evade the prohibition against outpacing the canon, though never by much – the farthest-flung vignette was “Logical Inaugural”, depicting the swearing-in of Federation President Sarek in early 2177 following his successful election. This story, published in a 1981 collection by prominent Star Trek writer Diane Duane, and obviously inspired by John Glenn’s successful campaign for the Presidency, “confirmed” what had already been widely suspected by that point in naming his sensible silver-haired predecessor as Lucille Carter. [8] The comic, meanwhile, occasionally featured President Carter in cameo appearances, always taking pains to avoid mentioning her first name – demonstrating the need for hierarchical canonicity. Duane herself suggested a modified version of Gerrold’s original hierarchy (fittingly, as like D.C. Fontana before her, she had started out working as the assistant and secretary to “the boss” before emerging from his shadow): The series proper and the subsequent miniseries would come first, followed by the comics which were adaptations of the series, then by Blish’s novelizations, and then by the comics in general; all other books, which included “embellishments” by Blish that were explicitly contradicted in the comics, came at the bottom rung of the canonicity ladder. Duane, who (like Gerrold) was very aware of the fandom, tried to leave room for “below the ladder” material – from as far back as the late-1960s, certain conventions and customs that had no direct onscreen evidence to support them had been widely accepted among fans, and this “fan canon”, or simply “fanon”, was used to “fill in your own blanks”, as Duane had suggested in endorsing the practice. Her actions were well-intentioned, but at the same time they were rather akin to throwing a lit match atop a pile of explosives.

    Even though it had been less than seven years between the grand finale of the series proper (in July, 1971) and the broadcast of the miniseries (in February, 1978), the entrenched fandom which had existed since the 1960s had plenty of time in the interim to develop their own ideas about the universe in which Star Trek was set. And though the notorious Puritan subculture would not make themselves known as a distinct group until after the fallout from the miniseries divided the fandom, many accepted “principles”, for lack of a better term, emerged during this period which could be described as “proto-Puritan”. The most controversial issue was the nature of the Doctor Who crossover which had opened the fifth and final season of Star Trek; in a real-world context, it was purely a mercenary matter that served to introduce the concept of Doctor Who to American audiences in preparation for NBC importing the series the following season. David Gerrold had considered featuring another appearance by the Doctor in the Star Trek comic, but eventually decided against it; as a result, neither property ever formally referred to the other again. It didn’t help that Star Trek, ubiquitous in syndication through the 1970s, did not include in its package of episodes the two-part crossover, leaving an “out-of-sight, out-of-mind” impression on many viewers.

    In the United States, no small number of Trekkies were somewhere between dismissive of and hostile to Doctor Who, which was being hyped as the successor to Star Trek, and was naturally deemed to be unworthy of such a legacy; it didn’t help that Doctor Who, unlike Star Trek, faded quickly after riding a brief “fad” period (personified by Linda Johnson, the popular American companion who lasted only two seasons). The much smaller cadre of American Who devotees (who, by analogy with Trekkies, became known as “Whovians”, though this term was not used elsewhere) [9] could not possibly counteract the more mainstream opinion held within the much larger Trekkie fandom. In the United Kingdom, on the other hand, Star Trek was perceived as the upstart attempting to piggyback off the success of the older, more established Doctor Who, and the overlap between both fandoms was much larger – a majority of Trekkies were also fans of Doctor Who, and indeed many Doctor Who fans were also fans of Star Trek, even though the initial crossover between the two would come to be perceived as the “original sin” which would result in the wretched excesses of the Yank Years. [10] For this reason (along with the more diffuse nature of “reality” in Doctor Who in general), the continuity of Star Trek was commonly considered fully intertwined with (if not subsumed within) that of Doctor Who – as a potential future from the vantage point of the UNIT years, and other references to the past in Star Trek were also to be considered part of the same timeline. This notion only curried favour as far as the fandom, and no further; the writers would never lock themselves into being forced to send the Doctor to the 1990s to fight Khan Noonien Singh in the Eugenics Wars, however tantalizing the idea might have proven to the certain contingents within the fanbase – at least, the British fanbase.

    Across the pond, many American Trekkies shuddered at the very thought; from their discontent arose an alternative theory, which was at once more ambitious and had much farther-reaching repercussions for the concepts of fandom and continuity: the parallelism theory. [11] This held that all fictional universes (later extended to all universes, fictional or otherwise, for hypothetical purposes) each formed a single, distinct reality which operated in constant parallel to every other universe, all moving (by default) forward in time at a constant speed. If any two (or more) universes were to cross over, this would represent an intersection of the two parallel universes, creating a single, merged reality at that point in time (for the duration of the crossover). When the crossover came to an end, the universes would again diverge and resume their parallel course, ending the merged reality and once again establishing two separate realities. The implications that followed in terms of canonicity were that Star Trek and Doctor Who formed a shared reality only for the duration of the crossover, and never before nor after. The concept of a unified canon comprising both properties was therefore in contradiction of the parallelism theory of canonicity (which, granted, had been developed largely in response to the crossover and was therefore built around invalidating its implications). One more ideologically neutral advantage to the parallelism theory, however, was that it helped to neatly reconcile the parallel reality which had been featured in the episode “Mirror, Mirror”; the universe which contained the Terran Empire was just as different as the one which contained the Doctor, despite the much stronger superficial similarities, although in this case the intersection took a different form (a “transposition”, which entailed solely characters crossing over as opposed to the settings themselves coming together). [12] Again, the reality of the “mirror” universe was only valid within the context of Star Trek for as long as the characters were transposed. Prior and subsequent events within that universe, according to the parallelism theory, had no significance within the canon.

    The complexity of the parallelism principle was, at first, a limiting factor in its gaining acceptance, but it acquired widespread currency upon being distilled into the maxim “What happens in crossover, stays in crossover”. [13] The opposing viewpoint, which held that Star Trek and Doctor Who were part of a single shared universe and separated only by the passage of time, came to be known as the concordance theory, though many “concordant” fans refused to dignify what they saw as the a priori viewpoint with a term which might imply that their views deviated from the norm. [14] But the parallelism view came to be accepted far beyond the Trekkie fandom, as it helped to address the crossovers (and visits to alternate realities) that were already commonplace in comics and animation, as well as between various universes which had fallen into the public domain. The overwhelming acceptance of parallelism within the Trekkie fandom made for strange bedfellows against the backdrop of the greater conflict that emerged as a result of The Next Voyage in 1978: the ideological divide was so intense that it came to be known as the “Star Trek Wars”. [15] Given that they could be traced to a single precipitating event that drove a wedge between the formerly (if nominally) united Trekkie fandom, Trekkies themselves often described it as the “Great Schism”. The conflict was waged between the mainstream (which, unlike the concordants, was sufficiently large to not require an identifying label) and the Puritan faction – perhaps the largest, probably the most notorious, and certainly the most vocal subculture in the fandom. The Puritanical view of canon was simple, and arbitrary only in that it brooked no compromise: the series proper comprised the entire canon. Naturally, this involved showing a special affinity to those personages who had been involved with only the series and nothing else: Gene Roddenberry, the “Great Bird of the Galaxy” and, extending the “Puritan” metaphor, the John Calvin; and Gene L. Coon, likewise the John Knox. [16] Many Puritans did not care for upstarts like David Gerrold (who, to be fair, considered himself no supporter of the Puritan mentality either, having coined the name for their mindset in derision), though the obvious question of whether they might have been more amenable to The Next Voyage if Gene L. Coon had lived to write it was one of the great hypotheticals of the fandom. [17]

    The Puritans were so vociferous in their attempts to invalidate what was canon that, to paraphrase Captain Kirk himself, it was easier to apply “reverse logic” and agree on what was not canon. Fanon was recognized as merely convenient for use as a storytelling device, as opposed to meaningful information. This was to say nothing of certain genres or devices which were not widely accepted, such as the infamous “slash” fandom. Perhaps it was the slashers who took the plot developments of The Next Voyage the worst – Spock, one-half of their sacred couple, had been wedded to Nurse Chapel, whom they had long dismissed as utterly unworthy of the Vulcan’s affections. As a result, they found themselves in agreement with the Puritan stance on canonicity despite there being no love lost between the two factions otherwise. The slashers were, in essence, revisionists – viewing the interactions between the Captain and his First Officer through the lens of “subtext” – which most other Trekkies apparently did not notice. The slashers were prolific producers of fan fiction and fan art, and contributed heavily to the fanon; their interpretation of the Vulcan pon farr and its implications was a particular point of pride for the Kirk/Spock community. However, the marriage of Spock and Chapel, and the birth of their son Selek, was implied to have resulted from Spock entering pon farr on the occasion subsequent to his doing so during the events of “Amok Time” – that is, seven years later. D.C. Fontana, who had written the Spock/Chapel subplot, confirmed this during a convention appearance in 1978, saying “we double-checked the math on that one”. [18] Fontana never revealed whether this was written in response to the common slash fiction storyline of Spock entering pon farr and seeking solace from Kirk (as the writing staff delicately avoided acknowledging fan fiction in order to maintain plausible deniability of its existence), but the slashers very vocally reacted as if she had. [19] Stories of Nurse Chapel figurines being burned in effigy abounded, though these were likely apocryphal (the earliest mention was in a parody article from a 1979 fanzine). However, after the miniseries aired it did become a pastime of many slashfic writers to find novel ways to… “dispose” of “Mrs. Spock”, clearing the way for Commodore Kirk and Ambassador Spock to come together romantically.

    Speaking of figurines, it was direct merchandising which occupied the lowest rung of canonicity, by universal agreement. Fluff in the product descriptions of toys and action figures, “trivia” in the board games and factbooks with no corroborating sources, and plot summaries for the video games – though the latter relegation would attract some controversy in later years. Many of the figurines themselves often did not bear close resemblance to the characters on whom they were based – it was not uncommon for Spock to be depicted with chartreuse skin, for example, despite having only a faint greenish tinge on television. Authorized images of Spock in “cartoon” form often followed this convention as well. The “official” excuse was that, given the small size of many Spock toys, the skin colour made it easier to tell him apart from other blueshirt characters, as the pointed ears and eyebrows were less apparent. By contrast, most of the human characters in each line were given identical skin colours despite the wide variation in complexions of the actors who played them – with the obvious exception of Uhura (though an unfortunate manufacturing error had once resulted in a batch of “White Uhuras” [20]; most of these had been destroyed once the mistake was spotted by quality control, but a few were anonymously misappropriated and became valuable collectors’ items).

    Although the battle lines had been drawn along multiple fronts, the Trekkie fandom in the early-1980s still seemed to be entering something of an autumnal period; the years of callow enthusiasm, and then followed by resplendence and rejuvenation, were well behind them. Debates continued to rage about whether Star Trek should be continued in some “official” (televised) capacity or not – the revamp of the comics had been well-received, and the offshoot novel and short-story lines sold very well, with the constant reiterations of the Star Trek video game bettering even those. But there seemed to be a palpable need throughout the fandom for active and creative involvement in the property they loved so much. Fan fiction and fan art clearly sated a need that wasn’t met elsewhere. Desilu staffers, despite turning a blind eye to fanworks, were very much aware of the untapped outlet for Trekkie creativity. Just as they had staked a claim in other emerging media of the day, from home video to video games, those at the studio decided to explore aligning their interest with an altogether different, yet equally nascent, form of self-expression…

    ---

    [1] In a letter to the editor of the April, 1980, issue of the fanzine Voyages, one “Betty from Boston” made the apt remark that “Desilu may not be perfect, but at least Star Trek isn’t in the hands of a greedy, corrupt studio like Paramount – or an upstart in over his head like George Lucas”.

    [2] During the “pre-Gerrold” era, as IOTL, the Star Trek comic told exclusively original stories – though it had a far more erratic release schedule than the once-a-month pace at which Gerrold had been exhorted to churn out the comic. Gerrold, though a higher-calibre writer than those who had worked on the comic before him, and having the advantage of access to rejected story ideas for the show to mine for material, still faced considerable difficulty meeting deadlines, even after bringing in other writers.

    [3] Blish adapted 75 of the 79 OTL episodes in twelve short-story collections before his death in 1975, omitting only “Mudd’s Women” and “I, Mudd” (which he planned to adapt into a Harry Mudd novel, which was published posthumously as Mudd’s Angels in 1978, having been completed by his wife), along with “Shore Leave” and “And the Children Shall Lead”. ITTL, even though more books are published and contain more stories per book, he cannot possibly adapt all 133 episodes before his death, at which point the series is abandoned – at about the same time that Gerrold asserts the primacy of the comics over the short stories (which is no coincidence).

    [4] “Classic 89” was a retronym applied to the distinguish the Blish adaptations from other short-story and novel lines bearing the Star Trek name (including the later adaptations to complete the series proper), given that the books in which they first appeared were named, simply, Star Trek (followed by the numbers 1-13).

    [5] The mission of the Artemis, which began very late into the “Lost Years”, was not depicted or even alluded to in the comics, with Cdr. Sulu and Lt. Cdr. Kyle instead serving aboard the Excelsior (as First Officer and Chief Engineer respectively), to add some familiar faces to Kirk’s roster.

    [6] Arex and M’Ress, of course, both appeared in TAS IOTL, a series for which Gerrold wrote multiple episodes. They also serve a “political” purpose, in allowing non-human “minorities” to have a more visible presence aboard the Excelsior (since Spock, the token non-human of the series proper, is the Captain aboard the Enterprise).

    [7] Yes, the potency of the Comics Code Authority survives ITTL, because there is no Nixon Administration to open a new front in the War on Drugs by requesting that Stan Lee write an anti-drug storyline for The Amazing Spider-Man. This gives the Code enough time to properly adapt to changing societal mores. Ironically, the greater success of the non-Code Star Trek line(s) still serve to weaken the authority of the code… only from without, instead of from within.

    [8] Diane Duane wrote many stories for Star Trek IOTL as well, and despite her relative youth at the POD (she was born in 1952), she did indeed get her start as David Gerrold’s assistant when she moved to LA in 1976 IOTL (at which time Gerrold had little to do with Star Trek anymore) before moving onto work extensively with the franchise. All those coincidences (and the parallels between her career and that of D.C. Fontana) struck me as too profound to butterfly away. Whether Duane will go on to write the Young Wizards series as she did IOTL is another question entirely.

    [9] As IOTL, where the term “Whovian” is also a creation of the American fandom, dating to the 1980s. “Whovian” is far more typical of American nomenclature than British, and the term has never held much currency in those old islands.

    [10] As a result, many British Trekkies are, on the whole, far less approving of Desilu than American Trekkies, because of their memories of the studio “meddling” in the production of Doctor Who throughout the Yank Years (along with NBC, who to be fair receive the brunt of the vitriol on the score).

    [11] So named because the word “parallelism” was so ambiguous that it could (and does) refer to one of any number of incredibly diverse concepts – why not try for one more?

    [12] “Parallel universe” and “transposition” are both terms which were used in the teleplay of “Mirror, Mirror” (ITTL and IOTL), and this helped to inform the parallelism theory.

    [13] Thanks to e of pi for this succinct summarization of the parallelism concept. ITTL, it is derived from the popular expression “what happens on tour, stays on tour” (sometimes rendered as “what happens on the road, stays on the road”), used by sports teams and rock bands since approximately the era of the POD. (The popular modern-day derivation IOTL, “What happens in Vegas, stays in Vegas”, dates only to 2003.)

    [14] The term “concordance” (a synonym for agreement) was used as part of the popular fan reference, The Star Trek Concordance, which was first written by the legendary Bjo Trimble in the late-1960s. It never achieved notoriety ITTL, allowing the term to be available for use in opposition to parallelism.

    [15] Borrowed rather shamelessly from the quasi-religious conflict which took place in the backstory of Futurama (not to be confused with the Star Wars Trek, the mass migration of Star Wars fans). ITTL, obviously, the term is used without reference to what is known as Journey of the Force.

    [16] Herbert F. Solow, the other major figure who had active involvement only with the original series, was usually disregarded, despite having wielded considerable creative authority on behalf of Desilu during the show’s development, because by 1978 he was known almost entirely for his role as a top studio executive.

    [17] Not to mention the Questor fandom (Questies? Or perhaps Questorians?)

    [18] This is contrary to Fontana’s OTL belief, expressed in her novel Vulcan’s Glory, that Vulcans could engage in sexual activity at any time, and were not bound to their seven-year mating cycle – however, the other writers (primarily Gerrold) talked her into “going with the flow” ITTL.

    [19] Shippers in general (not necessarily just slashers, though they are certainly not exempt) tend to be very petty about… “obstacles”.

    [20] Sometimes called “Friedas” (or “Fredas”, “Freidas”, or even “Freedas”) because Uhura was named in reference to the book Black Uhuru, a copy of which Nichelle Nichols brought with her to audition for Star Trek. Uhuru, of course, is Swahili for “freedom” or “independence”, and therefore the “cognate” of “Black Uhuru” would be “White Freedom”. Since Uhura is a “feminized” version of Uhuru (except not really, because Swahili doesn’t work that way), so too is Frieda equivalent to Freedom.

    ---

    Thanks to e of pi for assisting with the editing of this latest update, and for serving as the sounding board to my ideas!

    At long last, this is the update that sheds some light on the question of continuity within Star Trek, especially with regard to Doctor Who. To make a long post short, many British/Whovian (for lack of a better term) Trekkies say it happened, but they
    ’re just about the only ones. The majority hold firm to the notion of parallelism, which becomes a core TTL concept of canonicity and continuity spreading far beyond both original fandoms. Such arcane concepts have caught fire IOTL… remember little Tommy Westphall?
     
    Last edited:
    Appendix C, Part V: The Studio Strikes Back
  • Appendix C, Part V: The Studio Strikes Back

    The surest way to head for the brim is starting at the dregs.

    Attributed to Ted Turner, but likely apocryphal

    Ted Turner was, at the beginning of the 1980s, a big – even burgeoning fish in the infinitesimal pond that was a local media market, even a fairly populous and strategically-located one such as Atlanta. To take his WTBS national, and to create a true media empire a superstation of superstations would require certain intangibles which would allow him to stand far above the competition. So he went to Hollywood, hoping to make the right kind of transaction to put his nascent network on the map.

    Desilu Productions was his first port of call the smallest of the Hollywood studios, but in practice a minor only because it didn’t have a dedicated movie-making arm, their television properties had already been a boon for his station. What was then known as WTCG had reaped the benefits of a very costly mistake by the programming manager at WSB-TV, the NBC affiliate in the Atlanta market, in declining to purchase the rights to Star Trek when it had first entered syndication in 1971. This marked the beginning of a very lucrative relationship, with Turner then buying Doctor Who from Desilu lock, stock, and barrel including even the adventures of the First and Second Doctors. Granted, these were shunted into late-night/early-morning timeslots, with the television reporter for the Atlanta Journal deriding this choice of scheduling as the Cure for Insomnia”. [1] Turner entered negotiations with Desilu expecting them to embrace this new business opportunity, but he had miscalculated. The was already overstretched due to their deal with RCA and their investment in Syzygy, and it declined to entertain Turner’s ideas about building his “superstation” on their shoulders.

    Turner had wanted a personal audience with Lucille Ball, counting on his personal charm to win her over, but he never got the chance; he instead dealt with her right-hand, the man in charge of day-to-day operations, Herbert F. Solow. [2] Turner had hoped to speak with Ball because, in his words, “She and I are of a kind
    using a famous line from Star Trek to express his belief in their shared entrepreneurial spirit. But Solow was unmoved, and certainly didn’t see the benefit in hitching the Desilu wagon to one horse in the free-for-all that was the syndication market. Turner, being an entrepreneur, was also a keen businessman. He eventually realized the disadvantages of negotiating from a position of weakness. In their dealings with RCA and Syzygy, Desilu had little ground to lose when agreeing to their requests, but Desilu had already staked out a formidable position in the second-run syndication market. And it was the market that Desilu itself had helped to create; Solow had famously said of his studio at an industry event that “We don’t re-invent things we just invent them.” Turner wasn’t pleased with having to fold so easily, but at least Desilu had a neighbour in far more dire straits.


    Paramount had suffered what might delicately be described as a series of unfortunate setbacks in the last few years. Their television division, which had (after a rough start) become known for providing a marque of quality through the 1970s, had seemingly collapsed; this was due in large part to the architect of the division, Grant Tinker, choosing to depart the studio for CBS once the time had come for belt-tightening, and (on a more personal note) during a difficult period in his life he and his wife, Mary Tyler Moore, had divorced. [3] Paramount wasn’t big enough for the both of them anymore, and in the end neither would remain, with Moore creating her own studio, Hat Toss Productions (named for the iconic shot in the opening credits of her sitcom) inspired by the successes enjoyed by Lucille Ball and Penny Marshall, among others. [4] Charles Bluhdorn had been unsatisfied with Paramount Television for quite some time and, ever aware of how valuable the Desilu properties had proven in syndication, cursed his failure to acquire the studio when he had the chance… not for the first time. His outlook was unfair to Mary Tyler Moore, Bob Newhart, and Rhoda, all of which were performing very well, and this gave Ted Turner a valuable edge. Sure enough, when he approached Bluhdorn, the Austrian mogul was more eager to make a deal than he might have otherwise been.

    Television remained a sideshow as far as Paramount was concerned, despite the headaches it might have brought on for Bluhdorn; movies were still the big-ticket medium for all of the major studios, but Paramount had seen a major dearth of box-office hits since Journey of the Force in 1977. [5] In an era when “blockbusters” (as big-budget, high-grossing movies were becoming known) were re-defining success in the motion picture industry, Paramount was increasingly forced to relive their past glories by re-releasing their older films; this was a losing proposition as classic films grew increasingly ubiquitous on the networks, local stations, and even pay-television, not to mention that which would ultimately prove the death knell for theatrical re-releases: home video. Paramount had been a slow adopter of the format, backing the losing horse in Laserdisc before being lured (through mercenary means) over to CED. But it had a very large film library, the product of nearly 70 years [6] of continuous operation, and it was relatively underexposed. TBS, Turner had promised, would bring their old movies to the masses, and would pay top dollar for the opportunity to do so. In fact, Turner had ideas about the things he could do with those old movies, and he was sure to pay for the rights to modify all of the properties he was acquiring for broadcast, even beyond the customary permissions to re-format the picture for television viewing. This unusual request did not go without notice from Paramount executives. [7]

    But, in the end, they had little alternative. Paramount had debts to pay, thanks to certain legal proceedings. That said, despite winning the Billion-Dollar Verdict, Lucasfilm knew better than to demand such an exorbitant sum from Paramount, which would likely drive the studio into bankruptcy and render their damages unrecoverable. Both sides therefore arranged an accord which would see the value of the bond which was to be posted on the damages ($100 million, or ten cents on the dollar) placed in escrow, where it would remain until all potential appeals to the verdict were exhausted. In so doing, both sides expressed their willingness to fight to the bitter end and the Supreme Court. Therefore, the bulk of the $100 million payment on the bond would be financed by the sale of the exclusive broadcast rights of all shows produced by Paramount Television, and all films produced by Paramount Pictures and its antecedent companies up to 1977 Journey of the Force was excluded
    , given its importance as a bargaining chip in the ongoing legal proceedings – to TBS. In fact, Paramount Television even allowed TBS the exclusive syndication rights to their current productions, including the struggling WMTM in Cincinnati which provided an incentive for them to reach the “magic number” of 100 episodes, as very few shows had ever been successful in syndication otherwise. [8] Naturally, the syndication agreements which were already in place between Paramount and the various stations across the United States were allowed to continue until they expired.

    Lucasfilm v. Paramount, meanwhile, reached the Ninth Circuit of the United States Court of Appeals, before which it was argued in early 1982, nearly two years after it had been decided by the California District Court. However, the verdict was not reached by a jury, but by a panel of three judges, who voted 2-1 to overturn that ruling. It was certainly a relief for Paramount, and for Charles Bluhdorn, who was now off the hook for $900 million – or so it seemed. Certainly, it seemed that the dispassionate judges, stubbornly unimpressed with provocateurs such as the “rogue accountant” C.A. Baxter, were far more willing to back the status quo, and the major studios, than the populist juries. The $100 million bond payment stubbornly remained in escrow, per the agreement previously reached by both sides, as George and Marcia Lucas (along with their lawyer, Andy Taylor) swiftly announced their intention to appeal the verdict reached by the Court of the Ninth Circuit.

    There was only one place to go from there: the Supreme Court of the United States


    ---

    [1] In many markets, the 1963-69 run of Doctor Who airs in the middle of the night, if it airs at all, because there’s nowhere else for it to go. For this reason, coupled with the… unfortunate quality of the visual effects and rather slow pace of the narrative, it does not have a reputation for fully engaging its audience.

    [2] Solow is definitely more conservative and risk-averse than Lucille Ball the perfect manager, as opposed to a leader. He also makes for a fine gatekeeper.

    [3] Moore and Tinker also divorced IOTL, in 1981. ITTL they do so a year earlier, the strain of the Trial of the Century proving too much.

    [4] Hat Toss was not formed IOTL, though Moore did continue acting in television and film after parting ways with MTM.


    [5] ITTL, Paramount is without the following blockbusters, which kept it afloat during a perilous time for the American motion picture industry: Star Trek: The Motion Picture, the #4 film of 1979, with an $82 million gross; The Empire Strikes Back, far and away the most successful film of 1980 (grossing $209 million on its original release, and a further $13 million in 1982); and Raiders of the Lost Ark, which was once again easily the most successful film of 1981, earning $212 million (and then $21 million in a re-release the following year, and $11 million in another re-release the year after that); and Star Trek II: The Wrath of Khan, the #6 film of 1982, which received $80 million in grosses; not to mention the 1979 and 1981 re-releases of the original Star Wars. Granted, this was an exceptionally inflationary period IOTL, but all that still adds up to over $600 million in 1980 dollars enough revenues to cover their bond payment six times over (and indeed, more than half the Billion-Dollar Verdict in its entirety).

    [6] Paramount (as the Famous Players Film Company) was established in 1912, making it the second-oldest Hollywood film studio (Universal was founded earlier that same year).

    [7] Yes, Ted Turner has exactly the same big ideas which earned him no small amount of notoriety IOTL

    [8] Remember, Star Trek was one of the shows IOTL which really
    “proved” that audiences would accept fewer than 100 episodes in reruns.

    ---

    And thus, I give you the last update for 2013! As might be expected, it sets the scene for some major battles to be fought in the future. Thanks again to Andrew T for his legal advice. I would also like to take this opportunity to officially welcome nixonshead aboard as the official 3D model artist for That Wacky Redhead! Some of you may already be familiar with his exceptional work on Eyes Turned Skyward, where he is the artist in residence – and if not, I would strongly recommend that you become so; you won’t regret it. I feel extremely fortunate to have him working with me, and the Enterprise model he posted today is hopefully the very tip of the iceberg.

    Have a Happy New Year, everyone, and I’ll see you all in 2014! Perhaps, perhaps, the year in which That Wacky Redhead will come to an end…
     
    Last edited:
    Top