Prevent the Nov. 9, 1979 Training Tape Incident (missile false alarm)

http://www.pbs.org/wgbh/nova/military/nuclear-false-alarms.html

The training tape incident

Shortly before 9 a.m. on November 9, 1979, the computers at North American Aerospace Defense Command's Cheyenne Mountain site, the Pentagon's National Military Command Center, and the Alternate National Military Command Center in Fort Ritchie, Maryland, all showed what the United States feared most—a massive Soviet nuclear strike aimed at destroying the U.S. command system and nuclear forces. A threat assessment conference, involving senior officers at all three command posts, was convened immediately. Launch control centers for Minuteman missiles, buried deep below the prairie grass in the American West, received preliminary warning that the United States was under a massive nuclear attack.

The alert did not stop with the U.S. ICBM force. The entire continental air defense interceptor force was put on alert, and at least 10 fighters took off. Furthermore, the National Emergency Airborne Command Post, the president's "doomsday plane," was also launched, but without the president on board. It was later determined that a realistic training tape had been inadvertently inserted into the computer running the nation's early-warning programs.

However, within minutes of the original alert, the officers had reviewed the raw data from the DSP satellites and checked with the early-warning radars ringing the country. The radars were capable of spotting missiles launched from submarines close to the U.S. shores and ICBM warheads that had traveled far enough along their trajectories to rise above the curvature of the Earth. The DSP satellites were capable of detecting the launches of Soviet missiles almost anywhere on the Earth's surface. Neither system showed any signs that the country was under attack, so the alert was canceled.
One thing which doesn't work is verbally saying to people, ask any question if you have any doubt. Because people quickly learn pretty much with any institution that if you ask an 'unnecessary' question, you will be deprecated, doghoused, embarrassed, viewed as on the outs, viewed as less capable, etc, etc, etc.

Okay, so how do you prevent the Nov. 9, 1979 Training Tape Incident?
 
Last edited:

Puzzle

Donor
Put a lock on the training tape slot with the keys given to the officers. Simple solutions are best.
 
WOPR-SideView.jpg
 
Okay, so the brigadier general in charge of the training exercise knows the de-brief is as important as any other part.

And if he or she sends nine training tapes out, going to make sure he or she gets nine training tapes back.
 
http://www.pbs.org/wgbh/nova/military/nuclear-false-alarms.html

' . . the officers had reviewed the raw data from the DSP satellites and checked with the early-warning radars ringing the country . . '
But the computer people don't really feel comfortable giving raw data.

That's pretty much any institution, doesn't really trust the recipient or user of the information. Make it 'fool-proof' and just gunk it up. Want to buff or polish the information.
 
One thing which doesn't work is verbally saying to people, ask any question if you have any doubt. Because people quickly learn pretty much with any institution that if you ask an 'unnecessary' question, you will be deprecated, doghoused, embarrassed, viewed as on the outs, viewed as less capable, etc, etc, etc.

Okay, so how do you prevent the Nov. 9, 1979 Training Tape Incident?

Hire literate people who can read.
 
Most procedural manuals are just terribly written, and they're written with the assumption that the user is less intelligent than the writer which is a type of poison and just makes matters worse.

I think the thing which saved the day was that they had access to the raw satellite data.

And this would probably be fought, with someone worrying, oh, I don't know if we want young officers having access to the raw data before they get experience . . . Yes, we probably do, including during their independent study during slow times.

And computer systems that they were using in 1979 and the norm for what's 'useful' information is different. For example, look at how Microsoft Word fights you on different things, and I know someone will say it's not a word processing program, it's powerful desktop publishing. But be that as it may, the whole norm is different, and kind of the more is better approach is in style.
 
Last edited:
Hire literate people who can read.


It's not that simple. I've authored procedural documentation for years, and writing an SOP that captures all possibilities in a clear and totally unambiguous manner can be tricky - it's certainly not as easy as it first appears.
 
It's not that simple. I've authored procedural documentation for years, and writing an SOP that captures all possibilities in a clear and totally unambiguous manner can be tricky - it's certainly not as easy as it first appears.

I was under the impression that the person picked up the wrong tape, neither of them should be even close, and used it. I have no idea how they were marked on the case but I am willing to bet is said training tape.
 
I was under the impression that the person picked up the wrong tape, neither of them should be even close, and used it. I have no idea how they were marked on the case but I am willing to bet is said training tape.

That's kind of a secondary issue, as even well trained operators make mistakes. The bigger issue is the possibility of that mistake occuring in the first place - it should be engineered out of the system, both procedurally and physically if possible. The guy putting the training tape into a live system and having it appear as live should not have been able to happen.
 
Not sure why you'd WANT to prevent this as folks seem to be missing the main point the SYSTEM worked and worked well. We went on alert, but we didn't fire and within minutes had determined that the "threat" wasn't real and then found the cause.

It's often overlooked (or ignored for the sake of the story/plot) but there are a huge number of checks and balances inherent in the system to prevent things from getting to hot and they tend to work BECAUSE the military is built the way it is.

Having stuff like this happen is awkward, scary, (especially for those involved) and helps keep the system honest and working.

Randy
 
an example from aviation
The Human Factor, Vanity Fair, William Langewiesche, Oct. 2014.

http://www.vanityfair.com/news/business/2014/10/air-france-flight-447-crash

One of the more common questions asked in cockpits today is “What’s it doing now?” Robert’s “We don’t understand anything!” was an extreme version of the same. Sarter said, “We now have this systemic problem with complexity, and it does not involve just one manufacturer. I could easily list 10 or more incidents from either manufacturer where the problem was related to automation and confusion. Complexity means you have a large number of subcomponents and they interact in sometimes unexpected ways. Pilots don’t know, because they haven’t experienced the fringe conditions that are built into the system. I was once in a room with five engineers who had been involved in building a particular airplane, and I started asking, ‘Well, how does this or that work?’ And they could not agree on the answers. So I was thinking, If these five engineers cannot agree, the poor pilot, if he ever encounters that particular situation . . . well, good luck.”

In the privacy of the cockpit and beyond public view, pilots have been relegated to mundane roles as system managers, expected to monitor the computers and sometimes to enter data via keyboards, but to keep their hands off the controls, and to intervene only in the rare event of a failure. . . . . Since the 1980s, when the shift began, the safety record has improved fivefold, to the current one fatal accident for every five million departures. No one can rationally advocate a return to the glamour of the past.
You can't argue with that on aviation. We'll take a fivefold increase in safety every day of the week. Now, with missile defense and early warning, the situation is a little different in that the unexpected and offbeat is exactly what we need to be most concerned about.

Most computer systems assume the user is a dumb bunny.

The designers will swear up and down that they don't. But in point of fact, they do. They rather assume the user is less intelligent, less capable, less aware, less connected, less everything. They 'perfectionize' and laboriously go over the easy stuff, and don't quite know how to handle the hard stuff. They don't really trust the user.

So, key question: Do modern systems give users access to the raw satellite data?
 
Last edited:
Here's a different account of the facts:

PDF --> http://web.mit.edu/stgs/pdfs/white paper-- A Multinational Missile Launch Surveillance Network.pdf

' . . . The so-called “training tape incident” on 9 November 1979 is illustrative of the others.​
Early that morning, the night shift at the US NORAD command center-- an underground​
bunker that houses the headquarters responsible for launching America's nuclear forces--​
decided to run a training exercise. In preparation for that exercise, they inserted a​
computer tape that would cause the screens above the operator's heads to display all the​
signs of a massive nuclear strike from the Soviet Union. However, the night shift ended​
before they could run the simulation. Unfortunately, no one thought to remove the​
computer tape or tell the morning shift that the tape was inserted. The result was that​
shortly after 8 a.m. the new operators saw every indication that the United States was​
being attacked by a massive first right from the Soviet Union. . . '



 
an example from aviation

You can't argue with that on aviation. We'll take a fivefold increase in safety every day of the week. Now, with missile defense and early warning, the situation is a little different in that the unexpected and offbeat is exactly what we need to be most concerned about.

Actually I think you can argue against the pilot as mere system manager concept, for at least two reasons - AF447, and US Airways 1549.
 
But what if a transition to self-driving and partially self-driving cars, where some cars on the road are self-driving and some aren't, leads to a drop in U.S. traffic fatalities from about 35,000 to around 7,000?

And you get the feeling that the companies involved have not just recoursed to defensive behavior, but are still working to actually improve things.
 
Top