When asked by the teacher to summarize WW1, one of my college history classmates, with a completely straight face, told the class that WW1 was the war during which the US gained it's independence from France. The teacher cut her off before she could go any further, but later I asked her for more details about her version of WW1. She then elaborated that "in the late 1800s, I think" the Americans declared independence from France. (Although after the teacher had stopped her so abruptly she was begining to question whether she was right about that "It could have been from Mexico, I always get those two mixed up.") The American cause was aided by Nazi Germany (she said she wasn't quite sure if Hitler was in charge yet, but she knew the Nazi party definitely was). Of course, after helping them gain their independence, the Nazis turned on their former allies and sunk the USS Maine off the coast of Texas, thus starting WW2.