alternatehistory.com

Generally: How would US politics change if the nazis won WW2?

Of course, we'll have to makes differences for the various possibilities:

- Hitler declares war on the US, as OTL
- or he doesn't, and vice versa

- Nazis control "only" the European continent (maybe after an armistice with Stalin not even all of Russia to the Urals)
- or their empire includes Britain, and/or a big colonial empire in Africa and the Middle East

- Nazis try to interfere, Cold War-like, in American states
- They don't, and expect the same from the US

- Will the US politics nazify ("they defeated Communism, and they were generally very successful, so they can't be all wrong") - eugenics and oppression of non-WASPs in the interior, dominating Latin America even more in the exterior?
- Or will all anti-nazi powers unite against them, making the US more of a Social Democracy?
- Or would the US isolate itself and do whatever they think is best for them?

What will the emigrants in the US do?
Top