American Character without World War 1 happening?

The United States likely continues its isolationist attitude, but that's a bit of a misnomer, considering late 19th Century American isolationism involved several colonial wars. I doubt the US would really expand beyond their Pacific islands and the Western Hemisphere, but they're likely to have a degree of economic imperialism in Asia.
 
Last edited:
Top