And by this I mean the U.S doesn't turn the (Latin) Americas into its own backyard, leaves the Spanish colonies well alone and lets Europe handle its own problems.
But the U.S still expands Westwards.
How would the U.S look if it did not seek to expand its influence abroad? I believe our economy would be fine, and our military would be weak, but I'm curious on how we would act on the international stage.
Bonus points if you guys can think of a way this would change American culture.
But the U.S still expands Westwards.
How would the U.S look if it did not seek to expand its influence abroad? I believe our economy would be fine, and our military would be weak, but I'm curious on how we would act on the international stage.
Bonus points if you guys can think of a way this would change American culture.