Modern Imperialism
Banned
I often notice American views of Europe is much more hostile or isolationist until ww1. With the exception of some immigrant groups, many Americans seemed to view Europe like modern Russia views Western Europe or the EU. It only seems to be the world wars that makes American rekindle with Britain and the European identity. Without this does the US still goes towards a Eurocentric outlook or a more regional base outlook? Could a US hostile towards “western or European influence and culture” become a thing?