alternatehistory.com

One of the vague things I know about the decline of the British empire is that, in some way, the US acted to force European powers to give up their colonies (unless, as in Vietnam, the colonies looked to be becoming communist).

But what exactly was done? What did the US do, and how did it avoid ruining US-European relations?
Top