I came across an interesting article of late wherein it was stated that unlike what I had previously thought, the American entry into WWI on the side of the Allies wasn't a foregone conclusion. Quite the opposite in fact, and it was only with Wilson's election campaign in 1916 that American opinion began to shift.
Thus, what if America had decided to go to war against Britain/France/Russia instead of with them?