An Imperial German victory in WWI.
Not only have Nazi victories been inevitably done to death, but more than that, what more is left? It's a dystopia. Whether it's a U.S./Nazi Cold War a la Fatherland or a Japan/Nazi Cold War a la Man in the High Castle, either way, it's going to be brutal, bleak, and the worst part is, since WW2 is what settled the world into its current world order, it's going to stay that way.
Now, with WW1, the interesting about it is that things changed. When the balance of power in old, imperial Europe finally blew up, there were huge possibilities for different directions that could happen in its aftermath. WW1 represented the death of the old 19th Century European order, to be replaced by the radical new ideologies of nationalism or communism. Now, what would've happened if the Central Powers had won, Germany and Austria-Hungary had enjoyed the spoils of victory despite the problems they had within their borders, while Russia, Britain, and France had been defeated and been prey to radicalism?
The latter seems to promise a lot more potential for things to be different.