To lay down the scenario, Britain has been reduced to having it's Royal Navy destroyed for the most part, and is now a vassal state of Germany. Germany was able to win this conflict by allying with Russia, annexing Austria-Bohemia, etc, and focus its efforts on Western Europe. After one war in which it takes out France, and another one in which it takes down Britain, it is now undisputed master of Western Europe. America is still isolationist, and has no interest in those parts of the world. Meanwhile, what do South Africa, Australia, and New Zealand do? Let's say that this happens sometime in the early 20th century. Now, I would prefer nobody focuses on how Britain gets beaten, as that is the least important detail of this, as the only thing that makes Britain and Germany's war with each other here is that it led to Britain being no longer able to have it's own independent foreign policy and that it's former colonies are now doing their own thing, if they haven't been taken by Germany. It's path to power is not as important so much as the effects of its victory through power. But what do the British settler colonies, besides Canada who would just get close with America, do? Like, would they try to form friendly relations with Germany, or would they remain hostile? What would their foreign and economic policies be like? Immigration policies?