Was the rise of Europe inevitable?

Was the rise of the West inevitable?

  • Yes

    Votes: 11 25.0%
  • No

    Votes: 33 75.0%

  • Total voters
    44
Over the past 500 years, Europe has conquered and colonized much of the world. But was the rise and dominance of the West over the rest of the world inevitable? Why or why not?
 
Not inevitable, because there has been many times that Europe as we know it could have ceased to exist. Europe as we know it is built of the back of colonialism and global trade which in turn grew from the renaissance. While events like the black plague could have very easily destroyed all of Europe, without the plague Europe probably would never have moved away from feudalism and remained more or less medieval.
 
It was not inevitable. Hell if the Mongols never conquered China, then the song dynasty might have started the industrial revolution not Britain.
 
Yes. Whoever gets to America first is basically poised to become extremely powerful. West Africa and Europe are the only places well placed to reach the Americas. While West Africa did support large urban societies none were quite so large as Europe's and more importantly Europe (and Morrocco) had a greater impetus towards maritime culture due to its large number of peninsulas, islands and inland seas.

The discovery of America and the subsequent virgin field epidemnics created a snowba effect, and here we are.
 
while Europe dominating the world wasn't inevitable (there are a raft of PODs to stop that), once industrialization got going, Europe was bound to be a powerful place... there are just too many of the right resources all right there together; water power, the best coal, iron deposits, etc....
 
Top