We all read in our history books that the European powers colonized Africa in hopes of expanding their empires,or to tap into the huge resources of the continent. But there were huge amounts of negative events that were triggered because of it,like the Slave Trade..or the destruction of native African culture.
But it could also be said that during the 20th Centaury,when the European powers let go of their "colonies" what was left was a continent in shambles. I doubt any of Africa,except for Egypt or South Africa is modernized at all. Added to that,the countries that are in the continent do not reflect the native inhabitants or tribes. The map of Africa is still the way it was from the time of the European powers.
The bottom line is this,should Africa have been decolinized at all? Or should have the powers of Europe kept control? I have view points from both angles,but what do some others think?