Imperialism
What is imperialism? According to Merriam Webster's Online Dictionary, Imperialism is defined as: "the policy, practice, or advocacy of extending the power and dominion of a nation especially by direct territorial acquisitions or by gaining indirect control over the political or economic life of other areas; broadly : the extension or imposition of power, authority, or influence" (Imperialism). Read all about the lasting effects of imperialism on the various regions of the diverse continent of Africa.