Western influence on Africa
From Wikipedia, the free encyclopedia
Historically, the wildlife, natural resources, and culture have made Africa a highly valuable continent to the western world. Africa has gathered the attention of western tourists, western explorers, and western imperialists from all over. As such, Africa has been heavily influenced over time by western interests.
This article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these template messages)
|