Thursday, November 5, 2015

Whats Changed ?

When you google images of Africa you immediately see men and woman either in tribal jewelry and clothing, naked, starving, or children with skinny arms and legs  with big bellies and the Continent swarming with disgusting and harmful diseases. When inserted with the novel the Heart of Darkness it seems that not much has changed from the Imperialism Era in the late 1870s and early 1900s and with the leading headlines of Africa today. Daily News such as the New York Times consists of articles about the violence and diseases in-rooted in Africa.  Africa then and now is still considered to be a dangerous uncivilized Continent, full of unknowns, and dangerous areas. Although, with so much negativity floating around the news it is hard to believe that Africa can as dreadful as it is portrayed to be by the Mass Media, who has their own definitions of what is clean/attractive and what is dirty/hideous. Also, on the other hand what must be taken into consideration is that maybe the destruction of Africa by the European nations during the time of the Imperialism Era may have been too great a wound for Africa to ever recover. 

1 comment:

  1. Looking at people outside of the continent, the view of Africa hasn't changed much over the past hundred years. I think our perception is distorted because all we ever hear about is the negative. We never hear about good being done in these countries. People in Africa do the same everyday things we do. These things aren't recognized.

    ReplyDelete