Greetings, why do you think western media is obsessed with portraying Africa the way it does?

Cuz it validates Capitalism and White Domination.  

The corporate Western media seeks to show Africans as unfit or incapable of self-governance, show African nations as impoverished and unable to get beyond petty/tribal conflicts; which absolves the West of their past and ongoing crimes against Africans and Africans.

If Africans were not inherently dysfunctional then we’d have to look elsewhere to explain the conditions in much of Africa and we’d have to employ solutions beyond NGO & foreign nations intervention into Africa’s affairs.

Classic victim blaming scenario.   

Most Western media portrayals of Africa are Anti-African Propaganda.

Oh, and the West wants to sustain the rift between African and the African Diaspora; while reinforcing the inferiority of the African Diaspora by promoting they false inferiority African nations.