I'm going to be blunt here. What I'm going to say is a personal observation, as I see it. Fell free to rebutt and change my mind.
Western nations no longer have any interest in Africa. The way I see it, France, Belgium, free Dutch, et al, colonized Africa.
Did they treat the natives as equal? Of course not. Did the start to industrialize, bring mass farming, hydro and common market trade. Did they move the continent into the modern era? Damn right.
Things 'seemed' to progress in Africa for around a hundred years. Things were getting better and becoming modern to where they competed with the rest of the world.
Revolution took over, the native Africans ran the colonials out, by political force or violence. It is still happening today, although most non Africans that owned anything have left.
In a generation, Africa has moved back into tribal groups and activity, torn down any marketable businesses that they had and put themselves back to where they were a hundred years ago.
All they have left is tribal governance and despots. These same despots allow countries like China to set up and rape resources by way of bribes.
No one wants to say it, because 'it's not politically correct', but the basic world feeling is they can stew in their own juices. They made their bed, they can sleep in it. The modern world tried to convert and change the 'dark continent', but they'd rather live their tribal lives and culture.
Sorry about their loss. It sounds crude and insensitive, but every time we try help we get our ass bit.
The world has stopped :brickwall: to the detriment of thousands of innocent people.