• ☆ Yσɠƚԋσʂ ☆OP
    link
    fedilink
    arrow-up
    7
    arrow-down
    5
    ·
    11 months ago

    While nothing is ever black and white, it’s undeniable that the relationship between the west and Africa has been deeply exploitative. It’s also pretty well understood what people mean by the west. These are vassal states of the US that are politically subservient to US and rely on US for military protection. These are countries that are aligned around the failing liberal capitalist ideology that US promotes.

    Kicking out western neocolonial regimes is a prerequisite for any positive changes in the countries of Africa. Only once these countries are under control of the people living there can they chart their own course. As long as they remain under the yoke of western hegemony, the interests of the empire will always be put above the interests of the people living in these countries.