Western cultural imperialism – a sample

Western Imperialism attempting to force their worldview on Africa and the Third World once again. http://www.standforfamiliesworldwide.org/sffww/documentary/

Imperialism, as defined by the Dictionary of Human Geography, is “an unequal human and territorial relationship, usually in the form of an empire, based on ideas of superiority and practices of dominance, and involving the extension of authority and control of one state or people over another.”[2]

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.