Blog
Imperialism – what does the word imperialism mean
- November 16, 2013
- Posted by: admin
- Category: Uncategorized
Imperialism, as defined by the Dictionary of Human Geography, is “an unequal human and territorial relationship, usually in the form of an empire, based on ideas of superiority and practices of dominance, and involving the extension of authority and control of one state or people over another.”
It is often considered in a negative light, as merely the exploitation of native people in order to enrich a small handful.
Lewis Samuel Feuer identifies two major subtypes of imperialism; the first is the “regressive imperialism” identified with pure conquest, unequivocal exploitation, extermination or reductions of undesired peoples, and settlement of desired peoples into those territories, examples being Nazi Germany, the British Empire and the French Colonial Empire
The term as such primarily has been applied to Western political and economic dominance in the 19th and 20th centuries.
It meant that European (and American and Japanese) capitalists were forced by the internal logic of their competitive system to seek abroad in less developed countries opportunities to control raw material, to find markets, and to find profitable fields of investment.
Read more about the age of imperialism