Blog
Colonize – what is the meaning of the word colonizing
- May 23, 2012
- Posted by: admin
- Category: Uncategorized
The word “colonize” means; to take a way someone’s tradition, cultural beliefs and spiritual ideas, so that you may replace them with the western culture.
To make a colony or to make an environment through scientific discoveries, academic, western political policies and social structure, which is designed to take people away from their traditional beliefs, religion and spiritual ideas.
The act of transforming a colonized nation into a state.
To educate somebody with the aim or motive of taking him away from his own culture, tradition or spiritual belief, so that you may make him to adapt the western european culture
(People, plants and animals) to become established in (a new environment).
But also this same word to colonize means; to replace someone’s religious beliefs, tradition and spiritual ideas with the western Academic mindset.
Through the art of colonizing, Europe expanded their academic and philosophical knowledge into the continent of Asia, Africa, north America, south America and Australia.
The colonizing methods of Europe always focused on the transforming of the social structure of every culture (nation) that they colonized.
The scientific art of pharmacology which was used by Europe (western culture) to produce pharmaceutical drugs, became the supplying source of medication treatments for the colonies of Europe.
Every tradition and culture which made their medicine out of herbs (plants), began to neglect their ways of making medicine, so that they may take on board the pharmaceutical drugs produced by Europe.
Article written by AP Ngabo Alex Flimpoman for conscious lifestyle