How Colonialism Destroyed Cultures


Colonizers believed that everything, including the earth, was meant to be bought and sold.
TEENVOGUE.COM

Comments