Western colonialism

Western colonialism, a political-economic phenomenon whereby various European nations explored, conquered, settled, and exploited large areas of the world.



» Glossary for terminology related to decolonizing