The term racism refers to the biological belief that one group or race is superior to another by the color of their skin. The concept first took shape when the African slaves were deported to different parts of the Western world to sell them to white families at a high price (Eliav-Feldon et al.).The moment the Englishmen set eyes on the darker being, they labeled the natives as filth and ignorant humans while calling themselves virtuous and educated civilians. The arrival of the Black slaves into the United States of America led people to exhibit power and control over the new arrivals by taking away their fundamental human rights. Black slaves were treated like animals and were punished for not abiding by the laws of American society.
Studies indicate that white people thought themselves to be the superior being of their color. The color white allowed them to exert a force on non-whites. Slavery and segregation made the situation worse for African Americans as they were not allowed the right to live freely nor were they allowed to acquire education. Black men and women were prohibited from working as white people did not want to have them as their equals. Interracial marriages were strictly banned, and anyone who dared to break the law was severely punished. Such a treatment waged on for years after years till activists like Martin Luther King Jr. initiated a campaign for the rights of the people of color.
However, racism continues to exist in the present day in American society. American society has ingrained this idea into the minds of its people that Black people are inferior from birth and can never rise the ladder of social strata or be equal to the white man. People continue to fight for their rights in America. However, little to none has changed as white people ill-treat people of color.
Eliav-Feldon, Miriam, et al. The Origins of Racism in the West. Cambridge University Press Cambridge, 2009.