Since I’ve started transitioning to natural hair, I’ve been doing a lot of reading and research on the subject.
I’ve run across things from hair tips to the history of why black women feel the need to straighten their hair.
But in the grand scheme of the world, it just seems to like black women are the lowest rung of the totem pole in most–if not all–societies in the world. Women are already seen as inferior to men the world over, but being black just drops you to the very bottom of society.
Even those this is unjust, wrong, and completely sickening, it seems to me like a pretty undeniable fact.