What Southern Black Women Can Teach Us (And The Country) About Ourselves
It was Angela Davis who reminded us that “when Black women win victories, it’s a boost for virtually every segment of society.” It says a lot that Dr. Davis had the foresight to build a politic that benefits everyone while centering the most marginalized. In my opinion, it is her Southern roots that have prepared her to so easily diagnose the problems with American society while visioning something better.
I’ve observed this same clarity and prowess in other Black daughters of the South. It’s Women’s History Month, a time when we are celebrating and remembering the feminist contributions of women in this country, and it’s Black Southern women who continue to teach us what true feminism is. So, that begs the question: what is it that they know that the rest of us don’t? What can Southern Black women teach this nation about the pursuit of equity?