Dec 28, 2020
The American movie industry has promoted white supremacy since its inception, so much for "liberal" Hollywood.
The American movie industry has promoted white supremacy since its inception, so much for "liberal" Hollywood.
Alejandro Diaz writes about culture, politics, and the workplace. His new business, D & E Management Solutions, Embraces Humanity at work https://dandems.com