Dec 28, 2020
The American movie industry has promoted white supremacy since its inception, so much for "liberal" Hollywood.
The American movie industry has promoted white supremacy since its inception, so much for "liberal" Hollywood.
Alejandro Diaz is a workplace management consultant. His new book Enlightened Management is available on Amazon https://bit.ly/social_bookpromo