The American movie industry has promoted white supremacy since its inception, so much for "liberal" Hollywood.
--
Alejandro Diaz writes extensively on culture, politics, and the workplace.
Love podcasts or audiobooks? Learn on the go with our new app.