America Was Never White
History News Network

Radical rightists purposefully mix “heritage” with “history,” rhetorically pining for a once proud “white” America. But history proves that America was never white.
Spread the word