The Story Christian Nationalists Don’t Want You to Know
Black spiritual traditions helped build America—and they’ve always defied a single, state-sanctioned faith.
A national poll recently confirmed that over a quarter of Americans mostly or completely agree with the idea that the US Government should declare America a Christian nation. One in every five believes that God has called Christians to exercise dominion over all areas of American society. This pervasive myth and the political movement …



