- Home
- de-Christianized
How Government Schools De-Christianized America
Alex NewmanAug 15, 2021
Government indoctrination masquerading as “education” has deliberately waged war on biblical religion, and the fruits are devastating America and its children.