How Government Schools De-Christianized America
Government indoctrination masquerading as “education” has deliberately waged war on biblical religion, and the fruits are devastating America and its children.
Government indoctrination masquerading as “education” has deliberately waged war on biblical religion, and the fruits are devastating America and its children.