A Christian Nation?

Photo by Forsaken Fotos | CC BY 2.0

Over the years I have often heard Christians of various political stripes assert that the United States is a Christian nation. More recently, Christian evangelicals, who supported Trump and his campaign slogan of “Make America Great Again,” seemed nostalgic for a white Christian America. One might be tempted to call the belief that the U.S. is a Christian nation a myth, the seeds of which were sown in 1630 when John Winthrop challenged his community to establish a city on the hill, reflecting the covenant of God and Christian charity. Many myths contain a grain or two of truth. Nevertheless, the belief in a Christian nation is more illusion than truth.

This might be a provocative claim to many people that requires justification. Let me begin by acknowledging that most of the people who immigrated to America, taking native peoples’ lands, were primarily of various Christian denominations. Some saw this country as the new Promised Land, overlooking the fact that by occupying the land they removed any possibility of promise to the non-Christian people who lived here for millennia. So, I am willing to concede that white European settlers were mainly Christian. This was also true after the War of Independence and in this sense one might say this was a Christian nation in that most of the settlers called themselves Christian. I will come back to this, but for now let me say that this new “Christian nation” was clearly neither a…

Read more