If America is a Christian Nation…

Paul Derengowski, ThM

I keep hearing that America is a Christian nation or at least was founded as a Christian nation; it’s all right there in the Declaration of Independence and Constitution. Can’t you see it?

What I frequently see are some borrowed terms that fit well within the Deist way of thinking about God that are not Christian. For example,

We hold these Truths to be self-evident, that all Men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the Pursuit of Happiness.

Nevertheless, the more I thought about it, the more it does not make sense to be arguing that America is a Christian nation.

Because if America is a Christian nation, then why is not the Bible the anchor and foundation for all of its decision-making, from the passing of legislation at the highest levels of government to the rearing of America’s children from the crib?

If America is a Christian nation, then why is God’s name used more as a curse word or why is He more of a mystical whatever from wherever, than the person who has revealed himself in both His book (the Bible) and His Son (Jesus Christ), and desires our attention in all matters, whether they are good or evil?

If America is a Christian nation, then why is there such a lack of love between neighbors, regardless of skin color, race, or lot in life?

Jesus said, in answer to the question about the greatest of commandment in the Law, “You shall love your neighbor as yourself” (Matt. 22:39).

It followed on the heels of loving God with all of a person’s heart, soul, and mind.

If America is a Christian nation, then why do so many Americans demonstrate such disobedience in obeying either command?

If America is a Christian nation, then why are our college and university campuses filled with non- or anti-Christian instructors, professors, and administrators, whose whole agenda is not to inform our young people that absolute truth and knowledge about whatever subject they are studying begins with a reverential respect of God (Prov. 1:7, 29)?

Instead, if a person really wishes to be informed about anything, such knowledge only comes first by presupposing that God is a myth and man is completely autonomous.

If America is a Christian nation, then why do fewer than half of its so-called “Christian” pastors, as well as those claiming to be Christians, have a biblical worldview?

If America is a Christian nation, then why is sexual perversion so rampant, pervasive and accepted as normal, except on rare occasions when it is frowned up for arbitrary reasons, only to revert to normalcy, once their rarity have passed?

If America is a Christian nation, then why are babies murdered by the millions out of convenience, but the murderer’s lives are preserved, because it would be too inconvenient to execute them?

If America is a Christian nation, then why do all of the above, along with so many other issues and contradictions that literally fill our various news feeds, seem to point to the conclusion that America is not a Christian nation, nor has it ever been one?

Don’t get me wrong. The Christian worldview is the only way to interpret correctly the events and experiences we all encounter on a daily basis. It is seeing and living life from God’s perspective.

But, a vast majority of Americans are not doing that. So, how can America rightly be called a Christian nation?

About the Author

Paul Derengowski, Ph.D.
Founder of the Christian Apologetics Project PhD, Theology with Dogmatics, North-West University (2018); MA Apologetics with Honors, BIOLA University (2007); ThM, Southwestern Baptist Theological Seminary (2003); MDiv, Southwestern Baptist Theological Seminary (2000); BA Pastoral Ministry & Bible, Baptist Bible College (1992)