Was America Ever a Christian Nation?

“American Deism
8/24/2008 – Sunday
Many argue that America has always been a Christian nation and that the founding fathers of our nation were Bible-believing Christians. But is this really the case? On this edition of the program the hosts explore the subject of Deism, particularly in the writings of America’s founding fathers, but also in the life and practice of contemporary American Christianity.”

The above quote was taken from http://www.oneplace.com/ministries/The_White_Horse_Inn/archives.asp?bcd=8/24/2008

I just listened to this…it was about 35 minutes or so of discussion…very interesting…don’t let people mess with our history; let’s be real. When we understand how things really were we can better understand what is going on now.

Advertisements