AFAIK and from everything I can recall from my history books, America seemed to be secular. Apparently though, a lot of people claim that America is a Christian nation.
So, I’m trying to figure out what that means. Would people ever vote a non-Christian into office?
Should the laws be based on the Bible?
There are so many different flavours of Christianity. Which one of them defines America’s Christian values?
What do you think the founding fathers would have to say if someone told them that America is a Christian country?
Edit: Misspelled America in the title with my fat fingers. Sorry about that, American friends!
submitted by /u/sandalcade
[link] [comments]