America was founded as a Christian nation. It is only recentlly that anti-Christian bigots like you are trying to erase God from the public square and classsroom, and teach a FALSE history based on revisionist lies. Dont' even go there pal - I have a degree in U.S. History. I'll paste you to the wall.
When you say, "America was founded as a Christian nation", what exactly do you mean? Surely you do not think that America was founded as a Christian nation in the same way as Saudi Arabia is an Islamic nation?