No, it doesn't. America is not a Christian nation.
America is a Christian Nation in the sense that it was founded by Christians who did not want a Government-mandated religion. Did you get your History Lessons out of a Captain Crunch Cereal box?