This nation was founded on Christian principles by deeply religious people. It is not the same thing as to say that this is a Christian nation. If I am not a Christian --- does not make me a part of this nation?
Of course, everyone has religious freedom and is part of the U.S. as Americans, but there's also just a basic reality to be dealt with that non-Christians just have to remember where they are, and that most people they see around them are going to be living their Christian faith around them. Those who are in the minority and follow a different path have to make leeway for that and grow a thicker skin.
I'm not a nudist, but if I happen to walk into a nudist colony, it's not exactly logical for me to act shocked about the naked people all around me. I'm in their environment. If I'm going to be there, I have to change my expectations and not be so sensitive.
The tensions and controversy we see arising from this situation isn't really about religion, it's about the "society owes me" attitude fostered by the Left in this country over a period of years that various minorities have to be specially accomodated and everyone has to walk on eggshells.