Is God the only reason you do the right thing?
When it comes to morality, what is the point if you don't have God in your life? Society doesn't care, in fact it encourages Immoral behavior.
I'm just egging on the notion that morals decend from God. Not from Conservatives, not from liberals, not from independents, not from the Constitution.
When it comes to my sex-life, yes.