It hasn't been our country for a long time now. It's the politician's country, and they'll happily sell it out from under us for more power.
Didn't the Founding Fathers have something to say on that subject?
Somethig about, "...That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed. That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness...."