It seems that the Republicans have bought into the Democratic argument that healthcare is a responsibility of the federal government. It is not! Historically, healthcare policies have been up to the states (and to the individual). Things really got screwed up when the feds got involved. We need to return these programs to the states and remove all federal mandates/regulations regarding them. The states are the laboratories of democracy...let them have a go at it!
It may seem that way to you, but how did you come to that conclusion...the liberal media?