Insurance is not health care. What don’t those imbeciles get the idea? A health care provider has to accept the insurance otherwise it’s no good.
And perhaps even more importantly it does not reduce the cost of health care and may even increase it.
One of the biggest lies put upon the stupidity of Americans. Our country is mentally retarded.
The whole thing is a boondoggle for insurers. You are exactly correct, health insurance is not health care. Not only does the doctor have to accept the insurance, the insurance also has to cover the treatments the doctor prescribes.