IMO, multiple factors have dragged the medical profession into disrepute...
1. Covid pandemic. Many doctors mindlessly spouted untruths not only about the safety and efficacy of the vaxxes but also on topics such as natural immunity, masking, herd immunity etc. People don’t have amnesia. They remember that doctors, with great confidence and certainty, told them things that turned out to be untrue.
2. Transgenderism. People hear about the medical establishment going all in this madness, supporting child mutilation with sex changes etc.
3. Abortion. People see young doctors clamoring in the media for abortion, protesting Dobbs etc., and question... hey, do these people want to save lives or end them?
bttt