Hardly a doctor in our hospitals around here are Americans. The grand majority are from India and the Middle East. Pakistan, Iraq, Turkey, Egypt, etc., we are full of them. The entire cardiology department of a small hospital near here is entirely made up of Middle Eastern doctors. Not an American in the bunch. I even wonder if we have many American trained doctors anymore. Most are from “out of state” or “out of the nation.” I’m not saying they don’t know what they are doing, in fact most seem to know their jobs very well that I have dealt with. It is just that it does not seem like many American kids want to try to be doctors any longer. Why? Poor schooling, poor grades, attitudes, costs, what? Why are we not training enough doctors any longer?
Search Twitter for “unmatched American doctors.”
My daughters boyfriend had 4.0 in chemistry, great references from doctors on staff at a major research hospital , shadowing of several doctors, etc, He did not get accepted to med school. He is white American. He’s now weighing his options to try again or go for a master’s/phd and academic research instead. Maybe no connection, maybe anecdotal, but we have heard med schools have gone as woke as all the other corporations in the admissions policies, that they do not want white men, ( and white women are low on the list as well), and I imagine hospitals the same, since hospitals are pretty much owned by big corporations now.
Those kinds of truths tend to seep out and would deter qualified young people from trying.