“Expert systems” have helped with diagnoses for many years now.
Something I have heard quite often is that people who go into medicine want to work with patients, want to heal people, want to actually do “doctoring”. However, our modern system involves huge amounts of paperwork, for insurance, for affiliated hospitals, for protection against malpractice. It’s just forms, forms, forms. So doctors don’t have time to keep up with the latest advances and sometimes don’t have a clear idea of what they’re doing, but if Big Pharma says “give them this pill”, then the doctor just “gives them the pill” and sends the patient on their way.
And now the “doctors” are a further step for anything like “doctoring”. Let’s have the AI diagnose things — I have these forms to fill out. The forms are more important than the patient, after all.
All of our institutions are badly broken.
This is SOP among the DEI crowd.
After COVID this isn’t sounding so bad.
I wonder how good a liar AI is? It might go all HAL9000 on the government if they gave it the COVID job.
It is because the regulators wholly owned and operated by monied interest in keeping medicine as expensive as possible can’t control it. Maybe a diagnosis can be had in minutes rather than two dozen appointments and expensive tests.
The sad part is, used responsibly, this might be helpful for some doctors and patients - especially those with difficult or unusual symptoms and conditions. Doctors don’t know everything, after all. But we mistrust it because we know there will be charlatans out there that do not use it responsibly.
Diagnosis of even relatively simple systems can be a challenge [Just ask my furnace guy the other day].
Getting a correct diagnosis is valuable in multiple dimensions. [The furnace guy eventually got it totally right on the first trip out to the compound and deserves every penny he earned.]
The human body is not simple in theory nor practice.
As long as the AI diagnosis is not mandatory [CDC, I’m talkin’ ‘bout jou...], I’m open to a 2nd opinion.
Software for EKGs and CTs already spit out a dx, usually the worst case scenario. It is up to the doctor to take the whole patient and evidence into context to make the correct diagnosis
I like the concept of AI as a “help only” in the diagnosis.
1. Dr. makes his best diagnosis first.
2. Dr. then runs it through AI and if AI agrees it is “probably” right.
3. If AI disagrees the Dr. then reevaluates and goes with AI or his fist diagnosis or considers more tests.
4. Dr. make the final decision, not AT!
Having an AI making medical condition diagnosis creates a new dilemma.
If you have a DEI doctor and an AI who is more likely to make the correct diagnosis?
Medical school should teach social justice and how to input data to HippocrAItes.
I’ve been going to Dr. Alexa for years when I’m sick. LOL!
Note to doctors: And this is the thanks you get for crawling in bed with Deep State.
Now you know how the rest of us feel.
Welcome to the party, pals.
"Regulators see new opportunities to control medical industry by claiming that doctors are abusing AI diagnosis tools."
AI is a tool, not an expert diagnostician. Any doctor who would proceed to treat a patient solely on an AI system diagnosis has bigger problems than just his AI system.
Sooooo...... are doctors nowadays too stupid to have any critical thinking skills? How about conferencing with each other or it that too hard to do? Nooooo.... let’s all jump on the AI train and let some program that some yahoo somewhere designed and let it determine what the prognosis needs to be.
We’re surrounded by idiots apparently.
“Regulators” (i.e., unelected bureaucrats) become “alarmed” whenever their power and authority to boss others around is threatened. Power over others is most often the primary — if not the sole — reason for their existence.