Posted on 05/25/2022 2:36:23 PM PDT by Roman_War_Criminal
According to new research, deep learning models based on artificial intelligence can identify someone’s race merely by looking at their X-rays, which would be impossible for a human doctor looking at the same photos.
The findings raise several serious concerns concerning AI’s role in medical diagnosis, assessment, and treatment: Could computer algorithms mistakenly apply racial bias when analyzing photographs like these?
An international team of health researchers from the United States, Canada, and Taiwan tested their AI on X-ray images that the computer program had never seen before after training it with hundreds of thousands of existing X-ray images annotated with specifics of the patient’s race.
Even when the scans were taken from people of the same age and sex, the AI was able to predict the patient’s claimed racial identification on these photos with amazing accuracy. With some groups of photos, the system achieved 90% accuracy.
(Excerpt) Read more at strangesounds.org ...
Could computer algorithms mistakenly apply racial bias when analyzing photographs like these?
These monsters have to make everything about race.
I ‘m not too concerned about this AI ability.
I would have expected such a degree of sensitivity at some point. Something could make me change my mind and get worried , but not at this time.
I think the racist X-Rays are caused by Global Warming.
CLEARLY X-rays are racist!
It can also be determined from brain scans (fMRI) who is a conservative and who is not, even by human analysts, with a fair degree of accuracy and reliability. Would you be comfortable with an advanced AI scanner at, say, the entrance to a voting site?
“which would be impossible for a human doctor looking at the same photos”
Nonsense. If an anthropologist or forensic pathologist can look at bones and tell the race, sex, and age, and if a dentist can look at only teeth and tell the race, sex, and age, then I’m sure any doctor could learn to tell the same things from an x-ray. They just haven’t bothered to try to learn such a skill, since they have no need of it.
“Could computer algorithms mistakenly apply racial bias when analyzing photographs like these?”
It’s already happened:
https://algorithmwatch.org/en/google-vision-racism/
wait, i thought Anthropologists have been doing it for decades?
programming their criteria into AI shouldn’t be all that difficult
specially with known reference imagery...
>> the patient’s claimed racial identification
Good grief
I wrote the algorythim that’s going to be used to detect domestic terrorists.
1 Check subjects political party
2 is the person a Democrat?
3 Yes = Not a terrorist
4 N0 = Is a terrorist.
I just saved millions of dollars.
Oh brother, science has been doing this forever. It’s how they know the race of remains. Race, when I was young had zip to do with color and everything to do with bone structure. If you’re not Asian, Native American, or Black then you’re Caucasian.
-PJ
AI phrenology?
X-Ray Spex were an English punk rock band formed in 1976 from London.
During their first incarnation (1976–1979), X-Ray Spex were “deliberate underachievers”[1] and only released five singles and one album.[2] Nevertheless, their first single, “Oh Bondage Up Yours!”, is now acknowledged as a classic punk rock single[3][4][5][6] and the album Germfree Adolescents is widely acclaimed as a classic album of the punk rock genre.[7][8][9][10][11]
https://en.wikipedia.org/wiki/X-Ray_Spex
Initially, the band featured singer Poly Styrene (born Marion[12] Joan Elliott-Said) (alternatively spelled Marian[13] or Marianne[14]) on vocals, Jak Airport (Jack Stafford) on guitars, Paul Dean on bass, Paul ‘B. P.’ Hurding on drums, and Lora Logic (born Susan Whitby) on saxophone. This last instrument was an atypical addition to the standard punk instrumental line-up,[15] and became one of the group’s most distinctive features. Logic played on only one of the band’s records. As she was only fifteen, playing saxophone was a hobby and she left the band to complete her education.[16]
Can AI determine that I identify as a Bornean Elephant type of Republican and am an endangered species? Do they know that I can trample them and that they are prohibited from harming me in any way?
Mistakenly???
How is it a mistake if the results are accurate?
If AI can predict race, can predicting rac-ism be far behind?
No. I would not.
That’s placing people in social classes based on statistic assumptions, vs describing physical bone structures most common to various racial groups.
See post #11 and replace “terrorist” with “racist”.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.