Posted on 02/10/2018 8:38:58 PM PST by Theoria
Facial recognition technology is improving by leaps and bounds. Some commercial software can now tell the gender of a person in a photograph.
When the person in the photo is a white man, the software is right 99 percent of the time.
But the darker the skin, the more errors arise up to nearly 35 percent for images of darker skinned women, according to a new study that breaks fresh ground by measuring how the technology works on people of different races and gender.
These disparate results, calculated by Joy Buolamwini, a researcher at the M.I.T. Media Lab, show how some of the biases in the real world can seep into artificial intelligence, the computer systems that inform facial recognition.
In modern artificial intelligence, data rules. A.I. software is only as smart as the data used to train it. If there are many more white men than black women in the system, it will be worse at identifying the black women.
One widely used facial-recognition data set was estimated to be more than 75 percent male and more than 80 percent white, according to another research study.
(Excerpt) Read more at nytimes.com ...
It isn't dark skin, per se, that confuses the algorithms in the software. So, there isn't a line of code that says, in essence, "If subject's complexion is swarthy, then crash!"
Rather, as the excerpt pointed out, more White men than, e.g., dark-skinned women were used to "train" the software.
So, to balance out the software, they could simply use more dark-skinned and female test subjects.
Regards,
My kidlet’s in A.I. He spent 2 yrs. teaching a computer Elizabethan english.
OOOkay. I guess these people have other more important things to do. Lol.
In order to better understand my son’s career choice, I spent several hours last week reading “Machine Learning for Dummies”. It quickly became apparent I need to start with “Machine Learning for Morons”.
It basically boils down to statistic. Data in, data out. And a whole bunch of math.
It’s an extremely straightforward matter of contrast. The lighter the face the easier it is to distinguish lines and shadows and thus determine shape.
It’s not gender, it’s sex, and the software is not racist. This proves, with computer programs, that they all “really do look alike.”
Soul Man comes to mind.
hell no that never happens.
there will be protests in the street because the software ONLY works on WHITE MEN and not blacky.
they will demand restitution be made and the CBC will hold hearings on why blacky was discriminated against.
bills will be introduced on the floor banning black and white pics and only having black and black pics by Rep. Jackson Lee and just passes with kick backs discovered from HP concerning black cartridges and new black paper being shoe horned into the bill.
all of this because white men are recognizable
yeah before killing yo homey yo just need some shinola
Because it is only politically acceptable to test it on white men.
There is far less sexual dimorphism in the faces of Blacks - many of them look far more masculine than the other races.
Hence the need for over-sized posteriors, etc.
“They all look alike”
Ain’t that the truth! I’ve been saying that for years.
And it’s really a problem if facial recognition decides that you’re a gorilla.
“A major flaw in Google’s algorithm allegedly tagged two black people’s faces with the word ‘gorillas’”
http://www.businessinsider.com/google-tags-black-people-as-gorillas-2015-7
Wow! Looking at the photos I can see how a set of algorithms could make that mistake. Getting too close to the lens like that and at that angle creates an overly prognathic image.
They’re not saying face recognition is racist. They’re saying that the algorithms have to be improved because they are largely based on white male features. They’re not looking to abandon face recognition, but the make it more accurate,
Me too.
What exactly is the task?
Is it...
(a) determining the identity of someone based on their picture?
(b) determining the sex of someone based on their picture?
If it’s (a), I would think they would have the same amount of training data per subject, but maybe not...
if it’s (b), is a picture of someone not in the training set given as a test, or are the test pictures confined to the subjects?
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.