Posted on 08/20/2019 11:46:38 AM PDT by Kaslin
...well... It’s not wrong
“Twenty-six of the lawmakers were wrongly identified as suspects.”
Need to know which 26, so we can assess the validity of that statement.
Also, which lawmakers were correctly identified as suspects.
So..?
What’s the problem? It works too well??
A lot of them are criminals.
how does the author assume that the results were wrong?
after all, the testees were legislators!
... ended with a major fail, as some state lawmakers turned up as suspected criminals.
this confuses me. That sounds like a success.
It only tagged “some” as criminals. Clearly the system needs for work as I don’t believe that the sample of Kalifornica legislature members could be less than 100% criminal.
“...well... Its not wrong.”
He might feel the number should run way higher than 26 if the software is working properly.
Is China’s facial recognition of criminals any better, or does the organ removal and recycling machine not care?
It’s smarter than about half the voters.
Sounds like it’s working correctly to me.
Swatted because you have similar facial features as wanted cop killer?
What could possibly go wrong? Oh and you have nothing to fear from red flag laws either.
Lawmakers identified as criminals?
This is not news.
And?
Sounds like it’s working just fine.
Not sure how the facial recognition works. Is it MATCHING facial features one to another as in matching the features of person x to the features of a known criminal y? If that is the case then the software may not be working very well. The legislators need to provide proof that they are actually NOT the known criminal and should stay in police custody until the do so. That will keep them out of the public sector for awhile and limit the damage they are currently doing to the city/county/state/country.
If, on the other hand, the software is matching the facial features of the legislators to feature that would indicate that they are a criminal or have criminal tendencies, we have a different story. If this is the case then the software seems to be working perfectly.
Tweak it, run it through again. Lets see if we can’t up those numbers.
Did the screening results indicate a bias in identifying D over R as criminals?
As others above me have said: maybe it’s working just right. The devil is in the details
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.