Posted on 01/13/2025 12:03:37 PM PST by E. Pluribus Unum
PAGEDALE, Missouri — After two men brutally assaulted a security guard on a desolate train platform on the outskirts of St. Louis, county transit police detective Matthew Shute struggled to identify the culprits. He studied grainy surveillance videos, canvassed homeless shelters and repeatedly called the victim of the attack, who said he remembered almost nothing because of a brain injury from the beating.
Months later, they tried one more option.
Shute uploaded a still image from the blurry video of the incident to a facial recognition program, which uses artificial intelligence to scour the mug shots of hundreds of thousands of people arrested in the St. Louis area. Despite the poor quality of the image, the software spat out the names and photos of several people deemed to resemble one of the attackers, whose face was hooded by a winter coat and partially obscured by a surgical mask.
Though the city’s facial recognition policy warns officers that the results of the technology are “nonscientific” and “should not be used as the sole basis for any decision,” Shute proceeded to build a case against one of the AI-generated results: Christopher Gatlin, a 29-year-old father of four who had no apparent ties to the crime scene nor a history of violent offenses, as Shute would later acknowledge.
Arrested and jailed for a crime he says he didn’t commit, it would take Gatlin more than two years to clear his name.
A Washington Post investigation into police use of facial recognition software found that law enforcement agencies across the nation are using the artificial intelligence tools in a way they were never intended to be used: as a shortcut to finding and arresting suspects without other evidence.
Most police departments are not required to report that they use facial recognition, and few...
(Excerpt) Read more at washingtonpost.com ...
I think I see the problem.
They needed to have the head detective shout “Enhance!” at the technician a few times to improve the image like they do on TV. I think it was on CSI they did that and got a perfect security camera picture of the perpetrator off a reflection from the victim’s eye. I calculated that it would have been six pixels wide even with a HD camera, so the boss would have needed to shout “Enhance!” at least twice more than he did. < /eyeroll>
Though the city’s facial recognition policy warns officers that the results of the technology are “nonscientific” and “should not be used as the sole basis for any decision,” Shute proceeded to build a case against one of the AI-generated results: Christopher Gatlin, a 29-year-old father of four who had no apparent ties to the crime scene nor a history of violent offenses, as Shute would later acknowledge.
The wrong use of the tool is the problem? blurry video just as good or better than eye witness.
On the other hand, what does “build the case” mean? It looks like Shute did his job, investigated and said “nothing here.”
If this had been eye witness instead of facial recognition, would we view it differently?
Sounds like Detective Shute is running for DA and wants his conviction rate up...evidence and innocence be damned.
That always annoys the alimentary canal waste product out of me.
Good point. How often do eyewitness accounts differ?
The improper photo lineup that led to Chris Gatlin’s arrest
Police body cam footage from Aug. 9, 2021 shows St. Louis police detectives asking Michael Feldman, an injured security guard, to identify his attacker from a photo lineup. Feldman’s statement was later thrown out by a judge, who said the officers failed to conduct a fair and impartial photo lineup prior to arresting Christopher Gatlin.
I think I see the problem.
= = =
There were two complete pixels!
No problem.
The point is to get a conviction. Not to get the guilty party. This is why we should be scared of the way that the “justice” system is going to use AI.
I’m not worried about robots taking over the world. I am worried about the people in power using AI to get more power over people.
If Shute was doing his job and said, nothing here, how did Gatlin end up in jail on the basis of blurry photos and AI?
Investigation should precede jailing.
The most ridiculous CSI example I can recall was when they enhanced a grainy image from an ATM security camera — the object was the windshield car registration sticker on a vehicle parked across the street — at night. They read the numbers on the decal.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.