Posted on 09/22/2019 4:35:42 PM PDT by ransomnote
UK local councils and police forces are using personal data they own and algorithms they bought to pre-empt crimes against children, but there are many things that could go wrong with such a system.
A new research by Cardiff University and Sky News shows that at least 53 UK local councils and 45 of the country’s police forces are heavily relying on computer algorithms to assess the risk level of crimes against children as well as people cheating on benefits. It has raised many eyebrows on both the method’s ethical implications and its effectiveness, with references to Philip K Dick’s concept of precrime inevitable.
The algorithms the authorities sourced from IT companies use the personal data in their possession to train the AI system to predict how likely a child in a certain social environment is going to be subjected to crime, giving each child a score between 1 and 100, then classifying the risk level against each child as high, medium, or low. The results are then used to flag to social workers for intervention before crimes are committed. This does not read too dissimilar to the famous Social Credit system that China is building on national scale, though without the benefits of faster housing loans or good schools for kids as a reward for good behaviour.
The Guardian reported last year that data from more than 377,000 people were used to train the algorithms for similar purposes. This may have been a big underestimate of the scope. The research from Cardiff University disclosed that in Bristol alone, data from 54,000 families, including benefits, school attendance, crime, homelessness, teenage pregnancy, and mental health are being used in the computer tools to predict which children are more susceptible to domestic violence, sexual abuse, or going missing.
On benefit assessment side, the IT system to support the Universal Credit scheme has failed to win much praise. A few days ago, computer generated warning letters were sent out to many residents in certain boroughs, warning them their benefits would be taken away because they have been found cheating. Almost all the warnings turned out to be wrong.
MORE AT LINK
Police trial AI software to help process mobile phone evidence
Does it identify all the girls being groomed by Muslims for sex trade use or just white British guys?
What could go wrong?
Some environments really are red flags and need evaluation. But who watches the watchmen?
“Family has unacceptable religious beliefs.”
I.e., the family consists of devout Christians and are frequently seen attending church.
Arrest them!
Minority Report come to life, just wait until Googles new “Skynet” comes online.
“Guilty! Read the charges.” — No.2, “Dance of the Dead,” _The Prisoner_
p
The people in charge there thought 1984 was a manual.
Aw, so there is a silver lining to the being on the down hill side of life.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.