Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

UK government probes algorithm bias in crime, recruitment, and finance
Yahoo Finance UK ^ | March 20, 2019 | Oscar Williams-Grut

Posted on 03/20/2019 7:08:19 AM PDT by Reeses

The UK government is launching an inquiry into whether algorithms used in areas such as criminal justice, recruitment, and finance are biased against people based on gender or race.

The Department for Culture, Media, and Sport (DCMS) announced the new inquiry on Wednesday. It will be led by the Centre for Data Ethics and Innovation, an independent watchdog set up by the department in 2018 to come up with best practice policy on how to police artificial intelligence.

The Cabinet Office’s Race Disparity Unit, which highlights discrimination in public services, will also work on the new review.

The inquiry comes amid the rapid development of artificial intelligence. The technologies surrounding AI have provided big advances, but also thrown up challenges around the fairness and transparency of decision making. Tech giants such as Amazon (AMZN) have already run into issues of algorithmic bias in recruitment and Google (GOOGL) established a team last year to review the ethics of its AI.

“Technology is a force for good and continues to improve people’s lives, but we must make sure it is developed in a safe and secure way,” digital secretary Jeremy Wright said in a statement.

Highlighting examples of potential pitfalls, DCMS said algorithms used to scan CVs and shortlist candidates could discriminate against certain groups of people due to the unconscious, or conscious, bias of programmers who wrote the code.

It also warned that the growing use of algorithms in finance could lead to a lack of transparency when applied to loans.

(Excerpt) Read more at finance.yahoo.com ...


TOPICS: Business/Economy
KEYWORDS: ai
A big problem with neural networks is they are inherently racist, sexist, and politically incorrect. Worse there is no source code to examine where they might be "corrected". The pattern recognition is everywhere and cannot be removed. Laws will soon be passed to ban them from loan approval, hiring, crime solving, really anything to do with human evaluation.
1 posted on 03/20/2019 7:08:19 AM PDT by Reeses
[ Post Reply | Private Reply | View Replies]

To: Reeses

AlGore has rhythm?....................


2 posted on 03/20/2019 7:15:34 AM PDT by Red Badger (We are headed for a Civil War. It won't be nice like the last one....................)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Reeses

Facts can’t be documented if the left doesn’t like the implications.


3 posted on 03/20/2019 7:15:59 AM PDT by G Larry (There is no great virtue in bargaining with the Devil)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Reeses

“The Department for Culture, Media, and Sport (DCMS)”

Well that’s Orwellian as it gets. England is so dead. I am sick to death of this race crap.


4 posted on 03/20/2019 7:17:26 AM PDT by gibsonguy
[ Post Reply | Private Reply | To 1 | View Replies]

To: Reeses

We know what the results must be. After all, crime percentages are obviously equal for all cultures, so if some culture (we know which one) demonstrates a much much much higher crime rate, the algorithm must be flawed.


5 posted on 03/20/2019 7:20:01 AM PDT by Da Coyote
[ Post Reply | Private Reply | To 1 | View Replies]

To: Reeses

Well, if one uses historical data and bases actions on that historical data then one is more likely to get the same results as before.

The trick is to find the “real” variables to act upon. Not skin color or race, but rather if the person is a scumbag or not.


6 posted on 03/20/2019 7:21:50 AM PDT by glorgau
[ Post Reply | Private Reply | To 1 | View Replies]

To: glorgau
Not skin color or race, but rather if the person is a scumbag or not.

You might look at a person's record to determine this but you may also find that the correlation between whether or not someone is a scumbag and their ethnicity is going to get you in trouble anyway, even though those were not inputs into the algorithm.

7 posted on 03/20/2019 7:46:21 AM PDT by 17th Miss Regt
[ Post Reply | Private Reply | To 6 | View Replies]

To: Reeses

“an independent watchdog set up by the department in 2018 to come up with best practice policy on how to police artificial intelligence”

Great, now we’re going to have to worry about PC robots. On the upside, if they turn the robots into pajama boys, we might be able to actually defeat our new robot overlords.


8 posted on 03/20/2019 7:56:03 AM PDT by Boogieman
[ Post Reply | Private Reply | To 1 | View Replies]

To: Boogieman

When self-driving cars are in crash mode and have to choose between running over a white man or a minority, they will be required by law to run over the white man, except if the man is wearing a dress.


9 posted on 03/20/2019 8:09:04 AM PDT by Reeses (A journey of a thousand miles begins with a government pat down.)
[ Post Reply | Private Reply | To 8 | View Replies]

To: Reeses

So they will modify their algorithm to randomly pick white males as suspects to balance things out.

Can’t wait till they start combining that w the ancestry DNA search.

We need to kick in your door and shoot your dog at 3am in the name of fairness.


10 posted on 03/20/2019 8:12:08 AM PDT by fruser1
[ Post Reply | Private Reply | To 1 | View Replies]

To: Reeses

Men are more likely to commit crimes, especially violent crimes, than women.

Blacks, whether you blame culture or biology in lower average IQ or current politics, are more likely to commit crimes than whites and east Asians.

Muslims, due to culture and religion, are more likely to commit crimes like terrorism and rape of non-Muslims.

If the algorithm says these groups are more likely to commit crimes, it is based on DATA and intelligence.

Any algorithm that says an old white woman is as likely to rape as a Muslim man is flawed beyond reason.


11 posted on 03/20/2019 8:27:16 AM PDT by tbw2
[ Post Reply | Private Reply | To 1 | View Replies]

To: Red Badger

The problem with using algorithms to correct for so-called implicit bias is that implicit bias itself isn’t sound. Where it may exist, there is NO proof it affects real world behavior.

Psychology’s Favorite Tool for Measuring Racism Isn’t Up to the Job

Almost two decades after its introduction, the implicit association test has failed to deliver on its lofty promises.
http://nymag.com/scienceofus/2017/01/psychologys-racism-measuring-tool-isnt-up-to-the-job.html


12 posted on 03/20/2019 8:28:45 AM PDT by tbw2
[ Post Reply | Private Reply | To 2 | View Replies]

To: Boogieman

It is a new form of oppression by algorithm. You’re white and male? Your resume gets downgraded in the short list because of your demographics, regardless of your qualifications.


13 posted on 03/20/2019 8:29:32 AM PDT by tbw2
[ Post Reply | Private Reply | To 8 | View Replies]

To: tbw2

As the old saying goes, “Even observing an experiment introduces bias.”................


14 posted on 03/20/2019 8:30:08 AM PDT by Red Badger (We are headed for a Civil War. It won't be nice like the last one....................)
[ Post Reply | Private Reply | To 12 | View Replies]

To: Reeses

Life by Algorithm .. a Progre$$ive Solution for what ails ye ?? The data is still out, however.

Centre for Data Ethics and Innovation, an independent watchdog set up by the department in 2018

Centre for Data Ethics and Innovation Consultation

https://www.gov.uk/government/consultations/consultation-on-the-centre-for-data-ethics-and-innovation/centre-for-data-ethics-and-innovation-consultation


15 posted on 03/20/2019 9:17:34 AM PDT by NormsRevenge (Semper Fi - Monthly Donors Rock!!!)
[ Post Reply | Private Reply | To 1 | View Replies]

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson