Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Researchers go after the biggest problem with self-driving cars (database decides who lives & dies)
Axios ^ | 11/01/2017 | Steve LeVine

Posted on 11/16/2017 9:05:21 AM PST by MarchonDC09122009

Researchers go after the biggest problem with self-driving cars

By Steve LeVine November 01, 2017

The biggest difficulty in self-driving cars is not batteries, fearful drivers, or expensive sensors, but what's known as the "trolley problem," a debate over who is to die and who saved should an autonomously driven vehicle end up with such a horrible choice on the road. And short of that, how will robotic vehicles navigate the countless other ethical decisions, small and large, executed by drivers as a matter of course?

In a paper, researchers at Carnegie Mellon and MIT propose a model that uses artificial intelligence and crowd sourcing to automate ethical decisions in self-driving cars. "In an emergency, how do you prioritize?" Ariel Procaccia, a professor at Carnegie Mellon, tells Axios.

The bottom line: The CMU-MIT model is only a prototype at this stage. But it or something like it will have to be mastered if fully autonomous cars are to become a reality.

"We are not saying that the system is ready for deployment. But it is a proof of concept, showing that democracy can help address the grand challenge of ethical decision making in AI," Procaccia said.

How they created the system: Procaccia's team used a model at MIT called the Moral Machine, in which 1.3m people gave their ethical vote to around 13 difficult, either-or choices in trolley-like driving scenarios. In all, participants provided 18.2 million answers. The researchers used artificial intelligence to teach their system the preferences of each voter, then aggregated them, creating a "distribution of societal preferences," in effect the rules of ethnical behavior in a car. The researchers could now ask the system any driving question that came to mind; it was as though they were asking the original 1.3 million participants to vote again.

A robot election: "When the system encounters a dilemma, it essentially holds an election, by deducing the votes of the 1.3 million voters, and applying a voting rule," Procaccia said. He said, "This allows us to give the following strong guarantee: the decision the system takes is likely to be the same as if we could go to each of the 1.3 million voters, ask for their opinions, and then aggregate their opinions into a choice that satisfies mathematical notions of social justice."


TOPICS: Business/Economy; Culture/Society; News/Current Events
KEYWORDS: autonomous; car; safety
Navigation: use the links below to view more comments.
first previous 1-2021-4041-6061-68 next last
To: MarchonDC09122009
Researchers go after the biggest problem with self-driving cars

___

............ no human drivers

41 posted on 11/16/2017 10:13:21 AM PST by a little elbow grease (...... Ralph Cindrich lives .....and can still wrestle......STICKLY)
[ Post Reply | Private Reply | To 1 | View Replies]

To: TexasGunLover

A ‘special privilege on the road’ is not inherent in the vehicle you buy.

It’s the privilege of going faster than everyone else, it’s the privilege of shunting all of the other cars out of your way when it suits you, it’s the privilege of being better than everyone else in the eyes of the law.


42 posted on 11/16/2017 10:16:39 AM PST by MeganC (Democrat by birth, Republican by default, Conservative by principle.)
[ Post Reply | Private Reply | To 33 | View Replies]

To: Responsibility2nd

“But to fear an AI car over the millions of idiot drivers on the road today is painfully stupid thinking.”

I don’t mind if others have them.

I just don’t want one. And will not have one as long as I have a choice.

Which will probably be as long as I live.


43 posted on 11/16/2017 10:17:08 AM PST by Mariner (War Criminal #18)
[ Post Reply | Private Reply | To 30 | View Replies]

To: Mozzafiato

The AI is based on the opinions of 1.3 million people. Presumably, at least half of them are women. So perhaps the result will be to throw the train full of passengers off the cliff.


I doubt that even if they were ALL women it would. The reason is simple. When they answered the questionnaire, they were not in the actual “emotional response mode” they would be in if it really happened. This actually illustrates the beauty of this approach.


44 posted on 11/16/2017 10:20:21 AM PST by robroys woman (So you're not confused, I'm male.)
[ Post Reply | Private Reply | To 37 | View Replies]

To: MarchonDC09122009

I knew this was going to happen. Fast forward to 2050. The naacpmah (national association of colored people, m*slims, and America-haters) protests because the algorithm unfairly targets non-Whites. Since Whites in 2050 constitute 37.2% of the population, they are still considered the oppressors, the majority, the rich, the haves. It seems they are not dying in the right proportion. Road deaths come out to 21% for Whites, and 79% for non-Whites. Congress goes into session, and passes a law to adjust the algorithm so that 60% of deaths in accidents are Whites, and 40% non-Whites. Transsexuals are to be spared, and breeders, the binary kind, are to be given the thumbs down, where possible. Automatic reparations are to be added to EBT cards for all non-Whites and all non-binary breeders.

Databases don’t decide. Programs do.


45 posted on 11/16/2017 10:21:29 AM PST by I want the USA back (ItÂ’s Ok To Be White. White Lives Matter. White Guilt is Socially Constructed)
[ Post Reply | Private Reply | To 1 | View Replies]

To: robroys woman

If you don’t think the systems will be written to privilege the privileged, you are kidding yourself.


46 posted on 11/16/2017 10:31:11 AM PST by 9YearLurker
[ Post Reply | Private Reply | To 6 | View Replies]

To: 9YearLurker

Anything’s possible.

Maybe, just as you can get a performance chip for your computer to soup up your engine, you could get a “protect me at all costs” chip. Amazon prime, $99. I’d buy it. It’s the way I already drive. :)


47 posted on 11/16/2017 10:40:03 AM PST by robroys woman (So you're not confused, I'm male.)
[ Post Reply | Private Reply | To 46 | View Replies]

To: D Rider
Would kids do this? You betcha.

Your comment reminds me of this: After racist tweets, Microsoft muzzles teen chat bot Tay

48 posted on 11/16/2017 10:41:06 AM PST by kosciusko51
[ Post Reply | Private Reply | To 40 | View Replies]

To: TexasGunLover

Maybe we need to discuss just how far a private entity’s intellectual property intellectual property privacy privileges should be respected when it’s been determined they pose a threat to public well being, ie: Constitutional rights?

Ubiquitous / monopoly / anti-competition social media giants (Google) censorship of politically incorrect content impedes first ammendment protection.

Voting machine company tabulation algorithms affect the will of the people’s political governance.

Autonomous vehicle manufacturer’s “Moral Machine” live or die decision software affects public transportation safety.

Beware of high-stakes opaque algorithms that are the result a company executive, data scientist or mathematician’s subjective prejudice and self-interest.
Not everyone in Silicon Valley, Wall Street and D.C. shares the highest ethics, same beliefs, values and interests.


49 posted on 11/16/2017 10:49:04 AM PST by MarchonDC09122009 (When is our next march on DC? When have we had enough?)
[ Post Reply | Private Reply | To 36 | View Replies]

To: MarchonDC09122009
Not everyone in Silicon Valley, Wall Street and D.C. shares the highest ethics, same beliefs, values and interests.

I'm not concerned with their ethics. The free market provides a method for handling it, rather than depending on government intrusion.
50 posted on 11/16/2017 10:53:10 AM PST by TexasGunLover ("Either you're with us or you're with the terrorists."-- President George W. Bush)
[ Post Reply | Private Reply | To 49 | View Replies]

To: discostu

I agree this is more about mental gymnastics than anything to really get concerned with. The deaths and injuries will plummet in general, these are fringe use-cases to say the least.


51 posted on 11/16/2017 11:04:31 AM PST by fuzzylogic (welfare state = sharing consequences of poor moral choices among everybody)
[ Post Reply | Private Reply | To 31 | View Replies]

To: TexasGunLover

RE:
“I’m not concerned with their ethics. The free market provides a method for handling it, rather than depending on government intrusion.”

In an ideal world...
However, what happens when “free-market” companies bribe our government’s political parties with campaign cash in order to write legislation that further ensures their market dominance as quasi-media weilding ever more power in our culture and political body?
Should google, facebook, twitter be permitted to be left / marxist media organs via selective (truthiness) censorship?
Do you advocate unlimited power for Google, Facebook, and Twitter?
Or maybe, it’s possible to consider that incorporation in the USA is a privilege that requires respecting our Constitution and social responsibility...


52 posted on 11/16/2017 11:32:48 AM PST by MarchonDC09122009 (When is our next march on DC? When have we had enough?)
[ Post Reply | Private Reply | To 50 | View Replies]

To: discostu

LOL, actually, most of us avoid the object that is closest, not steer towards the one furthest away...

And usually when we do that, we run smack into the one that is second closest!


53 posted on 11/16/2017 11:35:56 AM PST by rlmorel (Liberals: American Liberty is the egg that requires breaking to make their Utopian omelette.)
[ Post Reply | Private Reply | To 35 | View Replies]

To: I want the USA back
"...Databases don’t decide. Programs do..."

And just think, with Terabit WAN infrastructure, nanocomputers and HUUUUUUUUUUUUUGE databases that contain everything from your brain scan results, your browsing history, your grades from high school, your driving (in this case non-driving) history, every facebook post and Tweet you have ever made, your lifetime history of tax returns, legal history, voting history, and your IQ...all accessible via GoogleNet...why waste a chance for computers and their programmers to make choices good for society, the environment, and humanity?

Never let a good opportunity go to waste!

While I sleep on my way to work, unaware of the brain tumor diagnosis (didn't get the notice from the hospital AI Patient Informer, which is programmed with human compassion in its voice to tell people they have a tumor) which is curable but will cost approximately half a million dollars in care to get me through it...the algorithm decides that I'm not worth it, and hurtles me towards the rent in the road caused by a washout that wasn't detected, logged, and fed into the system in time to make the car stop.

Everyone wins! (Except me, of course!)

54 posted on 11/16/2017 11:56:39 AM PST by rlmorel (Liberals: American Liberty is the egg that requires breaking to make their Utopian omelette.)
[ Post Reply | Private Reply | To 45 | View Replies]

To: RossA

“Ah! Brave new world!”

I don’t want to be part of it. This and sex robots and apostate popes and stuff.


55 posted on 11/16/2017 12:07:59 PM PST by steve86 (Prophecies of Maelmhaedhoc O'Morgair (Latin form: Malachy))
[ Post Reply | Private Reply | To 15 | View Replies]

To: D Rider
So a group of teens run into the street forcing the car to make a decision about the greater number of lives saved, and so goes off the side of the road. Would kids do this? You betcha.

teens = "teens" (or "youths/yutes")

Autonomous car users in urban areas, beware!

56 posted on 11/16/2017 12:11:43 PM PST by Disambiguator (Keepin' it analog.)
[ Post Reply | Private Reply | To 40 | View Replies]

To: Responsibility2nd
The fear that the State will decide when and where you go.

There is also the fact that any computer system is hackable.

Just imagine a black hat hacker who wants to cause ‘Social Justice’ who hacks the AI system and install malware that will at a given time cause all of the autonomous vehicles to crash head on with another vehicle or stationary object at maximum speed.

I don’t want a self-driving car for that reason alone.

57 posted on 11/16/2017 12:41:23 PM PST by Pontiac (The welfare state must fail because it is contrary to human nature and diminishes the human spirit.L)
[ Post Reply | Private Reply | To 21 | View Replies]

To: MarchonDC09122009
Given the moral/ethical choice between the following:

1) Running over two blind nuns on the sidewalk

2) Running over five toddlers playing in the street, or

3) Avoiding #1 and #2, and instead seeking out Al Franken and mowing him down wherever he may be;


...then I'd say if they could program the autonomous vehicle to choose #3 every time, we'd have a winner. ;-)



































Ham-fisted, Al Franken-type humor intentionally included above for comedic effect.
58 posted on 11/16/2017 12:41:23 PM PST by Milton Miteybad (I am Jim Thompson. {Really.})
[ Post Reply | Private Reply | To 1 | View Replies]

To: robroys woman

Nope. Not for the little people.


59 posted on 11/16/2017 12:42:28 PM PST by 9YearLurker
[ Post Reply | Private Reply | To 47 | View Replies]

To: robroys woman

Nope. Not for the little people.


60 posted on 11/16/2017 12:42:29 PM PST by 9YearLurker
[ Post Reply | Private Reply | To 47 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-2021-4041-6061-68 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson