Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Moral Machines: New Decision-making Approach Based on Computational Logic
Scientific Computing ^ | 9/8/09

Posted on 09/08/2009 1:12:36 PM PDT by null and void

Researchers from Portugal and Indonesia describe an approach to decision making based on computational logic that might one day give machines a sense of morality. Science fiction authors often use the concept of "evil" machines that attempt to take control of their world and to dominate humanity. Skynet in the "Terminator" stories and Arthur C Clarke's Hal from "2001: A Space Odyssey" are two of the most often cited examples.

However, for malicious intent to emerge in artificial intelligence systems requires that such systems have an understanding of how people make moral decisions. Luís Moniz Pereira of the Universidade Nova de Lisboa, in Portugal and Ari Saptawijaya of the Universitas Indonesia, in Depok, are both interested in artificial intelligence and the application of computational logic.

"Morality no longer belongs only to the realm of philosophers. Recently, there has been a growing interest in understanding morality from the scientific point of view," the researchers say.

They have turned to a system known as prospective logic to help them begin the process of programming morality into a computer. Put simply, prospective logic can model a moral dilemma and then determine the logical outcomes of the possible decisions. The approach could herald the emergence of machine ethics.

The development of machine ethics will allow us to develop fully autonomous machines that can be programmed to make judgements based on a human moral foundation. "Equipping agents with the capability to compute moral decisions is an indispensable requirement," the researchers say, "This is particularly true when the agents are operating in domains where moral dilemmas occur, e.g., in healthcare or medical fields."

The researchers also point out that machine ethics could help psychologists and cognitive scientists find a new way to understand moral reasoning in people and perhaps extract fundamental moral principles from complex situations that help people decide what is right and what is wrong. Such understanding might then help in the development of intelligent tutoring systems for teaching children morality.

The team has developed their program to help solve the so-called "trolley problem." This is an ethical thought experiment first introduced by British philosopher Philippa Foot in the 1960s. The problem involves a trolley running out-of-control down a track. Five people are tied to the track in its path. Fortunately, you can flip a switch, which will send the trolley down a different track to safety. But, there is a single person tied to that track. Should you flip the switch?

The prospective logic program can consider each possible outcome based on different versions of the trolley problem and demonstrate logically, what the consequences of the decisions made in each might be. The next step would be to endow each outcome with a moral weight, so that the prototype might be further developed to make the best judgement as to whether to flip the switch.

The research is published in the International Journal of Reasoning-based Intelligent Systems.


TOPICS: Culture/Society
KEYWORDS:
Navigation: use the links below to view more comments.
first 1-2021-26 next last
The solution to the trolly problem is obvious...
1 posted on 09/08/2009 1:12:37 PM PDT by null and void
[ Post Reply | Private Reply | View Replies]

To: null and void
I guess nobody ever saw "Colossus The Forbin Project" or any of the Terminator movies.
2 posted on 09/08/2009 1:15:15 PM PDT by Perdogg (Sarah Palin-Jim DeMint 2012 - Liz Cheney for Sec of State - Duncan Hunter SecDef)
[ Post Reply | Private Reply | To 1 | View Replies]

To: null and void

Garbage in...garbage out!


3 posted on 09/08/2009 1:16:17 PM PDT by mort56 (He who would sacrifice freedom for security deserves neither. - Ben Franklin)
[ Post Reply | Private Reply | To 1 | View Replies]

To: null and void

Evil machines? Anyone been to an NEA convention?


4 posted on 09/08/2009 1:17:34 PM PDT by Seruzawa (Obamalama lied, the republic died.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: null and void

The solution to the trolly problem is obvious...

Precisely! Because the one-person track has a liberal tied to it.


5 posted on 09/08/2009 1:18:09 PM PDT by Cletus.D.Yokel (FreepMail me if you want on the Bourbon ping list!)
[ Post Reply | Private Reply | To 1 | View Replies]

To: null and void
A moral computer from an islamic country??? Hmmmm, 1 suicide bomber has a pizza restaurant, day care or girls school in front of him, which one does he bomb?
6 posted on 09/08/2009 1:18:25 PM PDT by 2banana (My common ground with terrorists - they want to die for islam and we want to kill them)
[ Post Reply | Private Reply | To 1 | View Replies]

To: null and void

Interesting, if a machine can have the morality of the robot hero in the movie “I Robot” then the leftists who seek to enslave are in deep do do.

Then again Obama has the morality of the master computer of the same movie...


7 posted on 09/08/2009 1:27:29 PM PDT by longtermmemmory (VOTE! http://www.senate.gov and http://www.house.gov)
[ Post Reply | Private Reply | To 1 | View Replies]

To: null and void
An out of control trolley is the result of immoral work ethics to begin with. So.. a silk purse cannot me made out of a sows ear.

8 posted on 09/08/2009 1:30:49 PM PDT by I see my hands (_8(|)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Cletus.D.Yokel

Nope. There is a zero fatality option.


9 posted on 09/08/2009 1:31:30 PM PDT by null and void (We are now in day 230 of our national holiday from reality. - 0bama really isn't one of US.)
[ Post Reply | Private Reply | To 5 | View Replies]

To: All

Queensryche: NM156.....


10 posted on 09/08/2009 1:32:02 PM PDT by Maverick68 (w)
[ Post Reply | Private Reply | To 8 | View Replies]

To: null and void

This situation would have busted any of Isaac Asimov’s positronic brained robots.


11 posted on 09/08/2009 1:33:09 PM PDT by HiTech RedNeck (Proud Sarah-Bot.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: HiTech RedNeck

See post #9.

(Hint: Think outside the implied limits of the scenario)

It would be even simpler for a robot than a human...


12 posted on 09/08/2009 1:36:03 PM PDT by null and void (We are now in day 230 of our national holiday from reality. - 0bama really isn't one of US.)
[ Post Reply | Private Reply | To 11 | View Replies]

To: null and void

The problem, if you want to call it that, is that our morality depends on our empathic proximity to those involved in the moral decisions. We normally care more about friends and family than strangers and more about strangers we can see than strangers we can’t. Further, the disgust that we feel for morally repugnant acts is emotional rather than rational. Any program that doesn’t take that into account won’t work properly. And with regard to killer computers, it’s important to understand that much of what separates psychopaths from normal people is that they lack the empathy and visceral disgust over immoral acts often called a conscience.


13 posted on 09/08/2009 1:41:05 PM PDT by Question_Assumptions
[ Post Reply | Private Reply | To 1 | View Replies]

To: 2banana
Girl's school.

Nothing more frightening to a devout muslim than girls, especially educated girls.

14 posted on 09/08/2009 1:43:12 PM PDT by null and void (We are now in day 230 of our national holiday from reality. - 0bama really isn't one of US.)
[ Post Reply | Private Reply | To 6 | View Replies]

To: null and void

15 posted on 09/08/2009 1:45:06 PM PDT by The Comedian (Evil can only succeed if good men don't point at it and laugh.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Question_Assumptions

Exactly.

OTOH, if we can get them to fake sincerity, we’re half way there...


16 posted on 09/08/2009 1:45:28 PM PDT by null and void (We are now in day 230 of our national holiday from reality. - 0bama really isn't one of US.)
[ Post Reply | Private Reply | To 13 | View Replies]

To: Question_Assumptions

Yes, formation of conscience requires the capacity for love and compassion.

A logic-only base is doomed to either big mistakes or simple calculations.

Or as Chesterton put it: A madman is not someone who has lost his reason; he is someone who has lost everything but reason.


17 posted on 09/08/2009 2:27:06 PM PDT by D-fendr (Deus non alligatur sacramentis sed nos alligamur.)
[ Post Reply | Private Reply | To 13 | View Replies]

To: null and void

Psychopaths fake sincerity. That’s not a solution.


18 posted on 09/08/2009 7:57:38 PM PDT by Question_Assumptions
[ Post Reply | Private Reply | To 16 | View Replies]

To: Question_Assumptions

It’s a step on the way to humanity...


19 posted on 09/08/2009 8:30:41 PM PDT by null and void (We are now in day 230 of our national holiday from reality. - 0bama really isn't one of US.)
[ Post Reply | Private Reply | To 18 | View Replies]

To: null and void

I think faking it is a step on the way to humanity the way building “airplanes” shaped like birds with flapping wings were a step on the way to powered flight. Making something that simply looks like something else that works doesn’t necessarily work.


20 posted on 09/08/2009 8:48:12 PM PDT by Question_Assumptions
[ Post Reply | Private Reply | To 19 | View Replies]


Navigation: use the links below to view more comments.
first 1-2021-26 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson