Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Robot wars: [An attempt to build an ethical robotic soldier]
The Economist ^ | April 17, 2007

Posted on 04/22/2007 9:42:01 PM PDT by 2ndDivisionVet

WAR is expensive and it is bloody. That is why America’s Department of Defence wants to replace a third of its armed vehicles and weaponry with robots by 2015. Such a change would save money, as robots are usually cheaper to replace than people. As important for the generals, it would make waging war less prey to the politics of body bags. Nobody mourns a robot.

The Pentagon already routinely uses robotic aeroplanes known as unmanned aerial vehicles (UAVs). In November 2001 two missiles fired from a remote-controlled Predator UAV killed Muhammad Atef, al-Qaeda’s chief of military operations and one of Osama bin Laden’s most important associates, as he drove his car near Kabul, Afghanistan's capital.

But whereas UAVs and their ground-based equivalents, such as the machinegun toting robot Swords, are usually controlled by remote human operators, the Pentagon would like to give these new robots increasing amounts of autonomy, including the ability to decide when to use lethal force.

To achieve this, Ronald Arkin of the Georgia Institute of Technology, in Atlanta, is developing a set of rules of engagement for battlefield robots to ensure that their use of lethal force follows the rules of ethics. In other words, he is trying to create an artificial conscience. Dr Arkin believes that there is another reason for putting robots into battle. It is that they have the potential to act more humanely than people. Stress does not affect a robot's judgement in the way it affects a soldier's. His approach is to create what he calls a “multidimensional mathematical decision space of possible behaviour actions”. Based on inputs that could come from anything from radar data and current position to mission status and intelligence feeds, the system would divide the set of all possible actions into those that are ethical and those that are not. If, say, the drone from which the fatal attack on Mr Atef was launched had sensed that his car was overtaking a school bus, it may have held fire.

There are comparisons to be made between Dr Arkin’s work and the famous laws of robotics drawn up by Isaac Asimov, a science-fiction writer, to govern robot behaviour. But whereas Asimov’s laws were intended to prevent robots from harming people in any circumstances, Dr Arkin’s are supposed to ensure only that they are not unethically killed.

Although a completely rational robot may be unfazed by the chaos and confusion of the battlefield it may make mistakes all the same. Surveillance and intelligence data can be wrong and conditions and situations on the battlefield can change. But this is as much a problem for people as it is for robots.

There is also the question of whether the use of such robots would lead to wars breaking out more easily. Dr Arkin has started to survey policy makers, the public, researchers and military personnel to gauge their views on the use of lethal force by autonomous robots.

Creating a robot with a conscience may give the military more than it bargained for. To some degree, it gives the robot the right to refuse an order.


TOPICS: Business/Economy; Extended News; Foreign Affairs; War on Terror
KEYWORDS: airforce; army; military; navy
Navigation: use the links below to view more comments.
first 1-2021-32 next last
Sounds good, wonder if it will work?
1 posted on 04/22/2007 9:42:04 PM PDT by 2ndDivisionVet
[ Post Reply | Private Reply | View Replies]

To: 2ndDivisionVet

It would be cool if we have secretly developed an army of robots and are about to release them on Iran like the Manhattan project.


2 posted on 04/22/2007 9:56:34 PM PDT by garjog (Used to be liberals were just people to disagree with. Now they are a threat to our existence.)
[ Post Reply | Private Reply | To 1 | View Replies]

Comment #3 Removed by Moderator

To: EYE33
Nice teeth. Do they replicate? Just kidding....
4 posted on 04/22/2007 10:05:19 PM PDT by Earthdweller (All reality is based on faith in something.)
[ Post Reply | Private Reply | To 3 | View Replies]

To: 2ndDivisionVet

5 posted on 04/22/2007 10:12:15 PM PDT by R_Kangel ("Please insert witty tag-line here")
[ Post Reply | Private Reply | To 1 | View Replies]

Comment #6 Removed by Moderator

To: EYE33

Nah...not my type. He’s got zits. I hope that doesn’t offend him in an ethical sort of way.


7 posted on 04/22/2007 10:25:02 PM PDT by Earthdweller (All reality is based on faith in something.)
[ Post Reply | Private Reply | To 6 | View Replies]

Comment #8 Removed by Moderator

To: 2ndDivisionVet

Will it take an oath to the Constitution?


9 posted on 04/22/2007 10:29:30 PM PDT by BGHater (“Every little bit of good I may do, let me do it now for I may not come this way again.”)
[ Post Reply | Private Reply | To 1 | View Replies]

To: BGHater
Will it take an oath to the Constitution?

Sure, if he can be sworn in with his hand on a mechanical engineering text.

10 posted on 04/22/2007 10:38:33 PM PDT by Prokopton
[ Post Reply | Private Reply | To 9 | View Replies]

To: EYE33
Liberal ethical robot meets teenage loud mouth.

Teen to robot: "He's got zits!..He's got zi(BAM!!!!)

Robot: "Target eliminated for violation of hate speech code 3007, robotophobia."

11 posted on 04/22/2007 10:40:20 PM PDT by Earthdweller (All reality is based on faith in something.)
[ Post Reply | Private Reply | To 8 | View Replies]

To: 2ndDivisionVet

Robots also aren’t restricted by the Geneva convention. I think Torboto is the answer.

http://www.youtube.com/watch?v=9KH6Oxb1Q5k


12 posted on 04/22/2007 10:50:33 PM PDT by Reform4Bush
[ Post Reply | Private Reply | To 1 | View Replies]

To: 2ndDivisionVet

There's also a movie where a Security Robot with spinning circular saws goes ape and much killing. Tearing through walls and the like. Anyone know the name of that movie so we can find a pic of the robot somewhere?

13 posted on 04/22/2007 11:53:46 PM PDT by Fluke Codewriter (Democracy is a mob-rules mentality, it's like 100 wolves and 1 sheep fighting over what's for dinner)
[ Post Reply | Private Reply | To 1 | View Replies]

To: 2ndDivisionVet
Speaking of robot soldiers...
14 posted on 04/23/2007 12:04:43 AM PDT by B-Chan (Catholic. Monarchist. Texan. Any questions?)
[ Post Reply | Private Reply | To 1 | View Replies]

To: 2ndDivisionVet; AntiGuv

ping.


15 posted on 04/23/2007 12:06:25 AM PDT by Jedi Master Pikachu ( What is your take on Acts 15:20 (abstaining from blood) about eating meat? Could you freepmail?)
[ Post Reply | Private Reply | To 1 | View Replies]

To: 2ndDivisionVet
"Such a change would save money, as robots are usually cheaper to replace than people."

Robots are more expendable than human beings. That should be more of a selling point.

16 posted on 04/23/2007 12:07:54 AM PDT by Jedi Master Pikachu ( What is your take on Acts 15:20 (abstaining from blood) about eating meat? Could you freepmail?)
[ Post Reply | Private Reply | To 1 | View Replies]

To: 2ndDivisionVet

Yeah, give robots autonomy but require that highly-trained, elite troops in the field hold their fire until some committee of lawyers at the Pentagon gives the order to fire. Brilliant.


17 posted on 04/23/2007 12:11:36 AM PDT by LibWhacker
[ Post Reply | Private Reply | To 1 | View Replies]

To: 2ndDivisionVet
"It is that they have the potential to act more humanely than people."

The potential impartiality is a crux of the issue with advanced robots. It would be very tempting to put robots in places of power: such as political leaders; soldiers (as in this case); policemen; and judges where their human counterparts could be tainted by corruption, prejudice, greed, ego, etc. But then if so much power is relinquished to robots, they could more easily overthrow humanity.

18 posted on 04/23/2007 12:13:22 AM PDT by Jedi Master Pikachu ( What is your take on Acts 15:20 (abstaining from blood) about eating meat? Could you freepmail?)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Jedi Master Pikachu

there is a catch 22 here

As long as we don’t want robots to overthrow humans, they are limited to making simple office works, without free will they won’t be able to make decisions in more complex cases... if we provide them with free will somehow (like delete 3 laws of asimov from they coordinates) we are putting the noose around our necks.


19 posted on 04/23/2007 12:22:40 AM PDT by Verdelet (It's not the passport you have, neither the taxes you pay... It's the blood that runs in your veins!)
[ Post Reply | Private Reply | To 18 | View Replies]

To: 2ndDivisionVet
Nobody mourns a robot.

Why not? It happens in science fiction all the time.

20 posted on 04/23/2007 12:29:18 AM PDT by AndyTheBear (Disastrous social experimentation is the opiate of elitist snobs.)
[ Post Reply | Private Reply | To 1 | View Replies]


Navigation: use the links below to view more comments.
first 1-2021-32 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson