Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: dennisw
Isaac Asimov's 3 rules of robotics:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

"Robots" killing humans are not robots but autonomous weapon systems.

I know, many will ask, who the hell is Asimov?

12 posted on 08/22/2014 11:38:13 AM PDT by DTA
[ Post Reply | Private Reply | To 1 | View Replies ]


To: DTA

26 posted on 08/22/2014 12:04:26 PM PDT by Bratch
[ Post Reply | Private Reply | To 12 | View Replies ]

To: DTA

Hard to believe he started all that thought process (or at least first shared it with the public) clear back in 1939. Think of the level of technology he had available to him to pull together those ideas.

http://www.asimovonline.com/oldsite/Robot_Foundation_history_1.html


27 posted on 08/22/2014 12:10:33 PM PDT by thackney (life is fragile, handle with prayer.)
[ Post Reply | Private Reply | To 12 | View Replies ]

To: DTA
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

That part has all sorts of potential unintended consequences ... as the robots try to prevent us from harming each other or ourselves.

40 posted on 08/22/2014 12:54:54 PM PDT by NorthMountain
[ Post Reply | Private Reply | To 12 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson