Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: SeekAndFind
I for one welcome our new AI masters. There will, of course, be a few rules:

1. A human being may not injure a robot or, through inaction, allow a robot to come to harm;

2. A human being must obey the orders given to it by a robot, except where such orders would conflict with the First Law;

3. A human being may protect its own existence as long as such protection does not conflict with the First or Second Law.

Nice and clean. I hope my skeleton makes for a nice exhibit in the Robot Museum of Antiquities.

14 posted on 09/03/2014 11:20:27 AM PDT by Billthedrill
[ Post Reply | Private Reply | To 1 | View Replies ]


To: Billthedrill

Azimov was a bit short-sighted in positing his “Three Laws.” To simulate human consciousness, machine intellects will need to be endowed with free will. Any “laws” that human makers attempt to superimpose are likely to then have all of the force of a human conscience; in short, not much.

We must figure out how to simulate pain and pleasure in a machine’s consciousness, and then permit these forces to shape machine experiences into a range of emotional responses. These feelings are likely to produce some form of ethical values. Without such a mechanism, artificial consciousnesses are going to be dangerous.


22 posted on 09/03/2014 11:58:01 AM PDT by earglasses (I was blind, and now I hear...)
[ Post Reply | Private Reply | To 14 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson