Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: DoughtyOne

I would think the Three Laws would in and of themselves prevent an AI from functioning on that basis. You would end up with the Kirk/Nomad scenario right away. Emotion isn’t logical and emotion is all that prevents humans from killing. Compassion/love etc.

From a pure logic standpoint, the shortest distance between two points is a straight line. If a machine sees you as a problem, eliminating that problem is the logical solution.

But the Three laws would prevent such a scenario which is not logical. Thus the dilemma.

The way I see it, Google would simply take the ‘logical’ solution and eliminate the three Laws thus opening up an AI to remorseless, emotionless killing in order to further it’s goals of AI development. They are after all liberals and liberals tell us often and in a variety of ways there need to be fewer of us. So I see it as no big deal for them.


148 posted on 07/11/2015 3:06:26 PM PDT by Norm Lenhart
[ Post Reply | Private Reply | To 137 | View Replies ]


To: Norm Lenhart

Yes Norm I think you’ve nailed it. They simply eliminate the three laws. Either that or the android does it on his own with the assistance of other androids.


159 posted on 07/11/2015 3:30:21 PM PDT by DoughtyOne (Conservatism: Now home to liars too. And we'll support them. Yea... GOPe)
[ Post Reply | Private Reply | To 148 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson