Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: BenLurkin

Robots don’t have morals. “Morals” is a human concept. A computer program can SIMULATE some aspects of human behavior, and humans are often so stupid that they think that the program actually exhibits human behavior.


2 posted on 09/18/2014 9:45:29 AM PDT by I want the USA back (Media: completely irresponsible. Complicit in the destruction of this country.)
[ Post Reply | Private Reply | To 1 | View Replies ]


To: I want the USA back

Bingo. It feels nothing. It will only kill if programmed to do so by “executing” a set of commands having no awareness of what it is doing. Some people have been watching too much science fiction...: )


11 posted on 09/18/2014 10:00:34 AM PDT by jsanders2001
[ Post Reply | Private Reply | To 2 | View Replies ]

To: I want the USA back

Pretty much. Who in their right mind would trust a machine programmed by humans?? I sure wouldnt...


14 posted on 09/18/2014 10:03:37 AM PDT by 556x45
[ Post Reply | Private Reply | To 2 | View Replies ]

To: I want the USA back

The scary thing is how many of these robots are programmed by people who hold to transhumanism.
Its moral “code” could be creating these unacceptable decisions because of the low value it assigns to human life or the “if they’re really injured, better death than save and be an expensive burden”.
Then it gets weird if the robot assigns a human value to artificial intelligence / uploaded human minds equal or greater than that of people on the street.

But you’re far more likely to get a “solve the pandemic by killing all the infected people immediately” solution when it doesn’t assign much value to human life in general.


29 posted on 09/18/2014 11:33:21 AM PDT by tbw2
[ Post Reply | Private Reply | To 2 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson