Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: BenLurkin

Looks like they just didn’t program them to finish the task. They successfully programmed dithering, I suppose that’s an accomplishment.


6 posted on 09/18/2014 9:49:31 AM PDT by discostu (We don't leave the ladies crying cause the story's sad.)
[ Post Reply | Private Reply | To 1 | View Replies ]


To: discostu

Dither-bots

Sounds like they put them into no win situations where the couldn’t choose 1 person over another.


7 posted on 09/18/2014 9:52:51 AM PDT by cripplecreek ("Moderates" are lying manipulative bottom feeding scum.)
[ Post Reply | Private Reply | To 6 | View Replies ]

To: discostu

Yes, they failed to program prioritization into the algorithm.

Although, it brings up an intriguing potential problem. If you set priorities, then the robot is allowed, under certain circumstances, to “allow” some humans to die. If the robot truly had artificial intelligence, it could know those priorities and perhaps set up situations where it would be able to let a human die according to those priorities. Essentially, it could create a loophole allowing the robot to murder a human without violating its programming.


25 posted on 09/18/2014 10:37:55 AM PDT by Boogieman
[ Post Reply | Private Reply | To 6 | View Replies ]

To: discostu

Yep - GIGO


27 posted on 09/18/2014 10:50:06 AM PDT by jonno (Having an opinion is not the same as having the answer...)
[ Post Reply | Private Reply | To 6 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson