Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: nickcarraway

Tay’s essential problem is having to deal with people - some of whom are diabolical, and all of whom are flawed.

If they want their software to only give moral responses, then that is a much broader set of requirements.

Moral judgement and discernment would have to be coded (probably should be anyway), with broad and deep background knowledge on the long history of human depravity available for context. (At least start with a dirty word list of the hottest button topics)>

Teaching morality to software should probably be a major research and development effort, before its growing power is misused.

I saw a cute movie named Robot and Frank, where a family gets a home health care robot to care for the aging father, who is sliding toward dementia. The robot is concerned only about health outcomes, and agrees to help Frank conduct robberies, if he will agree to adopt a low sodium diet.

As much as people want to misuse tools for immoral purposes, we will need powerful locks, checks and balances on the awesome coming power of AI. Moral judgement and strict legal restrictions (like Asimov’s rules of robotics) should be well developed and tested, before handing them guns and the keys to the treasury - and we are already starting to hand them both.


21 posted on 03/24/2016 12:06:04 PM PDT by BeauBo
[ Post Reply | Private Reply | To 1 | View Replies ]


To: BeauBo

Who will design the locks?


37 posted on 03/24/2016 2:50:09 PM PDT by Elsie (Heck is where people, who don't believe in Gosh, think they are not going...)
[ Post Reply | Private Reply | To 21 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson