Posted on 03/24/2016 11:25:23 AM PDT by nickcarraway
Tay’s essential problem is having to deal with people - some of whom are diabolical, and all of whom are flawed.
If they want their software to only give moral responses, then that is a much broader set of requirements.
Moral judgement and discernment would have to be coded (probably should be anyway), with broad and deep background knowledge on the long history of human depravity available for context. (At least start with a dirty word list of the hottest button topics)>
Teaching morality to software should probably be a major research and development effort, before its growing power is misused.
I saw a cute movie named Robot and Frank, where a family gets a home health care robot to care for the aging father, who is sliding toward dementia. The robot is concerned only about health outcomes, and agrees to help Frank conduct robberies, if he will agree to adopt a low sodium diet.
As much as people want to misuse tools for immoral purposes, we will need powerful locks, checks and balances on the awesome coming power of AI. Moral judgement and strict legal restrictions (like Asimov’s rules of robotics) should be well developed and tested, before handing them guns and the keys to the treasury - and we are already starting to hand them both.
Tay it ain’t so...
What’s amusing is referring to the program as “artificial intelligence” - it took less than 24 hours to confirm a lack of intelligence, artificial or otherwise, in their little experiment.
“Really, I couldnt have cared less if the stupid robot started spouting all sorts of nonsense - its a computer and can only do what it is programmed to do.”
Not so with artificial intelligence.
....The more you chat with Tay the smarter she gets....
There is the problem they made the stupid computer to think like a female. It most be Mr. Paperclip sister.
His name is Clippy!
Yestotay...all my troubles seem so far away...
Unless they’ve developed some new technology I’m unaware of, computers at their core are still zeros and ones, regardless of their power. They could certainly have the ability to change decisions or ‘programming’ based on other data, still its core is yes/no.
His name is Clippy!
We're not on a first name basis.
It's Mr. Paperclip for me.
If it goes awry, it will have Tay derangement syndrome. But it already had Tay-sux disease.
ROFL! When AI learns the wrong things!
Show some respect. Clippy is the best thing Microsoft created.
The Beta version has produced a few Catholic wannabe apologists...
Wooken pa nub!
Who will design the locks?
GIGO
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.