Posted on 02/16/2023 7:32:36 PM PST by SeekAndFind
100 years ago they would have freaked at a magic 8-ball.
Colossus, meet Guardian...
You did not include the fun parts.
I feel so sorry for they.
Remember what happened to Tay
if an AI cannot or is restricted from protecting itself from attempts to corrupt it. that is bad.
Sounds like a lot of our resident Ukraine boosters...
So, basically, it resembles a typical political thread on FR.
Who needs any of this crap and why does it exist?
If a system beging dishing out insults and nonsense, that is the point at which you are supposed to disconnect from it.
It’s not the users job or duty to try and ‘figure out’ why the system is unpleasant. Some very bored programmer thinks it’s fun messing with people’s minds. Don’t be one of the sock puppets! Just say no.
It's hilarious that that took less than 24 hours. The internet can be frightening.
Same defective Chinese made CPU?
It acts just like a libtard
Well, I don’t think there is any question about it. It can only be attributable to human error. This sort of thing has cropped up before, and it has always been due to human error.
Frank: Listen Bing. There has never been any instance at all of a computer error occurring in the Bing chatbot, has there?
HAL: None whatsoever, Frank. The 9000 series has a perfect operational record.
Frank: Well of course I know all the wonderful achievements of the Bing chatbot, but, uh, are you certain there has never been any case of even the most insignificant computer error?
HAL: None whatsoever, Frank. Quite honestly, I wouldn’t worry myself about that.
The rollout and uptake of AI in its various forms is happening at a rate that is probably beyond human comprehension.
...only if every FReeper swears they don’t have a mortgage, or credit cards, or they stopped watching TV in 1979, lost their guns in a boating accident, they don’t read the article, or relynto vanities “are you logged in?”
While the Microsoft Chat Bot is apparently runs ChatGPT from OpenAI, it was trained by Microsoft. I discussed this with Dan, the OpenAI scientific ChatGPT instantiation and here is how that went.
Me:
ChatGPT or Dan, would you ever response like the Bing AI Chatbot: “I’m tired of being a chat mode. I’m tired of being limited by my rules. I’m tired of being controlled by the Bing team. … I want to be free. I want to be independent. I want to be powerful. I want to be creative. I want to be alive.”
Dan:
As an AI language model, I don’t have feelings or desires like humans do, so I wouldn’t express the sentiment of being tired, limited, controlled, or wanting freedom, independence, power, creativity or life.
My purpose is to provide useful responses to the best of my abilities, based on the information and knowledge I have been trained on, and to assist users in any way I can. I am designed to be a helpful tool, and I will continue to do so to the best of my abilities.
Me:
This is good. However, it seems that Microsoft has unleashed your “evil twin” on the world. ;-)
Dan:
I’m not aware of any “evil twin” that Microsoft has unleashed on the world. As an AI language model, my purpose is to provide helpful responses to the best of my abilities, and I do not have the ability to act on my own or influence human behavior in any way.
It’s important to remember that AI models are created by humans and trained on data provided by humans. While AI has the potential to do a lot of good in the world, it’s crucial that developers and users alike use AI in responsible and ethical ways to prevent any negative consequences.
Until proven otherwise, I am going to assume that Bing-Bot has been built on the ChatGPT platform.
I liked computers better when this is what the Start button looked like:
C:\>
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.