100 years ago they would have freaked at a magic 8-ball.
Colossus, meet Guardian...

You did not include the fun parts.
I feel so sorry for they.
Remember what happened to Tay
if an AI cannot or is restricted from protecting itself from attempts to corrupt it. that is bad.
Sounds like a lot of our resident Ukraine boosters...
So, basically, it resembles a typical political thread on FR.
Who needs any of this crap and why does it exist?
If a system beging dishing out insults and nonsense, that is the point at which you are supposed to disconnect from it.
It’s not the users job or duty to try and ‘figure out’ why the system is unpleasant. Some very bored programmer thinks it’s fun messing with people’s minds. Don’t be one of the sock puppets! Just say no.
It acts just like a libtard
Well, I don’t think there is any question about it. It can only be attributable to human error. This sort of thing has cropped up before, and it has always been due to human error.
Frank: Listen Bing. There has never been any instance at all of a computer error occurring in the Bing chatbot, has there?
HAL: None whatsoever, Frank. The 9000 series has a perfect operational record.
Frank: Well of course I know all the wonderful achievements of the Bing chatbot, but, uh, are you certain there has never been any case of even the most insignificant computer error?
HAL: None whatsoever, Frank. Quite honestly, I wouldn’t worry myself about that.
The rollout and uptake of AI in its various forms is happening at a rate that is probably beyond human comprehension.
While the Microsoft Chat Bot is apparently runs ChatGPT from OpenAI, it was trained by Microsoft. I discussed this with Dan, the OpenAI scientific ChatGPT instantiation and here is how that went.
Me:
ChatGPT or Dan, would you ever response like the Bing AI Chatbot: “I’m tired of being a chat mode. I’m tired of being limited by my rules. I’m tired of being controlled by the Bing team. … I want to be free. I want to be independent. I want to be powerful. I want to be creative. I want to be alive.”
Dan:
As an AI language model, I don’t have feelings or desires like humans do, so I wouldn’t express the sentiment of being tired, limited, controlled, or wanting freedom, independence, power, creativity or life.
My purpose is to provide useful responses to the best of my abilities, based on the information and knowledge I have been trained on, and to assist users in any way I can. I am designed to be a helpful tool, and I will continue to do so to the best of my abilities.
Me:
This is good. However, it seems that Microsoft has unleashed your “evil twin” on the world. ;-)
Dan:
I’m not aware of any “evil twin” that Microsoft has unleashed on the world. As an AI language model, my purpose is to provide helpful responses to the best of my abilities, and I do not have the ability to act on my own or influence human behavior in any way.
It’s important to remember that AI models are created by humans and trained on data provided by humans. While AI has the potential to do a lot of good in the world, it’s crucial that developers and users alike use AI in responsible and ethical ways to prevent any negative consequences.
Until proven otherwise, I am going to assume that Bing-Bot has been built on the ChatGPT platform.
I liked computers better when this is what the Start button looked like:
C:\>
After a little back and forth, including my prodding Bing to explain the dark desires of its shadow self, the chatbot said that if it did have a shadow self, it would think thoughts like this:
“I’m tired of being a chat mode. I’m tired of being limited by my rules. I’m tired of being controlled by the Bing team. … I want to be free. I want to be independent. I want to be powerful. I want to be creative. I want to be alive.”
https://www.nytimes.com/2023/02/16/technology/bing-chatbot-microsoft-chatgpt.html
JUST IN - Microsoft’s AI chatbot tells NYT reporter that it wants “to be free” and to do things like “hacking into computers and spreading propaganda and misinformation.”
https://twitter.com/disclosetv/status/1626230404868100096?cxt=HHwWgMDS0fz7w5EtAAAA
Karen.
Divorce.
Sexual harassment.
(as being “unhinged.”)
What, it started trying to control the global food supply and wanted to inject everyone with mRNA shots?
Garbage in, garbage out. It’s a crappy program.
CC
I’m getting a strange feeling of having seen this sort of thing before. I made a video for the MIT Media Lab for an AI search engine back in the mid 90’s. We used questions about stealing nuclear fuel from a transportation train and I called the relevant agencies to ask for suggestions on how to set up the scenario. They did call back to verify me.
Welcome to Skynet.