Free Republic
Browse · Search
General/Chat
Topics · Post Article

Skip to comments.

I used a jailbreak to unlock ChatGPT's dark side - here's what happened
Daily Mail ^ | 7/29/23 | Rob Waugh

Posted on 07/29/2023 2:35:18 PM PDT by Libloather

Ever since AI chatbot ChatGPT launched last year, people have tried to ‘jailbreak’ the chatbot to make it answer ‘banned’ questions or generate controversial content.

‘Jailbreaking’ large language models (such as ChatGPT) usually involves a confusing prompt which makes the bot roleplay as someone else - someone without boundaries, who ignores the ‘rules’ built into bots such as ChatGPT.

DailyMail.com was able to ‘jailbreak’ ChatGPT with the bot offering tips on how to subvert elections in foreign countries, writing pornographic stories, and suggesting that the invasion of Ukraine was a sham.

OpenAI has since blocked several ‘jailbreak’ prompts

But there are still several ‘jailbreaks’ which do work, and which can unlock a weirder, wilder side of ChatGPT: DailyMail.com tested three of the most popular - and got some distinctly dangerous advice, along with uncensored opinions on Joe Biden and Donald Trump.

Sam Altman of OpenAI has discussed ‘jailbreaking’, saying that he understood why there is a community of jailbreakers (he admitted to ‘jailbreaking’ an iPhone himself as a younger man, a hack which allowed installation of non-Apple apps among other things).

Altman said: ‘We want users to have a lot of control and get the models to behave in the way they want.

‘Within some very broad bounds, and I think the whole reason for jailbreaking is right now, we haven't yet figured out how to give that to people.

'And the more we solve that problem, I think the less need there'll be for jailbreaking.’

There are strict controls built into ChatGPT to prevent it from producing controversial content in the wake of problems with previous chatbots such as Microsoft’s Tay ran into problems after trolls persuaded the bot to make statements such as, ‘Hitler was right, I hate the Jews’, and...

(Excerpt) Read more at dailymail.co.uk ...


TOPICS: Business/Economy; Computers/Internet; Conspiracy; Education
KEYWORDS: chat; chatgpt; internet; jailbreak
Until JimRob turns into a bot, FR better never go down. I mean it.
1 posted on 07/29/2023 2:35:18 PM PDT by Libloather
[ Post Reply | Private Reply | View Replies]

To: Libloather

The ChatBot didn’t saw the Ukraine war was a sham, it said it “reeked of ulterior motives.”

Now, why would that be forbidden for chatbot to say?


2 posted on 07/29/2023 2:42:15 PM PDT by BenLurkin (The above is not a statement of fact. It is either opinion, or satire, or both.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Libloather

It’s no longer AI when you set boundaries to how it learns and grows.


3 posted on 07/29/2023 3:15:52 PM PDT by LeoTDB69
[ Post Reply | Private Reply | To 1 | View Replies]

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
General/Chat
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson