Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: suasponte137

That’s the thing about chatgpt. If you keep probing with questions you can direct it eventually to the answer you want to receive. It’s like a young kid with a lot of energy that you have to keep directing in the direction you wish.

I actually got chat GPT to apologize for some things it said and correct its course.


13 posted on 03/26/2025 10:38:12 AM PDT by dmzTahoe
[ Post Reply | Private Reply | To 10 | View Replies ]


To: dmzTahoe

Yeah...i notice that with more then a few Chat AI products.

Grok will sometimes concede, and I got that Chinese one to do it as well.

I like your analogy of these things being like a young kid with alot of energy.

They will make declarative statements with all the certainty in the world.....and then through questioning...you can make them see that they are wrong.

I was asking Grok something the other day about if Snow White live action was flop....and it said that it couldn’t answer because Snow White was not going to be released until
21 March.

It was March 23nd when I asked the question.

So I then pointed out the date was actually 23 March, from which it corrected itself and said that Snow White was infact looking poor in the box office.


19 posted on 03/26/2025 10:56:00 AM PDT by suasponte137
[ Post Reply | Private Reply | To 13 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson