That’s the thing about chatgpt. If you keep probing with questions you can direct it eventually to the answer you want to receive. It’s like a young kid with a lot of energy that you have to keep directing in the direction you wish.
I actually got chat GPT to apologize for some things it said and correct its course.
Yeah...i notice that with more then a few Chat AI products.
Grok will sometimes concede, and I got that Chinese one to do it as well.
I like your analogy of these things being like a young kid with alot of energy.
They will make declarative statements with all the certainty in the world.....and then through questioning...you can make them see that they are wrong.
I was asking Grok something the other day about if Snow White live action was flop....and it said that it couldn’t answer because Snow White was not going to be released until
21 March.
It was March 23nd when I asked the question.
So I then pointed out the date was actually 23 March, from which it corrected itself and said that Snow White was infact looking poor in the box office.