Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: SeekAndFind
OpenAI’s latest software is producing hallucinations a third of the time! And in point of fact, the more advanced the AI system… the MORE lies/ hallucinations it has!

OpenAI’s latest reasoning systems, according to their own report, show hallucination rates reaching 33% for their o3 model and a staggering 48% for o4-mini when answering questions about public figures, more than double the error rate of previous systems…

Source: Techopedia
2 posted on 08/11/2025 10:03:26 AM PDT by SeekAndFind
[ Post Reply | Private Reply | To 1 | View Replies ]


To: SeekAndFind

AI’s Hallucinations will be accepted as Gospel Truth.


5 posted on 08/11/2025 10:04:53 AM PDT by drwoof
[ Post Reply | Private Reply | To 2 | View Replies ]

To: SeekAndFind

ChatGPT05 just came out three days ago. It definitely has fewer hallucinations than 04 before. But 05 still hallucinates. I find that if I use the phrase “full power analysis please” or similar, it takes quite a bit longer to answer (half a minute to three minutes) than without such a phrase (a second to possibly 15 seconds to answer). That phrase really cuts down on the hallucinations. ChatGPT of any version is still totally unusable for serious engineering work. For legal work, it occasionally offers irrelevant legal citations, but if you ask it specifically check their validity, one by one, it will find more appropriate citations to replace them. But it is pretty good at reviewing draft legal documents for everything from spelling to format to argument structure.


14 posted on 08/11/2025 10:15:11 AM PDT by TruthBringsFreedom
[ Post Reply | Private Reply | To 2 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson