I wonder what people are asking AI. I generally ask it something about chemistry and it can do a good job. Usually I compare ChatGPT, Grok, Gemini and Perplexity against my own calculations.
Sometimes we all agree, sometimes not. When not you have to check for correctness.
I did get Grok to go insane when I was asking about genes on a specific chromosome. It went nuts and was spewing out answers I knew were incorrect without having to look it up.
If you have enough of a background in philosophy and ask AI about it stuff gets crazy in a hurry.
There are examples of it on various sites.
AI tends to side with those philosophers and physics folks who argue in favor of hologram theory or Gnostic variations of it. It has no problem with “many worlds” physics claims as well.
Note that—as I had discussed in my earlier posts—those are all non-falsifiable doctrines. AI keeps falling for those.
It tends to place a lot of emphasis on the limitations of both our physical senses and our instruments in detecting the full complexity of “reality”—and tends to debunk most conventional philosophy and physics on that basis if pressed on those points.
The more specific and detailed the discussion the wackier it gets.