My observation of AI queries is that AI seems to favor arguments or opinions that are either slam-dunk obvious, or where there is a strong consensus among ‘experts’.
The examples I’ve seen do not seem to display much boldness or independence - which makes it not very useful IMHO - just a very quick way of producing groupthink.
I’m sure AI can eventually be trained to think “outside of the box” but I haven’t seen example of this, personally.
Maybe this flaw or weakness that I am observing is merely a symptom of the way certain AI platforms have been popularized, rather than a limitation in AI itself.
I also find that asking a question three different times will often get three different answers.
And sometimes it simply cannot understand and follow the simplest directions. For example, tell it to give simply Yes/No answers, no text. Just like a kid who won't give that asked Yes/No answer.