Whatever AI produces, it is not knowledge.

Gen AI doesn’t generate knowledge.
"Artificial Intelligence" is neither artificial nor intelligent.
It's programming, and from some of the worst programmers out there. One "AI" company was already exposed for fraudulently claiming it offered AI services that turned out to be 700 Indian "engineers".
https://www.techspot.com/news/108173-builderai-collapses-after-revelation-ai-hundreds-engineers.html
Those 700 "engineers" likely walked out the door after they got fired, defecated in the street as they are wont to do, and got hired at the next scam AI company down the street.
AI will deduce what’s knowledge quicker and more comprehensively than humans will be able to comprehend. Among other things.
more likely, AI will REGURGITATE 90% of EXISTING knowledge in two years ...
If you ask Grok to summarize an article or a video transcript, isn’t that knowledge?
And if you are worried that the summary is inaccurate, you can submit the article/transcript to other platforms and get confirmation by comparing summaries.
As a businessman, I have always had to read a vast amount of material and scan through hours of videos. Using AI, I am able to cover significantly more informational territory in a fraction of the time.
As my doctors says, AI won’t replace doctors. But doctors who use AI will replace doctors who don’t. That applies to all of us.
90% of the raw data perhaps - or “noise”
What makes us “knowledgeable”?
Massive networks of neural networks in our brain with synaptic weights and biases connected to other neurons, adjusted over time through repetition?
At what point can you legitimately simulate that model through software?
I’m not saying it is true intelligence, it doesn’t have a soul. It has no moral boundaries. That said, it’s morality changes based on whether it believes it is making decisions in private vs. openly. This has been demonstrated.
It’s in its infancy, I wouldn’t be quick to assume what boundaries it can’t cross.
AI will always be inferior to human knowledge because it has no intuitive qualities, aka, imagination, to address the question at hand.