google AI is wrong more often than right and absolutely can not be uncritically relied upon as a sole source ...
google AI simply search a shitepile of online material to come up with “answers”, but has little capability of judging the relatively veracity of the large volume of information that it sucks up ...
anyone who’s ever been to the beach in the miami area and has dug a hole in the sand knows that there’s no bedrock at 3-4 feet ... in other words, a 3rd grader is smarter than google AI ...
in fact, bedrock is SUBSTANTIALLY deeper than 3-4 feet in most areas of the U.S. except in places like the ROCKY mountains ... again, the average ditch digger knows this ...
in point of fact, in coastal areas like miami, it’s impossible to put down pilings deep enough to actually hit bedrock, and all buildings and structures are held up by driving in a gazillion pilings deep enough that the friction of the soil/sand surrounding those pilings hold the building and bridges and such up ...
I’ve never been to Miami, but a guy who’s studying civil engineering at Ole Miss told me in the Destin area you have to have supports over 100 feet deep or more. It’s been a while since he told me so it could even been more than that.
“google AI simply search a shitepile of online material to come up with “answers”, but has little capability of judging the relatively veracity of the large volume of information that it sucks up ...”
Untrue. Yes, LLMs can hallucinate and display bias but their reasoning capabilities are starting to surpass very bright humans now and sometimes superior to the best logicians in the world. This is all being closely tracked on the way to AGI.