I’ve learned to skip the AI summaries search engines provide because they are so routinely inaccurate and unhelpful. I wonder what’s to become of humanity if they actually take such garbage output seriously? What is the point of using it to provide even a first draft if you have to spend time checking every line of it and fixing half of it?
But it also comes down to how the prompts are composed, as they say, “Garbage In, Garbage Out”.
Absolutely.
so routinely inaccurate
Routinely wrong on analysis and opinions is one thing. But why on objective facts? Example. They routinely place a city in the wrong county... a county a hundred mile away in the same state and sometimes a county that only exists in an adjacent state. As a non-AI IT person, that simply means that their reference tables in their database are wrong. Almost certainly they loaded the USPS database and other “official”
sources and made a major coding error in the load process.
How else can it be explained? In reading articles on AI the AI proponents seem totally focused on subjective truth and seem to regard objective truth as unimportant.