Somebody needs to check out the Safety for Humanity....
Hmmm...what could possibly take over code review and testing?
The job of out placement counselling and separation from employment officers were doing quite well but then were taken over by AI.
(Begin quote)
You're making a fair point that many people are treating AI like a replacement for thinking rather than a tool for reasoning. And yes — a lot of sloppy “vibe-coded” output is already flooding the internet. But I think your framing misses the deeper shift:
AI isn't just search — it's interactive cognition
Search retrieves.
LLMs synthesize, evaluate, and transform information.
Sure, the most product use cases today resemble "smart search assistants":
But that's not "just search".
It is a general reasoning assistant.
The difference is subtle but huge.
On coding: the bottleneck isn’t typing — it’s understanding You’re right: blindly generating code is dangerous.
Bad developers using AI = more bad code faster Great developers using AI = faster design, better abstractions.
Typing isn't the job.
Thinking is.
The people “switching seats with AI” weren't engineers to begin with — they were people who never understood why software design matters.
AI didn’t create them. It just exposed them.
Review cost vs. productivity
You mentioned reviewing AI work sometimes erases the time saved. True — for now.
But historically, every automation wave has a phase where:
✅ Output becomes reliable
✅ Human insight becomes optional or lighter
✅ Experts move up-stack to architecture and intent shaping.
That is already happening in:
The work shifts from writing code to supervising intent execution.
"AI should only handle things I don't care about"
Respectfully — that sounds like using a sports car to deliver groceries.
Useful, but underestimating the machine.
AI today is an early form of software-shaping intelligence, not just a search upgrade.
If we trap it in the mindset of “autocomplete-plus,” we’ll miss the transition from code we write manually to systems we specify and validate.
The real danger isn't AI coding -- it's AI replacing understanding.
Tools don't remove thinking.
People choosing not to think do.
Good engineers will use AI to accelerate mastery, not avoid it.
In another application, the server side pages were coded using < script >, < style > and < iframe > tags to pull in tested blocks of code. Unfortunately, those tags are easily exploited and the exploits were unknown when the original code was written 20 years ago. There are thousands of these constructs in hundreds of files. Taken one at a time, none of them is difficult to remediate for safety, but it only takes the inclusion of a Content-Security header to make all of them a monumental security headache. An AI driven remediation to deal with the problem in bulk might be a solution, but it would require strict post update testing as AI "fixes" are sometimes more damaging than helpful.
Game testers, contractors, etc.
i’ve been a professional developer for 40+ years.
developers are not users. we dream up and create the software others use.
students that use AIs will miss out on the struggle that goes along with learning to develop software. this will result in developers with less skills not more.
sure, areas in which the dev has prior extensive knowledge, the AIs can be used for mundane or lengthy repetitive coding tasks. anything else would make the AIs more of a crutch.
don’t get me wrong, today’s AIs are ok but they quickly start ‘hallucinating’ and lose the narrative resulting in code that’s wildly off base. this will most likely get better as time goes on but for now, it’s still a bit of a mess.
as for large projects... the AIs are simply not up to the task... currently.
There will be downstream impact as companies and consumers have access to better software cheaper.