Posted on 07/25/2025 8:04:35 PM PDT by SeekAndFind
This article spins the same superficial, overhyped narrative on repeat with its claims that AI can already replace entire teams of content creators and programmers.
AI is incredibly useful as a force multiplier for productivity, but the idea of a solo creator building a “full-stack content pipeline” that outranks Forbes is max hyperbole.
The buzzwords—“vibe coding,” “content machine,” “full-stack,” “agentic systems”—and the lack of technical detail make this article read like an AI-generated marketing pitch or the work of a human serving up thin gruel.
If they are talking about English majors, most of them are liberal.
I've been writing software professionally since 1972, and never once considered myself to be a "coder". Software engineer, hardware engineer, embedded engineer; but, never a "coder". A "coder" is just this side of a keypunch operator, a copy-and-paste artist at best.
Pitiful, derogatory term.
So who’s the creative here them or the program?
I don’t know much about AI but I have a lurking suspicion that it’s going to be involved in some huge tragedy. It isn’t after all actually intelligent.
You can actually run some LLMs on your local machine, using something called “Quantization”, in most cases even though it weakens it somewhat, it’s still good enough to do most of things you want to do.
If you have a certain domain expertise you need for the LLM, you can go to Hugging Face and find an quantized LLM suited for it, that won’t take up a lot of resources. You can also used a locally stored Vector database to add in any data that isn’t included in your LLM into the prompts.
I would say, have a solid foundation in the basics: HTML, CSS, JavaScript, HTTP/S, SQL, and mastery of one language, be it Java or Python. Because if you know one language inside and out, it makes it much easier to deal with any other one, because you can simply tell to do the equivalent in the other language, and do it the right way.
One of the best things about using Claude Code for me, is I can easily generate the documentation about what my code does, and it will even point out potentially better ways to do it. That’s why I prefer it over CoPilot.
...been coding since ‘83, as a kid, leading to a career in all sorts of software - embedded systems, web front/back, custom database apps, safety critical systems, graphics rendering engines, that has evolved toward automotive vehicle architecture, where it all comes together.
I agree with many aspects of your post. Things are evolving more rapidly. AI is disrupting. Capabilities are more available to more people, without traditional backgrounds.
That said, they’re not ‘software engineers’. Similarly, when I hear, “today’s kids know how all this stuff works”....no, they don’t, they know how to USE technology that has been made easy to use for non-technical people. E.g. smartphone.
There’s not enough software engineers, especially for industries where safety critical applications are involved....’software is eating the world’. AI enables better productivity - but the appetite for more productivity will only grow because of it.
Yes, the workforce will shift, it always does. I just don’t believe it’ll radically reduce the need for developers, it’ll just spread out the spectrum of skills needed for a greater range of needs. Even for those that do lower-level software, or safety critical systems, AI can help with productivity - but cannot replace (liability). It allows a team to be more productive, when we’ve decades of work to do. I still need more people, not less.
Things like web applications, especially just for presentation, marketing, and e-commerce, are easier targets. Complex custom systems (esp. safety critical) require full completeness of requirements, which has always been the problem - the customer never provides 100% requirement completeness (nowhere close). This is where real skill comes in, deeply understanding the problem. This is where AI falls, at least today.
Nobody knows how this will all play out. AI is accelerating because of AI. I’m not sure where we’ll be in just 5 years. Maybe I’ll be proven wrong. What we can’t do is stop educating each generation of how all this stuff actually works, real computer science.
Those aren't "the basics".
Totally agree! It seems few out there, the "coders", actually do not have a clue.
yes I'm beginning to think this narrative is being pushed by the AI companies to entice large companies to buy their products. It's also being used as an excuse for layoffs that would normally cause people to question how well a company is doing.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.