We are right on the cusp of the Artificial Intelligence Revolution. Tech giants and smaller companies will be laying hundreds of thousands of people in the near future.
Example: IBM lays off 10,000 workers yet invests 10 billion in OpenAI (makers of ChatGPT).
That’s a crock.
The AI is good at two things:
1) generating code according to known, common, strict rules.
E.g. something based on a mathematical formula.
2) Something for which there is a truckload of known, well-characterized, examples (abstracting a picture).
The problem I foresee is trying to get AI to write general purpose code, or complex modules. Merely an “average” of all kinds of code from GitHub, will either skip important internal business rules, get the syntax wrong, or screw up when looking at various successive versions of software or libraries.
The tech mavens are getting high on their own propaganda supply; they envisage a world without beginner coders.
But — with the pipeline of beginning jobs gone, who will be able to maintain, update, and bugfix the code (which no human wrote, so no human will “remember” what the various variable names and conventions refer to)?
And — this will make quality control, versioning, etc. potentially very dangerous: with a change to the AI engine, or the existing codebase from which the AI draws its examples, even repeating the exact same instructions (...and that’s assuming the curation of spoken phrases to get the code you really want works ok), might generate vastly different code from one version to the next.
And I’d like to see the security in place to allow a wide latitude of phrases, and people writing viruses, trojans, white rabbits, etc. to allow backdoors, hacking, what have you, on their own, and innocuously saying “include the my_checksum library” which they’ve trained the machine means something *else*...
The fantastic ease with which the AI draws pictures, is due to the abstractive ability, not-well-understood analog neural net of the human mind: we can guess what the computer ‘MEANT’. But in writing text, a compiler won’t know what the AI system *meant* to code. Compilers still require strict adherence to syntax: and a kernel that any human can see is “performing a join on tables A and B with conditional where clauses pulled from parameters defined elsewhere”...might not be good enough for the compiler to run with it.