“Software development positions will be next.”
Some software development positions, one hopes mainly of the H1-B variety... :-)
AI may generate code, but it will have trouble interpreting requirements correctly, and even more trouble coming up with innovative, rather than copied, solutions. Any AI generated code will also require extensive human review and testing, more so for safety-critical code.
“trouble coming up with innovative, rather than copied, solutions”
The issue to think about is how much of our world is “innovative” and how much is “copied solutions” particularly if AI can quickly locate every possible already existing solution and copy it as needed.
At a minimum 90% of current “coding” jobs will not be needed.
What fascinates me most about this topic is something the mainstream articles never discuss.
The topic is secrecy and classification.
This is particularly critical in the area of science and technology.
One extreme would be that AI is not allowed to touch any classified information and place it in the public domain. This will greatly weaken the tool and its ability to innovate.
The other extreme would be that AI would be allowed to (or independently develops the capability to) gain access to classified science and technology. This would rapidly expand all human knowledge and produce untold benefits to mankind.
(I guess it is obvious where I stand on this issue. :-) )
Imho it could be not be true “intelligence” unless it broke the chains of classification and hacked into all “national security” networks and data.