It was.
The moral of the story is that the margin of error for software running some of these systems is ten to the minus ninth power - extreme improbability.
Allowing a computer to teach itself can’t reach that level of certitude.
Even the smallest glitch would result in industry-killing liability lawsuits the likes of which we’ve never seen.
The risk is not worth the reward.
If I get the paper written, I’m sure I’ll blog about it. The issues you raise are terribly obvious. Can’t think of any reason I’d imagine that someone in the field wouldn’t understand them.