As long as they dont incorporate Asimovs 3 laws. Those would totally destroy mankind
I worry a bit about self-driving vehicles and the “trolley conundrum.”
They will have to have thousands (millions?) of “no win” scenarios and run them through the AI engine to see how it discerns.
Will Smith’s character in “I Robot” (which no doubt had the Good Doctor turning in his grave for the abuse of the Laws AND his characters) — did have a good point. We, as humans, value children over adults for the most part. The robot’s differential engine used exclusively probability of survival.
Here’s the thing...
If a method or technology is developed that can be repurposed by evil men to do evil thing, it will be so repurposed.
100% of the time.
Of course we have to make decisions. AI is like any other technology - a ‘double-edged sword’ which can be used for good or evil - and which can also be very Unintelligent and even downright STUPID - as William Binney explains in this talk Binney’s introduction begins at 1:50:30.
(I have no involvement with or interest in the Schiller Institute, but appreciate Binney wherever he shows up):
Three American values known to every G.I. that AI cannot compute... Mom, ApplePie,, and FordChevyMopar!