I would think the Three Laws would in and of themselves prevent an AI from functioning on that basis. You would end up with the Kirk/Nomad scenario right away. Emotion isn’t logical and emotion is all that prevents humans from killing. Compassion/love etc.
From a pure logic standpoint, the shortest distance between two points is a straight line. If a machine sees you as a problem, eliminating that problem is the logical solution.
But the Three laws would prevent such a scenario which is not logical. Thus the dilemma.
The way I see it, Google would simply take the ‘logical’ solution and eliminate the three Laws thus opening up an AI to remorseless, emotionless killing in order to further it’s goals of AI development. They are after all liberals and liberals tell us often and in a variety of ways there need to be fewer of us. So I see it as no big deal for them.
Yes Norm I think you’ve nailed it. They simply eliminate the three laws. Either that or the android does it on his own with the assistance of other androids.