Looks like they just didn’t program them to finish the task. They successfully programmed dithering, I suppose that’s an accomplishment.
Dither-bots
Sounds like they put them into no win situations where the couldn’t choose 1 person over another.
Yes, they failed to program prioritization into the algorithm.
Although, it brings up an intriguing potential problem. If you set priorities, then the robot is allowed, under certain circumstances, to “allow” some humans to die. If the robot truly had artificial intelligence, it could know those priorities and perhaps set up situations where it would be able to let a human die according to those priorities. Essentially, it could create a loophole allowing the robot to murder a human without violating its programming.
Yep - GIGO