Therein lies the rub.
In industry everything needs to be signed off by a person..that is even if done on a computerized machine. Some person takes responsibility.
If you have a self-driving truck that kills somebody is responsible. It can’t be pawned off on a machine.
Same goes here. Someone is responsible for the output. Whether a programmer or someone else.
Absolutely. I would say far more so in this case than that of a faulty or poorly designed self-driving vehicle.
The AI would not spit out this kind of thing without the specific intent of bias that could only come from a human being.
If the machine was creating biases through its own inadequate capabilities one would expect the errors to fall all over the political and social spectrum.
Nope, the machine is performing very well and giving a picture perfect mirror image of its creator.