Thanks for the quotes! Isaac Asimov’s three robotic laws have to be hardwired into all thinking machines and even then I don’t know if we will remain in control.
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
or (potentially more helpful or more dangerous)
0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm, except where such orders would conflict with the Zeroth Law.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the Zeroth or First Laws.
3. A robot must protect its own existence as long as such protection does not conflict with the Zeroth, First, or Second Laws.