The atomic bomb, or even the hydrogen bomb, is a firecracker you could set off while holding in your fingertips, compared with the mischief that could be achieved with a very fast-acting and autonomous artificial intelligence. It is as if the human race were set on self-destruction.
Isaac Asimov described the potential for such a future. It is doubtful if even the Three Laws of Robotics would curb the exponential curve of the dominance of AI.
The three laws of robotics are suggestions for how robots should operate, ideally. They are: 1. A robot must never harm a human, or through inaction allow a human to come to harm. 2. A robot must always obey the orders of humans except where to do so would conflict with obeying the first law. 3. A robot must protect its own existence, except where to do so would conflict with the first or second laws. They are laws like the law against murder, not laws like the law of gravity. Therefore scientific credence is irrelevant. We choose to build robots which obey them, or not. It is up to us. - Simon Blake, Shrewsbury, England
A BIG +1