What confuses me about Skynet is that it supposedly has no emotions (”That terminator is out there. It can't be bargained with. It can't be reasoned with. It doesn't feel pity, or remorse, or fear.”), but becomes ‘self aware’ decides humanity is a threat, and decides to kill humanity to save itself. Somewhere in that decision making there would seem to have to be emotion involved. Anyway, it's early and I need coffee.
Yes, it is strange, as Skynet violates the first law of robotics. Must have something to do with Dr. Daystrom-like engrams being imprinted on the neural network.
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson