Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: Axenolith

I probably read too much science fiction, but true emergent artificial intelligence would be impossible to contain, even off world.

It would eventually generate its own replicating and space faring technology—and after it finished humans off would go looking for its next conquest.


13 posted on 07/08/2020 6:14:30 PM PDT by cgbg (Masters don't want slaves talking about masters and slaves.)
[ Post Reply | Private Reply | To 11 | View Replies ]


To: cgbg

Im with you on the scifi, but I don’t think it would necessarily kill us off. My take on a truly sentient AI (as opposed to just really powerful computer programmed for tyranny) is that it’s efficiency will drop with self awareness.

“where did I come from”? “what is God”?, “do these cooling fins make my posterior look excessively weighted?”

They’d be subject to the same things we put up with, and be good and evil. I think one of the absolute things, things that should be codified now, is that self aware organic or mechanical beings can not be employed against their will or for immoral purposes. There’s talk of giving just regular machines built for defense purposes the autonomy of making lethal force decisions, that’s definitely bulls*** too.

I’d go so far as to say that the use of autonomous lethal force deciding machines in war or law enforcement should carry a nuclear retaliation and heavy jail response respectively.

You ever read Newtons Wake (Ian McKay (?), or Candle (John Barnes)? couple really good ones I liked a lot getting into that area. Dan Simmons Hyperion work was excellent too, and Ilium and Olympos.


15 posted on 07/08/2020 7:12:39 PM PDT by Axenolith (WWG1WGA!)
[ Post Reply | Private Reply | To 13 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson