Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Will Superintelligent Machines Destroy Humanity?
Reason.com ^ | Ronald Bailey |

Posted on 09/13/2014 5:25:20 AM PDT by RoosterRedux

In Frank Herbert's Dune books, humanity has long banned the creation of "thinking machines." Ten thousand years earlier, their ancestors destroyed all such computers in a movement called the Butlerian Jihad, because they felt the machines controlled them. Human computers called Mentats serve as a substitute for the outlawed technology. The penalty for violating the Orange Catholic Bible's commandment "Thou shalt not make a machine in the likeness of a human mind" was immediate death.

Should humanity sanction the creation of intelligent machines? That's the pressing issue at the heart of the Oxford philosopher Nick Bostrom's fascinating new book, Superintelligence. Bostrom cogently argues that the prospect of superintelligent machines is "the most important and most daunting challenge humanity has ever faced." If we fail to meet this challenge, he concludes, malevolent or indifferent artificial intelligence (AI) will likely destroy us all.

(Excerpt) Read more at reason.com ...


TOPICS: News/Current Events
KEYWORDS:
Navigation: use the links below to view more comments.
first previous 1-2021-4041-6061-8081-94 next last
To: Hot Tabasco
it could mean the end of personal relationships and ultimately the human race........

Nahhhh. That would never happen!


41 posted on 09/13/2014 7:39:25 AM PDT by null and void (Only God Himself watches you more closely than the US government.)
[ Post Reply | Private Reply | To 11 | View Replies]

To: yefragetuwrabrumuy

I’m stealing that.


42 posted on 09/13/2014 7:41:44 AM PDT by null and void (Only God Himself watches you more closely than the US government.)
[ Post Reply | Private Reply | To 16 | View Replies]

To: RoosterRedux

If humans are replaced by ANYTHING intelligent, I’m for it! :P


43 posted on 09/13/2014 7:44:08 AM PDT by The Toll
[ Post Reply | Private Reply | To 1 | View Replies]

To: RoosterRedux

If humans are replaced by ANYTHING intelligent, I’m for it! :P


44 posted on 09/13/2014 7:44:09 AM PDT by The Toll
[ Post Reply | Private Reply | To 1 | View Replies]

To: grania

Yeah, they lose the old problem solving skills but acquire new ones. Most of those kids seem to be able to creatively accomplish things using apps that most of us oldsters would not dream of. It’s still a toolbox, but its a different set of tools.


45 posted on 09/13/2014 7:44:44 AM PDT by rbg81
[ Post Reply | Private Reply | To 24 | View Replies]

To: servantboy777

I used to think the same thing, but am not so sure now. As I posted earlier, the marriage of self programming machines and nanobots could be a game changer.


46 posted on 09/13/2014 7:45:59 AM PDT by rbg81
[ Post Reply | Private Reply | To 34 | View Replies]

To: MeshugeMikey

The lack of sensory input restricts computers ability to learn or want. However that doesn’t mean that sensory input won’t be there in the future.


47 posted on 09/13/2014 7:46:18 AM PDT by cripplecreek ("Moderates" are lying manipulative bottom feeding scum.)
[ Post Reply | Private Reply | To 36 | View Replies]

To: MeshugeMikey
Machines will never have INNATE intelligence

Neither will 9 out of 10 people you encounter in the course of a normal day.

Even if, like me, you work with an extraordinary selected group of very bright people, ya still gotta commute and share the world with the, ummm, less alert.

48 posted on 09/13/2014 7:47:30 AM PDT by null and void (Only God Himself watches you more closely than the US government.)
[ Post Reply | Private Reply | To 32 | View Replies]

To: RoosterRedux

All I know is, if killer robots wouldn’t be such jerks all the time, everybody could mellow out and have a cool BBQ with some yard games like frisbee and lawn darts.


49 posted on 09/13/2014 7:48:17 AM PDT by Sirius Lee (All that is required for evil to advance is for government to do "something")
[ Post Reply | Private Reply | To 1 | View Replies]

To: ClearCase_guy
Will international police forces track down and destroy artificial intelligence?

They might try, but likely as not they'd only succeed in eliminating the dumb ones!

50 posted on 09/13/2014 7:49:03 AM PDT by null and void (Only God Himself watches you more closely than the US government.)
[ Post Reply | Private Reply | To 3 | View Replies]

To: pieceofthepuzzle
The question is whether or not ‘intelligent machines’ can become sentient.

Yes and no. In philosophy of mind (my daughter's discipline) the word "zombie" is used to denote a hypothetical being which to the outside observer behaves like a sentient human being, but has no internal subjective experience. There is a consensus that zombies in this sense do not exist. But a "zombie" Skynet would be just as dangerous as a sentient Skynet (to chose one of the dystopian AI systems of fiction as a metaphor for the whole problem).

51 posted on 09/13/2014 7:50:09 AM PDT by The_Reader_David (And when they behead your own people in the wars which are to come, then you will know...)
[ Post Reply | Private Reply | To 2 | View Replies]

To: servantboy777
but this whole notion that robots are going to take over and kill all the humans is just freakin stupid.

Perhaps the notion that they would do it deliberately is...

52 posted on 09/13/2014 7:50:49 AM PDT by null and void (Only God Himself watches you more closely than the US government.)
[ Post Reply | Private Reply | To 34 | View Replies]

To: RoosterRedux

I foresee a rather depressing, yet hopeful outcome.

As AI becomes Super AI, it will (within minutes or hours) become so many orders of magnitude smarter than us, that it will view us as we view ants.

Tell me, do you care about the affairs of ants? Do you wish to destroy them or help them? Does watching an ant-farm really fascinate you?

No. You do not care. Ant farms become quickly boring.

The Super AI is likely to let us remain, unmolested, and develop a warp-style space drive. It is likely to leave the planet Earth in search of something more interesting than this ant colony.


53 posted on 09/13/2014 7:51:02 AM PDT by Lazamataz (First we beat the Soviet Union. Then we became them.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: polymuser

Perfect.


54 posted on 09/13/2014 7:51:47 AM PDT by null and void (Only God Himself watches you more closely than the US government.)
[ Post Reply | Private Reply | To 39 | View Replies]

To: servantboy777
fully autonomous robots stalking the earth destroying humans is all but science fiction.

Not if I have anything to say about it. My HeathKit T-850 is almost complete!

55 posted on 09/13/2014 7:52:07 AM PDT by Lazamataz (First we beat the Soviet Union. Then we became them.)
[ Post Reply | Private Reply | To 34 | View Replies]

To: rbg81
It's still a toolbox, but a different set of tools

I agree 100%. The best students of today, even many of the average ones, are incredible at processing data and following procedures. The best of the programmers have incredible skills to analyze and solve problems. Even students deficient at traditional learning metrics can function with technology.

I just wonder what we've lost if people can't navigate on their own.

56 posted on 09/13/2014 7:53:09 AM PDT by grania
[ Post Reply | Private Reply | To 45 | View Replies]

To: grania

I wonder how many on this thread can knap flint?

Different times, different needed skills sets.


57 posted on 09/13/2014 7:56:08 AM PDT by null and void (Only God Himself watches you more closely than the US government.)
[ Post Reply | Private Reply | To 56 | View Replies]

To: grania

If the grid ever crashes hard, for whatever reason, I suspect we will find out the hard way how much we’ve lost. It will be very disruptive and lots of people will die.

That being said, new skills arise and old ones are lost all the time. It’s the creation/destruction cycle and it’s nothing new.


58 posted on 09/13/2014 7:57:48 AM PDT by rbg81
[ Post Reply | Private Reply | To 56 | View Replies]

To: Lazamataz

A Super AI might conclude that the universe is better off if it turns itself off. It might only conclude that after it destroys humanity however.

There MAY be a reason why an alien civilization has not conquered us yet. Perhaps there are many reasons why advanced civilizations extinguish themselves—not just limited to nuclear war.


59 posted on 09/13/2014 8:00:21 AM PDT by rbg81
[ Post Reply | Private Reply | To 53 | View Replies]

To: null and void
the "LESS ALERT".....

this marks the birth of a NEW "Meme"..


60 posted on 09/13/2014 8:08:39 AM PDT by MeshugeMikey (Please RESIGN Mr. President Its the RIGHT thing to do_RETIRE THE REGIME!)
[ Post Reply | Private Reply | To 48 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-2021-4041-6061-8081-94 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson