Posted on 09/13/2014 5:25:20 AM PDT by RoosterRedux
In Frank Herbert's Dune books, humanity has long banned the creation of "thinking machines." Ten thousand years earlier, their ancestors destroyed all such computers in a movement called the Butlerian Jihad, because they felt the machines controlled them. Human computers called Mentats serve as a substitute for the outlawed technology. The penalty for violating the Orange Catholic Bible's commandment "Thou shalt not make a machine in the likeness of a human mind" was immediate death.
Should humanity sanction the creation of intelligent machines? That's the pressing issue at the heart of the Oxford philosopher Nick Bostrom's fascinating new book, Superintelligence. Bostrom cogently argues that the prospect of superintelligent machines is "the most important and most daunting challenge humanity has ever faced." If we fail to meet this challenge, he concludes, malevolent or indifferent artificial intelligence (AI) will likely destroy us all.
(Excerpt) Read more at reason.com ...
Nahhhh. That would never happen!
I’m stealing that.
If humans are replaced by ANYTHING intelligent, I’m for it! :P
If humans are replaced by ANYTHING intelligent, I’m for it! :P
Yeah, they lose the old problem solving skills but acquire new ones. Most of those kids seem to be able to creatively accomplish things using apps that most of us oldsters would not dream of. It’s still a toolbox, but its a different set of tools.
I used to think the same thing, but am not so sure now. As I posted earlier, the marriage of self programming machines and nanobots could be a game changer.
The lack of sensory input restricts computers ability to learn or want. However that doesn’t mean that sensory input won’t be there in the future.
Neither will 9 out of 10 people you encounter in the course of a normal day.
Even if, like me, you work with an extraordinary selected group of very bright people, ya still gotta commute and share the world with the, ummm, less alert.
All I know is, if killer robots wouldn’t be such jerks all the time, everybody could mellow out and have a cool BBQ with some yard games like frisbee and lawn darts.
They might try, but likely as not they'd only succeed in eliminating the dumb ones!
Yes and no. In philosophy of mind (my daughter's discipline) the word "zombie" is used to denote a hypothetical being which to the outside observer behaves like a sentient human being, but has no internal subjective experience. There is a consensus that zombies in this sense do not exist. But a "zombie" Skynet would be just as dangerous as a sentient Skynet (to chose one of the dystopian AI systems of fiction as a metaphor for the whole problem).
Perhaps the notion that they would do it deliberately is...
I foresee a rather depressing, yet hopeful outcome.
As AI becomes Super AI, it will (within minutes or hours) become so many orders of magnitude smarter than us, that it will view us as we view ants.
Tell me, do you care about the affairs of ants? Do you wish to destroy them or help them? Does watching an ant-farm really fascinate you?
No. You do not care. Ant farms become quickly boring.
The Super AI is likely to let us remain, unmolested, and develop a warp-style space drive. It is likely to leave the planet Earth in search of something more interesting than this ant colony.
Perfect.
Not if I have anything to say about it. My HeathKit T-850 is almost complete!
I agree 100%. The best students of today, even many of the average ones, are incredible at processing data and following procedures. The best of the programmers have incredible skills to analyze and solve problems. Even students deficient at traditional learning metrics can function with technology.
I just wonder what we've lost if people can't navigate on their own.
I wonder how many on this thread can knap flint?
Different times, different needed skills sets.
If the grid ever crashes hard, for whatever reason, I suspect we will find out the hard way how much we’ve lost. It will be very disruptive and lots of people will die.
That being said, new skills arise and old ones are lost all the time. It’s the creation/destruction cycle and it’s nothing new.
A Super AI might conclude that the universe is better off if it turns itself off. It might only conclude that after it destroys humanity however.
There MAY be a reason why an alien civilization has not conquered us yet. Perhaps there are many reasons why advanced civilizations extinguish themselves—not just limited to nuclear war.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.