Posted on 10/07/2009 8:23:30 AM PDT by Alex Murphy
Technology futurists love to talk about the Singularity as the point in time when technology starts to progress so rapidly that machine intelligence melds with and surpasses human intelligence. It is to futurists what the Rapture is to fundamentalist Christians.
Those who welcome or fear this eventuality are gathering this weekend in New York City for the fourth annual Singularity Summit. Speaking at the summit are some of the better-known tech soothsayers, including author and programmer Ray Kurzweil; Steve Wolfram, the founder of the novel search engine Alpha; and Aubrey de Grey, an expert on anti-aging science. Also giving talks are Australian philosopher David Chalmers, whose idea inspired the Matrix film series, and Pay-Pal co-founder Peter Thiel, who has donated in the six figures to the Singularity Institute for Artificial Intelligence, the organization putting on the event. Last year, the summit drew 1,000 curious academics and entrepreneurs in San Jose, Calif. (See our story on the 2007 Summit here.)
Michael Vassar, the president of the institute, gives the Singularity just under a 25% chance of happening by 2040 and a 70% chance by 2060. When we do cross that line, Vassar says nothing will be the same. "Humans living in the post-Singularity world will be as powerless as jellyfish are in today's world," he says. His odds don't take into account the chances of the world plunging into rapid technological decline due to a nuclear war or a worldwide collapse into barbarism.
Vassar's six staffers at the Singularity Institute, including Kurzweil, publish papers with titles such as, "Uncertain Future Project," "Global Catastrophic Risk Project" and "Economics and Machine Intelligence," and have developed software that supposedly predicts technology's trajectories and generates odds on the occurrences of global catastrophes like nuclear war and global warming.
Singularists fall into optimist and pessimist
(Excerpt) Read more at forbes.com ...
The pessimists, and Vassar is one of them, see threats to humanity from the rise of an unfriendly machine intelligence that will want to enslave humans (think The Matrix) and use our brain matter for endless computation, much as we've used computers in the past 60 years.
Related thread:
Exploring the New God Argument by the Mormon Transhuman Association
Why enslave us when they could just kill us off and eliminate any threat? But really, this is a silly throwback to the pre-computer age. I think the last few generations of effort at machine intelligence have taught us that open-ended intellectual growth in machines is a fantasy. You don't get more out of them then you put in, in terms of intellectual capacity. They can store huge amounts of data and you can create carefully crafted algorithms for problem-solving growth in a carefully defined context, but open-ended intelligence is still a pipe dream.
Well, this boosts my hopes for becoming a cyborg overlord one day.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.