Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Singularities and Nightmares
KurzweilAI.net ^ | 3/30/2006 | David Brin

Posted on 03/30/2006 4:52:09 AM PST by Neville72

Options for a coming singularity include self-destruction of civilization, a positive singularity, a negative singularity (machines take over), and retreat into tradition. Our urgent goal: find (and avoid) failure modes, using anticipation (thought experiments) and resiliency -- establishing robust systems that can deal with almost any problem as it arises.

In order to give you pleasant dreams tonight, let me offer a few possibilities about the days that lie ahead—changes that may occur within the next twenty or so years, roughly a single human generation. Possibilities that are taken seriously by some of today's best minds. Potential transformations of human life on Earth and, perhaps, even what it means to be human.

For example, what if biologists and organic chemists manage to do to their laboratories the same thing that cyberneticists did to computers? Shrinking their vast biochemical labs from building-sized behemoths down to units that are utterly compact, making them smaller, cheaper, and more powerful than anyone imagined. Isn't that what happened to those gigantic computers of yesteryear? Until, today, your pocket cell phone contains as much processing power and sophistication as NASA owned during the moon shots. People who foresaw this change were able to ride this technological wave. Some of them made a lot of money.

(Excerpt) Read more at kurzweilai.net ...


TOPICS: Culture/Society
KEYWORDS: ai; computer; cyborg; evolution; evolutionary; exponentialgrowth; future; futurist; genetics; gnr; humanity; intelligence; knowledge; kurzweil; longevity; luddite; machine; mind; nanotechnology; nonbiological; raykurzweil; robot; robotics; science; singularity; singularityisnear; spike; technology; thesingularityisnear; transhuman; transhumanism; trend; virtualreality
Navigation: use the links below to view more comments.
first 1-2021-25 next last

1 posted on 03/30/2006 4:52:10 AM PST by Neville72
[ Post Reply | Private Reply | View Replies]

To: AntiGuv

Ping!


2 posted on 03/30/2006 5:01:21 AM PST by Momaw Nadon ("...with the ultimate goal of ending tyranny in our world.")
[ Post Reply | Private Reply | To 1 | View Replies]

To: Neville72
"For example, what if biologists and organic chemists manage to do to their laboratories the same thing that cyberneticists did to computers? Shrinking their vast biochemical labs from building-sized behemoths down to units that are utterly compact, making them smaller, cheaper, and more powerful than anyone imagined."

Already happening.

3 posted on 03/30/2006 6:25:31 AM PST by Wonder Warthog (The Hog of Steel)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Wonder Warthog

Long, rambling article about an important idea: here is the guts of it:

The options before us appear to fall into four broad categories:


1. Self-destruction. Immolation or desolation or mass-death. Or ecological suicide. Or social collapse. Name your favorite poison. Followed by a long era when our few successors (if any) look back upon us with envy. For a wonderfully depressing and informative look at this option, see Jared Diamond's Collapse: How Societies Choose to Fail or Succeed. (Note that Diamond restricts himself to ecological disasters that resonate with civilization-failures of the past; thus he only touches on the range of possible catastrophe modes.) We are used to imagining self-destruction happening as a result of mistakes by ruling elites. But in this article we have explored how it also could happen if society enters an age of universal democratization of the means of destruction—or, as Thomas Friedman puts it, "the super-empowerment of the angry young man"—without accompanying advances in social maturity and general wisdom.


2. Achieve some form of 'Positive Singularity'—or at least a phase shift to a higher and more knowledgeable society (one that may have problems of its own that we can't imagine.) Positive singularities would, in general, offer normal human beings every opportunity to participate in spectacular advances, experiencing voluntary, dramatic self-improvement, without anything being compulsory… or too much of a betrayal to the core values of decency we share.


3. Then there is the 'Negative Singularity'—a version of self-destruction in which a skyrocket of technological progress does occur, but in ways that members of our generation would find unpalatable. Specific scenarios that fall into this category might include being abused by new, super-intelligent successors (as in Terminator or The Matrix), or simply being "left behind" by super entities that pat us on the head and move on to great things that we can never understand. Even the softest and most benign version of such a 'Negative Singularity' is perceived as loathsome by some perceptive renunciators, like Bill Joy, who take a dour view of the prospect that humans may become a less-than-pinnacle form of life on Planet Earth.10


4. Finally, there is the ultimate outcome that is implicit in every renunciation scenario: Retreat into some more traditional form of human society, like those that maintained static sameness under pyramidal hierarchies of control for at least four millennia. One that quashes the technologies that might lead to results 1 or 2 or 3. With four thousand years of experience at this process, hyper-conservative hierarchies could probably manage this agreeable task, if we give them the power. That is, they could do it for a while.



When the various paths11 are laid out in this way, it seems to be a daunting future that we face. Perhaps an era when all of human destiny will be decided. Certainly not one that's devoid of "history." For a somewhat similar, though more detailed, examination of these paths, the reader might pick up Joel Garreau's fine book, Radical Evolution. It takes a good look at two extreme scenarios for the future—"Heaven" and Hell"—then posits a third—"Prevail"—as the one that rings most true.


So, which of these outcomes seem plausible?


4 posted on 03/30/2006 8:09:03 AM PST by Jack Black
[ Post Reply | Private Reply | To 3 | View Replies]

To: Physicist
self-ping
5 posted on 03/30/2006 10:06:51 AM PST by Physicist
[ Post Reply | Private Reply | To 1 | View Replies]

To: Jack Black

The only scenario that I don't find plausible is #4. Repressive governments may be able to slow technological advances, but they can't stop it.


6 posted on 03/30/2006 10:11:49 AM PST by ThinkDifferent (Chloe rocks)
[ Post Reply | Private Reply | To 4 | View Replies]

To: ThinkDifferent
Agreed. Even the Christian semi-theocracy of the Dark Ages was only able to persecute people for a few hundred years.

A Singularity event is still the most likely. I'd lean towards a negative scenario though. The hyper-capable leaving behind those too hide bound to advance. A conflict then ensuing. It hits all the classical societal stress points for just about every metric you care to imagine.

Extropians and transhumanists have been knocking around ideas for failsafe tech limits, strategies for paradigm shifts, dealing with religious zealotry and objectionism, ect...

Fascinating topic...

7 posted on 03/30/2006 10:22:41 AM PST by Dead Corpse (I believe that all government is evil, and that trying to improve it is largely a waste of time.)
[ Post Reply | Private Reply | To 6 | View Replies]

To: Dead Corpse
The hyper-capable leaving behind those too hide bound to advance.

I see this, too.

A conflict then ensuing.

Why? I think the extropians will just leave the planet, probably with technology that humans can't even understand. They may stick around to maintain the earth as a human zoo. If so I expect that would take their zoo-keeper role as seriously as most other zoo-keepers do.

Alternately they might just disengage.

8 posted on 03/30/2006 10:34:14 AM PST by Jack Black
[ Post Reply | Private Reply | To 7 | View Replies]

To: Jack Black
More than likely, that is exactly what will happen. An enhanced-human Diaspora. Of course, this leaves the rest of wretched humanity wailing that it "isn't fair". Envy and bigotry are huge social forces. More than enough to try and start a dust up. Anything that is a perceived threat would get the "it's for the children" treatment.

Advances I'm seeing coming to fruition fairly soon are computer/net interfaces, immune system revamp, re-engineering of entire gene sequences in fully developed biological systems. Any one of these gives a huge advantage over those who, for one reason or another, don't want to be "improved".

Frankly, the idea of extended life span, increased damage resistance/repair, and a direct neural feed to computing systems would be my "buy in" criteria. Computer enhanced eidetic memory alone would almost be worth it.

9 posted on 03/30/2006 10:47:59 AM PST by Dead Corpse (I believe that all government is evil, and that trying to improve it is largely a waste of time.)
[ Post Reply | Private Reply | To 8 | View Replies]

To: Wonder Warthog
Already happening.

lab on a chip is commonplace

10 posted on 03/30/2006 10:49:42 AM PST by TC Rider (The United States Constitution © 1791. All Rights Reserved.)
[ Post Reply | Private Reply | To 3 | View Replies]

To: Dead Corpse
A Singularity event is still the most likely. I'd lean towards a negative scenario though. The hyper-capable leaving behind those too hide bound to advance.

Well, I wouldn't consider that too negative, but then I know which group I'd plan to be in :)

A conflict then ensuing.

I don't think that's inevitable. The "transhumans" would likely be so wealthy and powerful that they could easily afford to let the "luddites" live in peace, perhaps even assisting them. I do see large potentials for conflict in the process of getting to that point, for example radical egalitarians determined that if they're not going to be enhanced, nobody else should be. (In a way, Islamic terrorism is a form of this).

Fascinating topic...

Indeed.

11 posted on 03/30/2006 10:56:22 AM PST by ThinkDifferent (Chloe rocks)
[ Post Reply | Private Reply | To 7 | View Replies]

To: Dead Corpse

"Advances I'm seeing coming to fruition fairly soon are computer/net interfaces, immune system revamp, re-engineering of entire gene sequences in fully developed biological systems. Any one of these gives a huge advantage over those who, for one reason or another, don't want to be "improved"."


I hear alot of big talk on this site and others, from those who claim they wouldn't want to have their lifetimes radically extended even if it was in a healthy state. I find that utterly ludicrous and don't believe it for a minute.

In essence what we're talking about is that long sought after "fountain of youth". How many could resist. Very, very few I'd bet.


12 posted on 03/30/2006 11:07:18 AM PST by Neville72 (uist)
[ Post Reply | Private Reply | To 9 | View Replies]

To: Neville72

My wife is one. Philosophical reasons. If it does come down to that though, I'll miss her. Terribly.


13 posted on 03/30/2006 11:11:21 AM PST by Dead Corpse (I believe that all government is evil, and that trying to improve it is largely a waste of time.)
[ Post Reply | Private Reply | To 12 | View Replies]

To: Neville72
I hear alot of big talk on this site and others, from those who claim they wouldn't want to have their lifetimes radically extended even if it was in a healthy state. I find that utterly ludicrous and don't believe it for a minute.

Me either. To be fair, lots of people hear "live for 500 years" and think that means 400 years in a wheelchair and on dialysis, and I can understand not looking forward to that. Once people understand the full implications of effectively curing aging, I expect to see very few holdouts.

14 posted on 03/30/2006 11:30:55 AM PST by ThinkDifferent (Chloe rocks)
[ Post Reply | Private Reply | To 12 | View Replies]

To: Dead Corpse
My wife is one. Philosophical reasons. If it does come down to that though, I'll miss her. Terribly.

That sucks. Perhaps she'll change her mind if it becomes more than a theoretical issue.

15 posted on 03/30/2006 11:32:19 AM PST by ThinkDifferent (Chloe rocks)
[ Post Reply | Private Reply | To 13 | View Replies]

To: ThinkDifferent
Tell me about it. She is literally my best friend. As for changing her mind, I'm not saying it would be impossible to do. This is just where she is at now.

I like the idea of living with her for a few hundred years. As odd as that may sound in this day and age when any given marriage has a 50% chance of success.

16 posted on 03/30/2006 11:34:34 AM PST by Dead Corpse (I believe that all government is evil, and that trying to improve it is largely a waste of time.)
[ Post Reply | Private Reply | To 15 | View Replies]

To: Neville72
I hear alot of big talk on this site and others, from those who claim they wouldn't want to have their lifetimes radically extended even if it was in a healthy state. I find that utterly ludicrous and don't believe it for a minute.

Not me. I'll take ten thousand years, please. Always something new out there to see and do. By time you got done with "everything", a whole bunch of new things would have come around.

17 posted on 03/30/2006 11:43:55 AM PST by RogueIsland (.)
[ Post Reply | Private Reply | To 12 | View Replies]

To: Jack Black
Human nature has not changed in thousands of years. Man is subject to original sin, meaning that there is no technological salvation. We are more affluent today then ever before and becoming more spiritually impoverished by the minute. That means a slow, downward spiral is the most likely scenario: increasing violence and anarchy on the fringes moving slowly to the center of our civilization and increasing infringement and attack by barbarian cultures. With the development of bio- and nano- superweapons over the next twenty years, I can easily see a more rapid descent into "immolation or desolation or mass-death".
18 posted on 03/30/2006 11:46:05 AM PST by ZeitgeistSurfer (Visit the Iran Crater in 2008)
[ Post Reply | Private Reply | To 4 | View Replies]

To: ThinkDifferent
The "transhumans" would likely be so wealthy and powerful that they could easily afford to let the "luddites" live in peace, perhaps even assisting them.

It's like the world's biggest game of "Risk". George Soros and his ilk are playing it today, in fact, sans the extended lifespans. ;)

19 posted on 03/30/2006 11:49:59 AM PST by Mr. Jeeves ("When the government is invasive, the people are wanting." -- Tao Te Ching)
[ Post Reply | Private Reply | To 11 | View Replies]

To: Neville72

There may be some who resist, but I foresee larger numbers who would like to join the transhumans, but can't due to cost--this is where I see the conflict arising..."haves," who can afford the enhancements, versus "have nots," who can't. It's difficult to evision a scenario under which this would be offered to all comers gratis. I suspect that most of us who are looking forward to this future will find ourselves on the outside looking in, as the billionaires move on to the next level--luddites by circimstance, not choice.


20 posted on 03/30/2006 12:10:12 PM PST by kms61
[ Post Reply | Private Reply | To 12 | View Replies]


Navigation: use the links below to view more comments.
first 1-2021-25 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson