Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Who Do You Trust More: G.I. Joe or A.I. Joe?
New York Times ^ | 2/20/05 | George Johnson

Posted on 02/21/2005 1:35:10 PM PST by LibWhacker

IN a story by Isaac Asimov, three technocrats are sitting in an underground cavern stuffed with electronics discussing how, with a computer named Multivac, they won the war.

The attack had come from an enemy that seemed inscrutable - not Shiite fundamentalists or Al Qaeda terrorists but beings from the star system Deneb, a shining light in the constellation Cygnus, who had threatened Earth with weapons of mass destruction.

But the earthlings, relying on the help of an artificial, dispassionate intelligence - this sprawling subterranean computer - had ultimately prevailed.

Recent reports that the Pentagon is planning to spend tens of billions of dollars over the next decade to perfect computerized warfare sound like science fiction. In fact, the plan, Future Combat Systems, was first dreamed up years ago. Its designers envisioned a 21st-century fighting force of automated tanks, helicopters and planes, remote missile launchers and even troops of robot soldiers - all coordinated by a self-configuring network of satellites, sensors and supercomputers. A way to get the human out of the loop.

Already schematized into PowerPoint presentations with tangled colored arrows, talking points and an Orwellian lexicon - "soldier systems," "networked lethality," "war fighter machine interfaces" - the rationale for the effort is laid out in an Army promotional video: "Either you create your future or you become the victim of the future someone creates for you."

Ever since the catapult, warfare has been technology's driving force. Computers were first developed to calculate missile trajectories and break enemy codes. But so far it's been only in science fiction that anyone has dared to turn over decision making to machines.

The Pentagon planners - they've probably all seen the movie "Fail-Safe" - emphasize that in their own futuristic vision, people will remain firmly in charge. The promise is that this won't be like Skynet in "Terminator 2," the self-evolving military electronics that develops a rebellious attitude of its own.

But that leads to a conundrum: The whole point of automation is to rise above human fallibility - knee-jerk decisions, misunderstood orders, cowardly retreats. Machines are faster, more focused, impermeable to propaganda and, at least for now, they don't talk back.

As the thinking machinery continues to evolve, the strategists will keep asking themselves the same question: Is there still a good reason to trust ourselves or should we defer to a computer's calculations?

In the Asimov story, "The Machine That Won the War," published in 1961 in The Magazine of Fantasy and Science Fiction, one programmer admits that as the Denebian battles dragged on, he came to distrust the system. The information the politicians and the generals were providing, the input for Multivac, struck him as so self-serving that he felt compelled to make corrections. So he juggled the data, relying on his own intuition.

Another of Multivac's keepers, charged with analyzing the results of its computations, makes his own confession. Because of bad parts and a shortage of good technicians, he suspected the hardware had become unreliable. He gave the numbers his own spin.

It didn't really matter, as the director of the program makes clear in the end. He more than anyone had been unable to put his trust in a computer. Nor did he entirely trust himself. When it came time to make the ultimate decision - attack here or there, advance or retreat - he threw out reason altogether. He flipped a coin.

Multivac would seem pretty lame stacked next to the machinery the Pentagon has put out to bid. But no matter how advanced the system becomes, how do you know when to believe it? Microprocessors and disk drives malfunction and programs can be buggy or catch a virus, but all that is true of human brains. If the computer says something funny, maybe the fault is with you. So who, or what, makes the diagnosis?

The Pentagon's promotional material doesn't mention whether the contract for the Future Combat Systems, said to be the biggest in American military history, includes a line item for philosophers. But they may be best equipped to judge whether computers, despite their faster speeds, greater bandwidth and bigger memories, are inherently different - less trustworthy than the gut feelings and hunches of their keepers.

John Searle, the philosopher at the University of California at Berkeley whose most recent book, "Mind: A Brief Introduction," came out last year, has argued for decades that the brain is not just a computer strung together from neurons. Whatever is happening in the head - and nobody really knows - it is not computation. Confuse reason with calculation, he argues, and disaster lies ahead.

But that has become the minority view. Attend a conference of the Society for Neuroscience or the Cognitive Science Society and you would be hard-pressed to find anyone who doesn't assume deep down that the brain is some kind of information processor. For the time-being, people excel at certain tasks, like recognizing faces and making sense of ambiguous data, but that may be only because of wiring details and variations in the algorithms - things that eventually could be simulated electronically. The result would be machinery that can do anything a person can, but faster and better.

If that comes to pass, doubting some future incarnation of Multivac might be an act of mutiny. Yet there would always be a nagging suspicion: The machine will have been designed by the imperfect species called homo sapiens. What if we got something wrong?


TOPICS: News/Current Events; Technical
KEYWORDS: artificial; combat; computers; future; intelligence; neuroscience; pentagon; technology; war; warfare
Navigation: use the links below to view more comments.
first 1-2021 next last

1 posted on 02/21/2005 1:35:11 PM PST by LibWhacker
[ Post Reply | Private Reply | View Replies]

To: LibWhacker

"Its designers envisioned a 21st-century fighting force of automated tanks, helicopters and planes, remote missile launchers and even troops of robot soldiers - all coordinated by a self-configuring network of satellites, sensors and supercomputers. A way to get the human out of the loop."

This idea didnt work out too well in the Terminator movies... hmmm.


2 posted on 02/21/2005 1:38:48 PM PST by WOSG (Liberating Iraq - http://freedomstruth.blogspot.com)
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker

I wasn't aware it's an either/or question.


3 posted on 02/21/2005 1:40:26 PM PST by Celtjew Libertarian (Shake Hands with the Serpent: Poetry by Charles Lipsig aka Celtjew http://books.lulu.com/lipsig)
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker

We're all just stupid humans. Why do we even go on living?

Denote sarcasm.


4 posted on 02/21/2005 1:40:33 PM PST by writer33 ("In Defense of Liberty," a political thriller, being released in March)
[ Post Reply | Private Reply | To 1 | View Replies]

To: writer33
We're all just stupid humans. Why do we even go on living?

That's what I tell liberals every day. Someday, I'm hoping the message will stick.

5 posted on 02/21/2005 1:42:32 PM PST by tortoise (All these moments lost in time, like tears in the rain.)
[ Post Reply | Private Reply | To 4 | View Replies]

To: LibWhacker

BOOO!!! Skynet's coming to get you.

The New York Times in action once again ...


6 posted on 02/21/2005 1:45:22 PM PST by Edward Watson
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker
They should also see the movie The Forbin Project.........
7 posted on 02/21/2005 1:45:58 PM PST by Red Badger (I call her GODZILLARY because she went to NYC and made her nest there, too.........)
[ Post Reply | Private Reply | To 1 | View Replies]

To: tortoise

Let's hope it works. :)


8 posted on 02/21/2005 1:47:49 PM PST by writer33 ("In Defense of Liberty," a political thriller, being released in March)
[ Post Reply | Private Reply | To 5 | View Replies]

To: writer33

So we can all be STOCKHOLDERS!


9 posted on 02/21/2005 1:52:19 PM PST by longtermmemmory (VOTE!)
[ Post Reply | Private Reply | To 4 | View Replies]

To: longtermmemmory

:)


10 posted on 02/21/2005 1:58:43 PM PST by writer33 ("In Defense of Liberty," a political thriller, being released in March)
[ Post Reply | Private Reply | To 9 | View Replies]

To: LibWhacker
My dad told me about a story about an A.I. computer, I'm not sure if this is the same one. I know it was Asimov but that means nothing.

Anyways in the story an A.I.computer is developed and then scientists hem and haw over who should be the first to ask it a question and what that question should be.

Finally it is settled that an old woman gets to ask the question. She asks, "Is there a God?"

After a moment the computer answers, "There is now."

11 posted on 02/21/2005 1:59:58 PM PST by infidel29 (America is GREAT because she is GOOD, the moment she ceases to be GOOD, she ceases to be GREAT- B.F.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker

> The machine will have been designed by the imperfect
> species called homo sapiens. What if we got something
> wrong?

And the NYT will obsess over the Frankenstein question
until the day their doors close for good, just as
Hollywood has obsessed over this question (since before
Frank' actually, "Modern Times", "Metropolis", etc.).

One of the reasons for NYT bias is that they spend too
much time at the movies, and not enough time in the real
world.


12 posted on 02/21/2005 2:01:53 PM PST by Boundless
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker
In the movie "War Games", it is the runaway W.O.P.R. computer that tries to envoke WW III when the main character hacks in and trys to play "Global Thermonuclear War."

Runaway military computers taking over the world is a pretty common storyline, apparently.

13 posted on 02/21/2005 2:35:35 PM PST by Yo-Yo
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker

I suspect this is the work of "Queer Eye For The G.I." !!! ;-))


14 posted on 02/21/2005 2:37:29 PM PST by GeekDejure ( LOL = Liberals Obey Lucifer !!!)
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker
Why doesn't the Times just get past the BS and spit out the real reason they don't like this.

If you take humans out of the loop, you also take casualties out of the equation, and the Times fears, that since America is inherently evil, there would be no mitigating factors to prevent the US from going to war at will and winning consitantly.

Its part of the same reason the Times opposes SDI, they believe something needs to keep America vulnerable and kept in check, casualties and possible destruction and loss of American lives does that.

Its sad, that the Times has been reduced to using movie storylines as proxy in its propaganda war.

15 posted on 02/21/2005 2:56:19 PM PST by Sonny M ("oderint dum metuant")
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker

Already schematized into PowerPoint presentations with tangled colored arrows, talking points and an Orwellian lexicon - "soldier systems," "networked lethality," "war fighter machine interfaces" - the rationale for the effort is laid out in an Army promotional video: "Either you create your future or you become the victim of the future someone creates for you."

Its amazing how many people mistake PowerPoint presentations and GeeWhiz Video Vignettes for real progress. FCS is certainly a grand vision, but it will be decades before we have the technology to make autonomous robots work in the real world. There were DARPA pilot projects of autonomous vehicles in the mid-80s that came to naught. There was also a recent DARPA challenge in the desert (last year), where the best entry didn't even get ten miles. Also, Boeing is the lead "integrator", so its doomed to failure.


16 posted on 02/21/2005 3:02:07 PM PST by rbg81
[ Post Reply | Private Reply | To 1 | View Replies]

To: rbg81; All

Many people do not challenge the fundamental premise to this article: is AI like that probable or even possible in the near-term future?

Yes, we already have essentially radio controlled heavy machine guns, but they are of little utitlity since they can navigate only certain terrain. They have their purpose in a static defense that a human mind would be wasted on.

I would imagine UAVs will become quite common for reconnaisance and perhaps for a very limited strike capacity, but why bother if you have more powerful missiles that can be sighted?

As for robotic soldiers, they do not seem imminent as drones and seem quite improbable as automatons. Read some of my past threads under my profile and you will see what I mean. Also, the reasons cited by the author: insubordination, retreat--both are very rare in the US Military.

What some people don't seem to get is while AI is a rather small community that makes progress toward greater robots, not AI entities, biotechnology is blooming and encompasses much greater resources. I think biological modifications--temporary or permanent--will become more common place. Some examples are high energy supplements, sleep inhibitors (meaning you don't need sleep), immunity to bio/chem warfare and to some degree nuclear, and strength enhancers. Add the human brain and enhanced braun to some technological additions--like the exoskeleton to help soldiers/Marinese carry heavy equipment on humps--and then you see a more probably future.

But of course, a more realistic version of the tomorrow does not quite fit the nihilism of this liberal author.


17 posted on 02/21/2005 3:17:24 PM PST by jdhighness
[ Post Reply | Private Reply | To 16 | View Replies]

To: Sonny M
Its sad, that the Times has been reduced to using movie storylines as proxy in its propaganda war.

Yup, I agree with you and everyone else who has made the point: the article is pretty silly, a lot hysterical, ignorant liberal propaganda.

It's been pointed out a million times -- we already trust our lives to machines all the time, machines that are way more intelligent than the machines of yesteryear. A modern commercial jet's autopilot comes to mind, a machine that must make dozens of decisions per second that could result in death for a whole lot of people.

Machines are going to continue to get smarter and smarter as the years go by, and they're not suddenly going to jump up and decide to turn on us. Why? Because we're not going to build that capability into them, on purpose.

Lol . . . Hollywood . . . They have the minds of children.

18 posted on 02/21/2005 3:46:25 PM PST by LibWhacker
[ Post Reply | Private Reply | To 15 | View Replies]

To: Sonny M

Such systems even ideally wouldnt "take causalties out of the equation"...only perhaps AMERICAN casualties. Which might lead to an interesting effect on enemy MORALE eh?

There would be something pathetic [and perhaps demoralizing?] about realizing you are considered so poor a soldier by the enemy you dont even rate getting killed by an HUMAN but instead they will just slaughter you and your brothers-in-arms by sending in a platoon of weapon-bearing 'R2D2s' and no matter what you do you CAN'T kill your foe.


19 posted on 02/23/2005 12:43:36 AM PST by FYREDEUS
[ Post Reply | Private Reply | To 15 | View Replies]

To: FYREDEUS
There would be something pathetic [and perhaps demoralizing?] about realizing you are considered so poor a soldier by the enemy you dont even rate getting killed by an HUMAN but instead they will just slaughter you and your brothers-in-arms by sending in a platoon of weapon-bearing 'R2D2s' and no matter what you do you CAN'T kill your foe.

If your an american liberal, this is a bad thing.

They want American casualties to be the price of an military action. Even if its one they believe in (see anything that doesn't help America).

Demoralizing the other sides troops, will only add to them ripping off movie plots or whatever arcane theroies they can provide to be resistant to high tech soldiers.

Keep in mind, recently I was talking to somone who is liberal and he was dancing around about it, but he did at one point say he thought America would be "cheating" in wars, and "rigging it" whenever they fight.

I nicely and politely said, this is not a game, and war is not sport, and lives are not a commodity that always has to be traded for a cause.

20 posted on 02/23/2005 1:22:41 PM PST by Sonny M ("oderint dum metuant")
[ Post Reply | Private Reply | To 19 | View Replies]


Navigation: use the links below to view more comments.
first 1-2021 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson