Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: Ichneumon
And they base this conclusion on *what*, exactly?

It's not a conclusion. It's a hypothesis.

And why do you feel that there's something "special" about neurons that would allow them to "enable" thought that an electronic equivalent would not? What about circuits that exactly replicate the biochemical responses of equivalent neurons, signal-wise?

For one, a brain is living tissue, silicon is not, nor will it ever be. Secondly, every single possible state a computer could ever be in can be written on a state transition diagram and under no circumstances may the program itself alter that diagram. Human beings and other living things, however, are quite capable of changing or ignoring their own internal 'code'.

Frankly, the highest level of intelligence computer software has ever demonstrated is that of a virus. A virus lives only as a parasite to replicate itself. Computer viruses do this. However, even computer viruses are not as capable as nature's version. Nature's viruses are capable of mutating themselves into a new form to make themselves more difficult to kill or to make themselves better able to spread to other organisms. Computer viruses don't even do this. When new anti-virus programs are released and new security software put into place, virus writers have to make new viruses as the old ones cease being effective.

Self-awareness in regular electronic computers is as ridiculous of a concept as the Earth being flat. Sophisticated software systems will eventually get to be good enough do a good imitation of thinking or even of human behavior, but it will never equal human behavior, and it will never think for itself. No computer software or hardware will ever 'think' outside of its own programming the way people do.

I've wrote enough lines of code to know how a computer functions. As I've said before, things like Commander Data, The Matrix, or I, Robot aren't going to happen, at least not with solid state electronic technology.
238 posted on 12/22/2005 11:24:09 AM PST by JamesP81
[ Post Reply | Private Reply | To 132 | View Replies ]


To: JamesP81
Self-awareness in regular electronic computers is as ridiculous of a concept as the Earth being flat. Sophisticated software systems will eventually get to be good enough do a good imitation of thinking or even of human behavior, but it will never equal human behavior, and it will never think for itself. No computer software or hardware will ever 'think' outside of its own programming the way people do.

Do people think outside their own programing? Or is the programming so complex that we just can't see the patterns?

Free will vs. determinism is one of the oldest debates in philosophy. Computer technology and the notion of "thinking machines" just gives us another metaphor to frame the debate.

So to reverse your question, are computers a primitive, limited analog for the human brain, or is the human brain a vastly more sophisticated computer than we can, at least now, comprehend?

250 posted on 12/22/2005 11:43:00 AM PST by ReignOfError
[ Post Reply | Private Reply | To 238 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson