Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: yendu bwam
Well, Tortoise - as this is not my specialty, I have trouble following all of the verbiage. If you should be right in your assertion - that the human mind can be expressable by a universal Turing machine - I will be amazed. Again, time will tell who is right.

The ultimate question boils down to whether or not the human mind is running on finite state machinery. If it is, then real AI must be possible in silicon. If it isn't, then it isn't possible on conventional machinery. There is a mathematical way to test "black boxed" machinery for "finite state-ness", which while not proof in a mathematically rigorous sense, can make a determination one way or the other with an extremely high statistical confidence. All such tests applied to the human mind that I am aware of indicate that the human mind is in fact a piece of complex finite state machinery to a high degree of statistical confidence.

Again, while this doesn't "prove" anything in a rigorous sense, few people are willing to bet against the extremely miniscule odds that the human mind isn't running on finite state machinery at this point. This doesn't speak as to the construction of the machinery, only its mathematical nature. So there is a theoretical chance you could be correct, but I wouldn't bet on it. As you said, time will tell.

523 posted on 06/03/2002 2:12:31 PM PDT by tortoise
[ Post Reply | Private Reply | To 521 | View Replies ]


To: tortoise
Your discussion of AI is fascinating to the extent that I can follow it, which is not very far. Are you, or rather, is the field assuming that material brain is equivalent to mind? If so, my bet would be that the effort will not succeed in any significant way. As I say, though, I will always hew to the facts and evidence.
528 posted on 06/03/2002 8:27:45 PM PDT by Phaedrus
[ Post Reply | Private Reply | To 523 | View Replies ]

To: tortoise; yendu bwam
tortise, I have a question about your nomenclature.

I think we discussed this subject a little bit on another thread a while back, but here you first frame the question as;
...whether or not the human mind is running on finite state machinery... [emphasis mine].

Then you state your belief (based on tests that you have seen) that the human mind is in fact a piece of complex finite state machinery.

In your latter phrasing you have, as a matter of identity, the mind as being the machine, whereas in your former you have the mind as running on top, so to speak, of the machine. Those seem to be two different things. Do you see the mind as the machine, or do you see the human mind as a property of the machine?

In my opinion, Machine and Person are two different classes. I think it is an error to conflate the two. A machine is an impersonal thing. A person, by definition, is not an impersonal thing, but personal. A machine cannot be a person. Every thing that a machine does is always and only an effect of a prior physical cause. If the mind of a human being is nothing but the effect, or the emergent property of physical force in a brain, what is it that causes those physical forces to produce different effects/thoughts? If man is nothing but machine, then the "it" MUST always be a prior physical cause. The logical conclusion of this view can only be that there is no real human personhood. If we are nothing but machine, always just an effect, riding along on top of those physical forces, then there is no real free will. Consequently there is no real morality, because moral obligation assumes various attributes of personhood that machines do not possess, including such things as personal volition, and the personal nature of the authority that commands the moral action, and so on. But every person reading this knows immediately and intuitively that machines do not possess personal volition, and so we would think it idiotic for example, to PUNISH a machine for doing what it ought not to do. As we know that machines always operate by coercive physical force, so also we know that machines do not have moral obligation. If your computer keyboard malfunctions and somehow manages to produce some random words on a page in your Microsoft Word document, and those words happen to resemble a sentence that contains a command, you are not going to feel any obligation to obey what the words tell you to do. You know intuitively that machines do not have moral authority. If a universal silicon Turing machine replies to my post here and issues its some command to me, I have no moral obligation to obey it.

All things human depends on the notion that we are personal beings and not machines. And that's a stab at explaining why I believe that Persons, are not just Machines.

Cordially,

531 posted on 06/04/2002 8:56:33 AM PDT by Diamond
[ Post Reply | Private Reply | To 523 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson