Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: tortoise; yendu bwam
tortise, I have a question about your nomenclature.

I think we discussed this subject a little bit on another thread a while back, but here you first frame the question as;
...whether or not the human mind is running on finite state machinery... [emphasis mine].

Then you state your belief (based on tests that you have seen) that the human mind is in fact a piece of complex finite state machinery.

In your latter phrasing you have, as a matter of identity, the mind as being the machine, whereas in your former you have the mind as running on top, so to speak, of the machine. Those seem to be two different things. Do you see the mind as the machine, or do you see the human mind as a property of the machine?

In my opinion, Machine and Person are two different classes. I think it is an error to conflate the two. A machine is an impersonal thing. A person, by definition, is not an impersonal thing, but personal. A machine cannot be a person. Every thing that a machine does is always and only an effect of a prior physical cause. If the mind of a human being is nothing but the effect, or the emergent property of physical force in a brain, what is it that causes those physical forces to produce different effects/thoughts? If man is nothing but machine, then the "it" MUST always be a prior physical cause. The logical conclusion of this view can only be that there is no real human personhood. If we are nothing but machine, always just an effect, riding along on top of those physical forces, then there is no real free will. Consequently there is no real morality, because moral obligation assumes various attributes of personhood that machines do not possess, including such things as personal volition, and the personal nature of the authority that commands the moral action, and so on. But every person reading this knows immediately and intuitively that machines do not possess personal volition, and so we would think it idiotic for example, to PUNISH a machine for doing what it ought not to do. As we know that machines always operate by coercive physical force, so also we know that machines do not have moral obligation. If your computer keyboard malfunctions and somehow manages to produce some random words on a page in your Microsoft Word document, and those words happen to resemble a sentence that contains a command, you are not going to feel any obligation to obey what the words tell you to do. You know intuitively that machines do not have moral authority. If a universal silicon Turing machine replies to my post here and issues its some command to me, I have no moral obligation to obey it.

All things human depends on the notion that we are personal beings and not machines. And that's a stab at explaining why I believe that Persons, are not just Machines.

Cordially,

531 posted on 06/04/2002 8:56:33 AM PDT by Diamond
[ Post Reply | Private Reply | To 523 | View Replies ]


To: Diamond; Tortoise
he logical conclusion of this view can only be that there is no real human personhood. If we are nothing but machine, always just an effect, riding along on top of those physical forces, then there is no real free will.

I also believe that it's very, very difficult to explain free will as emanating from an algorithmic machine (and without at least considering the realm of the quantum). One possibility of course is that free will as people experience it is just an illusion. But everything in our minds rebels against that possibility. One possibility that's been bandied about is that a conflux of self-referential algorithmic loops and such in the brain brings about consciousness and free will. For many reasons, I doubt that possibility as well. I'm not totally sure what Tortoise means by a 'finite state machine,' and admit I need to do some reading up on this. Then finally, on the question of free will, it becomes, in very general terms, hard to say that humans are only reacting to their environment in a prescribed way, when a great part of human existence is given over to moral questions, and avoiding the temptations to do evil or bad things.

537 posted on 06/04/2002 12:34:20 PM PDT by yendu bwam
[ Post Reply | Private Reply | To 531 | View Replies ]

To: Diamond
In your latter phrasing you have, as a matter of identity, the mind as being the machine, whereas in your former you have the mind as running on top, so to speak, of the machine. Those seem to be two different things. Do you see the mind as the machine, or do you see the human mind as a property of the machine?

I don't have time address all the points brought up, but I want to make a couple quick comments that are important foundational concepts.

First, there is no real mathematical distinction between "hardware" and "software", and no clear break between the two in practice. They are the same thing and interchangeable i.e. everything that exists as hardware can be implemented in software and vice versa. I often see people treat them as different in these discussions, but that is largely a consequence of the peculiar structure of the computers that most humans are familiar with (i.e. the von Neumann architecture). This isn't a necessity, just a convenient way of working with silicon substrates.

Second, "free will" is an illusion that all self-aware FSMs will have as a consequence of what could be described as Godel's Incompleteness Theorem applied to state machinery. Being perfectly aware of your own state ("your own state" assuming a self-aware finite state machine) is mathematically impossible, though you can make good approximations as to what you will do next, and therefore you cannot have perfect prior knowledge of what you'll do next until you do it. In other words, what you might view as "free will" could be correctly viewed by a much more powerful outside observer as you following a deterministically predictable trajectory.

What "free will" really means is that you can't know exactly what you'll do next until you actually do it. "Choices" have predeterminable (to some entity at least) outcomes, but you won't be aware of it until you make a choice. This may not be easy to digest, but it is the logical mathematical outcome of self-awareness on finite state machines. It should be noted that non-FSMs are even worse in this regard, so positing a different underlying reality doesn't help much here. And in the end, the existence or not of "free will" doesn't really matter a whole lot; it might change the excuses, but not the reality.

538 posted on 06/04/2002 1:27:13 PM PDT by tortoise
[ Post Reply | Private Reply | To 531 | View Replies ]

To: Diamond
Nice exposition at #531, Diamond.
539 posted on 06/04/2002 1:28:21 PM PDT by Phaedrus
[ Post Reply | Private Reply | To 531 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson