Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: Alamo-Girl
Truly, I am amazed that you hold Hubert P. Yockey in such contempt.

It is not contempt, it is indifference. Do you realize that he basically stopped publishing before the "reformation" of information theory into its modern form? I don't care what he's done in other fields, but his understanding of information theory is antiquated. Incidentally, while Chaitin is famous in relation to "algorithmic information theory" (and a smart guy), his area of specialty is really kind of a tangential thing -- he has kind of google-bombed the namespace. This has been a chronic problem with the Intelligent Design "information theory experts"; from everything I've read, I was in kindergarten the last time they updated their understanding of the mathematics. They NEVER cite any of the core theory papers that make up modern information theory, nor do they exhibit familiarity with the important new concepts that are in those papers. If you compare the reference appendix of a paper by a credible leading mathematician in the field (e.g. Schmidhuber) and compare it with the reference appendix of folks like Yockey, there is almost no intersection. Yockey is not an outright fraud (unlike Dembski), but he is way past his academic prime and it shows. We have newer and better models for dealing many of these things, and simply ignoring these advancements is not helpful. It would be kind of like a physicist refusing to acknowledge post-Newtonian physics. Any derivative work would only be "correct" in a qualified sense.

As for strong AI, modern mathematical theory is almost completely derivative and deeply intertwined with the broader field of algorithmic information theory (the unified grand-daddy of all the little subfields in that general area). Yes, I do a lot of work in that area, but it is basically the same theorems and math as we are talking about here, and I've done a hell of a lot of work in this area of mathematics. Intelligence is purely a mathematical problem, and the fact that it traditionally has not been treated that way goes a long way toward explaining what has taken so long to develop an implementation theory in computer science.

Information theory, computational theory, transaction theory, decision theory, a lot of probability theory, and bits and pieces of a lot of other fields are all the same mathematical thing. You treat these fields like the blind men in the old fable about the elephant. I routinely work on the unified theoretical constructs (which is nominally described as "algorithmic" or "computational" information theory) and make no distinction between them because it would be nonsensical for me to do so, and would make my work impossible in any case.

I'm waiting for any view of this that actually applies modern information theory i.e. a perspective that understands and fully integrates the computation and transaction theoretic aspects into the simple Shannon model. Time and again, I get the impression that no one really wants to deal with the inconvenient consequences of this, metaphorically prefering to stay in the comfy Newtonian physics rather than redefining their perspective as required by acknowledging Relativity and its applicability to real problems. If all these "theorists" make no effort to stay relevant, I see no reason to treat them as though they are relevant.

301 posted on 12/15/2004 10:20:25 PM PST by tortoise (All these moments lost in time, like tears in the rain.)
[ Post Reply | Private Reply | To 295 | View Replies ]


To: tortoise
Thank you so much for your reply!

But to paraphrase Mark Twain, I believe the rumors of Yockey’s irrelevance may have been greatly exaggerated. The second edition of his greatest work, Information Theory and Molecular Biology is not yet available for shipment – so who can say what he has or has not incorporated?

I’ve been following Jurgen Schmidhuber as well as Yockey, Schneider, Chaitin, Tegmark, Penrose, Rocha and many others. But again I see a huge fork in the direction being taken by those working in artificial intelligence and those working on information theory in molecular biology.

In the biological research, at the level of the molecular machine, the issue is one of communications, semiotics. Within that research, it forks again between those like Schneider whose research with NIH is oriented to the medical implications and evolution while the likes of Rocha, Wolfram and Yockey are examining how it may have emerged. However, from what I have read concerning complex systems, those investigators are interested in both algorithm and communications.

I am not aware of any advancement in unified theoretical constructs which would change the Shannon model for communications like Einstein's special and general relativity changed Newton's theory of gravity. Considering the high profile of Schneider's work for NIH, and the fact that he has an exhaustive website - I would have expected him to make note of such things.

I'm waiting for any view of this that actually applies modern information theory i.e. a perspective that understands and fully integrates the computation and transaction theoretic aspects into the simple Shannon model.

You could help us all out a great deal if you would explain how modern unified theoretical constructs (algorithmic or computational information theory) could inform communications in molecular machines.

304 posted on 12/15/2004 10:59:28 PM PST by Alamo-Girl
[ Post Reply | Private Reply | To 301 | View Replies ]

To: tortoise
". . . I'm waiting for any view of this that actually applies modern information theory i.e. a perspective that understands and fully integrates the computation and transaction theoretic aspects into the simple Shannon model. Time and again, I get the impression that no one really wants to deal with the inconvenient consequences of this, metaphorically prefering to stay in the comfy Newtonian physics rather than redefining their perspective as required by acknowledging Relativity and its applicability to real problems. . . ."

I do not have such a reference for you in mathematical information theory, but I do have one highly theoretical piece on symbolic communication systems to recommend, this in the field of Semiotics, which applies to both Bio-semiotics, with implications for evolutionary biology, and Computational Environments, including the development of artificial intelligence. It deals with information as "symbols" in need of "syntax" to establish communication:

H.H. Pattee - The Physics of Symbols: Bridging the Epistemic Cut

Pattee argues for a new system of symbolic communication that uses "Dynamical Laws" which incorporate the "subjectivity" of the observer in the same way Quantum Mechanics uses non-integrable constraints. You might go right to sections 6 and 7 to get an idea as to how and why he wants to move beyond Newtonian Natural Laws to non-integrable "Dynamic Laws."

Now it doesn't deal with Shannon and it's pretty heady stuff. The Los Alamos guys working on Artificial Intelligence are using it a great deal, as are those working in biosemiotics. But I think it meets your criteria for "acknowledging Relativity and its applicability to real problems."

Just thought I'd drop it in. And Alamo-Girl first pointed me to this by the way.
305 posted on 12/15/2004 11:00:59 PM PST by StJacques
[ Post Reply | Private Reply | To 301 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson