Posted on 04/02/2009 7:05:41 PM PDT by GodGunsGuts
What exactly do you mean when you speak of information?
Theorem 1: The fundamental quantity information is a non-material (mental) entity. It is not a property of matter, so that purely material processes are fundamentally precluded as sources of information.
Theorem 2: Information only arises through an intentional, volitional act.
Theorem 3: Information comprises the nonmaterial foundation for all technological systems and for all works of art.
Etc, etc...read Chapters 1-6!
The definition used in the article you cited is adequate: basically, the information in a system is the number of bits needed to describe it.
I am asking how you define it.
The article says that information is carried by a material medium, but the information itself is non-material.
Learn what theorem means. Here's a link.
"show me how each successive theorem does not follow from the logic presented therein."
Theorems require proof, not strings of illogical, or unsubstantiated, or unrelated statements.
Lets grab the first pile of text and look at it.
"It should now be clear that information, being a fundamental entity, cannot be a property of matter, and its origin cannot be explained in terms of material processes.
This erroneous statement follows from a string of similar claims. It is neither a conclusion, nor is it axiomatic. It is simply a false assertion.
Now...
"We therefore formulate the following fundamental theorem:
Ridiculous! The word therefore is unwarranted, because no logical operations whatsoever were performed.
Let's see...
" Theorem 1: The fundamental quantity information is a non-material (mental) entity. It is not a property of matter, so that purely material processes are fundamentally precluded as sources of information."
The first sentence is a statement attaching a quantitative attribute to the word information. In otherwords, it's part of the definition of the word information. The second statement is false. There has been no complete definition of the word information given, so the first claim in the sentence is indeterminate. The second part of the sentence is false, because it relies on an indeterminate value to arrive at some stated conclusion.
Thanks for the ping!
Have you by any chance been following this series? It touches on Shannon’s theory, and I think explains why Alex Williams feels like Shannon’s theory is relatively minor to the overall concept of information.
Information gets into tree rings and ice cores by natural means. Information also gets into DNA by natural means.
argue with wikipedia then see article at http://en.wikipedia.org/wiki/Quantum_mechanics which states (in the section on Quantum mechanics and Classical Physics)
“Quantum mechanics provides probabilistic results because the physical universe is itself probabilistic rather than deterministic”.
No Virginia, there are at the deepest level the universe operates, no discrete, small measurable states.....kinda makes we are all living in a seamless version of “the Matrix”, right?
A pseudoscientific paper authored by an engineer.
Has he submitted this for peer-review?
How would you falsify his theorem?
In order for any hypothesis to have any type of scientific merit it must stand up to peer-review.
That is how science works.
Also abiogensis has nothing to do with the theory of evolution.
Evolutionary theory deals mainly with how life changed after its origin. Science does try to investigate how life started (e.g., whether or not it happened near a deep-sea vent, which organic molecules came first, etc.), but these considerations are not the central focus of evolutionary theory. Regardless of how life started, afterwards it branched and diversified, and most studies of evolution are focused on those processes.
http://evolution.berkeley.edu/evolibrary/misconceptions_faq.php#a1
Symbolic Code is use to store and transmit information. In your model, what symbols are used? And how are those symbols generated?
The trail-blazing discoveries about the nature of energy in the 19th century caused the first technological revolution, when manual labor was replaced on a large scale by technological appliancesmachines which could convert energy. In the same way, knowledge concerning the nature of information in our time initiated the second technological revolution where mental labor is saved through the use of technological appliancesnamely, data processing machines. The concept information is not only of prime importance for informatics theories and communication techniques, but it is a fundamental quantity in such wide-ranging sciences as cybernetics, linguistics, biology, history, and theology. Many scientists, therefore, justly regard information as the third fundamental entity alongside matter and energy.
Claude E. Shannon was the first researcher who tried to define information mathematically. The theory based on his findings had the advantages that different methods of communication could be compared and that their performance could be evaluated. In addition, the introduction of the bit as a unit of information made it possible to describe the storage requirements of information quantitatively. The main disadvantage of Shannons definition of information is that the actual contents and impact of messages were not investigated. Shannons theory of information, which describes information from a statistical viewpoint only, is discussed fully in the appendix (chapter A1).
The true nature of information will be discussed in detail in the following chapters, and statements will be made about information and the laws of nature. After a thorough analysis of the information concept, it will be shown that the fundamental theorems can be applied to all technological and biological systems and also to all communication systems, including such diverse forms as the gyrations of bees and the message of the Bible. There is only one prerequisitenamely, that the information must be in coded form.
Since the concept of information is so complex that it cannot be defined in one statement (see Figure 12), we will proceed as follows: We will formulate various special theorems which will gradually reveal more information about the nature of information, until we eventually arrive at a precise definition (compare chapter 5). Any repetitions found in the contents of some theorems (redundance) is intentional, and the possibility of having various different formulations according to theorem N8 (paragraph 2.3), is also employed.
We have indicated that Shannons definition of information encompasses only a very minor aspect of information. Several authors have repeatedly pointed out this defect, as the following quotations show:
Karl Steinbuch, a German information scientist [S11]: The classical theory of information can be compared to the statement that one kilogram of gold has the same value as one kilogram of sand.
Warren Weaver, an American information scientist [S7]: Two messages, one of which is heavily loaded with meaning and the other which is pure nonsense, can be exactly equivalent . . . as regards information.
Ernst von Weizsäcker [W3]: The reason for the uselessness of Shannons theory in the different sciences is frankly that no science can limit itself to its syntactic level.1
The essential aspect of each and every piece of information is its mental content, and not the number of letters used. If one disregards the contents, then Jean Cocteaus facetious remark is relevant: The greatest literary work of art is basically nothing but a scrambled alphabet.
At this stage we want to point out a fundamental fallacy that has already caused many misunderstandings and has led to seriously erroneous conclusions, namely the assumption that information is a material phenomenon. The philosophy of materialism is fundamentally predisposed to relegate information to the material domain, as is apparent from philosophical articles emanating from the former DDR (East Germany) [S8 for example]. Even so, the former East German scientist J. Peil [P2] writes: Even the biology based on a materialistic philosophy, which discarded all vitalistic and metaphysical components, did not readily accept the reduction of biology to physics. . . . Information is neither a physical nor a chemical principle like energy and matter, even though the latter are required as carriers.
Also, according to a frequently quoted statement by the American mathematician Norbert Wiener (18941964) information cannot be a physical entity [W5]: Information is information, neither matter nor energy. Any materialism which disregards this, will not survive one day.
Werner Strombach, a German information scientist of Dortmund [S12], emphasizes the nonmaterial nature of information by defining it as an enfolding of order at the level of contemplative cognition.
The German biologist G. Osche [O3] sketches the unsuitability of Shannons theory from a biological viewpoint, and also emphasizes the nonmaterial nature of information: While matter and energy are the concerns of physics, the description of biological phenomena typically involves information in a functional capacity. In cybernetics, the general information concept quantitatively expresses the information content of a given set of symbols by employing the probability distribution of all possible permutations of the symbols. But the information content of biological systems (genetic information) is concerned with its value and its functional meaning, and thus with the semantic aspect of information, with its quality.
Hans-Joachim Flechtner, a German cyberneticist, referred to the fact that information is of a mental nature, both because of its contents and because of the encoding process. This aspect is, however, frequently underrated [F3]: When a message is composed, it involves the coding of its mental content, but the message itself is not concerned about whether the contents are important or unimportant, valuable, useful, or meaningless. Only the recipient can evaluate the message after decoding it.
It should now be clear that information, being a fundamental entity, cannot be a property of matter, and its origin cannot be explained in terms of material processes. We therefore formulate the following fundamental theorem:
Theorem 1: The fundamental quantity information is a non-material (mental) entity. It is not a property of matter, so that purely material processes are fundamentally precluded as sources of information.
Please, I mean no disrespect, but I have an observation. You know so much about theorem but seem to have no idea how to correctly spell the word... “theorm, theorems, theorm” one out of three.. I can see the word “that” being spelled “htat” because of a missed keystroke, but to spell the single word that your point refers to wrong not once, but twice is strange... odd.
Are you informed on the subject you are attempting to instruct others about?
Information “Got” into tree ring cores by a natural, rhythmic, cyclical process (the change of seasons). Random events did not “create” the information.
There is a always a non-random process behind the creation of information.
You might be right, if you would say that the weather is a completely random event - but we know its not. So therefore the “information” is not created, rather the tree is just recording the environment - no more than a book “writes” or “creates” the words recorded in it’s pages.
Moreover, tree rings will “never” be decoded to provide proofs, or poems, or record any information or knowledge not related directly to the growth of the tree. Compare this to a book, that can contain any idea, thought, formula, or whatever. Clearly it was *not* just a process that produced the information content in the book.
Even.
Exactly, natural, and that's the same way information gets into DNA.
Evidence can not render any statement proved. Conclusions follow, theorems do not. The use of the word theorem in these instances is fraud.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.