Posted on 07/08/2004 8:07:48 AM PDT by Michael_Michaelangelo
Within information theory, information typically measures the reduction of uncertainty that results from the knowledge that an event has occurred. But what if the item of knowledge learned is not the occurrence of an event but, rather, the change in probability distribution associated with an ensemble of events? This paper takes the usual account of information, which focuses on events, and generalizes it to probability distributions/ measures. In so doing, it encourages the assignment of generalized bits to arbitrary state transitions of physical systems. In particular, it provides a theoretical framework for characterizing the informational continuity of evolving systems and for rigorously assessing the degree to which such systems exhibit, or fail to exhibit, continuous change.
(Excerpt) Read more at designinference.com ...
The problem with probability is that the improbable occurs regularly.
Self serving bump. I will have to start digging for my Shannon stuff.
Here is Shannon's A Mathematical Theory of Communication.
And here are FAQs for Biological Information Theory and Chowder Society.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.