To: StJacques
The discrete versus continuous cases makes no real difference for information theory. The discrete case uses sums of terms and the continuous case uses integrals; these can be unified by treating sums as integrals using "distributions" (the Dirac delta function being the relevant example.)
A Gaussian supported on the entire line is no different conceptually from a die roll supported on (1,2,3,4,5,6).
488 posted on
01/05/2005 10:32:40 PM PST by
Doctor Stochastic
(Vegetabilisch = chaotisch is der Charakter der Modernen. - Friedrich Schlegel)
To: Doctor Stochastic
". . . The discrete versus continuous cases makes no real difference for information theory. The discrete case uses sums of terms and the continuous case uses integrals; these can be unified by treating sums as integrals using "distributions" (the Dirac delta function being the relevant example.) . . ."
I'm just going to make a quick comment in response here, because I'll have to go looking for a link to support my statement and it's very late for me and I'm just coming in to shut down for the morrow right now.
I understand that information theory can treat both discrete and continuous cases. But based upon the reading I have done, and I'll have to go looking for the support I need to back this up, when applied to molecular biology -- and here is where I need to clarify the exact terminology from sources -- Shannon Information Theory requires that "Information" be "continuous" over either the range of probabilities or just continuous over "probability" itself. I cannot remember exactly how it is stated, but the "continuous" requirement is there.
I'll try to get back to this tomorrow Doc. Got to go to bed now.
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson