Free Republic
Browse · Search
Smoky Backroom
Topics · Post Article

To: Doctor Stochastic
Thank you for the links!

It's been a while since I read through Shannon's papers, but I will do it again (though probably not tonight).

As I recall, his work in the 1940's was oriented to communications, the mathematics of communications. That was the "state of the art". The term "information" was coined for success and "entropy" for failure. The second term (as I recall) was coined with his permission and has led to much confusion because of its usage in other contexts.

677 posted on 07/07/2004 9:32:47 PM PDT by Alamo-Girl
[ Post Reply | Private Reply | To 674 | View Replies ]


To: Alamo-Girl

No, "entropy" was used because the formula for information was the negative of the formula derived by Boltzmann for thermodynamic entropy. Someone (I think Hamming told me that it was von Neumann) suggested jokingly that Shannon should call information "entropy" because no one would understand what he meant.


680 posted on 07/07/2004 9:41:15 PM PDT by Doctor Stochastic (Vegetabilisch = chaotisch is der Charakter der Modernen. - Friedrich Schlegel)
[ Post Reply | Private Reply | To 677 | View Replies ]

Free Republic
Browse · Search
Smoky Backroom
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson