I'm truly impressed! I have been reading a book called "SPIKES Exploring the neural code" on exactly that, information theory related to how the nervous system encode sensory signals. Cool stuff!
By coincidence, I put out a short paper today on the encoding of audio sensory data for use in induction networks, a behavior that is closely analagous to the way sensory data is encoded for the CNS. This was some tangential work to my normal area, though.
A lot of people don't realize that the mathematical descriptions of learning and intelligence are also found in information theory, with many important developments in the last couple years in that regard, and is among the many things that make that field interesting. Most people aren't aware to the extent a great many very interesting things can be proven in information that most people assume aren't provable. The mathematical toolset given by information theory is immensely powerful and it really starts to make you see the world in a different way after you've been immersed in it for a while. Definitely good stuff.