I haven't found this definition used by Shannon anywhere. I did find a reference from Berlinski making the claim. If Berlinski's horrible book on algorithms is an example of his work, his re-definition of information isn't very accurate or useful.
According to Shannon and Weaver information is defined as a measure of ones freedom of choice when one selects a message. In information theory, information and uncertainty are closely related. Information refers to the degree of uncertainty present in a situation.
It's better to go to Shannon (or Shannon and Weaver, an easily accessible booklet) to find out what is actually meant rather than rely on a (seeming unreliable) third party. Shannons papers are available online.
It's been a while since I read through Shannon's papers, but I will do it again (though probably not tonight).
As I recall, his work in the 1940's was oriented to communications, the mathematics of communications. That was the "state of the art". The term "information" was coined for success and "entropy" for failure. The second term (as I recall) was coined with his permission and has led to much confusion because of its usage in other contexts.
Actually it is the reduction of that uncertainty.