FYI,
Kolmogorov complexity is a modern notion of randomness dealing with the quantity of information in individual objects; that is, pointwise randomness rather than average randomness as produced by a random source. It was proposed by A.N. Kolmogorov in 1965 to quantify the randomness of individual objects in an objective and absolute manner. This is impossible by classical probability theory (a branch of measure theory satisfying the so-called Kolmogorov axioms formulated in 1933).
Reflective of the objectivists in FR. Ever play Dungeons and Dragons? They would be "chaotic neutral."
Kolmogorov did some very important foundational work, primarily in that he fixed one of the key deficiencies in Shannon's work. That said, while the foundations he laid are as valid as they ever were, the field has advanced substantially beyond where he left it. Algorithmic Information Theory, which is essentially the proper descendent of Kolmogorov's mathematics, has started to yield some very important developments regarding computational information theory in general. One could very much argue that it is the most important direction currently being explored in computer theory, and is solving some long outstanding questions in the field. In theoretical computer science, this is one of the areas that is really starting to get hot. It is increasingly clear that it is far more important than a lot of people initially thought it was. Such is the history of discovery.
AIT is essentially my area of expertise in mathematics, particularly as it applies to computational theory. My particular sub-specialty is this field of mathematics as applied to finite systems. Virtually all mathematics dealing with computational theory does NOT assume finite systems, which is a bit odd since all practical computers ARE finite systems.