www.arn.org - he posts there all the time. Before you do that, though, you may want to have a look at this:
Information as a Measure of Variation and determine whether or not you have the necessary credentials to understand what he is saying, let alone the maths he uses.
I am well-qualified to evaluate that paper, and the math is easily within my grasp.
I'd never seen that paper before, but it just reinforces my opinion. As with everywhere else, he tries to equate the vague notion of "meaning" with mathematical information, ignores relevant theorems, and makes some unsupportable assertions. Most of his schtick involves trying to redefine the strict mathematical definition of "information" to the looser pedestrian definition so that he can try and shoehorn it into supporting his conclusions. I doubt he's ever written a paper that is actually strict and rigorous relating to information theory, particularly since he is still attempting to discover something novel in the basics of the field. It is kind of like the cranks that think they have a special insight into rudimentary physics that has never occurred to anyone else that will allow them to design a perpetual motion machine
BTW, the paper is actually an essay dressed up like a math paper, in case you couldn't tell. Apparently you don't understand the paper...?
I will add that Dembski is in love with the idea that information theory only deals with "average information", and makes that a major premise of his argument.
Apparently he is entirely unaware that Kolmogorov extended the field to include absolute information and pointwise information (i.e. literal instance information, not "average" information) circa 1965. Only a world-class ignoramus could claim to be knowledgeable about information theory while premising his entire argument on a conceptual notion of information theory that was obsolete in the 1960s. All of his notions and ideas have been addressed in mathematics in the 40 years since he stopped paying attention to the field.