It doesn't only seem so but this is what they indeed claim.
If I'm not mistaken, W. Dembski came up with this "law of conservation of information". However, IMHO this doesn't make any sense at all but well, that's just my opinion.
Information can't be conserved in the real universe and the information of the system increases with time. Information creation is a fundamental characteristic of the system.
In fact, the 2nd Law of Thermodynamics mandates a trend towards increased information over time (via entropy aka information efficiency). If entropy increases in a system, the amount of information in the system also has to increase, since that is what entropy is more or less directly measuring.
If you look at the universe as a transaction theoretic system (i.e. with the "copy" law I mentioned earlier), the lower the entropy is the more likely that a particular instance of a pattern will be destroyed/altered by the dynamics of the system. An analogy would be like a register in a CPU, except that every location memory is potentially a register. When an operation is performed between two patterns the state of some register is modified except in this case it is some location in memory which previously contained another pattern. (This is a computational model variation theoretically similar to but not the same as things like LISP machines.) Over many iterations, the distribution of patterns becomes random i.e. approaches perfect entropy.