Without knowing the meaning of the newly coined term "bit pair of DNA code" the above isn't very meaningful. On the other hand, "some 3 billion bit pairs" and "a few gigabits" are about the same number.
Doctor Stochastic: On the other hand, "some 3 billion bit pairs" and "a few gigabits" are about the same number.
To the best of my knowledge, application coding has not been done in binary since the board wiring days (when I started.) We were thrilled with machine code and were downright giddy over Assembly. Then came an avalanche of higher level languages. From the field, if seemed as if every Computer Science postdoc was driven to best his predecessors.
In the example you gave earlier, an entire macro is board wired for a single code instruction to invoke. And the Iota example given by tortoise likewise equates to an entire phrase of Scheme which is a higher level language to the underlying Algol or Lisp etc.
In the end, the power of the interpreter, compiler or assembler determines which instructions will be executed on the board, which may be already wired for numerous macros. Gore3000 was doing as I had asked and normalized to the binary so we won't be talking apples and oranges.
In other words, I read his statement to say that the Microsoft programmers, using whatever language of choice - which is usually much lower than Lisp, e.g. C variants, Assembly - would use a comparable volume of higher level language to accomplish what we see in binary in the genetic code.
Newly coined or not it is apt to the discussion. We were discussing computers and the base pairs of DNA (the individual molecules which code the A, C, G, T not the codons which code in threes) are the smallest unit of code in DNA just as the yes/no or on/off state of a binary bit is the smallest coding unit in a computer from which all other code is derived. What I was pointing out was that just like computers need a multitude of code just to be able to do anything with those 0/1's bits, an organism needs a multitude of code to make anything of the ACGT bits. In fact, in computers you need programs to make the coding easier, that is why we are able with one instruction to copy millions of locations from one place to another. But this takes coding and when someone says that all you need is a few instructions to accomplish something, they are usually speaking of instructions in a high level language between which instruction and the actual computer are numerous lines of code. So what some in a facile way call a few rules are not because before those rules can be implemented you need an interpreter (which is a program) with lots of rules to make it usable in the natural language of the computer or the organism - the 0/1s or the ATCG's.
Now that you understand what was being said by me (if you did not already) I am sure you will agree with me that this 5-6 rules stuff of Wolfram is very unrealistic.