Information is the reduction of uncertainty (Shannon entropy) in the receiver or molecular machine in going from a before state to an after state. It is however offset by dispersing heat in the local surrounding thus paying the thermodynamic tab.
You ask which of the below is the most (1) correlated (2) ordered (3) complex (4) random (5) entropied using which definition of complexity. The complexity measures from post 205 are: (a) Kolmogorov (b) Cellular Automata (c) Self-organizing Complexity (d) Functional complexity (e) Time complexity or the algorithmic complexity of a discrete function (f) Specified Complexity (g) Metatransition:
Previously you chose Kolmogorov complexity and on this reply post you choose c). I am presuming you are choosing c) as the most complex, not the most random, correlated, ordered or entropied.
Regardless of complexity measure, for most correlated, I would choose a) and e) has potential for a second though I do not have the time or inclination to determine a minimum program to arrive at the binary string.
For most ordered, I would choose d) because of the encoding/decoding (semiosis) and because the phrase has meaning.
For most random, I would only make a choice if we agreed that physical reality consisted solely of printable characters. In that case, I would choose b).
For most entropied, I would choose a) if we agreed physical reality consisted only of As. In Shannon entropy, the most uncertain would be b).
Concerning complexity measures, I would choose as follows:
(b) Cellular Automata b because of the number of cells, size of the string
(c) Self-Organizing Complexity - d because of emergent properties required to give meaning of the phrase
(d) Functional Complexity -d because of the encoding/decoding (semiosis)
(e) Time Complexity b because of the length of the string, time to build it
(f) Specified Complexity d because of the semiosis
(g) Metatransition d because of the semiosis