Whether or not it does descends into nitpicking about the strict semantics and definition of "machine", but that is not really relevant here. That and "8-bit machine" does not mean much; it could refer to native dynamic range, minimum dynamic range, addressable memory, or some other parameter -- that term is as ambiguous in industry as it is here. If one makes no assumptions about the specific nature of the machine, just that it is fully general within the limits of its finiteness, such a machine could bootstrap zero-order complexity the size of our universe from a couple hundred bytes worth of algorithmic information. One can bootstrap an organism from much less, particularly if a suitable control function is built into the universe (it is).
The point being that a handful of bytes on a simple machine can bootstrap everything you need if you do not make naive assumptions about the control function of the machine. You can get all the "complexity" you want from algorithms of a size that can be brute-forced in real molecular systems in a reasonable amount of time. Whether or not this is how it happens is another question, but it most certainly is plausible. There is no obligation for the universe to make the theoretical mathematics required to understand all this easy to understand or intuitive. Indeed, the theoretical reality is almost the exact opposite of most people's intuition, which is why we still argue it.
I believe that boiled down to a 'yes'.
One follow-up question: What difference would several hundred bytes vs. zero bytes of pre-existing algorithms in the machine make in its ability to bootstrap per the original question?