In the early days of computers, there was a proposal that the human race switch from using the decimal system to octal. The reasoning was that it takes computers less time and energy to convert between native binary and octal than between native binary and decimal. It you add up the savings for every computer in the world, for all time, the total savings would be astronomical; so the quicker we switched over, the better.
Of course, they didn't forsee that in only a couple of years, octal would be largely replaced by hexadecimal for displaying raw bits.
The early Motorola chips (e.g., the 6800) used octal in their development work, but I forgot why they opted for that numbering system.
Of course, they didn't forsee that in only a couple of years, octal would be largely replaced by hexadecimal for displaying raw bits. Octal and hex coexisted from the dark ages of computing. There were two schools of thought. Digital Equipment Corporation was fond of octal. IBM was fond of hex.
There were also two character sets. IBM used EBCDIC while most everyone else used ASCII.