Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: econjack
In the early days of computers, there was a proposal that the human race switch from using the decimal system to octal. The reasoning was that it takes computers less time and energy to convert between native binary and octal than between native binary and decimal. It you add up the savings for every computer in the world, for all time, the total savings would be astronomical; so the quicker we switched over, the better.

Of course, they didn't forsee that in only a couple of years, octal would be largely replaced by hexadecimal for displaying raw bits.

15 posted on 10/08/2015 10:31:43 AM PDT by snarkpup ("No matter how paranoid you are, you're not paranoid enough." - Susan Modesky)
[ Post Reply | Private Reply | To 1 | View Replies ]


To: snarkpup

The early Motorola chips (e.g., the 6800) used octal in their development work, but I forgot why they opted for that numbering system.


20 posted on 10/08/2015 10:37:51 AM PDT by econjack (I'm not bossy...I just know what you should be doing.)
[ Post Reply | Private Reply | To 15 | View Replies ]

To: snarkpup
Of course, they didn't forsee that in only a couple of years, octal would be largely replaced by hexadecimal for displaying raw bits.

Octal and hex coexisted from the dark ages of computing. There were two schools of thought. Digital Equipment Corporation was fond of octal. IBM was fond of hex.

There were also two character sets. IBM used EBCDIC while most everyone else used ASCII.

80 posted on 10/08/2015 1:48:03 PM PDT by GingisK
[ Post Reply | Private Reply | To 15 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson