Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: jwparkerjr

Oh, wow. There is so much wrong with this, I don’t know where to start - and I’ll only be able to touch on this.

For starters - Apple used the 68000 and then later the rest of the 68K (short for 68000) family.

Second, the Intel limitation was *640*K, not 64K.

In 1982 terms, the Apple II had and STILL has the largest library of software titles ever written. Getting information on it was stupidly easy, it came with the machine, and the machine itself booted into BASIC from ROM.

IBM’s documentation on the original PC was questionable at best, you had to pay extra for documentation that meant anything and the

IBM PC development did not really take off until Windows arrived. Until then, the Apple II was still selling better AND was a better platform to write for.

IBM’s PC was well behind the curve until ~1985-1986. It didn’t actually pass the Apple II until about 1987. Apple’s missteps with the original Mac (which, to be honest, was mostly Steve Jobs’ fault but was later extended and compounded by some real stupidity on the part of Apple’s board) allowed IBM and compatibles in and take over.

Hm... stupidity on the part of the market leader creating openings for other companies... Where have we heard that recently?

What we’re seeing now is turnabout. IBM managed to catch up, pass Apple with the help of Microsoft, and then fell off the curve. Microsoft has continued, but it looks like they’re now about to fall off the curve and get passed by Apple.


12 posted on 04/12/2008 6:28:23 AM PDT by Spktyr (Overwhelmingly superior firepower and the willingness to use it is the only proven peace solution.)
[ Post Reply | Private Reply | To 7 | View Replies ]


To: Spktyr

Thanks Spktyr. :’)


62 posted on 04/12/2008 9:52:08 AM PDT by SunkenCiv (https://secure.freerepublic.com/donate/_____________________Profile updated Saturday, March 29, 2008)
[ Post Reply | Private Reply | To 12 | View Replies ]

To: Spktyr

I think you have a better understanding of those early years that I do! Thank you for the corrections, and especially making them in such a nice way! Sorry for the mis-information. As I said, that was simply how I remembered it from my job at a Computerland. I was wrong in a several things. What can I say. Brain fade, it’s my cross to bear. I ended up programming for the AlphaMicro family of multi-user, multi-tasking machines. Boy is my face red. Part of it’s due to the fact I spend the entire day outside at Sun ‘n Fun shooting pictures for the people who sponsor it, but much of the red is from being so wrong. Thanks again for the interesting information. It’s slowly coming back to me.


155 posted on 04/12/2008 5:26:13 PM PDT by jwparkerjr (Sigh . . .)
[ Post Reply | Private Reply | To 12 | View Replies ]

To: Spktyr
Hold on just a minute.

In regards to 64K Memory limitation:

Segment:Offset addressing was introduced at a time when the largest register in a CPU was only 16-bits long which meant it could address only 65,536 bytes (64 KB) of memory, directly. But everyone was hungry for a way to run much larger programs! Rather than create a CPU with larger register sizes (as some CPU manufacturers had done), the designers at Intel decided to keep the 16-bit registers for their new 8086 CPU and added a different way to access more memory: They expanded the instruction set, so programs could tell the CPU to group two 16-bit registers together whenever they needed to refer to an Absolute memory location beyond 64 KB.

If the designers had allowed the CPU to combine two registers into a high and low pair of 32-bits, it could have referenced up to 4 GB of memory in a linear fashion! Keep in mind, however, this was at a time when many never dreamed we'd need a PC with more than 640 KB of memory for user applications and data! So, instead of dealing with whatever problems a linear addressing scheme of 32-bits would have produced, they created the Segment:Offset scheme which allows a CPU to effectively address about 1 MB of memory.

167 posted on 04/12/2008 9:08:57 PM PDT by Kegger
[ Post Reply | Private Reply | To 12 | View Replies ]

To: Spktyr
Second, the Intel limitation was *640*K, not 64K.

Actually, for programmers, it WAS 64K. You had to deal with the way the Intel 808x (and later the 80286 and 80386) dealt with memory in 64KB segments. It was a huge pain. If you wanted to deal with data structures larger than 64KB, it was a major hassle for those of us doing intensive searches in DBMS programming (in C and assembler).

Mark

199 posted on 04/13/2008 12:24:34 PM PDT by MarkL
[ Post Reply | Private Reply | To 12 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson