Posted on 10/07/2015 6:13:22 PM PDT by dennisw
A source involved in the making of the film, which hits screens on Friday, told The Hollywood Reporter that MS Jobs Powell had been 'trying to kill this movie' for years.
The source said: 'Laurene Jobs called Leo DiCaprio and said: "Dont do it." Laurene Jobs called Christian Bale and said: "Dont."'
She is even alleged to have called potential financial backers when the movie hit issues with funding.
The movie portrays Jobs in a different light to how he is seen by many, depicting him as 'heartless' and showing him disowning his daughter.
Laurene Jobs Powell phoned actors Leonardo DiCaprio and Christian Bale Steve Jobs' widow lobbied them not to play role of her late husband in film Michael Fassbender got the role but emails revealed in the Sony hack showed he was not everybody's first choice Ms Jobs Powell also allegedly pressured financiers not to fund the movie Lack of money may have contributed to Sony Pictures pulling out, to be replaced by Universal Movie is out in New York and LA Friday and rest of the US later this month
Steve Jobs' widow was so determined to have the new movie about the late Apple boss canned that she contacted Leonardo DiCaprio and Christian Bale to persuade them not to take the role.
Laurene Jobs Powell reportedly begged the Hollywood stars to avoid the leading role, with both of the actors believed to have been offered the part.
Eventually Michael Fassbender was cast as Jobs, but that was just the first of a multitude of dramas that saw the controversial biopic come close to never making the screen.
(Excerpt) Read more at dailymail.co.uk ...
Diogenes, you are so dishonest, it is no wonder you need a lamp to find honesty when you make conclusion such as those.
Your statement shows you have no clue what a sea change or how difficult making such a difference it is.
Now, try rotating either of those characters on either the screen or printer to another angle. . . the one on the right is far easier to rotate without distortion than the one on the left.
But more, dropping the idea of the 8X8 character grid made it possible to think outside of the box and edit and use any type of font for any type of typeface, and make them infinitely resizable:
Divorcing the alphabet from the grid did a lot more. . . because you could SPELL check those fonts as well.
This was not trivial, contrary to your dismissive claims and required quite a bit of creativity and the other companies did not figure out how to do it, not because they CHOSE to avoid it, but because they COULD NOT!. So they did not.
When they finally did start doing something like it, they did their text inputting and editing in another program and the spell checking there.
The very definition of a trivial improvement, otherwise known as "cheese". Visually appealing, but functionally of no consequence.
FUNCTIONALLY of NO CONSEQUENCE???? You are daft? Do you realize the fantastic increase in productivity What You See Is What You Get" editing brought to editing newspapers, magazines, printing, layout work, advertising, banner, every kind of print production work in time and re-workng when you no longer had to do trial and error cut and paste on a paste board approach? No functional consequence? Composing to press time was cut to near nothing. You are an idiot if you think it was of no consequence. One major unintended consequence is it ended the exposure of typesetters to LEAD POISONING which had shortened their lives.
Cheese, my rear potootie!
It's a "Logan's Run" Sandman "follower" from 1976. Sort of resembles an Early version of the Iphone, but it was before they had actual screens that worked like that.
It's an early version of an EMPTY BOX with two micro-switches attached to NOTHING. You see that screen at the bottom with the gray rectangles on each corner? Do you know what those rectangles are? MASKING TAPE, holding the window screen inside the EMPTY BOX! The screen is a PHOTO! It's a MOVIE PROP, you dummy. . . it's to a "vision" of an early Iphone (sic) device that inspired Steve Jobs. You have no idea what innovation and invention is all about.
Ivan Sutherland, 1963. Not Steve Jobs.
Sutherland's program used the inputting of complex formulas to draw vector graphics on a cathode ray oscilloscope, not for the display of readable fonts. Another example of your attempting to obfuscate the issues. . . and equate early inventors with later. Non Sequiturs.
My recollection is that an Apple II cost about $1,500 dollars in 1978. A lot of young people couldn't afford them because of the high price. (I was one of them.) Good to know we have Steve Jobs to thank for that bit of A$$holery as well.
Oh, you'd rather he price it like the IBM-PC? OK. The IBM-PC with a single floppy drive and no monitor when it came out was $2499. Happy? The Green Screen monochrome IBM monitor was $499. Even happier?
No, Edison did not think of the concept of "movies". . . they existed before Edison invented his camera and projector system. There were, in fact, many movie systems before Edison's on the market. His was the most efficient. The French were there first. He also was not first with the electric light or even the light bulb. However, Edison did indeed invent recording sound. He appears to have thought of it and invented a machine to record it and play it back. . . although even that is in dispute that an engineer in his laboratory may have been the actual inventor. One thing is certain, though, and that is that Edison invented the modern industrial laboratory system.
Apple did work with Kodak to produce the first consumer affordable digital cameras.
Neither did Jobs. Engineers who worked for him did.
Whom, Steve, like Edison, directed on what to do, and how he wanted it done. Which means they thought of the idea, and built it in hardware. Jobs comes along and says "Wow! That's neat! I'm going to brow beat my engineers into building one of these, only it's going to be extra-spectacular-super-duper better!
"Now where did I put my whips and chains?"
Even the engineers who worked with Steve say he was often the one who came up with the ideas. Edison often came under the exact same accusations as YOU are making. . . even to the whips and chains. . . and was also defended by the people who worked with him.
Your knowledge of computer history is limited to your obsession with all things Apple to the point you sound retarded since you discount the work of others. Anti-aliased fonts were not an Apple invention and had been the work of many other people and companies.
Dude, I am quite good at three dimensional vector math, and I have written a lot of code to rotate, shade, texture map, etc, three dimensional objects. I know Jobs didn't come up with any of it, or even the idea of vector based fonts. Why are you trying to give him credit for this?
This was not trivial, contrary to your dismissive claims and required quite a bit of creativity and the other companies did not figure out how to do it, not because they CHOSE to avoid it, but because they COULD NOT!. So they did not.
This is just a rehash of our previous discussion where everything you said Jobs was involved with, I quickly looked up to discover was all done by other people years before he even came on the scene. Jobs simply sat around demanding his software/hardware engineers implement ideas he had already heard of.
Vector graphics were being used all over the place. Remember the game "SpaceWar"?
1977. Vector Graphics.
And what's this?
Spacewar 1962. Vector Graphics.
FUNCTIONALLY of NO CONSEQUENCE???? You are daft? Do you realize the fantastic increase in productivity What You See Is What You Get" editing brought to editing newspapers, magazines, printing, layout work, advertising, banner, every kind of print production work in time and re-workng when you no longer had to do trial and error cut and paste on a paste board approach?
Now I get it. You don't understand how any of this stuff works. To you, it's just "magic." To me, it's just evolutionary advancements of ideas that had already been implemented elsewhere. Printing WUSIWUG is no great difficulty. The essence of every early graphics system is pixel based graphics. Essentially just contiguous locations in memory. If you can draw it, you can print it. Methods of drawing new, neat little things, are just a question of style, not one of brilliant development. (Again, style over substance.)
IBM chose to do Character generator text because it was cheap, saved memory, and allowed them to think they were promoting a "business" computer, rather than a "toy." They deliberately went a different direction.
But now I get it. You don't grasp the nuts and bolts of how this stuff actually works. You don't have the necessary background to recognize it as iterations of the then current art. To you it seems like evidence of Steve Jobs brilliance, to me it just looks like he could browbeat smarter and more knowledgeable people than he was.
It's an early version of an EMPTY BOX with two micro-switches attached to NOTHING.
Yes. Something of about the caliber of a mind as brilliant as Steve Jobs could have come up with. And they thought of it first! :)
The IBM-PC with a single floppy drive and no monitor when it came out was $2499. Happy?
Who would have wanted one of those? They sucked. I wouldn't have been interested in one of those if they had given it to me. Graphics was always my thing, and IBM graphics deliberately sucked in 1980. They didn't come out with anything remotely interesting until around 1990.
No, in 1977+, the best hobbyist computer was the Apple II. Others came out, many of them quite good, but none could match the advantages Apple had from being first and biggest. IBM was crap during this era. It was great for keeping records, or word processing, but I never had any interest in Data Base or that sort of green eyeshade crap.
Okay dude, you *DO* know that you are dissecting snark? It isn't supposed to be taken seriously.
I really don't have the time to educate DiogenesLamp, as his statements are all over the place and non-factual. As for the Apple II being too expensive ("$1500" he says), I bought my 1977 Apple II for $1200. Computers back then were expensive, partly because the RAM chips were expensive. A bank of 16K RAM went for about $500 or more. If you wanted a PC that offered a lot of bang for the buck, it was the Apple II (and yes they were advertised as a "PC" meaning personal computer). IBM wasn't even in the game when Apple was playing until a few years later, and then IBM glommed onto the "PC" name as their own. In many respects, IBM imitated features of the Apple II.
Their target customers were IBM mainframe shops. The users were using 3270 terminals, and the PC was it's replacement. The 8088 was evolved from the 8008, which was designed to do programmable terminal emulation. That it was purpose-built for business application was not imaginary.
Who is talking about "anti-alias" fonts. YOU are. I am talking about "Vector Graphic" fonts that the Macintosh, and Steve Jobs pioneered. . . the one on the RIGHT, not the mess of the Anti-Aliased font on the left, in the picture. You are another one who doesn't know his history of computers. I was there and worked with magazines and publishing and KNOW what I am talking about. I've been a writer, editor, and PUBLISHER. . . and I've done everything in those fields including color separations ON A COMPUTER. Try PRINTING that anti-Aliased font. . . and see what you get. Try and print any screen font for professional use. It is useless for that purpose. SHEESH!
I don't discount the work of others, but you certainly discount what Apple has done.
You are an idiot if you think I think it is "Magic" . . . I was one of 100 High School Students in the United States in the mid 60s selected by Bell Labs to participate in some of their projects such as making our own transistors. . . and then later participate in voice creation via electronics. Don't talk to me about me thinking things are "magic," when you try to tell me that converting vector shaped graphics into EDITABLE text is a trivial evolution of 8x8 grid ASCII code text. It certainly is NOT. . . and no one else was able to do it except Apple in 1984. Even the Lisa did not have that function.
Of course, they are plotted as mere dots on a raster screen, but I really do not think you know the complexities involved in maintaining an EDITABLE, contextual TEXT as a vector graphic. YOU are the one who thinks that's magic. . . just a simple series of easily calculable vectors. . . but now put those on a screen and make them EDITABLE with a keyboard as TEXT, and maintain kerning, spacing, proportion, at all sizes possible. That is not an easy task.
Other computer companies could not do it. You are an asshat if you think they just by-passed the capability because they wanted to make only business computers. Bull SH!T. The make excuses because they could not. Steve Jobs and Apple found a way to do that because Steve Jobs insisted on it. . . he drove it. They could have compromised and made a text only system like Xerox did. They did not. As a result all computers today use the system Steve Jobs forced into existence.
Vector graphics were being used all over the place. Remember the game "SpaceWar"? 1977, vector graphics
. . .
Spacewar 1962. Vector Graphics.
You STILL DON'T GET IT. Easy child's play compared to what I just described.
You don't even understand the acronym of WYSIWYG, a extremely well known term in computers. . . you misspell it "WUSIWUG," and claim it is no great difficulty, when getting a one-to-one relationship to what one sees on a computer screen and what is printed on the output of a printer, much less a high-quality offset four or six color ink Printing press is NOT AN EASY THING to do, nor trivial.
Frankly, even Apple was not perfect at doing it. . . because the early MacIntosh had to use some "screen fonts" to represent some commercially available printer fonts. . . instead of the ones it could draw as vector graphics, because the fontographers who made them, did not provide the vector graphic files for screen display Apple used on their own fonts.
You mentioned the Amiga computer by Commodore. I used a piece of software written by one man, Deron Kazmeier, that eschewed screen fonts entirely, embracing vector fonts completely, and it actually provided the BEST WYSIWYG I have ever seen. . . and could produce documents from the size of a postage stamp to a bill board. . . That software was available for the Mac as well.
No one is saying that Steve Jobs invented vector graphics, nor is anyone saying that Steve Jobs invented Calligraphic Fonts. Both of course predated computers by a long time. What Steve Jobs did was realize that computers were an ideal device to use to display, edit, and even print documents using calligraphic fonts for creating documents in an interactive environment in REAL TIME, in a way that the person doing that creative act could EDIT them as see how they looked on a virtual page without printing them, wasting time, effort, and resources. Jobs made it happen when no one else realized it was even possible.
That is no trivial thing and is the very essence of innovation and invention, regardless of how many people Jobs involved in making it happen. YOU could not have done it, and YOU would not have thought of the idea. . . and no one else in Jobs' milieu thought of it either. That is what made his contribution revolutionary.
And they thought of it first! :)
No, they did not. . . It's a walkie-talkie.
But that is STILL FICTION and a long stretch from FAKE to a working product that changes the world of smartphones and how they work, the user interface, their shapes, and even their colors. . . all thanks to the vision of one man, Steve Jobs.
The IBM-PC with a single floppy drive and no monitor when it came out was $2499. Happy?
Who would have wanted one of those? They sucked. I wouldn't have been interested in one of those if they had given it to me. Graphics was always my thing, and IBM graphics deliberately sucked in 1980. They didn't come out with anything remotely interesting until around 1990.
No, in 1977+, the best hobbyist computer was the Apple II. Others came out, many of them quite good, but none could match the advantages Apple had from being first and biggest. IBM was crap during this era. It was great for keeping records, or word processing, but I never had any interest in Data Base or that sort of green eyeshade crap.
Um, no. The Apple II was not the biggest nor best hobbyist computer. That would have been the Commodore C64, was entered in the Guinness Book of World records at 17,000,000, while Apple only made around 6,000,000 of the Apple II, including around 1.5 million of the Apple IIgs. Priced at $695, the C64 was more affordable than the Apple so more hobbyists could afford it. There was more affordable software and more free software as well. I programed for both.
In other words, you are a snarky poster, who posts snarky commentary of meaningless drivel. . . What would the real Diogenes say about your honesty? He'd pass you buy as a speaker of untruths. . . a dishonest man who bears false witness, for the sake of seeing his foolish prattle in print. Sad. . . and shameful.
That is not vector graphics, or any sort of graphics at all.
Don't talk to me about me thinking things are "magic," when you try to tell me that converting vector shaped graphics into EDITABLE text is a trivial evolution of 8x8 grid ASCII code text.
And it's statements like these that make me think you have no idea how any of this works. Your statement can be taken two ways, and both of them indicate you are ignorant on this subject.
In one interpretation, "editable text" means editing the text characters. Yes, this could be done by bit banging the character generator rom, in character generator type systems. It can be done more easily and on the fly in any memory based pixel system. IT IS A BIG NOTHING.
In the other interpretation, "editable text" means "word processing, which was also big nothing.
What do you think a vector based text system does? It simply fills in memory pixels based on the vector directions given in the character description. The end product is a pattern in graphics memory. I was drawing objects (sprites) and resizing/rotating them on a TRS-80 back in 1977-78.
Of course, they are plotted as mere dots on a raster screen, but I really do not think you know the complexities involved in maintaining an EDITABLE, contextual TEXT as a vector graphic. YOU are the one who thinks that's magic. . . just a simple series of easily calculable vectors. . . but now put those on a screen and make them EDITABLE with a keyboard as TEXT, and maintain kerning, spacing, proportion, at all sizes possible. That is not an easy task.
No, I think it's pretty easy. Especially when most of the work has been done for you two decades earlier. Again dude, I draw all sorts of crap with graphics displays all the time. It is really not so incredibly advanced as you seem to think it is.
Other computer companies could not do it.
Correction. DID NOT BOTHER TO DO IT. They probably thought it was trivial too. The only people who wanted that sissy stuff were women, poetry writers, educrats, and various other assorted left wing style over substance types. Newspaper and Magazine people fall into this category as well.
The make excuses because they could not. Steve Jobs and Apple found a way to do that because Steve Jobs insisted on it. .
Another statement that indicates you do not know how any of this stuff works. No dude, there was no great secret or "brilliance" to developing vector fonts. I think most people in the industry at the time simply thought their usefulness wasn't worth the degree of effort which would be required to implement them in that era.
If I was trying to build a first class computing machine back in those days, I wouldn't be worrying about the trivia of how pretty the fonts can be made to be. I would be trying to get my operations per second up, or increase memory access speed, better storage, better graphics, anything but stupid fonts. That crap appeals to little girly minds who like playing with "my little pony", not to serious men.
I remember in the early 1980s when people were talking about that Post script stuff. I thought to myself at the time that the whole thing was just D@mn silly, and grown adults ought to have better things to do than diddle around with curlicues and embellishments. If you are going to worry about such superficial crap, you might as well "bedazzle" everything.
I thought the efforts in this direction were mostly worthy of contempt. It was a lot of effort put forth on something that really did not matter. It was beyond trivial.
But yes, it appealed to all the girly minds out there to the same degree as the latest makeup and fingernail polish. Those are successful businesses too, and in the area of fonts, there is probably a very large overlap with makeup.
Frankly, even Apple was not perfect at doing it. . . because the early MacIntosh had to use some "screen fonts" to represent some commercially available printer fonts. . . instead of the ones it could draw as vector graphics, because the fontographers who made them, did not provide the vector graphic files for screen display Apple used on their own fonts.
Silly crap. If you are doing a graphics memory dump, you don't need to be worrying about fonts. It's just pixels at that point. Changing it into fonts just makes the process messier and more complicated.
DUMP THE MEMORY.
That is no trivial thing and is the very essence of innovation and invention, regardless of how many people Jobs involved in making it happen.
No, I think it was pretty trivial, and very much akin to coming out with a new shade of fingernail polish or mascara.
No, they did not. . . It's a walkie-talkie.
Yeah, like the Iwatch, which the Dick Tracy cartoonist thought of decades before Steve Jobs.
I guess Jobs got a lot of his ideas from cartoons and movies. :)
But that is STILL FICTION and a long stretch from FAKE to a working product that changes the world of smartphones and how they work, the user interface, their shapes, and even their colors. . . all thanks to the vision of one man, Steve Jobs.
And the only thing Steve Jobs had the mental acuity to produce were FAKES. He couldn't do the engineering. He couldn't make anything himself. He could point at cartoons and movies and then tell his engineers, "Make one of those or I will yell at you and threaten to fire you!"
Steve Jobs was a spoiled little rich boy who could get in other people's faces and say "That's not good enough! I want something cooler, something better, something super duper stupendously fantastic!"
And he could keep pushing his spoiled little rich boy act until the people who did the real work could produce something that satisfied the little spoiled brat.
Um, no. The Apple II was not the biggest nor best hobbyist computer.
It certainly was from 1977 to 1982. What, did you think it was gonna last forever?
Suggestions that Steve Jobs was a brilliant man who did great things rather than a spoiled rotten brat are a topic very much worthy of snark.
The more I learn about him, the more respect I have for his ability to hype himself and subsequently convince others to do it too.
He's all sizzle, and no steak. He did for computing technology what General motors did for automobiles when they introduced different colors of paint.
Nothing of any real consequence, but certainly successful from a business standpoint.
This is my understanding as well. Back in those days, IBM people wore business suits, and everything they did was intended to project Business competence. Particularly relating to finances and record keeping.
In 1980, they were chasing a very different market than was Apple. Of course the two markets eventually combined/overlapped, and the rest is history.
The design considerations of the PC weren't just to "project an image". This was a case of form following function, not style over substance.
I am not interested in hearing anything with which you think to "educate" me. You are the sort of person Reagan described.
"The trouble with our opponents is not that they are ignorant... it's that they know so much which isn't so."
Computers back then were expensive, partly because the RAM chips were expensive. A bank of 16K RAM went for about $500 or more.
Bullsh*t. My recollection is that the Apple II used 4116 DRAM chips, and they didn't cost any where's near that. I also recall that in 1978 I was upgrading the memory of my own home built computer by using 2114 1024x4 Static ram chips, (I was 16, and had built my own computer out of parts) and they were something around $8.00 each, which would bring the cost of 16k worth to $256.00.
Dram was even cheaper and had higher capacity. I could probably look up the cost of these chips because I used to read them advertised in Popular Electronics back in the 1970s, and the entire catalog of those issues are available on line.
I find MK4116 drams listed for $27.50 from Quest electronics in the July 1978 issue. That brings 16K of ram to $220.00, and that's buying at hobbyist prices of onesies and twosies. I'm sure Apple could have worked a better price. Maybe as much as half.
If you wanted a PC that offered a lot of bang for the buck, it was the Apple II
It was the best thing available at the time. Steve Wozniak did a great job.
I stand by my quote about computers being expensive back then. Your recollection is flawed. I have receipts from the 1970s (I'm a packrat). My Apple II with 4K was $1200. 16K RAM banks (set of 8) did indeed go for $500 in late 1977. As you know, memory prices dropped and capacity increased every 18 months or so mirroring Moore's Law. By early 1979, 16K RAM banks had dropped to $250. I bought a set and have a receipt. By the way, the cost of producing chips was so expensive that many had gold in them. My Apple II chips have gold plated caps. So you don't know sh*t.
$27.50 X 8 chips = $220.00. If you paid more, you got ripped off.
And you don’t know how to read before you foment. I mentioned Moore’s Law. And 1977. And prices dropping by half within 18 months. And better gold-capped chips of higher quality. No ripping off when buying quality, instead of junk. You’re pulling crap from your ass again.
Oh, whoop-do-do! Sprites. . . collisions. . . all that? More child's play. Easily defined and easily tracked. NOT the same at all, and no where nearly the complexity of what I am talking about. YOU really don't have a clue what this is about. . . and what a sea change in technology it was.
Do you even know what kerning is? How do you kern two characters that have parts that stroke under other characters? Or how about characters that extend into other characters without conflicts of the points.
Consider the "T", "q" and "b" character in the above editable text flow and tell us how EASY and TRIVIAL that is to do and keep it editable. How do you adjust the kerning, the character spacing, and make it so you can drag cursor through that or easily select just one character in that string of characters to change? None of that was created by a trivial "character generator ROM." You really don't have a clue what you are talking about. . . except to throw spit wads. . . at those who did it.
Another statement that indicates you do not know how any of this stuff works. No dude, there was no great secret or "brilliance" to developing vector fonts. I think most people in the industry at the time simply thought their usefulness wasn't worth the degree of effort which would be required to implement them in that era. If I was trying to build a first class computing machine back in those days, I wouldn't be worrying about the trivia of how pretty the fonts can be made to be. I would be trying to get my operations per second up, or increase memory access speed, better storage, better graphics, anything but stupid fonts. That crap appeals to little girly minds who like playing with "my little pony", not to serious men.
I remember in the early 1980s when people were talking about that Post script stuff. I thought to myself at the time that the whole thing was just D@mn silly, and grown adults ought to have better things to do than diddle around with curlicues and embellishments. If you are going to worry about such superficial crap, you might as well "bedazzle" everything.
More proof that you are an idiot. . . manipulating graphics is one of the heavy duty uses that require faster memory access, faster calculation, and heavier data usage. . . and the more detail the greater the density required. Moving text around does NOT require any of those things you are talking about. . . and not the reason anyone was building faster computers with more memory with better graphics. Not at all.
YOU DISMISS the very thing that moved the industry ahead. . . the pursuit of the ability to move better graphics on the screen and output better graphics faster. . . whether it is on paper, film, or video. You think it is manipulating mere alphanumeric text? BS, Diogenes.
In the other interpretation, "editable text" means "word processing, which was also big nothing.
NO, Diogenes, it does NOT mean "word processing" which shows you have no clue about what you are talking about. IT means DOCUMENT processing. . . a totally different animal than mere WORD processing. . . it means LAYOUT and seeing the document as you change everything about itin real time, calculating those changes on the fly. . . Now, handle what you see on the screen and translate into an entirely different resolution to make it match EXACTLY on a printer, so that nothing is lost, yet your printing mode may be as different as a 300 DPI laser printer or a 3200 DPI offset press. . . or a photo printing system at even higher resolution. . . all of which require different levels of precision in their calculation. . . far more than moving a pre-defined "sprite" around on a TRS-80 screen!
Silly crap. If you are doing a graphics memory dump, you don't need to be worrying about fonts. It's just pixels at that point. Changing it into fonts just makes the process messier and more complicated.
Silly crap. If you are doing a graphics memory dump, you don't need to be worrying about fonts. It's just pixels at that point. Changing it into fonts just makes the process messier and more complicated.
Again, you demonstrate your total ignorance of what you are talking about. No, it was not a "memory dump." PostScript was a programing language in which the fonts were a description of how to DRAW each character, with exact details of each. . . down to angles, arcs, radii, etc. They were NOT any kind of a "memory dump" because they had to be interpreted by multiple devices and converted to ANY resolution output. If they were merely a memory dump, the page being displayed at 72DPI could not be printed at 300 DPI or better without duplicating the dots of the screen memory it was dumping. That was what rasterized printing did. . . and it looked it. Dot Matrix printers did GREAT at doing those kind of memory dumps. . . while they and daisy wheel printers were great at character printing too.
Yeah, like the Iwatch, which the Dick Tracy cartoonist thought of decades before Steve Jobs.
More of your abysmal ignorance, Diogenes. . . and dishonesty. Steve Jobs has been DEAD for almost 4 years before the Apple Watch came out. He had NOTHING to do with the watch.
It certainly was from 1977 to 1982. What, did you think it was gonna last forever?
You must have thought so. . .
You really are an asshat, you know that?
No, the target market for the IBM-PC was the office desk top. . . the IBM mainframe shops were companies who leased their products and the leasing agents were not at all interested in selling stand-alone micro-computers. In fact, the PC came out of the TYPEWRITER division, not the computer division and the computer division was singularly NOT HAPPY.
They looked on it as a competitor to their bailiwick and their terminal business, which were LEASE ONLY! They lobbied top management strongly to KILL the PC for a couple of reasons. . . primarily the OS was "not-IBM sourced" and the processors were also not IBM sourced. . . and, they claimed, it was redundant to their terminal business, which was quite profitable. "Besides," they claimed, "it would cannibalize the typewriter business."
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.