Posted on 06/10/2008 6:23:21 AM PDT by conservatism_IS_compassion
SAN FRANCISCOJune 9, 2008Apple® today previewed Mac OS® X Snow Leopard, which builds on the incredible success of OS X Leopard and is the next major version of the worlds most advanced operating system. Rather than focusing primarily on new features, Snow Leopard will enhance the performance of OS X, set a new standard for quality and lay the foundation for future OS X innovation. Snow Leopard is optimized for multi-core processors, taps into the vast computing power of graphic processing units (GPUs), enables breakthrough amounts of RAM and features a new, modern media platform with QuickTime® X. Snow Leopard includes out-of-the-box support for Microsoft Exchange 2007 and is scheduled to ship in about a year.
We have delivered more than a thousand new features to OS X in just seven years and Snow Leopard lays the foundation for thousands more, said Bertrand Serlet, Apples senior vice president of Software Engineering. In our continued effort to deliver the best user experience, we hit the pause button on new features to focus on perfecting the worlds most advanced operating system.
Snow Leopard delivers unrivaled support for multi-core processors with a new technology code-named Grand Central, making it easy for developers to create programs that take full advantage of the power of multi-core Macs. Snow Leopard further extends support for modern hardware with Open Computing Language (OpenCL), which lets any application tap into the vast gigaflops of GPU computing power previously available only to graphics applications. OpenCL is based on the C programming language and has been proposed as an open standard. Furthering OS Xs lead in 64-bit technology, Snow Leopard raises the software limit on system memory up to a theoretical 16TB of RAM.
Using media technology pioneered in OS X iPhone, Snow Leopard introduces QuickTime X, which optimizes support for modern audio and video formats resulting in extremely efficient media playback. Snow Leopard also includes Safari® with the fastest implementation of JavaScript ever, increasing performance by 53 percent, making Web 2.0 applications feel more responsive.*
For the first time, OS X includes native support for Microsoft Exchange 2007 in OS X applications Mail, iCal® and Address Book, making it even easier to integrate Macs into organizations of any size.
*Performance will vary based on system configuration, network connection and other factors. Benchmark based on the SunSpider JavaScript Performance test on an iMac® 2.8 GHz Intel Core 2 Duo system running Mac OS X Snow Leopard, with 2GB of RAM.
Technically speaking, I would think so. From a business standpoint? I don't know what's involved in activating that 8th core. Is it possible to do so after testing? What's the cost involved?
I'm not familiar with the Cell--does it really only have 7 cores? That seems like an odd number (pun semi-intended). To me, 8 cores would be more efficient from a scheduling and resource handling standpoint. I know HPL would probably run more efficiently on 8-core chips.
As NVDave notes, such details do matter, down at the hardware level, when one actually has to spend transistors and (even worse) leads coming off the die for each such bit. Intel has to size such processors for the biggest case need; and the operating system people have to write to that hardware, making sure to manage all those 44 bits correctly.
And then the marketing people get to skim over the internal design docs, pick off some nice big number, and brag incoherently ;).
From my perspective, that 44 bits wasn't enough. I'm using that same chip in a system measured in petabytes of RAM, not terabytes. It takes some serious operating system and external hardware magic to add bits that aren't there.
I'm sorry, I thought it was you who I was previously discussing the Cell with.
From what I heard in a usual batch some chips will be useless (errors in the PPC CPU, multiple SPE cores or elsewhere), some will have an error on just one SPE core (they comprise about half of the chip area) and some will be pristine. After testing they use all the chips in the middle case and blow one core on the last case to keep everything consistent. The alternative would be to set the PS3 standard to all eight cores and they'd have to throw away all those chips with just one defective core.
I'm not familiar with the Cell--does it really only have 7 cores?
Eight cores, seven active in the PS3 due to yield considerations as discussed above, six available to developers (the last reserved for the PS3's OS). I was just thinking to use the pristine ones for these applications and the seven-core ones for the PS3 market.
That center bus between the elements is over 200 GB/sec, the memory interface on the left is over 25 GB/sec, the I/O on the right is over 60 GB/sec. Pretty spiffy, huh? I like how the SPEs are lined up mirror-image.
I know HPL would probably run more efficiently on 8-core chips.
You can go one of two ways: Parallelize the task and have all six SPEs working on equal chunks of the larger problem, or serialize the task and have each SPE work on a part of the larger problem then pass the problem down for further processing while it receives its chunk of the next problem. Each SPE has IIRC 256K of fast local SRAM, so you can probably store a smaller program and fit more of a working dataset in there when serializing without having to go out to main memory as much.
What program, if I may ask? I've been using Delicious Library, and it's got some pretty good features, including using an iSight to scan UPC codes. But I'd like something that does a better job with books too old to have a UPC (of which I have a lot), especially if it can do ISBN lookups and check against the Library of Congress.
Make some of that RAM flash memory, and you're talking about all but eradicating the line between RAM and mass storage. That could be fairly revolutionary, though we're a non-trivial number of years away from it being practical, let alone mainstream.
We're a ways away from seeing how it plays out, but I can't blame Apple for taking one upgrade cycle to focus on tightening things up rather than release a raft of new features. It'd be nice if more companies took the time to do that every few revisions (I'm looking at you, Microsoft).
What does the Cell processor have to do with Apple’s upcoming operating system? Nothing, Apple already dumped IBM for their processors, and the Cell isn’t x86 or PPC compatible either. I’m sure Sony and IBM would rather you buy a PS3 and try to get some obscure version of Linux working on it, but I’d take a Mac Mini over a kludge like that any day.
"A keyboard. How quaint."
I really don't see voice becoming the primary input mechanism for most computer users unless there are a whole lot of advances in AI under the hood. Speech between two humans is an efficient mode of communication only because humans are able to infer what should fill in the gaps. Even then, it's easily misunderstood; without miscommunication and wrong conclusions, we would have no basis for sitcoms.
If speech control of computers is based on crisp, sharply articulated commands issued in a consistent and logical temporal order every time, I don't see it replacing the keyboard (or mouse, or even handwriting) without a change in the programming philosophy behind it, not just the application of more computing horsepower under the hood, however impressive that horsepower may be.
Didn't Leopard already drop the G3? Dropping support for the G4 and G5 at the same time would be unusual for Apple, but not unprecedented. The first release of OS X dropped support for the PPC601, 603 and 604 at the same time, IIRC.
I'm not a programmer or an engineer, so please forgive me if any of this is misinformed, over-simplistic, or just plain stupid.
The difference between pre-Y2K and today is that a programmer is less likely to hard-code limitations into critical software -- they'd be in subroutines or at the OS level, making it much easier to update or port a program than it was with the old, patched to the Nth degree, and mostly undocumented COBOL and FORTRAN code.
Another difference is that we might have actually learned from y2K (stranger things have happened), and databases have come a long way. If the critical data is stored in some standardized form, it would be a lot easier to move to another program or platform, even running the old and new systems in parallel to make the switch smoother.
And finally, virtualization is a pretty mature technology. It's easier now than before to run old software in its own little sandbox while making a transition to the new hotness. Bringing it back to Apple, this is something they're old hands at -- 680x0 emulation on PPC, PPC emulation on Intel, and Classic on OS X all made those transitions shockingly smooth.
What it means is that your son could very easily see the time when an unnecessary software limitation creates a crisis in the operating system. Which, looked at in that way, is pretty optimistic after all. Why would OS X necessarily last two human generations?
I guess my point is that it's a lot more modular than it used to be. OS X might not be around in two generations, just like few modern-day admins have even seen the big iron the Internet was built on. But TCP/IP survives, and if you get a couple of beers in a cranky old-timer, he'll start ranting about how "Web 2.0" is really just telnet 5.0, or gopher with pictures. Or, for that matter, that it's all just an extension of the telegraph, which was a packet-switched digital network before the telephone gummed things up with all that analog stuff.
Others noticed too. Apple PR says he’s had “a common bug” for the last few weeks.
You can download and install Safari 3 on your old G4. It has the expandable text windows. Unless you have a screen stretcher (Pat. Pend.), I can't help you with the width of the view...
Maybe not even that big. The PC I used in the '80s, all of its software, all of my data, and even the PC itself -- in an emulated form -- can fit on a microSD card that could be hidden under the stamp on a post card.
Follow the thread: from harnessing the GPU for regular programming tasks, to what is a GPU (matrix/vector processor), to another example of that kind of processor and the massive speed you can get with it, the Cell.
Nothing, Apple already dumped IBM for their processors
Apple wouldn't have adopted the Cell had they stuck with IBM because it's not good for general-purpose computing. The Cell is far more advanced than what was made for Apple, and IBM is putting a lot of work into it because it has a far larger market (and thus much more ROI) than Apple ever provided. The current customers also don't constantly whine for improvements and make constant demands for special treatment.
and the Cell isnt x86 or PPC compatible either
The central processor of a Cell is a Power core much like a simplified G4 PowerPC. The versions of Linux that run on the PS3 are PowerPC builds. You really need to do your research before you write.
Im sure Sony and IBM would rather you buy a PS3 and try to get some obscure version of Linux working on it
They don't care about Linux for this much since it won't make them much money. IMHO, the PS3 allows a Linux installation with a hypervisor so hackers don't have the "I had to hack it to run Linux on it" excuse. Of course the limits the hypervisor puts on it are making them whine anyway.
but Id take a Mac Mini over a kludge like that any day.
Good system for light general-purpose computing. Poor system for the high-speed tasks a Cell is designed for. To give you an example, in the weighted Folding@Home stats, 53,560 PS3s are putting out 1,510 teraflops while 198,529 PCs are putting out 189 teraflops. That's an average of .9 gigaflops per PC, 28 gigaflops per PS3. Even if you count that there are a lot of old PCs in the mix, giving them an order of magnitude benefit of the doubt still leaves them in the dust.
To give you another example, a new Mini can process a 1080p HD video stream. A Cell can easily process six simultaneously, including doing transformations on each stream.
Real kludge there, one that's going to be the first petaflop supercomputer, used to keep up our nuclear arsenal.
That's the one. I've also found the Amazon-only lookup limiting. But I do like that it checks Amazon Germany for my German books. It also can't look-up paperbacks well using the scanner. Still, it's the best thing I've found so far, 367 books inputted (probably 3/4 successfully with the scanner) and I've barely started.
The PS3 does not have a Cell with 7 SPUs. The Cell in the PS3 has 8 SPUs, with 1 reserved to the firmware/OS, leaving 7 for developers to utilize.
Whoops. You guys already knew that. Sorry.
The Cell has 8 on the die, the PS3 uses 7 (one lost to manufacturing error or purposely deactivated) with one reserved for the OS, leaving 6 left for developers.
They do it all the time. Take for example using an integer (signed, from -2 billion to +2 billion) for an index of records, very common. Then you find out your application is much more popular than expected and you're about to run out of numbers for the index. You then have to change them all to unsigned integers to get 4 billion records, or to unsigned longs. With the latter you get about 18 quintillion, effectively unlimited, but you never thought to do that in the beginning because that takes up twice the memory when processing and twice the space in the database.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.