Posted on 08/11/2005 9:25:47 PM PDT by HAL9000
SAN JOSE, California -- For the first time in five years, Intel Corp. will make a major change in the plumbing of its chips by switching to a new design that promises better performance and lower power consumption than today's Pentium 4.The world's largest chip maker will announce the architecture this month at a conference in San Francisco, spokesman Bill Calder said Thursday. Chips based on the new architecture are expected to debut in the second half of 2006.
The technology will replace the Netburst architecture that appeared in late 2000 with the Pentium 4 and enabled a path to higher processing frequencies. At the time, Intel hoped that it could boost performance by ratcheting up the speed of the chips.
But Netburst hit a roadblock last year as jumps in frequency failed to produce the expected improvements in overall performance. In addition, the chips required more power and thus generated considerably more heat.
"The original theory was Netburst would show increasing performance benefits with increasing frequency," said Nathan Brookwood, an analyst at the research firm Insight 64. "It didn't work quite the way Intel had anticipated."
Meanwhile, Intel's main rival, Advanced Micro Devices Inc., dropped out of the frequency race and engineered chips that do more work per clock tick rather than running at a faster pace. In most cases, AMD's technology bested Intel's chips.
Intel new architecture is expected to be based in part on Intel's Pentium M, which was developed to deliver performance and power savings in notebook computers. It also has roots in the Pentium III processor that Intel launched in 1999.
Like AMD's chips, the Pentium M's top clock speed is lower than the Pentium 4, which currently tops out at 3.8 gigahertz.
Brookwood said the change represents a bigger shift for Intel. In the past, Intel launched architectures on chips for desktop PCs and then carried the technology to other platforms, such as servers and notebooks.
"This next-generation part was originally designed and targeted for mobile," he said. "Now it's going to be proliferated onto desktops and servers. I think that in some ways is the bigger news than just the microarchitecture change."
Like the top of the line Pentium 4, the next-generation processors also are expected to have multiple computing engines on a single chip, security features and manageability functions.
"AMD will face tougher competition once Intel moves to the new architecture," Brookwood said. "But it's far too soon to be able to predict who's going to be ahead 18 months from now."
this looks good. we need to keep moving forward. but for goodness sake , how about we don't sell it to our enemies at least until the next upgrade.
The article mentions AMD's higher IPC, but fails to
mention if Intel expects improvements on that front.
The shift to the Pentium-M core has been predicted by
CPU enthusiasts for what, over a year now? Is there
any real news here?
> "AMD will face tougher competition once Intel moves
> to the new architecture," Brookwood said.
Sure, if they stand still.
Oh, good--we won't have to buy bottles of liquid nitrogen to cool our PCs.
Looks like AMD made some smart moves and now leads Intel in the proc race. Good for them. I put an Athlon in Mrs. randog's 'puter last year and we're both pleased with the performance.
If they could only switch to the correct byte ordering.
Did you know that if you pour liquid nitrogen on an open text book, and then turn the page, the page cracks off, sort of like breaking a cracker? (Don't ask how I know..)
Not a single Intel chip in any of the computers in this house, all AMD, thank you very much....
Mac ping?
This statement is basically untrue. The 486, Pentium, Pentium II, Pentium III, and P4 as well as the xeon were ALL initially targeted at the server market, with Intel's announcements and initial advertising in each case virtually poo-pooing the need for these chips on the desktop.
Of course the market thought otherwise, and Intel was glad to take the money.
Only the Pentium (4) M and Celerons were originally clearly intended to go on the desktop machines.
Whew! I was worried that, following Apple's switch to Intel, Intel would switch to PowerPCs.
Oh gawd, if they did that half the worlds software would need a rewrite, not to mention half the internet standards.
I'm afraid that bone headed move is going to have a LONG legacy, and we will be fighting that for decades.
Mac Ping?
About this article reporting Intel Switching their processor architecture after Apple switched their Mac architecture to Intel ... and I thought, "what the heck"... so PING!
If you want on or off the Mac Ping List, Freepmail me.
Microsoft did. ;)
I've stuck with the PIII because it runs cool, even without a fan. There's no need to use one of the newer waffle irons unless you play games.
It's early. Steve Jobs could decide to stick with the PowerPC family. ;')
Although in most situations nowadays there aren't really advantages to either byte ordering, there are some advantages to little-endian when it comes to machines' ability to read and process data efficiently, and no advantages at all that I'm aware of to big-endian except for making hex dumps look nice.
BTW, I wonder if the decision to use mod-65535 checksums in IP packet standards has anything to do with the fact that they are orthogonal to byte ordering (meaning that byte-swapping all the data in a packet will byte-swap the checksum).
early Athlons forever ruined me for AMD. They'd burn to a crisp in seconds if the heat sink wasn't exactly right. The first notice of trouble was that plastic burning smell. The core of the CPU literally liquified and poured out all over the place.
No thanks, Intel for me. Even if that was one AMD line that was bad.. 'Fool me once, you can't fool me again'.
The current AMD chips are great, but I will admit that the Intel chips are more tolerant of user idiocy.
When I got my first PIII chip, I couldn't get the heatsink on, so i figured, "I'll turn it on for a minute and make sure I have it in correctly."
The computer booted and immediately shut off with no damage. The chip still powers my mom's computer several years later.
AMD now has the same kind of technology, but they did lose a lot of people to burnt CPUs.
Some data are processed more efficiently with big-endian storage.
The x86 little-endian architecture is a relic of Intel's calculator chip designs of the 1970s. Even Intel's engineers admit that little-endian was an bad decision for x86. I wish that Intel would add big-endian access to their next generation of chips.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.