Posted on 04/04/2005 9:08:38 AM PDT by infocats
Forty years ago, Electronics Magazine asked Intel co-founder Gordon Moore to write an article summarizing the state of the electronics industry.
The article outlined what became known as Moore's Law, the observation that the number of transistors--tiny on/off switches that churn out electrical signals that get represented as 1s and 0s--on a chip can be doubled in a short period of time. Adopted as a yardstick by the tech industry, the concept is one of the reasons the industry evolved into a high-growth, but high-risk, affair.
This FAQ explains the impact and consequences of the principles set down in the April 19, 1965, article.
What is Moore's Law?
When writing the article, Moore noted that the number of devices (which then included transistors and resistors) inside chips was doubling every year, largely because engineers could shrink the size of transistors. That meant that the performance and capabilities of semiconductors was growing exponentially and would continue to. In 1975, Moore amended the law to state that the number of transistors doubled about every 24 months.
When the paper first came out, chips sported about 60 distinct devices. By contrast, Intel's latest Itanium chip comes with 1.7 billion silicon transistors.
As monumental as the article has become, it wasn't a big deal then. It started on page 114 of the magazine.
"It wasn't something you expected to join the archives," Moore said in a recent gathering with reporters. "I didn't think it would be especially accurate."
(Excerpt) Read more at news.zdnet.com ...
And possibly longer than that when looked at information storage ant retrieval as a whole.
The longevity of Moore's Law will someday be understood for what it is: the first evidence that the human race is now working for the computers, not the other way around.
Whatever is necessary to sustain the growth of cybernetic culture will be provided.
(steely)
Here is Dr. Moore's original paper:
ftp://download.intel.com/research/silicon/moorespaper.pdf
Hitachi To Produce 1 Terabyte Desktop Drives ...expected available late 2005.
I heard a professor speculate that Moore's law may apply across all of civilization as a whole. Look at how long it took to advance over the centuries and how this pace kept increased with time.
Would be interesting to do some research to see if this may has basically been the rule for thousands of years.
I have been looking for a picture of an OLD IBM 1405 Disk Drive...had BIG Platters and an arm that went in and out and up and down to access the rotating platter....
I actually installed a replacement for one of those at the Fleming Warehouse in Topeka , Kansas shortly after I started working for Big Blue....it was something to see.....
How far we have come....
Yes. This makes sense.
What humans do is sometimes described as a "search for truth." It may sound like a quibble, but I hold that a more precise description would be "a sort for truth."
Over time, the difference between n^2 and n*log(n) really adds up!
(steely)
Actually a whole lot longer than that, if you view Moore's law as a limited application of the larger principle of the evolution of information, starting right from the big bang, through the physical evolution of the universe, biological evolution and cultural evolution.
When wetware wouldn't run any faster, information evolution just jumped to silicon technologies. Of course, I think you would need to find a different time scale to explain all this, but in terms of the amount of information in the universe, there might be some argument that it has always proceeded at a geometrical pace.
It was in 1906 that the G.W. Pickard of Amesbury, Massachusetts perfected the crystal detector and in November of that year took out a patent for the use of silicon in detectors. Arguably this was the start of the silicon revolution and it did not take long before experimenters achieved amplification using crystal devices, long before the term transistor was devised.
Solid-state electronics were born even earlier, when Ferdinand Braun invented a solid-state rectifier using a point contact based on lead sulphide in 1874. But its to Pickard that the credit goes for discovering that the point contact between a fine metallic wire (the so-called cats whisker) and the surface of certain crystalline materials (notably silicon) could rectify and demodulate high-frequency alternating currents, such as those produced by radio waves in a receiving antenna (what Pickard called a wave-interceptor). His crystal detector (point-contact rectifier) was the basis of countless crystal set radio receivers, a form of radio receiver that was popular until the crystal detector was superseded by the thermionic triode valve.
By its nature the crystal rectifier was a passive device, with no signal gain. But radio historian Lawrence A. Pizzella WR6K notes anecdotal stories of shipboard wireless operators in the second decade of the 20th century achieving amplification using a silicon carbide (carborundum) crystal and two cats whiskers. He cites a taped interview made in 1975 with Russell Ohl at his home in Vista, California in which claims of signal gain were made. This is an excerpt from Ohls testimony:
As a parenthetical aside, I didn't pay too much attention to the transistor in '48 (although being only 5 at the time might have had something to do with it), considering it more chemistry than electronics.
Not until 1956 when Sony (no baloney) first introduced it commercially in a small portable radio, did I start to take it seriously.
Interesting.
Talk about universal. I never looked at it this way. Never looked at the big bang, an expanding universe, symmetry breaking, formation of matter, and the formation of large structures in the universe as an evolution of information.
I wonder what the next five substrates will be.
A few years later, we were astonished by the breathtaking technological advances that led to the company buying a 20 MEGAbyte Winchester fixed disk drive that was about the size of a VCR. Now, we have 3.5" disk drives hitting 1 TeraByte and memory cards the size of a postage stamp that hold over 1 GigaByte.
Its been a wonderful thing to witness such amazing growth in computer technology.
The key to "information evolution" as you call it (not quite the right phrase, but I get your meaning) is the power of symbols, of symbolic information, which is so powerful because it can be copied with an error rate very close to zero. This, in turn, is the result of the separation (with symbols) of information from the energy used to transmit that information; by making information separate from energy, the randomness of thermodynamics is eliminated. Maxwell's Demon exists in the world of symbols.
DNA encodes symbolic information; just look at the chart relating triple base-pair codons to the amino acids they code. Life itself depends on the copying of symbolic information through endless generations with exceedingly low error rates.
Human intelligence operates in the world of symbols also. Symbols thinking about symbols.
(steely)
Even if Moore's law ceased to function today, it wouldn't be so bad (from a computing standpoint, not from a business point of view). We have more computing power today than we know what to do with, and we certainly don't make the best use of what we have. If we were still squeezing every last drop of performance out of the available hardware today like we did back in the days of limited, expensive computing resources the performance would be incredible. Instead we have bloatware and inefficient operating systems that sap much of the gains that have been made in hardware peformance. In terms of interactive desktop performance, the 2.4 GHz machine with a gig of RAM that I'm using today is only marginally better than the 486DX33 systems I was using 12 or 13 years ago.
Got a link to his paper by chance ?
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.