Posted on 09/22/2022 5:15:24 AM PDT by FarCenter
...
To critics, Huang said he feels the higher price is justified, especially since the cutting-edge Lovelace architecture is necessary to support Nvidia’s expansion into the so-called metaverse.
“A 12-inch [silicon] wafer is a lot more expensive today than it was yesterday, and it’s not a little bit more expensive, it is a ton more expensive,” Huang said.
“Moore’s Law’s dead,” Huang said, referring to the standard that the number of transistors on a chip doubles every two years. “And the ability for Moore’s Law to deliver twice the performance at the same cost, or at the same performance, half the cost, every year and a half, is over. It’s completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past.”
“Computing is a not a chip problem, it’s a software and chip problem,” Huang said.
(Excerpt) Read more at marketwatch.com ...
Most ICs will be produced using DUV processes for the indefinite future until the costs of EUV can be drastically reduced and throughput increased.
Unless we have developed Star Wars holograms for common use, computers are pretty much topping out on anything that we need to do with them.
Yep. Moore’s law was never going to go on forever. Any kind of technology it’s a practical peak and doesn’t improve much from there and then we move on to other things. Our record players much better today than they were in 1980? How about cassette decks? Is toothpaste much better than it was in 1957?
At my first job in semiconductors we used 2” wafers.
Not even close. Remember, it’s not just about performance... it’s also about cost and size.
You ain’t seen nothing yet.
Once chiplets are worked out for gaming GPUs the prices will stabilize and quit going through the roof.
>>Unless we have developed Star Wars holograms for common use, computers are pretty much topping out on anything that we need to do with them.
I mostly agree with this - I used to upgrade my machine every 18-24 months for most of my career, and saw huge gains in performance each time I did, that made it worth it - now for 99% of what I do, I don’t even notice any improvement in my day-to-day experience as a software developer.
MY employer sent me a brand new top-of-the-line laptop a few months ago, to replace my three year old model (I didn’t even request it), and it sat unopened on my desk for 3 months before I finally bothered to set it up knowing it would function exactly as my old one.
How about music? Can you tell anymore if a song was from today or from say, 2005?
That will probably continue, but most people don’t need, nor necessarily want, computers the size of cell phones. There is a minimal size factor needed as well.
I still use a 2010 Macbook, and it’s fine, performance wise.
Has there been any good music made since 2005?
>>MY employer sent me a brand new top-of-the-line laptop a few months ago, to replace my three year old model (I didn’t even request it), and it sat unopened on my desk for 3 months before I finally bothered to set it up knowing it would function exactly as my old one.<<
Your company didn’t send you a new laptop because they want to give you a faster computer. They sent you one because disk drives, even solid-state drives, tend to start failing after 3-5 years.
How would you be affected if your laptop’s disk died right now, taking your data with it? (I hope you back up important work to your company’s network)
https://www.newegg.com/insider/how-long-do-hard-drives-and-ssds-last/
You're kind of reinforcing the original point. The issue you describe is one of physical breakdown and not necessarily a technological improvement.
A few months ago I transferred the last useful files off a Dell computer I had purchased in 2011. Most of them had already been moved over to the 2016 model when I went through an upgrade. And now I've got one that I purchased in 2021.
I got the 2011 files moved just in time; the computer broke down completely several weeks later and now it won't even boot up.
Interestingly, while opening boxes I had packed two years ago when I began the process of moving my office, I came across an old Dell computer that dates back to 2005 (if I remember correctly). It still works fine, but no modern software will run on it.
Quantum computing might not double the number of transistors, but at some point it might allow Moore’s law to catch up on power.
Curious if this takes into consideration the cost of inflation and other cost increases due to global demand for materials, supply chain issues, and the whoke host of orger global issues?
Sorry for the typos.
Should read ‘whole host of other...’
Heat dissipation is a current limiting factor on computing.
In Israel, there’s a company that prints entire pc computers on essentially a plastic sheet for pennies. You can roll it up and put it into a tube.
That tech will mature and iPhone equivalent tech will become part of clothing, whatever. For basically no cost.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.