Unless we have developed Star Wars holograms for common use, computers are pretty much topping out on anything that we need to do with them.
Yep. Moore’s law was never going to go on forever. Any kind of technology it’s a practical peak and doesn’t improve much from there and then we move on to other things. Our record players much better today than they were in 1980? How about cassette decks? Is toothpaste much better than it was in 1957?
Not even close. Remember, it’s not just about performance... it’s also about cost and size.
You ain’t seen nothing yet.
>>Unless we have developed Star Wars holograms for common use, computers are pretty much topping out on anything that we need to do with them.
I mostly agree with this - I used to upgrade my machine every 18-24 months for most of my career, and saw huge gains in performance each time I did, that made it worth it - now for 99% of what I do, I don’t even notice any improvement in my day-to-day experience as a software developer.
MY employer sent me a brand new top-of-the-line laptop a few months ago, to replace my three year old model (I didn’t even request it), and it sat unopened on my desk for 3 months before I finally bothered to set it up knowing it would function exactly as my old one.
Completely disagree. Two examples:
The latest iPhone has a 48 megapixel camera, 15 years ago the initial iPhone had a 2 megapixel camera and everyone thought the pictures were plenty "good enough". Improvements are going to just keep on coming. And the computing power needed to handle those pictures is increasing.
Secondly, computing power has increased enough that computer generated movies are almost indistinguishable from reality. More power and they will be completly indistinguishable. Do you really think no one will pay for that upgrade?
And, the STEM uses of computers will always be able to use more and more power.