A current problem that has gotten bad and is only getting worse is web page content.
More and more script languages are being embedded in webpages performing advertising and behind the scenes activity that can slow loading and performance down to a snail’s paces. Even with 512meg of RAM I am getting web pages alone that are forcing my virtual swap file to become very low on memory and have to enlarge. This is a webpage — excuse me you memory hog programmers.
See, most people don't consider the "big picture" of what they are doing, whether it is a C++/C# application, a webpage, or something else. I come from the days when the home computer was a simple creature, only capable of running a single program at a time. Programmers had to be frugal with resources and sharp with their algorithms. On top of that, it was usually one programmer with full control - he didn't need to worry too much about shared interfaces, etc. Code could get pretty sloppy compared to today, but the execution was far more efficient.
This is no longer the case, and I've long believed that the "sloppiness" (in execution terms) of the big picture, with DLLs, COM, etc. each allocating more resources than they need, and doing it on-the-fly, has netted poor performance faster than Moore's law can keep up!
With 30,000x the CPU power of 30 years ago, computers today should boot and start programs in the blink of eye - even if OSes and programs are 1000X more complex. Trouble is, they are 100,000X as complex yet don't need to be.