Course XP blows that away.
This isn't more fodder. This is one of the reasons I am an MS hater. The other is that up until Windows 2000 applications software can crash and lock up the OS. That is inexcusible.
A long time ago we had text based Basic, which allowed anyone to write great, fast and useful programs to solve all types of problems, including Astronomic, Engineering and Technical problems of all types.
HP Basic particularly was extremely rich in commands that made anything very easy to program and output.
Where did it go?
With todays machines, that interpreted language would be incredibly fast. And useful.
Why is there no current version?
(That I know of...)
Frankly, there's darn little performance difference I can detect between the two machines. In fact, I suspect I could do a blind test with a dozen random users and have a 50-50 distribution of accurate guesses as to which was the "fast" machine and which was the "dog".
I used to think my 8mhz "Turbo" XT w/640KB RAM was the bee's knees (compared to the 4 mhz (and slower!) 8 bit 64KB iron I'd cut my teeth on). Then I got my 10 (or was it 12?) mhz 286, and I was pickin' bugs from my teeth every time I got up from the keyboard. That sucker was fast.
Now, it seems that the iron has gotten so much faster than the apps that it's a "paper competition" with little real-world meaning for the vast majority of users.
Who needs the super iron? I see two classes of users, and for one class, the term "need" is applied in the loosest of all possible senses. The two classes are "gamers", and "network admins".
The only time I start feeling "cramped" on my machine is when I'm running multiple concurrent major apps, i.e., one or two instances of Visual Studio (running an app or two), SQL Server, IIS, and IE. IOW, when I'm doing that, I'm essentially running a whole network in one cramped little box. Most people don't do that.
To chime in on the author's theme, it wasn't that long ago (at least not at the rate that the years seem to keep peeling by at my age) that a 10 - 16 mhz 286-386 class machine, with 1-3MB of XMS memory was a high end network server, and cost a pile of money. Nowadays, we've got secretaries using machines that would have literally cost millions (and occupied rooms) a few years ago -- as glorified typewriters.
So, my two cents is that the "bloatware" thing is overblown. When 128 megs of RAM costs less than fifty bucks, and a 60 gig hard drive costs a buck a gig (I remember paying $275 for a 20 megabyte drive -- wholesale!), and no mix of OS and apps comes anywhere near taxing the capabilities for 99% of the users, "bloatware" is a non-issue.
For more than 10 years I have watched with amusement as PCs that are much faster than the old ones take LONGER to boot up -- typically, a 1.2 MHz PC XT vintage 1986 would be fully ready to go in a few seconds, and the current models are more than 1000 times as fast and take nearly 10 times as long because they are doing 10000 times as much computational work!
But the REAL problem with software bloat is not the slowness, it is the complexity which makes applications almost impossible to properly debug. NOBODY I know, and I know a LOT of computer types, makes any attempt to fix Microsoft-related errors themselves as they would with Unix or Linux, nor do they bother trying to get Microsoft to fix them because it just won't happen; instead they just shrug, reboot, and work around. A certain level of "Your program has performed an illegal operation and will be shut down" and a (lower) level of total freeze-ups and blue screens of death are simply accepted as a tolerable inconvenience.
But every single time this happens, there are one or more theoretically identifiable HUMANS who made specific MISTAKES that could be tracked down and blamed on them. The practical difficulties of this are sufficient that most of us are willing to simply let them be condemned to hand-simulate the infinite loops of their own programs in programmers' hell after they pass on.
This is an interesting "discovery." I guess the author is saying that the latest h/w resources allow the older os to provide better-than-ever performance because older os's don't have the unneeded overhead.
I hate unneeded overhead anyway.
Russ
But there NEEDS to be another Hardware solution...
Simultaneous calls and virtual multiple clocks or something...
Then it'll all work!
Something big, yes sir... that's it!
Ahh, the good ole days..
Everyone wants behemoth sized and gadget packed goods, rather than efficient and simplistic wares.
Seems like more of status symbol than practicality. Get XP and brag in your hood today!
Storage was a tape recorder
It had many useful (at the time) programs, and using basic I was able to write my own, well within the limits of the machine.
However I wanted more. Each new machine gave me more hard drive space, when I had my first computer with 4 MB of hard drive I wondered how I would ever fill the space. Well I did. My previous computer had 9 GB of harddrive, and when I got it I did not think it would be possible to ever fill the space, well I did.
My current computer has 80 GB, and I no longer wonder if I will ever fill the space, but wonder how long will it be before I have to start deleteing files.
I am looking forward to my first "80 Terabyte" hard drive.
The point is, as any project will expand for the time alloted it, and hard drive, no matter how big will get filled. As long as the price of hard drives falls as the size of the hard drive increases, I no longer worry about bloat ware.
:-}
#include "iostream.h"
int main(int argc,char **argv) { cout << "Hello world"; }
The executable is 8800 bytes in Solaris 2.6. However, most of that is part of the necessary overhead to communicate with the OS via the linked-in library. If you wrote all the library code yourself, you'd be wasting a lot of time and introducing unnecessary bugs.
The whole point of OO is really the optimize the efficiency of writing code and the liklihood that it will work correctly. If you write clean C++, the executables might be somewhat larger, but they will run very fast.
Streaming video.
Quake III.
Video conferencing.
MP3s.
DVDs.
CD burning.
Video editing.
Pause/rewind live TV.
...and plenty more, all doable within minutes of unboxing a new PC.
Oh sure, there's plenty of bloat. You don't need a large percentage of what disk space gets allocated for. Pounding out a quick memo doesn't need more than the first word processor run on an Apple II. But...compare the size proportions between commonly used applications and the data files they handle. The music you're listening to is likely a 5MB .MP3 (out of a collection spanning gigabytes); the LotR trailer you'll watch during lunch is about 50MB; the Unreal Tournament session you'll play this evening chews up 500MB in maps & related data files ... and none of these apps could be reasonably, conveniently be run on the relatively svelt machines of the past.
Complaints of "bloat" have appeared with every upgrade. Sure it all gets bigger...yet there's no question that there's more functionality in the straining load in my 700MHz, 20GB laptop than the PS/2 collecting dust in my brother's basement - that's WHY the older machines are mostly collecting dust: the "bloatware" machines actually do more.
Many people might not remember this, but one of the main reasons argued for graphical interfaces was that by building all that code into the OS, programs would be able to share & re-use code so effectively that programs would GET SMALLER!
Hmmm. Didn't seem to happen.
Mark W.