Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Rethinking software bloat.
Information week.com ^ | 12/17/01 | Fred Langa

Posted on 12/17/2001 4:33:52 AM PST by damnlimey

Rethinking 'Software Bloat'

PRINT THIS ARTICLE
DISCUSS THIS ARTICLE
WRITE TO AN EDITOR
 
Fred Langa takes a trip into his software archives and finds some surprises--at two orders of magnitude.
By Fred Langa

 
Reader Randy King recently performed an unusual experiment that provided some really good end-of-the-year food for thought:
I have an old Gateway here (120 MHz, 32 Mbytes RAM) that I "beefed up" to 128 Mbytes and loaded with--get ready--Win 95 OSR2. OMIGOD! This thing screams. I was in tears laughing at how darn fast that old operating system is. When you really look at it, there's not a whole lot missing from later operating systems that you can't add through some free or low-cost tools (such as an Advanced Launcher toolbar). Of course, Win95 is years before all the slop and bloat was added. I am saddened that more engineering for good solutions isn't performed in Redmond. Instead, it seems to be "code fast, make it work, hardware will catch up with anything we do" mentality.
It was interesting to read about Randy's experiment, but it started an itch somewhere in the back of my mind. Something about it nagged at me, and I concluded there might be more to this than meets the eye. So, in search of an answer, I went digging in the closet where I store old software.

Factors Of 100
It took some rummaging, but there in a dusty 5.25" floppy tray was my set of install floppies for the first truly successful version of Windows--Windows 3.0--from more than a decade ago.

When Windows 3.0 shipped, systems typically operated at around 25 MHz or so. Consider that today's top-of-the-line systems run at about 2 GHz. That's two orders of magnitude--100 times--faster.

But today's software doesn't feel 100 times faster. Some things are faster than I remember in Windows 3.0, yes, but little (if anything) in the routine operations seems to echo the speed gains of the underlying hardware. Why?

The answer--on the surface, no surprise--is in the size and complexity of the software. The complete Windows 3.0 operating system was a little less than 5 Mbytes total; it fit on four 1.2-Mbyte floppies. Compare that to current software. Today's Windows XP Professional comes on a setup CD filled with roughly 100 times as much code, a little less than 500 Mbytes total.

That's an amazing symmetry. Today, we have a new operating system with roughly 100 times as much code as a decade ago, running on systems roughly 100 times as fast as a decade ago.

By itself, those "factors of 100" are worthy of note, but they beg the question: Are we 100 times more productive than a decade ago? Are our systems 100 times more stable? Are we 100 times better off?

While I believe that today's software is indeed better than that of a decade ago, I can't see how it's anywhere near 100 times better. Mostly, that two-orders-of-magnitude increase in code quantity is not matched by anything close to an equal increase in code quality. And software growth without obvious benefit is the very definition of "code bloat."

What's Behind Today's Bloated Code?
Some of the bloat we commonly see in today's software is, no doubt, due to the tools used to create it. For example, a decade ago, low-level assembly-language programming was far more common. Assembly-language code is compact and blazingly fast, but is hard to produce, is tightly tied to specific platforms, is difficult to debug, and isn't well suited for very large projects. All those factors contribute to the reason why assembly language programs--and programmers--are relatively scarce these days.

Instead, most of today's software is produced with high-level programming languages that often include code-automation tools, debugging routines, the ability to support projects of arbitrary scale, and so on. These tools can add an astonishing amount of baggage to the final code.

This real-life example from the Association for Computing Machinery clearly shows the effects of bloat: A simple "Hello, World" program written in assembly comprises just 408 bytes. But the same "Hello, World" program written in Visual C++ takes fully 10,369 bytes--that's 25 times as much code! (For many more examples, see http://www.latech.edu/~acm/HelloWorld.shtml. Or, for a more humorous but less-accurate look at the same phenomenon, see http://www.infiltec.com/j-h-wrld.htm. And, if you want to dive into Assembly language programming in any depth, you'll find this list of links helpful.)

Human skill also affects bloat. Programming is wonderfully open-ended, with a multitude of ways to accomplish any given task. All the programming solutions may work, but some are far more efficient than others. A true master programmer may be able to accomplish in a couple lines of Zen-pure code what a less-skillful programmer might take dozens of lines to do. But true master programmers are also few and far between. The result is that code libraries get loaded with routines that work, but are less than optimal. The software produced with these libraries then institutionalizes and propagates these inefficiencies.

You And I Are To Blame, Too!
All the above reasons matter, but I suspect that "featuritis"--the tendency to add feature after feature with each new software release--probably has more to do with code bloat than any other single factor. And it's hard to pin the blame for this entirely on the software vendors.

Take Windows. That lean 5-Mbyte version of Windows 3.0 was small, all right, but it couldn't even play a CD without add-on third-party software. Today's Windows can play data and music CDs, and even burn new ones. Windows 3.0 could only make primitive noises (bleeps and bloops) through the system speaker; today's Windows handles all manner of audio and video with relative ease. Early Windows had no built-in networking support; today's version natively supports a wide range of networking types and protocols. These--and many more built-in tools and capabilities we've come to expect--all help bulk up the operating system.

What's more, as each version of Windows gained new features, we insisted that it also retain compatibility with most of the hardware and software that had gone before. This never-ending aggregation of new code atop old eventually resulted in Windows 98, by far the most generally compatible operating system ever--able to run a huge range of software on a vast array of hardware. But what Windows 98 delivered in utility and compatibility came at the expense of simplicity, efficiency, and stability.

It's not just Windows. No operating system is immune to this kind of featuritis. Take Linux, for example. Although Linux can do more with less hardware than can Windows, a full-blown, general-purpose Linux workstation installation (complete with graphical interface and an array of the same kinds of tools and features that we've come to expect on our desktops) is hardly what you'd call "svelte." The current mainstream Red Hat 7.2 distribution, for example, calls for 64 Mbytes of RAM and 1.5-2 Gbytes of disk space, which also happens to be the rock-bottom minimum requirement for Windows XP. Other Linux distributions ship on as many as seven CDs. That's right: Seven! If that's not rampant featuritis, I don't know what is.

Is The Future Fat Or Lean?
So: Some of what we see in today's huge software packages is indeed simple code bloat, and some of it also is the bundling of the features that we want on our desktops. I don't see the latter changing any time soon. We want the features and conveniences to which we've become accustomed.

But there are signs that we may have reached some kind of plateau with the simpler forms of code bloat. For example, with Windows XP, Microsoft has abandoned portions of its legacy support. With fewer variables to contend with, the result is a more stable, reliable operating system. And over time, with fewer and fewer legacy products to support, there's at least the potential for Windows bloat to slow or even stop.

Linux tends to be self-correcting. If code-bloat becomes an issue within the Linux community, someone will develop some kind of a "skinny penguin" distribution that will pare away the needless code. (Indeed, there already are special-purpose Linux distributions that fit on just a floppy or two.)

While it's way too soon to declare that we've seen the end of code bloat, I believe the signs are hopeful. Maybe, just maybe, the "code fast, make it work, hardware will catch up" mentality will die out, and our hardware can finally get ahead of the curve. Maybe, just maybe, software inefficiency won't consume the next couple orders of magnitude of hardware horsepower.

What's your take? What's the worst example of bloat you know of? Are any companies producing lean, tight code anymore? Do you think code bloat is the result of the forces Fred outlines, or it more a matter of institutional sloppiness on the part of Microsoft and other software vendors? Do you think code bloat will reach a plateau, or will it continue indefinitely? Join in the discussion!



TOPICS: Editorial; Miscellaneous
KEYWORDS:
Navigation: use the links below to view more comments.
first 1-2021-4041-6061-80 ... 121-129 next last
More fodder for all the MS haters out there,but it does raise an interesting question.What do you think are the real reasons for "Bloatware"
1 posted on 12/17/2001 4:33:52 AM PST by damnlimey
[ Post Reply | Private Reply | View Replies]

To: damnlimey
Greed & laziness. Pure and simple.
2 posted on 12/17/2001 4:42:18 AM PST by Exnihilo
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
I think a lot of it has to do with people wanting the OS to do everything for them, in order to avoid having to do any detail work. In Win3.1, you had to go in and set drivers, and load drivers from manufacturers disks, set exactly how you wanted the hardware to perform. More and more, people are wanting to do Plug and Play, rather than get into the details of how a piece of hardware is supposed to work. That takes more of the coding as well. But I'm late for work (help desk is fun, help desk is fun...if I say it enough, I almost believe it....)
3 posted on 12/17/2001 4:42:21 AM PST by Tennessee_Bob
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
I'm an MS hater, but, I must say that 32 bit OS improves the hell out of networking with older programs. There are software areas where a GUI is just unsuitable for data entry, and until another level for that data entry is reached, the fastest networking and user entry is on Win9x with a character based programming. And customers can still have pretty graphical printouts and graphs through the "OS".

Course XP blows that away.

4 posted on 12/17/2001 4:43:50 AM PST by jammer
[ Post Reply | Private Reply | To 1 | View Replies]

To: jammer
Just for comparison,this siteTiny apps shows just how small and tight code can get.
5 posted on 12/17/2001 4:51:47 AM PST by damnlimey
[ Post Reply | Private Reply | To 4 | View Replies]

To: damnlimey
More fodder for all the MS haters

This isn't more fodder. This is one of the reasons I am an MS hater. The other is that up until Windows 2000 applications software can crash and lock up the OS. That is inexcusible.

6 posted on 12/17/2001 4:54:25 AM PST by AndyJackson
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
I am not an MS "hater", but I can see the obvious too.
Perhaps this is a good time to ask the question that "intrigues" me.

A long time ago we had text based Basic, which allowed anyone to write great, fast and useful programs to solve all types of problems, including Astronomic, Engineering and Technical problems of all types.
HP Basic particularly was extremely rich in commands that made anything very easy to program and output.
Where did it go?
With todays machines, that interpreted language would be incredibly fast. And useful.

Why is there no current version?
(That I know of...)

7 posted on 12/17/2001 4:57:02 AM PST by Publius6961
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey;bush2000;innocentbystander
Single data point: I run NT 4.0 Server and XP Home on my machine -- a PPro 200, 64MB RAM. My wife runs 98, and 2000 Pro on her machine -- a PIII 933, 128MB RAM.

Frankly, there's darn little performance difference I can detect between the two machines. In fact, I suspect I could do a blind test with a dozen random users and have a 50-50 distribution of accurate guesses as to which was the "fast" machine and which was the "dog".

I used to think my 8mhz "Turbo" XT w/640KB RAM was the bee's knees (compared to the 4 mhz (and slower!) 8 bit 64KB iron I'd cut my teeth on). Then I got my 10 (or was it 12?) mhz 286, and I was pickin' bugs from my teeth every time I got up from the keyboard. That sucker was fast.

Now, it seems that the iron has gotten so much faster than the apps that it's a "paper competition" with little real-world meaning for the vast majority of users.

Who needs the super iron? I see two classes of users, and for one class, the term "need" is applied in the loosest of all possible senses. The two classes are "gamers", and "network admins".

The only time I start feeling "cramped" on my machine is when I'm running multiple concurrent major apps, i.e., one or two instances of Visual Studio (running an app or two), SQL Server, IIS, and IE. IOW, when I'm doing that, I'm essentially running a whole network in one cramped little box. Most people don't do that.

To chime in on the author's theme, it wasn't that long ago (at least not at the rate that the years seem to keep peeling by at my age) that a 10 - 16 mhz 286-386 class machine, with 1-3MB of XMS memory was a high end network server, and cost a pile of money. Nowadays, we've got secretaries using machines that would have literally cost millions (and occupied rooms) a few years ago -- as glorified typewriters.

So, my two cents is that the "bloatware" thing is overblown. When 128 megs of RAM costs less than fifty bucks, and a 60 gig hard drive costs a buck a gig (I remember paying $275 for a 20 megabyte drive -- wholesale!), and no mix of OS and apps comes anywhere near taxing the capabilities for 99% of the users, "bloatware" is a non-issue.

8 posted on 12/17/2001 4:58:30 AM PST by Don Joe
[ Post Reply | Private Reply | To 1 | View Replies]

To: Bush2000; innocentbystander; SolitaryMan; Don Joe; lelio; Smogger; Dominic Harr; Rodney King...
Ping - let me know if you want to be added/removed from ping list!
9 posted on 12/17/2001 5:03:13 AM PST by stainlessbanner
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
You want small? We use the K language (a descendent of APL) from Kx Systems. (Total size < 200KB - that's right, kilobytes!) We've built a complete database management system that fits onto a single floppy!
10 posted on 12/17/2001 5:05:04 AM PST by ZeitgeistSurfer
[ Post Reply | Private Reply | To 5 | View Replies]

To: jammer
"There are software areas where a GUI is just unsuitable for data entry, and until another level for that data entry is reached, the fastest networking and user entry is on Win9x with a character based programming."

Two things: one, there's nothing preventing you from deploying character-mode apps to an NT/2K/XP platform, and two, if a GUI-based data entry app has worse usability than a character-mode counterpart, it's the programmer's fault, not the GUI's. Granted, too may people do little more than drag and drop textboxes and then bind them to fields, but that's their fault. I can drive my car into a brick wall. If I do, that's not an indictment of Toyota.

11 posted on 12/17/2001 5:07:31 AM PST by Don Joe
[ Post Reply | Private Reply | To 4 | View Replies]

To: damnlimey
Greed, laziness, ignorance, and stupidity.

For more than 10 years I have watched with amusement as PCs that are much faster than the old ones take LONGER to boot up -- typically, a 1.2 MHz PC XT vintage 1986 would be fully ready to go in a few seconds, and the current models are more than 1000 times as fast and take nearly 10 times as long because they are doing 10000 times as much computational work!

But the REAL problem with software bloat is not the slowness, it is the complexity which makes applications almost impossible to properly debug. NOBODY I know, and I know a LOT of computer types, makes any attempt to fix Microsoft-related errors themselves as they would with Unix or Linux, nor do they bother trying to get Microsoft to fix them because it just won't happen; instead they just shrug, reboot, and work around. A certain level of "Your program has performed an illegal operation and will be shut down" and a (lower) level of total freeze-ups and blue screens of death are simply accepted as a tolerable inconvenience.

But every single time this happens, there are one or more theoretically identifiable HUMANS who made specific MISTAKES that could be tracked down and blamed on them. The practical difficulties of this are sufficient that most of us are willing to simply let them be condemned to hand-simulate the infinite loops of their own programs in programmers' hell after they pass on.

12 posted on 12/17/2001 5:14:30 AM PST by VeritatisSplendor
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
Techno-Bump

This is an interesting "discovery." I guess the author is saying that the latest h/w resources allow the older os to provide better-than-ever performance because older os's don't have the unneeded overhead.

I hate unneeded overhead anyway.

Russ

13 posted on 12/17/2001 5:16:21 AM PST by kinsman redeemer
[ Post Reply | Private Reply | To 1 | View Replies]

To: ZeitgeistSurfer
--this software you have, this could be a forum style software as well, correct? If not please excuse, I am far from being any sort of alpha geek on these matters.
14 posted on 12/17/2001 5:17:43 AM PST by zog
[ Post Reply | Private Reply | To 10 | View Replies]

Comment #15 Removed by Moderator

To: damnlimey
Yea, there's bloat allright...

But there NEEDS to be another Hardware solution...

Simultaneous calls and virtual multiple clocks or something...

Then it'll all work!

Something big, yes sir... that's it!

16 posted on 12/17/2001 5:35:58 AM PST by No!
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
bump
17 posted on 12/17/2001 5:43:51 AM PST by 74dodgedart
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
With 60G of hard drive space for less than $200 and 512 MB of RAM for about $100, who cares how large it is? I recently read they created a 180G hard drive. Storage is cheap! As long as a reasonably fast speed is still there (I don't recall ever waiting for the operating system software to perform any operation), who cares how big the code is?
18 posted on 12/17/2001 5:44:29 AM PST by SW6906
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
I used to hand code assembly language for the 6502 microprocessor (Apple II - 1 Mhz). That little thing was awesome, many instructions executing in a single clock or two, like an early RISC device. You could do amazing things. I graduated to Turbo Pascal on the early 4.77 Mhz IBM-PC. I used that with some inline assembly code, and that sh*t would fly!

Ahh, the good ole days..

19 posted on 12/17/2001 5:45:03 AM PST by Paradox
[ Post Reply | Private Reply | To 1 | View Replies]

To: SW6906
Let me correct that: I do wait about a minute or so for W2K to start up, but from then on, it's lightning fast (1.33G processor, 512MB RAM!).
20 posted on 12/17/2001 5:46:27 AM PST by SW6906
[ Post Reply | Private Reply | To 18 | View Replies]


Navigation: use the links below to view more comments.
first 1-2021-4041-6061-80 ... 121-129 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson