Posted on 12/17/2001 4:33:52 AM PST by damnlimey
Rethinking 'Software Bloat' |
||||
Fred Langa takes a trip into his software archives and finds some surprises--at two orders of magnitude. By Fred Langa |
||||
Reader Randy King recently performed an unusual experiment that provided some really good end-of-the-year food for thought: I have an old Gateway here (120 MHz, 32 Mbytes RAM) that I "beefed up" to 128 Mbytes and loaded with--get ready--Win 95 OSR2. OMIGOD! This thing screams. I was in tears laughing at how darn fast that old operating system is. When you really look at it, there's not a whole lot missing from later operating systems that you can't add through some free or low-cost tools (such as an Advanced Launcher toolbar). Of course, Win95 is years before all the slop and bloat was added. I am saddened that more engineering for good solutions isn't performed in Redmond. Instead, it seems to be "code fast, make it work, hardware will catch up with anything we do" mentality.It was interesting to read about Randy's experiment, but it started an itch somewhere in the back of my mind. Something about it nagged at me, and I concluded there might be more to this than meets the eye. So, in search of an answer, I went digging in the closet where I store old software. Factors Of 100 When Windows 3.0 shipped, systems typically operated at around 25 MHz or so. Consider that today's top-of-the-line systems run at about 2 GHz. That's two orders of magnitude--100 times--faster. But today's software doesn't feel 100 times faster. Some things are faster than I remember in Windows 3.0, yes, but little (if anything) in the routine operations seems to echo the speed gains of the underlying hardware. Why? The answer--on the surface, no surprise--is in the size and complexity of the software. The complete Windows 3.0 operating system was a little less than 5 Mbytes total; it fit on four 1.2-Mbyte floppies. Compare that to current software. Today's Windows XP Professional comes on a setup CD filled with roughly 100 times as much code, a little less than 500 Mbytes total. That's an amazing symmetry. Today, we have a new operating system with roughly 100 times as much code as a decade ago, running on systems roughly 100 times as fast as a decade ago. By itself, those "factors of 100" are worthy of note, but they beg the question: Are we 100 times more productive than a decade ago? Are our systems 100 times more stable? Are we 100 times better off? While I believe that today's software is indeed better than that of a decade ago, I can't see how it's anywhere near 100 times better. Mostly, that two-orders-of-magnitude increase in code quantity is not matched by anything close to an equal increase in code quality. And software growth without obvious benefit is the very definition of "code bloat." What's Behind Today's Bloated Code? Instead, most of today's software is produced with high-level programming languages that often include code-automation tools, debugging routines, the ability to support projects of arbitrary scale, and so on. These tools can add an astonishing amount of baggage to the final code. This real-life example from the Association for Computing Machinery clearly shows the effects of bloat: A simple "Hello, World" program written in assembly comprises just 408 bytes. But the same "Hello, World" program written in Visual C++ takes fully 10,369 bytes--that's 25 times as much code! (For many more examples, see http://www.latech.edu/~acm/HelloWorld.shtml. Or, for a more humorous but less-accurate look at the same phenomenon, see http://www.infiltec.com/j-h-wrld.htm. And, if you want to dive into Assembly language programming in any depth, you'll find this list of links helpful.) Human skill also affects bloat. Programming is wonderfully open-ended, with a multitude of ways to accomplish any given task. All the programming solutions may work, but some are far more efficient than others. A true master programmer may be able to accomplish in a couple lines of Zen-pure code what a less-skillful programmer might take dozens of lines to do. But true master programmers are also few and far between. The result is that code libraries get loaded with routines that work, but are less than optimal. The software produced with these libraries then institutionalizes and propagates these inefficiencies. You And I Are To Blame, Too! Take Windows. That lean 5-Mbyte version of Windows 3.0 was small, all right, but it couldn't even play a CD without add-on third-party software. Today's Windows can play data and music CDs, and even burn new ones. Windows 3.0 could only make primitive noises (bleeps and bloops) through the system speaker; today's Windows handles all manner of audio and video with relative ease. Early Windows had no built-in networking support; today's version natively supports a wide range of networking types and protocols. These--and many more built-in tools and capabilities we've come to expect--all help bulk up the operating system. What's more, as each version of Windows gained new features, we insisted that it also retain compatibility with most of the hardware and software that had gone before. This never-ending aggregation of new code atop old eventually resulted in Windows 98, by far the most generally compatible operating system ever--able to run a huge range of software on a vast array of hardware. But what Windows 98 delivered in utility and compatibility came at the expense of simplicity, efficiency, and stability. It's not just Windows. No operating system is immune to this kind of featuritis. Take Linux, for example. Although Linux can do more with less hardware than can Windows, a full-blown, general-purpose Linux workstation installation (complete with graphical interface and an array of the same kinds of tools and features that we've come to expect on our desktops) is hardly what you'd call "svelte." The current mainstream Red Hat 7.2 distribution, for example, calls for 64 Mbytes of RAM and 1.5-2 Gbytes of disk space, which also happens to be the rock-bottom minimum requirement for Windows XP. Other Linux distributions ship on as many as seven CDs. That's right: Seven! If that's not rampant featuritis, I don't know what is. Is The Future Fat Or Lean? But there are signs that we may have reached some kind of plateau with the simpler forms of code bloat. For example, with Windows XP, Microsoft has abandoned portions of its legacy support. With fewer variables to contend with, the result is a more stable, reliable operating system. And over time, with fewer and fewer legacy products to support, there's at least the potential for Windows bloat to slow or even stop. Linux tends to be self-correcting. If code-bloat becomes an issue within the Linux community, someone will develop some kind of a "skinny penguin" distribution that will pare away the needless code. (Indeed, there already are special-purpose Linux distributions that fit on just a floppy or two.) While it's way too soon to declare that we've seen the end of code bloat, I believe the signs are hopeful. Maybe, just maybe, the "code fast, make it work, hardware will catch up" mentality will die out, and our hardware can finally get ahead of the curve. Maybe, just maybe, software inefficiency won't consume the next couple orders of magnitude of hardware horsepower. What's your take? What's the worst example of bloat you know of? Are any companies producing lean, tight code anymore? Do you think code bloat is the result of the forces Fred outlines, or it more a matter of institutional sloppiness on the part of Microsoft and other software vendors? Do you think code bloat will reach a plateau, or will it continue indefinitely? Join in the discussion! |
coupled by inadequate supervision/management of those responsible for 'systems' of their own programming staff and accompanied by...
They got a lot of these straight out of college types running things from where I am sitting. Many of these guys don't know their *ss from a whole in the ground. Lots of engineer types in charge too. I dunno. A lot of these guys have never written a shred of commercial quality code.
inadequate programming standards (such as permitting no documentation, no modular thinking, no shared libraries/resources)...
We are affected by the lack of documentation. Big companies have an entire documentation department. A large phone company that we with worth comes to mind. Their product is buggy and inferior in many ways, however. They have so many analysts this, program managers that, and other managers over there that sometimes I wonder who actually does the work When your at a smaller shop docuemntation suffers though. Good programmers should self document. In the code.
rush deadlines agreed to by managers causing their programming staff to have to take short cuts and either produce inadequate documentation or eliminate documentation;
Hahaha...Hey! You been spying on my company?
The operating environment as in the 'shop itself'. For example, high programmer turnover and its impact on somebody having to go behind and decode spaghetti.
This usually cannot be helped. Lord knows I and everyone here has cleaned up so much spaghetti..
Good points. Not sure if all of these things are more responsible for bloat then the advent of rapid application development languages.
Bingo! You win a prize!
Yep. Well said.
Thanks. I'm in diagnostics research and quality troubleshooting. IOW, I'm a professional pessimist.
I won that prize back in 1986. My brother was in a EE class and was supposed to write software for a microcontroller that would display letters on an LCD screen. He had all the bits that would turn on the segments of the display and needed to write the code that would translate ASCII into the segments. He was having trouble, so he called me.
I told him to make an indexed table using the ASCII - 'a' as the index value and put the word in the table that would cause the LCD to display the letter that matched the ASCII value. I told him that, if he worked for me, I would like a solution that took several K bytes of memory per system over a system that took him days to write and test because hardware is cheap and I could build the price of the memory into the system. I couldn't recover his time in the same fashon.
The instructor used his result to demonstrate good engineering to the class.
Shalom.
Argee does hit it on the mark with programmer costs vs. cheaper hardware.
And, as you said, "rapid application development languages certainly makes it easy to churn out code"
Perhaps it's not that Bloatware is the problem - provided two things, that the Bloatware performs as advertised AND you have the cpu cycles to spare (as ctdonath2 noted).
But, if you unfortunately purchased or have had to maintain Crapware, then that Crapware is often Bloatware.....
But usability has been improved. At least that's what they tell me to say.
HA!
Have you seen the requirements for installing Linux?
Linux ISN'T immune to this 'phenomenon' either ...
True, and I would argue that it's better to err on the side of caution. Look at how widely Code Red was able to spread because people had no idea that their Windows machines had a web server running. If a user gets to the point where he needs advanced functionality, he should be sufficiently competent to enable it himself.
I know you're a coder - but I did not blame the individual developer or coder.
As a matter of fact, what I wrote was that it begins with the 'instruction', however that 'developer' obtained the instruction:
I wrote:
It has everything to do with Programming Design or the lack of it. This lack is a result of:
If you are not taught to code properly; if the manager/supervisor does not instruct you otherwise; if you manage to sell your code and people don't complain or have no alternative, you will continue to code poorly, and, perhaps, heavens forbid, churn out Crapware and Bloatware.
You did say software bloat is a complex issue - I agree - but for a different reason: because we have not defined it precisely.
"My favourite is 'writing hard core C to create slick tight code'."
-- Bill Gates
That's not a good example at all! The "bloat" you see there is so that the programmer doesn't need a Master's degree and ability to do this:
title Hello World Program (hello.asm)
; This program displays "Hello, World!"
dosseg
.model small
.stack 100h
.data
hello_message db 'Hello, World!',0dh,0ah,'$'
.code
main proc
mov ax,@data
mov ds,ax
mov ah,9
mov dx,offset hello_message
int 21h
mov ax,4C00h
int 21h
main endp
end main
And can, instead, do this:
#include < iostream.h >
int main () {
cout << "Hello, World!" << endl;
return 0;
}
"I would have written a shorter letter but didn't have time." -- Blaise Pascal (1623 - 1662).
Simple question, simple answer: The reason for all the bloatware, is because Programmers will link & compile entire libraries of code into their application, even if they're only using one or two functions from that library.
Much of this article struck a chord with me. My first programming job out of college was with a financial services company, that specialized in car dealership management software. The first version of the entire application ran on a single 1.2mb floppy - and that included months worth of sales and service records. Granted, this was back in 1987.
In 1990 when I left, the code was up to 3mb to run, with a months sales taking up another 3mb. As we added functionality, our function libraries got bigger and bigger. When we switched programming tools, our code base tripled, simply from the nature of the compiler, and the necessary re-write of the code to utilize NEW function libraries.
This article makes perfect sense, the author is spot on. The tools to create the applications are hogs. The libraries required to write code and add functionality are hogs. Programmers (self included) are hogs because we're lazy. We'll gladly link in that entire function library to get the ONE function we need. As long as we don't have to write it from scratch, it's fine by us.
That's the ONLY reason necessary and sufficient. That hardware costs less than programmer time is immaterial. In fact I've worked on projects where that wasn't true, yet bloat still occurs.
As long as someone will pay on an invoice, anything can happen.
The far better question is this:
Why do the people who pay prefer Bloatware?Time after time bloatware wins the market battle. Why?
And in most cases this is a good thing to do. Every line of code you don't write is a line that won't have a bug. I'll gladly use libraries that reduce the amount of code I have to write, even if the resulting program is bigger. I'll work faster, and users will get better reliability.
Unfortunately, too often today programs seem to be bigger *and* buggier, so linking libraries may not be the culprit there.
You should have made it multiple choice. Can I provide the choices?
a. Bloatware is inherently superior to properly sized software.
b. People who write bloatware have more time to spend developing marketing hype.
c. Evil, unlawful, monopolistic practices.
d. People are idiots.
Shalom.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.