Posted on 08/25/2004 5:19:10 PM PDT by Lexinom
One of the trends in modern computer science is the push for larger, more generic building blocks. In theory, this leads to faster development cycles but with a less granular flexibility. In practice, I've found that the arrangement of object files (the actual machine-readable code generated from the source code) and their interdependcies plays an often overlooked role in determining the effort necessary to complete a given project. What do I mean by this?
Let's take an example from the physical world. A bed of rocks, perhaps alongside a river, "arranges" itself such that the smallest grains of sand slip through the cracks made by the larger pebbles, which themselves fall down amidst the larger smooth, round rocks. In the software world, the library or module lowest in the dependency chain - we'll call it A1, - is used by A2, which in turn is used by A3. A1 ought to be the most granular, with the fewest dependencies and the broadest use of intrinsic types.
If A1 is a module written in the C language, it may use char, int, unsigned int, long, unsigned long, and void * data types without worry because all are intrinsic to the C language. Strong typing does not buy anything for performance, and any semantical value added thereby can be more safely effected with clear, terse commenting.
If A1, on the other hand, uses a fancy STL library, and a third-party GUI library, which in turn uses a system level library (we'll call it LIBSYS) which itself is incompatible with another commonly-used system library (say LIBC), we have created a potential monster. What if, for example, someone working on A3 decides to import a GUI library which requires LIBC? A3 still needs A1 with its dependencies. One such tie is to LIBSYS - incompatible with LIBC. Let's assume, further, that this developer gets fed up with the project and moves to another company. The next poor soul will inherit an intricate labyrinth of a codebase, rife competing dependencies, left in a sorry and broken state. The "grain of sand" has grown too big to fit under the rocks it was expected to support.
That's my take on dependencies. One is justified in questioning the boost in productivity achieved by running off with the latest, trendiest technologies. "Keep it Simple, Sam".
> Just some ramblings for late in the day.
Glad you explained.
For a minute there, I thought I was on slashdot.
It's been a frustrating day... Just had to vent a little, even if only 5% understand... :-)
Unfortunately, that is not always true. Intel processors can't even store an integer correctly. They use the brain-damaged "little endian" integer storage format.
Please keep me on your ping list for this stuff.
Thanks!
Those technologies aren't designed to boost programmer productivity - they are designed to lock in guaranteed library licensing revenue streams for software companies for a decade or more (the magazines always forget how long legacy applications built with these libraries end up staying in production). ;)
begin
poor_soul :== microsoft_programmer;
end
On the other hand, Big Endian shows in memory the actual representation, e.g. 0x4000 is displayed as 0x40 0x00.
BTW, was HAL9000 big-endian? :-)
Open Source rules.
Hey, I have to work on this Windows stuff, but my pride and joy is straight C programming on a Tandem mainframe, with an operating system interface developed Circa 1980. The focus there is always on engineering a superior product, not in fitting disjointed pieces together into a vast Micro$oft's gigsaw puzzle.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.