Posted on 01/21/2004 10:44:10 PM PST by farmfriend
Today Linux, Tomorrow the World?
By James V. DeLong
The term "open source" is linked with software, and most particularly with Linux, the operating system which, it is hoped or feared, can challenge both Microsoft's position on the desktop and its ambitions to extend its empire into server space.
The theory is that Linux and other open source programs are written by hordes of volunteers, each contributing his/her widow's mite of code, communicating at zero cost over the Internet, and self-organizing their efforts without need for either the incentives of markets or the commands of organizational hierarchies. The proof that this model works, it is said, is all around us: Linux itself; Apache, which runs over half the websites; and other popular programs that are doing the work of the world.
The term "open source" comes from the software business. "Source code" is the term for a program as it is written in a high level programming language, such as C or C++. If a program is to be sold, this source code is kept confidential, and the program distributed as a glob of 1s and 0s, readable by a machine but incomprehensible to humans.
The term "open source" means, at its most fundamental, that the code is not secret but public, and thus available for scrutiny. However, the term has acquired some additional meanings. To be regarded as "Open Source" in the computer community, a program must also be available for modification and for unlimited redistribution. The keeper of the flame is the Open Source Initiative, which has criteria and certifies which of the many available software licenses qualify.
Creations other than software could be produced by similar processes. Movement theoretician Yochai Benkler, in a Yale Law Journal article, renames the phenomenon "commons-based peer production," and identifies a range of applications, from mapping Martian craters to producing ratings on Amazon and eBay.
Benkler argues that peer production avoids the transaction costs inherent in both markets and hierarchies, and becomes an alternative "third model of production." The Electronic Frontier Foundation's Open Audio License and the various licenses offered to authors by Larry Lessig's Creative Commons are designed to create a library of legal licenses that will help sympathetic creators extend the model beyond the software domain.
These hopes are not totally abstract. Just as the open source movement can point to valuable software, it is also producing some interesting things in other areas, such as the popular encyclopedia Wikipedia, and ibiblio.org, "a conservancy of freely available information, including software, music, literature, art, history, science, politics, and cultural studies" put together by a consortium of the Center for the Public Domain and the University of North Carolina.
Partly Legitimate, Partly Vaporware
Like so much of the past decade's worth of New Economy hype, the theory anchoring the open source movement is partly legitimate insight and partly vaporware.
The insights are three:
From these insights, the open source advocates deduce that the Internet makes obsolete our current system of propertizing intellectual creations and funneling them through a market system. By de-propertizing everything, we will produce a cornucopia of plentitude because the need to hassle expensively over who owns what will be eliminated, and current barriers will be replaced by free osmosis of ideas.
"Model of Production"?
The problem is that each of these insights comes with a "but" that makes the concept of open source as a "model of production" exceedingly dubious.
While the Internet can dramatically decrease search costs and agglomerate information, the size of the net it casts can increase other costs, such as filtering and processing. And if the combined contributions of a large number of people can be valuable, it is also certain that not all of them will be so, and that some of them will even be malicious, which means that policing is required. The cost equation has many terms.
Further, the model of major software programs composed by in-kind investment of minor amounts of time by thousands of people turns out to be a myth. Browse through MIT's papers on open source, and it becomes clear that every serious program is produced by a highly professional core augmented, largely in the bug report stage, by a user community. The Internet allows this core to be dispersed, and it can be larger than it might have been some years ago, but the basic model of professionalism remains.
Nor is the story of creativity bubbling up from the horde of Internet users entirely accurate. The open source movement is indeed filled with able and creative people, but current programs are off-shoots of Unix, which was developed with billions of dollars from Bell Labs, DARPA, corporations, and universities. None were created from scratch by unpaid labor working at night. And given the merciless numbers on software productivity, none will be. (Very roughly, a professional programmer can produce about 1000 lines of polished code in a year. A distribution of Windows or Linux has 30 million lines.)
Resource Curse
This brings us to one main problem with open source theory, which is resources. The open source community approaches the need for financing the way Victorians approached sex. Everyone knows you need it to produce more little Victorians, but discussion is kept to a minimum.
When it becomes absolutely unavoidable to discuss support, the open source theorists talk about "indirect appropriation," whereby getting a reputation for solving a software problem might get you hired to solve a similar problem, or where high productivity gets you academic tenure. Or a software programmer writes proprietary works for an employer during the week while doing open source on Sundays. In the background is some economic activity that actually pays the bills while the creator cadges bits of time to engage in his hobby. Or, for the professionalized core, there is some economic entity willing to subsidize the enterprise for its own purposes.
Even for software, the workability of this model is not a slam dunk, and in the software area some very big companies have strong reasons to support the open source movement, as they are indeed doing. IBM, HP, Sun, Dell, and others are putting in billions of dollars, and this is what keeps Linux afloat.
There is nothing wrong with this; it is a sensible business strategy for these companies. But it is not a model that transfers to music, or books, or journalism, or movies, or pharmaceuticals or games, or practically any other form of intellectual endeavor. Watch the closing credits scroll down on a great movie, such as Master and Commander, and try to imagine an open source process that could produce it. Then try to imagine any model that could do it, except the one we use, that allows millions of people to pool their resources and chip in their eight bucks a piece. Sponsorship? Ads? That produces Survivor, not Master and Commander. Watch it and be damned, but don't inflict it on me.
Where the movement is producing interesting things, it is doing so with heavy funding from academia, foundations, or corporations, and it is far from clear why such funding is superior in any way -- practically or morally -- to funding through market processes.
Socialization of the Creative Sector
The open source theorists know perfectly well that the model might translate to academia, but not beyond that. In fact, they have another model in mind, which is to make content free, tax the hardware industry, and then distribute the revenues to the creative community according to some complicated government-run formula. (See the work of the Berkman Center, or the Electronic Frontier Foundation.) To even think about this produces a shudder, given the government's unblemished and bipartisan record of pork, politics, and destruction in every industry it touches. (Think schools, energy, telecom.) It is also not even open source, particularly; it is just socialization of the creative sector.
The big question is, Why would anyone want to go down this road? As noted before in these pages, the concept that "price should equal marginal cost even when that is zero" is a product of an artificial logic from which all the reality has been stripped, and is virtually irrelevant to an investment-centered economy.
The characteristics of the Internet cited as enabling open source also allow the creation of vastly improved markets. People from all over the world will be able to use micropayments and digital rights management to pool their resources and support their favorite bands, movies, books, magazines, or weblogs The open source advocates miss the fact that markets are also communities, and far more efficient and moral ones than are the mythical realms of academic dreaming.
IBM pushes Linux on Power processors
NEW YORK--IBM has put more weight behind its effort to attract customers to Linux that runs on its own Power processors, an initiative that distinguishes Big Blue from its competitors in the server market.
In 2003, Linux on Power was a subsidized development project within IBM, but now it's a group with a revenue responsibility. To that end, IBM is working harder to attract software partners, write its own applications and ensnare customers.
"We're taking the value proposition of Linux and moving it to Power," Jim Stallings, general manager of Linux for IBM, said at a news conference at the LinuxWorld Conference and Expo here.
![]() |
||||
![]() |
![]() |
![]() |
||
![]() |
![]() Get Up to Speed on... Open source ![]() Get the latest headlines and company-specific news in our expanded GUTS section. ![]() |
![]() |
||
![]() |
||||
![]() |
Among Linux on Power customers IBM announced Wednesday are Kendall-Jackson Wine Estates, Intermountain Health Care, the State University of New York at Albany, LexCom, National Semiconductor, Black Hills and Hitachi Global Storage Technologies.
IBM is trying to attract specific software partners in a handful of market segments, and is moving its own software to Linux on Power as well. At the show, the company is demonstrating its upcoming "Stinger" version of its DB2 database software that will run on 64-bit Power processors and take advantage of the new 2.6 kernel, or heart, of the Linux operating system.
Stallings hinted that more software partners may be appearing soon. "We're working closely with SAP to explore this area of Linux and Power," Stallings said. SAP's software is widely used to run accounting, inventory and other important business operations.
Linux is most widely used on computers using "x86" Intel processors such as Xeon and Pentium or Advanced Micro Devices' Athlon. Indeed, Intel's xSeries server line, which uses Intel processors, was the first foothold the operating system found within the company.
But Linux spread, first to IBM's mainframes and now to its Power processor-based pSeries and iSeries servers. Those systems today most commonly run two IBM operating systems, the AIX version of Unix and OS/400, respectively.
IBM hopes to make Power servers available at the same cost as those using Intel processors, said Brian Connors, IBM's vice president of Linux on Power. A key part of that will be the PowerPC processor, which is used in IBM's JS20 Power blade server as well as in Apple Computer's G5 computers.
Though Big Blue wants Power to be widespread, it doesn't have desktop computing in its crosshairs. "Our focus right now is clearly on the server," Connors said.
IBM--the loudest Linux advocate--was the first company sued by SCO Group in that company's attack on Linux. SCO argues that IBM breached its contract by moving Unix intellectual property such as file system software from Unix to Linux, a charge IBM disputes.
While several companies--Novell, Hewlett-Packard, Red Hat and the Open Source Development Labs--have begun indemnification programs or other legal protections for Linux customers, IBM steadfastly refuses to do so outside its contribution to the OSDL's legal defense fund.
"There's not a reason for having to indemnify if there's no basis for it," Stallings said.
In an interview, Stallings added, "Customers are not asking for indemnification. They're calling and saying, 'Explain to us what's going on.'" Once informed, they are happy to buy Linux, he said.
Customer views have been changing, though. Stuart Cohen, chief executive of OSDL, said in an interview that the Linux consortium began its $10 million legal defense fund for Linux users because of Linux customer requests.
Irving Wladawsky-Berger, the IBM vice president of technology and strategy who spearheaded Big Blue's Linux push, said IBM is addressing the situation as directly as possible with its legal fight against SCO.
"In our legal system, you get it over with by going to court," Wladawsky-Berger said. "We think the actions we're taking are absolutely the right actions to take the issue behind us."
Dig deeper: Open Source | Servers
I seriously doubt that 1000 number.
(Let's ignore the argument that Windows code might not be "polished").
30,000,000 by 1,000 = 30,000 man-years.
Does Microsoft have 10,000 programmers working full-time on Windows for 3 years? (not counting tech support).
If they pay them 50k each, that's a $500 million payroll.
All of which is just as true for Microsoft as it is for any of the open-source centers.
If I recall correctly, the 1000 lines per year comes from Brook's The Mythical Man-Month. For larger projects, the average programmer is, well, average, and there is much overhead for the complexities of large and difficult to manage projects. You're lucky to get that many production lines per year per programmer in such cases. See further a good discussion of this at: Book Reviews: The Mythical Man-Month by Frederick Brooks, Jr..
Other studies show higher productivity of 7,700 to 16,700 lines of code per year. See for example Are U.S. programmers slackers? (Computerworld, April 15, 1999). It really seems to depend on several variables, involving project complexity, programmer motivation and competence, and familarity with the subject. It doesn't seem to vary much over the years or by programming language. A line of Perl costs about as much as a line of assembly.
If I write a one-liner in Python (the reading man's Perl), it counts as one line, not as a 1,000 lines for the equivalent in assembly code.
But to repeat myself, it depends.
If sizing something like OS 360 or Microsoft NT, then the number of lines in the first production release, divided out over the number of people in the programming department and the length of the project from startup to first release, will be more like 1000 lines per year, not per month.
There's a whole lot of 'stuff' other than coding that goes into coding a big project, and a whole lot of programmers that are less productive.
Early programmers didn't have the luxury of color-coded editors and navigational tools like those in VC++.
Yes, there is definately a skill/motivation element, probably the biggest factor. Code Complete by Steve McConnell is an excellent book and cites many studies, incl. one that links even the amount of per-programmer office space to average productivity.
I'm just speaking from experience. ALSO, more skilled programmers don't necessarily produce more lines of code.
The fancy editing, navigation and debugging tools are not essential in my view to productivity.
for( i = 0; i < NODE_TOT; i++ )
{
j = pNode->xyz;
Wanna be Penguified? Just holla!
Got root?
Open Source = Socialist Fraud
Thanks, I appreciate the laugh before going to work.
Maybe lazy, inept professional programmers. I can do that in a day if I am on a roll or up against a deadline.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.