Posted on 05/16/2013 6:39:16 AM PDT by ShadowAce
High Scalability has a fascinating article up that summarizes a talk by Robert Graham of Errata Security, summarizing the development choices needed to support 10 million concurrent connections on a single server. From a small data center perspective, the numbers he is talking about seem astronomical, but not unbelievable. With a new era of Internet connected devices dawning the time may have come to question the core architecture of Unix, and therefore Linux and BSD as well.
The core of the talk seems to be that the kernel is too inefficient in how it handles threads and packets to maintain the speed and scalability requirements for web scale computing. Graham recommends moving as much of the data processing as possible away from the kernel and into the application. This means writing device drivers, handling threading and multiple cores, and allocating memory yourself. Graham uses the example of scaling Apache to illustrate how depending on the operating system can actually slow the application when handling several thousand connections per second.
Why? Servers could not handle 10K concurrent connections because of O(n^2) algorithms used in the kernel.
Two basic problems in the kernel:
Connection = thread/process. As a packet came in it would walk down all 10K processes in the kernel to figure out which thread should handle the packet
Connections = select/poll (single thread). Same scalability problem. Each packet had to walk a list of sockets.
Solution: fix the kernel to make lookups in constant time
Threads now constant time context switch regardless of number of threads.
Came with a new scalable epoll()/IOCompletionPort constant time socket lookup.
The talk touches on a concept Ive been mulling over for months, the inherent complexity of modern data centers. If you are virtualizing, and you probably are, for your application to get to the hardware there are most likely several layers of abstraction that need to be unpacked before the code it is trying to execute actually gets to the CPU, or the data is written to disk. Does virtualization actually solve the problem we have, or is it an approach built from spending far too long in the box? That Grahams solution for building systems that scale for the next decade is to bypass the OS entirely and talk directly to the network and hardware tells me that we might be seeing the first slivers of dusk for the kernels useful life serving up web applications.
So what would come after Linux? It is possible that researchers in the UK have come up with a solution with Mirage. In a paper quoted on the High Scaleablity site the researchers describe Mirage:
Our prototype (dubbed Mirage) is unashamedly academic; it extends the Objective Caml language with storage extensions and a custom run-time to emit binaries that execute as a guest operating system under Xen.
Mirage is, as stated, very academic, and currently very alpha quality, but the idea is compelling. Writing applications that compile directly to a complete machine, something that runs independently without an operating system. Of course, the first objection that comes to mind is that this would lead to writing for specialized hardware, and would mean going back in time thirty years. However, combining a next generation language with a project like [Open Compute] would provide open specifications and community driven development at a low level, ideal for eking out as much performance as possible from the hardware.
No matter which way the industry turns to solve the upcoming challenges of an exploding Internet, the next ten years are sure to be a wild ride.
Re: the free software license ...
The LAMP software wasn’t for commercial use and had an end-of-life with the project.
Well, if you want me to talk your ear off with stories, I can tell them all day. :)
There's nothing like living under the hood (at least for a fixed task list).
And yet, try to get a job as an Ada programmer these days...
I could do it, if I had "X years of experience in the field"... but that illustrates another problem: companies want workers "cookie-cutter made" for the position and are unwilling to expend time/effort in training. (This is regardless of programming language and even a quality of the "entry-level positions.")
An article of which author reveals his illiteracy in the very title is not worth reading.
it's another one of the side effects of the fact that neither companies nor employees have any loyalty to each other these days. Why train someone if they can just take their knowledge somewhere else? Why should an employee have any loyalty to a company when so many are run by folks who see their employees as just another 'resource' to be slotted? One ot the things that I see that is really tragic is that companies put no value on institutional knowledge. I recently was eventually passwd over at a company that I used to work for because I was considered "too expensive", even though I had over 10 years worth of institutional knowledge of processes, procedures, and environments (I built a lot of the infrastructure). I was quite clear with them that they were going to have to pay some kind of premium for that knowledge if they wanted me back. The delta I was asking for wasn't really all that much, but to me the principle was important enough to push for it. They choked on it anyway, and I simply wasn't willing to accept their lowball offers. In the long run, I guess that's a good thing because I left the place for a reason, but it was disappointing to see that even with the huge turnover they have had in the past year, they just couldn't see any value in that kind of institutional knowledge.
So every application will have its own set of proprietary device drivers? This is beyond ridiculous.....
Do you want to use a general purpose OS with a easily coded app to do a specific time critical job, or do you want to invest the time in developing an optimized, integrated OS and app to perform a specific time critical job in high volume?
The latter is the way we used to implement imbedded real time control systems and they were extremely efficient in the use of the extremely limited hardware resources available at the time (~ 30 yr ago).
The logical thing would be to sell the hardware and app together. The people that make the device drivers would also create the apps. This isn’t going in a very goods direction....
You are quite correct, and in this issue both sides have legitimate reasons for not expecting the other side to act well. Employers have damaged the ability of employees to be loyal by utterly taking advantage of them: things like Google's "everything you make belongs to us" intellectual property ideas, or Dell's/IBM's/Sony's "you made patentable improvements worth millions... here's a watch." (Or in this economy the "you're lucky to have a job, work more [harder, unpaid overtime, etc] or you won't".) // On the other hand, Employees have damaged the ability of companies to be loyal to them by not joining in on the drive for success [i.e. they don't view the company's success as their own success]. (IMO, this is a problem of leadership: a good leader will draw those under him into the vision/goal.) Another thing that strains things on the employer side is that the employees don't always want to be in their "slot" -- another symptom of viewing people as cogs/components.
One of the things that I see that is really tragic is that companies put no value on institutional knowledge. I recently was eventually passed over at a company that I used to work for because I was considered "too expensive", even though I had over 10 years worth of institutional knowledge of processes, procedures, and environments (I built a lot of the infrastructure).
*nod* -- This is true; but then again, it seems like there's a lot of companies that don't value "non-standard"* knowledge. (In a recent job interview I was kinda talked down to because "they wanted someone with more knowledge/experience in Objects" [as in OOP] -- despite that most of the languages on my resume had OOP -- when what they [apparently] meant was that they wanted someone more experienced with the .NET framework and C#'s method of OOP.) The following is another example of this attitude:
PieterCasparzen: C is very sufficient for low-level programming and it's so ubiquitous that it's the only sensible choice.It assumes that because C is ubiquitous it's the reasonable choice, with the implicit exclusion of all other solutions. This is the same attitude that many CS grads have regarding extensibility [from OOP] -- "if you can't add new values/functionality then it's useless" -- which I was talking with BuckeyeTexan about earlier... and it precludes applying any other set of knowledge to help one address the problem. (One thing that C is notorious for is difficulty of optimization, particularly due to aliasing issues, IIRC -- these issues are not present in LISP or Ada [not sure about FORTH, just starting it] -- so by forcing C, a company/client is excluding whole realms of "non-standard" knowledge or solutions.)
Go ahead.
Nah, you should read it. The author’s ignorance is an education in and of itself.
It’s true that many employers want their programmers to work heads down in their cubicals. It’s equally true that many employees use employers to gain training and certs and then move on for higher salaries. It’s a cycle that feeds itself.
When I interview and hire programmers, I place a premium on loyalty and knowledge (demonstrable skill, not just experience or education.) If I see loyalty in someone who might be lacking in knowledge, I will often hire him over someone who has more knowledge because I can instill knowledge in a loyal programmer. I can teach him how to problem solve and write efficient code.
While the reverse takes more time and effort, it is possible to elicit loyalty from a knowledgeable programmer. It takes more than money and promotions. There is only so much money in the budget to hand out. Likewise, the company hierarchy allows only so many titles before there become too many chiefs and not enough indians.
The key to eliciting loyalty, speaking from my experience only, is to show it. Loyalty includes respect. Include your programmers in the decision making process. Seek their opinions, value their advice, acknowledge their contributions and successes, explain your reasoning to them when you must make an unpopular decision, let them know up front about project contraints that you cannot change, fight for them on important technology decisions even if you know you can’t win, respect their personal time, invite them to share ideas and concerns with you, listen to and consider their ideas and concerns. I could write much more, but it all comes down to loyalty and respect. If they feel appreciated and respected, they will return it.
I’m only 43, but in my 22 years in IT, the most important thing I’ve learned is that there is always something to be learned from your superiors, peers, and subordinates. As an ignorant n00b, I spent a lot of time on the floor of my co-workers’ offices asking questions and soaking up their experiences. It has served me well.
“Its equally true that many employees use employers to gain training and certs and then move on for higher salaries. “
Which begs the questions of how can you possibly expect loyalty if a higher wage can be had elsewhere? If you company’s customers can get a better deal elsewhere they leave, too.
Funny how things come full circle. Didn't Wordstar used to come with all its printer drivers?
What's the big deal? Most of Linux is alpha quality, too.
As a typical windows user it reads like this:
Πρωτότυπο μας (που ονομάστηκε Mirage) είναι απίστευτα ακαδημαϊκή? Επεκτείνει το στόχο Caml γλώσσα με επεκτάσεις αποθήκευσης και ένα προσαρμοσμένο run-time για να εκπέμπουν εκτελέσιμα που εκτελούν ως φιλοξενούμενο λειτουργικό σύστημα κάτω από το Xen.
It's a logical question. To be clear, I pay programmers well. In my experience, however, employees base loyalty on more than wages. In no particular order, employees are more likely to stay if they are well compensated, receive good/great benefits, given flexible hours, have the opportunity to advance, get to work with new and interesting technology, regularly receive training and attend industry conferences, and feel appreciated, respected, and acknowledged.
If I offer (and deliver) all of that to a programmer and he chooses to leave for a higher salary, I will attempt to match the salary if he's a dedicated employee and if not, we're better off parting ways. I do my best to prevent employees from looking for higher salaries.
If your companys customers can get a better deal elsewhere they leave, too.
Absolutely true, but "deal" is the key word there. As with employee loyalty, customer loyalty is based on the whole package offered and delivered. If a company's prices are 5% higher than a competitor's, but that company provides better service, response time, benefits, and quality, customers will stay. While price may be their first consideration, it is not their only consideration.
I have always found that beside pay, thinking is the most important. Engineers want to think, to find solid solutions. They need the time to think, the place to think, and the tools with which to try out their thinking. Unfortunately for engineers, business think loud “collaborative” environments, which are cheap for the company, work. Or that fast paced, “just get ‘er done”, environments are productive.
This is very true; I do have some of these Engineer qualities... but, honestly, a lot of it is "untrained" (despite a BS) and "unhoned" (because in my [admittedly limited] experience companies want "coders" and not "software engineers").
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.