Posted on 09/26/2009 1:00:03 PM PDT by ShadowAce
(PhysOrg.com) -- Computer scientists at Sandia National Laboratories in Livermore, Calif., have for the first time successfully demonstrated the ability to run more than a million Linux kernels as virtual machines.
The achievement will allow cyber security researchers to more effectively observe behavior found in malicious botnets, or networks of infected machines that can operate on the scale of a million nodes. Botnets, said Sandias Ron Minnich, are often difficult to analyze since they are geographically spread all over the world.
Sandia scientists used virtual machine (VM) technology and the power of its Thunderbird supercomputing cluster for the demonstration.
Running a high volume of VMs on one supercomputer at a similar scale as a botnet would allow cyber researchers to watch how botnets work and explore ways to stop them in their tracks. We can get control at a level we never had before, said Minnich.
Previously, Minnich said, researchers had only been able to run up to 20,000 kernels concurrently (a kernel is the central component of most computer operating systems). The more kernels that can be run at once, he said, the more effective cyber security professionals can be in combating the global botnet problem. Eventually, we would like to be able to emulate the computer network of a small nation, or even one as large as the United States, in order to virtualize and monitor a cyber attack, he said.
A related use for millions to tens of millions of operating systems, Sandias researchers suggest, is to construct high-fidelity models of parts of the Internet.
The sheer size of the Internet makes it very difficult to understand in even a limited way, said Minnich. Many phenomena occurring on the Internet are poorly understood, because we lack the ability to model it adequately. By running actual operating system instances to represent nodes on the Internet, we will be able not just to simulate the functioning of the Internet at the network level, but to emulate Internet functionality.
A virtual machine, originally defined by researchers Gerald J. Popek and Robert P. Goldberg as an efficient, isolated duplicate of a real machine, is essentially a set of software programs running on one computer that, collectively, acts like a separate, complete unit. You fire it up and it looks like a full computer, said Sandias Don Rudish. Within the virtual machine, one can then start up an operating system kernel, so at some point you have this little world inside the virtual machine that looks just like a full machine, running a full operating system, browsers and other software, but its all contained within the real machine.
The Sandia research, two years in the making, was funded by the Department of Energys Office of Science, the National Nuclear Security Administrations (NNSA) Advanced Simulation and Computing (ASC) program and by internal Sandia funding.
To complete the project, Sandia utilized its Albuquerque-based 4,480-node Dell high-performance computer cluster, known as Thunderbird. To arrive at the one million Linux kernel figure, Sandias researchers ran one kernel in each of 250 VMs and coupled those with the 4,480 physical machines on Thunderbird. Dell and IBM both made key technical contributions to the experiments, as did a team at Sandias Albuquerque site that maintains Thunderbird and prepared it for the project.
The capability to run a high number of operating system instances inside of virtual machines on a high performance computing (HPC) cluster can also be used to model even larger HPC machines with millions to tens of millions of nodes that will be developed in the future, said Minnich. The successful Sandia demonstration, he asserts, means that development of operating systems, configuration and management tools, and even software for scientific computation can begin now before the hardware technology to build such machines is mature.
Development of this software will take years, and the scientific community cannot afford to wait to begin the process until the hardware is ready, said Minnich. Urgent problems such as modeling climate change, developing new medicines, and research into more efficient production of energy demand ever-increasing computational resources. Furthermore, virtualization will play an increasingly important role in the deployment of large-scale systems, enabling multiple operating systems on a single platform and application-specific operating systems.
Sandias researchers plan to take their newfound capability to the next level.
It has been estimated that we will need 100 million CPUs (central processing units) by 2018 in order to build a computer that will run at the speeds we want, said Minnich. This approach weve demonstrated is a good way to get us started on finding ways to program a machine with that many CPUs. Continued research, he said, will help computer scientists to come up with ways to manage and control such vast quantities, so that when we have a computer with 100 million CPUs we can actually use it.
Hm bet Eq2 would run with out lag on that....
Botnets are proof that Windows is actually the most scaleable OS on the planet. :-)
LOL!
Good story. Linux has always been a versatile kernel. It’s just a shame that the FOSS community has yet to build a truly prime-time OS on top of it.
“Botnets are proof that Windows is actually the most scaleable OS on the planet. :-)”
lol! sweet.
Exceeding the number of real Linux installations by only a factor of twenty.
Good story. Linux has always been a versatile kernel. Its just a shame that the FOSS community has yet to build a truly prime-time OS on top of it.The FOSS "community" can't program a user interface to save its life. That's why in good companies(i.e. Apple and to a lesser extent Microsoft) are determined by their interface designers.
Linux, Darwin or some other form of Unix-like will eventually displace Windows because the advent of the cloud is breaking Microsoft's ability to lock in. Once office ceases to be a factor, it's over.
Oh, I don't know. I rather prefer the Linux interface to the Windows one.
>>> Botnets are proof that Windows is actually the most scaleable OS on the planet. :-) <<<<
Heh.
I’m a huge UNIX nerd but have never been a fan of linux on the desktop until Ubuntu. They’re getting everything right.
I’ve been using Ubuntu as my main box for several years. I really do like Ubuntu, and I am an advocate for it.
That said, they really do need to find a way to automate printer, scanner, and misc device driver compiling and installation. That’s the Achilles’s heel.
Currently, I am using KDE 4.3 on Fedora 11. I had changed from Gnome to KDE because of odd gdm problems; kdm fixed those, and beyond that, KDE 4.3 is quite good.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.