I would appreciate it if somebody more knowledgeable than me (a very large group) could explain how much this would speed processing up relative to existing systems.
Think superconductor. Low resistance, uses less power, less waste heat.
If you had an 8-core CPU in your desktop machine and they were running at 500Ghz then your PC would qualify as a super-computer.
It could simulate the physics needed to do realistic real-time hi-res 3D-graphics without breaking a sweat. Your current PC would take weeks or months to generate a single frame.
It could do strange things like alter your image in real-time so that you could look and sound exactly like someone else on a live video chat.
It could handle simulations of nuclear devices.
If you could feed it the data it could monitor all US cell traffic in real-time and watch for keywords.
Creating the software to take advantage of such power would be the hardest part.
Just imagine 10,000 cores running at 500Ghz.
SEVERAL orders of magnitude. At this point, they're just fiddling with the matrix, meaning the wafers that go into semiconductors. Once they start stacking them and applying current, look out!
The current standard is 22 nm for processors, for instance. There's a finite number of wafers that can be crammed into a 22 nm space, and we're getting around 4 GHz consistently on newer chips.
If their numbers are correct, 500 GHz over a platter would mean multi-core processors with hundreds if not thousands of "cores" all computing at several hundred gigahertz. That's just at a server/desktop hardware level, which would be considered "macro." The excitement is coming in the form of mobile and even microprocessing power where your current iPhone/Android could be shrunk in thickness down to that of a piece of paper with processing power much higher than currently available. Wearable tech is also a possibility.