Posted on 07/27/2013 3:06:22 PM PDT by brityank
Another uncertainty for climate models different results on different computers using the same code
Posted on July 27, 2013 | by Anthony WattsNew peer reviewed paper finds the same global forecast model produces different results when run on different computers
Did you ever wonder how spaghetti like this is produced and why there is broad disagreement in the output that increases with time?
Increasing mathematical uncertainty from initial starting conditions is the main reason. But, some of it might be due to the fact that while some of the models share common code, they dont produce the same results with that code owing to differences in the way CPUs, operating systems, and compilers work. Now with this paper, we can add software uncertainty to the list of uncertainties that are already known unknowns about climate and climate modeling.
(Excerpt) Read more at wattsupwiththat.com ...
Go see it all, and the commentary following.
But we all know it’s true, right?
you put manure in ,you get manure out
No matter what type of data was put in Al Gore’s hokey stick came out.
But there is scientific consensus, said with a lisp.
Thanks brityank.
” Al Gores hokey stick came out.”
Does a “hokey stick” have something to do with al’s second chakra?
Except that with AGW hokie pokie, you put trillions of dollars in and don’t take anything out.
Sigh. There’s an entire subscience in computer science called “numerical analysis” that deals with the fact that floating pointing point computations in computers are imprecise due to the finite precision of the instruction sets, and which continusously introduces tiny errors.
There are scientists who are formally trained in numerical analysis to understand and minimize the impact of these tiny errors in models in which trillions of calculations are performed iteratively for millions of cycles. If not properly accounted for, these tiny errors can quickly overwhelm any legitimate model results and render the model useless.
Given the completely sloppy, slip-shod, and down right fraudulent nature of the way climate “scientists” go about their business, it wouldn’t surprise me in the least if they don’t bother to consult numerical analysts regarding the validity of their computer models with respect to errors introduced by floating point units.
CPUs have slight differences in their floating point computation units. They are supposed to follow an IEEE standard, but there are edge cases where some don’t. A worst case example is the bug in the Pentium processor a decade or so ago.
Even on the exactly the same processor, different compilers, different versions of the same compiler, or different compiler options on the same version of the same compiler will generate slightly different machine operation codes.
Over millions and even billions of repeated floating point operations, these slight differences accumulate, and the final result can change significantly. A reputable modeler freezes everything once a model is validated, and only accepts any changes after validation.
The problem: climate modelers can’t validate their models in the first place. They don’t have reliable data to do so.
http://hockeyschtick.blogspot.com/2013/07/new-paper-finds-same-climate-model.html#comment-form
Speaking of Hockey Schtick!!!!
There are two posts that point out the basic problem.
I will illustrate the problem. Let’s say you have four-digit precision, with scientific notation (sixteen-digit precision is routine, and thirty-two digit precision or higher may be necessary for special applications).
The scientific notation part keeps track of the order of magnitude of the number, e.g., ones, billions, billionths, etc.
OK. Now add 1 to X and repeat. With four digit precision you go up to 9999, and then to 10,000 or 10,00E+1. Add 1 to 10,00E+1 and you get 10,00E+1. Do this 5 million more times, and you still get 10,00E+1.
This reveals one of the problems with “Big Data,” lots and lots of data, each datum of which is small relative to the order of magnitude of their total. There are strategies to deal with this, but there is a certain culture among the climatologists that because they have lots of data (e.g., hourly readings at thousands of thermometers over thirty years), they are gods. We have a similar problem with exchange rates, stock prices and so forth, in financial markets. But, just because you have high frequency data doesn’t mean you have that much more information.
The next problem in climatology is the kind of calculations they perform. They use power functions. These make big numbers in to very, very big numbers, which are challenges to handle inside the computer, with floating point numbers. We economists have our own sin, which is to invert large matrices.
Yes, because of computational uncertainty, it is sometimes important to perform calculations using different platforms, operating systems and application software. And, if you get significantly different results, to think about what you’re doing that the computers find offensive.
I also find that, if your model fails to match reality, your model is wrong - not your reality.
Fitting the model to the reality - a problem with climatologists is that you have plug-in values that allows a lot of flexibility to the presumption that CO2 explains variation in global temperature. So, what you mean is predictive ability. GIVEN a model fitted with data from the past, how well does it predict the future (or “hold out” data, perhaps the unknown past). On the basis of zero predictive ability, you can say that climatology has embarrassed himself in public.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.