**************************************EXCERPT*******************************************
It is claimed that GCMs provide credible quantitative estimates of future climate change, particularly at continental scales and above. Examining the local performance of the models at 55 points, we found that local projections do not correlate well with observed measurements. Furthermore, we found that the correlation at a large spatial scale, i.e. the contiguous USA, is worse than at the local scale.
However, we think that the most important question is not whether GCMs can produce credible estimates of future climate, but whether climate is at all predictable in deterministic terms. Several publications, a typical example being Rial et al. (2004), point out the difficulties that the climate system complexity introduces when we attempt to make predictions. Complexity in this context usually refers to the fact that there are many parts comprising the system and many interactions among these parts. This observation is correct, but we take it a step further. We think that it is not merely a matter of high dimensionality, and that it can be misleading to assume that the uncertainty can be reduced if we analyse its sources as nonlinearities, feedbacks, thresholds, etc., and attempt to establish causality relationships. Koutsoyiannis (2010) created a toy model with simple, fully-known, deterministic dynamics, and with only two degrees of freedom (i.e. internal state variables or dimensions); but it exhibits extremely uncertain behaviour at all scales, including trends, fluctuations, and other features similar to those displayed by the climate. It does so with a constant external forcing, which means that there is no causality relationship between its state and the forcing. The fact that climate has many orders of magnitude more degrees of freedom certainly perplexes the situation further, but in the end it may be irrelevant; for, in the end, we do not have a predictable system hidden behind many layers of uncertainty which could be removed to some extent, but, rather, we have a system that is uncertain at its heart.
Do we have something better than GCMs when it comes to establishing policies for the future? Our answer is yes: we have stochastic approaches, and what is needed is a paradigm shift. We need to recognize the fact that the uncertainty is intrinsic, and shift our attention from reducing the uncertainty towards quantifying the uncertainty (see also Koutsoyiannis et al., 2009a). Obviously, in such a paradigm shift, stochastic descriptions of hydroclimatic processes should incorporate what is known about the driving physical mechanisms of the processes. Despite a common misconception of stochastics as black-box approaches whose blind use of data disregard the system dynamics, several celebrated examples, including statistical thermophysics and the modelling of turbulence, emphasize the opposite, i.e. the fact that stochastics is an indispensable, advanced and powerful part of physics. Other simpler examples (e.g. Koutsoyiannis, 2010) indicate how known deterministic dynamics can be fully incorporated in a stochastic framework and reconciled with the unavoidable emergence of uncertainty in predictions.
h/t to WUWT reader Don from Paradise
The fraud continues to unravel.
The temperature data gathering is defective from the get go. Collecting data from blacktop jungles causes a global rise in temps everytime its tried. Collecting faulty data can’t produce believable results.
Thanks very much for your posts on these topics, Ernest. Al Gore should be in prison. The IPCC (UN) should be dismantled. UNaccountable, fraudulent, lying POS totalitarians.
There are some good comments related to this from one of Pounelle’s reader/correspondents here:
http://www.jerrypournelle.com/mail/2010/Q4/mail651.html#Saturday
Basically he did this same thing with the models about 10 years ago - “hindcasted” with one of the top models by putting in data from the first 30 years of the 20th Century to see if it would “predict”, say, 1970.
The oceans boiled off in 1950.
So much for that model.
In another case, they *halved* the solar constant. Temperatures continued to rise, albeit more slowly.
I don’t care who you are, anyone who thinks things would continue to warm with half the energy input has a screw loose. Another crap model.
So it goes.
And the typical, “garbage in, garbage out”, applies on so many levels it is disgusting. From basing temperature forecast on very selective ancient tree ring growth, which is based as much or more on rainfall as temperature, using data sites that are compromised by external forces, I.E. concrete, asphalt, air conditioners, etc, ignoring actual water temperature data, which most people know (although I'm not sure of climate scientists) the earth's surface is largely composed of water, to the worst, changing input data to support their preconceived beliefs in AGW.