Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: elfman2
McIntyre & McKitrick found that the Mann et al. methodology included a data pre-processing step, ...

Muller, in his Technology Review article, gives a brief description of PCA and of Mann's flawed procedure that yielded the hockey stick.

.... McIntyre and McKitrick obtained part of the program that Mann used, and they found serious problems. Not only does the program not do conventional PCA, but it handles data normalization in a way that can only be described as mistaken.

Now comes the real shocker. This improper normalization procedure tends to emphasize any data that do have the hockey stick shape, and to suppress all data that do not. To demonstrate this effect, McIntyre and McKitrick created some meaningless test data that had, on average, no trends. This method of generating random data is called “Monte Carlo” analysis, after the famous casino, and it is widely used in statistical analysis to test procedures. When McIntyre and McKitrick fed these random data into the Mann procedure, out popped a hockey stick shape!

That discovery hit me like a bombshell, and I suspect it is having the same effect on many others. Suddenly the hockey stick, the poster-child of the global warming community, turns out to be an artifact of poor mathematics. How could it happen? What is going on? Let me digress into a short technical discussion of how this incredible error took place.

In PCA and similar techniques, each of the (in this case, typically 70) different data sets have their averages subtracted (so they have a mean of zero), and then are multiplied by a number to make their average variation around that mean to be equal to one; in technical jargon, we say that each data set is normalized to zero mean and unit variance. In standard PCA, each data set is normalized over its complete data period; for key climate data sets that Mann used to create his hockey stick graph, this was the interval 1400-1980. But the computer program Mann used did not do that. Instead, it forced each data set to have zero mean for the time period 1902-1980, and to match the historical records for this interval. This is the time when the historical temperature is well known, so this procedure does guarantee the most accurate temperature scale. But it completely screws up PCA. PCA is mostly concerned with the data sets that have high variance, and the Mann normalization procedure tends to give very high variance to any data set with a hockey stick shape. (Such data sets have zero mean only over the 1902-1980 period, not over the longer 1400-1980 period.)

The net result: the “principal component” will have a hockey stick shape even if most of the data do not.

42 posted on 01/30/2005 11:21:38 PM PST by mista science
[ Post Reply | Private Reply | To 15 | View Replies ]


To: mista science
Let me digress into a short technical discussion of how this incredible error took place.

How charitable.

52 posted on 01/31/2005 4:28:18 AM PST by Carry_Okie (There are people in power who are really evil.)
[ Post Reply | Private Reply | To 42 | View Replies ]

To: mista science; Drammach; shubi; gobucks

Thanks! That’s what I most wanted in order to understand what went wrong.

I forgot most of what’s necessary from statistics to really get that, but I think it’s saying that quirks in the program they chose to use led them to normalize 20th century data in a different set from earlier data, and tie only the 20th century data set to modern temperature readings. And that’s behind the false claim that world temperatures are rising.


55 posted on 01/31/2005 8:54:07 AM PST by elfman2
[ Post Reply | Private Reply | To 42 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson