Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: All
Found this at Watts up With That?:

Do We Care if 2010 is the Warmist Year in History?

******************************************************************************************

Posted on by Ira Glickstein, PhD

Guest Post by Ira Glickstein
According to the latest from NASA GISS (Goddard Institute for Space Studies), 2010 is shaping up to be “the warmest of 131 years”, based on global data from January through November. They compare it to 2005 “2nd warmest of 131 years” and 1998 “5th warmest of 131 years”.

We won’t know until the December data is in. Even then, given the level of noise in the base data and the wiggle room in the analysis, each of which is about the same magnitude as the Global Warming they are trying to quantify, we may not know for several years. If ever. GISS seems to analyze the data for decades, if necessary, to get the right answer.

A case in point is the still ongoing race between 1934 and 1998 to be the hottest for US annual mean temperature, the subject of one of the emails released in January of this year by NASA GISS in response to a FOIA (Freedom of Information Act) request. The 2007 message from Dr. Makiko Sato to Dr. James Hansen traces the fascinating story of that hot competition. See the January WUWT and my contemporary graphic that was picked up by several websites at that time.

The great 1934 vs 1998 race for US warmest annual mean temperature. Ira Glickstein, Dec 2010.

[My new graphic, shown here, reproduces Sato's email text, including all seven data sets, some or all of which were posted to her website. Click image for a larger version.]

The Great Hot 1934 vs 1998 Race
1) Sato’s first report, dated July 1999, shows 1934 with an impressive lead of over half a degree (0.541ºC to be exact) above 1998.

Keep in mind that this is US-only data, gathered and analyzed by Americans. Therefore, there is no possibility of fudging by the CRU (Climategate Research Unit) at East Anglia, England, or bogus data from Russia, China, or some third-world country. (If there is any error, it was due to home-grown error-ists :^)

Also note that total Global Warming, over the past 131 years, has been, according to the IPCC, GISS and CRU, in the range of 0.7ºC to 0.8ºC. So, if 1934 was more than 0.5ºC warmer than 1998, that is quite a significant percentage of the total.

At the time of this analysis, July 1999, the 1998 data had been in hand for more than half a year. Nearly all of it was from the same reporting stations as previous years, so any adjustments for relocated stations or those impacted by nearby development would be minor. The 1934 data had been in hand for, well, 65 years (eligible to collect Social Security :^) so it had, presumably, been fully analyzed.

Based on this July 1999 analysis, if I was a betting man, I would have put my money on 1934 as a sure thing. However, that was not to be, as Sato’s email recounts.

Why? Well, given steadily rising CO2 levels, and the high warming sensitivity of virtually all climate models to CO2, it would have been, let us say inconvenient, for 1998 to have been bested by a hot golden oldie from over 60 years previous! Kind of like your great grandpa beating you in a foot race.

2) The year 2000 was a bad one for 1934. November 2000 analysis seems to have put it on a downhill ski slope that cooled it by nearly a fifth of a degree (-0.186ºC to be precise). On the other hand, it was a very good year for 1998, which, seemingly put on a ski lift, managed to warm up by nearly a quarter of a degree (+0.233ºC). That confirms the Theory of Conservation of Mass and Energy. In other words, if someone in your neighborhood goes on a diet and loses weight, someone else is bound to gain it.

OK, now the hot race is getting interesting, with 1998 only about an eighth of a degree (0.122ºC) behind 1934. I’m still rooting for 1934. How about you?

3) Further analysis in January 2001 confirmed the downward trend for 1934 (lost an additional 26th of a degree) and the upward movement of 1998 (gained an additional 21th of a degree), tightening the hot race to a 28th of a degree (0.036ºC).

Good news! 1934 is still in the lead, but not by much!

4) Sato’s analysis and reporting on the great 1934 vs 1998 race seems to have taken a hiatus between 2001 and 2006. When the cat’s away, the mice will play, and 1998 did exactly that. The January 2006 analysis has 1998 unexpectedly tumbling, losing over a quarter of a degree (-0.269ºC), and restoring 1934‘s lead to nearly a third of a degree (0.305ºC). Sato notes in her email “This is questionable, I may have kept some data which I was checking.” Absolutely, let us question the data! Question, question, question … until we get the right answer.

5) Time for another ski lift! January 2007 analysis boosts 1998 by nearly a third of a degree (+0.312ºC) and drops 1934 a tiny bit (-0.008ºC), putting 1998 in the lead by a bit (0.015ºC). Sato comments “This is only time we had 1998 warmer than 1934, but one [on?] web for 7 months.”

6) and 7) March and August 2007 analysis shows tiny adjustments. However, in what seems to be a photo finish, 1934 sneaks ahead of 1998, being warmer by a tiny amount (0.023ºC). So, hooray! 1934 wins and 1998 is second.

OOPS, the hot race continued after the FOIA email! I checked the tabular data at GISS Contiguous 48 U.S. Surface Air Temperature Anomaly (C) today and, guess what? Since the Sato FOIA email discussed above, GISS has continued their taxpayer-funded work on both 1998 and 1934. The Annual Mean for 1998 has increased to 1.32ºC, a gain of a bit over an 11th of a degree (+0.094ºC), while poor old 1934 has been beaten down to 1.2ºC., a loss of about a 20th of a degree (-0.049ºC). So, sad to say, 1934 has lost the hot race by about an eighth of a degree (0.12ºC). Tough loss for the old-timer.

Analysis of the Analysis

What does this all mean? Is this evidence of wrongdoing? Incompetence? Not necessarily. During my long career as a system engineer I dealt with several brilliant analysts, all absolutely honest and far more competent than me in statistical processes. Yet, they sometimes produced troubling estimates, often due to poor assumptions.

In one case, prior to the availability of GPS, I needed a performance estimate for a Doppler-Inertial navigation system. They computed a number about 20% to 30% worse than I expected. In those days, I was a bit of a hot head, so I stormed over and shouted at them. A day later I had a revised estimate, 20% to 30% better than I had expected. My conclusion? It was my fault entirely. I had shouted too loudly! So, I went back and sweetly asked them to try again. This time they came in near my expectations and that was the value we promised to our customer.

Why had they been off? Well, as you may know, an inertial system is very stable, but it drifts back and forth on an 84 minute cycle (the period of a pendulum the length of the radius of the Earth). A Doppler radar does not drift, but it is noisy and may give erroneous results over smooth surfaces such as water and grass. The analysts had designed a Kalman filter that modeled the error characteristics to achieve a net result that was considerably better than either the inertial or the Doppler alone. To estimate performance they needed to assume the operating conditions, including how well the inertial system had been initialized prior to take off, and the terrain conditions for the Doppler. Change assumptions, change the results.

Conclusions

Is 2010 going to be declared warmest global annual by GISS after the December data comes in? I would not bet against that. As we have seen, they keep questioning and analyzing the data until they get the right answers. But, whatever they declare, should we believe it? What do you think?

Figuring out the warmest US annual is a lot simpler. Although I (and probably you) think 1934 was warmer than 1998, it seems someone at GISS, who knows how to shout loudly, does not think so. These things happen and, as I revealed above, I myself have been guilty of shouting at analysts. But, I corrected my error, and I was not asking all the governments of the world to wreck their economies on the basis of the results.

6 posted on 01/07/2011 10:00:54 AM PST by Ernest_at_the_Beach ( Support Geert Wilders)
[ Post Reply | Private Reply | To 1 | View Replies ]


To: All
From the comments to WUWT article...see post #6:

**********************************EXCERPT************************************

D. J. Hawkins says:

December 25, 2010 at 4:54 pm

I took a quick peek at the GISS website to try and understand how they crank out their numbers, and even a cursory glance was daunting. Has there been a clear presentation of the methodology somewhere? I would think that once you nail down the method, no matter how many times you run the analysis the results should be the same. If the assumptions regarding initial conditions are so fungible as to allow a reversal of the relative values of the anomolies at will, you don’t have a scientific analytical tool, you have a propoganda tool.

7 posted on 01/07/2011 10:06:36 AM PST by Ernest_at_the_Beach ( Support Geert Wilders)
[ Post Reply | Private Reply | To 6 | View Replies ]

To: All
More from the comments ...see post #6:

********************************EXCERPT*********************************************

Mike Davis says:

December 25, 2010 at 5:01 pm

The method for analyzing temperatures means the records remain fluid Because this is done with a model each time the model is rerun you will get different results for all periods. This is not like a spreadsheet where the records are fixed once the output is complete as they would be following accounting practices. Using three decimal points when they are starting with whole numbers is also cheating. The end result is that we do not know what the temperature has done for the last one hundred and fifty years!

9 posted on 01/07/2011 10:09:07 AM PST by Ernest_at_the_Beach ( Support Geert Wilders)
[ Post Reply | Private Reply | To 6 | View Replies ]

To: All
From the comments at #6:

*****************************************EXCERPT***************************************

A C of Adelaide says:

December 25, 2010 at 5:51 pm

It may seem obvious but I have never seen it explicitly stated so I say it here. It seems to me that there are five completely independent ways to become a sceptic.

1/ Science A person can examine the science of AGW theory and become sceptical of the science.
2/ Predictions. A person can take the science purely at face value but become sceptical when the measured global temperatures can be seen to not match those predictions.
3/ Data sets. A person can become a sceptic by simply losing confidence in the global temperature data sets by noticing the uncertainties in the data collection and “corrections”
4/ Dirty Tricks A person could rationally ignore the science and ignore the temperature graphs and become sceptical solely on the basis of the known fraud, dirty tricks and bad faith of some of the main AGW crew. Lost of trust
5/ Money It would be totally rational to be sceptical of a group of scientists funded by (say) the tobacco industry and consider any of the out put possibly lacking independence. Similarly, it would be entirely rational to become sceptical of a group of scientists who are openly competing for grant money from pro-global warming funding bodies. One does not need to understand science to understand conflict of interest.

The science is the most difficult and demanding pathway so I think many people wouldn’t come at it directly from this route – which would explain the AGW frustration that no one is listening to their “science is settled” mantra anymore. There are so many easier routes by which they have lost credibility. (I note it may also explain why less educated people are less impressed by the “science is settled” mantra)

My own personal route to scepticism for example came first through pathway 4, through first doubts after the release of the Climategate emails, to outright scepticism after reading the Case Study 12 from D’Aleo and Watts (2010) “Hide this after Jim checks it” which you allude to. The idea that you can make undocumented changes to “raw” data and still call it “raw” was quite shocking.

I guess due to pathways 3, 4 and 5 this “hottest year ever” nonsense has lost traction.

15 posted on 01/07/2011 10:19:34 AM PST by Ernest_at_the_Beach ( Support Geert Wilders)
[ Post Reply | Private Reply | To 6 | View Replies ]

To: All
More...from comments to article at post #6:

*************************************EXCERPT**************************************

Note;---Onion seems to be a defender of AGW....

********************************

Sam Parsons says:

December 25, 2010 at 7:28 pm

Onion Quotes Hansen:
“However, there have been changes of the time of observation by many of the cooperative weather observers in the United States [Karl et al., 1986]. Furthermore, the change has been systematic with more and more of the measurements by United States cooperative observers being in the morning, rather then the afternoon. This introduces a systematic error in the monthly mean temperature change.”

The problem solved was created by sheer idiocy and the solution compounds the idiocy. The problem solved is that persons who recorded temperatures did not do so at the same time. The fact that such a problem exists shows that the persons in charge of collecting data really did not give a damn about the data or they would trained their data collectors properly. Because they did not train regarding time of day, they probably did not train them regarding citing. In other words, for lack of uniform standards, the data is sh*t. It always has been and always will be. But rather than admit that his glorious science is based on worthless data, what does Hansen do? He decides that he will correct all those time of day recording errors in one fell swoop.
Fortunately for Hansen, it is possible to do this because the error are systematic; that is, everyone who made the error made exactly the same error! Lucky Hansen and lucky us! He will use a little program that he wrote and that will make everything hunkey dorey.

Onion, you cannot possibly believe this b*llsh*t. Were you never conned out of your lunch money by an older kid at school? You know, the kind of kid who just takes pride in being sleazy and bullying younger kids. Hansen writes in exactly the same way that the school yard con artist talks.


20 posted on 01/07/2011 10:30:21 AM PST by Ernest_at_the_Beach ( Support Geert Wilders)
[ Post Reply | Private Reply | To 6 | View Replies ]

To: All
This comment...lengthy from someone who did some major work:

***************************************EXCERPT********************************************

E.M.Smith says:

December 25, 2010 at 9:42 pm

D. J. Hawkins says: I took a quick peek at the GISS website to try and understand how they crank out their numbers, and even a cursory glance was daunting.

Yup. Put me off for about 2 years. I kept saying “SOMEBODY needs to look at this!”… then one day I realized “I am somebody.”

I’ve downloaded it to a Linux box and got it to run. It ‘has issues’ but I figured out how to get past them. Details on request and posted on my web site. It required a long slow pass through the code…

Has there been a clear presentation of the methodology somewhere?

No.

There has been a lot of presentation of the methodology. Much of it is in technical papers that present a method, that is used in the code, but the code does more than (or sometimes less than, or sometimes just somewhat different from) what the papers describe.

It’s a convoluted complicated beast. Starter guide here:

http://chiefio.wordpress.com/gistemp/

I would think that once you nail down the method, no matter how many times you run the analysis the results should be the same.

You would think that. I thought that. It’s not that…

The code is designed in such a way that EVERY time you run it (with any change of starting data AT ALL) you will get different results. And both the GHCN and USHCN input data sets change constantly. Not even just monthly, but even mid-month things just ‘show up’ changed in the data sets.

So to talk about “what GIStemp does” at any point in time requires specification of the “Vintage” of data used TO THE DAY and potentially to the hour and minute.

Why?

Infill of missing data, homogenizing via the “Reference station method”, UHI via the “Reference Station Method”. Grid / Box anomaly calculation that uses different sets of thermometers in the grid/box at the start and end times (so ANY change of data in the recent “box” can change the anomalies…) and some more too…

If the assumptions regarding initial conditions are so fungible as to allow a reversal of the relative values of the anomolies at will, you don’t have a scientific analytical tool, you have a propoganda tool.

Can I quote you on that? It’s rather well said…

IMHO, the use of “Grid / Box anomalies” (calculated AFTER a lot of data manipulation, adjustment, averaging , homogenizing, etc done on monthly temperature averages…) mixed with changing what thermometers are in a “grid / box” in the present vs. the past lets you “tune” the data such that GIStemp will find whatever warming you like. It’s cleverly done (or subtile enough they missed the “bug”… being generous) and if a good programmer devotes about 2 years to it they can get to this point of understanding. Everyone else is just baffled by it. Draw your own conclusions…

I’ve tried explaining it to bright folks. A few ‘get it’. Most just get glazed. Some become hostile. I’ll explain it to anyone who wants to put in the time, but it will take a couple of weeks (months if not dedicated) and few folks are willing to ‘go there’.

Judging from the look of the code, it was written about 30 years ago and never been revisited (just more glued on, often ‘at the end’ of the chain). From that I deduce that either Hansen is unwilling to change “his baby” or very few folks are willing to shove their brains through that particular sieve …

The bottom line is that “the GIStemp code” is DESIGNED to never be repeatable and to constantly mutate the results as ANY input data changes and that makes ALL the output shift. It’s part of the methodology in the design. Don’t know if that’s “malice” or “stupidity”…

It is my assertion that this data sensitivity is what GIStemp finds when it finds warming. The simple fact that as new data is added the past shifts to a warmer TREND indicates to me that the METHOD induces the warming, not the actual trend in the data. I’ve gone through the data a LOT and find 1934 warmer than 1998 and with a method that IS repeatable. See:

http://chiefio.wordpress.com/category/dtdt/

http://chiefio.wordpress.com/category/ncdc-ghcn-issues/

Basically, 1934 and 1998 ought to stay constant relative to EACH OTHER even as new data is added even IF you were ‘adjusting the past cooler’ to make up for something or other (nominaly UHI or TOBS – yes, I know, it sounds crazy to make the past cooler for UHI correction, but it’s “what they do” in many cases).

As it is, they jockey for relative position with a consistent, though stochastic, downward drift of the older relative to the newer. That tells me it’s the method, not the data, that’s warming.


22 posted on 01/07/2011 10:43:26 AM PST by Ernest_at_the_Beach ( Support Geert Wilders)
[ Post Reply | Private Reply | To 6 | View Replies ]

To: All
More from the comments to article at post #6:

********************************EXCERPT***************************************

Cassandra King says:

December 25, 2010 at 10:10 pm

In the USSR and its slave satellite subject regions that I visited in the early 80s the ordinary unconnected people would look at the state broadcasts of record grain harvests and then look at the massive queues for what tiny quantities of bread was available, the people could easily see the ‘truth deficit’.

The state/establishment/ruling parasite class is desperate to sell the idea of CAGW, so desperate are they that like the USSR they have turned to telling ever bigger lies and deceptions. The gap between the observations of ordinary people and what they are told has reached breaking point, henceforth we will see ordinary people acting like those of the USSR, they will not believe anything the lying regime says whether its true or not. Trust has broken down, the bonds of trust between the political class and the people is broken, we know the political class and their stooge Lysenko’s are lying through their teeth.

The political class need CAGW whether its real or not, it allows them to control carbon and control the masses, it allows the rich to become richer and the powerful to become more powerful. The CAGW fraud is plan A, I can only imagine that a plan B would be a nightmare.

24 posted on 01/07/2011 10:56:55 AM PST by Ernest_at_the_Beach ( Support Geert Wilders)
[ Post Reply | Private Reply | To 6 | View Replies ]

To: All
More :

****************************************EXCERPT******************************************

E.M.Smith says:

December 25, 2010 at 9:55 pm

Onion says: So this race between 1934 and 1998 was in the US temperature record, that comprises 2% of the Earth’s surface.

Um, no.

The article is a little unclear on that point, but the details are rather devilish, so I’d give them some slack on the details. The reality is rather complex…

GISS uses a code called GIStemp. It is that US CODE that is finding 1998 warmer than 1934 (sometimes).

GIStemp takes as input BOTH the USHCN (USA only data) and the GHCN (Global Historical Climate Network – whole world data). Except that between about Noveber of 2010 and about 2007? it took in the USHCN but only used it up to 2007. Then in November it suddenly started using all of it (having finally added the code to use the newer version)… EXCEPT that the new version of USHCN was all different from the old version (warmer) so direct comparisions of old and new GIStemp are not, er, “valid”? “reasonable”?

OK, in the first step of GIStemp, it does a garbled “half averaging” of USHCN and GHCN but only for the USA stations. Each, you see, has a different ‘adjustment history’ so it tries to undo some of the adjustments in one and put in the adjustments from the other, except where it only has one, then it just uses whatever one it has, adjustments that don’t match and all.

Oh, and it fills in missing data by making it up.

N0, honest. It is called “The Reference Station Method” and it is used both to press fit the data to look like what they think it ought to be (called ‘homogenizing’) and to fill in missing bits with what they think would look nice and fit in well.

THEN that mush goes on the following steps (that are detailed in the links I gave above for anyone courageous enough to ‘go there’).

So, “onion”, they use the whole global data set. It’s just what they DO with it that’s, er, odd.


25 posted on 01/07/2011 11:05:01 AM PST by Ernest_at_the_Beach ( Support Geert Wilders)
[ Post Reply | Private Reply | To 6 | View Replies ]

To: Ernest_at_the_Beach

So.... from what I’ve seen, experienced, and read, it would seem that given the margin of error, The AVERAGE GLOBAL TEMPERATURE has not really changed at all.


27 posted on 01/07/2011 11:28:12 AM PST by UCANSEE2 (Lame and ill-informed post)
[ Post Reply | Private Reply | To 6 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson