Posted on 12/05/2009 8:43:17 AM PST by ricks_place
No, I was probably wrong about that. They are applying the fudge factor for each 5 year period, so the negative factors are in the 40’s. I’ll have to look at the context of this piece of code.
**
"HARRY_read_me.txt. This is a 4 year-long work log of Ian (Harry) Harris who was working to upgrade the documentation, metadata and databases associated with the legacy CRU TS 2.1 product, which is not the same as the HadCRUT data (see Mitchell and Jones, 2003 for details). The CSU TS 3.0 is available now (via ClimateExplorer for instance), and so presumably the database problems got fixed. Anyone who has ever worked on constructing a database from dozens of individual, sometimes contradictory and inconsistently formatted datasets will share his evident frustration with how tedious that can be."
Schmidt also said: "What 'Harry' was doing is exactly that - upgrading legacy code so that it actually works and can be used by others."
I tried to follow along on the massive "CRU Hack" threads on RealClimate -- ultimately I had to give up. Schmidt seemed to be saying that the code was related to a temperature dataset that was not widely used. So my question is: was it used? Where? How much? Was it deemed consequential in any papers? Where the papers important?
I don't know, and I don't have the time to find out. Somebody want to enlighten me (us, me and palmer, all of FreeRepublic) -- go right ahead!
Back to the top.
They were using the Hasbro Easy Bake Oven level of software on a 7th grader’s idea of what constituted good data for a Michelin Five Star level of promised results?
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.