Posted on 12/02/2009 4:07:10 PM PST by JoeSeales
The Harry Read Me File Is A Must Read On Climategate
Yesterday, Glenn Beck had on Welcome To Obamaland author James Delingpole from across the pond to discuss Climategate and the decision by climatologist Phil Jones to step down after all those hacked e-mails were released.
About midway through the interview, Beck asked Delingpole what was the one earth-shattering thing that everyone should understand about Climategate. His answer was a file called Harry Read Me. Harry Read Me is a file that documents the frustration of a programmer who is trying to manipulate the climate data in a program to give climatologists the result that they want. It even documents that the real raw climate data has been erased. Delingpole called it data-raping.
The document is not only very telling of the data manipulation going on, but it is also quite funny to read the programmers frustrating comments when he cant get the program to work. Here are just a few excerpts:
What the hell is supposed to happen here? Oh yeah- there is no supposed, I can make it up. So I have : )
You cant imagine what this has cost me to actually allow the operator to assign false WMO (World Meteorological Organization) codes..."
OH F THIS. Its Sunday evening, Ive worked all weekend, and just when I thought it was done, Im hitting yet another problem thats based on the hopeless state of our databases.
Before we spend 145 trillion dollars on fixing a problem that doesnt exist, all of the facts should be presented- including these. However, since the media will not touch this, once again well all have to do their job and pass this around.
See the file here: http://www.joeseales.com/2009/12/02/the-harry-read-me-file/
(Excerpt) Read more at joeseales.com ...
Tried following the information but was just a bit over my head. I’m sure that many here will find it more explanatory and advise us. It does look like a good site for information though...
I will note that I tried to follow the code (C) but gave up as unless it is taken in consideration of the whole, not exactly precise. There are sections that make little sense though...leaving it up to other coders to make some sense of it.
This is really something. America will be the very last to catch on, but we will.
Whoaaa! Wait a minute.
CRU says the dataset gathered from meterological stations around the globe was destgroyed in 1980!
Now what data was Harry (actually Ian (Harry) Harris) working on?
It couldn’t be the base data — that was gone. Was he frustrated at having to wrestle with the “twiddled” data that Jones & Co. had given him?
And, to further confound the issue. What dataset was the base (or original) data that other climate scientists used for their evaluations. And if their results approximated what the CRU had published, after torturing the original data into screaming, how reliable is the meterological station data they worked with -— or was it all the same?
The Climate Pool (on Facebook)
Their "Mission Statement"
The Climate Pool is a hub for a global discussion on the Climate Summit in Copenhagen, planned to run from December 7-19. It is a Facebook fan page managed by leading news agencies all over the world which are members of the global media network MINDS International.
The issues at stake in Copenhagen affect the lives of billions of people in coming years. The summit, whatever the outcome, is likely to grip the attention of masses of news consumers the world over yet the complex scientific underpinnings and attendant competing interests make the role of serious journalism more important than ever.
Around this journalism we hope to drive debate of the myriad aspects of the event with readers around the world.
The news agencies involved will be posting blog items, linking to relevant content and driving debate on the pages discussion boards as well as by using other tools of social media moderating those discussions in accordance with the highest journalistic standards. This site is not aimed at replicating the traditional media coverage of such an event but of providing back stories and a forum for analysis and various points of view. Many of the participating news agencies will concentrate on national aspects concerning their home markets; some will focus more on a global market. All are committed to jointly providing a new model for collaborative journalism and reader engagement. To avoid language barriers, all content of this site will be in English.
The following MINDS news agencies will actively contribute stories in the Notes section:
**************************************
Locate them on your Facebook account by searching on The Climate Pool
If you do not have a Facebook account, this one item alone is enough to justify getting one.
Then help me and the rest of us "let em have it" :-)
bttt
For some reason, my computer won’t bring up the full note. But those extracts are certainly very incriminating.
Yes, this is crime, and these folks should be fired from their jobs and prosecuted for stealing government funds.
Great work Joe!! I’m no Software weenie but I’ve worked around it enough to see a very haphazard longterm lack of any process control or configuration control over the changes and manipulations this programmer tried. I’ll forward this to those smarter than I to see if they can discern more detail.
I had to click on the ‘plain text’ option to see it.
As did I. Gave me a headache. I am no longer a programmer, but I was for many years. How can you expect to get a viable outcome from that kind of mishmash. Did you notice the hundreds of uses of the word “synthetic”. I’d like to know exactly what that means. Its obvious from his notes that the data was in no way consistent and they were endeavoring to “normalize” it come hell or high water - even if the results were inconsistent. Its also obvious that he was working off a set of data that had already been “normalized” by a previous programmer and that programmer had left little in the way of documentation and certainly nothing to give anyone confidence that the data had been “normalized” in an appropriate way.
This really is an incredible thing to read. I can’t imagine trying to work in those conditions.
That work is more than sloppy at best. It wouldn’t have passed any independent audit or review that I’ve ever seen conducted.
Back to precip, it seems the variability is too low. This points to a problem with the percentage anomaly routines.
See earlier escapades - will the Curse of Tim never be lifted?
A reminder. I started off using a 'conventional' calculation:
absgrid(ilon(i),ilat(i)) = nint(normals(i,imo) +
*
anoms(ilon(i),ilat(i)) * normals(i,imo) / 100) which is: V = N + AN/100
This was shown to be delivering unrealistic values, so I went back to anomdtb to see how the anomalies were contructed in the first place, and found this:
DataA(XAYear,XMonth,XAStn) = nint(1000.0*((real(DataA(XAYear,XMonth,XAStn)) / &
real(NormMean(XMonth,XAStn)))-1.0)) which is: A = 1000((V/N)-1)
So, I reverse engineered that to get this: V = N(A+1000)/1000
And that is apparently also delivering incorrect values. Bwaaaahh!!
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.