Skip to comments."Global Warming" SCAM - Hack/Leak FLASH in forum [Ticker]
Posted on 11/25/2009 10:13:48 PM PST by grey_whiskers
Ok.... Who is Tim Mitchell? Did he die or something? There's a very disturbing "HARRY_READ_ME.txt" file in documents that APPEARS to be somebody trying to fit existing results to data and much of it is about the code that's here. I think there's something very very wrong here...
7. Removed 4-line header from a couple of .glo files and loaded them into
Uhm... So they don't even KNOW WHAT THE ****ING DATA MEANS?!?!?!?!
What dumbass names **** that way?!
Talk about cluster****. This whole file is a HUGE ASS example of it. If they deal with data this way, there's no ****ing wonder they've lost **** along they way. This is just unbelievable.
And it's not just one instance of not knowing what the hell is going on either:
The deduction so far is that the DTR-derived CLD is waaay off. The DTR looks OK, well
I've only actually read about 1000 lines of this, but started skipping through it to see if it was all like that when I found that second quote above somewhere way down in the file....
CLUSTER.... ****. This isn't science, it's gradeschool for people with big data sets.
It's justifiably immoral to try to deal in a moral fashion with an immoral entity.
Last modified: 2009-11-21 01:43:45 by asimov
Click through to the link: and read the comments, particularly from the poster "Asimov". The good stuff starts on page 13.
There are scads of quotes which literally had my mouth hanging open.
Examples include: -- an admission that an attempt to replicate THEIR OWN PUBLISHED DATA came out to within half a degree (!!)
These are very promising. The vast majority in both cases are within 0.5 degrees of the published data. However, there are still plenty of values more than a degree out.
-- an admission that some of their codes EXPLICITLY give incorrect results:
The IDL gridding program calculates whether or not a station contributes to a cell, using.. graphics. Yes, it plots the station sphere of influence then checks for the colour white in the output. So there is no guarantee that the station number files, which are produced *independently* by anomdtb, will reflect what actually happened!!
-- explicit admission that they are recreating a cloud correlation function from the year 2000 from scratch because both the original correlation data file and construction file have been lost:
; program to construct cloud correlation coefficients (with DTR) ; method approximately follows New et al 2000 ; this program is required because Mark New has lost both ; the correlation data file, and construction files ; written by Tim Mitchell 10.01.03
-- a plaintive cry for help that they don't even know which files they are looking *at*:
So.. we don't have the coefficients files (just .eps plots of something). But what are all those monthly files? DON'T KNOW, UNDOCUMENTED. Wherever I look, there are data files, no info about what they are other than their names. And that's useless.. take the above example, the filenames in the _mon and _ann directories are identical, but the contents are not. And the only difference is that one directory is apparently 'monthly' and the other 'annual' - yet both contain monthly files.
(and in a similar vein, this gem):
Then - comparing the two candidate spc databases:
I find that they are broadly similar, except the normals lines (which
both start with '6190') are very different. I was expecting that maybe
the latter contained 94-00 normals, what I wasn't expecting was that
thet are in % x10 not %! Unbelievable - even here the conventions have
not been followed. It's botch after botch after botch. Modified the
conversion program to process either kind of normals line.
Decided to go with the 'spc.94-00.0312221624.dtb' database, as it
hopefully has some of the 94-00 normals in. I just wish I knew more.
Conversion was hampered by the discovery that some stations have a mix
of % and % x10 values! So more mods to Hsp2cldp_m.for. Then conversion,
producing cldfromspc.94000312221624.dtb. Copied the .dts file across
as is, not sure what it does unfortunately (or can't remember!).
And (to my mind) the coûp de grace:
..knowing how long it takes to debug this suite - the experiment
endeth here. The option (like all the anomdtb options) is totally
undocumented so we'll never know what we lost.
22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine software
suites - let's have a go at producing CRU TS 3.0! since failing to do that will be the
definitive failure of the entire project..
Remember. The science is "settled".
Yep. Down at the bottom of the Erlenmeyer flask.
"If you're not part of the solution, you're part of the precipitate."
For once, I am literally *speechless*.
Asimov’s comments were pretty good.
Or the mice ate a few....
....incoming...attention...breaker/breaker.....tag line adjustment....
It’s all just mind-boggling..... a first-semester college freshman is expected to be more competent and rigorous than this. These ass-clowns want to run the whole world and they can’t even run their little 3rd-rate “climate research” center with any competence and honesty!!
Dont know if youve seen this comment (below) in the HARRY_READ_ME files, but he sure does sound incredibly frustrated, noting the hopeless state of our databases.
Im no programmer and Ive just been browsing around a bit in the comments included in the HARRY_READ_ME files posted at the links below, but this statement of despair does seem to indict the whole project of trying to make any scientific use of this incoherent mess of badly recorded and often undocumented data (at another point the commenter noted that he hates this project for the spaghetti mess of garbage data and bad code that has been dumped in his lap):
**** OH #### THIS. Its Sunday evening, Ive worked all weekend, and just when I thought it was done Im
hitting yet another problem thats based on the hopeless state of our databases. There is no uniform
data integrity, its just a catalogue of issues that continues to grow as theyre found. ****
Without any of this the way to spot a scam is this: does it take away your money and exercise of freedom?
You know what alarmist thingy I forgot?
The OZONE LAYER.
I remember reading that by now, our skin would be burning off our face and blistering when we left the house. Whatever happened with the Ozone Layer scare?
It went away when Clinton was elected.
But, but... Al Gore and Ed Begley, Jr., both said the debate is over.
I wish they’d quit *****ing out the cuss words - they’re the only one’s I understand.
First of all, you are apparently right. Someone called Tim Mitchell was working there on his PhD, and he wrote most of that "code." Once he got the PhD he just walked out of the door, never to be seen again - and that's where Harry comes into the spotlight. Harry was tasked with reverse-engineering Tim's mess, and he couldn't make any sense out of it. No surprise if the "settled science" that the CRU published was based on a PhD work of a student, and that work was never even checked, let alone peer-reviewed.
I worked with a few scientists, and they are typically awful programmers. This is because they never studied computer science. They may be geniuses in their area of expertise (not the case here, likely) but they are pathetic in implementation of their wondrous algorithms. They write spaghetti code as if there is no tomorrow, and in many cases that's true - they get their degrees and move on. They never need to worry about version control, documentation, maintenance - none of that usual stuff that is *required* for any production software.
So I understand how it all happened. There was no management in the whole process, ever. Stations were sending monthly data in similar, but minutely different formats; the disks with reports were not filed but just thrown into desk drawers until they are needed. There were no backups, there was no process established. Your average car mechanic has a better ran computer - he knows that if his database of customers is lost he'd be in trouble. But those guys just didn't care. All they wanted is the "correct" results, and Tim worked hard to produce them. And when you have millions of measurements over tens of years, it's easy - especially if you don't allow anyone to check your work. Tim's boss knew what's up and it was his task to make sure nobody gets to see the data and the code - until now.
So when all these files started coming, people like Tim, due to lack of formal CS training, screwed it up big time. Results were erroneous. They noticed that, and Tim added a layer of patches to "fix" the problem - which only made it worse, so they had to patch again and again. At some point they just inserted a huge fudge factor to fit the measurements into the politics. And when it was time to publish they ran the program, in its sorry state, one last time and published whatever came out. And it was all garbage, of course.
So when Harry was given this mess to sort out, no surprise he couldn't recreate the published results. How could he, if there is not a single written note on how to do it! And the programs were changed many times since then, and there is no record of what was changed and how. To add to that, some data files are simply lost, and other mislabeled, and not a single person present knows how Tim did what he did. Probably even Tim couldn't repeat the calculations on the next day, so disorganized the whole work was. Professional programmers know very well how to avoid such disasters - by using version control software - but obviously CRU geniuses were too much above all that mundane stuff.
Is it possible to save the situation? Yes, but it's not easy. First of all, they need to publish *all* source data (station data) that they accumulated over time. Scientists need to reconcile this data with other records and agree on how to interpret the data. Then it must be converted to a common format and again published. Everyone must agree that this data is reasonable. Only then the scientists may start working on analysis of that new batch of data. This will take a few years, especially none of the CRU crew may be involved - they can't be trusted.
Climategate: 'Greatest scandal in modern science'...
Call for Congressional investigation...
Paper: Junk science exposed among climate-change believers..
Obama: 'Step closer' to climate deal...
Three leading scientists who on Tuesday released a report documenting the accelerating pace of climate change said the scandal that erupted last week over hacked emails from climate scientists is nothing more than a "smear campaign" aimed at sabotaging December climate talks in Copenhagen.
"We're facing an effort by special interests who are trying to confuse the public," said Richard Somerville, Distinguished Professor Emeritus at Scripps Institution of Oceanography and a lead author of the UN IPCC Fourth Assessment Report.
Dissenters see action to slow global warming as "a threat," he said.
The comments were made in a conference call for reporters.
The scientistsSomerville, Michael Mann of Penn State and Eric Steig of University of Washingtonwere supposed to be discussing their new report, the Copenhagen Diagnosis, a dismal update of the UN IPCC's 2007 climate data by 26 scientists from eight nations.
Instead they spent much of the time diffusing the hacker controversy, known in the media as "Climate Gate."
Remember ALAR? (Insecticide scare. Bogus.)
Once he gets his "climate deal" and the enabling legislation, none of the chicanery and incompetence and ideological "science" matters any more. The NYT and the Networks can run it as the Scandal Of The Century and everyone can know but it won't matter any more. The US will be a Command Economy.
You, grey_whiskers? If so, it’s time we all sat down and took a good hard look at ourselves. Thanks, and have a great Thanksgiving.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.