Posted on 11/29/2009 7:58:10 AM PST by joinedafterattack
SCIENTISTS at the University of East Anglia (UEA) have admitted throwing away much of the raw temperature data on which their predictions of global warming are based.
It means that other academics are not able to check basic calculations said to show a long-term rise in temperature over the past 150 years.
The UEAs Climatic Research Unit (CRU) was forced to reveal the loss following requests for the data under Freedom of Information legislation.
(Excerpt) Read more at timesonline.co.uk ...
They could probably have saved all that data on one SD card.
You would think that it would not be terribly difficult to write software to transcribe an old digital medium to a new digital medium.
Certainly data compression algorithms work with little human interface.
-—<>-—<>-—<>-—<>-—<>-—
True to a point... the problem is that “holes” appear in the archived data. Magnetic glitches, formatting problems, re-formatting problems (a column or row was completely or partially shifted by inaccurate read/write of the previous archive record). Archeology (that is what we called it) is often needed, and that may require assistance of someone who was involved with the original record(s) to be reasonably confident of interpretation.
In addition, the ancient mag tape or paper tape machine may require maintenance, and those transistors or vacuum tubes are unavailable, so you have your electronics folks tied up attempting to reconstruct circuits that they have never contemplated. The mechanical problems with the old equipment are usually less stringent, but can be equally frustrating. Imagine how long it would take you to reconstruct the head of a 9-track tape machine... or the logic for verifying multiple checksums of blocks of information gleaned from a CDC (Control Data Corporations) written tape (Their computer information “words” were 60bytes, if I recall correctly... they certainly were not 64 bytes.) “Optical recognition” of data is definitely less than 100% accurate, and recognition of optical records that have been corrupted over time by assymmetric shrinkage of film or paper, as well as degradation of the media, exacerbates the problems.
These problems I’ve seen. I’ve personally dealt with some. I’ve got a few data tapes that I am not sure that even one machine exists in the world to read. Serious recovery of data archives is not at all trivial. Yes, a lot of automation can be used, but unless you are simply transcribing paper records to digital, embarking on this task quickly becomes very daunting.
“But, that raw temperature data was probably shared by more than the UEA. Theyre not the only ones tracking global temperatures.”
Yeah, like... NASA
We still had a IBM key punch machine when I went to college.
Students walking around with shoe boxes filled with punch cards; what memories.
I can imagine trying to transfer data from those (or maybe I cant)
I was working as a young technician during Gulf War I. After the war the US gov did an investigation into one our customers, a radio manufacturer whose equipment failed in the desert heat. As the lead technician I had to provide all the test data cards for the equipment we sold to the manufacturer in question. I was sweating hard as I tore through the old file cabinets and breathed a heavy sigh of relief when I found the original test data sheets. Long story short, several engineers at the radio manufacturer went to prison for falsifying test data. Seems like an apt punishment for these university profs, but I’m not going to hold my breath.
In principle (given enough knowns and unknowns), yes, the calculations (at least most of them) can be run backwards. Unfortunately, this is a very complex process to complete. In addition, according to DEFINITE notations in various programs, there were adjustments (fudge factors) that were applied to only a small portion of the data points. That type of procedure enormously multiplies the universe of possibilities that has to be investigated, and may well mean a “solution” is impossible to reach.
The amount of data that was released by the “whistleblower” is ONLY 60 Mb. My lab easily generated 50Mb of RAW DATA in one minute! I’m not including EMails, computer programs, many many iterations of processed data, personal notations, spreadsheets, designs of equipment or programs, financial information, personnel records, and so on and so forth. There is a HUGE amount of simple DATA, EMail and COMPUTER COMMENTS and programs as well as intermediate calculation steps that have not been released in this data dump that really ought be examined if we really hope to reconstruct the situation in this AGW community.
Note that I addressed what your comment says about intermediate calculations and data. These have not been inculded in this whistleblower dump.
I guarantee that MUCH of the needed info resides on computer and backups (we did and SAVED daily, weekly, monthly and annual backups) ... in the US. Getting it, and then analyzing what it tells us is critical, but very very difficult.
BUT BUT BUT - Ed Begley says we just need to read the peer review articles!
It’s unthinkable that a group of scientists would just throw out their raw data. This would be like Microsoft deciding to just throw out the original DOS code ‘cuz Windows is so much prettier.
This thing just got exponentially uglier.
Global warming/climate change/whatever is dead.
Good point! :-)
One of my husband’s worst ‘curses’ to bad drivers in traffic is “Harvard bound moron”—I always wondered what that means! ;-)
Thanks.
And the shredder. And the fireplace.
The two MMs [McKittrick and McIntyre] have been after the CRU station data for years. If they ever hear there is a Freedom of Information Act now in the UK, I think Ill delete the file rather than send to anyone.
From:
http://pajamasmedia.com/blog/global-warminggate-what-does-it-mean/2/
E-mail from Dr. Phil Jones.
This guy should do hard time.
Suggest you incarcerate the entire bodies, not just the innards.
Just trying to save space in the new building!
I don't think they'll walk very far without hearts or lungs. But without stomachs, they won't get hungry either, so maybe it'll be a wash...
There is another major problem that has not been fully addressed in this thread.
From the story: ...the CRU said: We do not hold the original raw data but only the value-added (quality controlled and homogenised) data.
Information from the recently released emails indicate that various types of data massaging was required by Jones, Mann and others to achieve their pre-determined output and predictions.
So, what we now have is
A. no original data exists
B. value added data (what kind of cr@p euphemism is that?), and
C. data that was further modified to make it fit a preconceived outcome, but no possible way to compare the twice-modified data to the original.
So essentially, they have been free to transform or modify the (value-added) data in any direction they want with no constraints, since there are no objective standards (original data) to compare their results to.
We are being served data that has been modified at least twice to get the desired results. Gossip that is furthest from the source is the least reliable.
Our area sent TEN buses and called the local paper to invite them to the loading point - no interest, no story.
nice work. will they be embarrassed? probably not sad to say.
And some little freebie intern could knock it out in two weeks and be thrilled to death to be a part of it.
Do you believe for one second that the data is lost, destroyed, thrown away, or otherwise unavailable? I don’t. They were the only copies extant? Supposedly peer reviewed “science” with no one else on the planet Earth in possesion of this data? Ridiculous.
They could easily fetch the numbers, they just don’t want to.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.