Posted on 08/31/2003 6:35:30 PM PDT by Pikamax
Not just warmer: it's the hottest for 2,000 years
Widest study yet backs fears over carbon dioxide
Ian Sample, science correspondent Monday September 1, 2003 The Guardian
The earth is warmer now than it has been at any time in the past 2,000 years, the most comprehensive study of climatic history has revealed. Confirming the worst fears of environmental scientists, the newly published findings are a blow to sceptics who maintain that global warming is part of the natural climatic cycle rather than a consequence of human industrial activity.
Prof Philip Jones, a director of the University of East Anglia's climatic research unit and one of the authors of the research, said: "You can't explain this rapid warming of the late 20th century in any other way. It's a response to a build-up of greenhouse gases in the atmosphere."
The study reinforces recent conclusions published by the UN's intergovernmental panel on climate change (IPCC). Scientists on the panel looked at temperature data from up to 1,000 years ago and found that the late 20th century was the warmest period on record.
But the IPCC's report was dismissed by some quarters in the scientific community who claimed that while the planet is undoubtedly warming, it was warmer still more than a thousand years ago. So warm, in fact, that it had spurred the Vikings to set up base in Greenland and led to northern Britain being filled with productive vineyards.
To discover whether there was any truth in the claims, Prof Jones teamed up with Prof Michael Mann, a climate expert at the University of Virginia, and set about reconstructing the world's climate over the past 2,000 years.
Direct measurements of the earth's temperature do not exist from such a long time ago, so the scientists had to rely on other indicators of how warm - or not - the planet was throughout the past two millennia.
To find the answer, the scientists looked at tree trunks, which keep a record of the local climate: the rings spreading out from the centre grow to different thicknesses according to the climate a tree grows in. The scientists looked at sections taken from trees that had lived for hundreds and even thousands of years from different regions and used them to piece together a picture of the planet's climatic history.
The scientists also studied cores of ice drilled from the icy stretches of Greenland and Antarctica. As the ice forms, sometimes over hundreds of thousands of years, it traps air, which holds vital clues to the local climate at the time.
"Drill down far enough and you could use the ice to look at the climate hundreds of thousands of years ago, but we just used the first thousand metres," said Prof Jones.
The scientists found that while there was not enough good data to work out what the climate had been like in the southern hemisphere over that period, they could get a good idea of how warm the northern hemisphere had been.
"What we found was that at no point during those two millennia had it been any warmer than it is now. From 1980 onwards is clearly the warmest period of the last 2,000 years," said Prof Jones.
Some regions may well have been fairly warm, especially during the medieval period, but on average, the planet was a cooler place, the study found.
Looking back over a succession of earlier centuries, the temperature fluctuated slightly, becoming slightly warmer or cooler by 0.2C in each century. The temperature has increased by at least that amount in the past 20 or so years, the scientists report in the journal Geophysical Research Letters.
"It just shows how dramatic the warming has been in recent years," said Prof Jones.
Scientists who do not believe that carbon dioxide is driving climate change are unlikely to run up the white flag just yet, however.
Dr Sallie Baliunas at the Harvard College Observatory in Massachusetts, for example, maintains that the recent warming could all be down to changes in the strength of sunlight falling on the planet.
She concluded that during the 20th century, earth went through a cycle of natural climatic change. According to her data, from 1900 to 1940 the planet warmed slightly, then cooled from 1940 until 1970, then warmed up again from 1970 onwards. Given that 80% of the world's carbon dioxide emissions have been produced since 1940, the expected effect, if carbon dioxide was causing global warming, would be higher temperatures not lower, she said.
Dr Baliunas's data also concluded that the period of warming between 1900 and 1940 must have been due to natural causes, most likely increased sunlight hitting the earth's surface, since carbon dioxide emissions were negligible at the time. The evidence, she said, pointed to variations in the sun's brightness being the cause of the planet's warming up, not carbon dioxide.
But other climatologists have welcomed the new study as the most conclusive evidence to date that the increase in temperature is a result of human activity.
"The importance of the finding is that it shows there's something going on in the climate system that's certainly unusual in the context of the last 2,000 years, and it's likely that greenhouse gases are playing the major role," said Prof Chris Folland of the Met Office's Hadley Centre. "If you look at the natural ups and downs in temperature, you'll find nothing remotely like what we're seeing now."
Cold water on climate claims
Not everyone agrees that climate change is largely driven by human activity. Some believe the warming the planet is experiencing now is part of a natural cycle. Historical anecdotes are sometimes used to support their case, but the new study debunks these claims.
· There were vineyards in the north of Britain
There were indeed vineyards in Britain in the 10th and 11th centuries, but only 50 to 60. There are now more than 350 in this country, with some as far north as Leeds.
· The Vikings went to Greenland
In AD980, Erik the Red and his crew headed from Iceland to Greenland, but it wasn't for the good weather. Erik had been kicked out of Iceland for murder so he took his crew westward where, they were told, they would find land.
· The Thames used to freeze over more often
The river's tendency to freeze over frequently in the 16th and 17th centuries is often cited as evidence that the climate used to be more erratic. But, according to the new study, the major cause was the original London Bridge, completed in the 13th century, which had very small spans between its supports for the Thames to run through. The result was that the river was tidal only as far as the bridge, causing the water to freeze over. When the bridge was rebuilt to a different design in the 1820s, the water flowed more easily and therefore became less prone to ice.
The more we argue against the notion of global warming due to incresed CO2 the more we fuel the debate; if we were smart, we would just dismiss it for the unsupportable nonsense it is.
Funny, my newspaper has a picture of a section of I-35 washed out in Emporia, Kansa from rain Saturday and Sunday and quoted a local meteorologist as saying the area received 8-12 inches in the 24 hour period; I guess one should be careful what one wishes for.
CO2 greenhouse might get you 1-2 watts per square meter, if it doubles. The power needed to maintain a body at a given temperature, indefinitely, goes as the 4th power of the absolute temperature. So you need roughly a 7% increase in the total power. About 4% for just a 3C change.
The base power you need this increment to is the total power operating at the surface to maintain the present average temperature, roughly 18C or 291K. Which is significantly higher than direct solar flux, largely due to water vapor greenhouse. (Without water vapor greenhouse, the temperature would be about 40C below zero. Look at the outer atmosphere, up above where jetliners fly). It is going to be on the order of 500W. If I'm off by a factor of 2 it won't matter for the conclusion.
CO2 is a trace gas, 0.1% of the atmosphere. Its contribution to total greenhouse is a correction term to that provided by water in the atmosphere. The atmosphere itself is the 3rd term contributing to mean temperature by size, behind sun output (1, biggest and warming) and rotation (2, largest cooler). Modest changes - a factor of 2 or less, not orders of magnitude - in trace elements in a system don't easily cause large changes.
The observed past CO2 change, as opposed to a hypothetical predicted doubling in the future, is enough to account for about a 0.2C increase in world mean temperature. To get the standard 3-5C prediction requires more like 20-35W per square meter. Please tell me where the extra power is supposed to come from.
Is the earth supposed to turn black as pitch? Not much that way - it is already mostly a black body, with an overall albedo around .85 (and only a portion of incoming solar flux is subject to reflection anyway - basically the portion of the visible that gets to the ground). Are changes in cloud cover supposed to have run away effects? Well they are modest net coolers, while observes changes since CO2 took off are on the order of a percent or less, with the wrong sign.
Please also note that the 5C prediction has not changed in 100 years, since global warming was first proposed, as an explanation of the phenomena of ice ages. Which are now explained by an entirely different mechanism (orbital eccentricity variation). Leaving them overexplained, to those who think modest CO2 changes readily cause 5C mean temperature variations.
It has all become a giant epicycle hunt, ever since people pointed out they didn't have enough power. They have been searching for 10 fold amplifiers hidden somewhere in climate complexity since. Every time they propose something, as mere handwaving, the physicists go look and it either isn't nearly big enough, has the wrong sign, or both.
It is, besides, wildly implausible that the atmosphere system contains 10 fold amplifiers in either direction, that can be touched off by tiny power input changes on the order of 1-2W per sq. meter. Because if it did, its time series behavior would look something like a driven oscillator, with violent amplitude swings.
But we in fact see tons of variation, much larger, on short time scales - by location, daily, seasonal, solar over 14 year cycles - without triggering any magic amplifier and running away to a completely frozen ice ball or an iceless sauna. Meanwhile the longer term stuff looks pretty darn stable, with very modest movements. As you'd expect from a global *mean*, which is of course a giant average.
Noise in the details but a broadly stable overall average is exactly what you'd expect if there are either no significant feedbacks, or those there are are mostly dampeners rather than amplifiers. There can be any number of feedbacks that behave an order of magnitude smaller in the follow on effect than in the triggering signal, and doubtless are. There might even be a few about as large as the signal. But feedbacks that boost an input 10 or 20 times are wildly implausible in a system as broadly stable as earth mean temperature.
And why would these entirely hypothetical amplifiers selectively respond only to CO2 changes? If the mechanism is supposed to be additional temp from modest 1-2W CO2 greenhouse causes ABC, why don't 1% changes in total solar power output over sunspot cycle time scales set off the same or even bigger changes, every 14 years?
Then there are the oceans. Which are the surface systems "heat sink". To maintain a higher temperature indefinitely requires a continual power term, because anything hotter glows brighter, re-exporting its higher energy (as infrared photons etc). But in addition, you first have to actually warm up all that water to the higher temperature level, which requires an integral of a power over a time sufficient to yield a the net energy to go from cooler to hotter - all the while against the "restoring force" of re-radiation of gained energy via brighter "glow".
When we go look at the oceans, they aren't getting appreciably hotter. Most data show no change at all, some find 0.01 to 0.1C increase over the course of 50 years. At that rate, the heat sink could go on absorbing the net surplus energy for a thousand years, before the average temperature has actually responded to the higher power and reached its higher equilibrium again.
Greenhouse can increase global mean temperatures. Past changes in atmospheric concentrations of trace gases might have increased global mean temperatures 0.2C or so. Future ones, on the typical scaremongers projected trends (double this, triple that, over 50-100 years) might get you 0.5C at the surface eventually, though probably with a long lag as to the whole amount, as you try to warm any appreciable depth of the oceans.
Can I explain why, if all this is so, the supposedly reputable scientists working on it all do not say so themselves? Sure. They used a linear model because linear models are much easier to manipulate when you've got scads of variables. They didn't use the 4th power of the temperature for the response to a driving power, even though that is a well understood physical law. I'd guess they made this basic underlying mistake back in the 1970s.
They probably put in 4 times instead, because for small changes that is close enough (1.01^4 ~= 1+.01x4). Then they had coefficients in their model to be determined statistically or empirically to give some best fit. Why but in 4xA(i) when A(i) is going to be determined empirically anyway? So drop the 4, and just expect to find an A(i) that is the right amount, 4x the A(i) you were looking for previously. Simpler in the equations no doubt. But physical nonsense if you then arrive at a conclusion that a 2W power change can lead to a 8W outcome.
Back in the 1970s, soon after the climate models were developed, someone pointed out the scale of the power difference, and that it would be equivalent to moving the earth a million or two miles closer to the sun. After a little embarassment, the climate models came back and said, we rejiggered everything, we are clearly heading for the next ice age. A few years later, that prediction went away, and it was back to the old script of a 3-5C, ice age scale warming. Had they found a new 20-35W of power? No. My bet, they just rejiggered internal weights derived from data fitting. And in the process, the 4th power law was "forgotten" by their model.
Ever since then (roughly, early 80s), there has been a noticable difference between the public explanations peddled by the greens, and the actual conversations with the scientists. The greens play on popular misunderstandings of basic physics.
They talk about the 2nd law and imply entropy will build up on earth (the "waxy build up" theory). Nonsense, it is exported to space by infrared light; the earth is not a closed system.
Or they say, if the power in goes "out of balance" with the power out, the earth will warm - implying, run away warmer indefinitely. Nonsense, as soon as any body gets warmer it automatically shines brighter and re-exports energy. To *keep* a body hotter requires continual inputs of energy per unit time, aka a power.
(Prove it to yourself - turn on the electric burner of a stove - it get hot, but not infinitely hot if you leave it on. Turn it off, and it does not stay as hot as you got it, but cools rapidly to room temperature. You can feel the infrared pouring off of it, if you hold your hand close to it but behind glass).
Among the scientists, on the other hand, everybody knows that they need a power term and a big honking one, and that none of the popular handwaving will do. So they chase epicycles, proposals packed off into the complexities of global climate, where they can allege things without their present interlocutor being an expert on them. Or they can shove their needed amplifier into the gaps, places nobody knows enough about.
Aerosols were going to save them. Then it turned out aerosols were coolers. Did that daunt them? No, certainly not, not to accomplish masters of spin. If aerosols are coolants then without them (sic) the world would be even warmer than it is now. So the temp increase from CO2 alone must be 10 times a large, just temporarily "masked" by those saving aerosols.
That this is physical nonsense, because we can actually look at the sky and see the wavelength of CO2 absorbtion, and measure energy in that band on the ground and up in a plane, and the power is about 1W, does not disturb them at all. They just say, "the climate is more sensitive than we thought." (The honest thing would be to admit aerosols cannot possibly support any significant feedback leading to a higher power term).
We have by now seen this repeated a half a dozen times with a half a dozen candidate explanations. The prediction is fixed. The theory is variable.
And it is not physics. Anyone who thinks otherwise has merely to tell me where he proposes to get the power needed to make the entire earth shine 4-7% brighter indefinitely. I'm waiting.
I've explained to him that past CO2 greenhouse by IPCC numbers can account for 0.2C warming but not more, and a future with double the CO2 change of the past 250 years packed into the next 100 could add slightly under 0.5C more, but not 3-5C. I've asked him for a power budget several times without seeing anything remotely like one.
I've provided him a back of the envelope one myself, with a giant +/- 30% error bar allowed, and shown it leads to the statements above. And that therefore the warming crowd's headline figures are an order of magnitude larger than what their own numbers are telling them. I've asked him to show me where I am wrong by an order of magnitude.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.