Posted on 03/24/2007 5:51:38 AM PDT by moneyrunner
UV can only form clouds during daytime
High level ice clouds which would be one of the primary results of UV interations are persistent through the night and hold atmospheric LW radiation in much as a blanket works.
I would also suspect those are high clouds, therefore cooling.
You have it just backwards, high clouds, on balance, warm by absorption & re-emission of LW radiation back into the atmosphere thus acting as a blanket. The case of the partially reflecting mirror of the high CO2 cloud and the Venus high level sulfate clouds being examples of the extreme case of the class.
Mid level clouds tend to be balanced reflection vs absorption/re-emission neither warming or cooling.
Low clouds tend to cool on net, by reflection of visible wavelengthw and absorption/reemissions of LW back up towards space.
The flip side is that UV induced clouds will be diffuse and uniform and diffuse clouds are more cooling than concentrated ones.
Sorry that one is just flat invalid, diffusion tends to increase the pathlength LW radiation must follow to find its way out of the atmosphere thus acting to warm more than cool. The dispersal off the direct vertical tends to trap radiation in the atmosphere by raising refractive effects as well as longer path through the atmosphere assuring any LW is more efficiently captured.
So what's the bottom line? Beats me, but a decent model should be able to figure it out.
Ahmmm, there is no such animal as a decent model for clouds. Which, along with spatial resolution, is one of the greatest factors for error in current climate models.
A climatoligist last week stated that the rise of CO2 is following the temperature increase. You have to actually look at the appropriate chart to see this rather than the idea that CO2 increase causes temperature increase, which is something a non-scientist would have trouble doing.
The greatest problem with ANYTHING relying on the GCMs is clouds/aerosols, which even the alarmist scientists consider to be their weakness. The error bars they put on those inputs are bigger than their inputs!
Consider: Assume the Sun were constant, not increasing in radiance since 1900. In order to account for ALL the temperature change on the earth since 1900, a 1% decrease in total amount of cloudiness (call them having an average albedo of 0.60, and the earth of 0.30) - allowing an additional 1% more surface to absorb its full compliment of solar energy in 2000 than in 1900 would be sufficient - I'm not going to do the calculations right now, but this would easily result in more than sufficient additional Watts of "forcing". Look up in the sky, and tell me how to measure within 1% what the cloudiness is over the course of a week or two.
I'm not saying this is the mechanism. It is only a suggestion of how little the temperature rise seen could have come about with a CONSTANT sun, which it's not.
The GCMs can NOT be counted on to predict temperatures when they can't even be given correct inputs values. They can NOT figure out within 1% what the cloudiness or average cloudiness will be. They can not figure out within 1% the physical processes that produce clouds. They can not even figure out what the effect of, say exactly 50% cloudiness, will be.
Then it becomes the old computerman's saw: Garbage In, Garbage Out.
The GCMs throw out more garbage than anything else on earth ever has... and they throw it out very, very fast.
Assuming no albido and no GH effect, the radiation emitted from the earth of the incident radiation in your chart (1366 now, 1364.5 in 1900). The stefan-boltzman constant is 5.67x10^-8 so:
4th root of ((1364.5 / 4) / (5.67 x (10^(-8)))) = 278.50476
4th root of ((1366 / 4) / (5.67 x (10^(-8)))) = 278.581269
or a 0.03 percent temperature increase from the increased irradiance and the rest from other sources.
Yes.
It is clear in the Vostok Ice Core data that the CO2 LAGS temperature increases for the last million years.
Of course the alarmists have decided that they only have to say "but then the additional CO2 is a positive feedback and raised the temperature ever higher." Of course, then they have a problem because the temperature stops going up, and the added CO2 stops raising the temperature ever higher, and they have to come up with another cute way to "explain away" that little beauty.
I believe that the blog here has it pretty well nailed as far as CO2 and temperature, though the details may be a little rocky:
http://www.rocketscientistsjournal.com/2006/10/co2_acquittal.html
I have to run now... have fun, all.
With an albedo of 0.7 the equation in my previous post becomes:
4th root of (((1 364.5 / 4) * 0.7) / (5.67 x (10^(-8)))) = 254.745859
With albedo of 0.707 it is
4th root of (((1 364.5 / 4) * 0.707) / (5.67 x (10^(-8)))) = 255.380349
which is about the observed increase. Your hypothesis works out to the right numbers which shows even more clearly that the change in irradiance is pretty meaningless.
Yeah... ignore Water Vapor, and albedo, and its pretty cold that way, ain't it?
Obviously, something else must be going on that is not accounted for by Stefan-Boltzman, huh? If not, it would be might cold on this ol' rock.
Let me propose one of the mechanisms that might, just possibly, just maybe, might respond slightly more linearly than than SB: Water Vapor and one of the effects of that, clouds.
That the % increase in temperature so closely matches the % increase in the irradiance of the Sun for over a thousand years might just serve as a clue for us, if we weren't so "intelligent".
which is about the observed increase.
Hmm,
Interesting overall cloud cover decreased a total of 4% from 1987 - 2000. Looks to be reversing thereafter as we peak out in solar activity.
Overall net global albedo in the same period decreased about 1% with a similar reversal since 2000.
http://isccp.giss.nasa.gov/zFD/an9090_ALB_toa.gif
Where the large positive anomolies in global albedo are due to El ChiChon 1982 and Mt Pinatubo 1991 volcanic eruptions injecting sulfate areosols into the upper atmosphere increasing reflectivity (i.e. albedo).
Temperture in the period:
http://data.giss.nasa.gov/gistemp/graphs/Fig.E_lrg.gif
Blue semicircles mark La Ninas, red rectangles mark El Ninos, and green triangles mark large volcanic eruptions.
The marked volcanic eruptions are Surstey 1963, El ChiChon 1982, and Pinatubo 1992.
A point of fact Anthropogenic Global Warming alarmists tend to overlook is that cloud cover has been decreasing, increasing the solar flux incident at the surface warming the surface more than any effect CO2 can have.
http://isccp.giss.nasa.gov/projects/browse_fc.html
"The overall slight rise (relative heating) of global total net flux at TOA between the 1980's and 1990's is confirmed in the tropics by the ERBS measurements and exceeds the estimated climate forcing changes (greenhouse gases and aerosols) for this period. The most obvious explanation is the associated changes in cloudiness during this period. The variations of the total net flux at the surface reflect the variations in the upwelling LW flux for the most part."
for general background articles also see:
The black body radiation of earth is proportional to the temperature to the 4th power so the % increase in incoming radiation (and outgoing since earth is in equilibrium) is proportional to the 4th power of the increase in temperature. Or IOW, the the temperature increase is proportional to the fourth root of the radiation increase. I'm sure you are aware of this, so I am clueless as to why you are saying they are equal.
I agree, I have no doubt the GCM's can't achieve 1% accuracy on albedo predictions or anything close to that, so there is going to be a large error in their predicted % temperature increase.
What makes you so certain that approximating Earth with blackbody radiation equations alone is accurate? Earth doesn't even come close to meeting the definition of a "perfect blackbody".
Using long term temperature and long term solar irradiance records are the only possible way to even approximate the deviation from "perfect", and the graph that I showed above, and other similar temp vs. irradiance records, suggest that using a linear relationship is much more accurate than blackbody, at least in the short times (a couple thousand years) that we have that data. Clearly there are processes on Earth-Sun that lack thermal equilibrium, and lags resulting from that invalidate the strict use of blackbody. Lags that range from minutes to days to years to decades to millenia. Of course there is no simple comparison that will match the solar data to the temperature data! Yet, for the times we have available there are some striking similarities in the behavior - now even INCLUDING the last couple decades now that it is clear that aerosols have likely been suppressing the temperature response to the Sun since 1940.
The computations you make above only point out the problems with using strict blackbody approximations for Earth. They don't prove in any way that the .2% of increased irradiance is likely to be wholly responsible for the temperature changes... or at least, for on the order of 85% of the change.
gotta run for the night... family calling...
My computations pointed out the opposite, that 0.2% of increased irradiance is NOT responsible for any serious amount of temperature change. And that therefore you are right in two ways, the black body model is vastly oversimplified and that the albedo and other changes are much more important.
The page you linked to that equates the 0.2% increase with a 0.2% increase in temperature is junk. I'm still not sure why you pointed to that in post 159, you have proven beyond a doubt that you know it's not true.
My computations pointed out the opposite, that 0.2% of increased irradiance is NOT responsible for any serious amount of temperature change.
You are correct, in fact Lean series used in in #159 is an adhoc reconstruction having it ultimate roots in 10Be isotope abundance and sunspot number taken as proxies for solar activity and arbitrarily scaled to current satellite measures of solar irradiance.
Actually the measure is more of solar activity in general, and more properly a function of coronal magnetic flux than it is of irradiance or thermal flux.
One would be better off to use magnetic flux scaling, the genesis of the measure, instead of pretending it is a measure of irradiance, which it is not:
A Doubling of the Sun's Coronal Magnetic Field during the Last 100 Years
M. Lockwood, R. Stamper, and M.N. Wild
NATURE Vol. 399, 3 June 1999. Pages 437-439
http://wdcc1.stp.rl.ac.uk/wdcc1/papers/nature.htmlCover:
The total solar magnetic flux emanating through the coronal source sphere8, Fs , derived from the geomagnetic aa data for 1868-1996 (black line bounding grey shading) and the values from the interplanetary observations for 1964-1996 (blue line). The variation of the annual means of the sunspot number <R> is shown by the area shaded purple. The magnetic flux in the solar corona has risen by 40% since 1964 and by a factor of 2.3 since 1901.
That would lead to less confusion in regards solar activity connections to climate and global surface temperatures, as the solar activity connection is much more than one of merely radiative transfer physics.
You are wrong, There are water vapor clouds on Mars
And yes like on Earth these clouds drive weather on Mars
Mars' climatic interactions between dust and water ice clouds: The dusty perihelion climate was observed by Viking and Mariner 9 and by NRAO in 1992,1994, and 1996. What the 1970's orbiters did not identify was the very distinctive climate of Mars at aphelion--the farthest point in its orbit to the Sun--, with its planet-wide belts of water ice clouds. It is the cold atmospheric conditions of Mars during aphelion, when the Sun's effect is much weaker, that stimulate the formation of these water ice clouds. The clouds reduce atmospheric temperatures by forming around the dust acting as condensation nuclei,reflecting sunlight back out into space and once frozen the dust falls to the ground. This competition between dust heating in the summer and cloud cooling in the winter drives the sweeping annual and short-term regional changes in Mars' climate.
Less Cosmic rays = less cloud formation on Mars = Less Cooling = more warming.
Explains the huge dust storms in 2001, 2003 and 2005
Yes, I didn't realize there was so much weather on Mars, and the same sulfur and water vapor are there to produce the same condensation effect (or lack thereof) from cosmic rays. Maybe theguys who ran the lab experiment of earth's atmosphere can do the same with Mars.
I have started up a business of selling those carbon credits and have a whole empty warehouse full of them.
" Changes in Solar Brightness Too Weak to Explain Global Warming"
"Brightness
From Wikipedia, the free encyclopedia
Brightness is an attribute of visual perception in which a source appears to emit a given amount of light. In other words, brightness is the perception elicited by the luminance of a visual target. This is a subjective attribute/property of an object being observed.
"Brightness" was formerly" used as a synonym for the photometric term luminance and (incorrectly) for the radiometric term radiance. According to Federal Standard 1037C, "brightness" should now be used only for nonquantitative references to physiological sensations and perceptions of light."
The solar brightness is measured by the Gore Brightness Meter invented by Al Gore in his childhood one million years ago (or does he just seem that old?).
It's the solar non-brightness that'll kill you.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.