Posted on 10/13/2002 6:36:49 PM PDT by PeaceBeWithYou
The Earth system - comprising atmosphere, ocean, land, cryosphere and biosphere - is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales.
Thus begins the abstract of an enlightening essay on the many shortcomings of today's climate models (O'Neill and Steenman-Clark, 2002) in which the implications of this situation are discussed within the context of developing "reliable numerical models that can be used to predict how the Earth system will evolve and how it will respond to man-made perturbations." The challenge of this enterprise, as the authors describe it, is truly daunting.
They begin by noting that the system "must be modeled as an interactive whole," and that "because of the complexity of the process and interactions involved, high-performance computing is absolutely essential." As they go on to elaborate, however, today's climate models are sorely lacking in this "absolutely essential" characteristic, as they are also deficient in many other important properties, which clearly implies that even our best climate models are not yet up to the task required of them, i.e., accurately predicting the future evolution of earth's climate.
O'Neill and Steenman-Clark note, for example, that there are "considerable gaps in knowledge about the interactions among the sub-systems," and that "current models include only a limited set of the necessary components," which leads us to ask: Are we way off-base in concluding that if today's climate models have "gaps in knowledge" large enough to be described as "considerable," ought not those gaps be filled before one puts much credence in the predictions of the models? And what about the models possessing a limited set of the necessary components? Wouldn't one want them to have all of the necessary components before their predictions were deemed correct?
Two examples of the coupling of subsystems that are "poorly treated at present," say O'Neill and Steenman-Clark, are the coupling of changes in atmospheric chemistry with climate and the coupling of the biosphere with climate. Moreover, they note that "individual subsystems like the atmosphere exhibit enormous complexity in their own right," and that "an increase of high-performance computer power of several orders of magnitude is needed to make significant progress." This being the case, we again are forced to ask: Are we way off-base in concluding that if we need "several orders of magnitude more computer power" to merely make "progress," is there not a very real likelihood that current climate models are nowhere near being able to produce an accurate description of earth's future climate?
In addition to the maddening complexity of the planet's climate system and the great gaps that exist in our knowledge of its workings, the lack of sufficiently fine spatial resolution is another enormous hurdle that stands in the way of accurate climate change predictions via numerical model calculations. With respect to the fast and dramatic climate changes that are thought to be linked to similar changes in the thermohaline circulation of the world's oceans, for example, O'Neill and Steenman-Clark say that "predicting rapid change reliably will require coupled models of the atmosphere and ocean with much finer spatial resolution than is used at present." An "imperative," as they thus put it, is to bring "much greater high-performance computer resources to bear on the problem to allow the Gulf Stream and related circulations to be adequately simulated." And if that need is truly imperative, as they say, we ask ourselves yet again: Are we way off-base in our belief that this need should be satisfied before we start turning the world's economy upside down in an effort to forestall model-based predictions of catastrophic global warming?
Then there is O'Neill and Steenman-Clark's statement that "it is widely recognized that the representation of convection, clouds and their interactions with radiation is one of the greatest weaknesses of current climate-prediction models," which is also a consequence of insufficiently-fine spatial resolution. And what is their prescription for solving this problem? They say that "a major drive in climate modeling must be to reduce the impact of uncertain parameterizations, such as that of convection, by resolving important processes to a greater extent," which clearly requires you-know-what and which prompts us to ask yet one more time: Are we way off base in demanding that the models resolve these processes before we start letting them make our decisions for us?
Of course we're not off-base; our questions and their implied answers are right on the mark. The nature of the well-chosen words so aptly employed by O'Neill and Steenman-Clark leave no doubt about it - computer modeling of earth's climate, as far as it has come, still has a long, long way to go before it is up to the task of accurately defining future climate. And until it gets there, the three of us have no intention of letting an inadequately programmed computer usurp the responsibility we have to do our thinking on the vitally important issue of carbon dioxide and global change.
The stakes are just too high.
Sherwood, Keith and Craig Idso
Reference O'Neill, A. and Steenman-Clark, L. 2002. The computational challenges of Earth-system science. Philosophical Transactions of the Royal Society of London, Series A 360: 1267-1275.
Garbage In = Garbage Out.
They are more akin to the Walt Disney "It's a small world" ride than to the real world. And, yet it seems that half of the world is willing to commit economic suicide based on their results.
Not this cowboy, and thankfully not the President either, thank God.
Looks to me like a corporate plan that worked. The old "scorched earth" method at work. If everyody can make it without paying us the royalties, then nobody can make it.
Then they simply come up with a new product (or roll out one that's already set to go into production) and feast on a royalties of a brand new patent
| To find all articles tagged or indexed using Global Warming Hoax , click below: | ||||
| click here >>> | Global Warming Hoax | <<< click here | ||
| (To view all FR Bump Lists, click here) | ||||
Your point is valid. There's simply too much guesswork involved for any of the current climate models to qualify as predictive models which can be relied on for accuracy.
Aren't you glad Kyoto Algore isn't president?
In the automotive sector the synthetic lubricants that were first used with the new refrigerants were more corrosive, especially to the o-rings in the compressors, hose connections, as well as the hoses. This made it neccesary to change out components and drain all the oil from the system. Add to that the smaller size of the molecule, the higher operating pressures, and you have a system that is more leak prone than the previous one.
Thankfully, it wasn't too long before a lubricant was found that would mix with the old, and not eat the seals out in a few months, nor require that you change the compressor, seals, hoses, and drain old lubricant.
With the DIY kits, you can convert most car AC's yourself for about 30-40 bucks now, or you can hire it done for around $100. Some folks payed over $1000 to convert entire systems.
The higher pressures will make compressors and hoses fail sooner. My guess would be that the lifetime will be reduced by about a third in most, half in some.
The real problem with the models are there is an agenda behind them so we are getting worst-case assumptions. And when you start assuming worst-case with dozens of variables, the errors just multiply and the results get ridiculous. No one in the global warming business cares about the truth, they care about proving global warming and offering doomsday predictions.
When I worked in a theoretical meteorology office, many years ago, they were beginning to devise ways to add continents to their billiard ball. They complained even then of not having enough real data, there were about 200 suitable weather stations on the planet. There are more weather stations now, but the data goes back only so far.
Point is, modeling is not easy, and there are built in limits to data used to check the models.
Indeed, Astrophysicist Sallie Baliunas, stated the complexity of the problem when she calculated that to accurately predict 50 years into the future we would need a climate program that could handle approximately 5 million variables, and would need to be many magnitudes faster than current supercomputers, or we would need all of the time since the beginning of time to compute the results using what we have today.
It's time we declared Enviromentalism either a Religion or a disease and either invoke seperation of Church and State, or find a cure/vaccine.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.