Posted on 02/22/2016 10:57:24 AM PST by Reeses
Agreed. There was a plan to “guide” or maybe coerce humanity in a direction that benefited the Empire. Funny how a few years and a little wisdom help a person to see the underlying message.
Maybe the glass is half full message: human nature (as group and a whole) is predictable. Not certain if I believe that since an equation that might describe would be rather complicated. But there are some basic, underlying variables that are somewhat predictable in human nature: survival and insuring that natural resources are available and procreation. Then there are the more ethereal variables — belief in a deity and compassion for others.
We're all wave generators interacting with each other and the universe, No quantum weirdness here!
George E. P. Box said that all models were wrong, but some are useful.
I wouldn’t include the AGW/CC models in the “useful” category, except as a political tool for political tools.
Maybe as useful as “horseshoes and hand grenades” yet the models have to be precise otherwise the results are garbage — your note about AGW.
Then additional accuracy is lost when they digitize a nonlinear model (an approximation, in the first place) to make it compatible with a computer simulation. Compound this with the thousand of variables each with their own nonlinearities, plus not clearly understanding the feedback and feed-forward mechanisms of the system and the interactions between the variables and initial conditions having gross error, and your simulation is garbage. Sorry, ranting about AGW again....
So if the temperature/pressure etc. measurements were taken in the backyard of Bob's weather station emporium and there were gross errors, then it will dramatically affect the results — garbage in, garbage out.
Also, doesn't the computer algorithm iterate on the computations? — the output of iteration 1 is loaded as initial conditions for iteration 2, etc. and any error is compounded at each iteration until the error swamps out the actual data (the “signal to noise” drops considerably).p>
So the question might be — how accurate does the measured data have to be (and the computer models) to accurately predict temperatures — an order of magnitude, two orders of magnitude, three orders of magnitude?
If I remember correctly running the data through a statistical algorithm will filter out some of the “random” nonsense. However, then again aren't they selectively changing the data again and how might that affect the predictions?
Bottom line: it is an extremely difficult (if not impossible) thing to write a simulation and program to predict the earth's climate. Yet these so called scientists claim that it is done deal. Ya know, the consensus.
Worst of all, they are betting our security and economic futures on something that they don't even understand. Any scientist who supports AGW without question should be fired.
From what I've read (an example is found here), the time that the solution is accurate is logarithmic to the error. It means that if you can predict the weather for 4 days with 1% error, with 0.1% error, you can predict 6 days, and 8 days with 0.01% error. With this example, to predict even for a month would require an impossibly accurate model.
Now, there is the argument that some will give about general trends and not exactly what the weather will be in Timbuktu in 2116. For instance, if I dump a teaspoon of sugar into a cup of water, I don't need to know the what the percentage of sugar is in each gram of water, because I know what the average is.
Similarly, there are models that smooth the data. This is typically done with certain types of fluids simulations, where turbulence is not explicitly modeled, but replaced with models that mimic the behavior of turbulence. However, even these models are only good for a limited set of conditions, and should be compared to experimental data.
An that is the most damning piece of evidence in the whole AGW/CC debate. Their models don't accurately model even the general trends. The last 15 years or so of temperature data are unexpected by their models.
Finally, as an aside from someone who has done many computer simulations, the old "hockey stick" results from Mann at PSU looks a lot like a computer simulation that has gone unstable.
Just to be clear, I am not an expert and only know enough to be dangerous. I took a couple of Systems Theory classes back on college. We worked with MIMO systems a bit. I always found it interesting. Thanks for the tutorial! It will come in handy on my other blogs.
Wow. Logarithmic; and their predictions are 10, 15, 20 years out. This is the most obvious error in the whole theory of AGW.
Average — got it. Isn't the downside to this is that the model may not converge and settle out if the system being modeled is indeed erratic and random. There is a trade off of course. Converging = the predicted result from the computer simulation doesn't match actual, real world measurements.
The computer model might even oscillate. Maybe it has to do with the Nyquist rate of the system which means that there is a minimum sample rate to model a system accurately. Average implies that the sample rate is set very low and possibly inadequate.
There is also the variable of location; where the data is taken (which comes with a thousand different variables that one has to account for — time of year, time of day, random weather conditions, etc.). Can you measure temperature at Timbuktu and a handful of other locations and claim that that small sample size is adequate to describe the whole planet's climate? I would think not.
As you mention, one extremely important job of the scientist is to >objectively< compare his/her theoretical model with actual real world data. If they don't match, then the model is wrong.
Just realized: when you say average, then maybe you mean “trending” in the statistical sense.
And that might be easier to achieve than a precise and complex computer simulation.
In other words, does the data imply that global temperatures are trending towards being warmer.
Of course, everything that we have discussed — initial conditions, data accuracy, etc. still apply.
Correct on all counts.
With its ideas of particles zipping in and out of existence, quantum mechanics is probably the kookiest-sounding theory in science.
...
As the article indicates, it’s more fundamentally the experimental results that are kooky. The theory follows suit.
Many times in the history of science some effect was declared to be completely random,
...
Are you sure about that? Randomness in nature seems to always occur within a framework of non-random laws.
Randomness is really just a declaration of ignorance, not an actual property of the universe. It's stunning the majority of physicists believe that way down there particles can move without cause. If they are right about a wall of magic there's really not much future need for physicists.
Then comes in Chaos Theory, which shows that unless you have absolute precision, you cannot predict over a long period of time the behavior of many non-linear systems.
Physicists have done remarkably well with predictions considering these limitations, but no one has come up with a way around them.
Randomness is really just a declaration of ignorance,
...
I believe that you believe that, in your own random way.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.