For if they can do that, then their predictions even further in the future should hold some weight. But if they and their super computers can not even predict the avg. temp for July two years in the future, why should we believe or give any credence to what they say will be the temp in 100 years.
Hell, the weather man has great trouble predicting weather 5 days from now!
Predicting weather is not necessary for climate models. But modeling weather in some detail is necessary despite the contention from both sides in the debate that all they need is energy equations. The reason in a nutshell is that water vapor causes warming and without detailed water vapor models (especially where it counts like the tropics), the climate cannot be predicted with any useful accuracy. But accurate predictions of temperature or precipitation for any one place or time are not necessary because they simply don't matter.
Two more points. Computing power will increase in the next 10-20 years to make my argument moot and the GW debate will be resolved except for the politics. So there's no reason to do any mitigation now. Second, mitigation is cheap and easy with accurate climate models. Ask the leftist scientists why they don't use their models to tell us how much sulfur to put in the upper atmosphere to cool the earth should that become necessary. They won't answer because they want to limit carbon (i.e. wealth) not warming.
I think more is required. They should show that their models predict better than others. As an example, I took some public hurricane predictions and compared them with random models and the predictions didn't do better than the random ones. I can provide more details if you like.