The greatest problem with ANYTHING relying on the GCMs is clouds/aerosols, which even the alarmist scientists consider to be their weakness. The error bars they put on those inputs are bigger than their inputs!
Consider: Assume the Sun were constant, not increasing in radiance since 1900. In order to account for ALL the temperature change on the earth since 1900, a 1% decrease in total amount of cloudiness (call them having an average albedo of 0.60, and the earth of 0.30) - allowing an additional 1% more surface to absorb its full compliment of solar energy in 2000 than in 1900 would be sufficient - I'm not going to do the calculations right now, but this would easily result in more than sufficient additional Watts of "forcing". Look up in the sky, and tell me how to measure within 1% what the cloudiness is over the course of a week or two.
I'm not saying this is the mechanism. It is only a suggestion of how little the temperature rise seen could have come about with a CONSTANT sun, which it's not.
The GCMs can NOT be counted on to predict temperatures when they can't even be given correct inputs values. They can NOT figure out within 1% what the cloudiness or average cloudiness will be. They can not figure out within 1% the physical processes that produce clouds. They can not even figure out what the effect of, say exactly 50% cloudiness, will be.
Then it becomes the old computerman's saw: Garbage In, Garbage Out.
The GCMs throw out more garbage than anything else on earth ever has... and they throw it out very, very fast.
With an albedo of 0.7 the equation in my previous post becomes:
4th root of (((1 364.5 / 4) * 0.7) / (5.67 x (10^(-8)))) = 254.745859
With albedo of 0.707 it is
4th root of (((1 364.5 / 4) * 0.707) / (5.67 x (10^(-8)))) = 255.380349
which is about the observed increase. Your hypothesis works out to the right numbers which shows even more clearly that the change in irradiance is pretty meaningless.