Larry Bell writes at Forbes:
One highly plausible answer to this mystery is that the climate models upon which IPCC’s failed projections are based exaggerate climate sensitivity to CO2, underestimate known natural forcings, and simply don’t understand how to factor in and calibrate other influences such as ocean cycles and solar activity. Numerous recent scientific papers suggest that overestimation of sensitivity by at least 30% may account for much of the problem.
The IPCC has crudely estimated an approximate 3.0o to 1.5 o C /decade mean global temperature increase in previous reports. Assuming a sensitivity of 3oC, that each 1ppm of CO2 will add about 1oC (at current saturation levels), and that CO2 has accumulated at about 2ppm/decade, then temperatures should have risen about 3oC during the past 15 years. If so, a reduction of 30% would still leave 2oC of missing heat which must still have been offset by natural cooling.
After all, the importance of those natural influences shouldn’t be that surprising given that history shows that temperatures have been higher when CO2 levels were lower, and vise versa. In fact, the past century has witnessed two generally accepted periods of warming The first occurred between 1900 and 1945. Since CO2 levels were relatively low then compared with now, and didn’t change much, they couldn’t have been the cause before 1950.
The second possible very small warming, following a slight cool-down, may have begun in the late 1970s lasting until 1998, a strong Pacific Ocean El Niño year. Yet even if global temperatures actually did rise very slightly during that second period, the U.K. Hadley Center and U.S. NOAA balloon instrument analyses fail to show any evidence, whatsoever, of a human CO2 emission-influenced warming telltale “signature” in the upper troposphere over the equator as predicted by all IPCC global circulation models. In fact, about half of all estimated warming since 1900 occurred before the mid-1940s despite continuously rising CO2 levels since that time.