The recent advice given to the UK Government by the Parliamentary Office of Science and Technology (POST) in a report called “Climate Variability and Weather,” highlighted an interesting dilemma facing anyone trying to predict climate change in the future, and attribute causes to climate change observed in the recent past.
The summary of the report was: “Short-term differences from long-term climate, or ‘climate variability’, can increase the risk of damaging extreme weather events. This POSTnote examines the causes of climate variability and the use of global climate models to understand, and predict, these variations.”
I have an expectation that in the absence of the global annual average temperature not increasing very much, if at all, in the past 10-15 years, the new talisman behind which many will rally will be the increased prevalence, supposedly, of extreme weather events due to “increased energy in the Earth’s climate system.” Such thinking is apparent in recent reports about the Russian heatwave. See here, here and here.
The dubious logic behind the conclusion, when trying to explain a single outlier event, that man-made global warming made the Russian heatwave three times more “likely” even though it was within natural limits of variability, will be something to return to in the future.
The POST report says that for the UK, natural climate variability will be dominant over the next few decades, although the influence of greenhouse gasses will increase “as the century progresses.” The report is also a plea for the UK Met Office to get bigger computers.
When will the man-made global warming signal emerge on regional scales? How can it be apparent in global climate statistics if it is swamped by natural climate variability over the Earth’s component regions?
The IPCC says that it is unequivocal that, when viewed globally, the man-made signal has emerged, although in this typical report details about past attribution are skimpy, everything is about model projections for the future.