Maggie Koerth-Baker writes in the New York Times:
A new method of statistical analysis called “event attribution,” developed by Allen, allows climate scientists to better understand how weather patterns work today. It examines recent severe weather events, assessing how much of their probability can be attributed to climate change. These impacts are so complex that isolating them would be like taking the sugar out of a chocolate-chip cookie — nearly impossible, everything is so intertwined. Event attribution tries to break through this ambiguity using brute force.
Harnessing a tremendous amount of computing power, scientists create two virtual worlds: one where the atmosphere and climate look and operate like ours does today, and one that looks more like the preindustrial world, before we started releasing greenhouse gases from factories, cars and buildings. They alter the weather in both simulated environments and see whether natural disasters play out given differing sea-ice levels, greenhouse-gas concentrations and sea-surface temperatures. They do this over and over and over, tens of thousands of times, producing an estimate of how much our altered climate affected the outcome.
It’s a slow process that requires sophisticated software, which is why it’s a relatively recent development. It took Allen and his team six years and 50,000 simulations to analyze the causes behind an episode of fall flooding in Britain in 2000. Eventually, they were able to say this: 9 times out of 10, the world with climate change had a 20 percent greater chance of experiencing those floods than the world without.