In her recent blog post Judith Curry gave her reasons for not believing that global warming is exacerbating heat waves. Her conclusion was summarized thus:
Bottom line is that the intuitively reasonable attribution of more heat waves to a higher average temperature doesn’t work in most land regions.
Her reference to the “intuitively reasonable attribution of more heat waves to a higher average temperature” is illustrated by this graph she got from a presentation by Prashant Sardeshmukh, who took the graph from the IPCC SREX SPM report:
The first panel shows that if the average temperature increases while, in other ways, the distribution remains the same (where “in other ways” means that the shape and standard deviation remain unchanged), then the number of extremes increases — where “extreme” refers to values high enough to get into those red areas on the right-hand side. It uses the “Gaussian” (a.k.a. “normal”) distribution as an illustrative example.
Notice that the cutoff limit for defining “extreme” doesn’t change. It’s like defining “extreme” as 100 degrees (Fahrenheit) or hotter even if the average temperature rises or falls. That’s an important point. If instead one defines “extreme” as, say, more than 2 standard deviations above the mean, then it makes no difference whether the mean and/or standard deviation change; so long as the distribution remains Gaussian, the probability of exceeding the 2-sigma limit remains the same (equal to 0.02275). Essentially, as the mean and standard deviation change so does the cutoff limit for defining “extreme.” But in terms of the impact on human civilization that just doesn’t make a lot of sense; if the mean temperature rises to 200 degrees (Fahrenheit or Celsius, take your pick) it’s lethal, even though the probability of exceeding 2-sigma is still just 0.02275.
The presentation from which Curry takes her graphs investigated how extremes (which it defines as 2 standard deviations or more above the mean) has changed over time. To do so, it uses temperature data not from thermometers but from reanalysis data. It also uses temperature not at earth’s surface (where we live), but at the 850 hPa level, about a mile and a half altitude. It was (quite relevantly) pointed out to Curry that we don’t live at altitude a mile and a half. It further uses temperatures during the months of December, January, and February. But none of that fazes her. Nor is it relevant to what I’m about to discuss.
The “money graph” in Curry’s post comes from the same presentation, and displays how the mean temperature, the standard deviation of daily temperature, and the probability of exceeding 2 standard deviations, have changed from the early 20th century (1901-1925) to more modern times (1981-2005):
The upper three graphs are for the 20th century reanalysis data, the lower three graphs are the results using AMIP simulations.
Let’s take a closer look at a region of the graphs for the 20th century reanalysis. I’ve put a box around the region I want to consider in this enlargement of the graph of how the mean changed:
For that area, the mean temperature increased by more than one full standard deviation. That’s a lot! It should, in fact, increase the probability of exceeding an “extreme” limit considerably, unless the shape of the distribution also changed in such drastic fashion that, well, it’s frankly hard to believe.
We can also take a look at how the standard deviation has changed in the same region:
The standard deviation also increased, by around 20% (just a rough estimate based on the scale attached to the original graph).
Here’s the thing: if the shape of the distribution doesn’t change, then increasing the mean and the standard deviation will both increase the probability of exceeding some cutoff limit. In fact, for a Gaussian distribution where we set the cutoff limit at 2 standard deviations to begin with (so the probability of exceedence is 0.02275), that probability will increase to 0.2023, about 9 times higher than it used to be. Unless, of course, we also increase the cutoff limit.
But when we look at the plotted value from the 3rd panel showing how the probability of exceeding 2-sigma changed, we note that it didn’t increase at all:
I have to wonder — how did that happen? If we keep the cutoff limit unchanged, but increase the mean by 1 standard deviation and the standard deviation by 20%, I suppose it’s possible to devise some contorted, bizarre change in the shape of the distribution which might make that happen. But the change in the shape would have to be truly astounding, truly so unlikely as to defy belief. If you’re trying to make that happen, it’s damn hard. And, I’ve looked at many distributions of daily temperature and how they’ve changed over time, but never seen any such change or anything even close.
All of which leads me to believe that, in Sardeshmukh’s analysis, he did change the cutoff limit. He’s computing the chance of exceeding 2-sigma as the mean, the standard deviation, and the cutoff limit for “exceeding 2-sigma” change.
In fact I think we can be sure of it. Look at an earlier figure of his from Curry’s post:
Notice that for these data (from the 1981-2005 time span) he states that “both probabilities would be 0.022 if the distribution were Gaussian.” That can only be the case if he’s defining “exceeding +/- 2-sigma” using the mean and standard deviation from that same data, rather than defining the 2-sigma limit from the entire time span (or the combination of the early 1901-1925 and late 1981-2005 data).
From this, we can see why his “pattern of change in extreme warm daily temperature probabilities … looks nothing like the mean warming pattern.” That’s because he changed the cutoff limit from one time span to the next. It’s 2 standard deviations in both cases, but for those cases the 2-sigma limit gives a temperature which is quite a bit different. Quite a bit.
And that’s why his graph implies that the probability of extreme temperature has changed only a little, and in widely variable fashion: because from one time span to the next he has (in terms of absolute temperature) re-defined the limit for extreme temperature.
Which makes me wonder, what the hell is going on? If he was trying to emphasize that we can’t use the change in mean and standard deviation to understand the 2-sigma exceedance, well duh. When the mean goes up and the standard deviation goes up and you change the limit to define “extreme” temperature, you should expect nothing less.
And if that is what he did (I don’t see how it could be otherwise), then why did he put in that graph from IPCC SREX SPM showing that an increase in mean/standard deviation will increase extreme heat when the limit defining extreme does not change? And why did he say (as Curry quotes) this?
The fact that changes in extreme anomaly risks cannot be deduced from the mean shiLs alone is disturbing …. but entirely understandable in terms of basic weather dynamics and the Climate--Weather connection.
Not so. The fact that “changes in extreme anomaly risks cannot be deduced from the mean shiLs alone” has nothing at all to do with weather dynamics or the climate-weather connection, it has to do with very basic and very fundamental mathematics.
Perhaps Sardeshmukh is confused, or doesn’t understand what he himself did.
As for Curry, the idea that she doesn’t understand what he did, or what the results really mean, doesn’t surprise me at all.