We regularly hear opinions from pundits who are very good at making nonsense sound plausible, even clever. It’s still nonsense of course, but with the right window dressing it looks very sharp indeed. Such are those claiming that the increase of wildfire in California has nothing to do with climate change, and maybe even nothing to do with how much rainfall the state gets.
Robert Rohde has mentioned using May-through-October values of temperature and rainfall to correlate with California wildfire. Let’s start with the data for acres burned by wildfire in California (from the source referred to by Willis Eschenbach):
The presence of a trend is clear. And, clearly it’s not linear.
Another crucial thing to notice is that these data exhibit heteroskedasticity, meaning that as the values get larger, the level of flucutation (i.e. the noise level) also increases. Fitting a smooth curve to the data, then ploting the residuals from the smooth fit, makes this plain:
We can address both issues by taking the logarithm of the area burned (in millions of acres):
When we do so, we find that a straight-line increase (shown as a red line) is a good approximation of the trend. It’s important to log-transform the area burned, in order to bring the ever-increasing noise level under control.
It’s easy to blame that straight-line trend on climate change, but from this data and this data alone, without other information such a conclusion would be premature. Yes we’ve seen higher temperatures in California and we expect that to promote wildfire. But other things have trended too, including factors that don’t necessarily relate to climate at all. All we know so far is that wildfire area burned is trending upward — which of the upward-trending variables are the root causes (there can be more than one) remains an open question.
But we do know temperature and rainfall (we’ll use the May-Oct averages) should influence wildfire area, from basic physics: hotter and dryer means more flammable. They certainly are correlated to wildfire area, just from a plot of each year’s average temperature (May-Oct) against its total rainfall (May-Oct), with circles showing the area burned:
We can also plot how different the area burned is from average, with the highest-burn years as big red circles and the lowest-burn years as big blue x’s:
Perhaps it’s no coincidence that years with below average rainfall and above average temperature dominate the big-wildfire list.
We can model the logarithm of the wildfire area as a linear function of temperature and rainfall, which gives us this:
It’s not a bad fit, in fact it’s better than the straight-line model but not by much. However, this model suggests that the influence of rainfall amount is not statistically significant, which is counter to physical intuition. Also, when we look at the residuals from this fit we notice that there’s still a trend present, one which is statisticallly significant:
Let’s try something crazy. Let’s model the logarithm of wildfire area using time (to represent a general trend) and temperature and rainfall. That will really test the influence of temperature and rainfall, because to have a significant impact they’ll have to do more than just follow the common trend. This model fits considerably better than the others, with the lowest AIC (Akaike Information Criterion) and lowest BIC (Bayesian Information Criterion) by far, and here it is:
Perhaps just as relevant, this model also confirms the statistical significance of all three variables. So yes, temperature and rainfall both have a significant effect, and there are other things contributing upward trend as well.
As easy as it is to show the temperature/rainfall impact on wildfire in California, this only scratches the surface of what we find in a rich literature on the whole topic. For instance, the May-through-October temperature isn’t necessarily the best choice for some regions or ecosystem types. The timing of spring snowmelt influences the dryness of fuels, as does the number of days with zero precipitation (esp. consecutive days). Some impacts aren’t felt in the year they occur but in the year following, particularly those that can influence the availability and dryness of next year’s fuel load.
So it’s a lot more complicated than just the May-Oct average temperature and total rainfall. Yet the recurring theme is simple, that hotter and dryer means more flammable: there’s little doubt, none reasonable, that higher temperatures and lower rainfall make wildfire seasons worse. Alas, there’s no doubt that man-made climate change is making California temperatures higher and drought worse. And not just California; both climate change and increased wildfire danger have spread throughout the western U.S.A.
This blog is made possible by readers like you; join others by donating at My Wee Dragon.