Temperature, Rainfall, and California Wildfire

We regularly hear opinions from pundits who are very good at making nonsense sound plausible, even clever. It’s still nonsense of course, but with the right window dressing it looks very sharp indeed. Such are those claiming that the increase of wildfire in California has nothing to do with climate change, and maybe even nothing to do with how much rainfall the state gets.


Robert Rohde has mentioned using May-through-October values of temperature and rainfall to correlate with California wildfire. Let’s start with the data for acres burned by wildfire in California (from the source referred to by Willis Eschenbach):

The presence of a trend is clear. And, clearly it’s not linear.

Another crucial thing to notice is that these data exhibit heteroskedasticity, meaning that as the values get larger, the level of flucutation (i.e. the noise level) also increases. Fitting a smooth curve to the data, then ploting the residuals from the smooth fit, makes this plain:

We can address both issues by taking the logarithm of the area burned (in millions of acres):

When we do so, we find that a straight-line increase (shown as a red line) is a good approximation of the trend. It’s important to log-transform the area burned, in order to bring the ever-increasing noise level under control.

It’s easy to blame that straight-line trend on climate change, but from this data and this data alone, without other information such a conclusion would be premature. Yes we’ve seen higher temperatures in California and we expect that to promote wildfire. But other things have trended too, including factors that don’t necessarily relate to climate at all. All we know so far is that wildfire area burned is trending upward — which of the upward-trending variables are the root causes (there can be more than one) remains an open question.

But we do know temperature and rainfall (we’ll use the May-Oct averages) should influence wildfire area, from basic physics: hotter and dryer means more flammable. They certainly are correlated to wildfire area, just from a plot of each year’s average temperature (May-Oct) against its total rainfall (May-Oct), with circles showing the area burned:

We can also plot how different the area burned is from average, with the highest-burn years as big red circles and the lowest-burn years as big blue x’s:

Perhaps it’s no coincidence that years with below average rainfall and above average temperature dominate the big-wildfire list.

We can model the logarithm of the wildfire area as a linear function of temperature and rainfall, which gives us this:

It’s not a bad fit, in fact it’s better than the straight-line model but not by much. However, this model suggests that the influence of rainfall amount is not statistically significant, which is counter to physical intuition. Also, when we look at the residuals from this fit we notice that there’s still a trend present, one which is statisticallly significant:

Let’s try something crazy. Let’s model the logarithm of wildfire area using time (to represent a general trend) and temperature and rainfall. That will really test the influence of temperature and rainfall, because to have a significant impact they’ll have to do more than just follow the common trend. This model fits considerably better than the others, with the lowest AIC (Akaike Information Criterion) and lowest BIC (Bayesian Information Criterion) by far, and here it is:

Perhaps just as relevant, this model also confirms the statistical significance of all three variables. So yes, temperature and rainfall both have a significant effect, and there are other things contributing upward trend as well.

As easy as it is to show the temperature/rainfall impact on wildfire in California, this only scratches the surface of what we find in a rich literature on the whole topic. For instance, the May-through-October temperature isn’t necessarily the best choice for some regions or ecosystem types. The timing of spring snowmelt influences the dryness of fuels, as does the number of days with zero precipitation (esp. consecutive days). Some impacts aren’t felt in the year they occur but in the year following, particularly those that can influence the availability and dryness of next year’s fuel load.

So it’s a lot more complicated than just the May-Oct average temperature and total rainfall. Yet the recurring theme is simple, that hotter and dryer means more flammable: there’s little doubt, none reasonable, that higher temperatures and lower rainfall make wildfire seasons worse. Alas, there’s no doubt that man-made climate change is making California temperatures higher and drought worse. And not just California; both climate change and increased wildfire danger have spread throughout the western U.S.A.


This blog is made possible by readers like you; join others by donating at My Wee Dragon.


26 responses to “Temperature, Rainfall, and California Wildfire

  1. “Let’s model the logarithm of wildfire area using time (to represent a general trend) and temperature and rainfall. ”

    I have no idea what you mean here. Could you spell it out more, please? What is actually being graphed?

    • y axis = log(WildfireArea)
      x axis = linear combo of time, temps (seasonal), and rain (seasonal) from multiple regression procedure.

      Tricky part conceptually is separating out what time represents here independent of any general temps/rains increases/decreases since regression partials out correlated variance in the predictors.

      • Well, maybe. But the X axis is only labeled “year”. And those other variables are not dimensionally equivalent to “year”. And anyway, what is the actual linear combination that was used? I’d be more inclined to think that the *Y* axis was a linear combination. In any case, I can’t decipher what the axes represent.

        [Response: The x-axis is time. The y-axis is the logarithm (base-10) of the area burned by wildfire in CA that year, in millions of acres.

        The black dots connected by a thin line represent the measurements. The thick red line is a linear model using the predictor variables (time, temperature, rainfall).]

    • Hi Tom,
      My reaction was similar to yours. Relabelling the axes seems to be in order, either by explaining that the areas have been normalised by some model-based index, with details added, or by including the dimensionally correct units. Log-area vs years does not seem to be an adequate account of what is shown.
      My guess would be that the y-axis is the one that has been normalised/adjusted, but that needs to be made explicit.
      Leto.

  2. Zeke Hausfather

    This 2006 Science paper by Westerling et al on wildfires and climate change in the Western US might be of interest: http://science.sciencemag.org/content/313/5789/940

    They find a strong correlation between March-August temps and area burned (though other factors also matter, of course).

  3. This October 2016 PNAS paper found that climate change has doubled the area affected by forest fires in the Western US in the last 31 years.

    Lead author John Abatzoglou said: “A lot of people are throwing around the words climate change and fire – specifically, last year fire chiefs and the governor of California started calling this the ‘new normal’. We wanted to put some numbers on it”.

  4. rhymeswithgoalie

    At least here in Central Texas, the worst-case scenario is a wet spring producing a lot of vegetation, followed by a relatively hot and/or dry summer.

    • That’s basically the scenario I heard about in reference to CA, when the drought first broke in the winter of 2016-17. Lots of new growth, which then dried out to become lots of new fuel. Sorry, don’t recall the source.

  5. In the latest statement from the moron who sits where a President should be, he contrasts the experience of California with – of all places – Finland. I suspect anyone who isn’t descending into senility can appreciate that Finland’s geographic position gives it a very different climate.

  6. I think George Wuerthner is one of THE experts on the subject. This interview was recorded a few weeks before the horror in Butte Co.,but it is prescient of what is happening. He is a frequent guest on Resistance Radio and his past interviews are great sources of information

  7. I’ll keep on banging the same drum I usually do – I believe that increasing CO2 in the atmosphere does cause vegetation to grow faster (provided there is water etc). And that is why I think your fit works better when time is included – time is a proxy for CO2 level, and hence the amount of vegetation.

    • David B. Benson

      Liebig’s Law of the Minimum.

      • By all means. But what if CO2 is the scarcest nutrient? It certainly won’t be in many situations, but there is a chance that in some situations it is. Certainly it is added in some greenhouses.
        But I’d be happy if someone comes up with another reason why adding time improves the fit.

      • Greenhouses that add CO2 also add fertilizers containing other ingredients. Deniers always neglect to mention this 2nd point.

      • From http://www.plantphysiol.org/content/111/3/909

        “Growth analysis indicated that increased [CO2] may allow eucalyptus species to perform better during conditions of low soil moisture.”

        I found this by searching on eucalyptus, because here in Australia that is what burns best, and I can’t imagine California is too much different.

  8. Wow… starting to think that Cliff Mass didn’t start his version of the first graph in 1987 and end it in 2016 by accident.

    *sigh*

  9. Another point that’s been nagging slightly: from what I’ve been hearing, a prominent aspect of the changing climate and its impact on California wildfires has been the extension of what used to be a relatively well-defined fire season. That’s particularly relevant in the present instance, obviously, since we’re talking about a wildfire outbreak doing record amounts of damage *in early-mid-November*. So, what patterns do we see outside of the May-October ‘bin’? I actually tried to take a quick-and-dirty look at that myself, just for November values, but got lost amid the voluminous NOAA pages and ran out of time to search further.

  10. By the way, it’s worth keeping in mind that the acreage burned every year is strongly affected–historically speaking, that is–by modern fire suppression technology and practice. Far and away the biggest influence, human or otherwise, on acreage burnt has been firefighting.

    That’s extremely plain if you look at the acreage burned over the span of the NIFC record, which starts in 1926. (They do note that data prior to 1983 can’t be methodologically verified, so there’s an inhomogeneity issue.)

    While most of the late 20th century is relatively ‘quiet’, with not many years logging more than 5 million acres burned, the years prior to about 1957 or so sport some really eye-popping totals. The record appears to be 1930–well, it would be, wouldn’t it, speaking as we have been of the correlations among temps, precip and wildfire?–when the total hit this rather impressive number:

    52,266,000 (!)

    https://www.nifc.gov/fireInfo/fireInfo_stats_totalFires.html

    • Careful, Doc. Those older numbers are very uncertain. The NIFC itself has stated that it believes there is significant double counting in those older numbers:
      https://www.carbonbrief.org/factcheck-how-global-warming-has-increased-us-wildfires

      • I did note the caveat they have–and while it makes sense that there are hefty uncertainties in that old data, an order of magnitude leaves lots of room for ‘slop’.

        And, after all, that was just a blog comment–not even an actual post!

        That said, thanks for furthering the point. Good to know.

      • From the Carbonbrief piece:

        “While the early 20th century data is not reliable and likely double or even triple-counted actual fires, Eardley says that it is possible that fire extents were higher back then for a simple reason: there was no large-scale firefighting organisation in the first half of the 20th century. Therefore, fires would burn through larger areas before being extinguished or burning themselves out, particularly when they were not close to towns or settlements.

        “Today, the US has larger and more organised firefighting operations in place. Therefore, recent increases are not due to any change in firefighting approach. If anything, many more resources have been devoted to fighting fires in the past few decades than in any prior period.”

        That was in fact my main point. I’d support the interpretation that fire suppression is the likely explanation for the decline in the available data (whatever its quality)–as opposed to being purely an artifact with the observation that the decline in acres burned begins in the late 50s, which I believe to be the time frame in which fire fighting technology and practice rapidly changed, rather than in or around 1983, which is when data collection became more reliable. (Though I’d admit that, since the methodology is basically unknown prior to ’83, it’s possible that it improved gradually during the observed ‘decline’ period, but without the improvement being documented.)

        On the other hand, the evolution of fire fighting policy and tech is hardly an unknown factor:

        https://en.wikipedia.org/wiki/History_of_wildfire_suppression_in_the_United_States#Suppression_as_a_rule

  11. The stats for the months prior to November paint a different picture. Campfire area (Sacramento drainage):

    March – May (1895-2017)
    Temperature: + 0.01 F / decade
    Precip: + 0.04″ /decade

    More precipitation in spring is conducive to a larger fuel load. This is followed by 5 months with a warming/ drying trend, setting the stage for bigger fires when the Santa Ana’s start to blow towards summer’s end:

    June – October
    Temperature: + 0.02 F /decade
    Precip: – 0.01″ / decade

    (Climate at a glance)