There’s a new reconstruction of past temperature covering the last 11,300 years by Marcott et al. (2013, A Reconstruction of Regional and Global Temperature for the Past 11,300 Years, Science, Vol. 339 no. 6124 pp. 1198-1201, DOI:10.1126/science.1228026). Data for their reconstructions, and the proxy data on which they’re based, are part of the supplementary materials.
The Marcott reconstruction has been joined to the Shakun reconstruction prior to that, and the HadCRUT4 global temperature data since, and the projected temperature change under the A1B scenario for the future, by Jos Hagelaars, in order to show us some perspective on climate change past, present and future.
This graph has been dubbed the “wheelchair.” Compared to the past, what’s happening in the present is scary. The future is scary as hell.
Marcott et al. took 73 proxy data sets distributed around the globe and combined them to form an historical temperature reconstruction. Their approach differs from that of others in some important respects. First, the proxy data were already converted to temperature estimates before being combined into a reconstruction. Second, since most dates were estimated by radiocarbon dating, dates were re-computed using the most up-to-date calibration (the “Calib6.0.1″ program using the “IntCal09″ data). Third, most of their proxy data sets are ocean-based rather than land-based, making for a more representative global picture.
Fourth, since their purpose is to understand what happened in the past 11,300 years their data have a time coverage concentrated on the past rather than the present. In fact the data coverage is much better for the distant past then the last century, since all 73 proxies overlap in time during the period from 5500 to 4500 years ago (a.k.a. “BP” for “before present”, where “present” is the usual choice in such studies, the year 1950) but only 18 proxies extend all the way to the year 1940 (the final year of the reconstruction).
This is in sharp contrast to other reconstructions, for which it is usual that data coverage shrinks to ever smaller numbers of proxies the further back one goes in time; for the Marcott et al. reconstruction data coverage shrinks as one gets closer to the present. But that’s not such a problem because we already know how temperature changed in the 20th century.
The proxy data sets were aligned to match during their common period of overlap, 5500 to 4500 BP (calendar years -3550 to -2550). Then they were combined in a number of ways. The “main” method (if there is one) was to use the data to estimate gridded temperature on a 5×5 degree latitude-longitude grid, then compute an area-weighted average. The same procedure was also applied using a 30×30 degree latitude-longitude grid. They also estimated averages over latitude bands covering 10 degrees, and area-weighted averages of that. In addition, they applied RegEM (regularized expectation maximization) to infill gaps before computing the area-weighted gridded averages. They also computed a simple average of all the proxy data (without area weighting), both for the unadorned proxy data and after infilling with RegEM, giving results similar to the area-weighted averages (which argues for good geographic distribution of proxies). They also computed averages by the “jackknife,” where multiple reconstructions with randomly chosen proxies omitted are averaged.
That’s a lot of reconstructions!
And as if that’s not enough, in order to get a handle on some of the uncertainties they actually perturbed the data, both the temperature estimates and the times at which they apply, according to their uncertainties, doing so randomly 1000 times for each method. This “Monte Carlo” approach yields an ensemble of estimates, which was averaged to create the reconstruction for each method and gives important information about its inherent uncertainty.
I carried out the “simple average” procedure myself, and got this result (in red, labelled “Calib6.0.1 Ages”) compared to the Marcott et al. simple average result (in black):
Note that my simple average shows more fluctuation than theirs despite both being based on a straight arithmetic average. That’s because theirs is “smoothed” by the process of generating 1000 perturbed data sets and averaging the results of each. This tends to “smear out” the uncertainty both of temperature and of time, which is appropriate because the very rapid small fluctuations in the simple average are unreliable — we just don’t know the times at which measurements apply with sufficient precision. But note also that my simple average follows the overall pattern of changes in the Marcott et al. reconstruction with outstanding fidelity. This is what the data show.
The sharp uptick at the end — which the straight average shows even more strongly than the ensemble average of perturbed series — is probably not correct. This is clearest if we paste a temperature time series onto the end of the reconstruction. Here I’ve added the HadCRUT4 series (20-year averages centered on the times of the Marcott reconstruction) aligned to the data leading up to the 1940 spike (which I think is a necessary step, more about that later):
The too-large uptick is an “artifact” of the fact that as proxies drop out of the reconstruction (because they don’t go far enough forward in time), if “cooler” proxies drop out it makes what follows artificially warm, if “warmer” proxies drop out it makes what follows artificially cool.
In fact the RegEM reconstructions (which infill missing data) don’t show such a large uptick, they have a much more modest one:
Because of this discrepancy between the different reconstruction methods, Marcott et al. themselves say that they do not consider the large uptick at the end of the reconstruction to be “robust.” In my opinion (for reasons we’ll elaborate soon) I agree. There is an uptick — but it’s not as dramatic as their “main” reconstruction (the “Standard 5×5″) suggests.
I’ll deal with several issues in future posts, including the recent uptick, the impact of proxy drop-out on the temperature reconstruction, the necessity for alignment with modern instrumental data, and the effect of re-calculating the ages of the proxy data (an issue which Steve McIntyre really doesn’t get — and in my opinion willfully so). Right now I’d just like to review what the Marcott et al. reconstruction is really telling us.
Let’s take the RegEM reconstruction and add the HadCRUT4 data on the end of it, aligning the records during their period of overlap:
Look at the spike at the end. The big, and most importantly the steep, scary spike at the end. That’s not an artifact of the way proxy ages were computed, or how the reconstruction was done, or the effect of proxy drop-out as records become more sparse in the later period. It’s what the thermometers say. Ignore them at your peril.
As scary as that is, what’s far more frightening is that it’s not going to stop.
The last deglaciation ended about 10,000 years ago. There followed a period of nearly 5,000 years when global temperature was surprisingly stable. In the 5,000 years following that, up to about 1800, global temperature declined a total of nearly 0.7 deg.C, culminating in the depth of the “little ice age.” From then until 2000, it rose by about 0.8 deg.C, and now exceeds temperature during any prior period of the holocene.
Marcott et al. claim differently, saying “Our results indicate that global mean temperature for the decade 2000–2009 has not yet exceeded the warmest temperatures of the early Holocene (5000 to 10,000 yr B.P.).” I disagree, because their reconstruction is not optimally aligned with instrumental data.
Whoever is right about this, modern temperature is surely close to the warmest temperatures of the early Holocene. The dangerous part is that it has happened so fast. In the span of a century or two, man-made changes to the atmosphere wiped out 5,000 years of natural climate change. People can argue about the uptick at the end of the Marcott reconstruction — I’ll do so myself — but for most who do so, it’s just an attempt to divert attention from the fact that global temperature really has increased in the last century, at a speed not seen in at least the last 11,300 years. We know this, thermometers have made it plain, only those in denial still deny it.
We are changing the climate rapidly — on the geologic time scale, in the blink of an eye. This is exactly the kind of rapid change which has caused extinction events in the past. What’s far more frightening is that it’s not going to stop.
Don’t let human civilization be the next victim of climate-induced extinction.
In the next post we’ll look at the effect of using re-calculated proxy ages on the recent uptick.