Some organizations which estimate global temperature change have been behind schedule because of the U.S. government shutdown. But the folks at the Berkeley Earth Surface Temperature project have released their figures for December’s temperature, bringing the year 2018 to a close. It was a hot one.

Last year was the 4th-hottest on record. Of course we already know that global temperature jitters around its trend value; it’s a combination of the *trend*, the long-term pattern, with the *noise*, i.e. the jittering around that never stops but never really gets anywhere. Welcome to Earth.

There are lots of ways to estimate what that elusive *trend* value is, an important task if we want to know how the underlying system is changing. The noise changes won’t tell us about that; whatever the climate does, noise will be noise. One often starts by using a smoothing method, some of us confess to enjoy geting fancy about that, and my “go-to” choice is a modified lowess smooth:

According to this smoothed estimate, the *trend* — which is where we look for climate change (man-made or not) — has shown two espisodes of warming since the year 1900. The first, from about 1910 to 1945, it rose roughly 0.43°C, while from about 1975 to now it has risen around 0.84°C. That’s basically twice as much.

The thing about a smooth is that it has an inherent

*time scale*. Changes that are much faster than that will get smoothed out, but changes slower than the time scale will be mostly unaffected. The idea is that the noise changes are fast, the signal changes are slower, so with the right time scale we can blunt the noise and let the signal shine through.

For my smooth (above) I used a time scale of 20 years. Climate is usually defined as 30 years or more, but I wanted a shorter time scale to pick up more fluctuations, since they’re not all noise! But I know from experience that with time scales shorter than 20 years, the noise will start to dominate the result and I’ll lose the very noise reduction we need to get signal out of data.

Even so, on any time scale the noise is still there. Allow me to demonstrate.

Let’s make some random noise, of the same size as the random noise in yearly global average temperature (about 0.1°C). In this case we know, ahead of time, what the signal is: it’s flat as a pancake, going nowhere. These data are only noise.

Now let’s smooth it. Well start with a choice which is popular with those who *want* to let the noise affect the results, because the time scale is too short: a 10-year Gaussian smooth:

The fluctuations cover a range of 0.14°C. Some (not Tamino, to be sure) might even claim there was global cooling of that amount from about 1950 to 1960. But we know these data are just noise.

What about my choice of lowess smooth at 20-year time scale? This:

This too shows fluctuations, covering a range of about 0.07°C. The noise is still there — and even though smoothing reduces the noise level, it doesn’t eliminate it completely. In this case, the noise is all there is.

The “smoothest” choice is a straight line fite:

There’s still noise, and it sill causes the smooth (the line) to cover a nonzero range, but in this case, at 0.007°C total, it really won’t bias our estimates of total change.

The model with the most noise reduction, but which still includes all the changes that we can

*actually demonstrate*, is a piecewise linear model, i.e. a set of straight lines represening the espisodes of

*demonstrable*global temperature change (not just the stuff that “looks like”). The times should be chosen to fit the data best (something called “changepoint analysis”), and I get this:

Using this model to estimate the trend changes (apart from the noise), the early warming amounted to about 0.39°C, but later warming covered 0.83°C, more than twice as much.

Still, some insist on nonsense: that early warming was “almost as large” as later warming. This is usually done without even estimating, let alone saying out loud, how much warming there was. When called to task, their acolytes will try to save appearances by using every trick imaginable to exaggerate what happened early while minimizing what happened late. They hope in vain that if they can pull the wool over people’s eyes about early warming, it will comfort them about recent warming. Recent warming is worrisome. The early global warming is no comfort at all.

What’s scary as hell is what’s to come … because that trend line isn’t just a lot higher than it used to be, **it’s still going up**.

This blog is made possible by readers like you; join others by donating at My Wee Dragon.

can you make correlation between CO2 and Temperature for the two period ?

What is the value for each one ?

maybe it’s better to take increase rate for CO2 ?

Some months ago, I did a simple linear regression of temperature anomaly (data.giss.nasa.gov/gistemp) vs CO2 levels (from Mauna Loa observatory) for the period 1960 through 2017. There was a correlation, and the regression equation was Y = -3.367 + 0.0104X, R^2 = 0.898; P = 1.74e-29. where X = CO2 concentration in ppm and Y is the temperature anomaly (vs 1950-1981 baseline) in deg. C. So the linear fit looked pretty good to me. I didn’t look at the earlier period. Mauna Loa observatory data starts around 1959. I would post the graph, but I don’t know how to do that on this blog.

[

Response:I prefer readers post a link to a graph.]Rather than correlating temperature with co2 in ppm, you should correlate it with the base-2 logarithm of co2 … because climate sensitivity for co2 is proportional to the number of doublings of concentration.

It seems to work very well. I have seen this on David Appell’s blog, and I think he got the idea from here:h

https://climategraphs.wordpress.com

It was also the subject of a thread at Neven’s sea ife forum. Here’s Cowtan and Way, and Berkeley Earth, 1850-present (180 years!) in a straight line:

log co2 vs temperature

I’d love to see Tamino’s take on this.

Ned, to be clear, the base of the logarithm doesn’t matter.

Ned — In principle, I agree with your comment. However, over the range of values of CO2 and temperature since 1960, it doesn’t make a difference. The linear fits of CO2 or log(10)CO2 or Log(2)CO2 appear almost identical, with almost identical R^2 and P values. So if your focus is what is happening, or likely to happen, over the lifetime of current readers of this blog, a simple linear fit with CO2 is likely to tell the story.

No surprises. Sobering data and analysis. Thanks for your beautiful work and solid communication of same.

I did also the lineair regression with Keeling and Hadcrut4 and bassically got the same results for 1960-2016. I used anually data. When I splitted this timeserie in three equaly long periods thing became a bit confusing.

R^2 1950-2016 = .85 , R^2 1960-1977= 0,04, R^2 1978-1997=0,54 and R^2 1998-2016=0,46. With other words very strong relation between CO2 and temperature for the whole timeserie and weak ones for the three equal long periods in which it was diveded. What to think about it?

Raymond, Isn’t that precisely what you’d expect for a long-term trend on a noisy (and auto-correlated) signal?

I think “accolades” in your penultimate paragraph should be “acolytes”?

[

Response:Yes of course. Thanks. Fixed.]And “straight line fite” -> “straight line fit” (typo).