Many things affect global temperature, and the three best-known other than greenhouse-gases those are the el Niño southern oscillation (ENSO), atmospheric aerosols from volcanic eruptions, and variations in the sun. We can use historical data to estimate how strongly those factors affect global temperature. I’ve done so in the past, and by requests here’s an updated version which includes recent data.
ENSO, when in its warm el Niño phase, warms up the atmosphere (and hence our weather), but in its cool la Niña phase cools us down. Large volcanic eruptions put sulfates in the atmosphere, cooling us off for a few years. When the sun gets hotter or cooler, Earth tends to as well. Here are the estimated impacts of these factors on global temperature (using temperature data from NASA) since 1950 (where “MEI” stands for the Multivariate El niño Index):
The effect of all three factors is statistically significant, although the impact of solar variation tends to be much smaller than that of either volcanoes or ENSO. When we add all three factors together we get an estimate of their total impact together:
The strongest peak warming due to these factors was during the super-strong el Niño event leading up to 1998, while the peak cooling was in 1976, when all three combinated to cool us off: volcanic eruptions, the cool (la Niña) phase of ENSO, and the cool part of the solar cycle.
We can subtract the estimate of how these factors have affected temperature, to get an estimate of how temperature has changed apart from these known factors. I’ll show both the data from NASA as is, and the NASA data with the impact of known factors removed, and I think it’s clearer when showing yearly averages rather than monthly data (note: 2018 isn’t complete yet), so here goes (open blue dots for data as is, filled red for adjusted data with known factors removed):
The difference seems rather plain, that after removing the impact of these known factors, we end up with a temperature series with less fluctuation but about the same trend. One way to estimate the present trend rate is by fitting a “piecewise linear” model, which is just two straight lines which meet at their endpoints:
Before adjustment the estimated trend rate (now) is 1.83 +/- 0.21 °C/century, after adjustment it’s 1.75 +/- 0.13 °C/century; the difference is not statistically significant. The standard deviation (a measure of the amount of fluctuation) of yearly averages before adjustment is 0.094 °C, after adjustment it’s 0.057 °C.
I’ve analyzed the adjusted data, just as I have the “as is” data, to look for any significant sign of deviation from the piecewise-linear trend (in particular, anything else we can say with confidence since about 1975). It’s not there. Just for a clear view, here’s the adjusted data with its estimated trend:
That doesn’t mean it has followed two straight lines since 1950, but it does mean we don’t have enough statistical evidence to claim that it hasn’t.
I’m also interested in how this looks using data over land areas only. I’ll do two different data sets, in part to show how well they agree with each other, which is partly a response to the recent “study” of Hadley Centre/Climate Research Unit data (the CRUTEM4 data for land areas only) referred to by a reader recently. The “study” calls itself an audit but in my opinion is nothing more than a hatchet job. I’ll also use the land-only estimate from the Berkeley Earth Surface Temperature project.
Here they both are (annual averages, note that 2018 isn’t complete yet and I’ve set them to the same baseline) “as is”:
Here they both are after estimating and removing the impact of ENSO, volcanoes, and solar variations:
Again it’s abundantly clear that the two data sets are in excellent agreement. The Berkeley data show greater fluctuation, so let’s take a look at just that. Here’s the Berkeley data for land areas only, together with a piecewise-linear trend approximation in blue and a smooth-fit trend approximation in red:
Of particular interest is the rate of warming, which we can approximate using either the smooth-fit or piecewise-linear fit. Here are the results:
An important thing to note is that the rates based on the piecewise-linear fit are really the average rates over each piece, which is why is has smaller probable errors than the rates based on the smooth fit.
Just for those who are interested in numbers, here is the data (both “as is” and adjusted for ENSO, volcanic, solar) for the NASA data covering the entire globe:
Important note: wordpress won’t allow me to upload a “csv” file, so I changed the suffix to “xls” to get it there. Change the name suffix from “xls” to “csv” and treat it as a csv.
This blog is made possible by readers like you; join others by donating at My Wee Dragon.
So, the global warming rate over land area has been approx. 2.7 °C / century for the last 25 years? That’s even a “little bit” more alarming than the 1.8 °C / century-value for total global warming …
Yes. One of the things that ‘everybody knows’ but most of the time forget to think about is that warming over land is much greater. So, for instance, hitting the 2 C warming mark globally would imply a mean change over land of ~4 C.
The naive always think it doesn’t sound like much, partly because the natural tendency is to compare the change to daily variability, which in temperate zones may easily be 15 degrees, or even much more in drier areas. But, to bring in a personal instance, this summer in mid-South Carolina has been persistently warm, with very few days failing to reach a high of 90 degrees F, from May well into October. I really, seriously, wouldn’t relish a situation where that daily max would be 97.2 F for months on end–nor where the really warm days would be hitting 111.2, instead of ‘just’ 104. At 80% relative humidity, and nominally normal pressure of 1000 hPa, that’s a wet bulb temperature of over 104 F, which carries a serious risk of lethality if exposure is prolonged.
https://www.weather.gov/epz/wxcalc_rh
The other, larger, factor is manmade aerosols (coal smoke, etc.) though it has a more consistent impact than volcanoes, El Nino, and the Sun. My understand is that manmade aerosols are “hiding” about half of the global warming signal.
Incoming radiation is about equal on land and ocean surfaces at any given latitude. Land surfaces re-radiate almost everything back into the atmosphere; oceans dump a good part of the energy into the depths below, where it is hidden for a very long time. Hence the difference in the way the air over the two surfaces warms up, weighted 70% in favor of oceans. I have been writing about this in my Climate Letters lately, with some good charts, at http://www.climatecarl.com/
Instead of piecewise linear, why not a quadratic? (Not a rhetorical question.)
[Response: Using just the data since 1950 (as in this post), I see little reason to prefer one over the other. A modified lowess smooth (with a good choice of time scale) lies between the two.
But using the full data set, I think a piecewise-linear model (with more pieces) comes out preferable to either a high-order polynomial (you’d have to go higher than quadratic) or a “piecewise quadratic (or whatever)” model. I’m also always reluctant to use high-order polynomials because they’re great in the heart of the data, but tend to explode unrealistically near the endpoints.
Do bear in mind that the piecewise-linear model was really implemented in the first place to test whether or not there is a *demonstrable* slope change since 1975.
All of this emphasizes that statistics doesn’t allow us to say it *is* piecewise-linear or quadratic (since 1950), it just allows us to show that you can’t (yet) prove it’s not.]
Thanks for this exercise. As I was looking at the graph with the 3 natural factors removed, I started to wonder why there is still variation. Presumably, part of the answer is that there are still other variable natural factors that aren’t accounted for. If those could be understood, would we be able to produce a graph of human forced changes that looked almost like a smooth line? Just wondering if the still noticeable ups and downs indicate a lack of knowledge about the many other natural variables (mind you, it also possible that what we call “natural” might be affected by what we’re doing to the climate).
Another thought: if we split out the northern and southern hemispheres, what do the data show? I’ve seen some graphs of this split on GISTEMP and it does seem that the SH warming isn’t as rapid as the NH. I know Michael Mann estimated, a couple of years ago, that the NH had already warmed 1.2C by the base period usually used to represent pre-industrial, so the NH is much closer to 1.5C (and 2C) than is generally reported. I’m not sure how “current” average anomalies are calculated, but, from those GISTEMP graphs, the NH looks to be about 1.4C (on the usual baseline, 1.6C on the Mann baseline) and the SH is about 0.8C (on the usual baseline, unknown for true pre-industrial baseline). So it looks like we could be at 1.1C globally, even on the usual baseline. I’d love to know what scientists calculate the real figures to be.
The use of the MEI-index has a disadvantage: This index has a warming trend because it is not corrected for the warming of the ocean. It would be better to use a detrended version of this index (or the ONI-index, which is corrected for the long-time warming in the NINO3.4-region).