Second-Least-Cold Year

At the risk of repeating myself, I’ll repeat myself. Global temperature is a combination of trend and fluctuation.

The fluctuations make it jitter about from year to year (or month to month, or day to day, or whatever), but the trend of late has been steadily upward. It’s called global warming (although some might prefer “global un-cooling”). We even know some (but not all!) of the causes of the fluctuations; things like the el Niño southern oscillation (ENSO), massive volcanic eruptions, and variations in the output of the sun itself. But the fluctuations don’t last. The trend, however, continues upward unabated.

Earth’s hottest year on record remains 2016, when the warming from man-made climate change combined with extra heat from the strong el Niño of that year. But even without el Niño, this year is likely to come in a strong 2nd. Here are yearly averages since 1950 from NASA, with the 2019 value year-to-date (January through October):

I regularly estimate the impact of things like el Niño, volcanic eruptions, and solar variations on global temperature. Then I can remove those temporary influences for a clearer picture of the long-term changes: the global warming. I did this with the latest NASA data, and here they are plotted as red triangles (un-adjusted data still as blue x’s):

Interestingly, while the raw-data 2019 value will probably come in 2nd, the adjusted-data 2019 value will overtake 1st place.

NOAA data tell a slightly different story. The raw data (blue x’s) have 2019-so-far in 2nd place (just barely beating 3rd place), and the adjusted data (red triangles) have 2019-so-far also in 2nd place (just barely failing to beat 1st place).

Turning to the data from HadCRU (the Hadley Centre/Climate Research Unit in the U.K.), we find that 2019-so-far only makes it to 3rd place, behind 2016 and 2015, but when adjusted it takes over 1st place:

A modified version of the HadCRU data is from Cowtan & Way, which, like NASA, has 2019-so-far in 2nd place and adjusted-2019-so-far in 1st place:

Finally, the global estimate from the Berkeley Earth Surface Temperature project imitates NASA and Cowtan & Way, with 2019-so-far in 2nd place, adjusted-2019-so-far in 1st place:

Four of the five data sets have 2019 coming in 2nd place, only HadCRU has it 3rd. Four of the five adjusted data sets have 2019 coming in 1st, only NOAA puts it second.

Thanks to those who have generously supported this blog. If you’d like to help, please visit the donation link below.

This blog is made possible by readers like you; join others by donating at My Wee Dragon.


22 responses to “Second-Least-Cold Year

  1. Susan Anderson

    For the scientifically challenged (but not entirely innumerate or blind) like me who are grateful to the likes of Tamino for doing the science, I would suggest that it’s very simple.

    Just scroll down and look at the pictures. It’s called acceleration.

    Uncertainly is not our friend. But certainty, while delayed and henceforth even more dangerous than people think, is arriving fast. Here are the four monkeys:

  2. I wonder if the temp increase is happening as it was projected to occur. In order to evaluate this, it would be useful to have the projections for 2020 that were made in 1990, again in 2000, and again in 2010.

    If this were done, I think it would be clear that global temp rise is happening faster than anticipated, but I am not certain of that. If it turns out to be the case that our decadal projections have been consistently low, it would be helpful to have that consistent underestimation issue front and center every time that climatologists present their projections. Almost nobody likes to deliver bad news for a variety of reasons, but if the facts and analysis of same are bad news, I think we need to discuss that in no uncertain terms. The denialists/lukewarmers will do their best to shoot the messenger, but maybe the message is important enough to risk it.

    thanks for the good work you are doing.



  3. I posted something elsewhere the other day about a fact that the folks who still think stolen email ten years ago is still relevant. It’s not. Some still go on as if “the pause” was real, too. Problem with the denial industry is it gets harder and harder to shift their goal posts to keep up with reality.

    In 1999, the warmest year in the instrument record was 1998, and was so unusual it made a great cherry for deniers to pick as the start of graphs for years to follow (especially if they stuck in up in the troposphere).

    Once 2019 ends, 1998 might still be in tenth place in some of the records, but it will be off the list in others. Along with the groups Tamino covered, here’s Japan Meteorological Agency’s global temperature information.

    Under the graph, note the five warmest years. Hard for the denial sites to do much when the SIX warmest years in the record are the six current years – and in some cases make the list without help from El Niño, like the boost 1998 needed.

    1st. 2016(+0.45°C), 2nd. 2015(+0.42°C), 3rd. 2017(+0.38°C), 4th. 2018(+0.31°C), 5th. 2014(+0.27°C)

    Back in 1988 when Hansen predicted the signal would rise above the noise at the turn of the century, maybe some people forgot that he wasn’t meaning the signal would then sink back down into noise.

    • b fagan noted: “Under the graph, note the five warmest years. Hard for the denial sites to do much when the SIX warmest years in the record are the six current years – and in some cases make the list without help from El Niño, like the boost 1998 needed.”

      I think when you look at 2019 concluding as the second hottest year in the records with a very modest EN bump, you might conclude that the signal has risen above the noise. But there is certainly still fluctuation/jitters, but “the trend of late has been steadily upward.”

      The trend may turn out to have a signal path that looks like a hockey stick. That would be a bitch, would it not? Nothing to do but watch the temps for another decade to see if that steady upward trend continues and, perhaps, accelerates.

    • @B Fagan, and all,

      The thing of it is, a lot of the public is disinterested in quantitative and other direct evidence in favor of “gut feel” and shared opinions of their friends.

      Don’t know what to do about that.

      While there has been massive amounts of income inequality, it is also true many people are terrible managers of their own monies, an another area of high self-interest which demands respect of quantitative evidence.

      • Susan Anderson

        @ecoquant, sadly, given the effects of delay and potential that should have been stopped decades ago, there is something happening that is doing something about that.

        Reality is breaking in. You don’t need to be a scientist to notice the accumulation of extremes.

        I participate sporadically at Wunderground’s Category 6, and Jeff Masters has started a blog at the Scientific American that continues his work. Since climate is weather measured over space (globe and large segments + atmosphere) and time (longest possible, but at least decades) weather is useful to bring laypeople to question their “team” when the lies pile up.

      • @Susan Anderson,

        It’s interesting, but even some climate scientists don’t appear to be doing the statistics of these extremes correctly. They are probably even more likely than their calculations show, as they should be doing these calculations using GEV distributions, and nothing like Gaussians. Arguably, a single occurrence of a very rare event suffices to invalidate a Gaussian or even a Gamma model.

        This is important, because if a luckwarmer uses a Gaussian model, their tail densities will be by definition inappropriate.

        I mention this all in part because there’s a nice review article of these by Lee Fawcett in the current issue of Significance, an article I enjoyed while eating breakfast today.

  4. Probably a bit of luck but I would still say that the 1982 predictions found in the link below was quite good:Prediction for 2015: CO2 concentration of 407 ppm and 0.8 degree warming.

  5. On the models, the steady upward trend, etc, if might be a good time to note that a lot of the models say that ECS is 5 degrees C or more:

    “For nearly 40 years, the massive computer models used to simulate global climate have delivered a fairly consistent picture of how fast human carbon emissions might warm the world. But a host of global climate models developed for the United Nations’s next major assessment of global warming, due in 2021, are now showing a puzzling but undeniable trend. They are running hotter than they have in the past. Soon the world could be, too.

    In earlier models, doubling atmospheric carbon dioxide (CO2) over preindustrial levels led models to predict somewhere between 2°C and 4.5°C of warming once the planet came into balance. But in at least eight of the next-generation models, produced by leading centers in the United States, the United Kingdom, Canada, and France, that “equilibrium climate sensitivity” has come in at 5°C or warmer. Modelers are struggling to identify which of their refinements explain this heightened sensitivity before the next assessment from the United Nations’s Intergovernmental Panel on Climate Change (IPCC). But the trend “is definitely real. There’s no question,” says Reto Knutti, a climate scientist at ETH Zurich in Switzerland. “Is that realistic or not? At this point, we don’t know.”

    • @smallbluemike,

      Okay, but electing a subset of models which have high ECS — or any other statistic — is not how ensemble models are supposed to be used. Rather, the population of numbers and its density are significant and to the degree to which measures of central tendency, like mean, median, and mode are informative, then those are useful.

      That noted, what can be said is that the density of ECS given by the ensemble has significant probability mass in >5C ECS and that oughtn’t be ignored in favor of an obsession with central tendency. Indeed, for asymmetric densities like ECS it’s critical to consider dispersion of estimates and the same asymmetry means standard deviation is a poor characterizer. I’d recommend HDPI (highest density probability interval) but, in any case, someone who sees a mean <3C in such densities and concludes "It's all good" does not, in my opinion, either know how to read a graph or is unfamiliar with how ensemble methods work, present company excepted, of course.

  6. b fagan

    You wrote above concerning JMA’s yearly temperature ranking and El Nino’s influence.

    An interesting comparison is that of
    – El Nino in the Multivariate ENSO Index

    – JMA’s global yearly temperatures as latitude/longitude grids for
    (1) 2015/16 (and 2016 dito)
    (2) 1997/98 (and 1998 dito)

    If this so-called SUPER El Nino in 2015/16 was, according to MEI, a lot weaker than the 1997/98 edition, why then are there so many more red dots for 2015/16?

  7. I wonder what this looks like if a person did a little smoothing of one sort or another?

    • Ooh, don’t say it!

      ‘Cause one part of that plot doesn’t look like the other.

      Decadal binning has to be about the simplest analysis you can do. But it’s pretty telling, isn’t it? Even just for CONUS.

      • It reminds me of a certain sports stick, but sometimes a sports stick is just a sports stick. It doesn’t always pay to read too much in to a thing.

      • I know. Especially when the sport is at risk over much of its historical range, and may require artificial ice to survive.

      • A cricket bat!
        A hurling stick?
        A niblick?
        Ooh! A caber, like the Highlanders toss! It’s a big straight line, like so many charts some people reference.
        Well, don’t tell me. I’ll get it.
        Hmm. Doc said artificial ice.