Hansen’s 1988 Predictions

On another forum entirely, discussion arose of Jim Hansen’s 1988 computer model simulation and its prediction for future temperature change. Talk centered around a graph from this post by the GWPF (Global Warming Policy Foundation):


GWPF_NASA-Hansen_Graph

Hansen estimated future temperature based on three scenarios of possible greenhouse gas emissions, high emissions (scenario A), medium (scenario B), and low (scenario C). The GWPF states that


“… not only has the world matched the ‘BAU’ growth of the 15 years prior to the 1988 testimony, we have increased the CO2 emission tonnes growth from 1.8% per year to 2.2% (the 15 years prior to 2013). To put those numbers into context, from 1972 through 1987, humans emitted 302 billion tonnes of CO2; in contrast, from 1998 through 2012 humans produced 461 billion tonnes.”

Therefore, they suggest, temperature should have increased at least as fast as the fastest simulation. But it hasn’t. On that basis, one person concluded


“It seems that if CO2 was the culprit, then the temperature would have done what he predicted.”

I’m quite sure that’s exactly what the GWPF wants him to believe.

At the outset, it seems ridiculous to suggest that “Empirical evidence confirms failed performance of global climate models” based on one run each of three scenarios from 25 years ago. But that’s hardly the biggest problem with the GWPF presentation.

The biggest problem is that the GWPF has presented the ultimate simpleton’s viewpoint of the situation. They imply that temperature had to come closest to Hansen’s scenario A because CO2 increase has most closely matched Hansen’s scenario A. But global temperature is affected by a lot more than just CO2, it is affected by the entirety of climate forcing. CO2 is only one of many.

Here are the net climate forcings according to the 2005 IPCC report:

600px-Radiative-forcings.svg

Notice that even among man-made greenhouse gases, CO2 isn’t the only forcing. There’s also methane (CH4), NO2, and a variety of halocarbons. Then there are the aerosols, which have a cooling effect on climate.

How do their changes compare to the forcing used in Hansen’s 1988 predictions? Here’s a comparison of the methane concentration (the 2nd-most-important man-made greenhouse gas) used in those models, to observed methane concentration since 1983 (measured at the Mauna Loa atmospheric observatory):

ch4

It turns out that methane has increased more slowly than the slowest of the scenarios. If we do like the GWPF did and account for only one climate forcing, but use methane instead of CO2, then temperature should have increased less than the slowest of the scenarios. The GWPF certainly didn’t mention that. Just my opinion, but I wouldn’t put it past them to neglect mentioning it even if they knew. And I wouldn’t be at all surprised if they didn’t even know.

Let’s not play the one-climate-forcing-only game. Let’s look at data for all climate forcings, and compare what was used in Hansen’s scenarios to best estimates of what has actually happened since the prediction began. This was done 7 years ago on RealClimate, but we’ve learned more about climate forcing since then (and we have more data), so let’s update that comparison.

As in the RealClimate post, I’ll leave out volcanic forcing because it’s short-lived, and the prediction can’t be expected to forecast volcanic eruptions anyway. Here’s the climate forcing used in the three scenarios compared to actual climate forcing since the prediction began:

forcing

Notice that I aligned the forcing estimates at 1984, because that’s when the prediction began. The paper was published in 1988, but the models had to be run (which took a lot longer in those days) before that, the paper had to be written, and reviewed, then published. The actual prediction started in 1984.

Notice also that actual forcing follows scenario C more closely than the other scenarios. On that basis, we should expect observed temperature change to match scenario C more closely than the other scenarios. And it does.

In fact, the match of observed temperature to Hansen’s scenario is C quite good. Damn good. So good, that I suspect the match isn’t entirely due to the excellence of the model, in part it was coincidence. Regardless of that, to claim on the basis of these 25-year-old model runs that “Empirical evidence confirms failed performance of global climate models” is downright ludicrous. Really. Ludicrous.

Think about how much work I had to do to track down the data and produce this analysis. And I’ve been analyzing climate data for years! Now think about the guy in the discussion who is new to the climate issue. How easy would it be for him to ferret out the truth about this? Fortunately, he seems to have an open mind and to want to know the truth regardless of political considerations. But given the complexity of the situation, given the tremendous amount of information which the GWPF failed to mention (and maybe doesn’t even know), what chance does he have to get at the truth in this situation? The only thing that’s “easy” about this is how ridiculously easy it was for the GWPF to give the wrong impression.

But the ridiculously simplistic viewpoint about climate forcing isn’t the only problem with the GWPF presentation. Notice that at the top of their graph they state “The NASA/GISS 2012 “correction” revisions of +0.05° per year average increase adjusted for.” That’s their way of saying (with appropriate “scare quotes”) that they’ve subtracted 0.05° from the temperature data according to NASA/GISS. They changed the data to make observed temperature seem lower. That sounds like cheating to me.

Look also at the line stating “HadCRUT3 Observed Annual Temperature Anomaly – prior to 2010, every 5 years.” Does that make you wonder why they chose to plot it only for every 5 years?

Let’s compare the temperature data from NASA GISS to that from HadCRUT3 using all the years from 1880 through 2013:

giss_v_cru

Notice a difference? The HadCRUT3 data are consistently lower than the NASA GISS data. That’s because the two organizations use a different baseline — NASA data are the temperature difference from the 1951-1980 average while HadCRUT3 data are the temperature difference from the 1961-1990 average. Comparing the two, without compensating for the different baseline, is like comparing an infant’s height above the ground (which is a positive number), to my height above Shaquille O’Neil (which is negative because he’s taller than I am). Properly to compare our heights, you’d have to use the same baseline. Properly to compare NASA GISS and HadCRUT3 temperatures, you’d have to use the same baseline. The GWPF didn’t.

I find it very hard to believe that they didn’t know about this. If they truly didn’t, then they’re so ignorant about temperature data they have no business discussing the issue at all. If they did know, but failed to compensate or even to mention it because that makes the HadCRUT3 data look even lower, then … draw your own conclusion. I can’t be sure, but it certainly seems to me that this difference is the reason they only plotted HadCRUT3 data “every 5 years” before 2010.

If you’ve already imagined how hard it would be for an honest, but inexperienced, investigator of climate science to get at the truth about Hansen’s 1988 predictions, now imagine how much harder it is when data are plotted in misleading fashion.

Computer models of global climate are far from perfect. But they remain our best tool to predict future climate change. Even so, they are not the basis for concluding that climate change is really happening, that it’s mainly because of us, and that it’s dangerous. Very dangerous. Those conclusions are based on observation of past climate changes (including way back to the glacial cycles over the last million years and more), and those pesky little laws of physics.

But the fact remains that far too many people, and organizations, who really don’t have the knowledge to know what they’re talking about, bombard us with misleading claims. That’s one of the main reasons our society remains paralyzed, doing next to nothing to address what is really the defining issue of this century.

In the meantime, I advise everyone who hears anything at all from the GWPF to be skeptical. Be very skeptical.

About these ads

63 responses to “Hansen’s 1988 Predictions

  1. Excellent. Thanks for doing the work – this tiresome, endless work of trying to mop up the mess, purposely made by little children.

    Appreciated!

  2. Sadly and regardless, the practical reality is that if temps don’t up-tick significantly in the next 2-3 years, general public confidence in the science and the danger, at whatever level it is, will probably suffer. If by 4-5 years, then we get them back but, in the meantime, that time is lost for the more effective action that might otherwise have been.

    Tamino helps huge for honest folks truly interested and curious willing to do the work. With or without the likes of WUWT or GWPF, only increasing temps will work for everybody else.

    • Surface temperatures are interesting because they’re the temperatures where we live. However, they are not the best gages of global warming because the planet warms in depth, and the oceans are especially effective at holding energy absorbed. The challenge for climate communication is to get people to pay more attention to energy absorption, which is the true measure of global warming and has been increasing unabated before and during the recent years’ reduced trend in surface-temperature rise.

    • I agree with Jay Dee Are; when trying to evaluate the models accuracy, we should start with examining the input (forcings), as done in this post, but then examine the total energy balance prediction first, before trying to understand the surface temperature predictions. The models seem to over predict the total energy imbalance by about 20-30%, based on estimates of ocean warming (which absorbs about 92% of the annual thermal energy increase of the planet). And contrary to many comments’ POV, the world has heated up every year, with the last possible cooling period, the twelve months after Mt. Pinatubo eruption in 1991.

      If the models have somewhat over predicted the heating rate, what about the surface temperatures? The temperature trends have been fairly consistent with the models, except for NH mid-latitude winters. As Judah Cohen showed in his paper, the winter cooling trend in this region, is the only trend inconsistent with the models. The models suggest winters warm faster than the summers, but in the heavily populated NH mid-latitudes, the opposite has occurred. The winters have gotten cooler or warmed less, and the summers have warmed more.

      The other other big miss for the models so far; the extensive melt of the Arctic ice cap, probably due to Arctic amplification has happened much faster than predicted. The loss of NH snow cover has likely contributed to this, and perhaps changes in the Arctic meteorological systems. In any case, the amount of thermal energy that melts the ice cap each summer, with this heat released each year is huge. Compare to average ice cap melt from estimated ice cap volumes, the extra heat absorbed and re-released is about 20-30 (x10^20) joule compared to normal ice cap seasonal swings from 30 years ago. The entire planet heats about 100 (x10^20) joule annually, so this massive swing of thermal energy from summer into the late fall/winter in a very small portion of the planet (Arctic), must be having some effect. There seems to be a counterintuitive connection between this seasonal release of stored thermal energy, and the NH mid-latitude late fall/winter temperature trends.

      Instead of knocking down each attempt to evaluate the model accuracy, it might be helpful, to get a creditable analysis of where the models have successfully predicted climate changes so far, and where the models seem to have missed. The misses so far don’t seem reassuring.

      • Paul Klemencic,
        Some interesting ideas but I think your numbers may be a bit high. And talk of “ice cap” can be a little ambiguous. Does it mean Greenland or the Arctic sea ice? I think ASI.
        The GRACE data shows Greenland does gain ice during the winter, an amount that has stayed pretty constant since 2002 at ~200Gt (and now entirely dwarfed by the annual melt).
        PIOMAS models the sea ice volume freezing up 15,800Gt in the early 1980s and 18,200Gt in recent years (although the most recent freeze season topped 19,600Gt) . So the freeze is now about 2,500Gt bigger which is something like 0.8 ZJ in latent heat, somewhat less than your 2 – 3 ZJ figure.
        I assume by “the entire planet heats about 100e20 J annually” you are basing this on OHC+. Another number that makes a useful comparison is the 50 ZJ that according to Vallis & Farneti (2009) is transported annually by the climate into the Arctic from temperate regions.
        So while the extra latent energy released during the recent winter Arctic Ocean freezes is not as high as you say, it is still a very significant amount to be suddenly sloshing around the high Northern latitudes (assuming it doesn’t just leak straight out into space).

      • Al Rodger: My estimate of the original planetary heating rate is bases on the Trenberth papers. Skeptical Science had a pretty good overview:

        http://www.skepticalscience.com/Understanding-Trenberths-travesty.html

        The rate was estimated at 145 x10^20 joule based on 0.9 w/m^2 imbalance in the Trenberth work referenced. Other estimates, like Hansen 2005 had the imbalance at 0.85 +/- 0.15, which could have put the imbalance as low as 110 x10^20 joule. The actual heating over the last several decades appears to be in the range of 90-100 x10^20 joules, with bulk from ocean heating. Again Skeptical Science give a good overview, with references to the papers:

        http://www.skepticalscience.com/global-warming-not-slowing-its-speeding-up.html

        Regarding thermal energy storage caused by Arctic amplification (heat melts sea ice and heats the top mixed layer of the Arctic ocean in the summer, with heat released in the fall and winter as ice refreezes), look at seasonal PIOMAS ice volume estimates. Use sea ice density of 917 kg/m^3 and ice heat of fusion of 333 kJ/kg to estimate heat stored when ice melts, and released when ice forms. Per 1000 km^3, the estimated heat is about 30 x10^20 joules.

        Likewise for a 50 meter thick mixed layer heated about 2 degC, using seawater density of 1027 kg/m^3, and specific heat capacity of about 3985 J/kg K, then for each million square km of mixed layer, requires about 4 x10^20 joule. If the amount of sea ice area cleared of ice an extra 2 million square km more in the summer to fall/winter time, this adds about 8 x 10^20 joule.

        So anyway you look at it, the number I used of 20-30 x 10^20 joule of the extra seasonal swing in thermal energy in the Arctic, probably is low. To put this amount of thermal energy in perspective, burning all fossil fuels globally is about 25 x10^20, and arctic amplification due to changing sea ice/ seawater albedo is about 25-35 x10^20.

        By contrast, the estimated atmospheric heating which drives the surface temperature changes discussed in this post is about 2 x10^20 joule.

        My point is that we should be focused on these heat flows, and where there models haven’t forecasted the correct results in terms of heat flows.

      • Paul Klemencic,
        I think you have managed to find an extra factor of 10 in you calculation somewhere. Hey ho. We all do it. And on energy & polar ice melt it is not so uncommon. This will not be the first time I call as my witness the PIOMAS page where the paragraph entitled Perspective: Ice Loss and Energy helpfully tells us:-

        “It takes energy to melt sea ice. How much energy? The energy required to melt the 16,400 Km3 of ice that are lost every year (1979-2010 average) from April to September as part of the natural annual cycle is about 5 x 10^21 Joules.”

        The energy may not be as great as you calculated, but it is still large enough to be worth thinking about.

      • Pete Dunkelberg

        Paul, good points whatever the exact numbers. Keep an eye on RC. The annual model-data comparison thread may start any day now.

  3. Tamino, a correction: the GWPF just reblogged something that fit their desired storyline (as it does all the time). The original story is from C3, and no, I will not provide a link to that here, it’s in the GWPF link when you click on “full story”.

    It is another example of GWPF having no trouble reblogging misinformation without any quality control whatsoever. “Fits our desired story? Yes, it does. Then reblog!”

    And to imagine it is a supposed educational charity…

  4. Roy Spencer and others do the same thing – they choose the forcing scenarios from past publications which have turned out to be furthest from reality and then say “Ha! Look, the model projections failed!”. They are obviously well aware that the model forcing scenarios which were closest to reality have predicted global temperature rather accurately. It’s not just ignorance, it’s dishonesty.

  5. Jim Hansen’s 1988 modelling, despite using a high sensitivity (4.1 was it?) and a slightly higher rating for CO2 forcing, does a very good job at projecting our future global temperature, even down to that “pause” which climate denialists get all so heated about. (A complete coincidence, as it happens; the projection staged a major volcanic eruption to cause their “pause”.)

    The English language is not the greatest help when you need to accurately describe folk like the Gentlemen Who Prefer Fantasy. It is in all likelihood incorrect to call them “bloody liars.” It is unquestionably true that they exaggerate their message and do so well beyond the point of an honest messenger. But they probably believe their message is very important. And if you have an important message, do you not emphasis the data which supports it (which is of course the important data) and downplay the data which contradicts it (which of course is not so relevant) – don’t you? Don’t we all? The question is how far you will go in that doing.
    The GWPF go too far. Far too far!!! (And being a UK-registered charity, these messages which go beyond being factually-based, are provided with the assistance of the UK tax payer. GWPF are bringing UK charities & the UK charity commission into disrepute.)

    The one word that goes halfway to describing GWPF (apart from “deluded”) is “untrustworthy.” Indeed, they are so good at this, I feel a new word is required – “untrustable.” Further, it is worth adding adjectives to emphasis how untrustable they are, something the English language is better suited to. Incredibly. Completely. Mind-blowingly. Ubiquitously. “Untrustable.”

  6. “The biggest problem is that the GWPF has presented the ultimate simpleton’s viewpoint of the situation.”

    Isn’t that their role?

  7. This would be shocking, except that by now it’s far from novel. As Mark said at the top, thanks, Tamino, for mopping up. Tedious, I imagine, but helpful.

  8. You probably meant N2O rather than NO2.

  9. skeptictmac57

    Gee,it’s almost as if the GWPF has some sort of,I don’t know..uh agenda?
    No,that can’t be right,because that would just be supplying their intended audience with fodder for confirmation bias…and that might lead to a dangerous misunderstanding of where our climate is headed.That would be a bad outcome for our world .
    Surely the GWPF would not want that…would they?

  10. GWPF Academic Advisory Council members include Bob Carter, Vincent Courtillot, Freeman Dyson, William Happer, Richard Lindzen, Ross McKitrick, Ian Plimer and Richard Tol. That’s not everyone in the Council, but these names should be familiar (or infamous?) to all who follow and participate in the climate debate. For a quick review, just enter any of their names into say Realclimate’s search engine. Many are regulars on the WSJ op-ed page. Lindzen is the only climate scientist in the bunch by any reasonable definition (e.g. peer-reviewed publication).
    ’nuff said.

  11. Rob Nicholls

    Thank you so much Tamino for all the work you put in to debunking this nonsense. “The honest, but inexperienced, investigator of climate science” certainly has a hard time, but from my personal experience at least, your website is a great help for those who are carefully trying to assess the evidence and work out which side of the “debate” over climate change has the truth on its side.

    In the UK we often get the joy of watching GWPF’s chairman Lord Lawson presenting himself as some sort of expert on climate change when the BBC feels like adding a bit of false balance (although I believe Andrew Montford has been helping out recently). On one level it is hilarious – Lawson’s arguments are clearly contradicted by a mountain of scientific evidence (see e.g. IPCC AR5, or http://www.skepticalscience.com) – but he is fairly good at rhetoric, so many people may be taken in.

  12. I find it strange that people aren’t accepting the research that has repeatedly shown that psychology and ideology, not facts, is the main driver of one’s acceptance or lack of on anthropogenic climate change.
    Right wingers (conservatives) form the bulk of climate change denial and one study showed that if you present facts of climate change in a confronting way they REGRESS MORE into denial, on average.
    Another study found that right wingers have, on average, a larger amygdala which has been shown in primates to be a ‘fear centre’ within the brain. This embryonic research may begin to explain the denial at a neurological level – more fear means can’t cope with consequences of climate change, hence the coping mechanism is greater denial.
    Why is it that right wingers are more religious? Is it because religious belief (denial) is a coping mechanism to deal with awareness of the human condition?
    Just joining the dots……..

  13. Thanks for this post. I have a few questions and would greatly appreciate hearing your thoughts, if you have time.

    It looks like the GISTEMP data follow the predicted temperatures from Scenario C pretty closely, and you show that the actual forcings (minus volcanoes) follow the predicted forcings from Scenario C. So … that suggests that Hansen’s 1988 model did a reasonable job of deriving temperature change from forcings. Is that right?

    But … I’m curious about why the total forcing followed scenario C so closely. From 1970-1990, it looks like the net forcing was about +0.8 W/m2. Over the next 20 years (1990-2010) it looks more like +0.2 W/m2.

    I know that methane was rising much faster before, and slowed down a lot after the 1980s. I’m guessing that aerosols have increased, so that would also reduce the forcing. But I’m still surprised at how little increase there’s been in the net forcing over the past 20 years.

    Which individual forcings kept the total forcing 1990-2010 so low compared to the total forcing 1970-1990? That would seem important when considering what’s going to happen over the next century. Returning to the 1970-1990 trend for the next 85 years would make it a lot hotter than sticking with the 1990-2010 trend in forcing.

    I hope this is clear and that I’m not misunderstanding anything.

    • One of the forcings that was predicted to increase, and actually started to decline, was CFCs. Thanks to the Montreal Protocol CFCs were phased out. By itself, probably not too important. Even though CFCs are estimated at 3 to 5000 more potent as greenhouse gases, their total concentration was probably less than 1 ppb, but it looks like its atmospheric concentration was increasing at about 5% a year.

      • And as Hansen pointed out at the time (his fig 2) the CFCs have a larger contribution than CO2 to Scenario A by now.

  14. Tamino,

    Nice post.

    Can you explain in more detail how you got the forcing values? (In particular, what did you assume for anthro aerosols?) Also, looking at your values for the forcing vs the ones in the RealClimate post, they look different by the end of the RealClimate data series (i.e., in the early 2000s) even though you both have the same 1984 baseline. Is this due to updated estimates of forcings for that period?

    [Response: I used the forcing data from NASA GISS. The differences are indeed due to updates.]

  15. Sandy Hawkins

    This is a wonderful analysis. I would love it if you would also write a blog (or repost if it has already been done) linking to a variety of references showing how we know that global warming is caused by humans. So many deniers claim that there is no proof. The article on Skeptical Science isn’t terribly convincing.

    Thanks for all of your hard work!

    • Sandy, that’s perhaps a bigger question than you realize. The subject of climate science is really, really, big, encompassing lots of physics, meteorology, oceanography, computer science (as necessary means, not as subject matter) and more. One of the things that helped me grasp some of this was to look at how the basic science got started. It ended up turning into a whole series of articles, which you can access starting here:

      http://doc-snow.hubpages.com/hub/Global-Warming-Science-In-The-Age-Of-Washington-And-Jefferson-William-Charles-Wells

      Or, perhaps, better, here:

      http://hubpages.com/hub/Global-Warming-Science-A-Thumbnail-History

      (The articles are linked in a ‘group,’ so you can advance to adjacent items in the series with one click by using the pointers labeled “preceding” and “next”, just above the comments.)

      That certainly doesn’t address your whole question, and may be both lengthier than you are looking for, and less directly on target. But the 19th-century science is relatively accessible, and knowing the depth of the research roots renders pursuit of some dead ends unnecessary.

      At any rate, I’m sure that others will have some recommendations for you.

    • The quick answer is that the ratio of carbon-13 to carbon-12 in atmospheric CO2 has been declining. Organisms are deficient in carbon-13. Fossil fuels, being composed of dead organisms, are deficient in carbon-13. Fossil-fuel combustion emits carbon-13-deficient CO2 into the atmosphere. Measurements have shown that the carbon-13 to carbon-12 ratio in atmospheric CO2 has been declining.

      Of course, this is only the attribution part of the argument. You may have to start by convincing people that increasing atmospheric CO2 increases global warming. You could begin with the very simple argument that, without greenhouse gases, Earth would be encased in ice, provable by a simple calculation.

    • “_The_ article on SkS?” – Well, as a source linking to actual science (as opposed to made up stuff like WUWT) I think SkepticalScience does a brilliant job. They also have several levels of detail to how it is explained. Each “myth” or denial claim is also very well explained. So you might wonder why does SkS need tons of pages to explain global warming? At the surface it might seem very complicated as the planets climate system is affected by many different things, but at the essence there are only a few things you need to look at to be convinced about global warming from human activities and the resulting climate changes from it.

      First you must be convinced that CO2 is indeed a greenhouse gas and even in its current 400ppm concentration it has a substantial effect on our planets ability to absorb energy. To get a rough idea about the “masking” that CO2 does, look at youtube for some of the experiments where they film a candle in infrared light behind a glass chamber they fill with CO2 and watch the “magic trick”. For many skeptics/deniers I often mention that in the pursuit of heat seeking missiles we really pinned the attributes of CO2. As a theory it has been around for 200 years so its not something Al Gore invented as many would lead to to believe.

      Then you have to acknowledge that the amount of energy that leaves the earth is less than the amount that enters it. And that means the body is warming up. The system is seeking equilibrium. This is why they made the Hiroshima Atomic Bombs widget – even though many criticize it – we have to realize that earth is building up an incredible amount of energy every second due to the elevated CO2 levels.

      So the energy initially comes from the sun right? You might wonder, well what if the suns output has gone up? It’s true that previously amount of energy from the sun has been a primary driver for the climate variations on earth, but sometimes mid last century they “parted”. Suns output went flat or down while the temperature of earth still went up. This is one of the primary observations that really shows that the suns variation doesn’t have the same effect as before even though it modulates warming a bit still. We also know that the amount of energy that comes in has varied so much due to earth variations in orbit that we go in and out of ice ages. Look up Milankovich for information about this.

      And then you have to acknowledge that human beings are the reason for the elevated CO2 levels. Fingerprinting as some has mentioned here is one of the main proofs, and we have a very good estimate of how much stuff we have burned over the years now and the amount of CO2 this adds is substantial. We can even measure higher CO2 concentrations in the oceans as well as almost half of all CO2 we have emitted is in the oceans already. With it comes lower ph values which has its own set of consequences for marine life. CO2 also lingers a long time in the atmosphere and the earths carbon cycle is not equipped with anything to “wash out” (weathering) this CO2 faster than we add it. The Mauna Loa data will show you this.

      Many deniers actually accept all of this above, but they think that either sensitivity to a doubling of CO2 is lower than what e.g. IPCC say, or that even if the planet is warming there will be no big changes to our habitat. Both are seriously wrong and this is really where you should be studying details as it really tell us something about how serious this problem is. Like Michael Mann’s last piece in Scientific American – even if equilibrium is down towards the low estimate of +2.5C warming for a doubling that only buys us an additional 10 years before we reach +2C warming which is widely considered a serious threshold for “safe” warming. I’d say +1C is already high enough as we are experiencing some major changes to the biosphere as we speak.

  16. Tamino,

    Excellent as always. I had two questions, if anyone knows the answer:

    1. Where else other than methane did anthropogenic forcings differ from Hansen 1988? I would assume aerosols were underestimated, but I don’t know for sure.

    2. It was interesting to see Tamino use atmospheric concentration of methane for his analysis, while GWPF ran their comparison against emissions only. Obviously, the atmospheric concentration is what should be used to compare with actual temperatures. I know emissions and CO2 concentration are highly correlated. However, from what I can tell, global anthropogenic CH4 emissions are increasing, but the atmospheric concentration looks like it has leveled off a bit. Is there any known mechanism that would argue why this would be the case, or is my source (https://www.globalmethane.org/documents/analysis_fs_en.pdf) simply wrong?

    • David C.
      The data you link with a graph of CH4 emissions to is ‘correct’ but does not delve far enough back into history. (I say ‘correct’ because quantifying CH4 emissions is not easy. Measured atmospheric levels are the only certainty in such quantifying.) Hansen et al 1988 was written when atmospheric CH4 levels were rising alarmingly. Since then, they have slowed almost to a halt before beginning to rise once more. It is only this renewed rise your graph is showing. For a graph of CH4 levels back to 1983 see here (usually 2 clicks to ‘download your attachment’)

  17. Emissions are cumulative. Since CH4 oxidizes to CO2 and H20 in about 10 years, any series longer than five or ten years will overestimate CH4 forcing (you can argue about 10 or twenty, but the oxidation does reduce CO2 forcing, it also leads to some smog). Chemistry details at RR

  18. For those interested, here is Hansen et al 1988.
    The ECS of the model is quoted as 4.2ºC. Effective ECS would presumably be a little less for non-CO2 (ie ECS=3.9ºC) as the forcing used for a doubling of CO2 was 4 Wm^-2 back in those days (rather than the 3.7Wm^-2 used today).

    • He also pointed out in th 1990s that it didn’t make a damns worth difference because trends over a few decades are not very sensitive to effective ECS.

  19. Horatio Algeranon

    “Heads you lose, tales I win”
    — by Horatio Algeranon

    Science stands still, and has been dead
    Since 1988
    The cry is shrill, for Hansen’s head
    Upon a silver plate

  20. From XKCD over at RC. Too good not to share:https://what-if.xkcd.com/88/

  21. Tamino, I think you are being a little bit unfair to the folks at GWPF. You state that the Hansen forcings had assumed that methane concentrations would continue to increase in the atmosphere, whereas actual methane concentrations had leveled off at a level well below where Hansen had projected them to be at present time. You show a graph of these methane data with methane plotted as ppmv, with no data on the impact to forcing from the deviation of actual to project methane concentrations. I believe the impacts are logarithmic in nature.

    You then take a look at all of the forcings from current GISS data and show that the rate of forcing increases had slowed dramatically since 1989 and matched Hansen C rather than A scenarios. The GISS forcing data do not break out methane impacts, but they do show most of the slowdown in forcing increases is due to a substantial increase in suspended aerosols, not methane.

    A reasonable person or organization could be expected to go to the IPCC AR5 report for data such as this. They would find that IPCC rates the quality of the aerosol data to be low. Further, Figure 8.18 in AR5 shows the overall climate forcing levels rising steadily over the period between 1989 and 2011 whereas GISS data show marked flattening of the forcings. Figure 8.18 also shows a leveling of aerosol concentrations since 1989 rather than the marked increase shown by GISS.

    I am not saying you are wrong or correct here, just that it is recognized that the aerosol data that makes your case is uncertain, and credible organizations (IPCC) present different data that does not support your argument or criticism of GWPF.

    [Response: Complete bullshit.

    As a side note, methane forcing is not logarithmic, it’s closer to proportional to the square root of concentration, although it also involves the concentration of N2O since their effects interact. The relevant fact is that it was certainly less than used in Hansen’s 1988 scenarios, because the methane concentration was less.

    Of course there’s uncertainty in forcing estimates. But there is no uncertainty at all in the fact that actual forcing has followed a pathway well below Hansen’s scenario A. The GWPF post nonetheless ignores everything but the increase in CO2 concentration, specifically so they can imply that temperature should have followed that scenario closer than the others. That is absolutely misleading and insultingly simplistic. It’s especially reprehensible because the difference in CO2 forcing between scenarios A and B is actually quite small — forcing depends on concentration, not emissions.

    Hansen’s scenario A — on which the GWPF post bases its criticism — had net anthropogenic forcing in 2013 at 3.25 W/m^2 above 1958 values, which makes it more than 3.75 W/m^2 over pre-industrial. The IPCC AR5 estimates net forcing only 2.3 W/m^2 above pre-industrial. Hell, EVEN IF YOU SET THE ENTIRE AEROSOL COOLING EFFECT TO ZERO the IPCC estimated forcing remains below scenario A. Showing the falsehood of the GWPF argument post does not depend on uncertain aerosol forcing, rather the idea that IPCC estimates can in any way reconcile scenario A to reality is complete bullshit.

    I was in no way unfair to them. They were unfair to everybody who is trying to figure out what the truth is.]

    • I repeat, it is NOT the GWPF that has done the analysis. It just blindly reblogged a story from C3. Plausible deniability, you know: “oh, is it wrong? We didn’t know, we didn’t do this analysis, we’re merely providing relevant information from all around.”

  22. well done! It will be interesting to watch how these purported “skeptics” and “contrarians” (really just merchants of doubt) will flop about in the bottom of the boat as the coming El Nino raises global surface temperatures by .4C in the next 2 years (as it did in 1996-1998).

    Then we will see how they set themselves up with all of these claims of “hiatus” and “arctic ice recovery”

    Reminds me of the U.S. affordable care act discussion about “death panels” somehow. . .hmmmmmm I wonder if they are related somehow. . .

  23. Hi Tamino,
    you have claimed in the text, that CO2 increase has most closely matched Hansen’s scenario A. I have checked this, and compared the assumed CO2 concentrations to the actual ones. The realworld CO2 concentrations are from 1992 to 2012 slightly below those of scenario B. So this is not true.

    • You may want to reread the text. Tamino claimed no such thing; his discussion is relative to all forcings, not just CO2.

      • Tamino has written:
        “They imply that temperature had to come closest to Hansen’s scenario A because CO2 increase has most closely matched Hansen’s scenario A. But global temperature is affected by a lot more than just CO2, it is affected by the entirety of climate forcing. CO2 is only one of many.”
        Maybe a have understood this differently.

      • Right. His statement is summarizing what the GWPF–“they”–said (or implied.) He himself is not making a claim about the measured CO2 concentration trajectory.

        One could, however, argue that he missed a trick there by not pointing out, as you did, that the GWPF was wrong about that. (Or rather, “that, too.”) Continuing the bridge metaphor, though, it was just an overtrick–he’d already essentially made his contract.

  24. As Hansen et al. pointed out “Note also that the forcing in scenario A exceeds that for the period from 1958 to the present, even though the forcing in that period is nominally based on observations; this is because scenario A includes a forcing for some speculative trace gas changes in addition to the measured ones”.
    Also in the intro: “The range of climate forcings covered by the three scenarios is further increased by the fact that scenario A includes the effect of several hypothetical or crudely estimated trace gas trends (ozone, stratospheric water vapor, and minor chlorine and fluorine compounds) which are not included in scenarios B and C.”

    If you look at there fig 2 you’ll see that the forcing due to CO2 alone is slightly less than 0.5 ºC whereas those trace gases increase the forcing to ~0.9 ºC. I recall McIntyre writing a blog about the failings of Hansen’s predictions and I put him straight on this and he very grudgingly admitted his error! GWPF appear to have made the same error.

  25. > GWPF appear to have made the same error.

    Why make the effort to create a fresh new misdirection, when rebunking an old climate denial zombie argument is so easy?

  26. I cringe when I see people talk about predictions rather than projections.lL

  27. Given the topic of the post and this discussion, it is interesting to go back and read a 2006 paper by Hansen himself that addresses this topic. It is entitled “Spotlight on Global Temperature,” by Hansen, Sato, Ruedy, Lo, Lea and Medina-Elizalde, subtitled “Early model predictions of global warming proved accurate, the Pacific Ocean seems charged for a potential super-El Nino, and global temperature is poised to reach record, perhaps dangerous, levels.” The paper is dated March 29, 2006 and is labeled DRAFT. I could not find a final version. It appears the paper was written in part to refute the views of Michael Crichton, of all people.

    Hansen starts his comparison in 1988, not 1984 as did Tamino. He states “Because of this chaotic variability a 17-year period is too brief for precise assessment of model predictions, but distinction among scenarios and comparison with the real world will become clearer within a decade.” He calls Scenario B the most realistic, and there is pretty good agreement between actual temperatures and Scenarios B and C. B and C diverge after 2005 and we now know the direction temperatures have taken. He notes the need for more complete analyses of industrial era climate forcings and simulations periods.

    All good so far. He then goes off on a discussion of why a super El Nino is eminent and some other topics.

  28. @ B Buckner, the paper in question was published in revised form in PNAS, deleting (appropriately, in my view) the ephemeral Crichton references, under the title “Global temperature change”.

    http://www.pnas.org/content/103/39/14288.short

  29. You can lead a horse to water but you cannot make him drink.
    You can lead a skeptic to science but you cannot make him think.

  30. John Garland

    Somewhat off topic, but interesting to readers possibly…

    From 538 blog at

    http://fivethirtyeight.com/datalab/fivethirtyeight-to-commission-response-to-disputed-climate-article/

    … “we see value in running a rebuttal to Roger’s article at FiveThirtyEight itself. So we are in the process of commissioning one from someone who 1) has not yet weighed in on Roger’s article and 2) has very strong credentials. The scientist who is our No. 1 choice is traveling, and so the turnaround will not be instantaneous.

    We appreciate your patience in the meantime. Climate change is not going away as an issue, and we want to get this right. All journalism relies on trust — between reporters and sources, between editors and writers, between a publication and its readers. Any time that trust is undermined, it’s a huge concern for us. We thank you for your continued feedback. We’re listening and learning.”

  31. > someone who 1) has not yet weighed in on Roger’s article and
    > 2) has very strong credentials. The scientist who is our
    > No. 1 choice is traveling ….

    I’d like to see someone open a betting pool on who they’re hoping to get.
    I’d bet he’s thinking of someone from this short list myself.

    But perhaps I’m too cynical. Y’wanna bet?

  32. In other news, the recent AGU GRL

    http://onlinelibrary.wiley.com/doi/10.1002/grl.v41.5/issuetoc

    is full of interesting* abstracts.
    One on statistics:
    Statistical significance of climate sensitivity predictors obtained by data mining
    Peter M. Caldwell, Christopher S. Bretherton, Mark D. Zelinka, Stephen A. Klein, Benjamin D. Santer and Benjamin M. Sanderson
    7 MAR 2014 | DOI: 10.1002/2014GL059205
    Key Points: Correlation magnitude is not sufficient proof of predictive skill; Significance testing is complicated by model nonindependence in ensembles; The best predictors of climate change are related to the Southern Ocean
    (http://onlinelibrary.wiley.com/doi/10.1002/2014GL059205/abstract)
    ____________________
    * for values in the “uh, oh” range