Now that 2011 is complete, most of the major global temperature estimates have updated their data to include the complete calendar year. The only one which hasn’t yet is HadCRU, for which data are available through November of 2011 but December’s estimate is not yet online.
It turns out that 2011 wasn’t as hot as 2010, as can be seen from a graph of the raw data:
For GISS data, 2011 was the 9th-hottest year on record, for NCDC it was 11th-hottest, for HadCRU (using only data through November) it was 12th-hottest, for RSS it was 12th-hottest, and for UAH 9th-hottest.
Some interpret last year’s data as “Global temps in a Crash.” Of course they do so without any analysis, based only on the fact that recent numbers are lower than previous numbers. And they don’t properly account for all those other factors that influence temperature, especially the fact that 2011 was a strong la Nina year so we expect it would be on the cool side of the prevailing trend.
But we can estimate, and remove, the influence of exogenous factors, including el Nino, aerosols, and solar variations. I’ve done this before, and it demostrates that the man-made influence on temperature is creating a sizeable and inexorable trend. For the updated results, rather than use TSI (total solar irradiance) to represent solar variations I used sunspot counts, because they’re easy to get, kept up-to-date, and already available as monthly averages. Here is the adjusted temperature data, with the exogenous influence removed:
When the other factors are accounted for, the continuing trend is obvious — and any talk of “Global temps in a Crash” is shown to be the idiocy that it is.
In the adjusted data, GISS ranks 2011 as the 2nd-hottest year on record (just behind 2010), NCDC ranks it 5th, HadCRU (using only data through November 2010) ranks it 5th, RSS ranks it 2nd, and UAH 2nd. No, global temps are not in a crash — they still fluctuate (for a lot of reasons, including exogenous factors) but the trend continues. It’s called “global warming.” It’s caused by human activity. It’s dangerous.
We can even average adjusted data for the 5 major global temperature records to give us a composite estimate of the man-made component of global warming:
2011 is second only to 2010 as hottest on record. The trend continues.
For those of you who wish to keep current with this analysis, I’ve modified the programs to use sunspot counts rather than TSI to represent solar variations, and I’ve written a program to download and format the necessary data from the internet automatically. It’s in this package:
Since wordpress won’t allow me to upload a “zip” file, I changed the name from “NewSoft.zip” to “NewSoft.xls” — after downloading, you should change the name back to “NewSoft.zip” then unzip the file. It contains two R programs and some data files, but all you really need are the R programs. One of them is “datagetform.r” — when you run that it will automatically download and format all the data, creating a master data file called “rawdata.dat”. That file is input to the other program, “allfit.r”, which will compute the influence of exogenous factors, then create two files: “Adjusted.dat” is the adjusted temperature data and “coeffs.txt” is the regression coefficients.
These programs come with no support and no guarantee, so if they transform your computer into a super-intelligent destroyer of worlds intent on removing the human infestation from the planet, I disavow any responsibility.
Tamino, the old canard about 1910 -1940 trend rate matching that from 1970-2011 keeps raising its head. It occurs to me that using your software to adjust for exogenous factors for the periods 1880-1940, 1910 to 1940, and 1880 to 2011 would be very interesting in showing the relevant causes of the different warmings. (That is leaving aside the fact that the 1910-1940 warming rate was in fact slower than the current warming.) Do you have any plans in that direction (or could you make some)?
“Global temps in a Crash” is shown to be the idiocy that it is.
Very nicely put!
I suppose some nit-picker will wonder which SSN you are using – and claim that if you’d only used a different one, it would show something else entirely. There’s no accounting for idiocy.
The BBC radio program More or Less had a short update on a 4 year old 100 pound bet between James Annan and David Whitehouse over whether or not the 1998 record temperature would be broken over 2008-2011 (based on the Had CRU). It’s interesting to listen to Whitehouse’s horrible “analysis” of recent global temperatures and his audible background guffaw at Annan’s perfectly reasonable explanation.
The following link will let you access the podcast version (for the next week or so at least):
The podcast is only 10 minutes long and the climate bet part is at the beginning, so it’s a quick listen if you’re interested.
We’ve got a post on that wager at Skeptical Science.
Annan should simply double down for the next four years and make the bet contingent on the average of the major reconstructions, rather than the conservative HADCru one.
My guess is that Whitehouse would refuse …
However, as Annan himself acknowledges, “there is little sign of the acceleration in warming that most models had predicted, and it increasingly seems that the Smith et al forecast (for example) was a bit excessive.”.
The question here is whether this may point to a lower climate sensitivity. Appart from the solar minimum and a bit of variability, in line with the recent Gillet et al recent paper, James Annan also mentions as another likely contribution to recent temps: “Slightly lower transient sensitivity, which then points to (probably) slightly lower equilibrium sensitivity too”. (yes, I noticed that it would be just “slightly”).
I hope that the warming doesn’t accelerate. Summer here in Perth is hot enough!
I read a paper yesterday about tipping points. See http://www.pnas.org/content/105/38/14308.full
It looked at abrupt climate changes, and the climate behaviour before the change. It suggested that you look at the transition as a chaotic system moving from one attractor to another. And that there is a slowing down of fluctuations immediately prior to the transition. But maybe we are changing things too quickly for this to be particularly noticable.
The planet is not a prisoner of HadCrut3, which is clearly wrong, so why should Smith et al be a prisoner of HadCrut3? After Annan made that comment, he posted an article about HadCrut4, in which it is reported 2010 is warmer than 1998. So Smith et al opens with a win in 2010, and appears to be in good shape to win it all.
Because the planet has behaved pretty much as they predicted.
Well, I don’t really have a clue, but, if you read his post, Annan said that “there is little sign of the acceleration in warming that most models had predicted, and it increasingly seems that the Smith et al forecast (for example) was a bit excessive” just after taking a look at the three data sets, not just HadCRUT. Besides, Annan’s suggestions of possible explanations to these lower-than-projected temps took place after writing that post about HadCRUT4. So I think he refers to any of the data sets. If Smith et al get it right with the right data sets (GISS, NCDC and HadCRUT4), for me that would be a fulfilled prediction, given that the prediction should be considered about real temps, not about flawed temp records.
Have look at https://sites.google.com/site/arctischepinguin/home/fr2011
Wipneus is very fast in updating graphs. He/she told me on the Dutch website http://klimaatverandering.wordpress.com (klimaatverandering = climate change) that he/she used your public logic with TSI data instead of sunspots, available at: ftp://ftp.pmodwrc.ch/pub/data/irradiance/composite/DataPlots/ext_composite_d41_62_1201.dat
The negative influence of La Nina in 2011 is clearly to see in the graphs of Wipneus. A La Nina wont last forever and is normally followed by an El Nino. It will be very interesting to see what happens to global temperature when the next El Nino appears.
[Response: Clearly the results are nearly identical. That’s no surprise, since sunspot numbers and TSI are both good proxies for solar activity. I suspect TSI is a little better.]
Those graphs are updated now with the HadCRU update.
On the http://klimaatverandering.wordpress.com blog we had a little 2012 GISS temperature anomaly forecasting competition. I based my entry on:
– the regression coefficients as calculated by the FR2012 code for ENSO (MEI index), volcanic aerosol, TSI, linear growth and seasonal viations.
– no volcanic aerosols in the first half of 2012;
– TSI value of 1366 W/m2;
– the MEI index following a pattern of previous “double dips”: 4 more months at the current (assumed) minimum and then a rize to +0.25 in August; MEI values are lagged by 4 months so that is the final value.
– A residual in December of -0.14 °C leads to residuals of -0.09 and -0.03 °C in January and February.
Graphs for MEI and TSI showing my guesswork:
Computed monthly GISS values for 2012 [°C]:
Mean Tyear= (Tjan+…+Tdec)/12:
The estimated 2-sigma error is +/- 0.11 °C
In speculative mode, I would say in case these forecasts turn out to be correct:
The year 2012 was stil largely (11 months out of 12) dominated by negative ENSO conditions. Despite that, it ranks as the third warmest year in the gistemp record. First are 2005 (0.62 °C) and 2010 (0.63 °C), 1998 was “only” 0.58 °C.
The last 2 months of 2012 show a glimpse of what 2013 will bring. Those records will be broken by a large amount, unless a volcano eruption brings some cooling.
Of course we must be careful that when the next strong El Nino arrives or TSI increases producing a new un-adjusted global record that similar adjusted data sets are produced, as I am more interested in the human contribution than variations produced by internal and external effects. Not to mention the claims of bias from Willard.
[Response: What are the odds that when the next strong el Nino hits, somebody at WUWT will use my code to claim that the incredibly high raw temperature is “only” due to el Nino?]
I would estimate that chance as 95% +- 5%
I’d go one further, and estimate that chance as 95% +5%/-(-5%)
I have a better estimation 95%+5%, without – :-)
That’s so great! , thanks!
I know there is no support, but how does one do the plots of all 5 adjusted data sets? I’m a total R noob, haven’t delved into stats packages since the DOS version of SPSS oh so long ago…
“What are the odds. . .”
10:1? Or am I too conservative?
Tamino, when you say “It’s called “global warming.” It’s caused by human activity. It’s dangerous.”, do you think your data prove the last two assertions, scientifically speaking ?
i don’t see any argument supporting that the GW is “caused by human activity” based on your statistics, nor that is is “dangerous”. The arguments are based on other considerations – use of physical models, and extrapolations of consequences. In principle, statistics could be used to reinforce our confidence in these predictions, of course, but I don’t really see where your statistics do this job – there is no accurate comparison of data with actual predictions, based on these claims for instance. I think this would really be useful to bring this kind of support.
I thought it was clear from context that those assertions were not following from the present issue, but rather unchanged from previous discussions–“other considerations,” as you put it.
Science doesn’t advance by proving what is true, it advances by disproving what is false. After removing the Sun, ENSO, and volcanic effects, the temperature is still going up in a nearly linear fashion. That increase cannot be caused by the Sun, nor by ENSO, from this data. So what’s your explanation for the increase? Magic?
Meanwhile, we know that human-caused greenhouse gases have been increasing in a near-linear fashion during the same time period, and that increase amounts to 1.1 Watts per square meter since 1979 (http://www.esrl.noaa.gov/gmd/aggi/). What is the *expected result* of adding 1.1 Watts per square meter to the surface of the Earth, in your physics? Think hard, and see if you can see the connection.
What Tamino shows is a 30-year warming trend in all the temperature records when natural influences are removed. The earth doesn’t just randomly change temperature–the amount of solar output from the sun must change, or the amount reflected by the earth, or the amount of greenhouse trapping.
You can’t just say you don’t believe–what is causing the change? As you should be aware, there is a well-documented change in greenhouse forcing adequate to explain the trend. We also see albedo (reflectivity) changes in line or somewhat greather than predicted feedbacks from greenhouse gases. If greenhouse gases aren’t causing the warming, what is? And, what might be compensating for the greenhouse heating?
Of course you cannot get attribution from a single graph. However, a 35-40 year trend is in and of itself anomalous.
For attribution, one looks at the spatial and temporal distribution of the warming–and finds it matches predictictions for a greenhouse gas. One looks at the fact that the stratosphere is cooling as the troposphere warms–and THAT is a smoking gun held in the hand of greenhouse warming. Barton Paul Levenson has a list of 17 confirmed predictions of climate models–a pretty impressive record. And then, of course, there is the fact that the greenhouse nature of CO2 and even anthropogenic causation of warming are established, even venerable tenets of climate science.
As to danger, we have already seen 27% more land in drought since the ’70s. Sea levels are rising and will continue to rise. Extreme weather is on the rise–both heat waves and impulsive precipitation causing flooding.
Yes, you have to look at ALL of the evidence, but that is what science forces us to do. If you aren’t convinced by this point, you haven’t been paying attention.
Temperatures rose at a rate of 0.0143 C/yr from 1916 to 1945, a 30 year period with a 0.43C temp increase at almost the same rate (0.0174 C/yr) as the most recent 30 years based on GISS data. The GCMs are based on physics as I understand it, and they miss this temperature rise in their hindcasts.
[Response: Wrong. Get your facts straight.]
What was wrong with my facts? I think the math is correct. I dont want to clog up your blog with a back and forth on this. Maybe you clould drop me an e-mail to let me know where I strayed.
What’s wrong with your facts (below) is, that GCMs do indeed see this rise. They do not “miss” it as you stated. See, for example, Bhat et. al. Figure 2:
Click to access Bhat_JABE_11.pdf
text to go with figure:
FAQ 8.1, Figure 1. Global mean near-surface temperatures over the 20th century from observations (black) and as obtained from 58 simulations produced by 14 different climate models driven by both natural and human-caused factors that influence climate (yellow). The mean of all these runs is also shown (thick red line).
You’re cherry picking the 1916-1945 interval . 1916 and 1945 are anomalous low and high years, respectively. If you think the 1916-45 interval is so important, what abut the 1915-1946 interval – the red line gets that trend perfectly.
What does it say about your method of data interpretation if you can conclude from virtually the same information that the models are both highly inadequate and almost perfect?
It would suggest that it means that you’re interpreting the data in a completely inappropriate way.
“B” is probably beating this drum again:
(Not sure that’s an operable link, but don’t really care.) A very clever study, which definitively proved that apples aren’t oranges.
Expanding a bit, the problem with the 2010 WUWT post is actually relevant to interpreting the Fig. 2 linked by KAP. By the way, here is another graphic that requires less scrolling, and is arguably clearer visually:
The ensemble mean doesn’t capture the *entire* rise over the period B specifies; but we shouldn’t expect that it would: the observations are equivalent to just one realization of the range of possible warming trajectories. As such, it has higher variability than the ensemble mean.
Individual runs of models *do* have similar variability as the observations, but of course the details of the shapes of the curves do not, in most cases, match. Essentially, the ‘weather’ differs, though the large trajectory of climate change does not.
So the comparison made between observations and the ensemble mean is apples to oranges; we would expect to see some resemblance, but not identity. And we do: the observations, even at the height of the 40s ‘bump,’ remain within the hindcast ‘envelope.’
So there’s an element of ‘strawman’ in B’s argument, since the contention that the 40s rise is not exactly and fully matched by the ensemble mean fails to recognize that there is no reason for anyone to expect that it ever would be. There is also an element of the cherry pick, since it just ‘happens’ that the big ‘bump’ in the observed temps in the early 40’s creates the biggest apparent ‘mismatch.’
OK, 30 years is generally considered to be the threshold between climate and weather. What physical mechanism caused temperatures to rise 0.43C during the period between 1916 and 1945? Yes, that is cherry picked, but in the same way the current rise is measured from the late 70s. And if we don’t know, how much confidence do we have in our ability to attribute the current rise in temperatures?
[Response: Have you really not investigated this?
1: Increase in solar output. 2: decrease in volcanic activity. 3: Greenhouse gases (despite what denialists claim, they were increasing back then too).]
“the observations, even at the height of the 40s ‘bump,’ remain within the hindcast ‘envelope.”
Not quite, and not at the low point either. Also, the hindcast envelope is so broad that one can even draw a straight line through it from the late 70s to today.
[Response: Wrong. Get your facts straight.]
Actually I’m still wondering how we can ascertain the exact amount of GW due to GHG. It may be not black or white – 0 % or 100 %. GHG must contribute to warming, but their effect has to be amplified by some factor to explain the observed warming, and the feedback factor is far from being well determined. So it’s not obvious – for me – how much of this warming is due to anthropogenic causes.
I think the cooling of stratosphere can be indeed explained by the cooling effect of CO2 in the optically thin regime (increasing the radiative cooling efficiency), but this doesn’t really help answering the question. Concerning the droughts – actually it is also unclear whether they are really due to GHG. Oceanic oscillations can also modify the repartition of rain, and climate models give contradictory results on the effects of GW on African monsoon for instance.
There is an easy bias : just collecting everything going wrong on the Earth and say “look, it’s GW”. Actually astrologists just do that : collecting all catastrophes and say “oh it’s Pluto/Mars/jupiter …”. I’m rather reluctant to accept these conclusions without careful studies – and just correlation is not a careful study.
jarch, there is a whole field of study called “attribution.” There have been many, many papers published dealing with just this question.
Why don’t you start by Googling ‘climate change attribution’–preferably on Google Scholar, not the main service–and seeing what you find?
If you get stuck, let me know and I’ll try to bird-dog it some for you. But you’ll learn more by doing it yourself.
A Comprehensive Review of the Causes of Global Warming, with links to the primary literature, might be a good place to start. For instance…
Jarch, your inability to distinguish between science and astrology is telling.
Most of the feedbacks are well determined–especially the main one, increased water vapor. Clouds and aerosols are less certain, but you have about a dozen independent lines of evidence that all point to a CO2 sensitivity between 2 and 4.5 degrees (90% confidence.
Why do you persist at grasping at straws?
I persist at non understanding what a “90 % confidence” level means if the theory itself is uncertain. I know what a confidence level if the theory is well assessed (gravitation for instance) and only the values of parameters is to be determined (orbit parameters of an asteroid), but not in the case when the theory itself is still discussed.
The argument : “I can fit the data with such a model and with such parameters” does not prove independently that the model is true AND the parameters are true. It proves only that IF the model is true , then the parameters must have these values. But there is a great number of cases in physics, where different explanations can fit the same data equally well ! so to gain confidence, you need something else, such as
* demonstrating that all other explanations are very far from being able to explain data
* demonstrating a very good accuracy that would make very unlikely that the model is wrong.
I do not see clearly these features in the current climate discussion … but I’m open to accept it if you show me it !
Jarch, all theories are uncertain. The confidence levels estimate the uncertainty.
I’m not even sure what you mean by “the model is uncertain”.
As to your criteria, I would contend that they are quite well met by climate models–as evidenced by the fact that no one has managed to develop a model that explains Earth’s climate without also predicting anthropogenic causation of global warming. This has been known since Arrhenius!
As to predictions, the models:
That is a pretty impressive set of accomplishments. I can only assume that you haven’t looked very hard for evidence. Now why would that be?
Ray, as far as I can see, this a set of facts that fit into predictions of climate models. But I would be very surprised if climate models wouldn’t make any right predictions, so finding a number of correct predictions is not really a surprise ! After all there is no revolutionary physics in climate models, the complexity arises just from the great number of degrees of freedom in a complicated, “real” system – and “real” systems are notoriously difficult to describe.
I am not saying that GHG are not emitted by mankind, not am I saying they don’t contribute to global warming. I’m just saying that the exact amount is still uncertain, and so is the future “danger” – compared to many others that could threaten our society.
jarch, I’m guessing ‘the exact amount’ means ‘the exact amount of warming.’ In that case, I don’t see what gives you any realistic comfort. Yes, there are uncertainties in the degree of warming (quite apart from the fact that this is strictly indeterminate, as it partly depends in principle on (in)actions not yet taken.) Yes, there are uncertainties in the resultant harms.
However, consider future warming. We’ve been experiencing around .2 degrees C per decade. Let’s assume a linear trend–almost certainly a severe underestimate, given significant delays to reach equilibrium combined with the ongoing increase of CO2 concentrations at roughly 2 ppm per year. By 2100, a year some now alive will see, we’d see a rise of 1.8 C above today’s temperatures, or something like 2.5 C above pre-Industrial conditions.
Given the more or less known results of such conditions on the cryosphere, soil moisture, and specific humidity, as well as the probable effects on biodiversity, is there any conceivable way that such numbers (conservative, remember) can fail to be highly damaging, at best?
The “exact amount” is constrained by about a dozen separate lines of evidence, all of which favor a value around 3 degrees per doubling, and which together constrain the effect to between 2 and 4.5 degrees per doubling. Even 2 degrees per doubling is dangerous if we get above 600-700 ppmv. Are you suggesting we bet the future of humanity on a 20:1 longshot?
And you asked for demonstration that other explanations cannot explain the data. Well, do you know of any models that don’t suggest substantial warming from anthropogenic ghg AND which actually make correct predictions. I’ve shown you the correct predictions. Are you now suggesting we move the goalposts?
Ray Ladbury wrote:
The list sixteen predictions made by the models that Barton Paul Levenson put together is impressive, particularly when he gives the references for the years that the predictions were made and the years that cooberating evidence was discovered. Well worth checking out — and looking up the literature if you get the chance, no doubt. (Some of it I have already seen, but I hope to look up the rest as soon as I get the chance.)
Correction: “… of seventeen predictions…”
Correction: “…that corroborating evidence…”
<blockquote.There is an easy bias : just collecting everything going wrong on the Earth and say “look, it’s GW”. Actually astrologists just do that : collecting all catastrophes and say “oh it’s Pluto/Mars/jupiter …”.
Strawman. Climate science doesn’t do what you describe. Please stick to arguments that are based on accurate assessments of what climate science actually tells us.
Actually, Foster & Rahmsdorf by itself puts transient sensitivity at very close to 2°C, as follows:
1. Mean regression slope of all adjusted global datasets is .17°/decade.
2. Regression slope for Aggregate Greenhouse Gas Index is .346 W/m²/decade. (http://www.esrl.noaa.gov/gmd/aggi/)
3. Sensitivity is therefore .17/.346 = .49 K/W/m².
4. CO2 doubling (the conventional measure of sensitivity) will provide an additional 4.0 W/m².
5. Expected temperature increase from 4.0 W/m² is therefore 4.0 * .49=1.96°C.
That’s the transient sensitivity from 30 years of measurement. The equilibrium sensitivity will be higher, because of long-term positive feedbacks from things like vegetation changes and ice-sheet changes.
[Response: Correction: F&R 2011 didn’t compute sensitivity. You did. And your computation ignores time scales altogether. If greenhouse gases suddenly stopped increasing but temperature continued to rise, would you conclude that sensitivity = .17/0 = infinity?]
Sorry, I didn’t mean to imply my computation of sensitivity was F&R’s, just that I used their data. Regarding timescales, both AGGI and F&R use 1979-2010, so I don’t see the issue. IF temperature stopped rising and AGGI didn’t, the computed sensitivity would decrease. IF AGGI stopped rising and temperature didn’t the computed sensitivity would increase. In either case, non-linearity of one of the datasets would be broken, which would require explanation.
In any case, the derived value of 2° for transient sensitivity is pretty much exactly what others have derived.
KAP, I would question whether you can compute a sensitivity as you did–for the simple fact that the climate has not returned to radiative equilibrium.
Which is why the computed sensitivity is NOT an equilibrium sensitivity, but a transient one. An equilibrium sensitivity would be higher. The number computed is close to other computations of transient sensitivity.
“. . . the transient climate response (TCR). . . is defined as the average temperature response over a twenty year period centered at CO2 doubling in a transient simulation with CO2 increasing at 1% per year.”
Or, “the global mean temperature change that is realized AT THE TIME of CO2 doubling under an idealised scenario in which CO2 concentrations increase by 1% yr–1.”
I have to say that I don’t fully understand this definition (or at least how it applies in practice.) But it’s clearly a different calculation than the one given above.
Tamino, with your permission I’d like to try adapting this analysis (fully referenced to F&R 2011 and Open Mind) as an example for a grad-level stats book. Is that something you’d be interested in discussing offline?
[Response: Not really, but feel free to adapt it as you see fit. If you have a question about some important aspect, feel free to post it.]
just spotted this
interesting nasa graphic on current too
j: I am not saying that GHG are not emitted by mankind, not am I saying they don’t contribute to global warming. I’m just saying that the exact amount is still uncertain, and so is the future “danger” – compared to many others that could threaten our society.
BPL: What part of “every harvest in the world will fail and we’ll all starve to death” do you not understand?
One of the (many) things that escape the denialist mind is that when an arbitrary mean warming-over-Pre-Industrial temperature occurs – say, a 2 degrees celcius increase in mean global temperature – the full biological/agricultural effects will not be immediately realised.
First, there’s the inevitable temperature overshoot that will occur as a planet’s thermal mass and complex energy-relocating systems continue to move toward an eventual equilibrium. This message is slowly starting to sink in. However, there will also be a lag in species and ecosystem responses, and also in abiotic, temperature-dependent parameters that impact on biological processes, with further delayed biological responses. Some of these responses will take years to occur after eventual equilibrium, some will require decades, and some will not manifest for centuries or even for millenia.
Most non-scientists are trapped by the type of thinking that imagines that all consequences are resolved in time-spans on the scale of the end of a financial year, or by the end of an electoral cycle. Not so. Temperature equilibrium will seem instantaneous compared to the time that it will take for biological equilibrium to occur.
I expect that there will be many future generations seeking out the graves of the Watts, the Moncktons, and the Bolts of the early 21st century world, on which to spit – or worse – long after a new stable temperature regime has been achieved…
If they are remembered at all. . .
I forcast a brisk sale in diuretics and laxatives at the inns that cater to this pilgrimage trate.
Simple “back of the envelope calculation” give 2011 as the ~ 2nd warmest year.
1 I made correction only for ENSO using Multivariate ENSO Index (MEI)
2 With Excel I got the best fit linear to 1990-2011 yearly ( GISS )
3 year tamp’ – the correction from the line
4 Average MEI for 14 months Nov Dec Jan-Dec. Because it take some months to the ENSO to spread its influence.
5 Plot a graph Average MEI to the result from 3
6 Take out 4 years 3 after Pinatubo + 1 that looks of line
7 With Excel I got the best fit [PEARSON=0.88 8] Use the slope* Average
Better correlation PEARSON=0.91 is 16 month average
(Set+ Oct+ 2(Nov – Oct) + Nov + Dec)/26