A question was asked recently:
Sorry, it’s not about Arctic not Antarctic, but the Arctic thread seems to be already closed for commenting. There’s a thing I can’t understand about Arctic sea ice – how come according to NSIDC the maximum sea ice extent in 2013 was the 6th lowest in satellite history while Jaxa is showing it to be higher than the 2000′s (not even 2010′s) average?
The short answer is: in the JAXA plot, the line labelled “2000′s Average” isn’t the 2000′s average.
That’s because JAXA data don’t start until June of 2002. Therefore they don’t record the maximum extent in 2000, 2001, or 2002.
We can compare extent figures from NSIDC and JAXA:
They reach clearly different annual maxima, with NSIDC reporting more ice at maximum than JAXA, but their annual minima are about the same. If we look at the difference:
we see there’s a strong seasonal pattern. I don’t know, but I suspect that’s because there are regions which NSIDC includes but JAXA doesn’t (perhaps Hudson’s Bay and/or the Great Lakes?). But that’s not the reason for the puzzlement.
If we compute the annual maximum extent for each record we get this:
Note that although they’re offset from each other, by and large they show the same pattern of changes. The horizontal lines show the averages during the time span plotted, but that’s not the “2000′s Average” because it doesn’t include 2000, 2001, or 2002.
We can also plot the ranks of the annual maxima:
which shows that this year did indeed reach the 6th-lowest maximum in the NSIDC record, but the 5th-lowest in the JAXA record.
If we compute all maxima (which includes a lot more data from NSIDC) we see the trend in maximum extent: