Iowahawk – the funniest man on the internet, explained the dealio with the leaked/hacked emails/files from CRU, how these emails/files impact the debate on Anthropogenic Global Warming and how all of this plays out on the global stage. I’m taking this from Ace of Spades post here and reposting it because people should read it to understand it. All 3 of you who read my blog. (Thank you!)
For the benefit of true believers like Stewart, lemme see if I can squeeze this whole methodology thing into a nutshell:
Jones, Mann, et al., practice paleoclimatology; that is, the statistical reconstruction of historic climate records. Their approach works something like this —
let y = a time series of observed global temperature records.
Unfortunately reliable time series only go back 100-150 years or so, a blip on geological time scale. To figure out if there is any sort of significant millennial trend, the series needs to go much farther back, 1000 years or so. Great grampa Ogg was too busy avoiding plague infested rats to write down the temperature, so we need to deduce it out from “proxy variables,” like measurements on annually striated phenomena like tree rings, ice core samples and so on. so…
let x1 … xp = a time series array of proxy variables.
Great! Now them thar proxy variable records will get us back 1000 years. But they’re expressed in measures of tree ring width, band coloration, ice density, etc., not in temperature. And contrary to popular belief there isn’t a physical law or textbook formula that converts these proxy measures into temperature. To do this Mann, et al., use a statistical approach —
1. perform a Principle Components Analysis (PCA) of the proxy variables. PCA is a standard statistical technique for linearly transforming/ reducing a set of raw correlated variables (x1 … xp) into a set of variables called Principle Components (PC1 … PCp) which retain the information in the original data. The PCs are orthogonal (uncorrelated) with one another.
2. Next, Mann et al. regressed the 100 years or so of observed temperatures against the proxy variable principle components:
y = b0 + b1*PC1 + b2*PC2 + … + bp*PCp + error
the regression coefficients (b’s) estimated from recent data were then applied to the older proxy PCs to obtain retrospective “backcasts” or “hindcasts” of the temperatures in 1015, 1016,… 1850.
Voila! The Mann et al. statistical model resulted in the now infamous hockey stick, showing a radical increase in global temperatures in recent years versus the relatively flat millennial variation. This was in large part the basis for the IPCC report.
The initial controversy about this result was raised by MacIntyre and McKittrick (MM) who noted the backcasts of Mann’s reconstructed temperatures didn’t reproduce the amplitude of the Medieval “warm period” or the subsequent “little ice age” that previous research had estimated. That previous work suggested that the recent uptick in temperatures in no big whoop compare to previous decades in the past 1000 years, but Mann’s result showed it off the charts. They published a couple papers suggesting the flat reconstructed historical temperatures were artifacts of Mann’s selection of a time frame for extracting principle components (see step 1 above), which artificially suppressed the variation in the temperature backcasts. This is likely what the CRU emails were talking about when they referred to “Michael’s Nature trick.” This artifact explanation was largely confirmed by George Mason U statistician EJ Wegman (methods editor for JASA), who blistered Mann’s model in a 2006 report commissioned by the Congressional Energy & Commerce committee. Amusingly, Wegman showed that replacing Mann’s principle components estimates with repeated samples of random white noise continued to produce the same hockey stick shape.
Now here’s where the fun begins. MacIntyre and McKittrick wanted to follow up on their research, and asked Mann and Jones for their source data. This is where M&J started stonewalling to the point where M&M made FOIA requests, which were ignored. The emails give some sense of how desperately the CRU group wanted to avoid providing it. Why? Because, I suspect (and seems obvious from the “harry_read_me.txt” programmer’s notes), the basic observed temperature variables — the linchpin of truth in Mann’s model — are hopelessly, utterly corrupted.
Now, if you’ve been following this, Mann’s entire temperature reconstruction method rests on knowing (observing) recent periodic global temperatures, y. Quibbling about principle components aside, that’s the dependent variable in the backcasts. But as is now becoming increasingly plain, y was constructed from an undocumented process that took raw ground station data and ran it through a black box that included smoothing, filtering, inference, manipulation, baling wire, glue and the juice of one whole lemon. This is what the CRU people are calling “valued added homogenized data.” Or what normal people call “made up horseshit.” It’s also the temperature data that dozens, if not hundreds of AGW studies are based on.
In the last few days, the ECU has cynically offered to “share the data,” but what they are offering to share is this numerical sausage. What they won’t share is the source code for their computational raw data meatgrinder, which I suspect contains a treasure trove of numerical shenanigans. –
Iowahawk. American Humor Treasure.
And, since I’m ripping off Ace already, I’m reposting this supertastic post by Michael Crichton on how Space Aliens Caused Global Warming. I miss him. Badly.
Now, maybe, we can get back to when science was science and there was none of that “consensus” nonsense. Consensus science brings us things like the Spanish Inquisition.