The global warming story has in part been based on a 1998 paper that showed a "Hockeystick" increase in temperatures in the 20th century. Unfortunately that hockeystick looks to have been based on iffy statistical analysis of iffy statistics rather than anything else.
See this MIT Technology Review article and (and for the statistically aware the raw source and further details are at the authors' website. There are numerous relvant linked pages to the one I linked too.
The key bit IMO (from the MIT article):
Canadian scientists Stephen McIntyre and Ross McKitrick have uncovered a fundamental mathematical flaw in the computer program that was used to produce the hockey stick. In his original publications of the stick, Mann purported to use a standard method known as principal component analysis, or PCA, to find the dominant features in a set of more than 70 different climate records.
But it wasn’t so. McIntyre and McKitrick obtained part of the program that Mann used, and they found serious problems. Not only does the program not do conventional PCA, but it handles data normalization in a way that can only be described as mistaken.
Now comes the real shocker. This improper normalization procedure tends to emphasize any data that do have the hockey stick shape, and to suppress all data that do not. To demonstrate this effect, McIntyre and McKitrick created some meaningless test data that had, on average, no trends. This method of generating random data is called “Monte Carlo” analysis, after the famous casino, and it is widely used in statistical analysis to test procedures. When McIntyre and McKitrick fed these random data into the Mann procedure, out popped a hockey stick shape!
That discovery hit me like a bombshell, and I suspect it is having the same effect on many others. Suddenly the hockey stick, the poster-child of the global warming community, turns out to be an artifact of poor mathematics. How could it happen? What is going on? Let me digress into a short technical discussion of how this incredible error took place.
In PCA and similar techniques, each of the (in this case, typically 70) different data sets have their averages subtracted (so they have a mean of zero), and then are multiplied by a number to make their average variation around that mean to be equal to one; in technical jargon, we say that each data set is normalized to zero mean and unit variance. In standard PCA, each data set is normalized over its complete data period; for key climate data sets that Mann used to create his hockey stick graph, this was the interval 1400-1980. But the computer program Mann used did not do that. Instead, it forced each data set to have zero mean for the time period 1902-1980, and to match the historical records for this interval. This is the time when the historical temperature is well known, so this procedure does guarantee the most accurate temperature scale. But it completely screws up PCA. PCA is mostly concerned with the data sets that have high variance, and the Mann normalization procedure tends to give very high variance to any data set with a hockey stick shape. (Such data sets have zero mean only over the 1902-1980 period, not over the longer 1400-1980 period.)
The net result: the “principal component” will have a hockey stick shape even if most of the data do not.
As the MIT article says, this doesn't disprove Global Warming per se but it most certainly means that some of the more alarmist claims need to be re-examined. After all about the worst outcomes of all would be for policy makers to conclude that the environmentalists were just BSing when in fact there was a real problem struggling to be seen beneath all the shoddy data.