L'Ombre de l'Olivier

The Shadow of the Olive Tree

being the maunderings of an Englishman on the Côte d'Azur

12 November 2008 Blog Home : November 2008 : Permalink

Open Source Science

There's been a bit of a brouhaha in the Climate blogosphere recently regarding what can only be described as an embarrassing mistake by the scientists behind the GISTEMP metric. Needless to say the error seems to have been spotted by the AGW-skeptical blogs and their commenters who not only queried the oddness but dug in and figured out what was wrong (basically in a number of locations September's data was used in place of October's)

The grown up response to this ought to be "oops we goofed and thank's for spotting it" but the actual response is rather more, umm, prickly. I'm not going to pick on the prickly author (or even make cheap shots) because a) others have already done it and b) it's not actually helpful. However in the comments of this prickly response there are some things that I am going to comment on.

First is this one:

Watts, exaggerates every finding he has, even when there is actual cause for concern; he also is trying to discredit every weather station in the country, as if no other data exists and suddenly every thermometer is impproperly placed and is not calibrated…funny really, and many people follow him as if he is the best bet to discredit AGW. I agree with Eyal, I wish this were all not true, however, scientists and science itself is not incompetent in this day and age to be wrong; should we feel good the science is good enough to offer warning or angry that it is correct and so little is being done?

Now I admit this is just a semi-anonymous blog commenter. Blog commenters are quite often more extreme than the actual owner of the blog so this commenter may just be a random kook, however I suspect the bolded bit is a not uncommon reaction. It is also extremely dangerous. Right now governments are passing laws restricting CO2 emissions and so on because they have been told that CO2 is causing global warming, that global warming is increasign at over 2°c per century and that if the world sees another century of 2+°c temperature rise it will be bad for mankind. I'm not going to quibble with the third assumption (I'll leave that to Bjorn Lomborg) but the other two are worth examining. If, for example the world is not heating up at 2+°c/century but something rather more modest (say 1°c) then the urgency for a fix is reduced. And if, as is not impossible, CO2 is not the main cause of any rise then curbing CO2 is not going to have any effect. The surface stations project is intended to try and get a feel for the accuracy of the measurements that report the 2°c increase. If this data is inaccurate then we're looking at GIGO and all the corrections and adjustments in the world won't extract the real signal.

So complaining about someone actually auditing the raw data sources sounds more like religion than science. it should be noted that what he and the other volunteers have found is that the surface station network is of mixed quality and that some of the automated adjustment algorithms seem questionable. This ought to be something that people welcome (and indeed some scientists do) yet many are like this commenter and apparently think this kind of questioning is akin to heresy.

Next there is

I think the complaints about GISS data-checking might be well-taken :-)

But I think that means that commenters should:

a) Write their Reps and Senators demanding that GISS’s budget be increased.
b) Send a check with every request for GISS to do more work.

1) How much more staff and $$ would you need to do all the things that RC readers here wish you to do?
2) If you had that extra resource, is that the way you’d spend it, or would other things have higher priority?

[Response: Good questions. 1) Current staffing from the GISTEMP analysis is about 0.25 FTE on an annualised basis (i’d estimate - it is not a specifically funded GISS activity). To be able to check every station individually (rather than using an automated system), compare data to the weather underground site for every month, redo the averaging from the daily numbers to the monthly to double check NOAA’s work etc., to rewrite the code to make it more accessible, we would need maybe a half a dozen people working on this. With overhead, salary+fringe, that’s over $500,000 a year extra. All contributions welcome! 2) No. Those jobs are better done at NOAA who have a specific mandate from Congress to do these things. With extra resources, I’d hire experts on ice sheet models, cloud parameterisations, model analysts and programmers. - gavin]

This comment and response are highly reminiscent of the arguments put forward by Microsoft and others with regards to open source software. And really they can be defeated in much the same way. A lot of ES Raymonds's "Cathedral and Bazaar" essay would seem to fit here. Instead of getting up tight and demanding a bigger budget for their own research, it seems to me a sensible approach would be to leverage the crowds of skeptics who are willing to pore through the evidence for free. The "surface stations" project being one example and the work by Steve McIntyre and his commentariat another. The McIntyre commentariat seem to be well on the way to doing precisely what Dr Schmidt (gavin) is asking for in terms of rewriting code. Of course they aren't rewriting all of it but they have rewritten chunks and no doubt they could rewrite (and audit) more if they received more cooperation from the owners of the code and the data so that they could check their intermediate results.

Currently the climate science establishment seems to be acting very much like Microsoft & co. with respect to their open source rivals and this does not fill me with confidence that the establisment is right.