Skip to content

Understanding what’s going in with climatology

There are two good posts today to help understand the brouhaha in climatology. Iowahawk has Fables of the Reconstruction (Or, How to Make Your Own Hockey Stick) that has a step by step tutorial on how to replicate Mann’s famous hockey stick graph complete with OOo calc spreadsheets.

My goal was to provide interested people with a hands-on DIY example of the basic statistical methodology underlying temperature reconstruction, at least as practiced by the leading lights of “Climate Science.” … Is there anything wrong with this methodology? Not in principle. In fact there’s a lot to recommend it. … The devil, as they say is in the details. In each of the steps there is some leeway for, shall we say, intervention.

That deals with what is called “homogenized” data where temperature records have been ‘adjusted’ to smooth out variability and compensate for known errors. The reason to do this has much to do with such things as the accuracy of measure, which notes is considered to be greater than 2C for 2/3 of US surface stations, as well as the fact that temperature is measured at a point in space and time and what you are really after is atmospheric heat content.

Basil Copeland takes on the temperature measures problem in Would You Like Your Temperature Data Homogenized, or Pasteurized?. His point is that you don’t want to remove inhomogeneities but rather to just clean up the data as it is the lumps and their distribution that have meaning, too.

with temperature data, I want very much to see the natural variability in the data. And I cannot see that with linear trends fitted through homogenized data. It may be a hokey analogy, but I want my data pasteurized – as clean as it can be – but not homogenized so that I cannot see the true and full range of natural climate variability.

The problem with any of these raw temperature data cleaning problems gets back to what Iowahawk illustrated. It is fundamental to understanding the accuracy of climate studies. When you step through data manipulations, you bring in numerous assumptions and sources of error.

That is where ideas such as noted in Models for gravity and heat become interesting. Instead of temperatures being a starting point for analysis and calculation, they become an end point or verification of a model. The model describes the variables involved in determining temperature at a given point in time and space. The problem with current models is that they are very good for current conditions but their veracity rapidly degrades the farther away from current you go. That means that they are nearly useless for climatology. That, in turn, is both why climatology turns elsewhere and an illustration of just how tough the problem is to predict climate change.

Post a Comment

You must be logged in to post a comment.