Over the past week or so, the Bureau of Meteorology has stood accused of fudging its temperature data records to emphasise warming, in a series of articles in The Australian. The accusation hinges on the method that the Bureau uses to remove non-climate-related changes in its weather station data, referred to as “data homogenisation”. If true, this would be very serious because these data sets underpin major climate research projects, including deducing how much Australia is warming. But it’s not true.
Lisa Alexander is the Chief Investigator for the ARC Centre of Excellence for Climate System Science and a Senior Lecturer Climate Change Research Centre at UNSW Australia. Andy Pitman is the Director of the ARC Centre of Excellence for Climate System Science at UNSW Australia. Originally published on The Conversation.
Crunching the numbers
Data homogenisation techniques are used to varying degrees by many national weather agencies and climate researchers around the world. Although the World Meteorological Organization has guidelines for data homogenisation, the methods used vary from country to country, and in some cases no data homogenisation is applied.
Homogenisation can be necessary for a range of reasons: sometimes stations move, instruments or reporting practices change, or surrounding trees or buildings at a site are altered. Changes can be sudden or gradual. These can all introduce artificial “jumps” (in either direction) in the resulting temperature records. If left uncorrected, these artifacts could leave the data appearing to show spurious warming or cooling trends.
There are many methods that can be used to detect these “inhomogeneities”, and there are other methods (although much harder to implement) that can adjust the data to make sure it is consistent through time. The Bureau uses such a technique to create its Australian Climate Observations Reference Network – Surface Air Temperature (ACORN-SAT) data set. These data are then used to monitor climate variability and change in Australia, to provide input for the State of the Climate reports, and for other purposes too.
In a statement about its climate records, the Bureau said:
The Bureau measures temperature at nearly 800 sites across Australia, chiefly for the purpose of weather forecasting. The ACORN-SAT is a subset of this network comprising 112 locations that are used for climate analysis. The ACORN-SAT stations have been chosen to maximise both length of record and network coverage across the continent. For several years, all of this data has been made publicly available on the Bureau’s web site.
Australia has played a leading role in developing this type of complex data-adjustment technique. In 2010, the Bureau’s Blair Trewin wrote a comprehensive article on the types of inhomogeneities that are found in land temperature records. As a result the International Surface Temperature Initiative (ISTI) has set up a working group to compare homogenisation methods.
Some of our own research at the ARC Centre of Excellence for Climate System Science has tried, with the help of international colleagues, to assess the impacts that different choices can make when using these different homogenisation methods. Much of our work focuses on temperature extremes. We have studied the impacts on large-scale extreme temperature data of changing station networks, different statistical techniques, homogenised versus non-homogenised data, and other uncertainties that might arise.
Our data on extreme temperature trends show that the warming trend across the whole of Australia looks bigger when you don’t homogenise the data than when you do. For example, the adjusted data set (the lower image below) shows a cooling trend over parts of northwest Australia, which isn’t seen in the raw data.
Trends in the frequency of hot days over Australia – unadjusted data using all temperature stations that have at least 40 years of record available for Australia from the GHCN-Daily data set.
Trends in the frequency of hot days over Australia – adjusted ACORN-SAT data. The period of trend covers 1951-2010 when both datasets have overlapping data. All data used in figures are available from www.climdex.org
Far from being a fudge to make warming look more severe than it is, most of the Bureau’s data manipulation has in fact had the effect of reducing the apparent extreme temperature trends across Australia. Cherrypicking weather stations where data have been corrected in a warming direction doesn’t mean the overall picture is wrong.
Data homogenisation is not aimed at producing a predetermined outcome, but rather is an essential process in improving weather data by spotting where temperature records need to be corrected, in either direction. If the Bureau didn’t do it, then we and our fellow climatologists wouldn’t use its data because it would be misleading. What we need are data from which spurious warming or cooling trends have been removed, so that we can see the actual trends.
Marshalling all of the data from the Bureau’s weather stations can be a complicated process, which is why it has been subjected to international peer-review. The Bureau has provided the details of how it is done, despite facing accusations that it has not been open enough.
Valid critiques of data homogenisation techniques are most welcome. But as in all areas of science, from medicine to astronomy, there is only one place that criticisms can legitimately be made. Anyone who thinks they have found fault with the Bureau’s methods should document them thoroughly and reproducibly in the peer-reviewed scientific literature. This allows others to test, evaluate, find errors or produce new methods.
This process has been the basis of all scientific advances in the past couple of centuries and has led to profoundly important advances in knowledge. Abandoning peer-reviewed journals in favour of newspaper articles when adjudicating on scientific methods would be profoundly misguided.