Doomed Planet

Our hottest week ever?


Intrigued by reports that Australia has never seen it hotter than during the recent heatwave, Quadrant Online sought enlightenment from the Bureau of Meteorology. A full week later, a response arrived from the public relations department. It all seemed very reasonable, entirely scientific and so forth, but what would a layman know?


To get a specialist appraisal of the BoM’s investigation, its response was forwarded for comment to QoL contributor  John McLean. What follows is QoL’s request for information, the BoM’s explanation, and McLean’s observations, which raise some fresh questions.

The correspondence is presented as sent and received, with only slight edits in name of grammar and removing the identities and email addresses of BoM PR staffers. Now, read on: 

Dear BoM,
In following all the news of our current Australian heatwave I have been intrigued to see repeated references to our nation’s "average temperature" and the BoM’s advice that, day by day, we have never seen it hotter.

You will pardon my ignorance, I hope, but I cannot recall ever before hearing a single daily figure cited as representing our entire island continent, which is a rather a large place and includes many different climate zones and microclimates. That your numbers are specific to a decimal point suggests an absolute certainty, and this intrigues me even more.

In order to satisfy my curiosity and that of Quadrant Online readers, I wonder if you could tell me how BoM arrives at this daily figure? I am particularly keen to learn:

 

1/ the number and locations of the current sites in your sample?
2/ the names and locations of sites used in previous sampling list(s)?
3/  are the numbers used in the national daily average gleaned from specific sites, or are they composites derived from a sampling of regional reports.
4/ the statistical method(s) used to process the data, both now and in the list’s earlier (if any) incarnations?
5/ Is the number based on raw daily data from the nation’s very many reporting stations? If not, why not?
6/ is the data "homogenised", which I understand to be the term your experts use when alluding to the tweaks and adjustments they perform on raw data?
7/ as the plotting of the national average temperature been done since the formal collection of records, or is the daily averaging a more recent innovation?

I appreciate you must be busy just now and I apologise for burdening you with this request, but after a good poke-around at your BoM site I have been unable to find anything that satisfies my curiosity on the points listed above. Equally frustrating, while the soaring mercury has been earning much attention in the press, not one of those media reports has included an explanation of BoM’s methods and the daily figures’ statistical basis.

Many thanks in advance.

Roger Franklin
Editor, Quadrant Online

PS: Don’t skimp on the air-conditioning today. The word is that it will be a hottie!.

A week later, after a little prompting, the following response arrived:

 

Background to temperature data at the Bureau of Meteorology

All scientific work at the Bureau is subject to expert peer review. Almost all of our methodologies are published in the peer-reviewed scientific literature. The associated publications are available widely.

The Bureau of Meteorology last year completed an extensive and dedicated international peer review, through a panel of world-leading experts, of its preparation of observational temperature data for Australia.

That review ranked the Bureau’s procedures and data analysis as amongst the best in the world, and the results of this review have been published.

A package of information around the temperature record can be found at the following links: http://www.bom.gov.au/climate/change/acorn-sat/documents/ACORN-SAT-Fact-Sheet-WEB.pdf and http://www.bom.gov.au/climate/change/acorn-sat/

The Bureau of Meteorology prepares and maintains multiple temperature analyses that are fit for purpose. This means that temperatures are analysed using more than one independent methodology, for example with and without homogeneity adjustments, as a consistency check and to provide the most appropriate tailored information for various uses.

The two main datasets are the real-time temperature monitoring system, developed under the Australian Water Availability Project (known as AWAP) and the Australian Climate Observations Reference Network–Surface Air Temperature dataset (known as ACORN-SAT). Both datasets commence in 1910, corresponding to the widespread standardisation of temperature instrument screens (“Stevenson Screens”); following the formation of the Bureau of Meteorology in 1908.

Temperature data from before approximately 1910 used a wide variety of non-standard configurations and are therefore not directly comparable to modern measurements.

The real-time monitoring (AWAP) data is part of a suite of daily products that is generated at 2pm each day for the previous day.  This suite of monitoring products includes maximum and minimum temperatures, as well as real-time rainfall, humidity (vapour pressure), solar exposure and vegetation greenness (NDVI).

These data are freely available at the following link: http://www.bom.gov.au/climate/maps/

 

Describing the recent heatwave.

The major heatwave experienced during the first two weeks of January was exceptional for two reasons (i) the spatial extent of very high temperatures across the country and (ii) the duration of that continental-scale pattern of extreme temperatures.

There are two appropriate diagnostics for representing the exceptional nature of the heatwave as described above. They are the proportion of the continent above particular threshold values (such as through percentile analysis) or, perhaps more intuitively, the area-averaged daily temperature. The area-averaged daily temperature is a grid-point average based on a spatially interpolated surface of station-based temperature measurements.

To gauge the extent of heat across the continent, the most appropriate data-set to use is the AWAP daily gridded data. This data uses all available and unhomogenised temperature measurements from around 700 stations daily.  The network is not fixed in time due to network changes and basic quality control, however sensitivity analyses (as documented in the published papers referenced in the links above) shows stability in the calculations post 1950.  This period also corresponds to the most significant trends in Australian temperature, as described in the linked references.

The spatial distribution of the stations used in real-time are shown at http://www.bom.gov.au/jsp/awap/temp/index.jsp .

The number of sites and their location is provided in the maps under RMSE – see http://www.bom.gov.au/jsp/awap/temp/index.jsp (for an example). Each dot indicates a station location, and the size of the dot the influence that station has on the analysis.

The calculation of area averages is a long-standing practice. The current technique dates back to the mid-1990s, when computing power first allowed for these intensive calculations.  The analysis is applied retrospectively from 1910.  It should be noted that calculations of monthly and annual temperatures are based on the technique for analysing daily temperatures and comprise the same data. In other words, the daily calculation is mathematically identical to that used for monthly and annual temperatures reported by the Bureau.  Calculations of global-mean temperature by international research centres also use similar techniques, with the earliest estimates dating back into at least the 1980s.

We estimate the AWAP network definition alone (which stations are used at each time-step) comprises around 11 million records over 103 years of data. This information is archived at the Bureau of Meteorology. In effect, the input data represents nearly all of the digital temperature archive managed by the Bureau of Meteorology, with the same data also available on the Bureau’s website.

It is therefore perhaps more convenient to absorb the methodology as described in the peer-reviewed paper Jones et al. (2009) available at http://www.bom.gov.au/amm/docs/2009/jones.pdf and specifically Figure 1 shown in this paper. The current available data is a now a little better than shown in Figure 1; as a result of continued historical digitisation.

Daily maps of all the AWAP variables are presented online, with various drop-down menus available for the user.  Area averages, such as state and cropping region averages, as well as a whole host of other diagnostics from the real-time monitoring system, are calculated daily. However we do not explicitly publish all of our diagnostics each day as the data volumes are very large. The area-average diagnostics find their way into regular reporting products or Special Climate Statements as appropriate to describing significant events. Special Climate Statements are routinely prepared for significant weather, and can be found at the following link. http://www.bom.gov.au/climate/current/special-statements.shtml

 

Extreme nature of the event.  Comparisons and reasons for using AWAP data.

The AWAP data is the most appropriate for capturing the spatial extent of the temperature anomalies since it maximises the network coverage across Australia. The ACORN-SAT data uses a more limited and fixed network of temporally homogenised data that is specific for estimating change over time. Additionally, the ACORN-SAT dataset is analysed as anomalies (which is standard international practice for homogenised temperature data) whereas the AWAP data, a spatial interpolation of all available measurements, is more suited to absolute temperatures. The spatial analysis accounts for spatial (x,y,z) changes in the network through time.  In terms of the impact of temperatures, such as fire potential, it is more obvious to use absolute temperatures.

A detailed comparison of AWAP and ACORN-SAT datasets was performed as part of the international review of our practices, and is contained in the scientific papers linked to above.

Additionally comparisons are routinely made between the two datasets in real-time, as a consistency check.

As such, the ranks for daily-averaged temperature for the current event are consistent across the independently analysed AWAP data and the ACORN-SAT dataset. Specifically, ranks for the highest three days on record are the same in both analyses. That is, 1st and 3rd were set in January 2013, while 2nd was set in December 1972.

The 20 highest AWAP-ranked daily maximum temperatures also compare closely with the ACORN-SAT-ranked daily maximum temperatures.

However, perhaps the most notable feature of this event was the duration for which continental-wide extreme temperatures persisted.

In this regard, the only two events that are comparable in the entire historical record are a two-week period during the summer of 1972-73 and the heatwave at the start of January in 2013.  This is clear from all available datasets and methods. In fact, these events are so unusual in their characterisation that sensitivity and uncertainty due to elements such as network changes are very unlikely to affect their comparison with climatology. This fact greatly increased our confidence in reporting on the exceptional nature of the January 2013 event.

In summary, the recent event in aggregate terms was warmer than the 1972-73 event on individual days.  Those extreme high temperatures, both in average daily maximum temperature and, more particularly, mean temperature (the average of daytime maximum and night-time minimum) also persisted for significantly longer.

This fits with a shift in the daily distribution of weather toward higher temperatures observed locally; accompanied by a notable increase in hot days and extreme hot days, particularly in the past 15 years.  This is turn is consistent with an additional global warming of approximately 0.4°C since the early 1970s.

Cheers,

etc etc etc

This was all a bit too technical for a humble website editor, so the BoM response was forwarded for analysis and comment to John McLean, . He made the following observations: 

The Bureau of Meteorology temperature recording is not straightforward. It’s not the minimum and maximum temperatures from midnight one day to midnight the next as some people think, and if you think about the "observers" going outside to read thermometers once per day you’ll understand why.

The convention used by the Bureau is that the maximum and minimum temperatures are read at 9:00am. The minimum temperature is recorded against "today" and the maximum temperature recorded against "yesterday", the latter on the assumption that it more likely occurred yesterday afternoon than during the first nine hours of today.

The Bureau’s new temperature data set, referred to as "ACORN-SAT" was prepared from historical data and incorporating various "quality controls" (a.k.a. "sanity checks") on the data. One such check was that the recorded maximum temperature was higher than the recorded minimum temperature across the same period (see section 6.1 of http://cawcr.gov.au/publications/technicalreports/CTR_049.pdf)

The check documented by the Bureau is in fact seriously flawed.

Two conditions are described in the document are as follows: 

1.    Txd ≥ Tnd

2.    Txd ≥ Tnd+1

(where T is temperature, x is for maximum, n for minimum, d for the day in question and d+1 for the next day.)

The second condition is fine. It’s saying that the temperature recorded for a given day should be greater than or equal to the minimum temperature recorded against the next day, which as we saw above means over the same 24 hour period of 9am one day to 9am the next.

The first condition is nonsense. It is requiring that the maximum temperature for today (9:00am to 9:00am tomorrow) is greater than the minimum temperature for today, the period for which is 9:00am yesterday to 9:00am today.

The documentation from the Bureau of Met fumbles around saying that the temperature at exactly 9:00am should be taken into account in both days, presumably because one second earlier or one second later will be in a different 24-hour period.

If the minimum temperature occurred before 9:00am it occurred in the time period before the period over which today’s maximum temperature was determined.

Ken Stewart has a list of over 950 instances of this occurring in the so-claimed quality controlled ACORN data at http://joannenova.com.au/2012/07/boms-new-data-set-acorn-so-bad-it-should-be-withdrawn-954-min-temps-larger-than-the-max/

The situation is not unusual, especially in southern Australia and particularly around the time of a shift to daylight saving. A cool morning followed by a warm day and a night where the temperature doesn’t fall below the 9:00am temperature the previous morning will result in that 9:00am temperature being recorded for the current day even though it occurred yesterday.

There’s no need at all for the first of the two conditions mentioned above because the 9:00am temperature will be incorporated into the data for the 24 hours to 9:00am the next day, which is the same period over which today’s maximum temperature will occur. Condition 2 above applies the only check that has to be made.

The Bureau of Met emphasises that the ACORN system was reviewed by an international panel but I couldn’t find any indication that the quality control was examined, especially the two conditions above.

If they were checked and approved by the review panel the error described above undermines the credibility of the review process. If the two conditions were not checked then why not, and what else is the Bureau of Meteorology implying was checked but in fact wasn’t?

As a footnote, the "panel of the world-leading experts" was comprised of one Australian (Ken Matthews), one New Zealander (David Wratt), Thomas Peterson (USA) and Xiaolan Wang (Canada). (see http://www.bom.gov.au/climate/change/acorn-sat/documents/ACORN-SAT_IPR_Panel_Report_WEB.pdf )

Ken Matthews has a Bachelor of Economics and according to his website, http://www.kenmatthews.com.au/,  "provides high level consultancy services in the areas of water policy and water management, as well as in all areas of public administration". He appears to have no experience in processing meteorological data.

David Wratt is from the New Zealand National Institute of Water and Atmosphere (NIWA), which in a court challenge over temperature adjustments made when observation stations are relocated, failed to describe the methods by which those adjustments were made and elected not to present an Australian Bureau of Meteorology review of those methods.

Thomas Peterson is from the National Climatic Data Center (NCDC), which itself is under fire over its temperature adjustments, especially the downward adjustment of older data.

Xiaolan Wang is from Meteorological Services of Canada and has a background in processing meteorological data.

The panel therefore comprised of one administrator with no apparent background in climate-data processing and three people with experience in this area, but two of which are from organizations under something of a cloud regards issues in processing data of this type.

Regards,

John McLean

 

John McLean is a PhD student through James Cook University. He has written widely on temperature issues and the IPCC, including a detailed essay cited in the US Senate.


Leave a Reply