The fate of the planet is at stake, but the key temperature data set used by climate models contains more than 70 different sorts of problems. Trillions of dollars have been spent because of predictions based on this data — yet even the most baby-basic quality control checks have not been done.
Thanks to Dr John McLean, we see how the IPCC demands for cash rest on freak data, empty fields, Fahrenheit temps recorded as Celsius, mistakes in longitude and latitude, brutal adjustments and even spelling errors.
Why. Why. Why wasn’t this done years ago?
This busts the facade. How can people who care about the climate be so sloppy and amateur with the data? …
There are cases of tropical islands recording a monthly average of zero degrees — this is the mean of the daily highs and lows for the month. One site in Colombia recorded three months of over 80 degrees C. That is so incredibly hot that even the minimums there were probably hotter than the hottest day on Earth. In some cases boats on dry land seemingly recorded ocean temperatures from as far as 100km inland. A spot in Romania spent one whole month averaging minus 45 degrees. The only explanation that could make sense is that Fahrenheit temperatures were mistaken for Celsius, and for the next seventy years at the CRU no one noticed.
Dr McLean audited the HadCrut4 global data from 1850 onwards for his PhD thesis, and then continued it on afterwards til it was complete:
“I was aghast to find that nothing was done to remove absurd values… the whole approach to the dataset’s creation is careless and amateur, about the standard of a first-year university student.”
His supervisor was Peter Ridd, famously sacked for saying that “the science was not being checked, tested or replicated” and for suggesting we might not be able to trust our institutions …
McLean’s findings show there is almost no quality control on this crucial data. The Hadley Met Centre team have not even analyzed this data with a tool as serious as a spell checker. Countries include “Venezuala”,” Hawaai”, and the “Republic of K” (also known as South Korea). One country is “Unknown” while other countries are not even countries – like “Alaska”. …
Adjustments Make the Past Cooler so the Warming Trend Appears Stronger:
In probably the worst systematic error, the past is rewritten in an attempt to correct for site moves. While some corrections are necessary, these adjustments are brutally sweeping. Thermometers do need to move, but corrections don’t have to treat old sites as if they were always surrounded by concrete and bricks.
New original sites are usually placed in good open sites. As the site “ages” buildings and roads appear nearby, and sometimes air conditioners, all artificially warming the site. So a replacement thermometer is opened in an open location nearby. Usually each separate national meteorology centre compares both sites for a while and figures out the temperature difference between them. Then they adjust the readings from the old locations down to match the new ones. The problem is that the algorithms also slice right back through the decades cooling all the older original readings — even readings that were probably taken when the site was just a paddock. In this way the historic past is rewritten to be colder than it really was, making recent warming look faster than it really was. Thousands of men and women trudged through snow, rain and mud to take temperatures that a computer “corrected” a century later. …
The first audit. Seriously, at this late stage?
As far as we can tell this key data has never been audited before. (What kind of audit would leave in these blatant errors?) Company finances get audited regularly but when global projections and billions of dollars are on the table climate scientists don’t care whether the data has undergone basic quality-control checks, or is consistent or even makes sense. …
The full report:
The 135-page audit with more than 70 findings is available for US$8 from Robert Boyle Publishing. You can help support months of work that should have been done by official agencies years ago.
Joanne and I helped John Mclean set up Robert Boyle Publishing so we could publicize this important work. The major political problem with the whole carbon imbroglio was a lack of due diligence and no audits. Well finally, someone audited the main temperature dataset — and found it laughably poor and systematically biased. Oh dear.
UPDATE: Climate Bombshell: Global Warming Scare Is Based on ‘Careless and Amateur’ Data, Finds Audit, by James Delingpole.
The first ever audit of the world’s most important temperature data set has found it to be so riddled with errors that it is effectively useless.
HadCRUT4 is the primary dataset used by the Intergovernmental Panel on Climate Change (IPCC) to make its dramatic claims about “man-made global warming”, to justify its demands for trillions of dollars to be spent on “combating climate change” and as the basis for the Paris Climate Accord.
But according to a groundbreaking analysis by Australian researcher John McLean it’s far too sloppy to be taken seriously even by climate scientists, let alone a body as influential as the IPCC or by the governments of the world. …
As McLean says:
“Governments have had 25 years to check the data on which they’ve been spending billions of dollars. And they haven’t done so once.”
McLean is the Australian IT analyst who broke another scandal about the global warming scare: that it was effectively the creation of just 53 people.