What if the putative smoking gun wasn't, I dunno, a smoking gun?
By joe
- 2 minutes read - 410 wordsThis is ridiculous. They rely upon statistical methods, and misuse them to create a smoking gun, are called to the mat on it, and then …
erm … if you use incorrect methods of analyzing your data, ones which admit biases and errors, its rather hard … no … fundamentally impossible … to make a reasonably valid claim that the “underlying data” (which you analyzed incorrectly) actually supports the conclusions which you reached.
This is the mark of a true believer. This isn’t science, its dogma. Combine that with the yahoo who wants to make it illegal to be skeptical about these results, to literally prosecute “thought crimes” … talk about a slippery slope. One that has nothing, whatsoever, to do with science, reality, … This is a very dangerous direction, and this group need be aware who their fellow travelers are, and disavow, loudly, and repeatedly, any such nonsense.
Sadly, they were not isolated incidents. Many hats had been hung on that particular smoking gun of a hockey stick graph. But the gun wasn’t really smoking, was it … it was misanalyzed. There may be a real signal in there, but the folks who gathered and interpreted the data did such a bad job of it in general, that there is no way to establish good provenance, correct methods, etc. This is just like a crime lab fudging their techniques. They could have so muddied real data, or gather incorrect data (like the moving of the data stations to near cities and other acts that significantly influence the measurements), that there may not be any real validity to the underlying data. If you get a hockey stick, you need to be skeptical as a scientist. You really need to be. Because if you declare eureka, for something that is decidedly a non-eureka moment … you get the current fiasco. This is sad. There are lots of tremendously important questions that need answering, or more correctly, deeper investigation. The deeper we look at things, the more we discover how little we know. The more we discover how much we need create new techniques and new methods for approaching problems. Lets hope some sanity returns to these efforts, and that bad data is recognized for what it is, bad analyses for what they are, and weak conclusions, based upon bad data and bad analyses, shown the level of consideration that they do deserve; that is, they are rapidly and permanently discarded.