Tim Harford, a.k.a. The Undercover Economist, blogs about the credibility revolution that econometrics – the statistical branch of economics – is going through.
From my personal experience in the subject, for all its obscure statistical jargon, econometrics is an art; the art of arm-twisting numbers into confessing the answers you want to hear. Like any interrogator / torturer worth his salt, the econometrician has an array of tools to get his dataset to talk, each with varying degrees of invasiveness. The simple (and relatively painless) regressions not giving you the answer you want? Then subject the numbers to more sophisticated regressions such as Vector Autoregressive analysis, a technique so awful my professor discouraged its use for fear of overcomplicating things.
This brings me to my main point: Do more data, more powerful computers and more sophisticated techniques automatically lead to better credibility? The first two give you higher statistical power, which means you can be more confident in the results of your analysis, but they still fall victim to the basic and most crucial problems of statistics. And do increasingly sophisticated techniques necessarily inspire trust in readers and policy-makers? One is unlikely to be convinced by the results of a study if he does not understand the method used to derive them. And as anyone in the business of selling things should know, overcomplication is a sure-fire way of rousing suspicion. Economists too are salesmen, in the business of selling ideas and answers to the body of knowledge, and they would do well to remember that.