I've been told that I get agitated when people in positions of authority misinterpret data. It might be geeky of me, but it's just not on.
The well-known mantra, "correlation does not imply causation", was heavily drummed into my head during my undergraduate Psychology degree. I'm confident that it would have entered my brain, regardless of my participation in any education; the idea is a straightforward one, and utterly common-sense. Despite the simplicity of this concept, it appears to be forgotten with alarming frequently in pieces of journalism published by (allegedly) high-quality and respectable sources. In a conversation where people are enjoying casual discourse and not scrutinising every utterance, these sorts of lapses are normal, and to be expected. However, when such errors are made by professional writers and journalists who are publishing formal pieces of work to large audiences, it's just sloppy.
For example, in a
recent article about illegal music downloading that I was reading on the
Guardian website, the author claimed that file sharing wasn't responsible for diminished profits in the music industry. Instead, he had some recent figures that demonstrated a large increase in video game and DVD sales. Therefore, he claimed that it was this that was killing the music industry, because people had no money left over to buy music. As far as he was concerned, this was undoubtably what those numbers meant, make no mistake about it.
Clearly, whilst he may be right, the causation claims he is making are, in no way, backed up by the data he cites. Ironically, he berates music industry bosses for the unwarranted inference of a "cause and effect" relationship between music downloading and a decrease in music sales, but then in the very same paragraph of his article, goes on to say that "there's only a limited amount of short-term spending cash available to people" and that "instead of buying music, they choose to spend it on other things". He replaces one unwarranted claim of causation for another.
This is just one example of many such articles I've read recently. Academic journals use peer-review, and you rarely see anyone getting away with publishing conclusions from their data that aren't absolutely qualified. You get tentativeness, "we might be able to conclude this, but we need more studies to confirm it", and admittance of the shortcomings of their current knowledge. Essentially, you get humbleness and honesty - the exact opposite of many sensationalist news articles that bastardise the real research and communicate it in that warped form to the general population. This may relate to why people consider psychology as a "pseudo science", if most of their exposure to psychological research has been in the form of watered-down, poorly written journalistic articles.
Are people aware that they are making these sorts of assumptions? Do they think that because they mention a few figures or the odd research paper (e.g. "studies have shown..."), that it qualifies their subsequent conjecture as fact? I can only assume that such unfounded conclusions are being drawn knowingly; on-the-fence articles may be considered inadequately gripping. Still, I'd rather read about the truth and be presented with the data in an accurate way, even if it might be rather less exciting sometimes.
*Actually, a man ate banana once, but he also ate lots of pies and smoked lots of cigarettes, and then got diagnosed with cancer a few years later. Sorry, I know that's boring.