In an online discussion about why there are so few women in computer science, one student wrote that he thought it was because there are fundamental differences between the sexes. When asked what data he was basing that opinion on, he replied:
It's more of an unrefuted hypothesis based on personal observation.
More recently, in a comment on an earlier post asking what data there was to back up the claim that Erlang and Haskell made parallel programming easier, Neil Bartlett said:
As a working programmer I favour my own personal experiences and experimentation over academic studies...
Put that way, it sounds very pragmatic, but what if you replace "academic studies" with the phrase "careful observation and analysis of a large number of people followed by critical peer review of the claims"? Or, if you like shorter sentences, "science". I wouldn't believe a drug company's claims about a new headache remedy unless they'd been subjected to independent scrutiny---in fact, while the rules are bent all the time, the drug company wouldn't actually be able to make any claims publicly until they were able to prove them. "My Uncle Vlad used it, and his migraines went away" isn't good enough---there could be a thousand other reasons why Uncle Vlad's headaches went away, and there's no way of knowing whether the effects on one (self-selected) subject are representative of the population as a whole. "It's made from mushrooms, which contain lots of mitochlorians, which are an effective vasodilator" isn't good enough either---there are at least three leaps of logic in that particular "just so" story, any one of which could be invalidated by some factor not included in the argument. Unfortunately, most of the claims about programming and related topics that I read online or hear in pubs are equally broken. Having worked with scientists and engineers for much of the last 25 years, though, I've learned that the good ones don't believe or disbelieve something until there is evidence---not just anecdote, personal experience, or a plausible chain of reasoning, but evidence---to back it up. As a profession, we are finally starting to accumulate that kind of evidence: see Glass's Facts and Fallacies of Software Engineering for an easy-to-read summary, or the journal Empirical Software Engineering for more recent results. It's hard work, but I suspect the real challenge will lie in persuading working programmers to say "evidence, please" more often.