Sunday, September 14, 2008

Confirmation Bias

Adam Shostack has a couple of posts on Confirmation Bias. I've added some comments on Adam's blog; here's a digest of the discussion.

Things Only An Astrologist Could Believe

Adam picks up an astrological analysis of a recent action by Google: apparently the timing of the Chrome release was astrologically auspicious.

Vedic Astrologer: "Such a choice of excellent Muhurta with Chrome release time may be coincidental, but it makes us strongly believe that Google may not have hesitated to utilize the valuable knowledge available in Vedic Astrology in decision making."

Adam: "This is a beautiful example of confirmation bias at work. Confirmation bias is when you believe something (say, Vedic astrology) and go looking for confirmation. This doesn't advance your knowledge in any way. You need to look for contradictory evidence. For example, if you think Google is using Vedic astrology, they have a decade of product launches with some obvious successes. Test the idea. I strongly believe that you haven't."

Myself: "What our Vedic friend is actually telling us is that Google "may not have hesitated" in its use of Vedic astrology. To be honest, I also find it hard to believe that Google executives sat around dithering about whether to use Vedic astrology or not."

In further comments, the Vedic astrologer argues that the astrological method is no different from other forms of observational science, using the scientific method, which requires the prediction of future results.
  • Hypothesis: The sun comes up every 24 hrs.
  • Method: I will time when the sun crosses the horizon.
  • Results: I successfully predicted 50 sunrises with a 100% degree of accuracy. This is further evidence that my hypothesis is correct.
  • Caveat: Although, I note that since 24 hrs is the period between sunrises by definition of a day, this is circular.
Actually, the statement that the sun rises exactly in 24 hour intervals is only believable if you live near the equator and you know nothing about astronomy, or if you adopt a solar method for measuring the length of an hour.

What confuses me about the hypothesis posed by our Vedic friend is whether he is trying to predict the decision-making behaviour of Google executives or the successful outcome of their decisions. Even if Google executives are making auspicious decisions, this could be "explained" either by the fact that they are employing the services of an astrologer, or by the fact that Google happens to have good (= astrologically blessed) executives. Or something.


More on Confirmation Bias

According to an old article by Michael Shermer in the Scientific American [The Political Brain, June 2006], "a recent brain-imaging study shows that our political predilections are a product of unconscious confirmation bias". Devan Desai concludes that "hardcore left-wing and hardcore right-wing folks don’t process new data".

When I first read that line about "hardcore left-wing and hardcore right-wing folks" quoted in Adam's blog I assumed it was talking about serious extremists - communists and neoNazis. Turns out it was just looking at people with strong Democrat or Republican affiliation. Maybe any party affiliation at all seems pretty hardcore to some people.

As far as I can see, the study only actually looked at people with strong political opinions, and didn't compare them with any control group. Like, er, the middle-of-the-road folks who fund and write up this kind of research.

I wonder whether anyone would get research funding or wide publicity for exploring the converse hypothesis - that people with strong political opinions are actually relatively open-minded, and that the people who have the most entrenched opinions are the bureaucrats who staff the research funding bodies and the people who write popular articles for Scientific American.

(Of course I'm jumping to conclusions myself here, that's what bloggers do isn't it?)

I'm not saying I believe that bigots are more open-minded than wishy-washy middle-of-the-roaders. I'm just saying we need to be mistrustful of studies that are designed to confirm the prejudices of the researchers, and suspicious of people who latch onto these studies to prove a point. The problem is that there may be confirmation bias built into the way these kind of pseudo-scientific studies are funded, organized and then publicized. Not surprising then if "the FMRI findings merely demonstrate what many of us already knew".

As director of the Skeptics Society, Michael Shermer latches onto a study showing that people are biased. Shermer himself has a particular set of bugbears, including evolutionary psychology, alien abductions, prayer and healing, and alternative medicine. Are we really to imagine that he approaches any of these topics with a truly open mind? And why should he anyway? The rest of us don't.

1 comment:

iCARE said...

Confirmation Bias -- excellent and timely topic! Your blogs are my first discovery this New Year morning. First, a New Year gift to you: Journal of Research Practice (JRP).

"Research practice," with all its frailties (some of which are identified by you in your article on peer review), still seems to be the only systematic antidote to confirmation bias (and other types of cognitive bias). Given the broad canvas of your writings, you may have already written something on this, have you?

Joyful 2009!

D. P. Dash