It's Hard to Change a Human's Mind
Recently I found myself in a debate with homebirth advocates that reminded me why I gave up debating several issues in the first place. I realized that very few people ever debate in order to get to the truth, but in order to support their own position. They seemed to enjoy the debate, and it often seemed to me as if it reinforced their beliefs no matter what information was being put out there. This seemed to be true across the board: democrats or republicans, theists or atheists, evolutionists or creationists.
There is an
article in Scientific American about an interesting study by Drew Westen (not published yet, but soon will be) on political bias where self described strong democrats or republicans were shown several statements by Kerry and Bush while undergoing an
fMRI scan. Each candidate contradicted themselves. The subjects would come down hard on the candidate that did not represent their party, and let their party's man off the hook. It was the emotional and conflict resolving areas of the brain rather than the reasoning dorsolateral prefrontal cortex that were involved.
If the study proves valid, it only substantiates what we've known for a long time:
We are biased, and we prefer data which supports our bias to an amazing degree, sometimes. The study also showed reward centers being activated once the subject had processed the information and come to a conlusion that agreed with his political leanings. So we feel good when we conform to our own bias.
I speculate that the advantage of this would be to cement social bonds in order to cooperate more, insuring the survival of more members of the clan. It was a strategy that would work with limited information and a need to act. Conservatism (not the political kind, but the slow to change kind) would also protect from taking chances that are too risky. It looks like safe courses of action are rewarded in the brain and as well as in social circles.
It is a strategy that is turning out to be difficult in a more varied society, with so much more information out there that needs to be weighed carefully and less need to act for our own survival.
I especially liked the conclusion of the article: politics needs to be peer reviewed with skepticism being rewarded.
Anyway, remind me to tell the story of how I think I broke my own confirmation bias. One can never really be sure about those things, though, can they?