That’s hardly a surprise to any skeptic who has ever debated a believer. But I am talking here just about ignoring facts when making political decisions. New research shows that people are adept at making political decisions without letting the facts get in the way, and they have the brain scans to prove it. A summary is reported in Live Science (all bold mine):
Researchers asked staunch party members from both sides to evaluate information that threatened their preferred candidate prior to the 2004 Presidential election. The subjects' brains were monitored while they pondered.
"We did not see any increased activation of the parts of the brain normally engaged during reasoning," … "What we saw instead was a network of emotion circuits lighting up, including circuits hypothesized to be involved in regulating emotion, and circuits known to be involved in resolving conflicts."
The subjects eventually rationalized what they had been told, reaching a biased conclusion based on their prior political preference. Their brains continued to be monitored:
Then, with their minds made up, brain activity ceased in the areas that deal with negative emotions such as disgust. But activity spiked in the circuits involved in reward, a response similar to what addicts experience when they get a fix, Westen explained.
The study points to a total lack of reason in political decision-making.
This shouldn’t be too surprising to anyone who has debated politics either, but it is interesting to see which parts of the brain are being used (or not) and when. I find it especially interesting that the reward circuits in the brain light up when the data the person doesn’t like, has been rationalized away.
Of course, there is a general lesson here for critical thinkers: we should try to be aware of our own biases when being presented with political (and other) information, and should try to evaluate information honestly, even if it challenges our political views (whatever they may be). This is hard, of course. I try to do this but like everyone else I know that I engage in some of the rationalization activities described above at least some of the time. Half the battle, if you want to be a critical thinker, is to be aware of your own biases and of your own rationalization processes – or “the art of thinking about thinking with a view to improving it” as criticalthinking.org puts it. Something we should all try to do.
Of course, if you didn’t come by your opinions through reason you’re unlikely to change them through reason either, which is why woo beliefs are virtually immune to contradictory evidence. Perhaps with some woos, the “part of the brain normally engaged during reasoning” never gets much of a work-out.