That’s hardly a surprise to any skeptic who has ever debated a believer. But I am talking here just about ignoring facts when making political decisions. New research shows that people are adept at making political decisions without letting the facts get in the way, and they have the brain scans to prove it. A summary is reported in Live Science (all bold mine):
Researchers asked staunch party members from both sides to evaluate information that threatened their preferred candidate prior to the 2004 Presidential election. The subjects' brains were monitored while they pondered.
…
"We did not see any increased activation of the parts of the brain normally engaged during reasoning," … "What we saw instead was a network of emotion circuits lighting up, including circuits hypothesized to be involved in regulating emotion, and circuits known to be involved in resolving conflicts."
The subjects eventually rationalized what they had been told, reaching a biased conclusion based on their prior political preference. Their brains continued to be monitored:
Then, with their minds made up, brain activity ceased in the areas that deal with negative emotions such as disgust. But activity spiked in the circuits involved in reward, a response similar to what addicts experience when they get a fix, Westen explained.
The study points to a total lack of reason in political decision-making.
This shouldn’t be too surprising to anyone who has debated politics either, but it is interesting to see which parts of the brain are being used (or not) and when. I find it especially interesting that the reward circuits in the brain light up when the data the person doesn’t like, has been rationalized away.
Of course, there is a general lesson here for critical thinkers: we should try to be aware of our own biases when being presented with political (and other) information, and should try to evaluate information honestly, even if it challenges our political views (whatever they may be). This is hard, of course. I try to do this but like everyone else I know that I engage in some of the rationalization activities described above at least some of the time. Half the battle, if you want to be a critical thinker, is to be aware of your own biases and of your own rationalization processes – or “the art of thinking about thinking with a view to improving it” as criticalthinking.org puts it. Something we should all try to do.
Of course, if you didn’t come by your opinions through reason you’re unlikely to change them through reason either, which is why woo beliefs are virtually immune to contradictory evidence. Perhaps with some woos, the “part of the brain normally engaged during reasoning” never gets much of a work-out.
Half the battle, if you want to be a critical thinker, is to be aware of your own biases and of your own rationalization processes
...and then, when you have successfully reached a conlusion in defiance of your native prejudices, you can enjoy the tingly buzz of that reward centre stimulation:)
Posted by: outeast | January 26, 2006 at 04:29 AM
One of the reasons I've been cutting down on the number of political beliefs I have is because politicians don't like to display data.
Posted by: BronzeDog | January 26, 2006 at 06:58 AM
I remember reading in some compendium on A.I. and human brain behaviors that ultimately ALL human decision making is based on un-reasoned choices. If memory serves, the model is that people spend some ammount of effort "reasoning" and then finally decide emotionally. The book discussed one clinical patient in particular who's ability to make that kind of emotional choice had been damaged. Apparently he would spend an unlimited amount of time "reasoning," for example, whether to make his next appointment for Tuesday or Thursday. If someone else would suggest one or the other, he would immediately agree with complete satisfaction. A true "Fish or Cut Bait" tragedy.
If this pathology reflects the normal mental reasoning processes, then it seems that humans are actually physiologically incapable of fully reasoning anything. Needless to say, even such bounded reasoning capacity has proved useful <smirk>...
My point here is not whether some people are reasoning and others emotional, or which mechanism is driving this decision or that. Rather, I suggest that when considering whether a decision is "well reasoned" (our own or of others) we should look at the "depth" of reasoning and the "soundness" of the (emotionally based) axioms it is founded upon.
One of the interesting outcomes of this model is that it widens the channels of communication. To wit, rather than limiting a discussion to “reasoned arguments” we should also include sharing the emotional bases. From my own personal experience (sample size 1), I’ve found myself more effective when I share my own (and elicit other’s) emotional bases as part of a conversation.
Note: I’m using vocabulary rather loosely, which is why I feel all the quoted words should be that way. Also, in particular, by “emotional” I hope to evoke the full gamut of mental process other than reasoning. This includes personal experience, body “memory,” internalized culture (family, community, national, religious, etc. etc. etc.).
PS: I’m reminded of an old joke about how mathematicians’ minds work. I don’t remember the joke itself (and I'm sure it wasn’t actually funny…), but I do recall its structural essence. Basically, “true mathematicians” don’t solve problems. They convert problems into other problems which already have known solutions. (Never mind that such a conversion itself could rightly be considered “the problem” or that some fraction of problems actually require novel solutions.) Perhaps the human reasoning process is too much like that (or rather it’s how it is and most people don’t embed that understanding into their reasoning process – “Gödel, rescue me!”).
Posted by: Stevel | January 26, 2006 at 10:56 AM
This is probably the joke/story you have in mind, Stevel:
A math professor gives his class a word problem: On the floor is a pan, next to a stove with a lit burner. How would you heat up the pan?
The class agreed the answer would be to put the pan over the lit burner.
He then asked the same question, except this time, the pan was on a table. One student responded that the correct answer was to put the pan on the floor, thereby reducing the problem to one that was already solved.
Posted by: BronzeDog | January 26, 2006 at 11:08 AM
I have always thought that humans are rationalizing rather than reasoning beings. The real problem is knowing when you're rationalizing and when you're reasoning. What most people think of as "thinking", that is, essentially the formation of a mental dialogue involving unspoken words, is not really thinking; it's more like an expression of the result of thinking. I think real thinking takes place below (or on the side, or on top) of what we call the conscious mind. Thus the process is not readily accessible to the conscious mind. At least that's what I think.
Posted by: Mark Paris | January 26, 2006 at 03:57 PM
Color me skeptical (of the article, I guess, since to be fair I haven't read the actual study).
I'm no neuroscientist, but I'm pretty sure we don't know enough about how reasoning operates in the brain to make the bold claims that this study does. As Steve mentions, there is evidence that emotional circuits are involved in reasoning generally. I'm not familiar with the compendium he mentions, but the story about the man who became pathologically indecisive after losing some connections between his emotional circuits and frontal lobe sounds like it comes from Antonio Damasio's "Descartes Error". Damasio's argument was not that all reasoning was compromised by emotion, but simply that emotional response was integral to reasoning, even when the decision seemed to have no emotional bearing at all.
Also, a notable quote from the article:
Hmmm. So, basically, the "control" showed the same response as the experiment?
Richard, I'm disappointed. You should have titled this, in true Skeptico style, "Tom Hanks Defies Reason".
Posted by: Eric Wallace | January 26, 2006 at 04:30 PM
Damn that would have been a good title!
btw, by "both the Democrats and Republicans reacted to the contradictions of these characters in the same manner" - I think they mean the same as each other. Not the same as in the experiment. The wording is ambiguous though.
Posted by: Skeptico | January 26, 2006 at 09:04 PM
Ah, it's so easy to criticize political opinion. What I wonder is whether we skeptics would also show the emotional response and lack of reasoning as we tread our well-worn mental paths to our conclusions about pseudosciences and the like.
Perhaps only if we had to construct an original argument.
It would also be very interesting to see what's happening with judges and juries along these lines.
Posted by: Mike Huben | January 27, 2006 at 07:29 AM
I try to show my work in dismissing bad arguments. That's why I typically label fallacies and propaganda tactics. Now, if woos would show me properly controlled tests, rather than repeatedly asking me to take their word for it, yes, I might have to put in more thought than usual.
But since our opponents seldom, if ever, do that, I don't see any need to change my defenses. Still need to work on methods of attack, since the defenses around their emotions and ego are quite thick.
Posted by: BronzeDog | January 27, 2006 at 07:46 AM
A draft of the paper is available on request, "The neural basis of motivated reasoning": http://www.psychsystems.net/lab/type4.cfm?id=400§ion=4&source=200&source2=1
Posted by: Ron Zeno | January 27, 2006 at 07:59 AM
Ron Zeno -
You missed one of the amps. The corrected (and linkified) URL is
http://www.psychsystems.net/lab/type4.cfm?id=400§ion=4&source=200&source2=1
PS: I don't wanna hafta getta password, can't we just (ab)use yours? <snicker>
Posted by: Stevel | January 27, 2006 at 12:14 PM
Skeptico says:
Ah, you're right, probably they did mean it that way. Sadly, they don't describe any actual differences in the neurologic response. Guess I'll just have to read the real thing.
Posted by: Eric Wallace | January 27, 2006 at 10:57 PM
Just to point out - as several bloggers out there (such as Tara over at Aetiology, Coturnix, and CogDaily's Dave) have noted, this paper has not as yet been peer-reviewed or published and all this comes from perhaps premature press reporting. Yes, it rings true; but we should perhaps be wary of it for precisely that reason.
Posted by: outeast | February 02, 2006 at 01:46 AM
Yeah. The conclusions match my own anecdotal experiences, but I'm not going to swallow it whole, at least not yet.
At least it's trying to address one critical problem I see with society: Nowadays, bias is bad if it's opposed to you, but "That's just my opinion" is considered the most virtous of defenses when you're found to be flat out wrong.
Posted by: BronzeDog | February 02, 2006 at 05:44 AM
There's a bit more complete (aka technical) information at http://news.emory.edu/Releases/PoliticalBrain1138113163.html
This guy hardly seems like a "hack" considering the volume (and overall dryness) of his peer reviewed papers.
As an FYI, the blue-red state divide was a hot topic at the Society for Personality and Social Psychology Conference where this study was probably released from.
Posted by: linmoo | February 02, 2006 at 11:07 AM