Are we rational? How biases could make war with Iran a possibility
Ph.D Student, Conflict Analysis and Resolution, George Mason University
What are the some of cognitive biases that are influencing the recent bellicose rhetoric emanating from Jerusalem and Tehran?
It is said that wars begin in the minds of men. Considering the people charged with running Israel and Iran today, this is indeed a frightening prospect. But it’s also a chilling insight into the workings of the human mind in general. Why? Because our minds are filled with biases – unconscious and systematic errors of judgment – that make war with Iran an increasing possibility. We are, as psychologist and Nobel laureate Daniel Kahneman argues, hardwired to find hawkish arguments more convincing than dovish ones.
Kahneman’s lecture was given in 2006 (the English begins 02:25), but the implication for the current and escalating conflict between Israel and Iran are clear. Below, I have selected a number of cognitive biases (not all mentioned by Kahneman) that I believe are influencing the recent bellicose rhetoric emanating from Jerusalem and Tehran. For the sake of familiarity, I will concentrate on the Israeli hawkish narrative (you can read a recent example here).
Optimistic bias: Studies show that most of us tend to believe that we are better looking, smarter, and will have more success in life than the average person. Of course it’s not possible for most people to be better than the average. The optimistic bias also leads us to believe that we have more control over our environment than we actually do.
In times of war, that sense of self-efficacy results in an unwarranted optimistic assessment about the likelihood of meeting one’s martial objectives (Iraq in 2003 and Lebanon in 2006 quickly come to mind). In the narrative in favor of striking Iran, the optimistic bias makes itself apparent in overconfidence in the prowess of the IDF, under-confidence in Iranian military and intelligence capabilities, and ignoring or devaluing the unintended consequences of such an operation.
Confirmation bias: A tendency to search for, and prefer, information that validates our worldview, and to ignore or undervalue information that contradicts it. Studies show that when presented with information that both supports and contradicts their beliefs, people are much more accepting and convinced by the supportive information, and much more critical of the contradictory. Often, exposure to contradictory information leads to attitude polarization, whereby people become more rigid and extreme in their positions.
In the narrative favoring a strike on Iran, the confirmation bias reveals itself with the sources that are marshaled for support of military intervention. Pugnacious comments by Iranian leaders, examples of “irrational” religious ideology, and reports about Iran’s capabilities to develop a bomb are all given serious weight. On other hand, statements by Iranian officials unequivocally stating they are not developing or acquiring a nuclear bomb, fatwas issued by Iran’s supreme leader (and other clerics) against nuclear weaponry, and intelligence reports questioning Iran’s determination to go nuclear are all ignored or summarily dismissed.
Of course the “confirmation bias” does not necessarily favor hawkish arguments. However, since the leaders of the current Knesset are by and large hawkish to begin with, the bias influences in favor of war. Also, it should be stated that in reading and hearing the pro-strike narrative, especially as articulated by elites, it’s not easy to identify when information is deliberately being distorted, and when automatic and unconscious biases have reared their head.
Fundamental Attribution Error: A tendency to view and explain other people’s (usually negative) behavior by favoring dispositional over situational factors. A hostile behavior, for example, is seen as proof of a person’s aggressive disposition (personality/culture), in contrast to a consequence of environmental constraints and pressures. Anybody under the spell of this bias not only fails to see the role of the situation in shaping behavior, but is also blind to the way in which their own actions are contributing to the other party’s manifest hostility.
Another interesting quality of the FAE is that in contrast to the “other,” one is usually very much aware of the situational forces that constrain one’s own behavior. Finally, there is what Kahneman calls the “illusion of transparency”: the belief that one’s intentions are transparent to all involved.
Thus, pro-strike Israelis believe that Iranians (or their government), like the rest of the Muslim world, have a deep-rooted, implacable and irrational hatred for the Jewish state/people (dispositional perspective), while they [the Israelis] have to act with hostility because of their rational need to defend themselves (situational perspective). It’s assumed that everyone understands that Israel is not interested in a military confrontation (illusion of transparency), and that Israel is a less hostile country than Iran.
The FAE makes more war more likely because when you interpret the other party’s negative behavior to be a consequence of a deep-seated and intractable hostility, your possible responses, outside of aggression, are significantly narrowed.
Loss Aversion: Biases do not only increase the likelihood of war, they also increase the likelihood of its continuation. Imagine the following: A terrible disease reaches your country and it’s predicted to kill 600,000 people. As a response, you are charged with picking an intervention strategy. You are presented with two programs and you have to choose one.
A. In the first program you will save 200,000 people for sure.
B. In the second, you have a 33 percent chance that 600,000 will be saved and a 67 percent chance that none will be saved.
Which program would you choose? In the original version of this experiment (1981), conducted by Amos Tversky and Daniel Kahneman, 72 percent of participants preferred program A. Now imagine the same scenario but with the following conditions:
A. In the first program 400,000 people will die.
B. In the second program you have 33 percent probability no one will die, and 67 percent all 600,000 will die.
Which would you choose? In the experiment by Tversky and Kahneman 78 percent of the group presented with this scenario preferred the riskier choice (program B).
Of course, the information in both scenarios is identical. The difference is that in the first scenario the narrative is framed as a gain/survival, while in the second as a loss/mortality.
Studies consistently show that when faced with a certain loss, people become risk-acceptant; choosing to forgo a sure loss for a potential loss, even if they chance losing more. In times of war (e.g. Vietnam), when it’s clear that one side will not be able to meet their objectives, instead of cutting their losses, leaders will often increase their commitments. They do this, as Kahneman points out, not only because they dislike losing, but also because the consequences for them as leaders (as opposed to the people) will probably not be worse off if they refuse to cut their losses.
It should be noted that it’s not necessarily irrational for Israelis to lean in a hawkish direction with respect to Iran: it may be the case that the damage from mistrusting an enemy with benign intentions pales in comparison to trusting an enemy with hostile intentions. However it’s easy to see how a combination of these biases creates a deadly cocktail of cognitive errors – making hawkish arguments sound more convincing than they need to be. This “rhetorical advantage”, as Kahneman calls it—or “a cognitive dance of death”, as I like to call it— may indeed drag us into a terrible and unnecessarily military confrontation.
So what’s to be done? At best, Kahneman says, we can become aware of our own mental habits. Sadly, even then we will not be totally immune from their influence. And those of us with dovish orientations? Well, the cards are stacked up against us; it would seem that we must labor in a herculean (or is it Sisyphean?) manner just to exercise influence. I guess this is what it sounds like when doves cry.
Roi Ben-Yehuda is a graduate student at Columbia University and a Ph.D student at the School of Conflict Analysis and Resolution at George Mason University. This article was posted on Roi’s blog and is published here with his permission.
This material is presented as the original analysis of analysts at S-CAR and is distributed without profit and for educational purposes. Attribution to the copyright holder is provided whenever available as is a link to the original source. Reproduction of copyrighted material is subject to the requirements of the copyright owner. Visit the original source of this material to determine restrictions before reproducing it. To request the alteration or removal of this material please email [email protected].