Perception and Misperception in International Politics: New Edition examines and tests, through historical example and precedent, the application of cognitive psychology on political decision making. Dr Robert Jervis describes the process of perception, for example how decision makers learn from history, and then explores common forms of misperception, such as overestimating one’s influence on other decision makers. Originally published in 1976, Perception and Misperception in International Politics: New Edition, 2017, includes an extensive preface by Dr Jervis bringing his analysis up to date by discussing relevant psychological research, including the expansion of behavioural economics, in the last 40 years.[1] 

Dr Jervis examines the ‘question of how [decision makers from] states perceive others, what the most common sources of error are, and how we can do better’. Decision makers ‘not only have to strive to perceive their environments, they must also take account of others’ perceptions’. These perceptions can include whether a state appears ‘menacing or reassuring, weak or strong (both in capabilities and resolve), as consistent and steadfast or changeable’.

Dr Jervis explores how decisions in international politics, including decisions made in error, can be ‘literally life-or-death… for countries’, concluding that ‘misperceptions were a central cause of both world wars’, where ‘each side believed they had the dominate strategy…no matter what the other side did’. He emphasises that ‘accuracy in perception and success in policy have such as strong pull that it is hard to resist the temptation to equate them with rationality’.

As a method enhancing accurate perceptions, Dr Jervis recommends that decision makers concentrate on becoming ‘more self-aware… reduce undue confidence, understand how others view evidence differently, and focus our attention on key issues’. We should understand ‘our biases and try to cope with them…to [ensure] more accurate perceptions and better decision making’.

Dr Jervis recommends that decision makers remain mindful of common perceptual errors so they may avoid or compensate for them. Through mindfulness, decision makers can adopt safeguards to decrease their unwarranted confidence in prevailing beliefs

Dr Jervis concludes that ‘contrary to the older arguments that human nature dooms us to constant violence…more recent studies show that our species is distinguished from other mammals by our facility for cooperation’. Success in ‘international politics, as in much of social life, depends on our ability to cooperate with a wide range of other actors – it is a mistake to believe that evolution has selected us for brutality’.

Perception and Misperception in International Politics

A book complementing the practical application of cognitive psychological examined in Perception and Misperception in International Politics, and acknowledged by Dr Jervis, is Daniel Kahneman’s, Thinking, Fast and Slow (2013). Kahneman’s book is about judgement and choice, knowing that it is easier to recognise the mistakes of others than to recognise our own. Kahneman seeks to enrich our vocabulary when we talk about judgements and choices or viability of new policies or appropriateness of decisions.

Both Jervis and Kahneman seek for us to learn to recognise situations in which systematic errors, known as biases, recur predictably in particular circumstances. Both authors emphasise how overconfidence in people is fed by the illusory certainty of hindsight.[2]

Divided into four parts and 12 chapters, Dr Jervis guides a reader through: Setting the Problem; Processes of Perception; Common Misperceptions; and, in Lieu of Conclusions. Key ideas within the four parts of Perception and Misperception in International Politics are summarised as follows.

Part 1: Setting the problem

External stimuli, internal processes and intentions:

  • If a person is to decide intelligently how to act, they must also predict how others will behave.
  • People do not ordinarily examine their basic beliefs unless they are confronted with shocks and unpleasant choices.

The Spiral Model and World War I – ‘each side will overestimate the hostility of the other’:

  • Spiral Model is when one state engages in a military build-up, other states sometimes take this as a sign that it is more aggressive or expansionist than they previously thought. Such increases in mutual suspicion can drive arms races and even lead to war, such as in World War I. Psychological bias is often invoked to explain this pattern of growing suspicions leading to hostility.[3]
  • The corollary of the Spiral Model is that ‘once each side loses its unwarranted fear of the other, some level of arms can be maintained that provides both sides with a reasonable measure of security’.

Deterrence and World War II – ‘aggressors underestimate the resolve of the defenders’:

  • Deterrence Theory disagrees that threats set off self-fulfilling spirals of fear and hostility, however, does concede, for example when a threat is not believed or not credible, such as in World War II, that threats may not deter others. 
  • Hobbes's theory of fear has two major implications. First is that mutual fear is the source of a friction between nations. Second that sovereign power is the source of fear, and that sovereign power also uses that fear to govern people.[4]
  • Prisoners Dilemma is that, whatever the other prisoner does, each is better off confessing than remaining silent. But the outcome obtained, when both confess, is worse for each than the outcome they would have obtained had both remained silent.[5]

Part 2: Processes of perception

Rational and irrational cognitive consistency:

  • Rational cognitive consistency ‘causes people to fit incoming information into pre-exiting beliefs and to perceive what they expect to be there’.
  • Irrational cognitive consistency ‘often leads to a policy that fails to reach any goals because it attempts to reach too many objectives’…where ‘seeking too many objectives often results in making too many enemies’.

Assimilation of information to pre-existing beliefs – ‘impact of expectations on perceptions’:

  • Bayesianism ‘gives an optimal way to combine new information with prior beliefs’. Bayesianism provides mathematical tools to update our beliefs about random events when presented new data or evidence about those events.
  • Failure to recognise the influence of pre-existing beliefs, includes for example, ‘the British plans to force the Dardanelles in 1915 resting on the assumption that the arrival of the fleet at the capital would finish the Turks: there would be a revolution in Constantinople’.

Excessive and premature cognitive closure, which leads to:

  • Stability in policy, because perceptions are slow to change.
  • Satisficing, a combination of sufficing and satisfying, rather than optimising, decision making.
  • Reluctance to reorganise evidence into a new theory or image if people are deeply committed to an established view.

The impact of the evoked set, or immediate concerns, occupying our thinking:

  • Perceptions as a ‘rational guess’ about an ambiguous stimulus.
  • Misunderstandings within a government through differences in communication, information, perspectives and time lags.
  • Rashomon effect where people express differences in perspective in multiple accounts of a single event.[6]
  • Enabling decision-makers learn from history, through paying less attention to what has happened and more attention to why it has happened:

Organisational learning:

  • Post hoc ergo propter hoc is the fallacy where just because one event follows another event, does not mean that the first event caused the second.
  • Underestimating the importance of variables not under a decision maker’s control, leading a decision maker to see very different future situations as they saw earlier situations.
  • Firsthand experiences: following failure at Gallipoli in 1915, the British ‘overgeneralised…and concluded that amphibious operations we not likely to succeed’ in contrast the Americans ‘who were not involved [in operations at Gallipoli] saw that new doctrines and technology pointed to possible [amphibious] solutions…and proceeded to develop the tactics that proved so successful against Japan [in World War II].’
  • Early experiences and generational effects, especially effects enabled by ‘values, ethics and ways of thinking’.
  • Reactions to failure, especially ‘avoiding policies that have failed in the immediate past’.
  • Nothing fails like success, including when leaders are likely to template past success to a ‘range of later, even inappropriate, situations’.

Part 3: Common misperceptions

Perceptions of centralisation:

  • Overestimating ‘the behaviour of others as centralised, planned and coordinated’… based on our human proclivity to ‘squeeze complex and unrelated topics into a coherent pattern’.

Overestimating one’s importance as influence or target:

  • Overestimating one’s effectiveness, especially in:
    • Separating perceptions of personal efficacy from perceptions of national efficacy.
    • Coalition warfare, where each nation tends to overestimate the percentage of the enemy’s resources that are devoted to fighting them.
  • Belief that the other understands that you are not a threat.
  • Prospect theory where decision-making depends on choosing among options that may themselves rest on biased judgments.[7] This is a theory developed by Tversky and Kahneman, which ‘holds that a choice will elicit higher risk-taking when it is framed in terms of avoiding losses than when it is described in terms of making gains, although the situations and choices are actually identical’.

Wishful thinking in international relations:

  • Confirmation bias is the tendency to process information by looking for, or interpreting, information that is consistent with one’s existing beliefs. People are especially likely to process information to support their own beliefs when the issue is highly important or self-relevant.[8]

Cognitive dissonance and international relations: when beliefs or assumptions are contradicted by new information, resulting in dissonance-reduction by decision makers through:

  • Justifying their behaviour.
  • Minimising their own internal conflict.
  • Reassuring themselves that they have made the best possible use of all information they had, and believing they have wisely employed their available resources.
  • Rearranging their own beliefs so that they provide increased support for their own actions.
  • Reducing the advantages of rejected course of action and the costs of their chosen one.
  • Risking not employing a rejected goal, even if altered circumstances permit its attainment.
  • Believing high costs of a policy, justify sacrifices leading to policy accomplishment.

Boomerang dissonance reduction occurs when:

  • ‘an individual is relatively resistant to changing their own position’ and the ‘dissonance arousing information…is also resistant to change’. With ‘both sets of dissonance elements resistant to change, the individual can do little else to reduce dissonance’ except to, with diminishing success, stubbornly justify their established position.
  • This type of stubborn approach, when beliefs or assumptions are contradicted by new information, is known as ‘anti-learning’.

Conclusion

Perception and Misperception in International Politics: New Edition, examines the ‘question of how states perceive others, what the most common sources of error are, and how we can do better’. Decision makers in states ‘not only have to strive to perceive their environments, they must also take account of others’ perceptions’. These perceptions can include whether a state appears ‘menacing or reassuring, weak or strong (both in capabilities and resolve), as consistent and steadfast or changeable’

In 'Part 4: In lieu of conclusions', Dr Jervis summarises, with practical advice for decision makers, minimising and compensating for common sources of error in perception employment. Through mindfulness of common perceptual errors, decision makers adopt safeguards to decrease their unwarranted confidence in prevailing beliefs, including:

  • Listening to arguments to place decision makers in the best possible position to learn what:
    • perspective they are rejecting.
    • evidence they should examine more closely.
    • assumptions requiring further thought.
  • Applying time, energy and commitment to see the world the way others see it, through diverse frameworks and a variety of possible perspectives.
  • Responding with curiosity to alternate explanations and ideas.
  • Reducing the amount of discrepant information required to make decision makers re-examine their own views.
  • Empathising with the ‘workings of the others’ argument’, combined with understanding the belief systems and values of other people.
  • Awareness of common misperceptions including the ‘role of emotions and motivated biases’ and the ‘power of expectations and needs’.

Jervis concludes that ‘in politics, as in everyday life, people form their ideas and habits through social processes’ and ‘our world is social in that how we think and act cannot be understood apart from the interactions that have nurtured them’. People ‘learn from, react to, and are formed by what others are saying and doing to them and how others respond to them’.