“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.“
– Mark Twain

Operational Art is the “skillful employment of military forces to attain military goals through the design, organisation, sequencing and direction of military actions… Operational Art translates strategic into operational and ultimately tactical actions.”[1] This is an inherently complex concept, requiring practitioners to translate abstract strategic goals into practical activities and actions that can be accomplished by tactical commanders. It is an integrative and generative cognitive function, where it is often easy to describe ‘what’ operational art is, but incredibly difficult to explain ‘how’ to practice it.

Beyond the “how”, the successful execution of Operational Art requires practitioners to have a thorough appreciation of a range of factors, including enemy and friendly capabilities, the operational environment, and strategic objectives. However, the minds of humans are susceptible to many cognitive biases that can impact their planning process. One such bias is the illusion of explanatory depth (IOED), which refers to the tendency of individuals to overestimate their understanding of concepts and capabilities – leading to the development of “facts” that just ain’t so. This article explores the impact of IOED on the conduct of Operational Art, highlighting some of the potential challenges and offering mitigations for these.

The term IOED was introduced by Yale researchers Leonid Rozenbilt and Frank Keil in their 2002 paper, The misunderstood limits of folk science: an illusion of explanatory depth, where they argue:

“…that people’s limited knowledge and their misleading intuitive epistemology combine to create an illusion of explanatory depth. Most people feel they understand the world with far greater detail, coherence, and depth than they really do. The illusion for explanatory knowledge–knowledge that involves complex causal patterns—is separate from, and additive with, people’s general overconfidence about their knowledge and skills. We therefore propose that knowledge of complex causal relations is particularly susceptible to illusions of understanding.”

Or to put it more simply, people often rely on heuristics to simplify complex systems, leading to a false sense of belief in their comprehension of those systems. A commonplace example of IOED is Bluetooth connections between electronic devices.

Most people know that Bluetooth enables a wireless connection between electronic devices – very few understand or can explain the manner in which the 79 different radio frequencies are used to establish and maintain the connection. It is only when we are asked to actually explain some concepts that we are confronted with our limited understanding of them. This is espoused in the adage ‘the best way to learn something is to teach it’ – in this way, we are overcoming IOED by forcing ourselves to delve deeper into the subject and move past our superficial understanding.

The resulting illusion can have significant implications for practitioners of Operational Art, leading to over or underestimating capabilities and risks, whilst also creating unrealistic expectations.

Impacts of IOED on Operational Art include:

  1. Missing the fundamentals: Planners that have a superficial understanding of Operational Art miss the fundamental purpose, which is to link tactical actions to the achievement of strategic goals. This can be described by the ‘illusion of levels’, where the ability to explain different levels of a concept inflates our sense of how much we actually understand the system as a whole – being able to explain how to connect two Bluetooth devices at the user interface level gives a false sense of understanding at the deeper technological level.

    With Operational Art often simplified to the ‘Ends-Ways-Means’ framework, being able to describe this level can lead planners to have a false sense of their understanding of the fundamental purpose of Operational Art. Consequently, operational plans may pursue tactical success to the detriment of the attainment of operational or strategic objectives.

  1. Incomplete Analysis: If planners falsely believe they have a detailed understanding of complex operational concepts, they may be more likely to overlook critical factors or fail to account for the interdependencies between various elements. One of the consequences of the IOED is making key deductions or decisions based on limited information, as we believe we have a deeper understanding of the complex system than in actuality.

    As a result, planning products may be based on inaccurate or incomplete facts and false assumptions, increasing the likelihood of operational failures. Moreover, the false confidence instilled by the IOED can limit the extent to which planners attempt to verify their conclusions. This impact is accelerated in the contemporary environment where technologies and geo-strategic relationships are becoming more complex and changing more rapidly.

  2. Rigidity during the planning process: The operational environment is a dynamic space, and the proper execution of Operational Art takes time – meaning the situation planners start with is unlikely to remain unchanged throughout the planning process. The IOED can limit the flexibility and adaptability of planners who may become less receptive to new information and other inputs. Ultimately this can inhibit the agility of planners to adjust their operational approach to changing circumstances, resulting in increased vulnerabilities to system shocks and unexpected challenges.

Methods to mitigate IOED:

  1. Foster a culture of intellectual humility: A culture that values intellectual humility encourages individuals to recognise the limits of their knowledge and expertise. This in turn can reduce individual and collective IOED. Establishing a climate of trust is an essential component of this culture. Trust provides a sense of safety, enabling team members to be more open and honest with themselves and each other about their limitations.
  2. Fully resourced red team exercises: Doctrinally we are supposed to analyse plans from an adversary perspective. These activities can challenge assumptions and concepts, expose potential vulnerabilities, and encourage critical thinking. Our experience has been that this process generally occurs with a significant asymmetry in resources that heavily favours the blue team.

    To help mitigate the impact of IOED on Operational Art, the red team resourcing should, as close as reasonably possible, mirror that of the blue planning element. Furthermore, given that most of our planning is based on regional operational environments, red teams should be resourced with appropriate cultural experts to provide specific cultural awareness.

  3. Information seeking, sharing, and collaboration: planners should be encouraged and enabled to seek out relevant information. An effective knowledge management system is essential to this – the harder planners need to work to expand the limits of their knowledge, the less likely they will be to do it. Planning timelines should deliberately build space into the process for gathering resources and understanding. Information sharing and collaboration can be enabled through encouraging diverse teams and interdisciplinary approaches – particularly the inclusion of subject matter experts not primarily assigned to planning teams. A climate of trust also plays a key role here as a more open and permissive environment discourages planners from guarding and hoarding information.

    Leaders and teams should also prioritise recognition and rewards for collaboration over individual contribution. These approaches will promote the cross-pollination of ideas and enable a more comprehensive understanding of the operational environment.

  1. The role of the commander: the commander is the ultimate decision maker; they decide if a plan will be put into action, and they also determine what are acceptable levels of risk. Commanders should have a firm understanding of the potential indicators and impacts of IOED on their staff, as well as mitigation methods to overcome these.

    This is particularly important given they are chiefly responsible for setting the organisation’s culture and climate. General (Retd) Stanley McChrystal proposes a “listen, learn, then lead” approach to optimise the commander’s contribution that is highly transferrable to mitigating the impacts of IOED.

    A less formal, more dialectic approach to commander’s back briefs would support this – ensuring that the product is more than “PowerPoint deep”, and creating opportunities to further develop subordinates in the art. This approach does not detract from mission command but supports the “trust but verify” component of effective leadership. It also demonstrates a genuine interest in the process on behalf of the commander, validating the efforts of their staff.

IOED can lead to significant challenges to the successful execution of Operational Art. The impacts of, and mitigations to, IOED proposed in this article are not limited to the operational level of war but are also transferrable up and down the planning chain from strategic to tactical. The impacts and mitigations offered in this article are not exhaustive, and planners and commanders at all levels would benefit from a deeper understanding of this particular bias on their thinking.