When I took up a position in a company in Asia, I was taken aside by the person who I was replacing. ‘You need to understand that this company follows a Chinese business model,’ the Chinese man said. When I pressed for further explanation, he said that initiative and unanticipated information could go down the strict hierarchy, but not up. I later saw what he meant. In one memorable instance, the person at the top of the hierarchy made a far-reaching decision, which almost everyone lower in the hierarchy knew was wrong – but no one dared to voice this, for fear of overstepping their level of authority. The company continues to live with the consequences.
Commenting recently on the Russo-Ukrainian war, Major General (retired) Mick Ryan pondered: ‘Russia's military is out of its depth in Ukraine. Was Putin kept in the dark about its weaknesses?' An article in The Guardian similarly draws attention to the surprising ignorance of Putin regarding the situation on the ground. The article quotes commentator Alexander Gabuev as saying, ‘Only a very small group of generals were informed about the war, and they didn’t ask difficult questions that could help prepare for any scenarios other than a speedy Russian victory.’ The author of a biographer on Stalin has assessed, ‘Vladimir Putin in 2022 is as isolated as Stalin was in 1952 and that’s despite the fact that he’s got access to the internet and WhatsApp.’ Self-deception is a phenomenon in which, at some level or in some capacity, we know something to be true, but our sense of self maintains distance from that knowledge for the sake of protection or advantage. The same thing can be seen at an organisational level, when organisational structure allows the most authoritative members to be protected from potentially damaging information that is known by others.
When Mao demanded increased productivity in the time of his Great Leap Forward, people were accused of ‘right-deviationist’ thinking if they advised leaders that there were food shortages. So the reports that went up the hierarchy told falsehoods of bountiful harvests, allowing the leadership to be buffered from the truth – until 45 million people had died of famine. Something similar happened in Soviet Russia in Stalin’s time.
What should we learn from examples like these of organisational self-deception? Two areas seem worthy of consideration.
First, how can we avoid the pitfalls of organisational self-deception? Perhaps one approach is to seek to avoid the buffering compartmentalism that allows self-deception to thrive. While it is important for us to retain distinctions between ranks, roles, and levels of classification, it would be self-destructive to allow these distinctions to hinder the flow of genuine knowledge up the hierarchy. It is important to have some forums in which the sharing of knowledge and ideas can happen across these distinctions (such as The Cove). It is also important for leaders to foster cultures in which it is okay for the truth to be shared.
Second, in our defence of Australia and its interests, how can we exploit tendencies towards self-deception in our adversaries? Sun Tzǔ remarks ‘You may advance and be absolutely irresistible, if you make for the enemy’s weak points.’ It seems to me that organisational self-deception is a significant weak point. If particular military cultures are resistant to the upflow of realistic bad news, we should ponder how our use of intelligence and information warfare can exploit this weakness for the benefit of those on the battlefield, and for the pursuit of international justice.