Technology does not advance; it accelerates. It is exponential – one technological advancement facilitates speedier and more expansive future advancements. This rapidly accelerating train is tough to board at any time, but it will be more difficult tomorrow than today.
And those who boarded yesterday have a substantial first-mover advantage.
Throughout history, information has repeatedly been a core element of humankind’s most transformative technological advancements. Looking back from the internet to the personal computer, the radio, the telegraph, the printing press, paper, the written word, the spoken word – all have profoundly impacted all of humanity.[1] Recent developments in generative artificial intelligence (AI) have begun amplifying humankind’s ability to create and spread information, and disinformation, on an unprecedented scale.
And people are not ready for this change.
Research led by the Australian National University revealed people fail, two thirds of the time, to distinguish AI-generated images from reality.[2] As advances in quantum computing take hold, the next leap in processing power will be as transformative as the shift from vacuum tubes to microchips.[3]
And this presents both opportunity and threat.
Most people worldwide now own a smartphone, and the world’s littoral zones concentrate population density at twice the global average.[4] Whether or not we actively engage in the cyber domain, our people are engaged by and within it. A force that isn’t technologically agile risks being outmanoeuvred in this densely populated, data-rich environment.
And we are not technologically agile.
Bringing a Luddite[5] force into this setting is like deploying a blind, three-legged military working dog into combat: he’s a good boy and all, but ill-equipped for the environment or the task. To achieve dominance in this technology-driven domain, we need an approach as broad, adaptable, and agile in cyber operations as our kinetic actions.
And that means people are more important than gadgets.
Many of the resources allocated to the cyber domain focus on the prevention and employment of sophisticated technological intrusions into secure networks for the purpose of stealing data or collapsing critical infrastructure.[6][7] Yet technological hacking[8] is only one half of the cyber domain threat dyad, and it may not even be the most dangerous half.
The human terrain[9] underpins all warfare and has done throughout human history. While battles may be won by kinetic effects, wars are won – and lost – on the human terrain.[10] When most humans in a country at war are no longer willing to support the effort, the war is lost. The information environment[11] influences the human terrain, and the technology-driven cyber domain broadens and amplifies this influence.
And our adversaries are way ahead of us.
The M1 Abrams tank has been the king of the land battlefield for more than 30 years. In the tank-on-tank battles of Operation Desert Storm in 1991, Abrams destroyed Iraqi Soviet-bloc tanks by a ratio of some 1600 to zero.[12] But in 2016, a regiment of Abrams tanks was defeated by a hashtag. The hashtag was #AllEyesOnISIS, which swept ahead of the Islamic State advance through Iraq and compelled thousands of heavily armed Iraqi soldiers to surrender or abandon their posts and their state-of-the-art war machines.[13]
We can confidently assert that #AllEyesOnISIS would not achieve the same effect against Australian troops, but this misses the point. An information attack is not effective in the way a bomb indiscriminately affects all within its blast radius. Australian soldiers would not be affected by #AllEyesOnISIS because it would not speak to their cultural landscape, their world view, and their existing uncertainties about whether they were standing on the right side of their God and history, as it did for the Iraqi soldiers. But an opponent who identifies similar points of uncertainty and division within Australian culture, then amplifies them over years of subtle information manipulation, could achieve a comparable outcome here.
And our adversaries are already doing this.
For state actors, cyber-domain information operations provide the opportunity to achieve human-terrain and even kinetic outcomes without risking kinetic retaliation. For non-state actors, this domain offers the added benefit of accessibility. There are an estimated 7.2 billion active smartphones in the world.[14] Never before have military forces been able to influence most of the world’s population, while knowing their own and others’ locations to near pinpoint accuracy, accessing every book ever written, every academic paper and news story ever published, and publishing their own to a global audience. This is the capability now available to any individual among about 80 per cent of the world’s population.
Political experts stress the need to build disinformation resilience, making voters more informed and sceptical – not sceptical about the medical benefits of vaccines or the scientific community’s general assumption that birds are not government spy drones, but about dubiously sourced social media posts which seek to nudge a user’s world view steadily towards the extreme.[15] Building similar resilience among soldiers means integrating knowledge-building habits into Army culture from the junior-enlisted level up. As the “internet of things” propagates, and dozens of household items are connected with cyberspace at large, few places will remain where an ill-considered remark from an unthinking digger will be free from the risk of public broadcast.[16]
And this is just the body armour of the cyber domain.
Although this may provide some defence against information attacks, it’s not a very manoeuverist[17] defence. To win in cyber warfare, the ADF will also need to employ information offensively. This cannot be achieved by centralising information and counter-disinformation[18] decisions at the highest levels. For starters, those who employ disinformation start by undermining trust in official and reputable sources.[19] Furthermore – and more importantly – centralised information campaigns cannot achieve the necessary tactical agility in a cyber domain where 510,000 comments are posted on Facebook, 350,000 tweets are sent on X, and 625,000 videos are watched on TikTok – every minute.[20]
Australian section commanders in combat have at their disposal enough destructive firepower to cause a major international incident if employed recklessly. They lack, however, the tools, training, or authority to share their daily positive actions with the global community and debunk the enemy’s disinformation. Russia’s ongoing invasion of Ukraine has not only been a war of kinetic military hardware, but an information war. The antagonist’s infamously prolific cyber disinformation capability has been employed in full force against the Ukrainian government and people, both before and since the conventional invasion began. Ukraine’s defence in this space, however, has been asymmetrically robust.[21] When Russia falsely claims a battlefield victory, it is not only the Ukrainian establishment that contradicts this message, but individual Ukrainian troops on the ground in the still-unconquered territory posting to social media. When Russia attempts to hide its forces’ war crimes, these crimes are first revealed, not by journalists, but by the Ukrainian troops who discover them.[22] This grassroots approach has proved one of the few effective weapons against cyber-enabled disinformation attacks, as the global public turns away from traditional media sources and towards social media as their sole news resource.[23]
This is not to suggest that our soldiers should deploy with their own electronic devices and start posting status updates. Cyber domain agility means equipping soldiers with both the physical technology that enables secure cyber connections and the understanding to competently use it. New threats emerge when all soldiers are actively engaged in the cyber domain – this is as complex and dangerous a terrain as any battlefield. But hiding from the technology is like sending troops into battle without weapons because “guns are dangerous”. They are dangerous – that’s why we use them.
And we don’t really have a choice.
Locking the force into a virtual Maginot Line[24] in the cyber-enabled information environment achieves only the appearance of security. Choosing whether to train, equip, and empower our troops to operate in the cyber domain is a question of exercising some control over this space, or surrendering control to whoever wishes to take it. There is no option to exert absolute control.
Although about 7 per cent of Army roles are focused on communications and cyber,[25] all soldiers will need to navigate the cyber domain, counter its threats, and employ its resources to achieve heretofore unrealised asymmetric advantages. These now-basic soldier skills cannot be taught in an online course; they must be trained and rehearsed alongside all other tactical proficiencies. Empowering junior personnel to employ their cyber-domain expertise in operational contexts will require a mindset shift among senior leaders, who must be willing to entrust subordinates with weightier responsibilities and manage the risks associated with this greater autonomy.
And we already do.
Technology has never transformed war into something fundamentally different from what it has always been – an interpersonal clash of human wills. Just as in 1939, when reliable aeroplanes, armoured vehicles, and radio communications increased the tempo and complexity of warfare, so too does the cyber domain. Successful armies in the Second World War navigated these new challenges by adopting mission command principles. Addressing the acceleration of technology in warfare now will mean applying the same approach when training and employing the joint force in the rapidly advancing cyber domain.
End Notes
[1] YN Harari, Sapiens: A Brief History of Humankind, Vintage, London, 2011.
[2] A Dawel et al., ‘Can You Spot the AI Imposters? We Found AI Faces Can Look More Real Than Actual Humans’, The Conversation, The Conversation Media Group, published online 14 November 2023, accessed 1 November 2024.
theconversation.com/can-you-spot-the-ai-impostors-we-found-ai-faces-can-look-more-real-than-actual-humans-215160
[3] T Mann, ‘From Vacuum Tubes to Qubits – Is Quantum Computing Destined to Repeat History?’, The Register, Situation Publishing, published online 3 October 2023, accessed 5 November 2024.
www.theregister.com/2023/10/03/quantum_repeat_history/
[4] L Creel, ‘Ripple Effects: Population and Coastal Regions’, Making the Link, Population Reference Bureau, Washington DC, September 2003, accessed 7 November 2024.
www.prb.org/resources/ripple-effects-population-and-coastal-regions/
[5] Luddite: This term originates in early 19th Century England with the protest movement of workers disenfranchised by the Industrial Revolution – in particular those in the cotton and wool weaving trades, who lost their jobs to semi-automated “power looms”. The “Luddites” were named for “General Ned Ludd”, sometimes “King Ludd”, based on popular folk tales at the time – which most historians regard as apocryphal – depicting a leader of the movement living in Sherwood Forrest.
Staff Writer, 'Why Did the Luddites Protest?', National Archives: Education Resources, National Archives, UK, accessed 9 November 2024.
www.nationalarchives.gov.uk/education/resources/why-did-the-luddites-protest/
[6] ADF, Australian Defence Doctrine Publication 3.24 Cyberspace Operations (Draft), 2020.
[7] ADF, Australian Defence Force Publication 6.0.3 Information Assurance, 2016.
[8] Hacking: The use of the term “hacker” in computing lexicon was first defined in a glossary for computer programmers launched in 1975 called The Jargon File. Only one of the eight definitions provided described the “hacker” in a negative light: “a malicious meddler who tries to discover sensitive information by poking around”. This use of the term is believed to have evolved from language adopted by the M.I.T Tech Model Railroad Club, first seen in the minutes of the club’s April 1955 meeting with the note “Mr. Eccles requests that anyone working or hacking on the electrical system turn the power off to avoid fuse blowing”.
B Yagoda, ‘A short history of “hack”’, The New Yorker, 6 March 2014, accessed 1 November 2024. https://www.newyorker.com/tech/annals-of-technology/a-short-history-of-hack
[9] Human terrain: Although the term is common in modern military vernacular, it is not well defined. Australian Army Land Warfare Doctrine 3-0 Operations suggests the need to recognise that “conflict is conducted amongst and within population masses, and that complex human terrain will be the dominant operating environment within which the population exists as the objective as well as a source of potential threat”. This publication does not include a specific definition for “human terrain” however. The UK Ministry of Defence’s now superseded Joint Doctrine Note 4/13 Culture and Human Terrain defines human terrain as “characterising cultural, anthropological and ethnographic information about the human population and interactions within the joint operations area”. Although this definition has not survived the disbursement of JDN 4/13 across two subsequent pieces of UK doctrine (Joint Doctrine Publication 04 Understanding and Decision Making and Joint Doctrine Publication 07 Defence Engagement), it is the definition applied for the purpose of this essay.
[10] R Smith, The Utility of Force, Vintage, London, 2005, p 3.
[11] Information: Data in context and includes documents and papers; electronic data; the software or systems and networks on which the information is stored, processed or communicated, intellectual information acquired by individuals; and physical items from which information regarding design, components or use could be derived.
Information environment: The aggregate of individuals, organisations or systems that collect, process or disseminate information.
ADF, Australian Defence Force Publication 7.0.3 Exercise Planning and Conduct, 2018, p 260.
[12] US General Accounting Office, Operation Desert Storm: Early Performance Assessment of Bradley and Abrams, USA, 1992, p 4, accessed 1 November 2024.
http://archive.gao.gov/d31t10/145879.pdf
[13] P Singer and E Brooking, Like War: The Weaponization of Social Media, Houghton Mifflin Harcourt, Boston, 2018, pp 4-7.
[14] P Taylor, ‘Number of Smartphone Mobile Network Subscriptions Worldwide from 2016 to 2023, With Forecasts from 2023 to 2028’, Statista.com, accessed 30 October 2024.
www.statista.com/statistics/330695/number-of-smartphone-users-worldwide/
[15] T Nichols, The Death of Expertise: The Campaign Against Established Knowledge and Why it Matters, Oxford University Press, UK, 2017, p 207.
[16] Ibid., pp 254-255.
[17] Manoeuvrist:A term describing an approach that employs the principles of manoeuvre warfare.
ADF, Australian Defence Force Publication 2.3.1 Rapid Environmental Assessment, 2010, p 187.
[18] Counter-disinformation: Activities that identify, assess, monitor and counter the threat posed by hostile disinformation actions.
This definition is adapted for the purpose of this essay from the Joint Doctrine “counterintelligence” definition, being “activities that identify, assess, monitor and counter the threat posed by hostile intelligence collection”.
ADF, Australian Defence Doctrine Publication 2.1 Counterintelligence and Security, 2015, p 71.
[19] T Helmus et al., Russian Social Media Influence: Understanding Russian Propaganda in Eastern Europe, RAND Corporation, 2018, accessed 1 November 2024. https://www.rand.org/content/dam/rand/pubs/research_reports/RR2200/RR2237/RAND_RR2237.pdf.
[20] S Marino, ‘What Happens in an Internet Minute’, LocaliQ.com, Gannett Co, published online 4 December 2023, accessed 1 November 2024.
localiq.com/blog/what-happens-in-an-internet-minute/
[21] M Karalis, ‘The Information War: Russia-Ukraine Conflict Through the Eyes of Social Media’, Georgetown Journal of International Affairs, Walsh School of Foreign Service, Georgetown University, Washington DC, published online 2 February 2024, accessed 7 November 2024.
gjia.georgetown.edu/2024/02/02/russia-ukraine-through-the-eyes-of-social-media/
[22] L McQuillan, ‘Could Social Media Hold Evidence of Alleged Russian War Crimes?’, CBC News, Canada, published online 7 April 2022, accessed 7 November 2024.
www.cbc.ca/news/world/social-media-war-crimes-investigations-1.6410145
[23] C St. Aubin & J Liedke, ‘Social Media and News Fact Sheet’, Pew Research Centre, Washington DC, published online 17 September 2024, accessed 7 November 2024.
www.pewresearch.org/journalism/fact-sheet/social-media-and-news-fact-sheet/
[24] Maginot Line: Built by France in the 1930s to deter German aggression and named after André Maginot, French Minister for War during much of the 1920s and early 1930s (until his death in 1932), the Maginot Line consisted of a series of static obstacles and bunkers stretching across most of France’s border with Germany, Italy, Switzerland and Luxemburg. When German forces did invade France in 1940, they bypassed the Maginot Line rather than assaulting it directly. The term now serves as a common metaphor for costly and overly rigid defensive measures that provide a false sense of security.
[25] Australian Army, The Australian Army Contribution to the National Defence Strategy, Army Headquarters, Australian Defence Force, Canberra, September 2024, p 16.