Introduction
The world as we know it is constantly evolving. With every passing year, it becomes more volatile, fractured and increasingly turbulent on one hand; and brings new possibilities, partnerships and conventions followed by new and emerging threats on the other.[1] The revolution of emerging technologies has brought numerous opportunities, but equally raised doubts on its inability to withstand threats against the contentious concept of data security.[2] The emerging technologies have expanded our ways of using data innovatively and efficiently but on the contrary have made access to the same data vulnerable to theft.[3] That said, securing data is on its evolution too.[4] This makes it all the more pertinent for modern democracies to remain updated and flexible enough to maximise their capacities through rapidly emerging technological innovations and develop potent solutions to counter the risks they carry. Here the role of the military is of profound importance.[5] Through this article, the author will analyse four potential threats that impact land forces’ component’s operational capability, whose threat potency will escalate with emerging conflicts. The author will then try their best to provide practical, viable solutions for decision makers to neutralise those threats effectively.
From Data Streaming to Fake News
With every passing year, society experiences the rise of new and emerging innovations in Information Communication Technologies (ICTs) which alter our perspectives and definitions through which we see life.[6] From the viewpoint of the military, four emerging domains are our focus today: streaming analytics through multiple devices which experts have commonly termed as the Internet of Things (IOT)[7], Information Technology Security[8], computing analysis with humanistic cognitive learning skills – an alternate definition laid out by behavioural scientists for Artificial Intelligence (AI)[9], and Fake News[10].
The Champion of Streaming Analytics
With every year, numerous devices hit the market with an in-built capability to stream data and monitor analytics. From Kindle to iSmart automobiles[11] to intelligent cooling systems to remotely piloted fixed and rotary wing aircraft with attack capabilities. The aforementioned examples are of emerging technologies with state-of-the-art systems, yet they remain vulnerable towards cyber threats.
In 2014, a report published by the Chief Scientific Adviser to the UK Government on the Internet of Things (IOT), studied the possibilities of using the IOT in homegrown defence applications. The report analysed the feasibility of connecting domestic infrastructures with that of indigenously designed automobiles in the backdrop of a proposed architecture of 5G network connectivity, which was scheduled to be laid in the UK at that time. If any of the devices is left unsecured, the interconnection of all devices to a common network – especially those devices that are wearable – may pose a grave threat and give an open-ended opportunity for the hacker to exploit personal data, or may even invite terror factions to drastically affect the lives of ordinary citizens by creating havoc. To begin with, self-driven vehicles are hackable[12] and can easily be manoeuvred to the wrong side of the road or over a pedestrian path or steered into a high security area. Another example could be creating a massive ‘internet blackout’, especially by disrupting the undersea-laid transatlantic cables, which would render countries – especially in Europe – vulnerable to internal and external security threats while substantially losing economic gains by the minute.[13] As decision-makers decided to establish such a critical yet vulnerable infrastructure at the helm of national economies, they should have taken note of the vulnerabilities it would bring and should have established concrete efforts to reinforce such critical infrastructures from cyber-security threats. Such an innovative decision for ensuring uninterrupted Internet connectivity must be reciprocated with the same innovativeness to adopt the IOT as a means to ensure data security.[14]
Collection, assessment and dissemination of critical time-sensitive intelligence enhances the ability of decision-makers to make swift decisions, ensure security protocols of certain instruments of power, as well as prepare and – if need be – deploy defensive manoeuvres against an incoming threat. All these tasks could be successfully achieved by enabling the use of the IOT for military means.
From the aforementioned statements, it is evident how the IOT poses a grave threat to a nation’s national security. Today the land forces’ component’s success in operations relies majorly on their ability to: view the area of operations (AO) in real time, interpret the adversary’s decision-making by coordinating their actions on the battlefield directly, and anticipating their tactical posture; which is relayed to the higher echelons in real time and in high definition (HD). Their ability to maximise their performance during combat operations can be credited to the satellite in outer space which relays information as they view it. Taking said statements into account, the land forces’ operational capability can be compromised first by rendering them ‘blind’ – especially during active fire engagements, irregular or stabilisation operations or during post-conflict negotiations – by armed groups, terror factions or rebel forces in an effort to create chaos or disarray in decision-making.[15]
Information Technology Security
Cyber is the fifth dimension of warfare that complements the other four dimensions: land, sea, air, and outer space.[16] On numerous occasions, we witness cases of democracies and violent non-state actors engaging each-other on the cyber front, with the latter diminishing critical infrastructures of the former – particularly on oil and gas pipelines to inflict massive economic losses to make a point. The aforementioned attacks are made to attain an objective:
- to taunt the insecurity measures of the targeted state,
- steal critical information pertaining to a resource,
- collect data through means of espionage,
- incite marginalised communities through false propaganda,
- destabilise the legitimate government, or
- to demonstrate superiority through offensive action.[17]
Illicit cyber activities do not occur in one particular region or on a single instrument of national power; cyber-attacks are omnipresent irrespective to time and space, whose variants may differ from every attack, and so too the intensity of coordinated hacks – which can be too complex for military establishments or civilian institutions to deter.[18] This cost-efficient, extremely volatile action, superimposed with the target nation’s red tape and numerous bureaucratic hurdles which increase the potency of such attacks, may in the future be explosive enough to bring devastating consequences.
Strategic scholars continue to cite military installations as more vulnerable to civilian-administered national instruments of powers.[19] On 23 December 2015 hackers from a known Russian military group systematically compromised information systems of three energy distribution companies in Ukraine and disrupted electricity supply to local masses, albeit temporarily.[20] The experts, although it took hours to transfer the control from an automated to a manually powered system, were able to restore electricity.[21] After a detailed probe, which took months, experts were able to identify a known covert Russian hacker group whose attack was designed to study the reaction time of Ukrainian authorities in the wake of a full-scale hybrid conflict.[22] If in the moment of an external aggression, the results of this outcome would be severely catastrophic, especially during a time where the target state is struggling to reinitiate electricity supply or struggle to re-calibrate other national instruments of powers – particularly the civilian administrated communications systems. If the aggressor initiates a full-scale mobilisation of conventional forces simultaneously, such an attack would bring any economically prospering, socially thriving nation to its knees.
Computing Machinery with Intelligence
Experts define AI as a technical instrument which has the ability to perform auto-generated tasks independently (without any remote assistance), provides self-direction/course correction and studies previous courses of action while identifying new solutions to execute future tasks.[23] Current research on AI indicates its vitality for military-administered and civilian-hosted critical instruments of power.[24] The results of the incumbent research on AI seems more than just promising, but viable and practical, and soon to be out to test in the battlefield. Some segments of the research are already in its developmental phase, while some surpassed expectations quite early, putting them to test in vivid advanced phases. On that note, the Defence Advanced Research Projects Agency (DARPA) not long ago, formulated an open-ended competition in which research institutions and academic research and development labs were invited to participate in a contest where they were tasked to create an AI which could carry out/defend a hack against another AI enabled module. The outcome was fruitful and probably quite valuable for the US military research agency – which was expecting to identify computational systems with correct programmable-logic-enabled-AI possessing an ability to identify vulnerabilities in a targeted AI-enabled system and maintain a certain frequency/momentum of attacks until the system collapsed.[25] An updated AI-enabled system with the right programmable logic, would be able to adapt to new ecosystems and relay offensive/defensive actions against the adversary. That said, to ensure correct and timely adaption within the aforementioned system, decision-makers must formulate policy papers and standard guidelines comprising correct defensive measures and operational capabilities, desired and expected outcomes, and foreseeable threats – while giving special emphasis to the capacity to switch automated systems to a manually operated design in case of a technical breach or human error.[26]
AI has the ability to alter the way we understand war, re-define traditional concepts and peculiarities surrounding modern warfare, and perhaps alter our perception or contention towards the concept of future war.[27] That being said, not every scholar perceives AI as a tool or asset.[28] Once programmed with the right logic, AI can be used to anticipate an adversary’s movement in combat and predict the probable decisions of their higher echelons. It can further be used to minimise collateral during combat operations by identifying possible areas for explosives such as IEDs and deploying them on Infantry Combat Vehicles or Mine-Resistant Ambush Protected (MRAPs) Vehicles – as well as provide concrete, viable and applicable solutions to decision-makers by using ISR capability to collect actionable intelligence.[29] AI can be deployed for both offensive and defensive measures in the fifth-dimensional domain of warfare. Taking a defensive stance, information security professionals are always wary about vulnerabilities in the network. They are constantly involved in predicting future threats and hypothesising their complexity, while carrying out a thorough damage assessment and formulating defensive manoeuvres against offensive actions. AI will assist information security administrators in identifying and isolating certain threats and even assist human elements in prioritising threats based on severity. With the ability to access vast quantities of data, they can create analytics for information security administrators to analyse and predict possible areas of breach and vulnerability – a task that is impossible even for the best of the best human minds to achieve in a reasonable time. They could further reinforce analytical capabilities of human administrators and even assist financial planners to identify surplus expenditures. On the other hand, land-force components are not ready to fully exploit AI as a tool in combat operations or even in decision-making. According to military theorists, artificial intelligence is artificial in nature and does not possess the cognitive skills, intellect, passion, sensation, or sentiment that – among other factors – make a human, human.[30]
Although AI can prove to be a real asset to modern militaries, it can also be programmed to cloak as a functionary for accepting tasks and may even provide incorrect solutions or false information as a part of an adversary’s information operations. AI lack cognitive skill and may prove violent during combat operations, especially in urban settings. If deployed in conventional battles directly, they may get captured by adversary factions or terror groups, especially in a time when modern warfare has taken an unconventional posture. It is important for military leaders and decision-makers to provide a roadmap for deploying AI in unconventional operations and areas with higher rates of unpredictability or complexity, as well as identifying solutions for deploying weapon systems that are autonomous and possess learning capability.
Fake News
The article is not complete without analysing the emerging, increasingly prominent (especially during the COVID-19 pandemic) concept of Fake News, ie false information intentionally propagated to create havoc, confusion, or disarray among the masses.[31] In the digital world where propaganda also persists, the evolution of information communication technologies (ICTs) has revolutionised the way we propagate information. Although nations have been fuelling sentiments of certain communities to appease them or rally them in their favour (to attain votes, sympathy, or even support for war efforts) for decades, the saga of unending intentionally-circulated false information is not as straight or identifiable as propaganda.[32] It is every aspect of the argument - twisted arguments, ideology, factionalism, exaggerated notions, rumours, one-sided or incorrect interpretations, and carefully planted theories - all put in a blender and intermixed to produce a notion strong enough to influence even the most educated communities of a nation.[33] Picked up by streaming agencies with like-minded ideologies, the propagators spread their alternative to the truth using social media through vivid discussion groups and even especially targeting certain susceptible sections of the community. This process makes it truer than the truth itself to some, and leaves law enforcement institutions in a quagmire as to where to begin.[34]
COVID-19 has unveiled the truth of fake news and online hate, which used to surface only during electoral days.[35] The aforementioned tactics are not only limited to civil society, but the military too is targeted through this act. Coalition forces which interact directly with local masses are at the nexus of these attacks, especially when it is carried by rogue factions to create disarray in an effort to plant the seed of mistrust within fractured communities.[36] When coalition forces face a conventional adversary, fake news becomes a tool for spreading deceit and placing stress in the hearts and minds of troops through efforts to intimidate, humiliate and even threaten them using local masses as intermediaries – where they remain exposed.
Maintaining high morale is part of modern military training, but military leaders and decision-makers must formulate a concrete counter-fake news policy to reinforce deployed troops during active operations, especially those deployed to win the hearts and minds of fractured local communities.
Conclusion
The article has highlighted significant concerns on emerging technologies that continue to challenge land forces’ component’s operational capabilities while identifying solutions in an effort to assist decision-makers in creating a concrete policy. The author highlighted threats posed by the IOT, AI, fake news, and various actors on a mission to exploit information security measures of a state. That said, AI is flexible and fluid enough to be integrated in systems software and synchronised with human decision-making and cognitive skills. This synchronisation must be extensively studied and evaluated by military leaders.
Above all, the most important factor that plays a critical role in the success of land forces’ operation’s capability is the cooperation and coordination between civilian decision-makers and military leaders. Their ability to work in an integrated capacity is one of the critical factors that will result in a hundred percent success of troops during combat operations.
End Notes
[1] Allenby, Brad. 2013. “THE IMPLICATIONS OF EMERGING TECHNOLOGIES FOR JUST WAR THEORY.” Public Affairs Quarterly 27 (1): 49–67.
[2] Nish, Adrian, Saher Naumaan, and James Muir. 2020. “Enduring Cyber Threats and Emerging Challenges to the Financial Sector.” Carnegie Endowment for International Peace.
[3] KRAUSZ, MICHAEL, and JOHN WALKER. 2013. The True Cost of Information Security Breaches and Cyber Crime. IT Governance Publishing.
[4] Rastogi, Nidhi, Marie Joan Kristine Gloria, and James Hendler. 2015. “Security and Privacy of Performing Data Analytics in the Cloud: A Three-Way Handshake of Technology, Policy, and Management.” Journal of Information Policy 5 (May): 129–54.
[5] Szpyra, Ryszard. 2014. “Military Security within the Framework of Security Studies.” Connections 13 (3): 59–82.
[6] PATTNAIK, BINAY KUMAR. 2013. “Globalization, ICT Revolution in India and Socio-Cultural Changes: Sociological Explorations.” Polish Sociological Review, no. 181 (May): 39–62.
[7] Poudel, Swaroop. 2016. “Internet of Things.” Berkeley Technology Law Journal 31 (2): 997–1022.
[8] Hsu, D Frank, and Dorothy Marinucci, eds. 2013. Advances in Cyber Security. Fordham University Press.
[9] Rubin, Charles T. 2003. “Artificial Intelligence and Human Nature.” The New Atlantis, no. 1 (May): 88–100.
[10] Vasu, Norman, Benjamin Ang, Terri-Anne Teo, Shashi Jayakumar, Muhammad Faizal, and Juhi Ahuja. 2018. “Unpacking Fake News.” FAKE NEWS: NATIONAL SECURITY IN THE POST-TRUTH ERA. S. Rajaratnam School of International Studies.
[11] STILGOE, JACK. 2018. “We Need New Rules for Self-Driving Cars.” Issues in Science and Technology 34 (3): 52–57.
[12] Kaur, Kanwaldeep, and Giselle Rampersad. 2018. “Trust in Driverless Cars: Investigating Key Factors Influencing the Adoption of Driverless Cars.” Journal of Engineering and Technology Management 48: 87–96.
[13] Winseck, Dwayne. 2017. “The Geopolitical Economy of the Global Internet Infrastructure.” Journal of Information Policy 7 (May): 228–67.
[14] Maple, Carsten. 2017. “Security and Privacy in the Internet of Things.” Journal of Cyber Policy 2 (2): 155–84.
[15] Szymanski, Paul. 2019. “Techniques for Great Power Space War.” Strategic Studies Quarterly 13 (4): 78–104.
[16] McGuffin, Chris, and Paul Mitchell. 2014. “On Domains: Cyber and the Practice of Warfare.” International Journal 69 (3): 394–412.
[17] Gartzke, Erik. 2013. “The Myth of Cyberwar: Bringing War in Cyberspace Back Down to Earth.” International Security 38 (2): 41–73.
[18] Healey, Jason. 2011. “The Spectrum of National Responsibility for Cyberattacks.” The Brown Journal of World Affairs 18 (1): 57–70.
[19] Galinec, Darko, Darko Možnik, and Boris Guberina. 2017. “Cybersecurity and Cyber Defence: National Level Strategic Approach.” Automatika 58 (3): 273–86.
[20] Maliarchuk, Tamara, Yuriy Danyk, and Chad Briggs. 2019. “Hybrid Warfare and Cyber Effects in Energy Infrastructure.” Connections 18 (1/2): 93–110.
[21] Smeets, Max. 2018. “The Strategic Promise of Offensive Cyber Operations.” Strategic Studies Quarterly 12 (3): 90–113.
[22] Kostyuk, Nadiya, Scott Powell, and Matt Skach. 2018. “Determinants of the Cyber Escalation Ladder.” The Cyber Defense Review 3 (1): 123–34.
[23] Ågerfalk, Pär J. 2020. “Artificial Intelligence as Digital Agency.” European Journal of Information Systems 29 (1): 1–8.
[24] TOPYCHKANOV, PETR. 2020. “The Impact of Artificial Intelligence on Strategic Stability and Nuclear Risk.” Stockholm International Peace Research Institute.
[25] Guyonneau, Rudy, and Arnaud Le Dez. 2019. “Artificial Intelligence in Digital Warfare.” The Cyber Defense Review 4 (2): 103–16.
[26] Wu, Tim. 2019. “WILL ARTIFICIAL INTELLIGENCE EAT THE LAW? THE RISE OF HYBRID SOCIAL-ORDERING SYSTEMS.” Columbia Law Review 119 (7): 2001–28.
[27] Johnson, James S. 2020. “Artificial Intelligence.” Strategic Studies Quarterly 14 (1): 16–39.
[28] Horowitz, Michael C, and Paul Scharre. 2021. “Military Uses of AI:” AI and International Stability. Risks and Confidence-Building Measures. Center for a New American Security.
[29] Austin, G. (Ed.). (2020). National Cyber Emergencies: The Return to Civil Defence (1st ed.).
[30] Scala, N.M., & Howard, II, J.P. (Eds.). (2020). Handbook of Military and Defense Operations Research (1st ed.). Chapman and Hall/CRC.
[31] Huhtinen, A-M, and J Rantapelkonen. 2016. “Disinformation in Hybrid Warfare.” Journal of Information Warfare 15 (4): 50–67.
[32] Rogers, Richard, and Sabine Niederer, eds. 2020. The Politics of Social Media Manipulation. Amsterdam University Press.
[33] YERLİKAYA, TURGAY, and SECA TOKER ASLAN. 2020. “Social Media and Fake News in the Post-Truth Era.” Insight Turkey 22 (2): 177–96.
[34] Sample, C, J McAlaney, J Z Bakdash, and H Thackray. 2018. “A Cultural Exploration of Social Media Manipulators.” Journal of Information Warfare 17 (4): 56–71.
[35] Maréchal, Nathalie, Rebecca MacKinnon, and Jessica Dheere. 2020. “Targeted Advertising and COVID-19 Misinformation:” Getting to the Source of Infodemics: It’s the Business Model. A Report from Ranking Digital Rights. New America.
[36] SZYMAŃSKI, Marcin. 2017. “‘DISTURBING POTENTIAL.’” Politeja, no. 50/5 (May): 167–92.