‘Learn to code’ will be the definitive idea for the 2030 Combat Engineer Regiment’s (CER) equipment and training requirements.  While that phrase can be tricky[1], that skill just might save a future Sapper’s life.  Unmanned ground and aerial vehicles (UGV and UAV) with robotic and autonomous systems (RAS), run by artificial intelligence (AI) and machine learning (ML), will control the 2030 CER.  Sappers conduct dangerous, dirty, and demanding tasks that enable the Joint Force to live, move and fight[2].  RAS will let Sappers to do those tasks outside the direct fire zone.  The task is crafting the computer systems that do this.  That’s where coding comes in.

In broad terms, coding is how computers do specific tasks using an executable computer program.  Coding uses a variety of computer languages, algorithms, and logic (amongst other things) to ‘automate the performance of a task….’[3]  Centuries ago, mechanical programming controlled machines like looms and clocks.  Now, digital programming controls a large part of everyday life and military tasks.  Coding built the original ARPANet, letting scientists quickly share information with their peers, which evolved into a medium for your lonely aunt to send you unsolicited cat videos[4], and eventually enabled information to become a fully-fledged domain of war.

In the next few years, Land 8160 will give Army an armoured Engineer capability.  Sappers will do mobility and counter-mobility tasks shielded by main battle tank-level armour.  The CERs will have assault breaching vehicles, assault-bridging vehicles and armoured Engineering vehicles, but it will still take Sappers inside the vehicles to operate them.  Our enemy will improve and adapt his weapons to defeat our armour, but we can only improve our vehicles so much before we hit upper physical limits.  Vehicles too large or too heavy cause logistical headaches with movement and use; overly complex vehicles require a long training program.  That makes it hard to replace trained operators and maintainers.  The fix is to take the human from the vehicles, which gets us back to RAS. 

Before we go deeper into what RAS for Engineers will look like, consider some history.  Military engineering has stayed mostly stable in its core functional areas throughout history and across civilisations.  Mobility; counter-mobility; survivability; general engineering.  The tools, techniques, and scale change over the years, but the difference is one of degree, not kind.  Roman Legions 2,000 years ago and Mongol Hordes 1,000 years ago both used specialist soldiers to build their camps, siege engines and roads.  Today’s military engineer works under these same areas and will in the future.  However, RAS, ML, and AI will bring a major evolution, if not revolution, in military engineering.  Underlying this is the need to talk (literally) with the computers controlling these devices and applications – coding. 

So now, back to RAS.  The 2030 CER’s armoured breaching, engineering and bridging vehicles will be either robotic or autonomously operated.  L8160’s consolidated operational needs (CONs) document directs Army to consider unmanned vehicles, in a ‘fitted for, not with’ design, so the groundwork is there.  Remote-controlled vehicles aren’t new to military use, but the traditional control system uses the same interface as a remote-controlled toy – the vehicle only does what the controller tells it to do.  RAS-enabled vehicles, in contrast, can perform a task free of the operator.  What the operator needs to know; however, is how to program the computer to do those tasks.  The 2030 CER will use vehicles that combine Land 8160 capabilities with autonomous unmanned vehicles.

How does AI and ML work?  AI mostly refers to machines that make decisions requiring a human level of skill.[5]  For AI in robotics, it means programming a robot to show certain behaviours and actions in a specified environment.  With autonomous robotics, AI robots could develop and control themselves autonomously, and adapt to uncertain and incomplete information inputs.[6]  Machine learning is an extension of AI that lets a machine learn from experience.  Consider an automatic dog-grooming machine.  The user programs it with a basic definition of a dog – furry, four legs, has a tail, is a good boy; gives it pictures of various dog breeds; then tells it to groom only dogs.  After repeated instances of grooming dogs, the ML software learns that a cat is not a dog, and therefore not to groom the cat if presented the choice.

The future Sapper will operate UGVs through software and applications on a tablet, in a more indirect manner than today’s remote-controlled vehicles.  In today’s terms, the operator uses the controller to drive the breaching vehicle to the obstacle belt, and then tells it to deploy the line charges, mine plough, and lane-marking systems in sequence.  The current fleet of EOD robots is a useful analogy to this method.  The 2030 operator will instead program the breaching vehicle to move on its own, tactically and in synch with other vehicles (manned or not), perform its Engineer tasks, and then prepare for an autonomous UAV resupply.  The Engineer UGV does all this because the operator wrote a computer program telling it to do so.  If AI and ML are developed to an advanced stage, it is entirely possible the operator simply goes to the ‘options’ dropdown menu on a tablet, selects the ‘BREACH’ option, inputs a start point, and then taps ‘enter’.  The AI software controls everything else.

While AI and ML programs are the tools used, the actual skill required is coding.  Coding underpins many of the applications of modern life, and will only increase in the years to come.  Consider the simple warehouse.  It’s a building to store things, basic as far as structures go.  Yet the trend now is towards automated (read human-less) warehouses.  Chinese e-commerce company JD.com recently paired with Mujin, a Japanese robotics and automation company, to open a 40,000-m2 facility entirely run by industrial robots and computers.  A warehouse that used to employ 500 people now only has a staff of five – and they only maintain the machines.[7] 

So is a fleet of autonomous Engineer vehicles in the next 10 years such a far-fetched idea?  Many car makers are now testing self-driving cars.  In an advanced use, the US Air Force’s Skyborg program seeks to pair XQ-58 Valkyrie drones with either F-35 or F-15EX jets.  Skyborg is a Valkyrie drone upgraded with new sensors and payloads, networked to a mothership.  An AI system, ideally, would allow either a single drone or group of drones to train and learn alongside pilots.  For the US Air Force, the desired capability would have an F-35 controlling several networked drones that provide the ‘Sensor’ end of the ‘Sensor-Shooter’ link, and create and share a truly common operational picture.  The Skyborg program aims for a 2023 initial test.[8]

Closer to (the Engineering) home, the US Marine Corps is testing automated 3D-printed construction equipment.  Recently, Marine Corps Engineers and construction company ICON built an M142 HIMARS multiple-launch rocket vehicle shelter using ICON’s Vulcan 3D printer.  The Marines loaded a mobile tablet linked to the Vulcan with vehicle shelter plans and specifications, which then laid layers of ICON’s proprietary Lavacrete mix.  The result was a concrete shelter rated to 6,000 psi ready for use in 36 hours – vastly faster and more expeditionary than standard methods.[9] 

Given these examples, is it too much to think that in 10 years’ time a Combat Engineer section will have soldiers skilled in coding rather than today’s mobility, counter-mobility and survival skills?  That those soldiers would husband autonomous aerial and ground vehicles instead of doing those tasks themselves?  Amazon uses AI and ML software to predict what you want to buy next, so why couldn’t an automated Q-store use AI and ML to predict a Class I, III, and V resupply (and in the correct quantities), then dispatch it with an autonomous UAV?  Or a forward-deployed section commander demands replacement parts to repair a vehicle, and the roboticised RPS 3D-prints the replacements, which an autonomous UGV delivers?