Future wars will be fought with physical and cyber robots
“Internet of Intelligent Battle Things (IOBT) is the emerging reality of warfare,” as AI and machine learning advances Alexander Kott, chief of the Network Science Division of the US Army Research Laboratory.
He envisions a future where physical robots are able to fly, crawl, walk, or ride into battle. The robots as small as insects can be used as sensors, and the ones as big as large vehicles can carry troops and supplies. There will also be “cyber robots”, basically autonomous programmes, used within computers and networks to protect communications, fact-check, relay information, and protect other electronic devices from enemy malware.
“In order to be effective in performing these functions, battle things will have to collaborate with each other, and also with the human warfighters. This will require a significant degree of autonomous self-organization; and also of accepting a variety of relations between things and humans,” Kott said in a research paper, to be released in the proceedings of the Spring Symposiums of the Association for the Advancement of Artificial Intelligence (AAAI).
Kott’s ideas rest on the assumption that countries have obeyed a ban on autonomous weapons “beyond meaningful control”. So humans will still, ultimately, lead and make decisions on the battlefield, whilst IOBT devices will mainly act as aids.
The plan also relies on AI and machine learning advancing to capabilities beyond what’s currently achievable now. Neural networks are good at learning patterns in data for tasks such as image recognition or language translation, and achieve high accuracy levels but it requires millions of training examples.
No plan survives the first enemy contact
The battlefield, however, is a volatile and dynamic environment. IOBT will have to adapt to changing tactics, and learn from a small number data samples that will be imperfect, unlabelled, and potentially even deceptive.
Samples might be video footage or camera feeds showing the local environment for robots. Machine learning algorithms will have to determine what features are important to focus on. For example, piles of rubble are probably irrelevant, but such systems should be able to see if there was a wounded ally buried among the debris.
Deep learning systems are often inefficient and rely on copious amounts of computing power to crunch through numbers. The hardware for IOBT will have to be small and light to fit into tiny sensors and robots and consume as little energy as possible.
“One might suggest that a way to overcome such limitations on computing resources available directly on the battlefield is to offload the computations via wireless communications to a powerful computing resource located outside of the battlefield,” the paper said.
“Unfortunately, it isn’t a viable solution, because the enemy’s inevitable interference with friendly networks will limit the opportunities for use of reach-back computational resources.”
The third key area is communication between humans and robots. Question and answering is used to test knowledge in natural language processing. It works well when machines are trained with large amounts of text like IBM’s Watson in the Jeopardy game show. But to survive, IOBT will have to actually understand commands and engage in useful dialogs rather than simply memorising and recalling bits of information.
Calling Agent X
Kott explained to The Register that it’ll require agents that are able to follow conversations and have general common sense and reasoning skills – something that today’s chatbots lack.
“There are significant gaps that must be filled in our knowledge of AI, in order to apply them truly broadly and effectively and safely within any complex domain of human endeavors,” he added.
“It is important to make use of the recent advances in AI, in certain types of tasks for which today’s AI is well suited. But it is also important to recognize those tasks where AI is not yet ready for robust applications.
The US Army has invested in the Distributed and Collaborative Intelligent Systems (DCIST) and Collaborative Research Alliance (CRA), both programs aimed to excel IOBT research in a joint effort with US universities.
But the recent revelation into Google’s government contract with the Pentagon highlights the gaps in the US Army’s AI workforce. It is relying on Google employees to help use Google’s TensorFlow APIs, to analyse its drone footage using computer vision.
At a recent Senate hearing, General Paul Nakasone, currently the commander of the United States Army Cyber Command, admitted that AI engineers are scarce.
“Indeed it’s a challenge for the Army to compete for talent with the Silicon Valley, and actually with many other places that have great high-tech entrepreneurship. But, it’s a good challenge to have. It keeps us on our toes in competing for best people,” Kott told El Reg.
“We are lucky in that so many brilliant scientists and engineers are patriotic and proud to serve in defense of our society, and are excited to work on challenging problems of the Army’s technology. And we augment the talents of our Government scientists by working closely with academia and businesses. We find many companies – large and small – very interested in working with us.” ®