![]() |
"Killer robots" may seem like something from a sci-fi film, but reality is catching up |
Entire regiments of unmanned tanks; drones that can spot an insurgent in a crowd of civilians; and weapons controlled by computerised "brains" that learn like we do, are all among the "smart" tech being unleashed by an arms industry many believe is now entering a "third revolution in warfare".
"In every sphere of the battlefield - in the air, on the sea, under the sea or on the land - the military around the world are now demonstrating prototype autonomous weapons," says Toby Walsh, professor of artificial intelligence at Sydney's New South Wales University.
"New technologies like deep learning are helping drive this revolution. The tech space is clearly leading the charge, and the military is playing catch-up."
![]() |
Russian arms maker Kalashnikov is developing a suite of fully automated weapons |
It features a 7.62mm machine gun and a camera attached to a computer system that its makers claim can make its own targeting judgements without any human control.
- AI fighter pilot wins in combat simulation
Unlike a conventional computer that uses pre-programmed instructions to tackle a specific but limited range of predictable possibilities, a neural network is designed to learn from previous examples then adapt to circumstances it may not have encountered before.
![]() |
Would robot combat systems make fewer mistakes than humans? |
And it is this supposed ability to make its own decisions that is worrying to many.
"If weapons are using neural networks and advanced artificial intelligence then we wouldn't necessarily know the basis on which they made the decision to attack - and that's very dangerous," says Andrew Nanson, chief technology officer at defence specialist Ultra Electronics.
But he remains sceptical about some of the claims arms manufacturers are making.
Automated defence systems can already make decisions based on an analysis of a threat - the shape, size, speed and trajectory of an incoming missile, for example - and choose an appropriate response much faster than humans can.
But what happens when such systems encounter something they have no experience of, but are still given the freedom to act using a "best guess" approach?
Mistakes could be disastrous - the killing of innocent civilians; the destruction of non-military targets; "friendly fire" attacks on your own side.
![]() |
Remotely piloted drones have been used to carry out missile attacks since 2001 |
And this is what many experts fear, not that AI will become too smart - taking over the world like the Skynet supercomputer from the Terminator films - but that it's too stupid.
"The current problems are not with super-intelligent robots but with pretty dumb ones that cannot flexibly discriminate between civilian targets and military targets except in very narrowly contained settings," says Noel Sharkey, professor of artificial intelligence and robotics at Sheffield University.
Despite such concerns, Kalashnikov's latest products are not the only autonomous and semi-autonomous weapons being trialled in Russia.
The Uran-9 is an unmanned ground combat vehicle and features a machine gun and 30mm cannon. It can be remotely controlled at distances of up to 10km.
The prospect of autonomous weapons systems inadvertently leading to an escalation in domestic terrorism or cyber-warfare is perhaps another reason to treat this new tech with caution.
No comments:
Post a Comment