The movie portrays a brutal future. A military firm unveils a tiny drone that hunts and kills with ruthless efficiency. But when the technology falls into the wrong hands, no one is safe. Politicians are cut down in broad daylight. The machines descend on a lecture hall and spot activists, who are swiftly dispatched with an explosive to the head.
The short, disturbing film is the latest attempt by campaigners and concerned scientists to highlight the dangers of developing autonomous weapons that can find, track and fire on targets without human supervision. They warn that a preemptive ban on the technology is urgently needed to prevent terrible new weapons of mass destruction.
Stuart Russell, a leading AI scientist at the University of California in Berkeley, and others will show the film on Monday during an event at the United Nations Convention on Conventional Weapons hosted by the Campaign to Stop Killer Robots. The manufacture and use of autonomous weapons, such as drones, tanks and automated machine guns, would be devastating for human security and freedom, and the window to halt their development is closing fast, Russell warned.
“The technology illustrated in the film is simply an integration of existing capabilities. It is not science fiction. In fact, it is easier to achieve than self-driving cars, which require far higher standards of performance,” Russell said.[the_ad id=”11018″]
The military has been one of the largest funders and adopters of artificial intelligence technology. The computing techniques help robots fly, navigate terrain, and patrol territories under the seas. Hooked up to a camera feed, image recognition algorithms can scan video footage for targets better than a human can. An automated sentry that guards South Korea’s border with the North draws on the technology to spot and track targets up to 4km away.
While military drones have long been flown remotely for surveillance and attacks, autonomous weapons armed with explosives and target recognition systems are now within reach and could locate and strike without deferring to a human controller. Opponents believe that handing machines the power over who lives and dies crosses a clear moral line.