Scientists Urgently Call For Ban On Killer Robots

Technocrats build because they can, not because of a pressing social need and thus, autonomous killer robots are soon to be a household name. The video below is fictional, but the technology is thoroughly plausible. Since military and police forces are already investing heavily in this type of technology, it is certain to be in the hands of criminals and terrorists at some point.  TN Editor

The movie portrays a brutal future. A military firm unveils a tiny drone that hunts and kills with ruthless efficiency. But when the technology falls into the wrong hands, no one is safe. Politicians are cut down in broad daylight. The machines descend on a lecture hall and spot activists, who are swiftly dispatched with an explosive to the head.

The short, disturbing film is the latest attempt by campaigners and concerned scientists to highlight the dangers of developing autonomous weapons that can find, track and fire on targets without human supervision. They warn that a preemptive ban on the technology is urgently needed to prevent terrible new weapons of mass destruction.

Stuart Russell, a leading AI scientist at the University of California in Berkeley, and others will show the film on Monday during an event at the United Nations Convention on Conventional Weapons hosted by the Campaign to Stop Killer Robots. The manufacture and use of autonomous weapons, such as drones, tanks and automated machine guns, would be devastating for human security and freedom, and the window to halt their development is closing fast, Russell warned.

“The technology illustrated in the film is simply an integration of existing capabilities. It is not science fiction. In fact, it is easier to achieve than self-driving cars, which require far higher standards of performance,” Russell said.

The military has been one of the largest funders and adopters of artificial intelligence technology. The computing techniques help robots fly, navigate terrain, and patrol territories under the seas. Hooked up to a camera feed, image recognition algorithms can scan video footage for targets better than a human can. An automated sentry that guards South Korea’s border with the North draws on the technology to spot and track targets up to 4km away.

While military drones have long been flown remotely for surveillance and attacks, autonomous weapons armed with explosives and target recognition systems are now within reach and could locate and strike without deferring to a human controller. Opponents believe that handing machines the power over who lives and dies crosses a clear moral line.

Read full story here…

Related Articles That You Might Like

Technocracy.News is the only web site on the Internet that reports on and critiques Technocracy. Please take a minute to support our work on Patreon!

Leave a Reply

Be the First to Comment!

Notify of
avatar
wpDiscuz

The only Authoritative source for

Exposing Technocracy

Stories curated daily from around the world

Subscribe and get the digest!

No SPAM! We will not share your email with any 3rd party.

Thank You for Subscribing!

 

If you don't receive a confirmation email within a few

minutes, please check  your spam/junk folder.

Wath for a confirmation email.