Pentagon Chiefs Fear Killer Robots Could Potentially Wipe Out Humanity

Pentagon (Wikipedia Commons)

Only six months ago, the Pentagon denied that it would make autonomous killer robots. Today, it is clear that not only are they making such machines, they are clearly worried about the potential of autonomous robot wars to wipe out all humanity.  TN Editor

Huge technological leaps forward in drones, artificial intelligence and autonomous weapon systems must be addressed before humanity is driven to extinction.

Robotic weapons have become so advanced that top military experts in the US fear the plot of the sci-fi film ‘Terminator’ could come true.

Huge technological leaps forward in drones, artificial intelligence and autonomous weapon systems must be addressed before humanity is driven to extinction bymechanical overlords like in the 1984 Arnold Schwarzenegger classic, according to Pentagon chiefs.

Air Force General Paul Selva, the Vice Chairman of the Joint Chiefs of Staff at the US Defense Department, said so-called thinking weapons could lead to: “Robotic systems to do lethal harm… a Terminator without a conscience.”

When asked about robotic weapons able to make their own decisions, he said: “Our job is to defeat the enemy” but “it is governed by law and by convention.”

He says the military insists on keeping humans in the decision-making process to “inflict violence on the enemy”.

“That ethical boundary is the one we’ve draw a pretty fine line on. It’s one we must consider in developing these new weapons,” he added.

Selva said the Pentagon must reach out to artificial intelligence tech firms that are not necessarily “military-oriented” to develop new systems of command and leadership models, reports US Naval Institute News .

Read full story here…

Related Articles That You Might Like

Leave a Reply

2 Comments on "Pentagon Chiefs Fear Killer Robots Could Potentially Wipe Out Humanity"

Notify of
avatar
Sort by:   newest | oldest | most voted
FreeOregon
Guest

General Paul Selva has a conscience?

John Dunlap
Guest
This article just strengthens my conviction that these weapons are not intended for conventional battlefields. AI will never make decisions “better” than a human can, but it already makes decisions much faster. These machines have no advantage against conventional forces, simply because opposing forces will field their own systems and countermeasures, but they pose a serious danger. Specifically, because of the speed at which autonomous weapons will make decisions, there is a very real probability that the shape of any conflict, including expansion and escalation, will evolve at a pace which outstrips the ability of decision makers to stay ahead… Read more »
wpDiscuz

The only Internet source for

Technocracy

Stories curated daily from around the world

Subscribe and get the digest!

No SPAM! We will not share your email with any 3rd party.

Thank You for Subscribing!

 

If you don't receive a confirmation email within a few

minutes, please check  your spam/junk folder.

Wath for a confirmation email.