Robogeddon: Autonomous ‘Slaughterbots’ Programmed To Kill

Please Share This Story!
Technocrats invent because the can, not because there is a demonstrated need to do so. When AI, GIS, GPS, facial recognition and military hardware meet, what would you expect other than automated slaughter? It’s faster, easier and far removed from pulling the trigger with an enemy in your sights. ‚ĀÉ TN Editor

A superpower arms race to build killer robots could wipe out humanity if left unchecked, experts fear.

The doomsday warning comes after a UN conference failed to agree a ban on Terminator-style “slaughterbots” – which are being developed by China, Russia and the US.

Major powers are investing billions to create advanced AI weapons that can hunt and strike targets with no input from controllers.

Last year a Turkish-made kamikaze drone made the¬†world’s first autonomous kill¬†on human targets in Libya, a UN report revealed.

But experts warn the technology is advancing so fast, governments and societies have not properly considered the dangers.

They say machines making their own decisions are prone to unpredictable and rapidly spreading errors.

These arise from codes called algorithms which even the programmers don’t always understand and cannot stop going awry.

If AI weapons in the future are armed with biological, chemical or even nuclear warheads, the results could be unintentional Armageddon.

“It is a world where the sort of unavoidable algorithmic errors that plague even tech giants like Amazon and Google can now lead to the elimination of whole cities,”¬†warns Prof James Dawes¬†of Macalester College.

“The world should not repeat the catastrophic mistakes of the nuclear arms race. It should not sleepwalk into dystopia.”

MIT professor Max Tegmark, co-founder of the Future of Life Institute, issued a similarly dire warning this week.

He told Wired: “The technology is developing much faster than the military-political discussion.

‚ÄúAnd we’re heading, by default, to the worst possible outcome.‚ÄĚ

A potential ban on so-called Lethal Automomous Weapons Systems (LAWS) was discussed last week at the UN’s five-yearly¬†Convention On Certain Conventional Weapons.

Some of the 120 nations taking part – including Brazil, South Africa and New Zealand – argued LAWS should be restricted by treaty like landmines and some indenciaries.

A growing list of countries including France and Germany support limits on some automous weapons including those that target humans. China said it supports a narrow set of restrictions.

Other nations, including the US, Russia, India, the UK and Australia resist a ban, saying that continuing to develop killer robots is essential to avoid being at a strategic disadvantage.

Life and death decisions

Already¬†a terrifying array of AI weapons¬†have been deployed around the world, including self-firing machine guns in Korea’s Demilitarised Zone no man’s land.

At least 14 countries have suicide AI drones, including Israel’s devastating Harop unmanned attack plane which was used to hunt down Hamas terrorists.

Harops also¬†devastated Armenia’s army¬†during the clash with Azerbaijan last year – although it was not revealed if they acted with or without human input.

Russia’s deadly robot weapons include the new¬†Checkmate stealth fighter, which combines AI systems with a human pilot.

Designers say there could be a future version with no need for a pilot.

China began tests over a decade ago on a robot submarine designed to track and destroy enemy ships autonomously.

It has also recently unveiled an anti-submarine drone and truck-launched battlefield drone swarms.

China is also building a robot warship armed with torpedoes, satellite pictures revealed in October.

All these killing machines can be programmed to seek human approval before attacking the targets they find.

But they could also be used as the ultimate “fire and forget” technology, loitering for hours or days ready to fire on targets chosen entirely by a computer.

Unchecked slaughter

Experts say leading militaries are kidding themselves if they believe they can control the spread of these advanced new weapons.

They say rogue states and terrorists will inevitably try to get hold of them to unleash massacres.

And as compact killer robots become as cheap as Kalashnikovs, gangs such as Mexico’s cartels could use them in a terrifying new wave of bloodshed.

“If you can buy slaughterbots for the same price as an AK-47, that‚Äôs much preferable for drug cartels, because you‚Äôre not going to get caught when you kill someone,‚ÄĚ Prof Tegmark told The Sun.

“Even if a judge has lots of bodyguards, you can fly in through a bedroom window while they‚Äôre sleeping and kill them.‚ÄĚ

And governments will be more likely to go to war in future, experts say, as robot weapons mean fewer soldiers in the firing line.

 

Read full story here…

About the Author

Patrick Wood
Patrick Wood is a leading and critical expert on Sustainable Development, Green Economy, Agenda 21, 2030 Agenda and historic Technocracy. He is the author of Technocracy Rising: The Trojan Horse of Global Transformation (2015) and co-author of Trilaterals Over Washington, Volumes I and II (1978-1980) with the late Antony C. Sutton.
Subscribe
Notify of
guest
2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Rachel

Hmm. Since all the technocrats behind the robo killers seem fine with wiping out most of us non elitists I’m not afraid of them losing control. Machines cannot love, but they can’t hate or single mindedly set out to destroy/conquer either. The “worst” case scenario I foresee is the robots may accidentally kill a technocrat or member of the oligarchy hiring them.

[…] Robogeddon: Autonomous ‚ÄėSlaughterbots‚Äô Programmed To Kill […]