Omniviolence Is Coming And The World Isn’t Ready

YouTube
Please Share This Story!
This article raises interesting concerns about the ability of violence to be projected from anywhere in the world toward any target. Psychopaths of the future will use technology to disrupt the functioning of the nation-state.

Technocrats build because they can, not because there is a need to do so. Ethics and morality are not a consideration. ⁃ TN Editor

IThe Future of Violence, Benjamin Wittes and Gabriella Blum discuss a disturbing hypothetical scenario. A lone actor in Nigeria, “home to a great deal of spamming and online fraud activity,” tricks women and teenage girls into downloading malware that enables him to monitor and record their activity, for the purposes of blackmail.

The real story involved a California man who the FBI eventually caught and sent to prison for six years, but if he had been elsewhere in the world he might have gotten away with it. Many countries, as Wittes and Blum note, “have neither the will nor the means to monitor cybercrime, prosecute offenders, or extradite suspects to the United States.” 

Technology is, in other words, enabling criminals to target anyone anywhere and, due to democratization, increasingly at scale. Emerging bio-, nano-, and cyber-technologies are becoming more and more accessible. The political scientist Daniel Deudney has a word for what can result: “omniviolence.”

The ratio of killers to killed, or “K/K ratio,” is falling. For example, computer scientist Stuart Russell has vividly described how a small group of malicious agents might engage in omniviolence: “A very, very small quadcopter, one inch in diameter can carry a one-or two-gram shaped charge,” he says. “You can order them from a drone manufacturer in China. You can program the code to say: ‘Here are thousands of photographs of the kinds of things I want to target.’ A one-gram shaped charge can punch a hole in nine millimeters of steel, so presumably you can also punch a hole in someone’s head.

You can fit about three million of those in a semi-tractor-trailer. You can drive up I-95 with three trucks and have 10 million weapons attacking New York City. They don’t have to be very effective, only 5 or 10% of them have to find the target.” Manufacturers will be producing millions of these drones, available for purchase just as with guns now, Russell points out, “except millions of guns don’t matter unless you have a million soldiers. You need only three guys to write the program and launch.” In this scenario, the K/K ratio could be perhaps 3/1,000,000, assuming a 10-percent accuracy and only a single one-gram shaped charge per drone. 

Warning: This video is dramatized and has not happened!

That’s completely—and horrifyingly—unprecedented. The terrorist or psychopath of the future, however, will have not just the Internet or drones—called “slaughterbots” in this video from the Future of Life Institute—but also synthetic biology, nanotechnology, and advanced AI systems at their disposal. These tools make wreaking havoc across international borders trivial, which raises the question: Will emerging technologies make the state system obsolete?

It’s hard to see why not. What justifies the existence of the state, English philosopher Thomas Hobbes argued, is a “social contract.” People give up certain freedoms in exchange for state-provided security, whereby the state acts as a neutral “referee” that can intervene when people get into disputes, punish people who steal and murder, and enforce contracts signed by parties with competing interests. 

The trouble is that if anyone anywhere can attack anyone anywhere else, then states will become—and are becoming—unable to satisfy their primary duty as referee. It’s a trend toward anarchy, “the war of all against all,” as Hobbes put it—in other words a condition of everyone living in constant fear of being harmed by their neighbors. Indeed, in a recent paper, “The Vulnerable World Hypothesis,” published in Global Policy, the Oxford philosopher Nick Bostrom argues that the only way to defend against a global catastrophe is to employ a universal and invasive surveillance system, what he calls a “High-tech Panopticon.”

Sound dystopian? It sure does to me. “Creating and operating the High-tech Panopticon would require substantial investment,” Bostrom writes, “but thanks to the falling price of cameras, data transmission, storage, and computing, and the rapid advances in AI-enabled content analysis, it may soon become both technologically feasible and affordable.” Bostrom is well-aware of the downsides—corrupt actors in a state could exploit this surveillance for totalitarian ends, or hackers could blackmail unsuspecting victims. Yet the fact is that it may still be a better option than suffering one global catastrophe after another. 

How can societies counterattack omniviolence? One strategy could be a superintelligent machine—essentially, an extremely powerful algorithm—that’s specifically designed to govern fairly. We could then put the algorithm in political charge and, insofar as it governs as something like a “Philosopher King,” not worry constantly about the data collected being misused or abused. Of course, this is a fantastical proposal. Even the real-world use of AI in the justice system is fraught with problems. But at this point, do we have a better idea for preventing the collapse of the state system under the weight of widespread technological empowerment?

Read full story here…

Subscribe
Notify of
guest

4 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
leo

Most of the “omniviolence will be(is) programmed terror – like Operation Gladio, The War On Terror, etc. – state/elite/ corporate sponsored terrorism to keep the underclass worldwide in constant fear/control as they maintain their power base. Possibly the technology could be used by small groups of “freedom fighters” or “guerrilla” fighters scattered worldwide to use against the totalitarian overlords.

Elle

Why would you need to buy charges from China to blow-up whomever when a drone is equipped with propellers spinning at high speed. Now think about it. A tiny drone with parts that spin? What can you do with that? Don’t over think it. Typical of Hollywood type dystrophic scenarios this one is way over-thought making the idea of needing charges to blow up someone’s head quite silly. One thing was interesting, the ‘philosopher king’ idea of having one huge governing algorithm. It reminded me of a story told by Al Bielek’s about one of the alleged time travel experiments… Read more »

Petrichor

And your point in retelling this story is—?

Jim

Rudolf Steiner spoke of a coming “war of all against all”. Perhaps his perspective would provide helpful insights as well.