Criminal gangs and terrorists could get their hands on murderous artificial intelligence to hunt down enemy targets.
And by being controlled at length, they could make it impossible for the culprits to be traced.
Computer engineer Dr Subhash Kak warned the parts needed to assemble killer robots will prove “not difficult” to buy on the black market.
And before long he said the robots themselves will available to buy, ready to carry out mass murder upon instruction.
The Oklahoma University lecturer told Daily Star Online: “The technology for human assisted killer robots is already here.
“Bad actors will be able to produce these using parts that are not too difficult to buy on the open market, or just get them on the black market.
“The possibility of untraceable terror attacks will increase.”
His comments come amid furious debate over the use of autonomous killer robots by countries on the battlefield.
Major countries including the US and Russia opposed a treaty banning them, sparking outrage from campaign group Campaign to Stop Killer Robots.
Noel Sharkey, a roboticist acting as spokesman for Campaign to Stop Killer Robots, said: “The two main options on the table for next year’s work were binding regulations in the form of a political declaration led by Germany and France and negotiations towards a new international law to prohibit the use and development of autonomous weapons systems led by Austria, Brazil and Chile.
“Cuba was particularly stubborn and would not accept any wording that even hinted that there might be any benefits.
“The others concede in the end with a compromise to take out the word ‘risks’ although the risks themselves remained.
“It is shameful that a handful of states can prevent the majority from moving towards negotiations that would regulate or prevent the use of these morally reprehensible weapons.”
He added: “The campaign strongly objects to permitting the development of weapons systems that, once activated, would be able to select and attack targets without human intervention.”
Dr Kak previously told us he believes AI brains can become “radicalised” for terrorist attacks.