The question of when Russia will deploy giant anthropomorphic battle robots on its borders was on top of the list of things people asked President Vladimir Putin about during his online news conference in 2006.
Putin politely skirted the question. But last March, his special envoy Dmitry Rogozin said that Russia does in fact want lethal robots, though not necessarily giant or even anthropomorphic, and is conducting research in the field.
But Russia also recognizes the implications of creating machines authorized to kill humans, judging by the country's participation in the first United Nations conference on killer robots held in Geneva this week.
"Governments are acknowledging this is a concern," Mary Wareham of Human Rights Watch told The Moscow Times by telephone from Switzerland earlier this week.
The meeting of UN Convention on Certain Conventional Weapons experts on the subject of lethal autonomous weapons systems runs Tuesday through Friday. The event brings together 30 nations, including military powerhouses Britain, France, Germany, India and the U.S. — though, notably, not China.
The whole thing, admittedly, sounds a bit like a sci-fi convention. After all, thanks to Arnold Schwarzenegger and Robert Patrick, everyone knows killer robots look like humans and come but in two varieties, solid and semi-liquid, unless they are giant "mecha machines" from adolescent anime dreams.
Fewer people know, however, about the Samsung SGR-A1 machine-gun-sporting battle towers that guard South Korea's border with its northern neighbor, or even the Predator drones, loaded with Hellfire or Stinger missiles, that the CIA has been using in Iraq and Afghanistan since 2001.
Russia has jumped into the fray as well, according to Rogozin, who is also deputy prime minister responsible for Russia's military-industrial complex.
Projects in development include a remote-controlled android with driving and shooting skills, he said in an interview to the governmental daily Rossiiskaya Gazeta in March.
Also on the list is a combat system capable of "delivering strikes on its own," he said, without elaborating.
This would be a direct violation of the first commandment of robots as posited by science fiction legend Isaac Asimov in 1942 — which is is to never harm a human.
So far, no military technology exists that is capable of killing people without direct input by other people, according to experts in the field. But modern robotized weaponry may be edging toward crossing that line, warns the Campaign to Stop Killer Robots, the driving force behind the Geneva conference.
Whether Russia is really getting somewhere with its own battle robots remains open to question. One leading Russian analyst refused to comment on the matter or even have his name cited in connection with it, dismissing the story as nonsense. But Alexander Khramchikhin of the respected Institute for Political and Military Analysis in Moscow said fully autonomous weapons will likely define the warfare of the future.
Rogozin, who lobbied the creation in 2012 of the Foundation for Advanced Research — known as the "Russian DARPA" after Pentagon's own cutting-edge research agency — could not be reached for comment for this article.
Russia began working on battle robots as early as the 1930s, fielding at least two battalions of remotely controlled "tele-tanks" at the start of the Nazi invasion in 1941, though those did not survive for long in the bloody mess of the blitzkrieg.
Military officials have said that Moscow is developing its own drones — technology on which it lags significantly behind the U.S. — and an unmanned ground vehicle to guard its nuclear missile silos.
And it is definitely not alone: Most countries aspiring to a state-of-the-art military, from France and China to Israel and the U.S., have dabbled in robotics-based military technologies.
Khramchikhin refused to predict a timeframe for the creation of fully autonomous battle robots, but said Russian prototypes may already be rolling around the Russian DARPA's closed testing grounds.
In 2012, the global forecast was 20 to 30 years before the first killer robots, said Human Rights Watch's Mary Wareham, "But now we think it may be earlier."
There is, however, still time for a pre-emptive strike on lethal machines, according to the Campaign to Stop Killer Robots.
The group, co-founded in 2013 by Human Rights Watch, Amnesty International and the Nobel Women's Initiative, among others, campaigns for a global ban on fully autonomous weapons similar to bans on chemical or biological arms.
For now, most countries across the globe have no clear-cut policy on autonomous battle systems — as if they have never heard of Asimov.
The conference in Geneva is the first step toward a coordinated international policy on the matter, Wareham said. The next round of international talks is set for November. But a global deal is still years away, she added, and "a long hard road" will have to be traveled.
She has reason to hope: Even Putin, who had eventually yielded to the flood of questions and commented on battle robots in 2006, said their use "is impossible without human agency." Though that, of course, was eight years ago.