Ban Killer Robots Before They’re a Problem, Say Human Rights Activists

"Autonomous killing machines? What a great idea!" --some mad scientist in a government lab


If we’ve learned anything from a hundred years of science fiction, it’s that handing over a) guns and b) any serious amount of authority to robots is not going to end well. However, the Pentagon doesn’t make R&D decisions based on Battlestar Galactica. Military drones are still controlled by humans, but for how long? Lest arms makers get any ideas, Human Right Watch has just published a lengthy report titled, “Losing Humanity: The Case Against Killer Robots.”

Sign Up For Our Daily Newsletter

By clicking submit, you agree to our <a href="">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime.

See all of our newsletters

If  “killer robots” makes you think “cylons,” well, you’re pretty much on the same dystopian page as Human Rights Watch and their partners at Harvard’s International Human Rights Clinic. The organizations are concerned about the prospect of the development of autonomous weapons, which would have the ability to pick and choose their own targets (based on their masters’ aims, of course). Unfortunately, they wouldn’t have the built-in moral calculus that says to err on the side of not killing civilians.

From the report’s announcement:

“Giving machines the power to decide who lives and dies on the battlefield would take technology too far,” said Steve Goose, Arms Division director at Human Rights Watch. “Human control of robotic warfare is essential to minimizing civilian deaths and injuries.”

Now, bear in mind that these weapons don’t actually exist yet and likely won’t for another two or three decades. But seriously: killer robots.

Ban Killer Robots Before They’re a Problem, Say Human Rights Activists