logo_reaching-critical-will

WILPF Statement to the 2016 CCW meeting of experts on lethal autonomous weapon systems

This statement was delivered by Ray Acheson, Director of Reaching Critical Will at the 2016 CCW meeting of experts on lethal autonomous weapon systems in Geneva on 12 April 2016. The PDF is available for download.

Writing in the mid-20th century, the philosopher Simone Weil called for the examination of technology and means of warfare rather than just the ends pursued by war. She argued that to understand the consequences of war, we need to analyse the social relations that are implied by our instruments of violence. 

With these discussions at the UN on lethal autonomous weapon systems, we have the opportunity to ask questions about this technology before it is fully developed or deployed. We know that research and development is ongoing. We have seen some precursors tested and used. But we still have the chance to interrogate—and prevent—the consequences of fully mechanising our means of violence and war.

Two aspects of Weil’s approach are relevant for our discussions about autonomous weapons: First, the tools we use to commit violence can determine the form of that violence. Some weapons facilitate certain attacks that might otherwise be impossible. Second, the choice of what we use to kill each other, how we kill each other, has meaning in itself. These choices are affected by and affect our social relations.

We can look to armed drones for examples of how autonomous weapons might determine the form of violence and war. Weapons deployed and operated from a foreign territory keep their operators out of harms way. They present a perception of “low risk” and “low cost” to the state deploying the weapon. Weapons that do not require any meaningful human control would only increase this seemingly limitless expansion of the battlefield, opening up more and more scope for the deployment of weapons into situations and to carry out tasks that might otherwise not be considered possible.

The use of such weapons also risks the targeting, assassination, death, and injury—without due process or feasible precautions—of people that might not otherwise be subject to such violence. An autonomous weapon, using algorithms and software to determine and engage targets reduces people to objects, which is an affront to human dignity.

There is something especially cynically abhorrent in the idea of human beings assigning the act of killing other human beings to a technological creation. Giving machines power to target and kill human beings crosses a moral line. It cheapens human life. The principle of humanity requires deliberative moral reasoning, a fundamental human capacity, over each individual attack decision. The taking of life requires human accountability, determined by morality and law. Without that we shirk our responsibilities and betray our common humanity.

There are other ways autonomous weapons affect our social relations. They undermine equality and justice between countries and people. The features that make might make autonomous weapons “attractive” to higher income, technologically advanced countries looking to preserve the lives of their soldiers push the burden of risk onto the rest of the world. As with armed drones, the deployment of autonomous weapons would not likely, in any near future, result in an epic battle of robots, where machines fight machines. They would be unleashed upon populations that might not have defenses against them, that might not be able to detect their imminent attack, and that might have no equivalent means with which to fight back. They would be weapons of power, dominance, inequality, and othering.

Autonomous weapons also have several gendered dimensions. With autonomous weapons, algorithms would replace violent masculinities to create some kind of perfect killing machine. Turning men and women into warfighters has tended to require breaking down their sense of ethics and morals and building up a violent masculinity that is lacking in empathy and glorifies strength as violence and physical domination over others portrayed as weaker. Autonomous weapons would be the pinnacle of a fighting force stripped of the empathy, conscience, or emotion that might hold a human soldier back. While armed drones are on this trajectory towards mechanised violence, autonomous weapons would complete the separation of the body and mind from the battlefield and the process of dehumanisation of warfare.

WILPF, which is a partner organisation of the Campaign to Stop Killer Robots, urges the negotiation of a ban on lethal autonomous weapon systems as an imperative to ensure the retention of meaningful human control over all targeting and attack decisions. At the CCW Review Conference later this year, high contracting parties to the CCW should establish a Group of Governmental Experts to begin negotiations of a legally-binding instrument prohibiting lethal autonomous weapon systems.

[PDF] ()