WILPF Statement to the 2015 CCW meeting of experts on autonomous weapon systems
In two weeks, the Women’s International League for Peace and Freedom (WILPF) will celebrate its 100th anniversary. For 100 years, our members have sought an end to war and promoted nonviolent solutions to conflict. For 100 years, we have worked to prevent the development of violent technologies and called instead for the development of norms, principles, practices, and institutions of peace and justice.
That is why we have come to this meeting on autonomous weapons here in Geneva. The world around us is embroiled in armed conflict. We are witness to the bombing and shelling of towns and cities, causing the deaths, injuries, and displacement of scores of civilians. We are witness to the rampant international arms trade and profiteering from weapons development and sale. We are witness to the use of armed drones in ways that violate international humanitarian law and international human rights law.
Amidst this backdrop, we are gravely concerned at the possibility of weapons that may operate without meaningful human control.
Human beings are fallible. We can be violent, we can break laws. But we have something that machines do not have, and likely cannot be programmed to have: moral reasoning. We can value human life—even if we sometimes don’t. Moral decision-making is not just about choosing what action to take, but choosing what perspective from which to take it and what kind of world you want to live in.
The use of force has already become too disengaged from moral reasoning, effective oversight, or accountability. Developing and deploying autonomous weapon systems that operate without meaningful human control advances a trajectory that we should be reversing.
The laws of war and protection of human rights require human engagement. Under international humanitarian law and human rights law, the legality of an attack is context-dependent. It is generally assessed on a case-by-case basis. Questions of distinction and proportionality cannot be resolved through automated mechanisms, and in any case, the law requires that human commanders make such judgments. While some argue that advances in technology might be able to address these issues in the future, there is no way for technology to address the fact that what gives law meaning is free will.
Beyond the law, giving machines power to target and kill human beings crosses a moral line. It cheapens human life and undermines human dignity. The principles of humanity require deliberative moral reasoning, by humans, over each individual attack decision.
A preemptive ban on autonomous weapons is necessary to ensure the retention of meaningful human control over targeting and attack decisions. Existing international law is not strong or clear enough to prevent the development of autonomous weapons. We encourage delegations here at this meeting to consider how to ensure meaningful human control and to explore options for pursuing negotiations to prohibit autonomous weapons.
WILPF promotes non-military and nonviolent solutions to resolve disputes. But there is something especially cynically abhorrent in the idea of human beings assigning the killing of other human beings to a technological creation. The taking of life requires accountability, human accountability, for our actions, determined by morality and law. Without that we shirk our responsibilities and betray our common humanity.