CCW Report, Vol. 7, No. 6

This cannot be kicked down the road any further
21 August 2019

Ray Acheson | Reaching Critical Will of WILPF

Download full edition in PDF

On Tuesday morning, governments and activists gathered again in Geneva to resume formal discussions on autonomous weapon systems (AWS). The focus of discussions this week is to finalise a report of the group of governmental experts (GGE) to the Convention on Certain Conventional Weapons (CCW), which gives the group its mandate each year. An informal consultation was held the day before, and the week before that, and in May and June, in the Chair’s attempt to make up for the lack of days the GGE was able to formally operate this year after Russia single-handedly stripped the mandate to seven days from ten. The crux of the draft report, its conclusions and recommendations, will set up future work of the GGE and thus, for the moment at least, the future direction of international action on autonomous weapons. In this respect, things look bleak.

The Chair tabled the latest version of the draft conclusions and recommendations on Monday night, upon which delegates commented on Tuesday. The draft recommendations currently suggest that thirty days of meetings will be held over the next two years—fifteen in 2020 and fifteen in 2021. During these meetings, the group is to deal with reaching common understandings of human control and examine legal, technological, and military aspects of autonomous weapons. At the end of 2021, they are to use these deliberations “to continue the clarification and development of normative and operational frameworks on emerging technologies in the area of lethal autonomous weapon systems.” 

If this sounds a bit vague to you, it’s because it purposefully is. The notion of “normative and operational frameworks” is intended to capture whatever delegates want it to. For some, such as the 28 states that have so far called for a ban on autonomous weapons, this could mean the negotiation of a new legally binding instrument. The US delegation is convinced that this is all that it means, and thus objects to this already intensely ambiguous formulation. But others who do not support a legal agreement have argued that a framework could just be the agreement of political guidelines or commitments on how autonomous weapon systems can and cannot be used, or something even weaker than that. The point is, the Chair has gone for “constructive ambiguity”—which as the Campaign to Stop Killer Robots points out, may aid diplomacy at the CCW “but will do little to quell growing public concerns over removing human control from the use of force or meet rising expectations that nations will take strong action on this serious challenge.”

Public concern and expectations are indeed rising. Tech workers in particular have been organising against the weaponisation and militarisation of their computer programmes and other technologies. Employees of big companies like Google, Amazon, and Microsoft have protested specific contracts with militaries and weapon companies; others have started their own tech firms and pledged never to contribute to the development of autonomous weapons. Public opinion polls show again and again that the majority of the world’s citizens are appalled by the idea of machines making life and death decisions. 

Yet public concerns and expectations are in direct contrast to the positions of many governments at the CCW. First, there is the general lack of concern about setting up another discussion mandate for the next two years. This locks the GGE into continuing discussions it began six years ago, albeit over more days each year. But to what end? Without a guarantee of concrete action at the end of all these years of expert-level conversation, the CCW process is looking more and more like a master class in kicking the can down the road.

And then there is the attempt by a handful of states, including Australia, China, Israel, Republic of Korea, Russia, United States, and United Kingdom to water down what little progress might be possible if it were up to the majority of countries participating in these discussions. Russia in particular spent Tuesday aggressively undermining the Chair’s attempts to reach consensus on the draft report. The Russian delegation refused to participate in any of the informal consultations and then showed up with a litany of requested changes to the conclusions and recommendations that sought to walk back agreed language from previous outcomes. Russia objected to references to any international law other than international humanitarian law. Despite extensive debate and agreed language from previous GGE sessions it tried to remove all the references to human control, ethics, and morality and it attempted to redirect the object of discussion to specific weapon systems rather than emerging technologies—even though the latter has been the framework of CCW discussions on autonomous weapons for years. But the other countries in this group of spoilers are not blameless in these efforts to prevent progress in the CCW. It seems very clear that they prioritise leaving options open for the development of weapons that can kill without human control over the ethical, moral, legal, political, technological, and operational dangers that these weapons pose to humanity, peace, and security.

Their collective nonchalance when it comes to the vast majority’s desire to ensure meaningful human control over weapon systems and the use of force is an affront to the diplomatic process. As activists following international disarmament discussions have warned for years, consensus in many of these forums has come to mean unanimity, giving every single government a veto over every single decision or document. This has paralysed action in the CCW and in the Conference on Disarmament, leaving the UN General Assembly or alternative ad hoc forums as the only legitimate spaces where progress is possible, and rendering many UN bodies increasingly irrelevant or obsolete.

Furthermore, the refusal of a handful of governments to permit international negotiations on limits to autonomy in weapon systems means that they put their quest for dominance through violence over the human lives and security interests of the rest of the world. As six years of CCW discussions and previous work in the Human Rights Council have shown time and again, machines cannot and must not be able to select and engage targets on their own. Chile and Austria doubled down on this message on Tuesday, emphasising that human judgements cannot be replaced by machines. As Peter Asaro of the International Committee of Robot Arms Control reiterated at a side event, machines cannot understand humans as humans, no matter how complex the target profile they may be programmed with. Furthermore, as activist organisations such as the Women’s International League for Peace and Freedom and Mines Action Canada have repeatedly argued, bias in the programming of such profiles will inevitably lead to human rights violations, setting human beings up for death on the basis of sex, race, ethnicity, or other discriminatory criteria.

The time for the CCW to prevent a future where machines determine who lives or who dies on the basis of software and sensors, is now. Punting off concrete work for another two years, only to then be faced with the same prospects of a handful of states refusing to allow the development of laws and regulations, is not responsible behaviour. It is not ethically, politically, or legally sound to allow a few countries to drag us into the dark abyss of autonomous violence. Governments need to step up to match the courage and organisation of tech workers and start taking concrete action against killer robots before it’s too late.


[PDF] ()