CCW Report, Vol. 6, No. 5

15 April 2018

Allison Pytlak

Download full edition in PDF

Imagine if the catastrophically destructive power of a nuclear weapon had not been developed. Imagine the lives saved not only from the bombings in Japan but also in places poisoned by testing. Imagine if anti-personnel landmines had not been developed and used widely: the lives, limbs, and livelihoods of individuals that would be intact, as well as swathes of land that could be free for farming, building, or transportation routes. Imagine if a lot of things had not been created and were not now being used to smash cities, and everything they contain, into oblivion. 

We cannot turn back the clock and undo what has been done. We instead find ourselves labouring hard to identify ways to remedy the situation in the present, and prevent such things from reoccurring in future. 

With autonomous weapons, there is still a choice. Such types of weapon systems may be under development by some, but the genie is not yet out of the bottle. Discussions at the Group of Governmental Experts (GGE) meeting this week made it very clear that the concept of human control is firmly at the heart of the debate over what to do about killer robots.  No state supports a weapon that operates entirely without it, particularly in taking decisions about taking human life.  Differences exist as to the extent and nature of human control among states, and possibly the motivations of some are not entirely altruistic. Yet the clear take away message is that this is the right thing to do, for ethical, moral, and legal reasons that have been enumerated by so many voices in government, academia, military, science, and elsewhere, and were reperated at this GGE meeting.

A prohibition on autonomous weapon systems, which is gaining support from a steadly growing group countries participating in the GGE, is the best response to that concern. It would be clear, preventive, and forward-looking. Other options that have been presented have merit but do not go far enough. They cannot be as effective, binding, or lasting. This is not to say that negotiating such a prohibition would be easy but nothing worthwhile ever is. In the context of the Convention on Certain Conventional Weapons (CCW) there is precedent of the ban on blinding laser weapons to learn from, as some states pointed out. 

During the week I spoke on a side event panel about some of the lessons learned from various multilateral discussions on cyber conflict. The two issues are unique, although similar in that they are both about intangible things and hypothetical scenarios, and both involve dual-use items in which intent and purpose of use make a significant difference. The international community has made some progress in articulating norms for behaviour in cyberspace, but it is widely agreed that the gap between law and digital capability is growing exponentially. It may never close. There is a very real risk that the same will happen with autonomous weapons if decisive action is not taken quickly. To stay ahead of the curve, the opportunity of this GGE and its next meeting must be maximised in order to set out the course for swift policy response at the end of this year. 

We can, of course, meet again in five or ten years to discuss how to react to the use of autonomous weapons and their systems, their proliferation, and inevitable “misuse” as we do virtually all other weapon types.  Wouldn’t it be a welcome change however, to get it right this time and not have that conversation? This is a unique opportunity to learn from mistakes and do better in future. 

[PDF] ()