CCW Report, Vol. 7, No. 3
Will the "insignificant states" please stand up
29 March 2019
On Wednesday morning, as delegates participating in the current round of UN talks on autonomous weapons discussed “possible options” for moving forward on this issue, a mounting swell of voices calling for a ban on these weapons could be heard loud and clear. Numerous diplomats and activists from around the world advocated for work to begin now on developing a treaty to prohibit or regulate autonomous weapon systems in order to ensure the retention of meaningful human control over the use of force, on ethical, legal, political, and technical grounds. Clearly anxious about this, those against a ban or other legally-binding solutions attempted to rally the opposition. Quoting from the preamble of the Convention on Certain Conventional Weapons (CCW), Finland stressed the importance of the “militarily significant states” participating in discussions, current and future, on autonomous weapons. By the afternoon, those promoting non-binding declarations, or no action at all, held the floor.
This emphasis on “militarily significant states” is reminiscent of responses faced in the nuclear sphere for most of the last seventy years. It is exactly why it took so long to prohibit the worst weapon of mass destruction humankind has created (so far). The states possessing nuclear weapons, and their allies who perceived benefit from hiding behind their arsenals of terror, held an iron grip on what was considered credible and realistic in diplomatic and academic debate. During that time, nuclear weapons poisoned the earth, land, and water for countless generations in so many places around the globe. They incinerated human beings in Japan, twice, and have kept the world under constant threat of an atomic holocaust. When governments that are apparently considered to be “military insignificant states” objected to this state of affairs and worked together with activists and the Red Cross to prohibit nuclear weapons, they were ridiculed by other governments, described as “not really serious states” and “radical dreamers;” they were told that they did not have real security interests and that they were being “emotional”.
The idea that those governments that have the capacity to destroy the world in moments should have more of a say over international security is Kafka-level absurd. It is generally the same countries that consistently prioritise the economic interests of the military-industrial complex or the political interests of dominance-hungry governments over the protection of human beings. They invest more money in weapons and war than they do in the well-being of their own populations, sinking more economic and human capital into designing new ways to kill each other than into solving our common crises of climate change, environmental degradation, and poverty. It is shameful that the CCW, while also purporting to advance disarmament by “putting an end to the production, stockpiling and proliferation of such weapons” enshrines the importance of “military significant states” in its text—and perpetuates it in practice.
Militarism vs international law
These are the same countries that at this meeting proclaim it is “premature” to develop any legal or other measures to prevent the development of weapons that would kill and destroy without human control. Countries like the United States, which is apparently more concerned about the possible “stigmatisation of technology” than with preventing an arms race of automated violence, and thus “cannot accept” any attempts to negotiate a prohibition on autonomous weapons. Russia, too, is against any legal or political measures, moratoriums, codes of conduct, or anything that could possibly constrain its pursuit of killer robots. It says that any justifications for a ban on autonomous weapons that uses the moral foundations of international humanitarian law will lead to an artificial division of “good” and “bad” weapons—as if international humanitarian law is not meant to constrain “bad” actions by states and their agents during warfare. Morality, as a system of values and principles of conduct, seeks to prevent the worst possibilities of human behaviour. It is incredibly disturbing to hear representatives of “militarily significant states,” especially those with nuclear weapons, speak out not just against the development of certain weapons but against the frameworks that the human species has developed over time to guide our lives and our collective societies.
Some governments, like the United Kingdom, argued that a legal instrument prohibiting or regulating autonomous weapons “would not have any practical effect”. The UK delegation prefers “continued commitment” under existing national and international law, which it feels are sufficient to govern the development of robots that kill people. The “existing law is sufficient” assertion has been voiced repeatedly by the countries that oppose negotiating a prohibition on autonomous weapons—it is an easy line, one that was also said against the prohibitions on landmines and cluster bombs and other attempts to stop the indiscriminate killing of civilians. The Netherlands, which hosts US nuclear weapons on its soil, even claimed that rushing to political or legal agreements “can weaken or hollow out existing humanitarian protections,” apparently suggesting that a ban on killer robots could undermine existing international law.
The idea that a legal treaty would not be “practical, acceptable, and enforceable,” as these countries argue, should be concerning to anyone who cares about the resilience of international law or arms control. Rejecting the attempt to agree to new standards and rules around unprecedented weapons technology threatens the international’s community’s reliance on the rule of law to govern inter-state relations or conduct on the battlefield. Such arguments against new law can be read as antagonism towards international law as a legitimate constraint on the state’s monopoly on the means and methods of violence, which (not-so-ironically) undermines the existing international law that these same governments claim are sufficient to regulate state behaviour.
A new international legal instrument would not have any negative effects on states, said the Brazilian delegation, if all are adhering to international law already. Unless, it questioned, have we moved to a scenario where “might makes right?”
Legality and morality
Preventing such a move is why the “unserious” or “militarily insignificant” states are taking a stand for both law and morality at this meeting. European countries such as Austria and to some extent Belgium, Ireland, and Switzerland, together with Latin American governments including Brazil, Chile, Costa Rica, Cuba, Ecuador, Mexico, Panama, and Peru, as well as others such as Algeria, Pakistan, South Africa, and the Non-Aligned Movement have spent the last few days reiterating their moral, ethical, political, legal, technical, and operational concerns with increasing autonomy in weapon systems. Their concerns about autonomous weapons demonstrate that they do not see any benefits from having sensors and software programmed to find, fix, and fire upon targets. They do not see any benefit from granting machines the ability to kill and destroy, or from distancing humans ever further from the act of committing violence against other humans.
“How much suffering could have been spared if the international diplomatic community had addressed the problems of landmines and cluster bombs sooner than we did?”, asked the delegation of Peru on Wednesday. Right now, we have the chance to prevent human suffering from the further automation of violence, and we must seize this moment. “Diplomacy should not be overtaken by realities on the ground,” cautioned the Austrian ambassador. “Doing nothing while these novel and unique weapons are gaining increasingly levels of autonomy is not an option,” said Pakistan.
The message from all of these governments is that the time is right to negotiate a legal instrument on fully autonomous weapon systems. We have enough clarity on the core concepts and technologies to do so, and we have the urgency as well—without new law, the unfettered autonomation of violence will continue until it overtakes us. Already we can see the emergence of machine warfare and policing on the horizon, from Australia’s investments in trying to build “ethical” autonomous weapon systems to the United States’ “unapologetic” pursuit of “artificial intelligence-enabled weapons”. All of this provides the backdrop for the UN talks, at which these governments seek to stall progress to ensure their tech is built before the rest of the world can stop it. While a handful of governments may wish this for our future, the majority clearly do not. The military allies of the most “significant states” seem hesitant to step up to stop this, suggesting declarations, codes, and principles. So once again it is the “insignificant states” that once again need to stand up and take the lead for the sake of humanity, as they have done with incredible significance on other humanitarian disarmament initiatives.