logo_reaching-critical-will

CCW Report, Vol. 9, No. 7

Editorial: Beyond balance and binaries
4 October 2021


Ray Acheson | Women's International League for Peace and Freedom

Download full edition in PDF

During the most recent installment of UN talks on autonomous weapon systems (AWS), it became increasingly difficult not feel that the fate of our collective future is at stake in these discussions. Not just in relation to how governments ultimately decide to deal with AWS, but in a much larger sense. The conversations happening in this Group of Governmental Experts (GGE) provide important insights into the approach that militarised governments take not just to weapons, but to the world itself.

Orwell at the GGE: The claims of “weapons for peace” and “automation for protection”

As participants reviewed the Chair’s revised paper—which contains what might be turned into recommendations that the GGE submits to the CCW Review Conference in December—an alarming pattern developed. A small number of states called for removing references to human rights and international human rights law; to human dignity; to ethical considerations; to algorithmic bias; to the word “obligations;” and to anything they perceived as a “new commitment”. They not only rejected anything they felt could constrain their development and use of AWS, but they also rejected the very idea that ethics, morality, rights, dignity, or bias have anything to do with programming machines to surveil, attack, and kill human beings.

The developers of AWS refuse to engage with these issues, rejecting out of hand the risks and challenges raised by those trying to prevent the automation of violence. When confronted with text in the Chair’s paper about ethics or bias, for example, they assert that there hasn’t been enough consideration of the issue so they can’t accept language on it; and simultaneously argue that these issues are irrelevant and there’s no point in continuing to have conversations about them.

After imposing this catch-22 to prevent sincere deliberation over the likely harms of AWS, they then proceed to go on at length about the imagined benefits of these weapons. Russia, for example, argued that a high level of automation and use of algorithms in weapons will be “more reliable than spontaneous, uncontrolled actions of the human mind.” It rattled off a litany of ways in which machines will “improve” warfighting through increased accuracy and efficiency and lack of human emotion. Russia is not alone in purporting these benefits; Australia, India, Israel, Japan, Republic of Korea, Turkey, United Kingdom (UK), and United States (US) are also among those claiming the virtues of AWS.

Each of these states, which are already engaged in developing and in some cases using increasing autonomous technologies in weapon systems, also assert that any outcome of the GGE must “strike a balance” between “military necessity” and humanitarian concerns. The US even went so far as to argue that there may at times be a convergence of military and humanitarian interests, and that the GGE’s outcome should reflect this. But as Ireland pointed out, with the support of several other delegations, there is no universal definition of military necessity. It encouraged the GGE to take guidance from the Declaration of Saint Petersburg, which states that necessities of war should yield to requirements of humanity, and not that they should be balanced or that they can “converge”.

This idea that war and humanitarianism can converge is indicative of the mythologies within which the militarised states live. It’s a worldview that says violence and war is good for humanity. Images of advertisements for these new tools of violence and oppression become clear when listening to their interventions. Killer Robots Will Save Lives! Automating Violence Will Make Us Safer!

As Cuba noted, the GGE is not a forum for the promotion of new weapons. It is meeting under the Convention on Certain Conventional Weapons (CCW), which has a mandate to prohibit and restrict weapons, not champion them. Likewise, as many delegations have reiterated throughout many GGE session, the CCW makes it clear that “the right of the parties to an armed conflict to choose methods or means of warfare is not unlimited,” that it seeks to “continue the codification and progressive development of the rules of international law applicable in armed conflict,” and that one of its key objectives is to end the arms race and facilitate disarmament. This is the framework within which the GGE is meeting to develop recommendations for a normative and operational framework on AWS. This is not an arms fair.

The logics of oppression and power

The pro-AWS branding exercise seems to be more about overriding the concerns of others than it is an actual motivation for the development of these systems. That is, the arguments put forward by AWS developers are couched in terms of protecting civilians, minimising harm, and complying with international humanitarian law. But these are not the motivations behind the development of these technology. Fighting war faster; risking fewer human soldiers; sorting, tracking, and killing people more “efficiently”—these are real motivations behind AWS.

As noted in a previous editorial, if states really wanted to save lives, they would invest in mitigating climate chaos, reducing poverty and inequality, fostering education, housing, and food security. They should be spending their money, time, and ingenuity on pretty much anything else other than weapons. Weapons do not save lives. Weapons are designed to take lives, to destroy infrastructure, to repress and control.

The dominant narrative of the militarised governments of the world is that weapons are for security. They have perpetuated this myth for so long that entire economies have been built around it, and international relations is governed by it. Yet human history has shown this to be false—in fact, the opposite of reality. One needs to look no further than the recent conflagration in Afghanistan to see where militarism and violence lead. And to pull back the curtain on the magical thinking about the benefits of AWS, one needs to look no further than the last two decades of drone warfare. Machine-based remote warfare has already led to thousands of civilian casualties, erroneous targeting, the rise of military operations outside of war, extrajudicial killings, psychological harm, destruction of schools, hospitals, and markets, and so on. One also needs to look no further than the deployment of artificial intelligence (AI) and algorithmic technologies by police or immigration officials, which has already resulted in the wrongful identification, harassment, incarceration, or deportation of people.

“One cannot disentangle tech production and deployment from racial and carceral logics,” noted Dr. Matt Mahmoudi of Amnesty International. In a statement to the GGE, he pointed out that component technologies that some AWS will depend on are already facing prohibitions, with the European Union banning remote biometrics and the UN High Commissioner for Human Rights speaking out against biometric mass surveillance. A growing number of delegations have already articulated their concerns with the perpetuation and amplification of social biases, such as gender and racial bias, through the development and use of AWS. As noted in a previous editorial, these harms are not abstract or theoretical—they are well documented and widely understood.

Yet, the AWS developers either reject that bias is a problem or assert there has been “insufficient study”. In reality, however, it’s more likely that bias is desired by those who want to develop AWS. As the UK explained, bias is part of the operational parameters that will need to be programed into an AWS. States want to be able to program bias into machines, it’s only “unintentional” bias that might raise a concern, the UK noted. The UK likely didn’t mean to suggest that it wants to program a machine to attack people of a certain race or sex (or at least, it probably didn’t mean to say it out loud). But that is precisely one of the concerns raised by AWS: that they will be capable of being programmed to target and attack people based on race, gender, age, ethnicity, disability, or other physical or social marker that has led to them being deemed a “threat” to a state.

Centring lived reality

Weaponising technologies with autonomous features will not bring security or save lives. But it will offer yet another tool of power and control. This reality is at the core of what needs contesting at this GGE and elsewhere, and many governments know it. “We are acutely aware of our histories, often tainted by the bitter legacies of colonialism, the scars of war, and marked inter alia by a pattern of deployments and testing of experimental weapons technologies against our populations,” noted Palestine. “We cannot but extrapolate that, in all probability, the Global South is where autonomous weapons systems will be initially tested and used by the developers of these systems.”

This lived reality should be at the core of any work on AWS. Instead of building a future where machines fight wars and enforce “order” in societies, governments should be offering reparations for past harms and building structures of care and equality. They should be learning from history about what does and does not lead to peace and well-being. They should be engaging cooperatively to prevent conflict, not wage it more “efficiently”.

Disarmament is the best defence

Many delegations have noted throughout the GGE’s work that autonomy is a spectrum and binary characterisations of AWS are not helpful. This is not a question of fully versus partially autonomous, this is not about military necessity versus humanitarian concern. The work of this GGE is—should be—about safeguarding the future of humanity in all its complexity. Turning human beings into 1s and 0s in a line of code in a machine does not safeguard human lives, or rights, or dignity.

The binary approach to the world has led militarised countries into a narrative of “might is right,” “good guys and bad guys,” and “dominance or submission”. This binarism renders invisible all the other ways of living in the world, in all its complexities and nuances. It precludes collaboration and cooperation and it prevents peace and solidarity. Because of their military power, these countries feel they have the right to control the discussion and its outcomes. They try to block the creation of any agreements that do not suit their perceived interests, and deny the legitimacy or validity of any agreements produced without them. But these states are not the only ones that matter. They do not exist in a vacuum, merely compete with, constraining, or killing each other. They exist in a much bigger world, in which other approaches exist.

“We are aware that our absence in these fora would lead to the framing of disarmament norms to reflect mainly the views of powerful highly-militarised States, often to the detriment of our priorities and efforts to protect our civilian populations and the most vulnerable from further harm,” noted Palestine. But countries of the global south “are not merely passive recipients of norms determined by other States but instead pride ourselves with being active norm-makers ourselves.”

Such norms have in the past included the prohibition of weapons that cause harm and suffering. And if the growing momentum in the GGE is any indication, these countries will lead the development of norms against AWS. As Panama said, while the militarised countries focus on building more sophisticated arsenals, the best defence is disarmament.

[PDF] ()