logo_reaching-critical-will

CCW Report, Vol. 9, No. 4

Editorial: Disrupting denialism
14 August 2021


Ray Acheson | Women's International League for Peace and Freedom

Download full edition in PDF

During the second week of its work, the Group of Governmental Experts (GGE) reviewed the Chair’s paper containing draft elements on possible recommendations for a normative and operational framework (NOF) on autonomous weapon systems (AWS). The group petered out at the end, holding mostly informal discussions on its final day that were not available via webcast and that seemed to mostly consist of bickering over process. Despite this, the discussions this past week were useful in developing concrete positions on possible elements for the framework, which has been on the GGE’s mandate to develop since 2019.

While the title of the paper—“draft elements on possible recommendations”—is far removed from the negotiation of an actual NOF, it was structured in such a way as to provide helpful guidance on what an NOF could eventually look like. This gave participants an opportunity to discuss what elements would be needed in an agreement on AWS, and what more needs to be discussed to get there.

Of course, in keeping with their ongoing objections to doing any work on this portfolio outside of reiterating previously agreed language, a handful of states that are actively pursuing the development of AWS did not care for this approach. They would have preferred to work on a draft report to the CCW Review Conference that merely reiterated the eleven guiding principles adopted in 2019 and perhaps restated the agreement reached even earlier that international law applies to AWS.

Fortunately, they were not able to disrupt discussions on the Chair’s paper too much, and most delegations engaged thoughtfully in crafting recommended elements for an NOF. This offers an important lesson to the GGE moving forward: to get substantive work done, concerned states must forge ahead and not allow those that deny the risks and challenges posed by AWS to prevent the action that is urgently needed to protect humanity and international law.

Deny, defer, derail, disrupt

“Denialism strenuously demands a dialogue as a wedge to open ‘debate’ where there is none, as a tactic to derail and disrupt,” notes historian Sean Carleton. “That’s the strategy. Engagement, in varying ways, can be essential—but so too is non-engagement.”

The tactic of denialism has been on full display the past two weeks of this GGE session, as well as in all its previous meetings. One recent example is the attempt to deny the implications of algorithmic bias for weapon systems.

A growing number of states acknowledge the risks of algorithmic bias and see the dangers this poses for human life and dignity if deployed in weapon systems. Therefore, the Chair included an element on bias in his paper, which acknowledges that “algorithm-based programming relies on data sets that can perpetuate or amplify social biases, including gender and racial bias, and thus have implications for compliance with international law.”

Argentina, Austria, Brazil, Chile, Ireland, Mexico, Palestine, Panama, Peru, Philippines, South Africa, Switzerland, Venezuela, the International Committee of the Red Cross (ICRC), and Campaign to Stop Killer Robots (CSKR) supported this provision.

In addition, last week Austria, Brazil, Chile, Costa Rica, El Salvador, Holy See, Ireland, Luxembourg, Mexico, New Zealand, Palestine, Panama, Peru, Philippines, Sierra Leone, Uruguay, the CSKR, and Red de Seguridad Humana para América Latina y el Caribe (SEHLAC) raised concerns about gender, racial, and other biases that can be perpetuated through algorithms and thus must be considered in any regulations of AWS.

The issue has been raised during side events and in written submissions to the GGE in the past. The Women’s International League for Peace and Freedom has published papers offering feminist perspectives on AWS, which among other things provide analysis of racial and gender biases that may be perpetuated and turn lethal when incorporated into weapon systems. The Campaign to Stop Killer Robots has published papers about gender and racism in respect to AWS, and has set up web pages with additional resources on gender and racism, too. The United Nations Institute for Disarmament Research has published a report on algorithmic bias and weaponisation. Tech workers have long warned that the application of such technologies to weapons will have grave impacts.

Beyond these resources directly related to AWS, there are scores of studies and media reports about racial and gender bias in technology that are relevant to our discussions in the GGE. Wanda Muñoz of SEHLAC helpfully compiled some of these into a Twitter thread for delegates. The Algorithmic Justice League is also a great place to start digging into the social harms of artificial intelligence.

As Panama pointed out, these harms are not abstract or theoretical. We know that algorithms, artificial intelligence, and machine-learning systems reproduce implicit bias—and can be programmed intentionally with discriminatory parameters. We have seen the harms this has caused in policing, migration “enforcement,” biometrics, and more. And as Palestine noted, while victims of data bias in other spheres may be able to seek justice for the harms against them, there will be no justice for someone who has been targeted with lethal force due to data bias.

There is thus already a lot of information, real world experience, and careful consideration of this issue. As Brazil, Chile, and Mexico warned, “After so much study and proof of the risks shown by ethicists, technological experts, philosophers, and other experts, maybe the George Orwell scenario … is for those who discard these risks and challenges that have been proven without a doubt.”

Yet some delegations opposed the inclusion of bias in the paper, including India, Russia, and the United States. Their basic argument is that there is not information on this and that it requires further discussion in the GGE; India also argued that if programmed “correctly,” datasets can “remove any biases”. Others, such as Australia, Finland, France, Japan, Netherlands, Sweden, and United Kingdom were slightly more cautious, suggesting that there may be concerns about data bias in some circumstances, but more study is necessary. They urged changes to the Chair’s paper to reflect their hedging on this issue.

This continued denial of a problem, and subsequent deferment of action on it, is classic denialist strategy. It’s meant to buy time while the states developing these weapons forge ahead before they are constrained by international regulations and norms. The contradictions between the position that “more information is needed” while weapons development continues apace was not lost on those supporting urgent action. Palestine, for example, asked if those waiting for “more studies” or “further discussion” will impose a moratorium on the development of autonomous weapons until new studies are available.

Denialism (and its contradictions) isn’t just related to the question of bias but to all the risks posed by AWS. This is reflected in one of the basic contradictory assertions by these governments: that AWS do not yet exist, so we can’t know if they have risks, thus they cannot be prohibited; but we know for sure that they have real, non-hypothetical benefits, so we should develop them.

This smoke-and-mirrors deflection is part of the denialist strategy, and it’s also employed to disrupt the process for taking action against AWS, such as through the negotiation of a legally binding instrument. These states deny the problem and the solution—and they even deny stronger mandates or new mechanisms to help advance discussions.

Even France expressed frustration with the contradictions of certain states, noting that the Russian delegation both calls for information and says we can’t reach agreement on certain things because there hasn’t been enough discussion, and yet also opposes establishing new mechanisms that could bring that information into the GGE for discussion!

Yielding to the requirements of humanity

Which brings us back to the core issue: figuring out how to make progress despite the stalling tactics of AWS developers. Consensus, as described in last week’s editorial, is a problem here. But, as delegates demonstrated this past week, we can advance the issue even over the objections of the denialists. But there may only be so far states can get within the CCW, or any other consensus-based forum. Given that various nuclear-armed states have managed to block any progress in the Conference on Disarmament for more than twenty years, it’s very clear the extent to which they can derail processes and disrupt the collective good.

It’s thus up to the majority to disrupt their ability to disrupt. This could involve taking the autonomous weapon issue outside of the CCW to a forum where states aren’t given vetoes. Other weapon systems have been banned in the UN General Assembly and through stand-alone processes. A lot of governments see the CCW as being the most appropriate forum, but what does that mean if the forum does not permit the desired action from being undertaken? If we cannot start—and conclude—negotiations, if we cannot negotiate in good faith—meaning without the will of the majority being taken hostage by a handful of states—and if we cannot secure an agreement that protects humanity, how is the CCW “appropriate” for anything at all? If its high contracting parties prevent it from fulfilling its own mandate to prohibit and restrict weapons, the work to uphold its goals and objectives must be taken up elsewhere.

If the GGE can agree to a negotiating mandate at the CCW Review Conference in December, that is one thing. But simply extending the mandate, or establishing new committees or groups of experts, is insufficient. It’s time to disrupt the status quo, as those behind the Mine Ban Treaty, Convention on Cluster Munitions, and Treaty on the Prohibition of Nuclear Weapons did, by forging new international law and norms against weapons that cross moral, ethical, and legal lines. Even if those developing AWS reject these efforts, normative stigmatisation and prohibition will have an impact. Political, legal, ethical, moral, and technical lines can be drawn that impact industry guidelines, economic incentives, and international law alike.

This is important for all governments to consider as the GGE continues its work in September and December, and as the CCW Review Conference meets in December. As laid out in the Declaration of St. Petersburg of 1868 and reiterated by the ICRC this week, it is a requirement that states, fix limits at which “the necessities of war ought to yield to requirements of humanity.” It’s past time for states to live up to this obligation and prohibit AWS through a legally binding instrument.

[PDF] ()