logo_reaching-critical-will

CCW Report, Vol. 12, No. 1

Red Flags of Automating Violence
12 March 2024


Ray Acheson | Reaching Critical Will, Women's International League for Peace and Freedom

Download in PDF

Coming out of the first week of discussions for this year in the Group of Governmental Experts (GGE) on autonomous weapon systems, it feels like there is some good news and bad news. The good news is that delegates focused on specific text, drawn from a compilation of submissions from governments in response to guiding questions from the Chair of the GGE ahead of the session. These submissions are in turn based on more than a decade of discussions and proposals in the GGE. Ideally, the deliberations held last week could be a basis from which to begin negotiating an international legal instrument that prohibits and restricts autonomous weapon systems (AWS).

The bad news is that there are still a handful of states that, while participating in this process, clearly have no intention of ever allowing the formal negotiation of such an instrument in the CCW. This is not news, of course; the same states have been blocking meaningful progress at the GGE since its inception. They are mostly the same states carrying out a great deal of violence in the world with a variety of weapon systems already—and they are mostly the same states that have rejected bans on other weapons in the past, and that have ridiculed those working for change. During this session of the GGE, the Russian delegate expressed frustration with those pushing for an agreement with prohibition and restrictions on AWS, saying that these delegations are living in another universe. This is reminiscent of another Russian delegate’s remarks to the First Committee in 2013 about those pursuing a ban on nuclear weapons, whom he described as “radical dreamers” who have “shot off to some other planet or outer space.” 

Even though the positions of the states blocking a ban on autonomous weapons are well known, it’s useful to highlight a few of their comments from this round of talk that raise some red flags:

Russia’s representatives repeatedly suggested that there is nothing special about AWS, that they are no different than any other weapon. It argued that weapon reviews, mandated by Additional Protocol I of the Geneva Conventions, do not provide for automatic prohibition of weapon systems that are not compatible with IHL, and that such reviews do not need to be made public or the information shared with anyone. Together, this all suggests that the Russian government believes software, sensors, algorithms, and artificial intelligence (AI) pose no unique technological, political, humanitarian, or ethical challenges, and that the Russian state can build any weapon systems it wants to without justifying or explaining itself.

Israel’s representatives asserted that prohibitions on the use of certain weapons and other rules of international humanitarian law (IHL) should not be conflated with how weapons can be used in specific contexts. For example, it argued that “there is no basis in IHL to support the argument that objects meeting the definition of military objectives in accordance with IHL should as a matter of principle be treated differently in the context of employing LAWS.” Israel also argued that it is not possible to define a level of human interaction without which a weapon system would be unlawful per se. Together, this all suggests that the Israeli government believes it can violate the laws of war when it wants to—which is backed up by its ongoing attempted genocide of Palestinians and its overwhelming violation of IHL and international human rights law and its commission of war crimes and genocide—and also that it can operate weapon systems without any human control if it decides it is in its self-interest.

The United States (US)’ delegates indicated that the US military is already operating fully autonomous weapon systems, which it believes are lawful. And despite Israel’s ongoing genocide of Palestinians and the supply of US weapons to that end, the US delegation claimed that many weapon systems that some in this forum might consider unlawful, or want to make unlawful through a new treaty, are currently being used “without legal controversy”. The US delegation also repeatedly argued the problem is not any given weapon system as such, but its use—or rather, its potential misuse. The delegation also claimed that it is not necessarily possible to control the effects of any weapon, and that no weapon can distinguish between civilians and combatants. Together, all of this suggests that the US government believes that regardless of the violence it or its allies are committing in the world, and regardless of the weapons it uses to commit that violence, there are no controls over the development or use of these technologies that could, or should, constrain this violence. It also indicates agreement with Russia’s delegation that there is really nothing “special” about autonomous weapons.

This is a simplified accounting and analysis of the “red flag” comments made during this session of the GGE. But these comments provide key insights about the current tactics of these governments to delay or prevent the progressive development and codification of international law that could constrain the development and use of AWS. These comments also signal that, as is clear from all three governments’ engagements in war and violence, that they do not see their actions as incompatible with their interpretation of existing international law. Whether it is Russia’s full-scale invasion of Ukraine and violations of IHL and human rights; Israel’s occupation of and genocidal campaign against Palestine; or the US’ involvement in countless occupations and wars, both directly and indirectly—the governments in question appear to believe themselves above the law in each and every case.

Of course, plenty of other government delegations, international organisations, and civil society groups pushed back against these misinterpretations and violations of the law. The International Committee for Robot Arms Control (ICRAC), for example, did an excellent job of questioning the US delegation’s assertion that militaries cannot control the effects of a weapon—pointing out that control over weapon effects is a core component of the IHL prohibition on the use of indiscriminate weapons (see the report on Topic 2 for details)—and of the US attempts to erroneously qualify notions like bias and discrimination (see the report on Topic 3).

Many diplomats and organisations also refuted the idea that there is nothing special about AWS. The representative of Switzerland, for example, pointed out that the entire point of ten years of discussion within this GGE have been based on the understanding that the weapon systems under discussion are ones where machines will be designed and developed to do something that has so far only been done by human beings: namely, making “decisions” about the life and death of human beings.

The arguments put forth by Russia, Israel, and the United States that there is nothing special or unique about AWS is absurd. For example, the US delegation’s argument that no other weapons can distinguish between civilians and combatants and therefore autonomous weapons shouldn’t have to either deliberately obscures the fact that no other weapon system is programmed to attack on its own. Why else would heavily militarised states even bother weaponising technologies like AI and algorithms if it was not to offer a new dimension to warfighting (or policing, as many of these weapons will end up being used by police forces as well).

Weapons that are programmed to identify, select, and engage targets based on sensors and software are not the same as any other weapon that has previously been developed or deployed. The technologies that enable these types of weapons are only now being developed and put to use in other contexts; their weaponisation must be urgently addressed—preferably by preventing their weaponisation altogether.

As the Austrian delegation noted, the prohibition or restriction of these systems might not derive exclusively from existing IHL. But human rights and ethics must be brough to bear on these weapons, and completely new laws and norms might be necessary. The design of the Convention on Certain Conventional Weapons (CCW) is to develop new law, just apply existing law. If, after all of this time, high contracting parties to the CCW cannot agree to prohibit and restrict autonomous weapons, then the majority of states must urgently take the issue up in a democratic forum that puts human life and dignity above war profiteering and geopolitical power.

[PDF] ()