CCW Report, Vol. 10, No. 6
Potential convergence confronts persistent obstinance at the GGE on autonomous weapons
25 July 2022
Ray Acheson | Women's International League for Peace and Freedom
The second formal session of the 2022 Group of Governmental Experts (GGE) on autonomous weapons (AWS) began today in Geneva. After Russia blocked the first session in March from operating in a formal mode, the GGE Chair convened three informal, virtual meetings in April and June to discuss the four proposals on the GGE’s table at that time. Ahead of this week’s meeting, delegations put forward four new working papers (China; Russia; jointly Finland, France, Germany, Netherlands, Norway, Spain, and Sweden; and jointly Argentina, Ecuador, Costa Rica, Nigeria, Panama, Philippines, Sierra Leone, and Uruguay). In addition, the Chair tabled a draft report of the GGE, which will be discussed and possibly adopted later this week.
Despite the growing convergence and common elements among many of these proposals, major obstacles remain to the consensus adoption of a report containing recommendations for the Convention on Certain Conventional Weapons (CCW). One of the key blockages is that the delegation of Russia is still refusing to accept that any new rules or regulations, let alone prohibitions, are necessary when it comes to AWS. Other states have supported this position in the past, though did not necessarily participate in today’s discussion; some others who do support action will only accept new measures if they are voluntary. This means that the GGE remains locked in by the lowest common denominator, which it must overcome to make any substantive progress this year.
The insufficiency of the law
During Monday morning’s session, the Russian delegation reiterated its belief that there is no basis under international humanitarian law (IHL), the Marten’s Clause, or the principles of humanity or human rights, that AWS can be considered so unique as to require the “imposition of restrictive regimes”. Russia also rejects the supposition that IHL needs modernising or adapting to address AWS, and reiterated that the main focus of the GGE should be on the applicability of IHL to lethal AWS. To this end, Russia’s working paper focuses on this issue, as well as espousing alleged benefits of the use of AWS.
The United Kingdom also emphasised that its proposal is clear that existing IHL is suitable for the regulation of new capabilities in weapon systems. It urged the GGE to increase its understanding of ways that weapon systems with autonomy can be developed and used responsibly, by creating manual for the application of IHL and best practices.
In contrast, most other delegations support the development of new agreements. Several noted that the GGE has been in a discussion mode for nine years already. To retain any credibility, it must now turn to developing new commitments on AWS. While states differ over whether these should be legally binding or not, or focus on national measures versus international standards, most agree that the time for action is now.
The myths of autonomy’s “advantages”
Other states also raised concerns about Russia’s emphasis on “potential benefits” of AWS, which also appears in earlier statements by the governments actively developing these weapon systems. In one of Russia’s many interventions today, it said that its focus on perceived “advantages” of AWS is a counterpoint to the many “emotional” discussions about the potential risks of such weapons. This highly gendered language is meant to ridicule and belittle those who raise legitimate concerns about AWS. Fortunately, states participating in the GGE pushed back against this patriarchal characterisation of their positions.
Cuba pointed out that the CCW is not a place to look at alleged benefits of weapon systems or virtues of artificial intelligence (AI), but to prohibit and govern certain conventional weapons. Pakistan likewise argued that the CCW is about ending arms races, achieving disarmament, and codifying and developing relevant international law. The human cost and destabilising effects of AWS are too significant to dismiss, it argued, including in relation to lowering the threshold for the use of force, asymmetric warfare, extrajudicial killings, and more. Iraq agreed AWS would be detrimental to international stability and international law, including through arms races that could have repercussions for generations to come. Nigeria pointed out that with the use of AWS, civilians would be targeted based on algorithms, and the weapon systems may be capable of self-evolution, which would hinder human control of their actions.
“War in itself is a human error,” said Cuba. Technology does not and cannot make war better. Furthermore, talking about the advantages that certain technologies like AI could possibly bring to warfare is contradictory to the essence of an arms control and disarmament forum. Nuclear energy is not discussed as a “benefit” of nuclear weapons, it pointed out. Biological weapons are not seen as a a benefit of biological systems “There are no good weapons,” said Cuba, including for legitimate self-defence, because even then, when they are used disproportionately, states are not following principles of responsibility.
Interactive discussions about specific proposals
Throughout the day, some delegations made comments upon or asked clarifying questions about the various proposals currently on the GGE’s table.
Switzerland asked those involved in the US joint proposal to clarify how the principles and practices in the paper should be structured, suggesting that right now they “float in space”. Switzerland also noted that voluntary measures can be useful, but only as a supplement to and not a replacement for legally binding rules. The US delegation argued there is a history of progressive development of IHL through non-binding instruments, such as the Montreux Document. It urged delegations to focus on making progress within the limited time for the GGE, which it sees as lying with the two-track approach.
The vast majority of states participating in the GGE do support a two-track approach, in which some systems would be prohibited, or deemed already unlawful under IHL, and other systems would be regulated to ensure their compliance with IHL. In this context, the new working paper from Finland, France, Germany, the Netherlands, Norway, Spain, and Sweden suggests the GGE should seek consensus on this approach and commit to:
(1) outlaw fully autonomous lethal weapons systems operating completely outside human control and a responsible chain of command; and
(2) regulate other lethal weapons systems featuring autonomy in order to ensure compliance with the rules and principles of international humanitarian law, by preserving human responsibility and accountability, ensuring appropriate human control and implementing risk mitigation measures.
Despite this convergence on two tracks, however, differences remain over what would be prohibited versus regulated.
China’s working paper also distinguishes between “acceptable” and “unacceptable” weapon systems, but has a different approach than some of the other proposals in relation to determining what falls into each category. China’s paper suggests five unacceptable characteristics as a package, including lethality, autonomy (absence of human intervention), impossibility for termination, indiscriminate killing, and evolution. Acceptable systems may operate with high degree of autonomy but are always under human control.
Canada raised some questions about China’s paper, including its emphasis on lethality that could open the door to the production and use of non-lethal weapons. China insisted that the five features of unacceptable weapons cannot be singled out and must be taken together; therefore, it does not see a problem with having lethality as one of the characteristics. Russia’s proposal also focuses exclusively on lethal weapon systems, about which Cuba raised concerns. While Russia argued this reflects the mandate of the GGE, Venezuela pointed out that that non-lethal weapons also need regulation, because it is not possible to eradicate the risk of indiscriminate use of weapons. It prefers that any agreement commit states to retain full human control over all weapon systems, including those with autonomy. As civil society groups have pointed out in tdhe past, it is not the technology that determines whether a system is lethal or not, but how and where and when it is used.
Canada also made a few comments on the joint paper introduced by Germany. It pointed out that the language in paragraph 3 about de facto prohibition of weapons that cannot comply with IHL is not a commitment but an obligation under IHL. Therefore, the paragraph should say such weapons “must” not be developed or used, rather than “should”. Canada also said its understanding of human control is that retaining such control during the whole lifecycle of a weapon system doesn’t necessarily mean exercising real time human control over all systems in all cases.
Cuba also raised questions about the joint working paper introduced by Germany, in particular its lack of reference to state responsibility and exclusive focus on human accountability. Cuba said the same is true of Russia’s working paper, which Cuba noted is also confusing in its assertion that an AWS is a weapon operating without any human involvement, yet a human would still be accountable for the system’s actions. Cuba also pointed out that the focus on increasing compliance with IHL does not make sense in legal terms, because one is either in compliance or not.
France responded to Cuba’s questions about its joint proposal, noting that paragraph 2(c) proposes policies and measures to ensure the accountability of states alongside humans. Russia also argued that its paper does not evade state responsibility but argued that the responsibility is borne by the individual making the decision to use the AWS. Cuba, however, pointed out that a state is still responsible for what humans under its jurisdiction do, even if the state wanted them to do something different. It also noted that human control is distinct from state responsibility.
Many delegations highlighted the emerging convergence among state positions, which has grown through the past nine years of discussions. Stop Killer Robots noted that across the broad range of proposals currently on the table, there are clear and identifiable areas of common ground:
States have recognised that these discussions concern weapon systems with autonomous functionality in the selection and engagement of targets. There is now widespread acceptance of this formulation within the proposals, which provides a solid foundation for moving forward.
Throughout these proposals, there is clear recognition that the use of autonomous weapon systems entails significant risks across a wide range of areas.
States highlight serious concerns over the application of international humanitarian law. Many flag concerns around international human rights law and criminal law. There is recognition of the challenges to human dignity, ethical principles and the dictates of public conscience, as well as wider concerns on international peace and security.
Most states recognise that these risks are serious, and that a response is urgently needed.
As part of the response, many states recognise that certain autonomous weapons systems are unacceptable and must be prohibited.
These include the need for prohibitions that cannot be sufficiently or meaningfully controlled or systems operating outside a responsible chain of command. While there are important differences over how such prohibitions are formulated, the recognition of a need for prohibitions and the centrality of human control in determining acceptability is well established across a wide range of proposals.
For the Stop Killer Robots campaign, a prohibition on systems that are designed to target humans is also necessary. We encourage states to further engage with the broad range of legal, ethical and humanitarian concerns around systems that are designed to target humans, and recommend an outright prohibition. Protecting human dignity is fundamental to an effective response to the challenges posed by autonomous weapons.
Regulations for human control
Virtually all states see the need for regulations across the broad scope of autonomous weapon systems that are not explicitly prohibited. Such measures to ensure human control include understanding weapons systems, limiting the types of target that a system can engage, and constraints the duration, geographical scope and scale of operation. Measures such as these should be recognised as central to our response - and form part of a legally binding instrument, and not simply national level best practices, which would be subject to substantial disparities in national interpretation and implementation.
Taken together, the campaign argued, these elements provide a “shared recognition of the need for a framework to include both prohibitions and regulations over the development and use of autonomous weapons systems that have strong potential to form the basis of a legal response.”
Most states participating in the GGE have made it clear that a legally binding instrument is the best way to address the risks and challenges of AWS. This is important, as Chile and Mexico said, for advancing IHL implementation, but also to go beyond IHL to focus not just on the use of AWS but on the entire lifecycle of the weapon and to ensure the incorporation of ethics and human rights. A legal instrument is also necessary to avoid the fragmented approach of national measures, the two delegations noted.
Likewise, the Philippines, speaking on behalf of the states that have tabled a draft CCW protocol, noted that a legal instrument is the best way forward and that the GGE has enough elements to begin negotiations. It urged that this proposal be considered in coordination with the roadmap the states submitted earlier, which consists of three steps: recalling the level of understanding reached in the GGE; recognising common ground in the GGE report; and recommending the CCW deliver a mandate to take this work forward, which should require the negotiation of a legally binding instrument.
After nine years of talks, amid the rapid development of autonomous technologies, including in weapon systems, this is imperative. Agreeing to national voluntary measures, as some states want, or doing nothing at all, as others prefer, are only viable strategies for making sure autonomous weapons become inevitable. Stopping the development of these weapons is the only way to secure our collective future.