CCW Report, Vol. 11, No. 1

Report on the GGE Virtual Informal Consultation
27 February 2023

Laura Varella | Reaching Critical Will, Women's International League for Peace and Freedom

Download in PDF

On 20 February 2023, states joined the virtual informal consultation of the CCW Group of Governmental Experts (GGE) on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems (LAWS). The consultation was convened by the Chair of the Group, Ambassador Flávio S. Damico from Brazil, as a way of providing states with the opportunity to express their views and expectations for the work of the GGE this year.

The Chair announced his intention of structuring the first session of the GGE, which will take place on 6–10 March, around three elements. First, delegations would present their proposals for the work of the Group. States would be given time not only to present new proposals, but also to recall proposals made last year. After this first round, delegations would engage in a “horizontal discussion,” looking into how the proposals relate to each other. Following this, delegations would try to look at common themes, having the opportunity to comment on issues they believe are relevant. The Chair expressed his wish of having a more effective discussion, bearing in mind the mandate of the GGE to intensify deliberations this year.

Mexico, Pakistan, the United Kingdom (UK), and Uruguay expressed support to the Chair in conducting the work of the Group this year. So did Aotearoa-New Zealand, the Philippines, and Switzerland, which agreed with the Chair that the GGE needs to intensify its work in 2023. Algeria called for progress and highlighted the sense of urgency, while Uruguay expressed hope that the GGE adopts more than a mere procedural report this year.

A repeated call among delegations was for states to focus discussions on finding common ground between the different proposals. The UK, the United States (US), and Japan noted that there is a lot of overlap among the proposals currently on the table. Similarly, Mexico, Switzerland, and the Philippines encouraged delegations to look for commonalities among the proposals. Some delegations, including Japan and India, stated the need for states to focus on substance, rather than on form.

This last point was also raised by Russia, which called on delegations to build bridges and try to find compromise. Russia also expressed hope that the GGE engages in substantial discussions and that it uses its full potential. However, it is worth recalling that substantial discussions were prevented last year precisely by Russia’s actions in insisting on negotiations to happen informally during the first session of the Group. Additionally, Russia’s call for states to “build bridges” contradicts its own behaviour in the GGE, as a country that has repeatedly prevented compromise by using the rule of consensus as a tool to impose its will.

Like Russia, the US also stressed the need to find commonalities among different positions and expressed hope for a substantive outcome. It suggested structuring the discussions within four main themes: characterisation; prohibitions; regulations; and state responsibility/accountability. By separating the discussion about AWS that should be prohibited and others that should be regulated, the US argued that this formula would take into account the two-tiered approach suggested by several states. Surprisingly, the US did not make any comments regarding its framework for a “Political Declaration on the Responsible Military Use of Artificial Intelligence and Autonomy,” announced only a few days earlier, which contrasts with its own position and does not recognise that some AWS should be prohibited.

In line with those delegations that encouraged finding common ground among states' positions, Stop Killer Robots (SKR) highlighted some points of convergence. For example, it said that states made a lot of progress in characterising AWS and recognising the risks that they pose. SKR also noted that most delegations agree that certain weapons must be prohibited, although some elements of meaningful human control—such as predictability, understandability, explainability, reliability, and traceability—still need to be further improved. SKR stressed that the prohibition to target human beings also has been insufficiently incorporated in proposals. Finally, recalling the words of Professor Stuart Russell, who issued a warning that AWS can be systems of mass destruction, SKR reiterated that mere principles and practices won't cut it, and that legally binding rules are the only possible response to the threats posed by AWS.

[PDF] ()