CCW Report, Vol. 13, No. 4
Make It or Break It
11 September 2025
Laura Varella | Reaching Critical Will
On 1–5 September 2025, delegations met in Geneva for another session of the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS). The GGE organised its discussion around a rolling text put forward by the Chair, Mr. Robert in den Bosch of the Netherlands. The text is divided into boxes, or sections, focusing on different elements of a potential instrument on LAWS. As debates progressed last week, the Chair proposed a revised version of Box I, which was discussed by delegations on Thursday. On the following day, Friday, delegations discussed new wording suggestion elaborated by the International Committee of the Red Cross (ICRC) on the critical functions of LAWS.
Amidst technical debates on the characterisation of LAWS and on the meaning of the term “context-appropriate human judgment and control,” the meeting had one of its highlights on Friday, when a group of 42 states delivered a statement affirming they are ready to “move ahead towards negotiations within the CCW on the basis of the rolling text." As said by Stop Killer Robots, this is a real step forward, and states should take this commitment seriously and to move to launch negotiations on a legally binding instrument on autonomous weapon systems as soon as possible.
The controversy around the term “identify”
One of the most debated issues of the week was the critical functions that should be included in the characterisation of LAWS. In the rolling text proposed by the Chair, LAWS were characterised as “an integrated combination of one or more weapons and technological components, that can select and engage a target, without intervention by a human user in the execution of these tasks.” While several delegations supported the description of LAWS with only “select and engage” as critical functions, a few others insisted that “identify”—which was present in earlier drafts—should also be considered a critical function.
However, as delegations started to discuss whether “identify” should be put back into the rolling text, soon it became clear that states attributed different meanings to this term. As explained by the Asia Pacific Institute for Law and Security and the University of Utrecht, for some, “identification” as a function seemed to mean the process of developing target profiles. For others, identification seemed to mean the process of recognising persons or objects in the operational environment that match a target profile. They emphasised that if the word “identify” was included, and if it was read in the first sense—the process of developing target profiles—this would radically narrow the application of the elements that the Group has been drafting. The ICRC had the same concern, arguing, “If read this way, it could require all LAWS to have the ability to pre-set their own target profile, which would exclude many autonomous weapons systems who do not have such capability.”
A solution was put forward by the Chair on the last day, with language proposed by the ICRC. The draft proposal included the term “identify” in the rolling text as a critical function but explained the meaning of the term in a subparagraph. While some delegations responded positively to this attempt to find a compromise, others still expressed concern and indicated need for more discussions (see the report “Revised Box I with Additional Drafting Suggestions” in this edition for details.)
France rightly pointed out that the exercise of trying to define “identify” risked falling into the same failed attempt of defining “lethality”. “The characterisation should remain wide reaching,” emphasised France. Several delegations made similar remarks and stressed that just because a system is included in the characterisation, it does mean it is prohibited; that would be dealt with in the “prohibitions section” of the rolling text (Box III). Nevertheless, it seems that the discussion is not yet over and that states will have to keep working to find convergence on this issue.
The decade-long controversy around the term “lethality”
For the last decade, the GGE has been walking in circles when it comes to the issue of lethality in the characterisation of autonomous weapon systems—the ongoing LAWS versus AWS debate. For some delegations, including China, India, Israel, Japan, the Republic of Korea, Russia, Ukraine, the United States (US), the word “lethal” needs to be maintained in the phrase “lethal autonomous weapon systems,” as this was established in the mandate of the GGE. However, for many other delegations, including Aotearoa New Zealand, Brazil, Bulgaria, El Salvador, Ireland, Luxembourg, Mexico, Norway, Panama, Peru, the Philippines, Sweden, and Switzerland, and more, the word “lethal” is not needed for several reasons: the existing international humanitarian law (IHL) framework makes no distinction between weapons based on their lethality; lethality is an effect rather than a characteristic of a weapon system; and autonomous weapons can also cause damage or destruction to objects or to cause injury or superfluous suffering.
To try to find convergence on this issue, the Chair included the word “lethal” qualifying “autonomous weapons systems” in the rolling text, but included a definition of what lethality would mean in a subsequent paragraph, noting, “Lethal refers to the capacity to cause death to a person. The fact that a LAWS can be used in circumstances that do not result in death, such as to damage or destroy objects or to cause injury, does not exclude it from this characterization.” As the week progressed and new updates were made in the rolling text, the definition changed to “The fact that a LAWS can be used in a way that does not result in loss of life, does not exclude it from this characterization.” Yet, despite the Chair’s best efforts, critical divergencies still remain.
Many delegations have already stated that only a working definition would be enough for the GGE’s work; further details about the characterisation could be decided during negotiations. They also pointed out that the characterisation should be broad, which would not mean that all AWS would be prohibited; that would be decided in the prohibitions section (Box III). If states are still worried about certain weapons being invertedly included in the scope of the treaty, this could also be clarified in the exclusion clause (Paragraph III of Box 1). While some that defend the word “lethal” might be trying to engage in the discussion from a legitimate technical perspective, one cannot but wonder if others’ motivation is to extend debates indefinitely to prevent the start of negotiations. Or even to have an instrument that allows for a restrictive interpretation that would only cover a very limited type of AWS.
This leaves states in the room to decide whether they want a strong instrument, or if they want consensus. Austria warned that if states are “adopting something just for the sake of achieving consensus and it is a low-quality definition, I think we would not do anyone any favour. We would actually damage the discussions on this topic for decades or even centuries.” Meanwhile, the US argued that states need both quality and consensus, and that it is confident the Group could achieve both.
It remains to be seen if having an instrument that covers only some types of autonomous weapons, leaving others completely unregulated, will be something that delegations are willing to risk. In the case of anti-personnel mines, cluster munitions, and nuclear weapons, states decided it was not.
When we think about these debates, it is important to challenge the underlying notion behind arguments to keep the word “lethal”. The states pushing for this are the same ones that claim AWS are necessary for security, and that prohibiting or regulating them might put their populations in danger. A quick look at those supportive of the term “lethal” reveals that they are the most heavily militarised states and their allies, deeply immersed in an artificial intelligence (AI) arms race. Instead of assuming these weapons are necessary for their security, perhaps they should ask: in a world without effective regulation, isn’t it likely that they will proliferate? That many states and non-state actors will have access to them? That civilians and communities worldwide will bear the brunt of these technologies, as they are already suffering from many other weapons? Without a strong prohibition, the fast developments in AI and automation are already indicating that the weaponisation of these technologies will lead to harm and destruction, and violations of international law. States are able to put a stop to it now, but will they have the same chance ten years from now? As nuclear weapons illustrate, it seems that the longer it takes for states to prohibit these weapons, the more entrenched they will become in military doctrines and political economies.
The road to negotiations
But it is not time for despair yet, as several states remain committed to agreeing on a treaty prohibiting AWS. On the last day of the meeting, a group of 42 states delivered a joint statement affirming that "while the rolling text remains a work in progress, it is a sufficient basis to fulfil the mandate of this GGE in its current form" and that they "are therefore ready to move ahead towards negotiations within the CCW on the basis of the rolling text." The statement, which was co-drafted by Brazil, Ireland, Norway, and Switzerland, was endorsed by a total of 39 high contracting parties to the CCW and three observers, including Austria, Belgium, Brazil, Bulgaria, Chile, Colombia, Costa Rica, Denmark, Dominican Republic, Ecuador, El Salvador, Finland, France, Germany, Guatemala, Iceland, Ireland, Italy, Kazakhstan, Kiribati (observer), Lesotho, Luxembourg, Malawi, Mexico, Montenegro, Nauru, New Zealand, North Macedonia, Norway, Pakistan, Palestine, Panama, Peru, Portugal, Samoa (observer), Sierra Leone, Slovenia, Spain, Sweden, Switzerland, Thailand (observer), and Uruguay.
Stop Killer Robots welcomed the joint statement, in particular that these states have collectively concluded they are “ready to move ahead towards negotiations.” The mandate of the GGE still stands, with two more sessions scheduled to take place ahead of the CCW Review Conference in 2026. Nevertheless, the joint statement demonstrates that there’s wide convergence around the elements that should form the basis of a treaty and that they are ready to make this happen.
At the end of the meeting, the Chair clarified that he intends to discuss a revised version of the rolling text in the next session of the Group; the discussion on the GGE’s report—which will include the “set of elements of an instrument”— will take place in the final session. “In the end, we are the ones who make it or break it,” he said, reminding states that the “outside world is watching us.”
With heavily militarised states trying to block the adoption of elements that will enable a strong prohibition, it will take “courageous political action,” as put by Stop Killer Robots, to take the next step and start negotiations on a legally binding instrument. In the meantime, states can take the First Committee as an opportunity to advance their perspectives and issue support for a legally binding instrument on autonomous weapon systems.
[PDF] ()