CCW Report, Vol. 13, No. 3
Report on the virtual informal consultation of the GGE on LAWS
3 June 2025
Laura Varella | Reaching Critical Will, Women's International League for Peace and Freedom
On Wednesday, 28 May, delegations met online for an informal consultation during the intersessional period of the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS). Both high contracting parties (HCPs) and observers— which includes civil society—participated in the meeting, marking a welcome change from previous informal consultations during this mandate, where HCPs and observers met separately. The GGE will still meet separately for three other informal consultations in the upcoming months; the consultation with civil society will happen on 17 June, while the schedule for the two consultations with HCPs was not shared with observers. Hopefully the rich and interactive discussions held during the recent consultation will encourage delegations to hold all future informal conversations in an inclusive format, as was done under previous mandates.
The following provides an overview of the discussions on key issues and is not necessarily a comprehensive account of all positions and interventions.
Human control
For this consultation, the Chair, Ambassador Robert in den Bosch of the Netherlands, proposed three guiding questions:
- Is direct control or oversight during selection and engagement stages always needed to comply with international humanitarian law (IHL)?
- Or, by using the term context-appropriate human judgement and control, do we acknowledge that in certain systems or situations, human judgement and control can be exercised in a more indirect manner, for instance before the system autonomously selects and engages a target?
- What specific contextual factors should be taken into account when deciding on the necessary amount and timing of context-appropriate human judgement and control?
Additionally, the Chair put forward a background paper on context-appropriate human judgement and control and the critical functions of LAWS, with the aim of facilitating discussions during the intersessional period. The Chair also referenced the updated version of the rolling text, which contains the term “context-appropriate human judgment and control,” among other changes.
Comments about the term “context-appropriate human judgment and control”
Australia, Austria, Belgium, Brazil, Germany, Ireland, the United Kingdom (UK), the International Committee of the Red Cross (ICRC), and others supported the term “context-appropriate human judgment and control”. Austria and Belgium said they would have preferred strengthened language with the word “meaningful”.
In contrast, Israel, Russia, and the United States (US) opposed the term.
Russia said this term has no general relation to international law. It also expressed concern regarding the subjective nature of the term, arguing it could be a subject of different interpretation.
Similarly, the US questioned the apparent premise in the background paper that context-appropriate human judgment and control is a well-settled concept. It said that this is a new term and not a requirement under IHL. Additionally, the US said previous debates demonstrate that states do not interpret human control and related terminologies (like meaningful human control and context-appropriate human judgment and control) in the same way. It opposed combining the terms “in a single package,” arguing that treating them as a singular standard expands the room for different interpretations. It said that the GGE should focus instead on articulating in detail existing core IHL requirements, as well as specific measures that help implement those requirements in the context of LAWS. “Only this approach will promote a unified understanding of the legal framework governing LAWS, and the measures necessary to implement it, rather than a subjective approach that varies in its implementation depending on the state and its policies,” said that US.
In response to the US, Austria said the term is not going “too far away” from existing IHL concepts, giving the example that Rule 19 and Article 57(2) of the Additional Protocol to the Geneva Conventions links control to other elements of predictability. It reiterated previous arguments that it is necessary to introduce a new concept given that this new technology poses new challenges. Austria noted that this has been done before, both in the Convention on Certain Conventional Weapons (CCW) and elsewhere, explaining that the Convention on Cluster Munitions, among other instruments, tried to find solutions to address humanitarian concerns, adding to the development of IHL. Austria emphasised that this is not something states are prohibited from doing; on the contrary, it is their job.
Mines Action Canada echoed Austria’s remarks and added that the drafters of the CCW back in the 1980s did not envision the idea of weapons with such a high level of autonomy being used in the battlefield or on borders. It encouraged states to be creative and build on existing IHL in a way that makes the future safer for both civilians and those taking part in armed conflict.
Similarly, Article 36 said that terms such as “context-appropriate human judgment and control” or “meaningful human control” would be an “overarching shorthand” for the human element required in the use of these systems. But, it said, to give these concepts substance, they need to be expressed through more specific obligations and rules regarding control of the context and understandings of the weapon systems. Article 36 noted that this is precisely what the rolling text is doing, as it is not asserting “context-appropriate human judgment and control” as a blanket obligation, but fleshing out what that concept should be understood to encompass. Article 36 also reminded participants that the CCW is a lawmaking structure, and that if states are going to prioritise the CCW as a framework to tackling this issue, they shouldn’t at the same time diminish its actual capacity and intended purpose as a structure for making new legal rules.
The UK acknowledged the point made by the US that human control is not a deliberate requirement of IHL. However, it pointed out that human control is essential to ensure compliance with IHL and that the “context-appropriate nature” is an important component of this. It said that to ensure compliance with IHL, humans must have the capability to make informed judgments regarding the use of LAWS and to control their effects, and that in order to do that, the context must be taken into account.
Brazil echoed the UK’s comments, saying that the concept of “context-appropriate human judgment and control” is an elegant way to reflect that the specific requirements for control will vary depending on the context of use, which includes the nature of the system, the environment in which it is used, and the mission for which it has been deployed. Brazil also noted that the term is not an end to itself, but that it needs to be operationalised. In response to the US’ remarks, Brazil also noted that while there are divergent understandings expressed by delegations about what human control and judgment mean, it is their job to bring clarity to the debate, and that the Chair’s paper and rolling text are helping in this direction.
The US opposed Austria’s comments that “context-appropriate human judgment and control” or “meaningful human control” would be close to an existing IHL rule, arguing that there isn’t either a conventional rule or a customary international law rule that has developed in this way. It also objected to Article 36’s comments that this would be an “overarching shorthand,” stating that this would not bring clarity to the debate but would instead confuse it. “What does bring clarity to this debate is achieving a clear articulation of the legal rules and a specific enunciation of measures to implement those obligations in the context of LAWS,” said the US delegation. The US also said that it acknowledges that control of a weapon is one measure to achieve IHL compliance, but that this is not the only measure. The US believes that the most fruitful way to approach this debate would be to continue exploring this range of measures.
Australia said that while it supports the inclusion of “context-appropriate human judgment and control” in the text as a compromise, it shares the view of the US and others that the need to adopt human control in using LAWS is not a separate requirement under international law, either under a treaty or under customary international law. Instead, control is a means to comply with IHL. It agreed that states should be looking at those specific control measures that might be appropriate when using LAWS in order to comply with IHL.
In contrast, Mexico noted that these technologies present new challenges and raise fundamental questions as to how to apply IHL rules. It said it prefers to focus on possible agreements on specific rules that are applicable to these new technologies, rather than imposing a common or consensual understanding of existing rules.
The Republic of Korea said that while it agrees new rules are needed, these rules need to be clearly defined in order to be effective. It shared the concerns expressed by the US that the terms being considered demonstrate a wide range of interpretations that result in confusion. It said that if the term “context-appropriate human judgment and control” is retained in the text, it should be accompanied by a footnote or one-pager explaining its meaning. It expressed preference for focusing the discussion on actual measures rather than the term itself. Austria said that there could be an annex providing examples of best practices that could help giving specificity and clarity for which delegations are asking.
Distinction between judgment and control
South Africa underlined that it would be essential to draw a clear distinction between human control and human judgment, stating that these concepts reflect different levels of responsibility and decision-making. South Africa said context-appropriate human control relates to the direct operation and engagement with the weapon system, to ensure that human oversight is maintained throughout out the targeting process. In contrast, it said human judgement involves a higher level of decision-making, including determining when, how, and under what conditions LAWS may be employed. “These are two distinct forms of responsibility: the responsibility of engaging a target, and the responsibility of authorising its use, and both these must be explicitly articulated to ensure accountability,” emphasised South Africa.
Brazil said that it sees control and judgement as different notions that are interlinked. Mexico agreed that these concepts are distinct and interrelated. It said that first, it is necessary for a human to make the judgment that matches legal evaluations and ethical considerations with an expected use of force. This judgment is done in the abstract, for example, while determining if the means or methods of warfare would be in all or some circumstances contrary to IHL—for instance due to the nature of the weapon—as well as by its use, which needs to be evaluated in a particular context. However, Mexico also noted that while judgment is an evaluation of the weapon itself with regard to the context, there is also the need for the human to have control over the operational aspects, especially over the weapon system’s effects.
Australia also emphasised that control and judgment are distinct concepts. It said that control measures are applied in order to determine the effects of an autonomous weapon system, including in the design and development phase, through testing and evaluation, and in setting specific parameters. It said that these controls, which are applied to ensure compliance with HL, will depend on the context. Australia said that judgment, on the other hand, refers to the decision-making of commanders and others in the chain of command, including the decision to use a particular weapon system for a particular operation.
The UK said that it considers judgment the ability to make a decision or form an objective opinion about which action should be taken. In the case of LAWS, this could include decisions over where, when, and how LAWS should be used, the risks for IHL compliance, and what means are needed to control their effects. The UK noted that control would be the means to ensure human intervention across the lifecycle of the weapon system and to control the effects of those systems when in use.
Germany stated that human control implies not only technical control, but also an element of judgment. It said that since the control of algorithms and operations is exercised by humans, human judgment will be applied within the responsible chain of command.
Comments about the first and second question
Australia, Brazil, France, Germany, and others said that direct control is not always necessary to comply with IHL.
Germany said that control will mostly be exercised before the selection and engagement of targets. Australia also said that contemporaneous human control will not necessarily be required during the engagement and selection phases in order to comply with IHL. Australia said that the levels of control will depend on the context and the type of autonomous weapons that are employed.
France said that control is essential in the earlier phases to guarantee compatibility with IHL. France said that human control and judgment should be exercised in the development and programming of the system, and when defining the system’s operational framework. For the latter, France suggested this could include defining the rules of engagement and the mission parameters assigned to the system, limited in time, space, and by specific objectives defined according to the situation and context. France noted that the operator of a LAWS must be able to monitor the reliability of the system during its deployment. “Therefore, the systems compliance with IHL will depend on the quality of the defined operation framework.” France also emphasised that in addition to this framework, some situations would require more direct human control during use. For example, to deactivate the system, if necessary and when it is feasible.
Russia also stated that human control can be exercised by means other than direct control, for instance by limitations on types of targets, duration of operation, geographical scope, and scale of use; as well as timely intervention, deactivation, and testing weapon systems and military equipment in real operational environments, among others. It noted, however, that that specific forms and methods of human control should remain at the discretion of states.
The ICRC emphasised that context-appropriate human control and judgment would not require direct physical control of the weapon system at all times, but requires instead that relevant control is maintained over the effects of the weapon system. The ICRC emphasised that this control should be established through a combination of prohibitions and restrictions, which are included in the rolling text in Box III.
Austria noted that control is always needed, both in the development and engagement phases. It also argued that different contextual factors play an important role in defining the level of human control. It pointed out that indirect control is not necessarily a temporal control, as it is not exclusively conducted before the engagement phase. Rather, it is just a different quality of control. Austria reiterated its position that human control over the use of force should be maintained.
Human Rights Watch pointed out that it is difficult to answer questions about “direct” and “indirect” control without a clearer understanding of what those terms mean. It affirmed that the background paper implies that “kill switches” and pre-set parameters represent direct and indirect forms of human control, but that it is unclear if they are the only examples. Nevertheless, it emphasised that the “meaningfulness of control provided by a kill switch depends on how much time the operator has to intervene.” Human Rights Watch also emphasised that the effectiveness of mission parameters depend on how far ahead they are pre-set. “To be meaningful, human control should be exercised at the time of attack, not long before. For example, to comply with the proportionality test, the person using the autonomous weapon system needs to determine at the time of a specific attack whether civilian harm is excessive in relation to military advantage in a complex, rapidly changing environment,” said Human Rights Watch.
Comments about the third question
Ireland underlined that there are several contextual factors that must guide decisions on the timing and nature of human control. This would include: the operational environment (systems operating in environments with minimal civilian presence, such as the open sea, might require less direct control than those in urban or populated settings); the nature of the target (greater human judgment is required where targets are harder to distinguish, including distinguishing combatants from civilians, or those hors de combat); technological capabilities (the system's ability to reliably comply with distinction and proportionality principles must be critically assessed); predictability and reliability (the systems must behave predictably within the limits of the programming); among others. Ireland emphasised that autonomy is a spectrum, and depending on the system's attributes and its operating environment, different systems will fall in different areas on the spectrum. It stressed that ensuring context-appropriate human control and judgment is necessary over all weapon systems, and those that fail to meet this criteria—which is clearly defined in Box III of the rolling text—should be prohibited, and the rest should be subject to measures and restrictions that allow for their necessary control and judgment to comply with IHL and other international law.
Both Mexico and Human Rights Watch also said that the operational environment could be considered when defining the level of control, particularly highlighting concerns about use in populated areas. However, Human Rights Watch expressed concern with the argument that autonomous weapons systems could be safely used away from such areas. It stressed that experience shows that once a certain weapon system exists, there is a likelihood that it will proliferate or be misused. “While context may be a factor to consider, it seems that the level of human judgment and control should determine whether a context is appropriate not the other way around,” it emphasised.
France stated that the specific degree and timing of control required will depend on the system, the context of use, and the mission plan. This would include, for example, the characteristics of the system, its transparency, type of learning, data, and criteria used to perform its functions, its capabilities, and functions that it can execute autonomously. It would also depend on the environments and on the targets, as they might be complex and might change over time. Brazil echoed France that the context assessment includes the nature of the system, the environment in which it is used, and the mission in which is employed.
Mexico noted that control has two dimensions: the control of the weapon itself—the decision about when it should be used, and in certain situations to be able to deactivate or terminate the attack—and also being able to intervene when the situation on the ground has changed. Mexico said control should also include the ability for a human user to influence the system when there are recognised biases or unexpected algorithm evaluations with regard to the context, the intention, or the character, of the target. Mexico emphasised that the level of judgment or control could vary depending on the context.
The Interagency Institute warned that human control can only be meaningful when the tempo of operations remains within human cognitive limits. It said that in warfare, exceeding these limits creates a gap where accountability, judgment, and legal compliance may fail.
Critical functions
Another issue discussed in the online consultations was the critical functions that should be considered in the characterisation of LAWS. In the most recent version of the rolling text, paragraph 1 of Box I establishes that LAWS can be characterised as weapons that can “select and engage a target, without the intervention by a human user in the execution of these tasks.” The Chair’s background paper also provides descriptions, examples, and explanations related to the critical functions of LAWS, particularly on paragraphs 15–20.
Israel, the Republic of Korea, and Singapore supported re-introducing the term “identify” as a critical function, in line with the previous version of the rolling text. Similarly, the US said that the GGE has already found consensus that target identification is one of the essential characteristics of LAWS in 2019.
The US also said that close-in weapons systems and loitering munitions generally have the ability to identify the target. Without it, the weapon system can't select the target for engagement, nor can it direct force against the target. On the other hand, automatic contact mines historically have not had the ability to identify the target, but simply explode based on contact. It said that rudimentary mines that have been in existence for more than 100 years are not the focus of the Group’s work and it seems better to exclude these systems given that specific rules already exist for the use of land and naval mines. The Republic of Korea made similar remarks.
Austria reiterated its position that instead of having a restrictive characterisation, which could create loopholes, it is important to have a broad definition. It emphasised that this does not mean that all AWS would be prohibited. Austria supported retaining only the terms “select and engage” in the rolling text, noting that if “identify” was added, only one out of the five examples of loitering munitions in the Chair’s background paper would be considered autonomous weapons.
Article 36, Brazil, the ICRC, and Ireland echoed Austria’s remarks that it would be important to maintain a broad scope at this stage. Ireland stressed that it would be important to cast a wide net, and later in the negotiations of a future instrument states can agree on which systems should be excluded from the characterisation or fall within one of the two tiers. It also opposed including “identify” as part of the critical functions in the decision, as this would create a loopholed and is overly restrictive.
The ICRC also pointed out that identification and selection of targets are usually carried out in a single process. It said it would be better to have one concept covering it all. The ICRC emphasised it would be important not to create risk of confusion or a loophole by using “identify” in addition to “select”. It recommended that the entire process described in paragraph 20 of the Chair’s background paper should be referred to as “selection”. It said that if the Group prefers to have two terms, it is critical that the notion of identification focus on the aspect of the system looking for and discovering the object that fits the targets, not pre-setting the targets. It explained that pre-setting a specific target or conditions is done mostly done by humans, so that would defeat the purpose of having a definition that would exclude most, if not all, of the weapons of concern.
Article 36 made a similar point, pointing out that the Chair’s background paper includes several tasks that are mechanically the same. It also expressed concern about comments in favour of keeping identification as a necessary characteristic to limit the scope of the instrument and exclude systems that might operate in a more basic way. It said this could risk reducing the scope to such an extent that we’re only considering extremally complex futuristic systems that mght be more appropriate for prohibition. The ICRC also expressed concern about leaving out the most archaic forms of weapons that are triggered by the environment because they are precisely those that are likely the most indiscriminate.
Australia said that while it believes that LAWS might have the ability to identify a target as well, it is open to the characterisation including “select and engage,” recognising that the identification process may be done, for example, by a separate but not interrelated machine.
Japan noted that it would be useful to examine how delegations understand the tasks of identification and selection, in particular if the determination of whether a specific target is a military objective or not is made at the identification stage or at the selection stage.
Anti-personnel autonomous weapon systems
Human Rights Watch pointed out that neither the Chair’s rolling text nor the background paper mentions that autonomous weapon systems that target people pose serious legal and ethical risks. “While the weapons systems may not have been the background paper’s subject, the rolling text, which contains numerous other restrictions, fails to include a prohibition on antipersonnel autonomous weapons systems despite calls from many states, ICRC, and civil society,” said Human Rights Watch. The Interagency Institute also supported a prohibition on anti-personnel autonomous weapon systems.
Other comments about the background paper
The US expressed concern with references in the working paper to weapon systems distinguishing or applying proportionality. It emphasised that it is not the weapon that needs to comply with principles of distinction and proportionality, but rather the person who uses the weapons. Similarly, Article 36 expressed concern with Paragraph 23 of the working paper, as it suggests that the system is going to be applying the principle of proportionality or undertaking distinction, while those are human responsibilities.
Mines Action Canada expressed concern with the understanding that a kill-switch would not pose any issues, as written in Paragraph 23. The organisation suggested it could be useful to revisit the discussions held in the GGE around confirmation bias.
Human Rights Watch noted that the background paper discusses human control only as it applies to compliance with IHL, and that it does not address other relevant bodies of law, notably international human rights law, nor does it consider ethical, humanitarian, and security concerns. The rolling text takes a similarly narrow approach, focusing on IHL. “The broad participation and substantive engagement in this month’s UN General Assembly meeting in New York, where those issues were raised, shows that there is strong interest in dealing with them,” pointed out the organisation. Human Rights Watch also noted that the background paper and rolling text address the use of autonomous weapon systems in armed conflict, but the systems will also likely be used in law enforcement operations and other contexts. It emphasised that a legally binding instrument should prohibit and regulate autonomous weapon systems under any circumstances.
[PDF] ()