logo_reaching-critical-will

Urgency of dealing with autonomous weapons is highlighted in Germany’s online forum

Katrin Geyer and Ray Acheson
6 April 2020

On 1–2 April, the government of Germany hosted an online forum in support of the 2020 Group of Governmental Experts (GGE) on lethal autonomous weapon systems. This follows a symposium organised by Brazil in February. Due to the COVID-19 outbreak, it is unknown how the schedule of the GGE will be affected this year. In the meantime, this forum was intended to bring governments and civil society together to continue discussions around the guiding principles developed by previous GGEs and to consider the development of an operational and normative framework mandated by the last meeting of high contracting parties of the Convention on Certain Conventional Weapons (CCW).

More than 300 participants from 70 countries registered for the online forum, where human control over weapons and the use of force was the centerpiece of discussions. The forum was interactive, with presentations by invited speakers followed by questions and comments from registered participants. Unfortunately, one session was a man panel, which failed to reflect a diversity of views as well as participants. But overall, the forum offered a welcome exchange of views and helped keep momentum up in this process during a time of uncertainty. The urgency of dealing with autonomous weapons before it’s too late is only mounting, particularly as we see an increasing imposition of surveillance and other emerging technology on populations around the world in the face of the coronavirus outbreak.

Germany intends to compile key points from the forum in a Chair’s summary, which will be released in due course. In the meantime, this report provides an overview of each session as well as some reflections from the question periods.

Opening session

After introductory remarks by Rüdiger Bohn of the German Federal Foreign Office, Germany’s Foreign Minister Heiko Mass delivered the first opening statement to the Forum. Maas observed that while addressing the global COVID-19 pandemic, work on other pressing issues must not be neglected. He argued that the standstill in nuclear disarmament, the attack against existing multilateral agreements, and the risk linked to new technologies in warfare require everybody’s full attention. He also asserted that “letting machines decide over life and death of human beings goes against ethical standards and undermines human dignity,” and that their use would constitute “a red line we should never cross.”

Izumi Nakamitsu, High Representative for Disarmament Affairs, warned in a second opening statement of the risk that weapons may soon be used that are in violation of international humanitarian law (IHL) and that are contrary to the dictates of public conscience. She repeated the UN Secretary-General’s statement that the development and use of killer robots would be politically unacceptable, morally repugnant, and should be prohibited under international law. She noted that the agreement on the 11 guiding principles at the last GGE should help inform the development of aspects of a normative and operational framework. She welcomed the Forum’s central emphasis of the human role in lethal force.

Lastly, Nakamitsu observed that more can be done to enhance efforts to ensure the full and equal participation of women in this process. This remark was particularly relevant as gender diversity of the Forum’s speakers, especially on the first day, was very low.

Ambassador Janis Karklins of Latvia, Chair of the 2020 GGE on autonomous weapons, outlined his intended approach to the Group’s work this year, based on the programme of work and draft agenda, which is not yet available online. He has requested high contracting parties (HCP) to the CCW to submit national “commentaries” on the operationalisation of the 11 guiding principles in order to identify commonalities among how the principles are being implemented by states. He expressed hope that discussing commonalities of national approaches will lead to discussions about a normative and operational framework at the international level. He proposed to start work on a draft final outcome document this year to be adopted in 2021. In a personal capacity, the Chair encouraged the GGE to “think outside the box.” In light of a lack of common understandings of the characteristics of autonomous weapon systems, he suggested to focus on the understanding that humans should always remain in control of the application of lethal force. He encouraged the GGE to work towards this as its focus and leave technical discussions of weapon systems aside.

Session 1­: Defining the human role in the use of lethal force

The first working session looked at the human role in the use of lethal force from the perspectives of military practitioners, industry representatives, and scientific experts.

Speakers addressed related questions, including: how to qualify the concept of human control/supervision; the degree of human judgement, interaction and intervention necessary for weapon systems to remain in compliance with international law; and how factors such as system design and operational context affect the appropriate type of human involvement in future weapon systems. The man panel took an overly narrow legal and technical focus and did not address existing concerns over removing human control from the use of force.

Karl Chang, Associate General Counsel for International Affairs at the US Department of Defense, outlined the United States’ position on the required degree for human control in the use of lethal force, based on the Department of Defense (DoD) Directive 3000.09, and the US’ working paper submitted to the GGE in 2019. Mr. Chang reminded that the DoD is opposed to developing a new standard on human control, and argued that new principles should not stigmatise new technology when that technology could be used to increase the protection of civilians in conflict. He asserted that the goal of the GGE should be to “effectuate the intent of officers” to enhance compliance with IHL and the protection of civilians in conflict. He presented three different scenarios often arising in military practices, and suggested the GGE focus on discussing what IHL would require in these circumstances and develop “common understandings” on IHL interpretation.

Martin Hagström, Deputy Research Director of the Swedish Defence Research Agency, argued in his contribution (delivered in his personal capacity) that states should develop and exchange national directives of the use of increasingly complex technology applications in weapon systems. He welcomed the US Directive 3000.9 as “top level design criteria” and recommended that these kind of standards and directives be further developed and exchanged by governments.

Prof. Dr. Wolfgang Koch, Chief Scientist für Kommunikation, Informationsverarbeitung und Ergonomie (FKIE) and Head Sensor Data and Information Fusion (SDF) FKIE-SDF at the research institute Fraunhofer FKIE, elaborated on the technological aspects needed to strengthen the human role in the application of lethal force in order for a machine to act “ethically well”. He explained in detail the interrelationship and flow of information between artificial intelligence applications in weapon systems, the environment where AI is applied, technical automation, and the human decision-maker. He concluded that “the responsible use of autonomous weapon systems requires human involvement, not only by deciding to use [autonomous weapon systems] but by configuring them with technically implemented, ethically justifiable rules defined by humans.”

After the first set of presentations, an interactive discussion between panelists and participants followed. Discussions revolved mostly around existing weapon systems with high degrees of autonomy and if/how these could help to assess the required degree of human control for future autonomous weapon systems.

The second part of working session I delved further into the question of the human-machine interaction across the weapon’s lifecycle.

Florian Keisinger, Campaign Manager at Airbus Defence and Space GmbH, detailed the company’s approach to developing “Future combat air systems (FCAS),” which they project will be operable around 2040. He explained that in developing this weapon system, Airbus grapples with one fundamental challenge: that the system be “technologically superior” and at the same time fully under human control. To address this challenge, Airbus is bringing together a range of experts from many different disciplines that are to accompany and advise the development and design of the combat system.

Giacomo Peri Paoli of the Security and Technology division of the UN Institute for Disarmament Research (UNIDIR) discussed key considerations of the military decision-making process leading to the use of force. His contributions were based on the recently published report The human element in the decisions about the use of force. Using an iceberg infographic, Paoli presented a framework of human control at the strategic, operational, and tactical levels of decision-making, offering key legal considerations for decisions-makers at various stages in the process. The graphic also demonstrates how critical decisions about the use of force are taken at various levels. The graphic is intended to guide discussions in the GGE on military and legal aspects of human control. Mr. Peri Paoli announced that UNIDIR will host a series of regional table-top exercises to assess what a future normative and operational framework should consider and will share these findings with the GGE.

Col. Jun Yamada, Military Adviser to the Mission of Japan in Geneva, outlined Japan’s understanding of human-machine interaction. He said that it is important to incorporate the views of the private sector, and cited as an example the requirements established by the private sector for personal care robots. Mr. Yamada announced that Japan plans to hold panel discussions with a variety of actors on the topic in Tokyo in December 2020 in order to bridge the gap between this year’s GGE session and the CCW Review Conference in 2021.

Kathleen Lawand, Head of the Arms and Legal Division of the International Committee of the Red Cross (ICRC), was the final speaker in this session. Her presentation was based on the chapter “new technologies of warfare” in the 2019 report International Humanitarian Law and the challenges of contemporary armed conflicts, and a forthcoming joint report on the topic together with the Stockholm International Peace Research Institute (SIPRI). She explained that certain limits on autonomy in weapon systems can be deduced fromexisting rules on the conduct of hostilities, notably the rules of distinction, proportionality, and precautions. This means that autonomous weapons that are unsupervised, unpredictable, and unconstrained in time and space would be unlawful IHL. Lawand underscored however that existing IHL rules do not provide all the answers to address increasingly autonomous functions in weapon systems. She asserted that three types of constrains are needed to determine the required level of human control to exercise context-specific judgements in line with IHL requirements: 1) constraints on targets and tasks; 2) spatial and temporal limits; and 3) ability to supervise and intervene. Against this backdrop, Ms. Lawand reiterated the ICRC’s position that there is a need for internationally agreed limits to ensure compliance with IHL. She made an urgent appeal to states to act immediately.

Session II­: Developing and elaborating the guiding principles

Working session II allowed states to share their approaches to, and analyses and understandings of, the guiding principles that have been elaborated in the context of earlier GGE sessions. All agreed that principle C on human-machine interaction is the cornerstone of the GGE’s work and that some principles require further unpacking. While some panelists welcomed the GGE Chair’s call for exchange on national approaches to operationalising the principles, others cautioned that this approach would establish the principles as ends in themselves.

After a few opening remarks by Rüdiger Bohn of the German Federal Foreign Office, Murielle Marchand from the Permanent Mission of Belgium welcomed the principles as a first positive result and marking important progress toward developing common understandings of the challenges related to LAWS.

Belgium’s priority for the GGE’s work is to maintain the positive momentum from 2019 and to identify common priorities and objectives for the next two years, avoiding polarisation. While remaining open to the addition of new principles (such as on ethics or algorithm bias), Marchand advised against adding more principles “for the sake of it,” but instead urged participants to “unpack” existing principles to allow for more concrete thinking. In this vein, she stressed that principle C on human-machine interaction requires much more consideration. In order to circumvent the current “blindspot” among states of not having a common understanding of what autonomous weapon systems are, she suggested to determine a set of criteria that would facilitate the evaluation of a weapon system to comply with IHL. Such analysis could be based on the joint 2019 paper presented by Luxembourg, Ireland, and Belgium.

Pamela Moraga from the Permanent Mission of Chile in Geneva underscored that the principles are just the “minimum common denominators” that the GGE was able to agree on after years of formal deliberations. In line with Marchand, Moraga suggested that some principles are more important than others, particularly principle C, and principles A, B, D, and H. Like Marchand, Moraga said the GGE should now focus on “unpacking” the concepts contained in the principles. Yet she also acknowledged that the principles could be further expanded by creating additional principles, including relevant ethical, moral, and humanitarian considerations.

Moraga argued that the three main issues relevant for the development of a legally binding instrument on autonomous weapons can be found in the principles: 1) the reaffirmation that IHL applies to all weapon systems; 2) the assertion that human responsibility for the use of any weapon system must be retained; and 3) the need for meaningful human control.

Moraga then cautioned that the Chair’s approach to the principles risks establishing the principles as ends in themselves, as opposed to being just starting points. She argued that states cannot and should not operationalise principles at the national level that were meant to guide the work of the GGE and asserted that their operationalisation would mean that they are granted binding normative status. Lastly, Moraga underscored that there is a risk that operationalisation at the national level will skew the debate towards those states developing or working on autonomous weapon systems and against those working to prohibit these systems entirely.

Moraga further observed that national measures such as confidence-building measures, codes of conducts, political declarations, or legal weapon reviews are not sufficient to address the distinct challenges of autonomous weapon systems. She reminded participants that there is a large group of countries that support the development of a multilaterally agreed regulation and prohibition of autonomous weapon systems.

Mikaël Griffon, Deputy Director of Arms Control at the French Ministry of Foreign Affairs, concurred with previous panelists that the principles shouldn’t be expanded too much in order to avoid increasing inconsistencies and ambiguity. According to Griffon, the GGE should focus on two main priorities: 1) unpacking existing principles, such as principle C; and 2) operationalising other principles by sharing national experiences and exchanging on best practices, for instance in relation to the principle of legal weapon reviews. He concurred with the other panelists that principle C on human-machine interaction is the central issue of the GGE’s work. Griffon shared France’s understanding of requirements for human-machine interaction, with the key criteria being that the decision to use force needs to always trace back to human intent. He informed that France’s Ministry of Armed Forces established a permanent Ethics Committee in January 2020 that is tasked to reflect on challenges related to new weapon systems and to propose guidance to the Ministry.

The final speaker of this session was Alessandro Candeas, Ambassador and Director of the Defence Department of the Ministry of Foreign Affairs of Brazil. He kicked off his presentation with a comparative historical perspective of the negotiations of other arms control treaties such as the Biological Weapons Convention (BWC), the Chemical Weapons Convention (CWC), or the nuclear Non-Proliferation Treaty (NPT), with that of the process on autonomous weapons. Candeas argued that an instrument on autonomous weapon systems will need to be negotiated differently than those for other types of weapons because they are under development. Candeas agreed with the other panellists that principle C on human-machine interaction forms the cornerstone of the GGE’s work. He clarified Brazil’s understanding of the concept and stressed the need to distinguish between defensive and offensive scenarios when it comes to establishing the required degree of human control. In terms of the way forward, he called on the GGE to move towards the negotiation of a legally binding protocol. He acknowledged that effective regulation may be achieved by intermediate steps such as political declarations, or corporate codes of conduct and market restrictions. Candeas presented a comprehensive proposal to operationalising the general principles via four different paths that each encompass different principles and outcomes.

Session III: Possible elements of the normative and operational framework

The third and final working session looked at possible elements for the normative and operational framework that the CCW high contracting parties have agreed to elaborate throughout the next two-year GGE cycle. Speakers discussed various options for moving forward, including a legally binding instrument, and the role of existing international law in relation to a framework on autonomous weapons.

Bonnie Docherty, Associate Director of the Armed Conflict and Civilian Protection Initiative at Harvard Law School, outlined elements for a legally binding instrument on autonomous weapons. She highlighted the treaty elements paper and related frequently asked questions (FAQ) that she and her team developed for the Campaign to Stop Killer Robots, which sets out the scope, prohibitions, and positive obligations for a future treaty. In this formulation, a treaty would cover all systems that select and engage targets on the basis of sensor processing rather than human inputs. It would require meaningful human control (MHC) over the use of force, with the understanding that MHC has decision-making, technological, and operational components:

  • Decision-making components include understanding of the operational environment and how a system functions.
  • Technological components include predictability and reliability; ability to relay information to a human; and ability of human to intervene.
  • Operational components include limits to where and when the system can operate and what it can target, i.e. including time between human assessment and application of force.

Docherty also noted that the instrument would need to have prohibitions on specific weapon systems that select and engage targets and that by their nature pose fundamental moral or legal problems. 

Amanda Wall, Attorney Advisor for Political Military Affairs at the US State Department, outlined US views on possible elements of the framework mandated by the GGE, of which she says her government is supportive. She also welcomed Ambassador Karklins’ request for states to prepare commentary on the guiding principles and argued there is value in further developing these principles. However, when it comes to the framework, Wall said that form must follow function—that CCW states have to “achieve consensus on substance before we can decide what form that substance will take.” That is, they cannot agree on whether to develop a treaty or political declaration or instrument first before knowing what it will say. Wall also argued there are three essential considerations: 1) That states acknowledge there is no gap in existing law just because the GGE is dealing with emerging technologies; 2) That states need to identify good practices of compliance with IHL, development of autonomous weapons, and for human-machine interaction; and 3) That the GGE should promote better understandings of risks and opportunities including of improving compliance with IHL through the development and use of autonomous weapons, arguing that these machines will be better than human soldiers in certain regards.

Lt. Col. David Walker, UK Ministry of Defence, concurred with his US colleague on most points. He urged CCW states to recognise that there are no gaps in the existing law, but that autonomous weapons pose unique challenges to the law and thus those developing them will need to constantly reassess their weapon reviews process to make sure these weapons remain compliant with the law. However, states should not create new law, he argued, asserting also that nothing outside of the CCW offers a useful template for dealing with these weapons. Any previous processes related to landmines or cluster munitions, for example, dealt with weapons that have an established pattern of harm, whereas autonomous weapons are “more conceptual”. Thus, he argued, developing a treaty about these weapons would be “applying prohibitions to a concept”. He did agree with other speakers however that the human element in considerations of these “concepts” is key.

Anja Dahlmann, Researcher and Coordinator with the International Panel on the Regulation of Autonomous Weapons (iPRAW), closed out the session with a discussion about elements of a regulatory instrument on autonomous weapons. She agreed that the starting point should be the human element or human control over use of force, and explained that control is a process, not a singular event, and that it does not necessarily mean direct manipulation of the weapon system but is rather context dependent. She concurred with the ICRC’s focus on autonomous functions rather than autonomous systems. The core element of any instrument, she argued, would be that weapon systems must abide by IHL. There are legal and ethical considerations here. Elements of control include design (technical control) and use (operational control). Dahlmann also touched upon the possibility of including verification measures in a legally binding instrument on autonomous weapons, which could focus on human-machine relations and access points in various stages of the life cycle. However, she noted that the main challenge would be that software could be changed after verification.

Conclusion

WILPF, as a member of the Campaign to Stop Killer Robots, has long argued for the development of a prohibition on autonomous weapon systems. Far from being just a “concept,” as the UK representative to the online forum asserted, these weapons are currently in development. Our experiences with other weapon systems, including armed drones and nuclear weapons, should inform our approach to increasing autonomy in the use of force. We have a chance now to prohibit these weapons before they cause the “patterns of harm” that certain governments claim to require before they can take action.

As the Campaign to Stop Killer Robots has written, “The Covid-19 pandemic is an unexpected development that threatens the lives of millions of people. This defining worldwide event could provide the impetus for more coordinated and substantive multilateral efforts to deal with the dangers that certain threats pose to humanity. It should affirm the urgent need for international legal frameworks such as a treaty to ban fully autonomous weapons.”