logo_reaching-critical-will

CCW Report, Vol. 11, No. 2

Taking on the War-Builders
14 March 2023


Ray Acheson | Reaching Critical Will, Women's International League for Peace and Freedom

Download in PDF

The first session of the 2023 Group of Governmental Experts (GGE) on autonomous weapon systems (AWS), organised within the Convention on Certain Conventional Weapons (CCW), met from 6–10 March in Geneva. With a new batch of proposals on the table for consideration and dialogue across a range of key issues, the week saw some notable developments in states’ positions, as outlined by Stop Killer Robots in its closing remarks. However, the heavily militarised states and some of their allies continue to actively block the development of legally binding rules, despite overwhelming support for a new instrument to prohibit and regulate AWS in order to safeguard humanity. It remains to be seen what the GGE will accomplish this year; it still has another five days of formal work and two virtual informal consultations ahead of it. But based on the discussions last week, there is not a clear path forward at the veto-based CCW.

Half-measures and tautologies

While some of the proposals and statements at this GGE session indicated progressive developments in thinking and approaches to AWS, others highlighted once again the lengths to which certain governments will go to prevent any constraints on their weapons development or use.

There is a category of states that play the middle road between blocking all progress and allowing any meaningful progress. Canada, for example, despite the government mandate to support a ban on autonomous weapons, has chosen instead to support the US-led proposal for voluntary guidelines for IHL implementation in relation to AWS. The Canadian delegation argued last week that such guidelines or political commitments are not mutually exclusive to a legally binding instrument, which could be pursued later, once states have reached agreement on definitions for AWS. However, this argument ignores the fact that the pursuit of common definitions has been intentionally drawn out by certain states to prevent the development of legal rules that could preclude the creation and use of these weapons.

Another middle of the road player, France, continues to insist upon its version of the two-tier approach—which is that fully autonomous weapons should be prohibited while all other AWS should be regulated. However, its conception of a fully autonomous weapon is one that operates completely outside of human control and a responsible chain of command. As the non-governmental organisation Article 36 points out, this is a “fantastical and unhelpful basis for boundary definition.” The types of weapon systems this characterisation could include arguably cannot exist, because a weapon cannot be developed, deployed, or used without some form of human engagement within a chain of command, which means every system would fall outside of the fully AWS category except perhaps for weapons that are built and deployed by other machines. Which may be a concern in the dystopian future that some of these states are trying hard to build, but is not where the problem lies right now.

The problem right now is that it is human beings who are trying to build machines that can autonomously select targets (including people) and engage them (kill or injure people or blow-up infrastructure) on the basis of sensors and software rather than human judgement. A handful of governments, such as Israel, even insist that there is no requirement for any form of human control over weapons. Human control is not an obligation under international humanitarian law (IHL), the Israeli delegation argued last week, asserting that IHL rules do not prescribe how states should comply with the rules.

Of course, it is because of this that states have sought to create guidance for implementing IHL through other treaties and agreements—including by prohibiting and regulating weapons through the CCW. Mexico and Chile pointed out that the assertion that there is no need for additional regulation of AWS cancels out the justification for the CCW or other disarmament and arms control treaties. It leaves the world open to increased death, destruction, and discrimination, as heavily militarised and violent governments pursue the development of means and methods of warfare that take humanity further and further away from itself.

“Are we ready for a world in which decisions related to human lives will be replaced by machine learning?” asked the delegation of Peru last week. It articulated its rejection of the development and use of weapons that “shoot on their own,” or where artificial intelligence software “pulls the trigger” using facial recognition to determine its targets. Ireland similarly said that “ceding control to algorithms is not an option,” pointing out that delegating the decision to take human life to machines is not consistent with the laws, norms, or values that humanity has collectively developed. Ecuador highlighted that AWS will further dehumanise armed conflict and reduce the threshold for war, while Chile and Mexico reiterated that these weapons would exacerbate discrimination, pose a serious threat to international peace and security, facilitate a new arms race, lead to the use of other inhumane, destabilising weapon systems, and undermine commitments to achieving general and complete disarmament.

But those seeking to develop and deploy AWS refuse to acknowledge any of these concerns. Instead, like France, they offer to prohibit weapons that cannot exist. Russia, making a similarly fantastical proposition, suggested that prohibitions are only necessary on weapons for which there is clear evidence that their use would be so destructive that in no case could they abide by the key principles of IHL. But, Russia argued, since there are many weapons already operating with high levels of autonomy, this means that it is possible that all such weapons could be used in certain circumstances.

This is a complicated argument that once again sets an impossibly high standard for the prohibition of weapon systems. This approach also implies that Russia and other militarily violent states always use their weapons in compliance with IHL, which is very clearly not the case, despite their repeated affirmations that they hold IHL in the highest regard. Some delegations such as the United States and Japan even contend that AWS will actually enhance compliance with IHL. Japan reiterated its belief in the benefits of autonomy in weapon systems, arguing that they will enable faster analysis and processing of data, helping to improve distinction, accuracy, and speed of targeting.

Building a world of war

At the core of all these arguments is the vision of a world at war. Instead of working to build a world of peace, they are actively building a war-making world. They purport that faster processing of data in war is a good thing—it will allow them to kill more and destroy faster. With this as the underlying premise for weapons development, things like national legal reviews and “vigorous field testing” and training of system operators will not make a difference. Weapons do not save lives. Weapons are designed and used to kill and destroy; increasing their autonomy by adding algorithms or machine learning software does not change this basic fact, it only takes control of killing away from humans.

As Venezuela said, it is counterproductive to assert that there may be certain weapon system that could improve the implementation of IHL because the purpose of IHL is to limit human suffering, which is inherent to armed conflict, and unregulated weapons lead to violations of IHL.

The challenge of “trusting the process”

In response to those who continue to argue that existing IHL is sufficient or that legally binding rules are not necessary, Palestine pointed out that all of the conditions that virtually every state participating in the GGE has recognised as being important—such as predictability, explainability, traceability, and more—are not codified in existing IHL, because humans did not have to contend with autonomy when IHL was first formulated. When new technology arises, Palestine argued, it is only natural that new dimensions will arise and need to be codified. IHL didn’t exist before it existed, which is to say, those who authored IHL did not argue that since such rules do not exist we should not develop them. As the world evolves, the rules to live within it cannot stay stagnant.

Palestine also urged all delegations to engage in a thought experiment in which they do not know which country they are representing—large or small, weapon producer or importer, at war or at peace. From such a position, asked the Palestinian delegate, what would each delegation do? AWS must be analysed from this angle: not from narrow national interests but focusing on the interests of humanity as a whole.

In response to Palestine, the United States said states should “trust the process” they are in right now, which, as a multilateral process, means that every delegation is going to come with different perspectives. The vast majority of states, of course, have been trying to trust the process. For ten years they have engaged in discussions in the CCW, even while the preferences of the vast majority are blocked every single year by the narrow interests of a tiny minority. But it is becoming increasingly harder to trust a process that cannot even adopt reports that reflect the majority’s views on key topics, let alone begin work on a concrete outcome despite its mandate to do so.

It is also incredibly hard to trust the states that are actively delaying or preventing action on developing legally binding rules on AWS, when those countries are bombing cities and towns in contravention of IHL, when they have used armed drones to attack weddings and funerals, when they are already deploying facial recognition on drones, and when they have spent the last eight decades building up massive military-industrial complexes, profiting from the production and sale of weapons, and testing and modernising nuclear weapons in contravention of international law.

It is also difficult to “trust the process” when the states developing AWS claim only IHL is applicable to AWS and this forum cannot consider the application of international human rights law—all while they militarise their police forces and introduce autonomous explosive devices, robotic dogs, and discriminatory predictive policing software and facial recognition algorithms into their police departments and border enforcement mechanisms. It is hard to trust the process when the United States is building Cop City, a compound to train police from around the world in urban warfare techniques with the latest military equipment; or while the US military contracts with universities to develop and test autonomous weapons on campuses around the country. It is hard to trust the process when Israel is deploying artificial intelligence (AI) on guns at checkpoints and illegally razing Palestinian neighbourhoods to expand its settler colonial state in contravention of international law; or when Russia has launched a war against Ukraine and has been relentlessly bombing civilians and using prohibited weapons for the past year. Just to give a few examples.

As noted earlier, these countries are building a world of war, for war. They are actively creating both the conditions for unrest, and the tools of violence to repress it. The more these governments invest in this world, the further away we get from international law, ethics, morality, rights, and dignity. The norms for peace, justice, and equality that humanity has collectively built over many years are being destroyed by those seeking power and profit from human suffering.

As the AutoNorms project at the Centre for War Studies, University of Southern Denmark noted last week, “As more autonomous AI technologies become integrated into diverse weapon platforms and these platforms proliferate globally, a certain norm of diminished human control risks spreading silently.” This silent diffusion of norms away from human control is happening outside of the GGE and the public eye. While “the process” in which we are supposed to trust languishes in Geneva, AWS are being deployed in the real world.

In this context, the unambitious half-measures proposed by “middle-ground” states are insufficient, and the refusal by heavily militarised states to acknowledge the need for any measures is unacceptable. The regional meeting held in Costa Rica in February on the social and humanitarian impacts of autonomous weapons demonstrated what is possible and what needs to be done. It is clear that there is a growing convergence around the need to prohibit and regulate AWS through legally binding means. States need to draw a clear moral and legal line against machines killing people, and against further investment in the world of war. The time for a lack of ambition and half measures is over; the time for treaty negotiations is now.

[PDF] ()