CCW Report, Vol. 11, No. 3
Time to Leave the CCW Chatbox
20 May 2023
Ray Acheson | Reaching Critical Will, Women's International League for Peace and Freedom
After yet another session of the Group of Governmental Experts (GGE) on autonomous weapon systems (AWS), states are no closer to prohibiting or restricting the development or use of these weapons. The final report, which delegations continued to negotiate late on Friday night, once again falls far short of what is needed to address the urgent risks posed by autonomous weapons.
Despite overwhelming support from most delegations to elaborate clear prohibitions and restrictions on the development, possession, and use of AWS, the final text does little more than assert that states must comply with international humanitarian law (IHL) in their use of AWS. The report contains many caveats that turn commitments into suggestions, using phrases such as “when necessary, states should…”. This is followed by a few possible self-imposed—not mandatory—limitations on the use of AWS, which are extremely vague. One is a suggestion to “limit the types of targets that the system can engage,” without a reference to the widely-supported call for the prohibition of weapon systems that can target human beings. Meanwhile, when a state determines—through a national review process—that a weapon system’s deployment would be prohibited by international law, it is not required to not develop or use the system, but is only encouraged to exchange best practices about it--bearing in mind national security or commercial proprietary concerns, of course.
As is par for the course at this point, the report also punts decision-making about the next GGE mandate to the CCW Meeting of High Contracting Parties, which will meet in November 2023. Inevitably, that will be another fraught discussion about how many days the GGE will meet for (most states will call for at least twenty days, but it will probably be ten, because Russia won’t allow more), and about the content of the mandate (most states will call for negotiations of a new CCW protocol prohibiting and restricting AWS, but a handful of heavily militarised states will object to this and the mandate will look more or less, or exactly, like this year’s). After more than a decade of these deliberations, the script writes itself.
Meanwhile, autonomous weapon development is continuing full steam ahead. In complete disregard for the UN deliberations, tech companies in several countries are developing weapon systems operating with autonomous or artificial intelligence features. One of many examples is Palantir’s Artificial Intelligence Platform (AIP), which the company is integrating into military decision-making systems. Palantir, which is already notorious for human rights abuses of its software in contracts with the US Department of Homeland Security and US Immigration and Customs Enforcement (ICE), has announced that it is running “a ChatGPT-style chatbot to order drone reconnaissance, generate several plans of attack, and organize the jamming of enemy communications.” Reporting on this new technology, VICE notes, “While there is a ‘human in the loop’ in the AIP demo, they seem to do little more than ask the chatbot what to do and then approve its actions.”
While many delegations have been engaging in the GGE for many years in good faith, it’s clear that a handful of governments—including Russia and the United States, through different tactics—are actively stalling the creation of any restrictions on related technologies to provide time for their military industries and private tech firms to advance research and development of these systems past the point of no return. Among other things, these governments make circular argument that existing international law is sufficient to address AWS while also asserting that because existing international law does not talk about autonomous weapons, there are no inherent prohibitions on these weapons. This means that no evolution of law is possible, even as the evolution of weapons is happening right now.
These governments are insisting upon the inevitability of AWS while creating the exact conditions for this inevitability. But autonomous weapons are not inevitable, no matter how badly the most militarised states in the world want them. What is inevitable is that such weapons will result in horrific human rights abuses and violations of international humanitarian law if they are deployed. States that want to prevent this dystopian future must act now. Staying inside the CCW for endless years of discussion is not only not credible, but also reckless.
While some might argue that those developing AWS must be involved in negotiating their prohibition, we know from years of practice that this is not how most disarmament has been achieved. Instead, it has been governments concerned with humanitarian risks and human suffering, working in partnership with civil society organisations and the International Committee of the Red Cross, that have brought new international law to fruition, which has in turn impacted the development, possession, and use of various weapon systems. A ban on autonomous weapons is needed now, and the last decade of discussions has set the stage for negotiations, with consensus—not unanimity, but actual consensus—building around a two-tier approach of prohibitions and restrictions.
Of course, it would be great to have all states participating in negotiations, and for no states to develop, possess, or use AWS. As this is so far not the case, it is also up to tech companies, and tech workers, to refuse to develop weapon systems with autonomous and artificial intelligence functions. Many companies and workers, as well as scientists and academics working in related fields, have already taken this stand. As governments struggle with efforts to create new international law, we need more of those tasked with building these systems to pledge not to do so.
The world does not need more violent technologies; it needs instead to devote human ingenuity and financial resources towards mitigating ecological crises and diminishing and preventing suffering and inequality. At this point, the GGE is like chatbox full of error messages, or perhaps even a chatbot sending repetitive messages that get us nowhere. We can see this slowmotion catastrophe speeding up before our eyes as new applications for artificial intelligence and autonomous technologies come online every day. We don’t have much time left to prevent the weaponisation of these technologies. Bold and creative action is needed right now.