A war where the machine decides who to kill! (LAWs wars)

Which country wants to be attacked by an AI-controlled system with no one in command? Which country wants their soldiers to be killed by an autonomous machine, and potentially, some civilians by mistake? The answer is evidently no one! No country wants that. But which country intends to possess such weapons, then the answer is more ambiguous. The last report of the Group of Governmental Experts (GGE) reflects this. After a week (25-29 July) of discussion at the Palais des Nations, UN Geneva, the adopted report is hollowed without meaningful conclusion or commitments.

Lethal autonomous weapons

Lethal autonomous weapons (LAWs) are military system that can autonomously search for and engage targets based on programmed constraints and descriptions. LAWs are also known as killer robots.

Autonomous weapons have existed for many years; for example, land mines trigger and kill or an injury without any human action. With emerging technology, including AI, we understand the interest of certain states to include these technologies in weapons to improve their autonomy. Since the 70s, the US has used the Phalanx CIWS, which can autonomously identify and attack incoming missiles. With AI, its capacities are considerably increased! Continuing with the example of mines, Russia’s anti-personnel mines of the POM-3 type are particularly deadly. They are disseminated in the land of operations but do not explode immediately. When activated, they rise in the air before exploding and causing multiple ravages, which can be fatal within a radius of 16 meters. Equipped with sensors and software, they choose their target, when they explode or not, depending on the identity of the people or equipment that approach. There are, unfortunately, so many other systems that will be too long to cite here. To conclude this part, in Libya in 2020, a Kargu 2 drone hunted down and attacked a human target. According to a report from the UN Security Council’s Panel of Experts on Libya, published in March 2021. This may have been the first time an autonomous killer robot armed with lethal weaponry attacked human beings. [https://en.wikipedia.org/wiki/Military_robot]

We quickly understand all potential ethical and legal issues. Autonomous systems can make mistakes; who is responsible then? Like mine killed millions of civilians, new systems may have bias and kill unstintingly, with no one to stop them. The range of potential problems is extensive.

A slow-downed convention

For nine years, the Convention on Certain Conventional Weapons or CCW, also known as the Inhumane Weapons Convention, has tried to regulate it with its GGE. For the most ambitious, it would be a question of agreeing on a treaty, or another international instrument, which would guarantee the prohibition that a weapon can operate autonomously, i.e., without the intervention of human supervision. Many Latin Americans, and European states are now advocating for this outright ban. The answer is less clear-cut for other states, including the USA. They consent to the prohibition of specific weapon systems as well as to a certain regulation but refuse a binding legal framework. Finally, Russia is slowing down all negotiations and reducing its content.

Russia and the game of consensus

A majority of States are now convinced of the need to act significantly, even asking for more days to debate in 2023. But the main problem is the rule of consensus, which prohibits any discussion breakthrough”.

Many little disagreements, for instance, delegations, wasted time discussing whether the CCW is an appropriate forum or the only appropriate forum for dealing with the issue of autonomous weapons.

These discussions have even been theatrical when Russia attacked many times the presence of civil societies to limit their intervention and participation in informal meetings. It was a tool to slow down the discussion, focusing the debate on organizational points. At the same time, we can also be afraid that this Russian posture is appearing in others GGEs. Meanwhile, some other states, like Israel and India, are discrete and do not oppose it. They probably use this condition to their advantage. Russia is doing all the work for them.

Therefore with the refusal of a few states, all the details about elements and possible measures for an agreement on autonomous weapons were removed. All conclusions about what kinds of control are necessary, and possible processes to achieve that control, were taken out. The present conclusions section just outlines the types of proposals discussed, recognizes ethical perspectives, and repeats the respect for international humanitarian law. It confirms then that states are responsible for wrongful acts in accordance with international law [link to report], so no new laws. 

Not only are the conclusions disappointing, but the way the discussion was carried out was disappointing, and the mandate for 2023 remains uncertain.

We can not wait on CCW, the urgency of the problem is too critical.

The slow process is to the advantage of countries using these technologies. The Russian POM-3 mines, for instance, have been used in Ukraine, accordingly to Human Right Watch. The development and deployment by Russia and other countries will continue as long as no agreement is reached. LAWs have to be outlaws! And the CCW seems not to be anymore the right platform.

C.F. Da Silva Costa
C.F. Da Silva Costa
M.S. in Strategic Protection of the Country’s System & Ph. D. in Physics. Represented Accord university, as a civil society, at the 25-29 July, Group of Governmental Experts on emerging technologies in the area of Lethal Autonomous Weapons Systems, ONU, Geneva.