CCW Report, Vol. 3, No.2

by Ray Acheson via Reaching Critical Will 

The third UN meeting on lethal autonomous weapon systems (LAWS) opened on Monday morning with a general discussion by states. Once again, the majority of delegates taking the floor agreed that human beings must always be responsible for the use of force, in particular over decisions about life and death. Some states indicated their support for preventing the development and deployment of LAWS, which would operate without meaningful human control, through a multilateral instrument. A handful of delegations reiterated their well-known arguments against such an instrument. However, a reflection on their policies, consideration of the state of technological development, and recognition of the majority opinion in favour of retaining meaningful human control over individual attacks would seem to highlight problems with these positions rather than critical divisions amongst states within the CCW.

Arguments against action

The following represent some of the well-rehearsed arguments heard on Monday against the development of new law prohibiting the development and deployment of LAWS.

The technology is far away

Some states, such as Israel, Japan, Russia, Spain, and the United Kingdom, argue that LAWS are a possibility of the distant future and may never exist at all. Yet the United States has a list of existing weapon systems it considers beyond the remit of LAWS discussions, such as armed drones, the Patriot or Aegis missile defence systems, or torpedoes. The existence of such weapon systems indicates that the development of fully autonomous weapons is not so distant after all.

The UK is already investing in the development of a weapon system, the Taranis, which has included the testing of autonomous capabilities including target location and engagement. Israel operates the Harpy drone, which automatically detects, attacks, and destroys radar emitters. The US Phalanx system for Aegis cruisers automatically detects, tracts, and engages anti-ship missiles and aircraft. Further, as Sierra Leone noted, increasing the autonomy of existing systems could turn them into fully autonomous systems, warranting their inclusion in on-going talks.

States have “no plans” to develop LAWS

Over the past few years a number of states have been vague in their orientation toward the possible development of LAWS. The US, for example, has a policythat neither encourages nor prohibits the development of LAWS and indicates it will review any applications to develop such technology. Japan says it “has no plans to develop robots out of the loop, which may be capable of committing murder.” Others have been more emphatic, such as the UK, which has declared that it will never deploy weapons without human control.

Yet even where states make such declarations, questions remain about their interpretation of human control. As explained by the UK-based NGO Article 36, “UK policy has not yet provided an explanation of what would constitute human control over weapons systems whilst at the same time suggesting a narrow and futuristic concept of LAWS that appears permissive towards the development of weapons systems that might have the capacity to operate without the necessary levels of human control.”

Existing law is adequate to regulate development and use of LAWS

Some states have suggested they believe all weapons should have meaningful human control yet do not support the development of new law in this direction. The Netherlands indicates it does not support the deployment of weapons without human control, but also does not support a moratorium on the development of specific technologies at this time. Turkey says it supports human control over weapons, but is hesitant about a preemptive prohibition of LAWS because they are “hypothetical”. Canada says it does not support banning LAWS, even while it acknowledges challenges LAWS would pose to national level weapon reviews such as those mandated by article 36 of the 1977 Additional Protocol I of the Geneva Conventions, particularly around testing of these systems.

A number of states and civil society actors have pointed out other potential problems with relying on article 36 reviews as a response to LAWS. For example, the NGO that takes its name from the legal provision requiring weapon reviews argues that given the global implications of LAWS, decisions about their development must not reside solely with the states considering their acquisition. In addition, “narrow interpretations and inconsistent outcomes across states … could lead to the introduction of unacceptable technologies.” Furthermore, the development of LAWS would represent “an unprecedented shift in human control over the use of force,” which raises ethical, political, and legal concerns that may go beyond specific weapon systems under review.

Existing law applies

An even less helpful variation of the argument that existing law is adequate to regulate LAWS is that existing international law applies to LAWS. To what weapon system would existing law not apply? Should we really be worried that entire weapons, means, or methods of warfare might somehow be unshackled from the law? If so, we have a bigger problem than how to deal with autonomous weapons.

See full article here.