Prohibiting autonomous weapons systems

Campaigners against 'killer robots' Noel Sharkey (International Committee on Robot Arms Control), Jody Williams (Nobel Women’s Initiative) and Steve Goose (Human Rights Watch)—plus an unwelcome guest. Flickr / Campaign to Stop Killer Robots. Some rights reserved.

With
states continuing to research and develop relevant weapons systems, international
action is needed to prevent the emergence of fully autonomous weapons—systems which,
once activated and their mission defined, could select people or objects to attack
without any further human intervention.

Last
week, 89 countries participated in a second informal meeting of experts on ‘lethal
autonomous weapons systems’, within the framework of the United Nations’
Convention on Certain Conventional Weapons (CCW). International organisations,
NGOs and academics, including roboticists and ethicists, contributed. This
issue has now been on the international agenda for two years, in part due to
the Campaign to Stop Killer Robots, which is calling for a treaty to ban
fully autonomous weapons.

Despite
most states rejecting weapon systems that operate without human control, a
decisive and co-ordinated response has not been forthcoming. Discussions must now
advance towards an international prohibition, before the speed of technological
development overtakes diplomatic processes.

Core issue

Whether
used in an armed conflict or law-enforcement context, fully autonomous weapons
would fundamentally challenge the relationship between human beings and the
application of violent force. The core issue remains whether the act of using
potentially lethal force should ever be delegated to hardware and software
systems, which cannot hold responsibility.

This
fundamentally dehumanising possibility was firmly rejected as unacceptable by a
large number of states and civil-society organisations last week in Geneva.
Experts, including the UN special rapporteur on extrajudicial, summary or
arbitrary executions, highlighted the affront to human dignity it would
represent.

The
discussions, chaired by Germany, were mandated to examine autonomous
weapons systems in the context of the purpose of the CCW, which is to ban or
restrict the use in armed conflict of specific types of weapons that “trouble
the conscience of humanity”. The treaty specifically recognises the need to
“continue the codification and progressive development of the rules of
international law”—as existing rules will not always be adequate in a changing
world.

Since the
latest discussions were informal, the only official outcome will be a summary report
to the next meeting of the 120 states-party to the CCW, in November. States
will make a decision then on how to proceed—including considering the option of
negotiating a new protocol on autonomous weapons systems. This makes the next
few months crucial for galvanising commitment to action.

Little information

Autonomous
weapons systems were discussed at the Human Rights Council in 2013 and at
previous meetings in the framework of the CCW in 2014. Some states have presented
policy positions to these forums but little information is available on what weapons
technology is being developed, and few states have elaborated comprehensive
policies on autonomous weapons systems. States with high-tech militaries—such
as China, Israel, Russia, South Korea, the UK and the US—are investing in
autonomy for various functions of weapons-systems technologies. But no state
has explicitly stated that it wishes to obtain ‘lethal autonomous weapons
systems’ as discussed at the CCW meeting, or that it is actively pursuing them.

A clear, co-ordinated leadership group for a ban has not yet emerged.

Many
appear still to be determining their position—or considering what they should
reveal at this stage—including how they should align with others with whom
their interests may coincide but who are not their normal allies. Attendance at
the discussions was high, due to considerable interest. There was also high interest
from media and national parliaments.

A number
of state and civil-society delegates however noted a serious
under-representation of delegations and statements from countries of the global
south, which were also less likely to have brought experts from their capitals
given the lack of a sponsorship programme associated with the CCW. Perspectives
from potential developer countries risked being over-represented.

Prohibition

Only
Israel and the US indicated last week that they felt the door should be left
open on these weapons’ development. A number of states, including France, Japan
and the UK, wished to make clear they had no intention of developing them—though
France and the UK do not support their prohibition. No states have put in place
a national moratorium, as recommended by the UN special rapporteur to the Human
Rights Council.

Those who
support a ban include Bolivia, Cuba, Ecuador, Egypt, Ghana, the Holy See, Palestine, and Pakistan. Others said prohibition or new legal regulations might have
merit, without making firm commitments. A clear, co-ordinated leadership group
for a ban has not yet emerged.

Given
that there appears to be a consensus within the CCW context that the
development of ‘lethal autonomous weapons systems’ would be undesirable, the
main point of contention is what the international response should be. A range
of critical ethical, technical, legal, strategic and regulatory concerns were
discussed over the five days of meetings in Geneva, with different options
proposed for next steps as a result.

Measures
to increase transparency and encourage national legal reviews of weapons were proposed
by some states, led by the US, to deal with the concerns raised by autonomous
weapons systems. A better understanding of technologies under development would
certainly assist international discussions by making certain issues more
concrete. Reviewing the legality of the technologies of violence states could
deploy is also an important obligation (originating in additional protocol I to
the Geneva Conventions and a customary obligation on states not party). But these
are procedures which states should be undertaking in any case, insufficient as
a specific response to autonomous weapons systems.

National-level
weapons reviews cannot give a global answer to the ethical questions raised by such
systems. Autonomous weapons would represent an unprecedented development, which
could not have been foreseen when current laws were made. Concerted
consideration by the international community is therefore required. Weapons
reviews are conducted by individual states and their results are not made open
to scrutiny. They also only examine strictly legal questions in relation to the
conduct of armed conflict, yet autonomous weapons systems should not be subject
only to legal reasoning, given the fundamental moral issues raised.

Meaningful human
control

Members
of civil society and some others have advanced the principle, to structure this
debate, of requiring meaningful human control over every individual attack. Certain
weapons systems would be prohibited as a result. In Geneva this concept was aired,
and debated, more than ever before, in both states’ statements and expert-panel
discussions. Numerous states asserted that work on autonomous weapons should
include a focus on this issue, and on reaching consensus on where the acceptable
limits of meaningful human control lie.

This
offers the most promising avenue towards a collective international response on
autonomous weapons systems. More in-depth discussion is needed, proceeding from
how states ensure sufficient control over existing weapons systems. This will
help to operationalise the principle, and can help lay the foundations for a
legal instrument prohibiting systems that do not comply.

A small
number of states doubted the utility of discussing meaningful human control, pointing
to the lack of an accepted definition, or referred to the issue using slightly
different formulations. This represents a stalling tactic. Rather than posing a
problem, the ambiguity gives an opportunity for states to build a common
concept to deal with autonomous weapons systems—as, for example, the concept of
‘unacceptable harm’ helped frame discussion on the prohibition of cluster
munitions.

A formal
process of work to prohibit fully autonomous weapons should be set in train. This
should home in on elaborating and agreeing the key elements of ‘meaningful
human control’. The 2016 review conference of the CCW will no doubt be a
significant marker. Sri Lanka, which will be CCW chair at the next meeting of
states-party, has already pointed to the precedent of the pre-emptive
prohibition of blinding laser weapons by a CCW protocol. Given the relevance of
autonomous weapons systems to law enforcement as well as armed conflict, they
should also be considered in the Human Rights Council, as a number of states
have pointed out.

Some
states argued in Geneva that both the technology and the debate on autonomous
weapons systems were in their early stages. The urgency of swift consideration
was highlighted by robotocists, who pointed out that the debate not only concerns
advanced artificial intelligence decades away but also systems using sensor
technology which could be built now or in the near future. If the consensus
approach associated with the CCW impedes take swift action, the process to
prohibit fully autonomous weapons will need to be taken forward elsewhere.

Like us on Facebook to follow the latest openSecurity articles, and tell the editors
what we should publish next