UN Expert Calls For a Moratorium On Lethal Autonomous Robots
UN Human Rights Expert Calls For a Moratorium On Lethal Autonomous Robots
GENEVA (30 May 2013) – The United Nations Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof Heyns, today called for a global pause in the development and deployment of lethal autonomous robots (LARs), to allow “serious and meaningful international engagement on this issue before we proceed to a world where machines are given the power to kill humans.”
“While drones still have a ‘human in
the loop’ who takes the decision to use lethal force, LARs
have on-board computers that decide who should be
targeted,” he stressed.
“The possible
introduction of LARs raises far-reaching concerns about the
protection of life during war and peace,” Mr. Heyns noted
during the presentation of his latest report* to the UN
Human Rights Council. “If this is done, machines and not
humans, will take the decision on who is alive or dies,”
he said.
“This may make it easier for States to go to war; and raises the question whether they can be programmed to comply with the requirements of international humanitarian law, especially the distinction between combatant and civilians and collateral damage,” he explained.
“Beyond this, their deployment may be unacceptable because no adequate system of legal accountability can be devised for the actions of machines,” the independent human rights expert noted, based on his analysis of potential violations of the rights to life and human dignity should the use of LARs materialize.
In his report, Mr. Heyns urges the UN Human Rights Council to call on all States “to declare and implement national moratoria on the production, assembly, transfer, acquisition, deployment and use of LARs, until a framework on the future of LARs has been established.” He also invites the UN High Commissioner for Human Rights to convene or to work with other UN bodies to convene a High Level Panel on LARs to articulate this framework.
“War without reflection is mechanical slaughter,” the UN expert on summary executions said. “In the same way that the taking of any human life deserves as a minimum some deliberation, a decision to allow machines to be deployed to kill human beings deserves a collective pause worldwide.”
For the Special Rapporteur, the time is ripe for a thorough and cool-headed global reflection in order “to ensure that not only life itself but also the value of life and human dignity is protected in the long run.”
“If deployed, LARs will take humans ‘out of the loop,’” Mr. Heyns warned. In his view, “States find this technology attractive because human decision-making is often much slower than that of robots, and human thinking can be clouded by emotion.”
“At the same time, humans may in some cases, unlike robots, be able to act out of compassion or grace and can, based on their understanding of the bigger picture, know that a more lenient approach is called for in a specific situation,” he underscored.
The Special Rapporteur stressed that there is now an opportunity to pause collectively, and to engage with the risks posed by LARs in a proactive way, in contrast to other revolutions in military affairs, where serious reflection mostly began after the emergence of new methods of warfare. “The current moment may be the best we will have to address these concerns,” he said.
The new report provides specific recommendations to the UN system, regional and inter-governmental organizations, and States, as well as to developers of robotics systems, NGOs, civil society, human rights groups and the International Committee of the Red Cross.
(*) Check the full report: http://daccess-dds-ny.un.org/doc/UNDOC/GEN/G13/127/76/PDF/G1312776.pdf?OpenElement or http://ap.ohchr.org/documents/dpage_e.aspx?si=A/HRC/23/47
ENDS