Statement
Holy See Statement on Lethal Autonomous Weapons and Drones, November 14, 2013
Statement by H.E. Archbishop Silvano M. Tomasi, Permanent Representative of the Holy See to the United Nations and Other International Organizations in Geneva at the Annual Meeting of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects (CCW)
Geneva, 14 November 2013
Monsieur le Président,
La délégation du Saint-Siège voudrait vous exprimer sa satisfaction pour votre accession à la présidence de notre réunion, et vous remercier pour l’excellent travail pour préparer nos travaux.
Mr. President,
Lethal autonomous weapons and drones, although distinct, share much the same humanitarian implications and raise several questions of grave ethical concern. Most critical is the lack of ability for pre-programmed, automated technical systems to make moral judgments over life and death, to respect human rights, and to comply with the principle of humanity. These questions will grow in relevance and urgency as robotic technology continues to develop and being utilized. With this concern in mind, I take the opportunity to express our support for your initiative, Mr. President, that envisions the adoption of a mandate to start thinking about these important and urgent matters. Indeed advantage should be taken of all relevant contributions from all fields, particularly those of international humanitarian law and human rights law.
On this occasion, allow me to address the issue of drones and to propose some reflections related to the ethical dimension of these systems.
The use of weaponised drones in armed conflicts and other international hostile actions has increased exponentially in the last several years. Social, political, economic and military factors may have changed the equation for some decision-makers regarding the use of weaponised drones, but the ethical and humanitarian concerns remain relevant, and in fact have become more compelling as their use increases.
The development of weaponised drone technology and its more frequent military application represent a notable change in the conduct of hostile action. From a user’s standpoint, the ability to operate remotely, even from a computer halfway around the world, greatly reduces risks to the user’s own military personnel and it extends the strategic reach to the point of enabling it to deal with perceived threats around the world.
Consideration has to be given, however, in addition to international law and the law of war, to the humanitarian and ethical implications of the use of weaponised drones as well as to other questions related to human rights’ law. Armed drones—like any other weapon—are and should always be subject to the rules and moral principles these juridical instruments impose.
It is difficult to assess the precise impact on civilians of the use of weaponised drones, due in part to the lack of transparency in reporting, but it is indisputable that large populations live in constant fear of their strikes. Credible sources report a high number of casualties in the civilian population. Thus, if the economics of drones may make sense to the budgets, it is ethically imperative that those cost savings not be the only costs considered. Costs to civilian life and property, as well as the psychological and economic cost of living in constant fear of future mistaken strikes, should not be ignored.
Some additional pressing questions should worry the international community. When a weaponised drone is piloted from thousands of miles, who bears the responsibility for humanitarian violations in its use? When vital data related to the use of weaponised drones is withheld from scrutiny, how can compliance with international law, international humanitarian law and ethical standards be verified?
Weaponised drones are useful precisely because they take a number of important functions out of the hands of human beings, increasing accuracy and decreasing risks to life and limb for military personnel. Yet the increasing involvement of a pre-programmed machine in several steps of the targeting and attacking process further blurs the question of who is accountable when something goes wrong. Clear accountability is essential to upholding the laws and norms of international humanitarian law.
Furthermore, it is essential to understand and lay out the criteria to identify legitimate targets and distinguish targets from innocent civilians. The lack of military risk and supposed accuracy of surveillance and targeting by weaponised drones may make operators and commanders more willing to execute strikes with greater risk to civilians: greater transparency and clearer accountability in their use is critical.
Decisions over life and death are uniquely difficult decisions, a heavy responsibility for a human being, and one fraught with challenges. Yet it is a decision for which a person, capable of moral reasoning, is uniquely suited. An automated system, pre-programmed to respond to given data inputs, ultimately relies on its programming rather than on an innate capacity to tell right from wrong. Thus any trend toward greater automation of warfare should be treated with great caution. But even in the limited automation of “human-in-the-loop” drone systems, there lies the potential for removing the essential human component from the process. Human decision-makers involved should be trained, well informed and should dispose of reasonable and sufficient time to be in a position to make sound ethical decisions.
The emerging class of remote operators of robotic weapons systems such as drones have not necessarily been given such training or adequate time to deliberate as they make decisions on the screen which affect life and death thousands of kilometres away. This procedure has ethical implications for the civilian cost at the receiving end of the drones, but it also adversely affects the operator. A study showed that “nearly 30 percent of drone pilots experience what the military calls ‘burnout,’ defined by what the military describes … as ‘an existential crisis.’”
In this context of dehumanised warfare with remotely-operated weapons and low risk on one side, a key ethical question thus is whether this lowers the threshold of conflict, making it seem more attractive to enter into war. Considering this question with the near inevitability in modern warfare of massive civilian casualties should give pause.
A final ethical consideration to explore briefly is the threat of proliferation of sophisticated drone technology. The need to account for ethical considerations and set a strong precedent for restricting their use becomes much more urgent when considered in light of the ongoing and accelerating proliferation of these weapons around the world. Any precedent set by failing to account now for all humanitarian and ethical considerations in the use of drones becomes an increasing danger as drone technology proliferates further.
Mr. President,
As we enter this new era of technology in warfare, it is essential that all actors stop to consider all relevant questions related to the use of drones. Respect for life, respect for human rights and avoiding dehumanisation are our collective challenge
The-Holy-See-Statement-Lethal-autonomous-weapons-and-drones.pdf