Page 453

REVISTA IEEE 2

453 Cesáreo Gutiérrez Espada Autonomous weapons systems, drones and international law In particular, the latter group believe that there are factors that exist that prevent or make it extraordinarily difficult for these autonomous weapons systems to heed both (A) IHL standards relating to the distinction between lawful targets within this legal framework and those that are not lawful and (B) standards that entail the principle of proportionality. The first set of standards, as is well known, seek to protect the civilian population from indiscriminate attacks while the second set call for a prior assessment of the potential damage to the non-combatant or protected population compared to the military advantage that the attack aims to achieve. A) With regard to the principle of non-distinction, it has been said that, due to deficient technological suitability of the sensors that currently exist, the inability to understand the context, the difficulties linked to applying the status of non-combatant in practice (by means of an IT program) and the inability to interpret intentions and emotions, it would be extremely difficult for an autonomous robot to comply with the IHL provisions governing its own use; let alone to identify, in situations of asymmetric conflict, who is and who is not a combatant23. Noel E. Sharkey, Professor of Artificial Intelligence and Robotics at the University of Sheffield (United Kingdom) is particularly categorical on the matter. This British expert considers that today’s robots lack the main requisite elements for compliance with the principle of distinction to be guaranteed: • They have neither the adequate sensors nor the vision systems required to distinguish between combatants and civilians, in particular in asymmetric or assimilated conflicts, or to recognise injured combatants who have surrendered or those who are in a mental state for which the principle of distinction is applicable. • It is not possible, due to the vagueness of the legal definitions contained within the Geneva Conventions of 1949, as well as those within Additional Protocol I to the Geneva Conventions of 1977 concerning international armed conflicts (such as “civilians” or necessary concepts such as “common sense”), to incorporate the essence of the principle of non-discrimination into the programming language of a computer. SILBERMAN, Jared: “International governance...” cit., p. 280; SINGER, Peter Warren: Wired for War... cit., p. 398; ARKIN, Ronald: Governing lethal behaviour... cit., p. 127. 23  Vid. ad ex. SHARKEY, Noel E.: “Grounds for Discrimination: Autonomous Robot Weapons”, RUSI Defence Systems, October 2008, pp. 86-89, pp. 88-89 (http://www.rusi.org/down-loads/ assets/23sharkey.pdf; accessed Monday 9 September 2013); ASARO, Peter: “On banning Autonomous Weapons Systems: Human rights, automation and the dehumanisation of lethal decision-making”, International Review of the Red Cross, 94 (2012), Number 886, pp. 687-709, pp. 696 ff.; DINSTEIN, Yoram: “The Principle of Distinction and Cyber War in International Armed Conflicts”, Journal of Conflict and Security Law, Volume 17 (2012), Issue 2, pp. 261-277 and HUMAN RIGHTS WATCH: Losing humanity... cit., p. 31.


REVISTA IEEE 2
To see the actual publication please follow the link above