New Technologies in International Law / Tymofeyeva, Crhák et al.
IDF uses AI in military intelligence and telecommunication which helps to improve the warning systems. AI is also used in operational learning and planning; the AI can reach conclusions which were impossible to reach in the past with human efforts due to difficulties in handling and analysing vast amount of data. 867 Aside of the aspect of artificial intelligence, there could be noted several aspects which might be found problematic in connection to the general concept of protection of human rights. One of them would be avoiding mentioning the minorities living in the State of Israel. This omission in the above cited text might raise questions whether such approach is really democratic. Including the aspect of the artificial intelligence, there could be raised concern whether these technologies might be operating fully in accordance with the prohibition of discrimination which might contain multiple aspects not only in relation to the wide content of this human right, but also in relation to the broad variety of the technologies. There can be given several examples to create some general idea of the problematics. These technologies can be used in a different way in connection to Jews and Arabs. Antebi noted that “ the AI system is only as good as the data it accepts. When the data used to train the machine is not sufficiently diverse, biases may arise.” 868 But even when “the data is perfect”, it still reflects social bias, such as gender and ethnic differences. This is potentially very dangerous in using military AWS. This is exceptionally relevant for the IDF given the fact that the Palestinian terrorists are often women, and not exceptionally children (persons under the age of 18). The terrorists very often purposely do not use any outer sign which would allow them to be distinguished from civilians. On the top of that, the terrorists from Hamas and Palestinian Islamic Jihad also often dress up as Jewish people, sometimes they even wear Israeli police uniforms. 869 The fully autonomous AI weapon systems might face challenge of distinguishing between the civilians and combatants, which is another problem because the Palestinian terrorists do not have the status of the combatants but are rather armoured civilians using the weapons or suicide bombs to commit a crime of a terrorist attack. The semi-autonomous AI weapon systems might be a little less problematic in case they are operated by a human being which takes the final decision to activate the weapon and neutralise or injure the perpetrator. One of the options of how to eliminate the risk of violation of the international law of armed conflicts might be stipulating, that the developers of the AI should consult their inventions and results of testing with lawyers who are specialised on this area of law, which, after all, could be suggested in regards many other types of weapons. 870 867 ‘Israel Defence Force Strategy Document’ ( Harvard Kennedy School Belfer Centre for Science and International Affairs , 2015)
205
Made with FlippingBook Annual report maker