Facial Recognition Technology and the Death of an Iranian Nuclear Scientist: An International Humanitarian Law Perspective Commentary
geralt / Pixabay
Facial Recognition Technology and the Death of an Iranian Nuclear Scientist: An International Humanitarian Law Perspective

On 27 November 2020, Mohsen Fakhrizadeh was killed while traveling in his car just east of Tehran. It is understood that Fakhrizadeh was Iran’s foremost nuclear scientist and that he led the country’s efforts to develop a nuclear bomb. The details of the attack are a matter of doubt, with Iran claiming that Israel was the perpetrator and Israel refusing to comment. Initially, the Iranian regime stated that there was a gunfight between Fakhrizadeh’s bodyguards and several gunmen – indeed local media reported that three or four of the gunmen had been killed at the scene. 

However, on 07 December 2020, Iran changed its account and claimed that facial recognition technology was used to facilitate the killing via an unmanned, vehicle-mounted, machine gun ‘equipped with an intelligent satellite system’ which zoomed in on Fakhrizadeh and shot him. Ali Shamkhani, secretary of the Supreme National Security Council, emphasized that ‘no individual was present at the site’ at the time. This new ‘high-tech’ account of events has not been independently verified and it should be borne in mind that Iran may have a vested interest in suggesting that sophisticated technology was used in the strike. After all, Israel is one of the few States known to be developing technology with such capabilities. Nonetheless, it is possible that the Iranian account is correct and that this was the first instance of facial recognition technology-enabled weapons targeting. If so, what does international law – and in particular international humanitarian law (IHL) – have to say about such an attack?

If the Iranian account is true, IHL certainly governs the incident. One way to trigger the application of the regime is for there to be an ‘armed conflict’ between two or more States under Article 2 of Convention (I) for the Amelioration of the Condition of the Wounded and Sick in Armed Forces in the Field. According to the International Criminal Tribunal for the Former Yugoslavia in Tadic, this simply requires that ‘armed force’ is used by one State against another with no express minimum intensity requirement. Given that Fakhrizadeh was shot, and given that he held the rank of Brigadier-General in the Iranian military, the incident amounted to an attack against Iran for the purposes of IHL. The regime regulates different categories of conflict and the rules for this category, international armed conflict (IAC) are manifold and complex. However, the preeminent concept is ‘distinction’ – the notion that one must distinguish between civilians and combatants and only direct attacks against the latter.

The interface between facial recognition technology and this concept of distinction is a matter the author considered in a recent article. That article concerned ‘autonomous weapons’ – weapons that can be deployed to the battlefield and thereafter function without any further human intervention. Interestingly, the Iranian account leaves open the possibility that the weapon used to kill Fakhrizadeh was autonomous – although admittedly the terms ‘unmanned’ and ‘autonomous’ are not synonymous and it may be that a human-controlled the weapon, albeit remotely, much like a drone. Whatever technology was used, it was established in the article that distinction involves three principal stages: observation, recognition, and judgment.  

In terms of observation abilities, machines are just as capable as – or indeed more capable than – humans. For example, the defense manufacturer Raytheon, working with Exyn Technologies, has developed ‘mapping autonomous drones’ that are able to fully perceive their surroundings. According to the company, Airsoc, they can ‘operate in GPS-denied environments to map dense urban environments in 3-D [and] can dig deep to reveal tunnels, urban underground, and natural cave networks’. The technology works by using a combination of sensors, including cameras and lidar [which is] similar to radar, but using pulsed, infrared laser light’. The company boasts that the system collects 300,000 data points per second in order to map its environment and that it is sensitive enough to detect even dangling wires. In the commercial sector the company Ascot Resources, which is considering exploitation of the long-abandoned Big Missouri Ridge mine, intends to explore uses of Exyn’s system because initial tests produced more complete and timely maps than human surveyors. In short, machines can ‘see’ well enough for the purposes of the observation stage of distinction.

Regarding recognition capabilities, facial recognition technology is advanced too. It is used as a convenient security feature for unlocking phones and laptops. It is also used in passports and by payment applications. Soon, we are likely to see it rolled out to enable targeted advertising in shopping centers where characteristics such as age and gender are used to determine which advertisements are presented to which customers. In terms of more security-orientated scenarios, Israel has shown keen interest in the technology. Israeli firm AnyVision is at the forefront of development efforts, with suggestions that facial recognition “is used by the Israeli military at border crossing checkpoints, where it logs the faces of Palestinians crossing into Israel” and that it is “secretly used … throughout the West Bank … to monitor the movement of Palestinian residents.” Of course, no human would be able to cross-match thousands of faces against a database with this degree of accuracy or speed and thus machines have proven to be compatible with the recognition stage of distinction.

Regarding judgment, however, the position is different.  Judgment requires an appreciation of the impact that context can have on the status of persons and objects. This is particularly true in armed conflicts where an individual can shift between being ‘targetable’ and ‘untargetable’. For example, an otherwise targetable soldier may become hors de combat (out of action) because ‘he is in the power of an adverse Party’, ‘he clearly expresses an intention to surrender’ or ‘he has been rendered unconscious or is otherwise incapacitated by wounds or sickness’. The trouble is that machines struggle with contextual considerations of this sort. For example, the Danish agricultural technology company ‘Agrointelli’ develops new systems to make arable farming more profitable. One of its research projects, ‘RoboWeedMaPS’, uses artificial intelligence-powered machines to remove weeds while leaving crops undisturbed. Although the system has been equipped to ‘observe’ its environment and trained in detail to ‘recognize’ different weed types, it still struggles to ‘judge’ what is a plant and what is a weed – i.e. to distinguish one from the other. As a senior researcher on the project noted, ‘it only takes a small beetle to eat a leaf and the plant doesn’t look like the one in the image at all [or] the stems can be so thin that … it looks as though the leaves aren’t connected [or] if it’s cold in spring, some weeds turn completely purple even though they’re normally green’. Similar contextual difficulties arise when machine judgment is used for other purposes such as computer gaming and healthcare. The same problems would likely be even more pervasive in a setting as dynamic as a battlefield. For this reason, autonomous weapons fall at the last hurdle – they are not currently compatible with the judgment stage of distinction and so humans must remain responsible for weapons targeting.

Returning to Fakhrizadeh, the result of the above analysis is that the attack against him was probably compatible with distinction if facial recognition technology was used solely to assist a human operator in target selection. This is because the human combatant would have been required to approve the attack or, indeed, to cancel it if Fakhrizadeh became hors de combat by virtue of capture, surrender, or being wounded. The human operator would have had the benefit of the heightened accuracy and speed afforded by the system – indeed no shots appear to have struck the scientist’s wife who was sitting only inches away from him – while retaining control of the attack. On the other hand, if facial recognition technology was used in a system that acted autonomously – without human control after deployment – then the strike was probably not compatible with distinction as the system would not have been capable of detecting possible changes in the target’s status owing to contextual shifts.  

Again, the facts surrounding Fakhrizadeh’s death are not clear and will likely remain shrouded in mystery. Israel may not have been involved. The weapon, or weapons, used to kill Fakhrizadeh may have been utterly conventional. The law, too, is complex and has more facets than distinction alone and other rules or principles may have been violated in this attack. Nonetheless, it should at least be of some reassurance that IHL has some answers for situations in which facial recognition technology is used to facilitate weapons targeting.


Elliot Winter is a Lecturer at Newcastle Law School in the UK specializing in the Law of Armed Conflict and a recent visiting lecturer at University of Pittsburgh School of Law.


Suggested Citation: Elliot Winter, Facial Recognition Technology and the Death of an Iranian Nuclear Scientist: An International Humanitarian Law Perspective, JURIST – Academic Commentary, December 21, 2020, https://www.jurist.org/commentary/2020/12/elliot-winter-iran-facial-recognition-ihl/.

This article was prepared for publication by Vishwajeet Deshmukh, a JURIST staff editor. Please direct any questions or comments to him at commentary@jurist.org.

Opinions expressed in JURIST Commentary are the sole responsibility of the author and do not necessarily reflect the views of JURIST's editors, staff, donors or the University of Pittsburgh.