Addressing Constitutional Challenges in Use of Facial Recognition Technology by Indian Law Enforcement Agencies Commentary
teguhjatipras / Pixabay
Addressing Constitutional Challenges in Use of Facial Recognition Technology by Indian Law Enforcement Agencies

Facial Recognition Technology (FRT) is becoming a buzzword globally and within India, among debates centering around AI ethics and high-tech surveillance. FRT is a collective nomenclature for any system that conducts 1:1 verification functions, or 1:n monitoring and identification functions of individuals, using facial mapping and/or sensitive biometric data. While the technology conceptually can have an array of use cases, its increasing footprint in law enforcement and surveillance activities has spurred vociferous contentions. The arguments seem to center around constitutional and legal rights’ violations, technological flaws leading to inaccurate outcomes, and even regulatory or governance concerns.

In India, numerous states have been pursuing FRT in different public sector applications. Specifically, almost twenty different states have some form of FRT system deployed by their local police forces. There are several concerns against this unregulated and non-transparent bolstering of the state surveillance apparatus. The most obvious issue is individual privacy. Since 2018’s landmark judgment of the Supreme Court of India, informational privacy has been read as part of the fundamental right to life and liberty guaranteed by the Indian Constitution. This entails an individual’s autonomy over when her personal information can be collected, for what purpose, and how it is processed or shared. However, the fundamental design of FRT violates this informational autonomy – FRT involves sophisticated and intelligent algorithms which are trained on extensive datasets, and in its application, it is interfaced with databases existing with the government. In both these instances, the data principal who is either being monitored, or whose biometric data is part of such datasets, loses her control over the same.

While recognizing the right to privacy, the Supreme Court also established it to not be an absolute right, carving an exception to it in matters of public order and national security. However, it creates this exemption with checks and balances, rather than affording the State a carte blanche. Laying down a three-pronged test of legality, legitimate aim, and proportionality, the Supreme Court ensured that arbitrary actions of surveillance are impugnable.

Beyond privacy, there is also the concern of due process violations which is sine qua non to any individual’s deprivation of life or liberty. Basically, any violation of Article 21 protections of the Indian Constitution, which now encompass the right to privacy, must be backed by some legal procedure established through legislation. The objective is to thwart arbitrary actions of executive overreach. However, the unregulated use of FRT in Indian law enforcement contravenes this very tenet of due process.

FRT also has risks and flaws in its inherent design that can yield inaccurate outcomes. A seminal study conducted by researchers in the United States found that FRT algorithms tend to be biased and inaccurate in identifying darker skin complexions, especially among women. The issue of accuracy and how certain algorithms deliver an end result is a general concern running deep in any AI ethics’ discourse. However, what exacerbates the issue in FRT’s usage is the consequence of someone errantly getting arrested, charged or even prosecuted. Such outcomes are not hypothetical as instances of wrongful arrests have already been reported. Our own study showed that when FRT is introduced to a policing system that is biased against certain groups of people, it is likely to exacerbate those biases. We found that the use of FRT by Delhi Police would disproportionately affect Muslims because Muslim areas were over-represented in the relatively over-policed areas of Delhi. When AI errors meet non-AI biases, both interact with each other to compound issues for people at risk.

This context signifies why last month’s challenge in an Indian court is a watershed moment. In December 2021, a civil rights activist residing in Hyderabad filed a public interest writ petition in the High Court of Telangana. The petition challenges the unregulated and arbitrary use of FRT by the city police in Hyderabad, and other public authorities and entities across the state. It is the first constitutional challenge to the use of FRT and is likely to shape some important principles for the responsible use of AI in India.

The primary contention in the petition is hinged on the Puttaswamy judgment and its three-pronged test to determine when states may impinge the right to privacy for national security or law enforcement concerns. The petition also flags the lack of legislative backing, and the absence of any regulatory norms, as serious red flags to the ongoing application of FRT in the state. It also flags the design flaws of FRT, and its potential inaccurate outcomes, as being violative of the right to equality, as afforded under Article 14 of the Constitution.

The petition does fail to raise three significant issues. First, the ongoing application of FRT specifically in law enforcement has implications for the larger criminal justice system. Assuming the accuracy and constitutionality of such technologies, it is unclear how inputs provided by it in a live investigation or surveillance exercise can be used as acceptable evidence in a court of law. Electronic evidence is governed by two provisions of the Indian Evidence Act, 1872, namely sections 65A and 65B. However, these provisions are limited to “electronic records” being deemed admissible in law  – the same being proxies in lieu of physical documents. FRT is commonly an algorithmic technology that is more sophisticated and distinct from electronic records. This being the case, it is arguable that the current iteration of the Evidence Act fails to address how emerging technologies will be inducted into the criminal justice system, and what precisely is their evidentiary value.

The FRT applications also contravene India’s commitment to deploy responsible and safe AI. Crucial to responsible AI is the idea of trust, yet the opaque manner in which FRT is currently being deployed by Indian law enforcement renders it counterproductive to this objective. Though this is not a constitutional argument, it certainly makes a compelling point that FRT usage needs to align with the larger vision and policy guiding AI deployment in India.

The relief prayed in the petition while seeking the use of FRT to be declared unconstitutional is contingent on the absence of any governing statute. The High Court has an unprecedented opportunity to give some substantial teeth to the ideas enunciated by the Supreme Court in Puttaswamy, and also reinforce ideas of responsible AI usage. Despite being a constitutional challenge before a state High Court, the outcome of this case is going to serve as the benchmark for similar applications of FRT across states in India. It also presents an opportune moment for the state government to consider meaningful legislative reform for regulating the use of FRT within the state of Telangana. Whatever the case culminates, it certainly brings an overdue debate, on state surveillance and reforming the same, to the forefront.

 

Ameen Jauhar & Jai Vipra are senior resident fellows at the Vidhi Centre for Legal Policy. Ameen leads the Centre for Applied Law & Technology Research (ALTR) with a focus on law and policy issues around data governance, internet and cyber regulation, and AI ethics in India. Jai works with the ALTR team with a focus on digitalisation and development in the Global South.

 

Suggested citation: Ameen Jauhar and Jai Vipra, Addressing Constitutional Challenges in Use of Facial Recognition Technology by Indian Law Enforcement Agencies, JURIST – Professional Commentary, February 11, 2022,
https://www.jurist.org/commentary/2022/02/jauhar-vipra-FRT-constitutional-challenges-law-enforcement/.

 


This article was prepared for publication by Ananaya Agrawal, JURIST Tech (Beat) editor. Please direct any questions or comments to her at commentary@jurist.org


Opinions expressed in JURIST Commentary are the sole responsibility of the author and do not necessarily reflect the views of JURIST's editors, staff, donors or the University of Pittsburgh.