Watch the Watchmen Series Part 4 : The National Automated Facial Recognition System

In the fourth part of our "Watch the Watchmen" series which aims to develop and use a national database of photographs which is to be used in conjunction with a facial recognition technology system by Central and State security agencies to swiftly identify criminals.

07 October, 2020
9 min read

tl;dr

The National Automated Facial Recognition System (AFRS) is being developed by the National Crime Records Bureau (NCRB) under the Ministry of Home Affairs. The project aims to develop and use a national database of photographs which is to be used in conjunction with a facial recognition technology system by Central and State security agencies.

History

The NCRB first released the Request for Proposals (RFP) (Document Reference: 02/001).calling for bids for the creation of AFRS on 28 June, 2019 Initially, the deadline for submission of bids was on August 16, 2019 (Read our first post on the subject here). However, the deadline was extended multiple times due to reported administrative reasons. Finally, the RFP was recalled and cancelled. A new revised version of the RFP [Document Reference: 02/001 (Revised)] was issued on June 22, 2020.

The current deadline for the submission of bids is October 8, 2020.

The estimated budget of the project is INR 308 crore.

What is AFRS?

The RFP invites bids for the creation of a National Automated Facial Recognition System (AFRS) to further create a national database of photographs. According to the RFP, this database is purported to be used to swiftly identify criminals by gathering existing data from various other databases like:

  1. Passport database under the Ministry of External Affairs,
  2. Crime and Criminal Tracking Network and Systems (CCTNS) by the National Crime Records Bureau(NCRB) under the Ministry of Home Affairs (MHA),
  3. Interoperable Criminal Justice System (ICJS) by the NCRB under the MHA ,
  4. Women and Child Development Ministry's KhoyaPaya Portal,
  5. Automated Fingerprint Identification System (AFIS) by the NCRB under the MHA
  6. Any other image database available with police/other entities.

To identify criminals, scene of crime (SOC) images and videos will be matched with the abovementioned databases by using facial recognition technology. Face recognition systems use computer algorithms to pick out specific, distinctive details about a person’s face. These details, such as distance between the eyes or shape of the chin, are then converted into a mathematical representation and compared to data on other faces collected in a face recognition database. The data about a particular face is often called a face template and is distinct from a photograph because it’s designed to only include certain details that can be used to distinguish one face from another.

What  could go wrong?

Identification happens when FRT is used to identify an individual from a pool of many people. This kind of 1: many FRT based identification is typically used for security and surveillance purposes. Verification, on the other hand, occurs when an image is compared to another image from an existing database in order to verify that the individual is who he/she claims to be, i.e., authentication of identity on a 1:1 basis.

Implementation of a Faulty FRT System

Claims relating to accuracy of FRT systems are routinely exaggerated and the real numbers leave much to be desired. The implementation of such faulty FRT systems would lead to high rates of false positives and false negatives in this recognition process.

  1. A false positive occurs when FRT gives an incorrect positive result wherein the system misidentifies an individual as someone he/she is not. This may lead to discrimination and strengthening of existing biases. For instance, ACLU found that Amazon’s facial recognition software “Rekognition” incorrectly matched 28 members of the US Congress with people who had been arrested for committing a crime. Of those members who were incorrectly matched, 40% were people of color which is a disproportionate number since people of color make up only 20% of the US Congress.
  2. A false negative occurs when the FRT gives an incorrect negative result wherein the system is unable to recognize an individual or authenticate his/her identity. This may lead to arbitrary exclusion of an individual from government schemes and benefits. Failure of biometric based authentication under Aadhaar has led to many people being excluded from receiving essential government services and even led to starvation deaths. Such problems will be further aggravated by an inaccurate FRT system.

The AFRS is being developed and deployed by the government without any technical standards in place which may lead to faulty systems being implemented. Once in place, it would be very difficult for it to be reconciled with future technical standards and  damage like discrimination and exclusion would be impossible to undo. In the revised RFP, there is no mention of the international standards which were included in the original RFP which had to be complied with as a technical requirement. The reason behind the exclusion of these standards is unclear and raises the question of why necessary standards of technical requirements are being diluted.

Implementation of an Accurate FRT System

While there have been claims of a fully accurate FRT system, none of these claims have been corroborated by independent review and audit. The National Institute of Standards and Technology (NIST) has extensively tested FRT systems for 1:1 verification and 1:many identification and how accuracy of these systems vary across demographic groups. These independent studies have concluded that currently, no FRT system has 100% accuracy.

An accurate FRT system would hypothetically have a 100% success rate in 1:1 verification and/or 1:many identification. However, it will come with its own ominous connotations, the most problematic of which may be state led mass surveillance and difficulty for outside actors to counter-challenge government decisions.

Probe images for FRT systems are often collected by the police through CCTV cameras installed in public spaces. Individuals in a CCTV surveilled area may be aware that they are under surveillance but the assumption is that this surveillance is temporary. Use of CCTV in conjunction with FRT would mean their images will be stored for a longer period of time, if not permanently. This data will also be used to extract particular data points such as the facial features and other biometrics which the individual has not consented to sharing when entering a CCTV surveilled zone and these data points can be used to track future movements of the person. Therefore, integration of FRT with a network of CCTV cameras would make real time surveillance extremely easy.

One of the most important changes which have been made in the revised RFP is that it now states that the project “does not involve the installation of CCTV camera nor will it connect to any existing CCTV camera anywhere”. This is a departure from the original RFP wherein CCTV integration had been included as a functional requirement.

However in the revised RFP, a new data source  "Scene of Crime images/videos" has been introduced for input into the AFRS database. This inclusion is at odds with the previous assertion made in the RFP - that integration of CCTVs will not take place. Deletion of CCTV camera footage as a data source leaves a gap in the functional architecture of AFRS and the RFP fails to satisfactorily account for its replacement. The RFP also fails to mention how the data and subsequent analysis/ information which is obtained through AFRS will be presented and utilized in a court of law, i.e., the nature of the evidence obtained from AFRS and its admissibility as pertaining to its reliability in courts.

Deployment of AFRS without adequate legal safeguards is deeply troubling. Specific laws with regard to FRT and personal data protection do not currently exist in India. While a flawed Personal Data Protection Bill has been introduced in the Parliament (read more here), there is uncertainty about when it will be enacted and implemented and how effective it will be in protecting the personal data of individuals. Under the current version of the Bill, wide exceptions have been provided to the Government for surveillance related activities. A strong data protection legislation is needed to hold these FRT systems accountable in terms of collection, storage and usage of data including sharing of data across government agencies and with third parties.

The implementation of an accurate FRT system would also violate fundamental rights by facilitating mass surveillance. For instance, there will be a chilling effect on the right to freedom of speech and expression because people will be wary of being prosecuted in case they express anti-government sentiments. Further, the right to freedom of movement would be hampered as mass surveillance would allow the government to track the movements of individuals in real time across the country. Finally, the right to privacy will be violated as  sensitive personal data which is collected by these FRT systems will be used by the Government without the informed consent of the individual. This would also hamper the individual from exercising the liberty to share their information in some contexts and remain anonymous in others according to their individual choice.

As per the Hon’ble Supreme Court's decision in Justice K.S. Puttaswamy vs Union of India (2017 10 SCC 1) any justifiable intrusion by the State into people’s right to privacy protected under Article 21 of the Constitution must conform to certain thresholds. These thresholds are:

A. Legality

Where the intrusion must take place a defined regime of law i.e. there must be an anchoring legislation, with a clear set of provisions. As pointed out in our previous legal notice as well, there is no anchoring legislation which allows for and regulates  the use of AFRS. In the absence of such a framework and safeguards, the first requirement for lawful restriction on the right to privacy is not met.

B. Necessity

Which justifies that the restriction to people’s privacy (in this case data collection and sharing) is needed in a democratic society to fulfill a legitimate state aim. In the RFP, it is stated that the need for AFRS arises because it will enable automatic identification and verification  through criminal databases which would help investigation of crime and tracking and detection of criminals.

This characterisation is based on a faulty assumption that facial recognition technology is accurate and would provide speedy and correct results. However, ongoing research in the field has shown that facial recognition technology which is completely accurate has not been developed yet. Use of such inaccurate technology, especially for criminal prosecution, could thus result in a false positive, i.e., misidentification of an innocent individual as a suspect in a crime. Thus, AFRS fails to meet the requirement of necessity as laid down by the Supreme Court in the Puttaswamy judgment.

C. Proportionality

Where the Government must show among other things that the measure being undertaken has a rational nexus with the objective. The AFRS contemplates collecting sensitive personal information, intimate information of all individuals in the absence of any reasonable suspicion by collecting images and videos from a scene of crime. This could cast a presumption of criminality on a broad set of people. In the Supreme Court’s decision in K.S. Puttaswamy v. Union of India (2019) 1 SCC 1  or the Aadhaar judgment , the Hon’ble Supreme Court held that:

“[u]nder the garb of prevention of money laundering or black money, there cannot be such a sweeping provision which targets every resident of the country as a suspicious person”

While this statement was made in the context of rejecting the mandatory linkage of Aadhaar with bank accounts to counter money laundering, it clearly shows that imposition of such a restriction on the entire population, without any evidence of wrongdoing on their part, would constitute a disproportionate response. Similarly, collecting sensitive personal information of all individuals who were present at the scene of crime creates a presumption of criminality which is disproportionate to the objective it aims to achieve.

D. Procedural safeguards

Where there is an appropriate independent institutional mechanism, with in-built procedural safeguards aligned with standards of procedure established by law which are just, fair and reasonable to prevent abuse. At present, there is no independent institutional mechanism which would put in place procedural safeguards in the RFP which will regulate the proposed project. In the absence of any checks and balances, function creep becomes an immediate problem wherein the issue of AFRS being used for functions more than its stated purpose becomes a reality. Use of AFRS without safeguards could result in illegal state-sponsored mass surveillance which would have a chilling effect on fundamental rights such right to freedom of expression, freedom of movement and freedom of association which are guaranteed in the Constitution. Fear of identification and retaliation by the state would deter individuals from exercising their fundamental right to protest which is included in the freedom of speech and expression.

Project Panoptic

IFF has been tracking the RFP for AFRS since last year and has sent two legal notices to the NCRB pertaining to its illegality (Read here and here). Additionally, under our Project Panoptic, we have also been tracking all other instances of facial recognition use in the country that we have come across. Our main aim behind this project is to increase transparency and accountability from the authorities who are implementing these projects.

Important Documents

  1. IFF's Legal Notice to the NCRB on the Revised RFP for the National Automated Facial Recognition System dated July 15, 2020 (link)
  2. We might be in the market for a new kind of face mask dated July 18, 2019 (link)
  3. We wrote to NCRB and MHA requesting them to halt their ongoing National Automated Facial Recognition System (AFRS) project dated April 22, 2020 (link)
  4. Problems with Facial Recognition Technology Operating in a Legal Vacuum dated February 20, 2020 (link)
  5. Update on IFF's #ProjectPanoptic dated July 21, 2020 (link)

Subscribe to our newsletter, and don't miss out on our latest updates.

Similar Posts

1
What we do in the shadows: IFF seeks transparency in how Indian ‘smart governments’ are using AI

Noting a glaring lack of transparency and publicly available information on how union and state governments are deploying AI in the public sector, we write to the National Institute of Smart Government urging proactive disclosures and publication of government-led AI projects.

5 min read

2
Big Relief! Supreme Court Stays Notification Constituting Fact-Check Unit!

In a small win for press freedom, Supreme Court has stayed the notification of Union Government operationalising the Fact-Check Unit under Information Technology Rules, 2021, till the constitutionality of the same is finally decided by Bombay HC.

5 min read

3
A DM from the PM (and the storm it stirred)

Last week, millions of WhatsApp users received a message from the Ministry of Electronics & IT, undersigned by the Prime Minister, asking for feedback on schemes introduced by the incumbent government. We unravel what this means for your privacy and the electoral process.

7 min read

Donate to IFF

Help IFF scale up by making a donation for digital rights. Really, when it comes to free speech online, digital privacy, net neutrality and innovation — we got your back!