IFF's Legal Notice to the NCRB on the Revised RFP for the National Automated Facial Recognition System #ProjectPanoptic

IFF has sent the NCRB a legal notice asking them to recall the revised RFP for AFRS citing concerns of violations of fundamental rights of privacy and freedom of speech and expression.

15 July, 2020
7 min read

tl;dr

On June 22, 2020 the National Crime Records Bureau released a revised Request for Proposals for the procurement of the National Automated Facial Recognition System (AFRS). IFF has sent them a legal notice asking them to recall the RFP citing concerns of violations of fundamental rights of privacy and freedom of speech and expression.

The Revised RFP

On June 22, 2020, the NCRB recalled and cancelled the original RFP [Document Reference: 02/001] that it had issued on July 3, 2019. A revised RFP [Document Reference: 02/001 (Revised)] was issued in its place. We had previously sent a legal notice ref. no. IFF/2019/115 dated July 18, 2019 to NCRB asking them to recall the original RFP. However, the current recall and cancellation fails to resolve our initial concerns pertaining to the AFRS project as raised in that legal notice. Additionally, having gone through the revised RFP, we have come across changes which have raised new concerns for us.

A. Functional Scope of the Project

One of the most important changes which have been made in the revised RFP is that it now states that the project “does not involve the installation of CCTV camera nor will it connect to any existing CCTV camera anywhere”. This is a departure from the original RFP wherein CCTV integration had been included as a functional requirement.

B. Functional Requirements

In the functional requirements for AFRS, the revised RFP has introduced the requirement wherein the technology should be able to carry out N:N combination search. Previously, the original RFP required the technology to be able to carry out searches which responded to 1:1 (identification) and 1:N (verification) combinations. The RFP does not provide insight into the nature of searches which will be carried out through N:N combination searches. Since N:N combinations searches have various definitions in foundational literature, there is no clarity with regard to the use case for which they will be utilised. Vagueness in terms of purpose of use is harmful since it allows for the possibility of function creep.

C. Technical Requirements

1.  Integration with existing crime analytics solutions

In the revised RFP, a new technical requirement has been mandated wherein the AFRS platform which is being built should be able to integrate with existing crime analytics solutions with the Police for providing unique attributes on images/visuals. Crime analytics solution has not been defined in the RFP and could thus be interpreted in a wide sense to include CCTV footage which contradicts the previous statements in the RFP which talk about restricting integration with CCTV cameras. It would thus be useful if a clarification is issued wherein “existing crime analytics solutions” are defined with emphasis on their scope.

Integration with existing crime analytics solutions could thus mean integration of AFRS with private vendor solutions which could lead to questions about data access and sharing. Another question which arises is the scope of the existing solutions. Since different solutions are currently being used by different states, will integration mean scaling up the entire solution for use at the national level automatically or will the solution be vetted first at the national level. Another interpretation could be that existing state-level solutions will only be integrated with AFRS in that particular state.

2. Dilution of technical requirements

In the revised RFP, there is no mention of the international standards which were included in the original RFP which had to be complied with as a technical requirement. The reason behind the exclusion of these standards is unclear and raises the question of why necessary standards of technical requirements are being diluted.

D. Functional architecture

1. Scene of Crime (SOC) images/videos included as a data source

In the revised RFP, a new data source  "Scene of Crime images/videos" has been introduced for input into the AFRS database. This inclusion is at odds with the previous assertion made in the RFP - that integration of CCTVs will not take place. Deletion of CCTV camera footage as a data source leaves a gap in the functional architecture of AFRS and the RFP fails to satisfactorily account for its replacement. The RFP also fails to mention how the data and subsequent analysis/ information which is obtained through AFRS will be presented and utilized in a court of law, i.e., the nature of the evidence obtained from AFRS and its admissibility as pertaining to its reliability in courts.

2. List of databases which would be integrated with AFRS changed

The original RFP provided a list of databases from which data was to be gathered to create the AFRS database. This list of databases has now been removed from the revised RFP and has been replaced with the term “dynamic police databases”. The lack of definitional clarity and broad scope surrounding this term sets up the AFRS for function creep and an open ended data sharing/mining endeavor, which is untenable in law.

The AFRS violates the right to privacy

As per the Hon’ble Supreme Court's decision in Justice K.S. Puttaswamy vs Union of India (2017 10 SCC 1) any justifiable intrusion by the State into people’s right to privacy protected under Article 21 of the Constitution must conform to certain thresholds. These thresholds are:

1. Legality

Where the intrusion must take place a defined regime of law i.e. there must be an anchoring legislation, with a clear set of provisions. As pointed out in our previous legal notice as well, there is no anchoring legislation which allows for and regulates  the use of AFRS. In the absence of such a framework and safeguards, the first requirement for lawful restriction on the right to privacy is not met.

2. Necessity

Which justifies that the restriction to people’s privacy (in this case data collection and sharing) is needed in a democratic society to fulfill a legitimate state aim. In the RFP, it is stated that the need for AFRS arises because it will enable automatic identification and verification  through criminal databases which would help investigation of crime and tracking and detection of criminals.

This characterisation is based on a faulty assumption that facial recognition technology is accurate and would provide speedy and correct results. However, ongoing research in the field has shown that facial recognition technology which is completely accurate has not been developed yet. Use of such inaccurate technology, especially for criminal prosecution, could thus result in a false positive, i.e., misidentification of an innocent individual as a suspect in a crime. Thus, AFRS fails to meet the requirement of necessity as laid down by the Supreme Court in the Puttaswamy judgment.

3. Proportionality

Where the Government must show among other things that the measure being undertaken has a rational nexus with the objective. The AFRS contemplates collecting sensitive personal information, intimate information of all individuals in the absence of any reasonable suspicion by collecting images and videos from a scene of crime. This could cast a presumption of criminality on a broad set of people. In the Supreme Court’s decision in K.S. Puttaswamy v. Union of India (2019) 1 SCC 1  or the Aadhaar judgment , the Hon’ble Supreme Court held that:

[u]nder the garb of prevention of money laundering or black money, there cannot be such a sweeping provision which targets every resident of the country as a suspicious person

While this statement was made in the context of rejecting the mandatory linkage of Aadhaar with bank accounts to counter money laundering, it clearly shows that imposition of such a restriction on the entire population, without any evidence of wrongdoing on their part, would constitute a disproportionate response. Similarly, collecting sensitive personal information of all individuals who were present at the scene of crime creates a presumption of criminality which is disproportionate to the objective it aims to achieve.

4. Procedural safeguards

Where there is an appropriate independent institutional mechanism, with in-built procedural safeguards aligned with standards of procedure established by law which are just, fair and reasonable to prevent abuse. At present, there is no independent institutional mechanism which would put in place procedural safeguards in the RFP which will regulate the proposed project. In the absence of any checks and balances, function creep becomes an immediate problem wherein the issue of AFRS being used for functions more than its stated purpose becomes a reality. Use of AFRS without safeguards could result in illegal state-sponsored mass surveillance which would have a chilling effect on fundamental rights such right to freedom of expression, freedom of movement and freedom of association which are guaranteed in the Constitution. Fear of identification and retaliation by the state would deter individuals from exercising their fundamental right to protest which is included in the freedom of speech and expression.

Thus, our legal notice asks the NCRB to recall the revised RFP that they have issued as it fails to satisfy the thresholds laid down in the Puttaswamy judgment and is violative of the fundamental right to privacy, right to freedom of speech and expression and freedom of movement. This legal notice has been drafted with the help and insights of Smriti Parsheera and Vidushi Marda.

Project Panoptic

IFF has been tracking the RFP for AFRS since last year. Additionally, under our Project Panoptic, we have also been tracking all other instances of facial recognition use in the country that we have come across. Our main aim behind this project is to increase transparency and accountability from the authorities who are implementing these projects.

One of the ways in which we demand transparency is through filing Right to Information requests. If you want to file a requests or need guidance with filing please reach out to us at [email protected]

Important Documents

  1. Legal Notice to NCRB on the Revised RFP for AFRS dated July 14, 2020 (link)
  2. Revised RFP for AFRS issued by NCRB dated June 22, 2020 (link)
  3. We might be in the market for a new kind of face mask dated July 18, 2019 (link)
  4. We wrote to NCRB and MHA requesting them to halt their ongoing National Automated Facial Recognition System (AFRS) project dated April 22, 2020 (link)
  5. IFF proposes a three year moratorium on the use of Facial Recognition Technology in India dated March 23, 2020 (link)

Subscribe to our newsletter, and don't miss out on our latest updates.

Similar Posts

1
Your personal data, their political campaign? Beneficiary politics and the lack of law

As the 2024 elections inch closer, we look into how political parties can access personal data of welfare scheme beneficiaries and other potential voters through indirect and often illicit means, to create voter profiles for targeted campaigning, and what the law has to say about it.

6 min read

2
Press Release: Civil society organisations express urgent concerns over the integrity of the 2024 general elections to the Lok Sabha

11 civil society organisations wrote to the ECI, highlighting the role of technology in affecting electoral outcomes. The letter includes an urgent appeal to the ECI to uphold the integrity of the upcoming elections and hold political actors and digital platforms accountable to the voters. 

2 min read

3
IFF Explains: How a vulnerability in a government cloud service could have exposed the sensitive personal data of 2,50,000 Indian citizens

In January 2022, we informed CERT-In about a vulnerability in S3WaaS, a platform developed for hosting government websites, which could expose sensitive personal data of 2,50,000 Indians. The security researcher who identified the vulnerability confirmed its resolution in March 2024.

5 min read

Donate to IFF

Help IFF scale up by making a donation for digital rights. Really, when it comes to free speech online, digital privacy, net neutrality and innovation — we got your back!