We might be in the market for a new kind of face mask.

The NCRB has invited bids for the implementation of Automated Facial Recognition Systems. We sent a legal notice to the NCRB because concerns such as discriminatory profiling and privacy violations are just the tip of the iceberg of problems this technology will bring about.

18 July, 2019
3 min read

Highlights

  • Background: On June 28, 2019, the National Crime Records Bureau (NCRB) invited bids from reputed turnkey solution providers for the implementation of a centralised Automated Facial Recognition System (AFRS).
  • Need for established safeguards: The proposal to introduce facial recognition into society currently comes with a variety of concerns that need to be addressed first. We write a legal notice to the NCRB  with also a covering letter and a copy of the notice to Home Minister, Shri. Amit Shah and Home Secretary, Shri. Rajiv Gauba, highlighting the features and scope of the AFRS and its unfathomable detriment it could bring about to Indians if implemented.

Know your enemy

Its important to understand the characteristics the Request for Proposals requires the systems to possess, in order to understand how problematic it will be. Our legal notice highlights features which will be covered here in brief to provide basic context.

  1. Function: The AFRS is intended to be a repository of all crime and criminal related facial data and should be able to identify or verify a person from a variety of inputs ranging from images to videos.
  2. Integration: The system should be able to be integrated with various other databases such as ICJS, CCTNS, IVFRT, state police integration software in existence or any others. The integration does not stop there, it requires the system to be compatible with other biometric solutions such as Iris and AFIS but doesn't specify what these databases are. Might we have yet another Aadhaar worry to add to our frown lines?
  3. Identification: As per requirements of the RFP, the system should not only be able to match images from a variety of databases, it should also be equipped to capture images from CCTV footage, public or private video feeds. As concerning as this sounds, there's more. It should also have the ability to tag images uploaded from newspapers, raids, sketches etc. with identifiers based on sex, age, scars, tattoos, consider landmarks, features and contours in identifying individuals and also accommodate for images with plastic surgery and make up for accuracy in identification.
  4. Technical requirements: The aspects that stuck out in the RFP largely related to the need for the system to be compatible with bio-metric solutions such as Iris and Fingerprint identification systems. It also requires there that there be security in storage, user access and authentication
  5. Security requirements: The bidder is largely responsible for additional measures in maintaining the integrity, confidentiality and availability of data that will be stored, apart from the established ISO standard prescribed in the RFP.

Here's why we aren't quite ready for sci-fi

We take a closer look at these not so frightening features and identify concerns with the entire technology itself.

  • Absence of legality: The requirement of facial recognition systems does not stem from a statutory basis nor is it a result of the executive power of the Government. Clearly it lacks any sort of legal backing. This is in addition to the violation of privacy it will so flagrantly undermine as it fails to fulfil any of the elements laid down by Puttaswamy v. Union of India (2017 (10) SCALE 1) in permitting a violation of privacy.
  • Manifest arbitrariness: Beginning to picture the massive invasion of privacy the AFRS will potentially have? Especially with its all seeing eye looking at not only databases  and databases of images but also strategically located CCTV footage, images of newspaper clippings, raids, sketches etc., all most likely without your knowledge. This kind of en-masse surveillance is bound to bring about a high degree of damage. Studies by MIT and Georgetown and trials conducted the London Metropolitan Police acknowledge that pervasive biases that exist currently within our societies are likely to be mimicked by the algorithms within these systems. Mis-identification and discriminatory profiling is the result we're looking at if these systems are implements. Apparently, the already existing discriminatory practices perpetuated by human beings is no longer enough, we must look to AI to continue our dirty work.
  • Absence of safeguards and accountability: In light of the above, there is currently no legal restrictions or limitations to this technology to ensure its proportional use or afford protection to those it interacts with. Add facial recognition to the on going debate on CCTVs and we have ourselves a full fledged mix of India and China no longer being restricted to the Indo-Chinese cuisine.

Considering the trajectory India appears to be on with mass surveillance and technological perpetuating of discrimination, a scarier version of the Orwellian dystopia seems to be right up our alley. We urge the NCRB to take a step back and recall the bidding process for the Automated Facial Recognition System until adequate safeguards that address these various concerns are put into place. This request is also with a disclaimer that failure to do so may cause us to seek remedy in accordance with the law.

Important Documents:

  1. Request for Proposals by the National Crime Records Bureau for Automated Facial Recognition Systems [link]
  2. Legal notice to the National Crime Records Bureau [link]
  3. Covering letter to the Ministry of Home Affairs and Home Secretary of Home Affairs [link]

All your black mirror nightmares coming true? Support us as we fight to restrict dystopias to tv shows and books. Become an IFF member today!

Subscribe to our newsletter, and don't miss out on our latest updates.

Similar Posts

1
Petitioners Conclude Arguments Before Third Judge in Case Challenging Constitutionality of Fact-Check Unit Conceptualised under IT (Amendment) Rules, 2023

After a marathon hearing before the Bombay HC spanning over 7 days, the Petitioners have concluded their arguments before the third Judge, Justice A.S. Chandurkar, in the petitions challenging the constitutionality of the Fact-Check Unit Conceptualised under IT (Amendment) Rules, 2023

5 min read

2
Why do we do the “Quarterly Members’ & Donors’ calls” / For all the johnny-come-lately`s

What goes on in these “Quarterly Members’ and Donors’ calls" and why do we host them? What kind of mangoes do we eat and how?

3 min read

3
Dear Digi Yatris, it’s time to deboard

Amid suspicions about its tech operator’s criminal records and vast allegations of data privacy violations, the Digi Yatra Foundation has announced a revamp of the service and is urging its users to abandon the old app and re-install a new version. We shed light on this shady ‘makeover’.

7 min read

Donate to IFF

Help IFF scale up by making a donation for digital rights. Really, when it comes to free speech online, digital privacy, net neutrality and innovation — we got your back!