We will not be tracked! Indian Railways’ plans to introduce FRT surveillance in train coaches is a big departure from the right to privacy

Indian Railways has floated a tender for the installation of 3.3 lakh FRT-enabled CCTV cameras in train coaches which will be used alongside face-matching servers to surveil and identify passengers and ‘curb crime’. We write to authorities pushing back against this massive privacy violation.

12 March, 2024
10 min read

tl;dr

A tender floated by the Centre for Railway Information Systems, a functionary of the Ministry of Railways, invites technology providers for the installation of facial recognition-enabled CCTV cameras inside train coaches across India, and the use of face-cropping tools and face-matching servers to surveil and identify individuals with an objective to ‘curb crime’. We believe that this plan raises a number of concerns about data privacy of individuals, which will not be remedied even after the enactment of a data protection legislation. The plan also fails the foundation Puttaswamy test laid down by the Hon’ble Supreme Court, lacks legal safeguards against misuse of vulnerable facial data, and might even prove unsuccessful in meeting the stated objective and add resource burdens for authorities. 

Background

The Centre for Railway Information Systems (“CRIS”), a functionary of the Ministry of Railways (“Ministry”), recently floated a tender for the installation of a cumulative 3.3 Lakh facial recognition-enabled CCTV cameras inside 44,038 train coaches across India. There are plans to equip 38,255 coaches with 8 cameras, 2,744 coaches with 5 cameras, 2,079 coaches with 4 cameras and 960 coaches with 6 cameras. Of these, the Central Railways division will have a total of 3,018 coaches under CCTV surveillance, Western Railways 3,408 coaches, and East Central Railways 2,533 coaches. 

The CCTV surveillance systems will be enabled with ‘video analytics’ and facial recognition technology (“FRT”) and apart from the coaches, 4 will be planted at exit/entry points of the train. A face image cropping tool built into the 4 entry/exit CCTV cameras will identify faces of passengers from the camera’s live feed and send the metadata to a central ‘face matching server’ in real time. This ecosystem will collect and store facial data of all passengers entering and exiting a train, which include adult and child passengers alike. 

We believe that this plan raises a number of concerns about privacy, surveillance, and data security. Collecting, processing, and storing sensitive facial data of adults and children without their explicit consent at such a large scale violates their right to privacy. This affects not only passengers, but everyone who enters a train coach, including local vendors and coolies who frequently go in and out of trains, families of passengers, railway staff, sanitation workers, and so on. We wrote to officials CRIS and the Ministry detailing our concerns, which are summarised below:

Legality and efficacy of CCTV and FRT-based surveillance

The Hon’ble Supreme Court of India, in its landmark decision in K.S. Puttaswamy v. Union of India [2017 SCC 1] (“Puttaswamy”), articulated the right to privacy as an inalienable part of an individual’s right to life under Article 21 of the Indian Constitution. Article 21 is clear in noting that the right can only be taken away or infringed upon by procedures established by law. Further, the three-prong test laid down by the nine-judge bench in Puttaswamy notes that any state action infringing upon individuals’ privacy must meet the requirements of 1) having a legislative basis, 2) being necessary to meet legitimate state aim, and 3) proportionality between the action and the state aim. We believe that the act of CCTV surveillance inside trains and at entry/exit points is an act that infringes upon the privacy of a large number of individuals without satisfying any of the Puttaswamy criteria. 

Fails the test of legality

CCTV cameras, FRT, and other surveillance tools operate in a regulatory vacuum in India and lack an anchoring law. They also collect biometric data, i.e. facial datapoints, of unsuspecting individuals without their consent or without an operational data protection law. The Digital Personal Data Protection (“DPDP”) Act, 2023 is not in operation yet, but when enacted and its procedural rules notified, it still may not adequately regulate CCTV cameras and FRT. Section 17 of the Act holds the power to exempt government and other public authorities from its very application at any given time. If such an exemption is notified, the data protection provisions will not apply to the Indian Railways and other entities finalised through the tender. 

Further, the DPDP Act also does not classify ‘sensitive personal data’ as a distinct category needing additional safeguards and caution, unlike its earlier versions or even the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011. Global instruments, including the European Council’s ‘Guidelines on facial recognition’, recognise the sensitive nature of biometric information such as facial data and the vulnerable position the processing of such data may leave the data principles in. Therefore, until specific Rules under the DPDPA prescribe higher standards for processing sensitive information such as facial biometric data, the use of surveillance tools like CCTV cameras and FRT will continue to operate without appropriate privacy safeguards.

Additionally, the tender lacks any data protection considerations. It is not clear what the Indian Railways will do with such a large amount of private sensitive data, and the current plans of storing it centrally in a ‘face matching server’ are worrying given the lack of proper data protection and security safeguards. CCTV feeds are especially prone to being misused in the absence of strict and clear guidelines. In Delhi, for instance, CCTV feeds placed in public spaces can be accessed by Resident Welfare Associations, market associations, local police, and the Public Works Department pursuant to an arbitrary SOP which has no basis in law. This SOP permits CCTV feeds to be made available to anyone through passwords provided after approval from the local MLA to login and access the feeds. For not being tethered to any law or having any legislative basis, current plans in the tender fail the first test of legality.

Fails the test of necessity and proportionality

The aim of this move as stated in the tender is to curb crime and identify suspects in trains, so one must explore if CCTV surveillance and FRT face matching are the right and proportionate tools to do it. We submit that both these tools are not only unsuccessful and ineffective in ensuring safety and curbing crime, but also have high rates of inaccuracy in identifying suspects. To this end, they fail to meet the second and third criteria laid down in Puttaswamy.

First, it is important to highlight the false equivalence between CCTV surveillance and safety. Multiple studies (here, here, here, for instance) have proven that CCTV surveillance has little to no effect on the reduction of crime in the surveilled area. On the contrary, CCTV cameras may inversely contribute to serious right to privacy violations by enabling surveillance by unauthorised persons, voyeurism, or stalking. 

Secondly, several studies show that FRT is inaccurate, especially for people of colour (which includes Indians) and women. Components of FRT, such as computer vision systems are inherently non-transparent and their decisions are not easy to understand even by the people who built them. When such systems make an error, the developers or the operators deploying them cannot tell what reasoning the machine has done to get this error, let alone correct it. One cannot use opaque, arbitrary and inexplicable measures in this way can severely jeopardise the right to privacy without safeguards. So, FRT might not be a reliable tool to catch miscreants and criminals through the face matching servers.

FRT is a tool riddled with bias

In fact, FRT as a tool can be easily influenced by externalities and the biases of the Railway staff operating it. FRT systems generate a probability match score, or a confidence score between the suspect who is to be identified and a database of identified criminals (in this case, a database that any Police force or Railway staff may have access to already). Multiple possible matches are generated and listed on the basis of their likelihood to be the correct match with corresponding confidence scores. The final identification, however, is done by the human analyst (here, the Railway staff) who selects one match from the list of matches generated by the technology. 

This identification procedure creates fertile ground for misidentification because while the software releases several possible matches, the analyst conducting the search makes the final identification. This also opens the door for the staff’s own biases to creep into the final result wherein they may be prejudiced against a certain race, religion or community, based on which their decision making may be affected. We have written about the threats of FRT systems across various use cases in detail in IFF’s submissions on NITI Aayog’s draft discussion paper titled “Responsible AI for All: Adopting the Framework – A use case approach on Facial Recognition Technology”. FRT-enabled CCTV cameras are unnecessary, ineffective, and disproportionate tools and will not help the Ministry curb crime or identify criminals.

Administrative burden

Moreover, the Indian Railways are the largest railway network in Asia, running up to 11,000 trains everyday across track lengths of 1,08,706 kilometres, of which 7,000 are passenger trains. Given the magnitude and vastness of Indian railway networks, and the frequency of plying trains, there will be lakhs of CCTV feeds running simultaneously if the tender is implemented, with an expectation of real-time surveillance of all the feeds to catch crime and intervene. This seems highly unlikely and unreasonable. If this is to be made possible, the Indian Railways may have to invest capital in hiring additional staff and divert resources from other projects. This might be difficult to justify in light of high inaccuracy rates of CCTV and FRT surveillance.

The plan will violate the right to privacy

By capturing and storing facial data, CCTV cameras can prove to be highly intrusive, facilitate real-time surveillance, and threaten individual privacy. The right to privacy extends beyond private spaces to public spaces, where individuals must be accorded the same safeguards for their privacy and dignity, as in their homes. The standard of “reasonable expectation of privacy” discussed in the Puttaswamy decision is not exclusionary and links to autonomy, liberty, and dignity of individuals, which are components of fundamental rights. As JJ. Chandrachud noted, 

“While the legitimate expectation of privacy may vary from intimate zone to the private zone and from the private to the public arena, it is important to underscore that privacy is not lost or surrendered merely because the individual is in a public place.”

JJ. Bobde added, “the entitlement to such a condition is not confined only to intimate spaces such as the bedroom or the washroom but goes with a person wherever he is, even in a public place.” In light of these considerations, CCTV surveillance in public spaces impedes the exercise of one’s right to privacy, and jeopardises their vulnerable facial data. 

In the case of Indian Hotel and Restaurant Association and Anr v. State of Maharashtra, installation of CCTVs in a restaurant was held to be a violation of the right to privacy. The Court discussed the condition which required the installation of CCTV cameras in the rooms where, according to the facts of the case the dancers would perform for their clients, and deemed it a “totally inappropriate” and excessive infringement of the right to privacy. Reliance was placed on Anita Allen’s work on “unpopular” privacy, which propounds that governments must design “unpopular” privacy laws and duties to protect the common good, “even if privacy is being forced on individuals who may not want it.” This includes one’s right to physical or spatial privacy. The same can be said to apply to train coaches, which is also a closed space the passengers share with each other, akin to restaurants. Jurisprudence suggests that such a measure is unconstitutional and violative of privacy.

A tool for surveillance

Further, FRT is innately invasive. It involves processing digital images of individuals faces for verification or identification, by extracting data points from a face and then comparing data points to a pre-existing image. It creates unique risks because our faces are our most prominent identifiers. Unlike fingerprints, our faces are on most, if not all, of our identity cards such as passports, Aadhaar cards, PAN cards, and drivers’ licence. The facial data stored from FRT-based surveillance systems is far more vulnerable than any other biometric identifier, and when surveillance tools are deployed in transit, it becomes automatically linked with the passengers’ movement and location data. This real-time personal information feeds central ‘face matching servers’ and repositories, which may eventually enable government entities to create accurate 360° profiles of citizens and result in “dragnet surveillance”. The use of any digital identifiers, especially facial biometric data, may lead to unauthorised profiling of individuals through correlation of identities across multiple application domains.

Privacy is dignity

Article 21 broadly recognises persons’ right to life, which is read also as the right to live with dignity. The Supreme Court stated in Puttaswamy that a person’s dignity is connected with their privacy. FRT systems used in public spaces (such as inside train coaches) are per se violative of privacy and dignity, as studies have shown that making FRT ubiquitous as an ecosystem reduces the identity of citizens to merely a rapidly shared and transacted datapoint, imposes a sort of continuous and involuntary visibility and constricts the behaviour of people. Real-time monitoring can also make passengers uncomfortable and conscious of their conduct, appearance, or mannerisms, especially given that Indian railway passengers come from unequal and diverse socio-economic backgrounds. 

Data security concerns

The tender envisions building a central ‘face matching server’ which will receive facial data from the FRT-enabled CCTV cameras and match them against existing biometric records. The extensive data collected on such a central database will instantly become vulnerable to breaches and potential misuse. The risk of unauthorised access or malicious intent poses a serious threat to the privacy of passengers and many other people. Another grave concern the tender raises is that it will include child passengers in its CCTV and FRT based surveillance. It is internationally understood that personal data of children requires additional insulation from privacy breaches and misuse – in Puttaswamy, the Supreme Court underscores that there has to be added burden while handling and processing children’s data. 

The use of institutionalised FRT has historically been a topic of controversy globally. However, the use of FRT by law enforcement agencies and government bodies in India continues without any checks in place. We urge the Ministry to conduct a careful evaluation of the plans proposed in the tender, and to withdraw the tender and opt for less privacy-infringing methods of curbing crimes in trains. The risks and potential for misuse associated with surveillance tools like CCTV cameras and FRT far outweigh its benefits, if any. 

Important documents

  1. IFF’s Letter to Indian Railways on its tender on CCTV installation in train coaches dated 11.03.2024 (link)
  2. Tender for Selection of Back-End Partner for Implementation of Supply, Installation & Commissioning of IP based Closed Circuit Television Surveillance System (CCTVSS) in Coaches, dated 18.01.2024 (link)

Subscribe to our newsletter, and don't miss out on our latest updates.

Similar Posts

1
Petitioners Conclude Arguments Before Third Judge in Case Challenging Constitutionality of Fact-Check Unit Conceptualised under IT (Amendment) Rules, 2023

After a marathon hearing before the Bombay HC spanning over 7 days, the Petitioners have concluded their arguments before the third Judge, Justice A.S. Chandurkar, in the petitions challenging the constitutionality of the Fact-Check Unit Conceptualised under IT (Amendment) Rules, 2023

5 min read

2
Why do we do the “Quarterly Members’ & Donors’ calls” / For all the johnny-come-lately`s

What goes on in these “Quarterly Members’ and Donors’ calls" and why do we host them? What kind of mangoes do we eat and how?

3 min read

3
Dear Digi Yatris, it’s time to deboard

Amid suspicions about its tech operator’s criminal records and vast allegations of data privacy violations, the Digi Yatra Foundation has announced a revamp of the service and is urging its users to abandon the old app and re-install a new version. We shed light on this shady ‘makeover’.

7 min read

Donate to IFF

Help IFF scale up by making a donation for digital rights. Really, when it comes to free speech online, digital privacy, net neutrality and innovation — we got your back!