Resist Surveillance Tech, Reject Digi Yatra

Digi Yatra continues to raise privacy, surveillance, exclusion and transparency concerns, made worse by the highly disturbing manner in which it is currently being deployed at airports – with coercion and deception, at the cost of passengers’ dignity, privacy, and autonomy.

16 January, 2024
12 min read


We wrote to the Ministry of Civil Aviation, NITI Aayog, Airports Authority of India, Digi Yatra Foundation, and the Delhi, Bengaluru, Mumbai, Cochin and Hyderabad regional airports, bringing their attention to the worrisome implementation of the Digi Yatra service across airports in India. We urge them to completely withdraw Digi Yatra from Indian airports owing to its large gamut of concerns relating to privacy, surveillance, exclusion errors and lack of institutional accountability and transparency, coupled with the highly disturbing manner in which it is currently being deployed at airports – with reports of coercion and deception, at the cost of passengers’ dignity, privacy, and autonomy.


Digi Yatra is an opt-in service at Indian airports launched by the Ministry of Civil Aviation (“Ministry”) on June 8, 2017 with an aim to make air travel “seamless, contact-less, hasslefree and paperless” for all passengers in India. The service facilitates ​​digital processing of passengers at airports by using facial recognition technology (“FRT”) and Aadhaar-linked credentials to authenticate passengers in place of traditional boarding passes at airport terminal entry points, security check, self-bag drop, check-in and aircraft boarding. We have previously written to the Ministry outlining our concerns in relation to the data collection, storage, and processing mechanisms adopted by Digi Yatra, and its worrying use of FRT without safeguards. 

However, more recently, the concerns have greatly exacerbated due to the unlawful and undignified manner in which Digi Yatra is being deployed at airports. Airline passengers across India are being ambushed and coerced into signing on to the “voluntary” Digi Yatra service and scan their faces at multiple airport check-points through deception and false information by private airport personnel and CISF staff.

Digi Yatra is…

Not mandatory

Staff is reportedly claiming that the service is mandatory by law, or is disallowing passengers from proceeding without signing up for it. This is patently against the information furnished by this Ministry in response to our Right to Information (“RTI”) Applications Ref. MOCAVIR/E/23/00434 dated 07.08.2023 and Ref. MOCAVIR/EO0438 and MOCAV/RIE/00439 dated 08.08.2023, which states, “Digi Yatra gates exists in airports for passengers who voluntarily choose to use this facility (sic)…Digi Yatra is not mandatory. The other manual process e-gates continue to be available for passengers.”  The Ministry’s Digi Yatra Biometric Boarding System (“DYBBS”) Policy is also clear in noting that “Creation and use of the Digi Yatra ID Travel Credential by a passenger will be completely voluntary”. On ground, this claim is being regularly flouted, undermining the consent and dignity of passengers. Consent of passengers becomes paramount because Digi Yatra carries with it a number of privacy and surveillance concerns, and relies on technology which is not necessarily very accurate. 

Not anchored in law

While the DYBBS Policy states that airports using the service will conform and adhere to the “data protection laws as applicable and mandated by the Government Of India”, India’s data protection legislation has not yet been enforced or substantiated through Rules, which will prescribe all the relevant procedures, especially on taking “free, specific, informed, unconditional and unambiguous” consent from passengers. The DYBBS Policy itself, which envisages the formation, composition, data flow ecosystem and other functional attributes of Digi Yatra, does not have the force of law due to being untethered to any legal framework. Due to this, even the ‘High Level Data Privacy Guidelines’ included in the Policy cannot be made directly enforceable against any private or public authority contravening them. Therefore at the moment, Digi Yatra hangs in suspended animation as a privacy-infringing measure without legal backing. 

Backed by a weak data policy

Even in the event  that the DYBBS Policy, and with it the Privacy Guidelines, be made enforceable, they lack fundamental privacy principles necessary for the airtight protection of personal data, especially data as sensitive as facial biometrics. The Guidelines make ostentatious references to various standard principles such as lawfulness of processing, purpose limitation, data minimisation, accuracy, and storage limitation among others – but also goes on to state that  “BBS shall have an ability to change the data purge settings based on security requirements on a need basis” and “Any Security Agency, BOI or other Govt. Agency may be given access to the Passenger Data based on the current/ existing Protocols prevalent at that time”. It also comprises a wide and vague exemption for sharing passenger data with the government agencies, which may eventually lead to abuse of access. Sharing of biometric data with the government agencies without consent may also lead to violations of specific fundamental rights such as the right to move freely within the territory of India enshrined in Article 19(1)(d) of the Indian Constitution – as it could result in additional screening measures for those groups of people who historically register lower FRT accuracy rates.

Has a shady data ecosystem

We know that Digi Yatra stands on weak law-policy foundations. However, the kind and magnitude of data it collects standing on these weak foundations is alarming. In Digi Yatra’s Privacy Policy, the categories of data listed for collection include and are not limited to: identity and contact data, biometric data, business information, technical data such as passwords and video or image data, images or video. These are provided or captured with consent on mobile apps, kiosks systems or e-gates at airport checkpoints when passengers visit the airport, or at the premises of the Digi Yatra Foundation (“Foundation”), the entity responsible for operating the service. The Privacy Policy fails to mention the specific purposes for which this amount of data may need to be collected. It goes on to say that the collected data may also be used for purposes other than those, such as “improvement of products, contacting for surveys, and to process user/customer requests” and so on. But in reality, data such as contact or business information or audio-visual data, do not have a reasonable nexus to the objective of the service, which is simply to authenticate a passenger against their facial biometric data. To that end, Digi Yatra deviates  from the privacy principle of data minimisation in processing of data, or the expectation that a data fiduciary will only collect information which is “adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed.”

Further, some clauses pose a mystery. The Privacy Policy allows for collecting, storing, processing, transferring, and sharing a passenger/user’s personal information (including sensitive personal information) with third parties or service providers for the purposes set out in the policy (and which include marketing, events, programmes and promotions) – but on the other hand – states that the data collected under Digi Yatra “cannot be used by any other entity since it is encrypted.” The contradiction  in these two clauses has never been addressed by the Ministry or the Foundation, which leaves one uncertain of the claim that any of this data is encrypted.

Secondly, there is ambiguity with respect to what kind of personal information is shared from a passenger’s smartphone and how it is used by the Digi Yatra mobile application. In an April 2023 press statement, the Ministry claimed

Under Digi Yatra, passengers’ data is stored in their own device and not in centralized (sic) storage… In the Digi Yatra process, there is no central storage of passenger's Personally Identifiable Information (PII) data. All the passengers’ data is encrypted and stored in the wallet of their smartphone. It is shared only between the passenger and the airport of travel origin, where passenger's (sic) Digi Yatra ID needs to be validated. The data is purged from the airport’s system within 24 hours of departure of flight” 

This appears to contradict the DYBBS Policy, which states that “The Airport operator [DYBBS] will retain the Travel Data including the Digi Yatra ID Travel Credential for a duration of 30 days from the date of travel after the Passenger’s Flight departs… (sic)”, implying that data is stored, and that Union government functionaries have access to it when required. The press statement is also at odds with an interview given by Avinash Komireddy, the founder and CEO of the company which has designed the Digi Yatra ecosystem ‘Dataevolve’, wherein he states that “data authentication takes place on the Amazon Web Services cloud platform.” This authentication flow has not been referenced by the Ministry in any of its statements about the exchange of information between a passenger’s smartphone and the origin airport, or mentioned in the DYBBS Policy. Overall, how data is stored and authenticated in the Digi Yatra ecosystem has not been made transparent, which raises concerns for whether privacy standards are being complied with.

Flunks the Puttaswamy test

The Hon’ble Supreme Court of India laid down certain thresholds which have to be fulfilled to justify state intrusion into the right to privacy guaranteed to citizens in its decision in ​​K. S. Puttaswamy v. Union of India [(2017) 10 SCC 1]. These thresholds are: legality, necessity, proportionality and procedural safeguards. Digi Yatra, a) demonstratedly fails to fulfil the legality threshold for want of an anchoring legislation or operative data protection rules; b) fails to meet the requirements of necessity and proportionality because mere convenience cannot be justified as a necessary restriction on privacy. It goes one step further by failing to prescribe any streamlined grievance redress mechanisms, penalties for contraventions for institutional actors, or any rights of data principles in case of privacy violations. In toto, Digi Yatra fails to meet the standards requisite to infringe upon the passengers’ fundamental right to privacy.

Is not saved by the DPDPA

Finally, when the DPDPA comes into play with its procedural Rules notified, it still may not be able to adequately protect the sensitive personal data of Digi Yatra users, specifically facial biometric data. First, Section 17 of the Act holds the power to exempt Digi Yatra and its data processing authorities from its very application at any given time. If such an exemption is notified, the valuable provisions of seeking informed and verifiable consent before data sharing, among others, will not apply to the service, and the existing Privacy Policies may  not be able to provide adequate safeguards to Digi Yatra’s data processing and sharing practices. 

Second, the DPDPA does not classify ‘sensitive personal data’ as a distinct category needing additional safeguards and caution, like its earlier versions or even the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011. Global instruments such as the European Council’s ‘Guidelines on facial recognition’ recognise the sensitive nature of biometric information such as facial data and the vulnerable position the processing of such data may leave the data principles in. Therefore, until specific Rules under the DPDPA prescribe higher standards for processing sensitive information such as facial biometric data, Digi Yatra will continue to operate without appropriate privacy safeguards.

Is completely non-transparent

The Right to Information Act, 2015 (“RTI Act”) was enacted to promote transparency and accountability to the operations of Indian public authorities, and for preventing them from acting in private interest, or otherwise undermining democratic processes. Digi Yatra, being a Union government-backed initiative affecting thousands of airline passengers and their privacy, should be transparent in its operation and the nodal authorities should be accountable for its ripple effects. We see that this is not the case.

Digi Yatra is helmed by a special purpose vehicle, the Digi Yatra Foundation, which is tasked with implementing the Digi Yatra Central Ecosystem pursuant to the DYBBS Policy issued by the Ministry. In essence, this is the nodal body responsible for running the service, and therefore the data fiduciary in this instance. The Foundation, a private company incorporated under Section 8 of the Companies Act, 2013 on February 20, 2019, does not fall within the purview of the RTI Act. Even though 26% shareholding lies with public institutions such as the Ministry and Airports Authority of India, the Foundation itself is absolved from any accountability to Indian citizens. This is a worrying circumvention of the RTI Act, and the Foundation is able to stonewall any effort to make its operation transparent.

Further, there is a total lack of public audits to ascertain the data security of the sensitive biometric information collected by Digi Yatra, except for periodic vulnerability audits and certifications conducted by the Standardisation Testing and Quality Certification Directorate and the Computer Emergency Response Team (“CERT-In”). Regrettably, these reports and audits are also not publicly accessible as CERT-In has recently been exempted from the ambit of the RTI Act (see our statement on the exemption here). Digi Yatra is a worryingly opaque ecosystem, which cannot be made transparent even by invoking the hail-mary RTI legislation.

Relies on dangerous FRT

The use of institutionalised FRT has historically been a topic of controversy globally. In the aftermath of the Black Lives Matter movement in 2020, multiple companies including Microsoft, IBM, and Amazon announced that they would cease the sale of FRT to American police departments in the short term and make a final decision after either re-evaluating their stance or if a law regulating FRT comes into force. However, use of FRT by the executive in India continues without any checks in place. We have written about the threats of FRT systems across various use cases in detail, including making submissions on NITI Aayog’s draft discussion paper on the use of AI and FRT.

The Indian Constitution recognises persons’ right to life, which is read also as the right to live with dignity. The Supreme Court in Puttaswamy recognised that dignity is connected with privacy. FRT systems used in public spaces (such as airports) are per se violative of privacy and dignity, as studies have shown that making FRT ubiquitous as an ecosystem reduces the identity of citizens to merely a rapidly shared and transacted datapoint, imposes a sort of continuous and involuntary visibility and constricts the behaviour of people. 

FRT creates unique risks because our faces are our most prominent identifiers. Unlike fingerprints, our faces are on most, if not all, of our identity cards such as passports, Aadhaar cards, PAN cards, and drivers’ licence. The facial data stored from FRT-based authentication systems, for instance, is far more vulnerable than any other biometric identifier as it can facilitate creation of 360-degree profiles of citizens and can result in “dragnet surveillance”. Dragnet surveillance refers to the collection and analysis of information on entire populations or communities, instead of merely those who are under suspicion for commission of a crime. A 360-degree profile is institutionally created when a single ID number across datasets links together different data sources. The use of any digital identifiers, especially facial biometric data, may lead to unauthorised profiling of individuals through correlation of identities across multiple application domains.

Further, several studies show that FRT is inaccurate, especially for people of colour (which includes Indians) and women. FRT systems are particularly sensitive to errors when encountering new faces. Additionally, Components of FRT, such as computer vision systems are inherently non-transparent and their decisions are not easy to understand even by the people who built them. When they make an error the user cannot tell what reasoning the machine has done to get this error, let alone correct it. One cannot use opaque and inexplicable measures to infringe on privacy. 

Is not even effective!

It is highly unlikely that DigiYatra will satisfactorily deliver on its main claim; which is to “enhance passenger experience and provide a simple and easy experience to all air travellers”. A simple example could be a busy airport, where someone’s image is not captured properly and does not match their government-issued ID. Even assuming that the FRT being adopted under Digi Yatra has a low inaccuracy rate of 2%, this would mean thousands of passengers will not be correctly verified. Not only would this lead to wastage of time on an administrative front, including delay in the flights taking off, but could also pose a security threat. We have written about these inefficiencies in more detail here.

A service similar to Digi Yatra was implemented in the United States by the US Department of Homeland Security (“DHS”), which was proven to have multiple legal, technical and privacy problems of exclusion errors, weak privacy safeguards, and function creep into surveillance. Similar legal concerns have been raised by the Electronic Frontier Foundation, which states that, “(w)e cannot overstate how big a change this will be in how the federal government regulates and tracks our movements or the huge impact this will have on privacy and on our constitutional “right to travel” and right to anonymous association with others.” They also highlight how such systemes will end up discriminating against minorities due to technical problems.

The American Civil Liberties Union ultimately sued the DHS and other implicated agencies, for records related to the US government’s use of FRT that the group said could pose “grave risks to privacy”. On the Indian front, a Kanpur-based professor had moved the Allahabad High Court on the ground that the use of FRT and biometric scans for attendance-recording purposes is antithetical to the right to privacy under Article 21. Use of FRT in surveillance across Tamil Nadu and specifically within the city of Chennai has also been challenged in the Madras High Court in August 2023, on which a notice has been served by the Court and hearings are pending.  

Our request

Our primary request to the Ministry in this instance is to completely withdraw Digi Yatra, and relapse to traditional physical means of passenger verification. Digi Yatra’s threats and challenges far outweigh its benefits. It has already become a tool of coercion and deception for airport staff against unsuspecting passengers – its growing use will simply only add multiple legal, technical and privacy complications into the mix. And as usual, the common passenger will have to bear the brunt of all of it, while failing to receive any of the promised “convenience”.

Important documents:

  1. IFF’s letter to Ministry of Civil Aviation on Digi Yatra dated 12.01.2024 (link
  2. IFF’s letter to NITI Aayog on Digi Yatra dated 12.01.2024 (link
  3. IFF’s letter to Airports Authority of India on Digi Yatra dated 12.01.2024 (link)
  4. IFF’s letter to Digi Yatra Foundation on Digi Yatra dated 12.01.2024 (link)
  5. IFF’s letter to Delhi International Airport Limited on Digi Yatra dated 12.01.2024 (link)
  6. IFF’s letter to Bengaluru International Airport Limited on Digi Yatra dated 12.01.2024 (link)
  7. IFF’s letter to Cochin International Airport Limited on Digi Yatra dated 12.01.2024 (link)
  8. IFF’s letter to Mumbai International Airport Limited on Digi Yatra dated 12.01.2024 (link)
  9. IFF’s letter to Hyderabad International Airport Limited on Digi Yatra dated 12.01.2024 (link)
  10. IFF’s past publications on Digi Yatra (link)

Subscribe to our newsletter, and don't miss out on our latest updates.

Similar Posts

Read our Open Letter to Electoral Candidates & Parliamentary Representatives on the Impact of Deepfakes on Electoral Outcomes

With endorsement from civil society organisations and individuals, we wrote to electoral candidates, political parties, and parliamentarians, urging them to publicly declare that they will not use deepfake technologies to create deceptive or misleading synthetic content for the 2024 Elections.

3 min read

Supreme Court Orders Publication of Review Committee Orders Relating to Internet Shutdowns

The Supreme Court has held that review committee orders under the Telecom Suspension Rules, 2017 must be published, while deliberations of the review committee need not necessarily be notified.

3 min read

Haryana police’s use of drones against protesting farmers is a disproportionate and unconstitutional response

Haryana police is deploying drones to drop tear gas shells and potentially surveil farmers protesting near the Shambhu border as part of their Delhi Chalo march. We write to state authorities contending that such use of technology is disproportionate and unconstitutional, and must cease immediately.

5 min read

Donate to IFF

Help IFF scale up by making a donation for digital rights. Really, when it comes to free speech online, digital privacy, net neutrality and innovation — we got your back!