#PrivacyofthePeople: The harms of biometric attendance apps

Read our analysis of the harms of biometric attendance apps and whether the draft Digital Personal Data Protection Bill, 2022 will resolves these issues.

07 March, 2023
5 min read

tl;dr

Use of apps for marking the attendance of students, teachers, and government employees has increased in the past few years. However, these apps come with their own concerns related to the privacy of the biometric information they collect as well as the exclusion they cause due to being error prone. In our post, we discuss these concerns and also assess whether the upcoming draft Digital Personal Data Protection Bill, 2022 will provide any respite to individuals whose data is being collected through these attendance apps.

Background

Attendance apps use biometric data such as facial features, retina, iris, or fingerprints to log the attendance of individuals. In some cases, the biometric attendance systems are also linked to salary payments. The use of such apps has been on the rise:

  • In a recent order, the Government of Andhra Pradesh has mandated the use of a facial recognition enabled app named ‘APFRS’ for recording the attendance of all government employees, including contract and outsourced workers. (December, 2022)
  • In Ongole, sanitation workers have been mandated to mark their attendance through a facial recognition based attendance system.
  • In Telangana, the School Education Department has mandated teachers and non-teaching staff in government schools to record “geo-attendance” through a mobile app. (November, 2022)
  • In Jharkhand, doctors working in government hospitals have been asked to mark attendance through a biometric attendance system.
  • In Ludhiana, the Municipal Corporation has issued an order making it mandatory for the staff to mark their attendance through facial recognition. (March, 2023)

Our concerns

  1. Privacy: In the absence of a data protection law, any data collection exercise can result in harmful breaches of privacy. Here, the data being collected is biometric in nature which increases the harms that may result from any data breach or misuse. Breached biometric data may be used to steal financial information and identities, just like a breached password. Biometric data may also be used for other types of fraud such as in forged documentation. However, unlike passwords, biometric data cannot be changed and thus, once breached the harms may be irrevocable. Use of biometric data may also be done for the purposes of surveillance and could result in severe harms such as when Andhra Pradesh Police used facial recognition to identify and penalise protesting teachers.  In addition to biometric data, such biometric attendance systems also collect other information such as geo-location information which may also lead to potential harm. The extreme harms of such biometric data and geo-location collection were seen in Afghanistan when Afghan biometric databases were abandoned to the Taliban after the fall of the government in 2021.
  2. Exclusion: The use of such apps may also lead to exclusion which could result in a failure to mark the attendance of the individual. This could have serious consequences, especially if attendance is linked to the calculation of wages or salary. Exclusion could happen because of various reasons such as error-prone apps, and lack of access to the internet or mobile phone devices. For example, sanitation workers in Nagpur who have been mandated to wear GPS enabled watches have alleged that the watches do not give accurate readings.  

Will the draft Digital Personal Data Protection Bill, 2022 resolve these issues?

The draft Digital Personal Data Protection Bill, 2022 (DPDPB) was released on November 18, 2022 for public consultation. The statement of objects and purpose of the DPDPB states that it is to “provide for the processing of digital personal data in a manner that recognizes both the right of individuals to protect their personal data and the need to process personal data for lawful purposes, and for matters connected therewith or incidental thereto”. Here, the emphasis seems to be on operationalising data processing for data fiduciaries instead of providing primacy to interests of the Data Principal. This interpretation is also supported in the context of biometric attendance apps after an analysis of the provisions of the DPDPB which allow for overbroad processing of personal data and dilute protections for Data Principals. Two clauses which seem to be specifically relevant are Clause 8, which relates to deemed consent, and Clause 18, which relates to exemptions.

💡
Clause 8(7) of the DPDPB: A Data Principal is deemed to have given consent to the processing of her personal data if such processing is necessary for the purposes related to employment, including prevention of corporate espionage, maintenance of confidentiality of trade secrets, intellectual property, classified information, recruitment, termination of employment, provision of any service or benefit sought by a Data Principal who is an employee, verification of attendance and assessment of performance

Clause 8 of the DPDPB pertains to “deemed consent” wherein Data Principal is deemed to have given consent to the processing of their personal data on the occurrence of certain situations listed in the Clause. Clause 8(7) specifically pertains to deemed consent for purposes related to employment, including verification of attendance and assessment of performance. However, once collected, data may also be processed for prevention of corporate espionage, maintenance of confidentiality of trade secrets, intellectual property, classified information, recruitment, termination of employment, and provision of any service or benefit sought by a Data Principal who is an employee under Sub-clause 7 as well as any of the other Sub-clauses mentioned in Clause 8. Such non-consensual processing is a clear instance of function creep wherein data collected for one specific purpose is then processed for other purposes of which the Data Principal has no knowledge and for which they have not provided informed consent.

💡
Illustration accompanying Clause 8(7): ‘A’ shares her biometric data with her employer ‘B’ for the purpose of marking A’s attendance in the biometric attendance system installed at A’s workplace. ‘A’ shall be deemed to have given her consent to the processing of her biometric data for the purpose of verification of her attendance. 

Further, Clause 18(2)(a) also empowers the Union Government to exempt any state instrumentality from the application of the provisions of the bill in the “interests of sovereignty and integrity of India, security of the State, friendly relations with foreign States, maintenance of public order or preventing incitement to any cognizable offence relating to any of these”. Interests stated in the provision for which exemption may be exercised are excessively vague & thus open to misuse through overbroad application resulting in a large number of government instrumentalities being granted exemption from the application of law. Further, the exemption granted is overbroad as it effectively excludes all activities of an agency from the purview of the bill.

💡
Clause 18(2)(a) of the DPDPB: The Central Government may, by notification, exempt from the application of provisions of this Act, the processing of personal data by any instrumentality of the State in the interests of sovereignty and integrity of India, security of the State, friendly relations with foreign States, maintenance of public order or preventing incitement to any cognizable offence relating to any of these.

In light of these provisions, it is unlikely that the DPDPB will be able to satisfactorily safeguard the privacy of citizens whose biometric data is collected through biometric attendance apps.

Our suggestions

Our specific suggestions pertaining to the clauses of deemed consent and exemptions in the DPDPB are:

  1. Deemed consent: While certain exceptions are necessary in order to facilitate a functional data protection regime, these exceptions can, if not worded clearly, could lead to more harm. Therefore, any exception should be worded clearly, limited in purpose, necessary and proportionate to the aim, and accompanied by sufficient procedural safeguards.
  2. Exemptions: Any exemptions sought by government agencies should be granted only if they fulfil the standards of legality, necessity, and proportionality. It is essential that government collection and processing of citizen data is regulated to prevent misuse.

With regard to biometric attendance apps, we suggest that the use of these apps be ceased immediately in order to safeguard against privacy and exclusion harms.

Important documents

  1. A Public Brief of the Digital Personal Data Protection Bill, 2022 dated February 16, 2023 (link)

Subscribe to our newsletter, and don't miss out on our latest updates.

Similar Posts

1
#FreeAndFair: Launching IFF’s Election Website

As the country gears up for the 2024 Lok Sabha elections, we watch every technological development that may affect electoral integrity. Visit the IFF election website freeandfair.in to read about IFF’s actions and efforts. 

5 min read

2
Your personal data, their political campaign? Beneficiary politics and the lack of law

As the 2024 elections inch closer, we look into how political parties can access personal data of welfare scheme beneficiaries and other potential voters through indirect and often illicit means, to create voter profiles for targeted campaigning, and what the law has to say about it.

6 min read

3
Press Release: Civil society organisations express urgent concerns over the integrity of the 2024 general elections to the Lok Sabha

11 civil society organisations wrote to the ECI, highlighting the role of technology in affecting electoral outcomes. The letter includes an urgent appeal to the ECI to uphold the integrity of the upcoming elections and hold political actors and digital platforms accountable to the voters. 

2 min read

Donate to IFF

Help IFF scale up by making a donation for digital rights. Really, when it comes to free speech online, digital privacy, net neutrality and innovation — we got your back!