Why NIC’s proposed polling station surveillance during the 2024 elections is a terrible idea

It’s election year, and NIC is already planning on deploying FRT and drones at polling stations to surveil voters and “maintain law and order”. We write to NIC and ECI outlining why this is undemocratic, unconstitutional, and overall a terrible idea.

18 January, 2024
4 min read

tl;dr

NIC has released a tender calling upon technology providers of surveillance equipment, including drones and facial recognition technologies, for monitoring the 2024 union and state elections. We wrote to the NIC and ECI outlining the various harms of election surveillance, including voter intimidation, overbroad surveillance and profiling, exclusion errors with facial recognition, and privacy concerns.

Background

In December 2023, the National Informatics Centre (“NIC”) released a tender for the procurement and deployment of surveillance equipment, including drones and facial recognition technologies (“FRT”), for monitoring election processes during union and state elections. 

The tender lays out plans for live-webcasting the voting and counting processes during elections, and outlines setting up of a “centralised command and control centre” to monitor activities in real time, in order to “prevent unfair practices and maintain law and order at polling stations during elections”. NIC intends to install field surveillance vehicles, drones, system extracting data from FRT for voters, IP-based cameras, LED TVs for viewing the live data, and web-based audio and video streaming software on polling stations and counting halls. 

The Election Commission of India (“ECI”), in encouragement of this plan, has asked the NIC to live-cast from as many polling stations as possible – but at the moment, NIC plans to select at least 5% to 10% of polling stations and counting halls for webcasting. The tender does not comprise a list of the selected polling stations, or a criteria for their selection.

We wrote to the NIC and the Election Commission of India (“ECI”) contending that the use of FRT and drones for surveillance during election processes gravely injures the right of privacy of voters, and hampers the conduct of fair democratic elections.

Voter surveillance is voter intimidation

The proposed use of monitoring and surveillance technologies is antithetical to a free and fair election. The extensive deployment of video surveillance equipment will hurt individual fundamental rights, notably the right to privacy and dignity. Citizens have a legitimate expectation that their voting activities remain confidential and free from unwarranted scrutiny. The omnipresence of surveillance cameras may not only compromise this right, but also be perceived as voter intimidation, thereby deterring them from exercising their right to vote without fear or coercion. Additionally, the right to freedom of expression may be affected, as individuals might self-censor in a monitored setting, hindering open discussions and the free exchange of ideas. 

Preliminary research on the relationship between FRT and voter turnout suggests that polling stations with FRT have lower turnout compared to those without. Earlier experimentation with implementing FRT in the polling process in India has revealed that such an operation can be plagued with logistical issues and inaccuracies. These factors can have a “chilling effect” on enfranchisement in the country, causing a decrease in voter turnout as a whole. 

Surveillance violates voter privacy

Overbroad surveillance through the use of FRT in voting is per se violative of the fundamental rights to privacy and dignity, as studies have shown that making FRT ubiquitous as an ecosystem reduces the identity of citizens to merely a rapidly shared and transacted datapoint, imposes a sort of continuous and involuntary visibility which constricts their behaviour. The facial data stored from FRT systems is also far more vulnerable than any other biometric identifier, as it can facilitate the creation of 360-degree profiles of citizens and can result in “dragnet surveillance”. A 360-degree profile is institutionally created when a single ID number across datasets links together different data sources. The use of any digital identifiers, especially facial biometric data, may lead to unauthorised profiling of individuals through correlation of identities across multiple application domains.

FRT is a flawed tool

If the intent of the ECI and NIC is to ensure free and fair elections, FRT is a flawed tool to rely on. Studies show that FRT is inaccurate, especially for people of colour – which includes Indians. Beyond ethnicity, these systems may be more likely to inaccurately identify on the basis of gender, age and complexion, yielding even lower accuracy rates for a diverse Indian cultural landscape. It is globally understood that about one in every seven searches on facial recognition databases will fail to turn up an accurate match. The Delhi Police has reported a match rate of around 80% on FRT systems, with this 80% accuracy being treated as “positive” results. Surely, a 20% inaccuracy rate for a process as vital to India’s democracy as the general and state elections, is far from desirable. Using FRT in the polling process can render the identification of voters fairly narrow – thus both disenfranchising voters from marginalised communities, and introducing degrees of marginalisation that do not exist presently. 

Risks of centrally storing voter biometric data

Additionally, the tender states that along with FRT installed on camera for voter verification, there will also be a web portal to collate data and MIS reports based on the data from FRT. The extensive data collected on such a portal will instantly become vulnerable to breaches and potential misuse. The risk of unauthorised access, manipulation, or malicious intent poses a serious threat to the confidentiality of individual voting behaviours, and unchecked collection of voter data and analysis of voter behaviour can be used to influence voter behaviour undemocratically.

We urge the NIC and ECI to carefully reevaluate the implementation of surveillance technologies in the electoral process. We recommend against using the listed tools, and conducting a thorough privacy-impact assessment to gauge their impact on fundamental rights of voters. 

This post was drafted with assistance from Medha Garg, Policy Intern, Internet Freedom Foundation

Important documents:

  1. IFF’s letter to the NIC and ECI on election surveillance dated 17.01.2024 (link)

Subscribe to our newsletter, and don't miss out on our latest updates.

Similar Posts

1
Delhi HC issues notice on Hindutva Watch’s petition challenging the blocking of their entire X/Twitter account

The founder of Hindutva Watch, a research initiative that monitors hate speeches, hate crimes and human rights atrocities committed against marginalised communities in India, has challenged the illegal, arbitrary, and disproportionate blocking of their entire Twitter account before Delhi HC.

6 min read

2
Delhi HC issues notice on Hindutva Watch and India Hate Lab’s petition challenging the blocking of their entire websites.

The founder of research groups Hindutva Watch, and India Hate Lab which track and monitor hate speeches, hate crimes and violence committed against marginalised communities, has challenged the illegal, arbitrary and disproportionate blocking of their entire websites, by MeitY before the Delhi HC.

6 min read

3
One nation, One student ID, zero law or policy to back it up #WhatAreYouVotingFor

The BJP manifesto promises “100% implementation” of the Aadhaar-linked APAAR student ID which centrally stores a large volume of student personal and academic data—but the coercive pan-India exercise is operating without any policy document or accountability from Ministries.

10 min read

Donate to IFF

Help IFF scale up by making a donation for digital rights. Really, when it comes to free speech online, digital privacy, net neutrality and innovation — we got your back!