#PrivacyofThePeople: Society and community management apps

What are the privacy implications of society and community management apps that resident associations are increasingly adopting in Tier 1 and Tier 2 cities across India? IFF analyses user rights from harm and consent-based frameworks, and how the upcoming data protection law may protect users.

24 November, 2021
5 min read


In the latest post in our #PrivacyOfThePeople series, we look at the privacy implications of the society and community management applications that resident associations are increasingly adopting in Tier 1 and Tier 2 cities across India. We analyse the user rights from harm and consent-based frameworks, and how the upcoming data protection law may protect users.


In our last #PrivacyOfThePeople post, we explored how digital lending apps, which are purportedly aimed to increase financial inclusion, are clearly predatory and can lead to devastating consequences for the most vulnerable. Before that, we wrote about the effects of the proposed data protection bill on the gig and app-based workers, ASHA and Anganwadi workers, farmers, social media users, medical patients, students, and dating app users. This post will explore the potential harms and consent issues in various society and community management applications like MyGate, Adda, Lockated, etc. and how that might intersect with India’s upcoming data protection regime.

The issue

The challenge of unauthorised access to a gated society is nothing new. The concept of society and community management applications seems to be very simple - the residents of a gated colony get themselves registered on the client-side application and the security guards get the manager side of the application (“Guard App”). Whenever a visitor, for example, a delivery man or one’s house-help would turn up to enter the gated premises, the gate-keeper on duty would alert the resident on the application so then they can ‘approve’ or ‘decline’ the request to let the visitor enter - just as they would earlier ring you up on the landline intercom, except this time the medium is a digital application.

While this idea may seem quite fascinating and convenient, several pressing issues arise including problems of workplace and peer surveillance along with potential function creep for the residents and visitors alike, of the society. According to the privacy policy of several of these applications, the data categories collected by them include Name, Email address, Mobile number, your apartment/villa number, vehicle number (if any) of the residents; name and resident flat to be visited for the guests; name, phone number, resident flat to be visited (if any), vehicle number (if any), entry/exit time, visit purpose and photograph for service providers; and IP address, the web pages visited, browser, device and operating system information (including device ID, Android ID, Apple IDFA ID etc.), mobile network information, the date, time, and referrer URL of your request.

Applications such as MyGate and Adda categorically claim to be compliant with the GDPR and the ISO 27001 security standards and have also stated that they use strong encryption while also having purpose limitations and data minimisation built-in. However, often the challenge that emerges from using these applications is not just that of data loss or breach, but rather of workplace and peer surveillance.

The applications are designed and marketed to be highly convenient to the users and the resident welfare associations. Claiming to mitigate the issue of the house helps skipping work, the application sends a push notification to all residents who employ a particular daily staff about the arrival of maids, nannies and other daily helpers. This is achieved by first creating a profile of the daily staff and then linking them to the various residents, whose apartments they work at. In this way, the application also logs in their attendance and whether or not they went to a particular apartment for each day. This amounts to overbroad surveillance of house helps and violates their right to freedom of movement.

The adoption of society and community management applications is much akin to the rise of community policing, which took root in the 1970s in western countries. Such neighbourhood vigilance would encourage residents of a community to “take the responsibility of detecting and preventing crime through surveillance of their neighbourhoods”, which they often achieved by documenting each other's movements and visitors.

The society and community management application, in this case, would act as a digital ledger of the visitors any resident or tenant has. Since the application requires the guests to furnish the name, contact details, and apartment number to be visited, these digital ledgers would be a prime target for neighbourhood and Resident Welfare Association busybodies as it would allow them to monitor the guests of each resident and may also result in unnecessary moral policing of residents, especially women.

The PDPB and community management apps

The draft Personal Data Protection Bill, 2019 allows employers to process personal data of employees without their explicit consent, which means that despite having rights under the Personal Data Protection Bill, 2019 and the right to privacy having been declared a fundamental right by the Supreme Court, it can be overridden by the employers. Clause 13 of the draft Bill allows employers to process data that is not classified as sensitive personal data of the employees for recruitment, for provision of any service or benefit to the employee, for verification of their attendance and any other activity relating to the assessment of the performance of the employee. While non-sensitive personal data does not contain sensitive information such as biometric information, health data or address, it still has identifiable characteristics which can be linked together to profile an individual.

While the proposed personal data protection bill, 2019 in Clause 11 calls for free, informed, clear and specific consent from data principals, the enforceability of it becomes blurred under the conditions and circumstances in which the domestic helps work. Another challenge that stems from the lack of free and informed consent is the absence to check and call for enforceability of the rights of the data principals. The enforceability of ‘user rights’ under applications privacy policy is also murky as the domestic help themselves do not use the application - yet, are profiled by it.

These applications also allow the residents to ‘rate and review’ their domestic help claiming to help others in finding a better cook. This not only creates undue pressure on the help but would also mean that they have to comply with any additional requests of the residents in order to get better reviews. This way the application aims to create an unhealthy competition between the domestic help. This often leads to domestic helps complying with any data sharing that is requested from them - including additional identity proofs such as Aadhaar number, any proof of local contact, and in post-pandemic times, vaccination details. As a result, the ‘free’ nature of consent specified by Clause 11 may be further impinged upon.


In order to ensure that the recent adoption spree of society and community management applications is strictly in accordance with the rights of the users and residents as well as to minimize intrusion into the personal lives of people, we recommend that the following changes are made to the Personal Data Protection Bill, 2019:

  1. Compliance with the right to privacy decision: In absence of a data protection regime, we recommend that the application maintain strict adherence to the test of necessity and proportionality as propounded by the Supreme Court in the right to privacy decision (Puttaswamy). Further, the principles of necessity and proportionality should be included under Clause 13 of the Bill.
  2. Amend Clause 13: We recommend that it must be mandatory for employers to consult employees and seek their views prior to processing personal data under Clause 13, PDP Bill 2019.
  3. Issue a Code of Practice: The Data Protection Authority should be entrusted with the responsibility of issuing a Code of Practice for workplace data collection under Clause 50, PDP Bill 2019.

Important documents:

  1. The Personal Data Protection Bill, 2019 as introduced by the Minister for Electronics and Information Technology, Mr Ravi Shankar Prasad (link)
  2. IFF's Public Brief and Analysis of the Personal Data Protection Bill, 2019 (link)
  3. IFF’s Public Brief on Impact of Personal Data Protection Bill, 2019 on Worker Surveillance (link)
  4. The previous blog post titled ‘#PrivacyofThePeople - ASHA Workers and Employee Surveillance’ dated 23rd June 2021 (link)
  5. The SaveOurPrivacy Campaign (link)

This blogpost has been authored by IFF intern Gyan Tripathi, a student of law at Symbiosis International (Deemed University), Pune and reviewed by IFF staff.

Subscribe to our newsletter, and don't miss out on our latest updates.

Similar Posts

Your personal data, their political campaign? Beneficiary politics and the lack of law

As the 2024 elections inch closer, we look into how political parties can access personal data of welfare scheme beneficiaries and other potential voters through indirect and often illicit means, to create voter profiles for targeted campaigning, and what the law has to say about it.

6 min read

Press Release: Civil society organisations express urgent concerns over the integrity of the 2024 general elections to the Lok Sabha

11 civil society organisations wrote to the ECI, highlighting the role of technology in affecting electoral outcomes. The letter includes an urgent appeal to the ECI to uphold the integrity of the upcoming elections and hold political actors and digital platforms accountable to the voters. 

2 min read

IFF Explains: How a vulnerability in a government cloud service could have exposed the sensitive personal data of 2,50,000 Indian citizens

In January 2022, we informed CERT-In about a vulnerability in S3WaaS, a platform developed for hosting government websites, which could expose sensitive personal data of 2,50,000 Indians. The security researcher who identified the vulnerability confirmed its resolution in March 2024.

5 min read

Donate to IFF

Help IFF scale up by making a donation for digital rights. Really, when it comes to free speech online, digital privacy, net neutrality and innovation — we got your back!