Jago online market entities jago! A look at how the revised Dark Patterns Guidelines attempt digital consumer protection

The Ministry of Consumer Affairs has swung for a boundary in its attempt to regulate and penalise dark patterns in the Indian market. We examine what these new innings mean for user rights, and if the regulation is robust enough to bowl out nefarious dark patterns.

06 December, 2023
7 min read

tl;dr

The Ministry of Consumer Affairs has notified a revised version of the Guidelines for Prevention and Regulation of Dark Patterns. The new additions to the draft are welcome, and set a strong foundation for the implementation of these Guidelines, but the omission of penalty clauses creates uncertainty about how they may actually be enforced. Though the revised version reflect many recommendations made by stakeholders like IFF, we cast a critical eye on how they will impact digital consumer rights.

Introducing the revised Guidelines  

The draft Guidelines for Prevention and Regulation of Dark Patterns (“Guidelines”) were published by the Ministry of Consumer Affairs, Food and Public Distribution (“Ministry”) for public consultation on September 7, 2023, until October 5, 2023. We submitted comments highlighting five broad areas of concerns, which include striking a balance between multi-sectoral regulation and standard setting by nodal entities, and self-regulation by market players; current categorisation of the listed dark patterns being too rigid to accommodate rapidly evolving trends; lack of privacy-affirming safeguards in the guidelines; vague contraventions clause; and the need to strengthen grievance redress and reporting mechanisms to best implement the guidelines. Additionally, we conducted a community survey on identifying dark patterns in our daily lives (which is still live and accepting responses!), which was compiled and sent to the Ministry to demonstrate the vast and ever-changing world of such deceptive designs. Our detailed submission and the dark patterns tracker can be accessed here

Overall, the draft Guidelines set a satisfactory preliminary framework to start monitoring and regulating dark patterns, but there was certainly scope for improvement. On December 1, 2023, the Ministry published a revised set of Guidelines which address such deficiencies, including some we pointed out. This post examines the revisions made to the draft and how they affect consumer interest and digital rights. We have also conveyed these concerns to the Department, including the fact that comments sent on the draft Guidelines by stakeholders, which seemingly have been incorporated, were never published for public viewing. We urge the Department to do the same, keeping personal details of individual responders redacted.

And why should you care?

Dark patterns are deceptive UI/UX designs that benefit an online service like websites or apps by influencing users into making decisions they might not otherwise make. They are all around us – from your preferred food delivery app to popular news websites, nearly every online service is using some form of deception or user manipulation to expand on their profits. These subtle (or in-your-face) design or advertising tricks subvert or impair user autonomy, influence decision making, and work to the detriment of the user. Dark patterns can amount to misleading advertisements, unfair trade practice, or a violation of your consumer rights – so to this end, the Ministry has decided to regulate them in consumer interest. The Guidelines aim to protect and enhance consumer experience on apps and websites by deeming dark patterns contraventions under the Consumer Protection Act, 2019.

So what has changed?

The edits

A major change in the draft comes by way of Annexure 1. What was previously deemed an exhaustive list of punishable dark patterns (called “specified dark patterns”), has been made illustrative. The revised version reads that ‘the dark pattern practices and illustrations specified provide only guidance and shall not be construed as an interpretation of law or as a binding opinion or decision as different facts or conditions may entail different interpretations.’ Per Clause 2(i), the Central Consumer Protection Authority (“CCPA”) will also notify new additions to the Annexure from time-to-time, reflecting emerging trends in the market. This is a welcome change for reasons we explored in our submission in detail.

We contended that dark patterns as a phenomenon are generally difficult to detect and rapidly evolving in form, design, and tactics, and any attempt to create an exhaustive list, especially at this preliminary stage, might be an exercise in futility for the Ministry. The categories prescribed in Annexure 1 are undoubtedly helpful for the consumer in identifying deceptive designs, but the Guidelines should not treat them as exhaustive. There is a plethora of other universally known dark patterns, like obstruction, social proofs, psychological pricing, growth hacking, linguistic dead-ends, roach motels, privacy zuckering, and so on – the list is wide and ever-expanding. Our community survey also points to the vast range of creative but deceptive designs that may be seen as dark patterns. Treating Annexure 1 as illustrative, and notifying new emerging dark patterns from time-to-time, is a good exercise in consumer awareness and timely regulation.

The additions

The draft Guidelines defined ten categories of dark patterns in Annexure 1, i.e. the Ministry defines ten categories of dark patterns: false urgency, basket sneaking, confirm shaming, forced action, subscription trap, interface interference, bait and switch, drip pricing, disguised advertisement, and nagging. The revised version adds three more.

  1. Trick Question, or the deliberate use of confusing or vague language to “misguide or misdirect” a user from taking decisions based on their understanding (or lack thereof). For example, an app interface asking  “Do you wish to opt out of receiving updates on our collection and discounts forever?” and then giving the user options like, “Yes. I would like to receive updates” and “Not Now”, instead of a simple “Yes”.
  2. Saas billing, or the process of “generating and collecting payments from consumers on a recurring basis in a software as a service (SaaS) business model by exploiting positive acquisition loops in recurring subscriptions to get money from users as surreptitiously as possible”. This happens when the platform may choose not to notify the user when a free trial transitions to a paid one, or indulge in processing “silent recurring transactions” without informing the user, and so on.
  3. Rogue Malwares, or using a “ransomware or scareware” to mislead or trick a user into believing that there is a virus on their computer, with an objective to convince them to pay for a fake malware removal tool that in turn installs malware on their computer.

In addition to this, the Guidelines add more use cases and illustrations to existing dark patterns categories. For instance, forced action now includes cases where excessive information or personal details may be asked of the user to access a certain service or product. Pointedly, the Guidelines deem the following as dark patterns: 

  1. Forcing a user to share personal information linked with Aadhaar or credit card, even when such details are not necessary for making the intended purchase;
  2. Forcing a user to share details of his contacts or social networks in order to access products or services purchased or intended to be purchased by the user;
  3. Making it difficult for consumers to understand and alter their privacy settings, thereby encouraging them to give more personal information than they mean to while making the intended purchase.

These additions are significant in shedding light on privacy concerns associated with dark patterns. In our submission, we pointed out that though comprehensive in other regards, the draft Guidelines did not address associated privacy risks. Consent obtained by deceptive design is not informed consent. Collection of excessive and unnecessary personal data without informed consent of consumers poses a grave threat to consumer interest. This risk may exist even after the notification of the requisite rules under the Digital Personal Data Protection Act, 2023 (“DPDPA, 2023”), as the Act does not govern interfaces or designs through which personal data is collected. One of our recommendations was to include additional categories of dark patterns that compromise consumer privacy, and provide illustrations of such patterns (such as privacy zuckering, privacy maze and linguistic dead-ends) in Annexure 1 to penalise entities from excessively collecting information, and empower consumers against giving away personal data.

Another illustration was added to the category bait and switch, by which a seller may falsely show an unavailable product as available in order to lure the customer to move the product to the shopping cart, but once the user moves to the shopping cart, they are notified that the product is “out of stock” and higher-priced product is offered.

The omissions

Clause 8 from the draft Guidelines, stating “the provisions of the Act [Consumer Protection Act, 2019] shall apply to any contravention of these guidelines,” has been removed. We had expressed concerns about the vague language of this Clause in our submission. To apply the Guidelines justifiably in the domain and equitably for small and big entities, we urged the Ministry to set clear parameters for penalising dark patterns. Owing to their deceptive nature, it may be tricky to see if a dark pattern was employed by an entity with the intention to deceive – so we encouraged the Ministry to opt for a case-by-case assessment on merits to establish intent, and penalise the entity accordingly. 

While vague penalty provisions are dangerous, removing them from the draft creates even more uncertainty. Use of dark patterns is ‘prohibited’ under Clause 4, yet there is no provision prescribing the consequences of contravening the Guidelines. The Consumer Protection Act, 2019 may be employed on a case-by-case basis by the CCPA, but the Guidelines must immediately provide clarity on this front.

It doesn’t end here...

Consumer protection extends to physical spaces as well, where consumer consent and autonomy must be guarded against covert coercion and deceptive marketing by public and private players. Take the Ministry of Civil Aviation’s Digi Yatra initiative for instance, which is (supposedly) an expedited check-in service at airports, deploying facial recognition technology to scan passengers’ faces and their boarding passes. Digi Yatra already raises a number of concerns in relation to data collection, storage, and processing with weak or non-existent privacy safeguards. On top of this, passengers are being coerced into opting for it over a regular airport check-in, through deception and coercion by airport staff (see tweets here, here, here, here and here, to name a few).

While adjacent practices like misleading advertisements and unfair trade practices are penalised in other laws in force in India, dark patterns are notorious for being more covert and difficult to detect. A misleading advertisement may be easier to identify and report based on its language and features, and therefore easier to regulate, than dark patterns in physical spaces. The Ministry is encouraged to keep up the momentum and expand its mandate to such prevalent but nefarious physical dark patterns as well.

Important documents:

  1. IFF's letter to the Department of Consumer Affairs on the revised Guidelines on Prevention and Regulation of Dark Patterns (link)
  2. IFF’s comments on the draft Guidelines on Prevention and Regulation of Dark Patterns (link)
  3. Ministry of Consumer Affairs’ Guidelines for Prevention and Regulation of Dark Patterns, 2023 (link)
  4. Ministry of Consumer Affairs’ draft Guidelines for Prevention and Regulation of Dark Patterns, 2023 (link)

Subscribe to our newsletter, and don't miss out on our latest updates.

Similar Posts

1
Read our Open Letter to Electoral Candidates & Parliamentary Representatives on the Impact of Deepfakes on Electoral Outcomes

With endorsement from civil society organisations and individuals, we wrote to electoral candidates, political parties, and parliamentarians, urging them to publicly declare that they will not use deepfake technologies to create deceptive or misleading synthetic content for the 2024 Elections.

3 min read

2
Supreme Court Orders Publication of Review Committee Orders Relating to Internet Shutdowns

The Supreme Court has held that review committee orders under the Telecom Suspension Rules, 2017 must be published, while deliberations of the review committee need not necessarily be notified.

3 min read

3
Haryana police’s use of drones against protesting farmers is a disproportionate and unconstitutional response

Haryana police is deploying drones to drop tear gas shells and potentially surveil farmers protesting near the Shambhu border as part of their Delhi Chalo march. We write to state authorities contending that such use of technology is disproportionate and unconstitutional, and must cease immediately.

5 min read

Donate to IFF

Help IFF scale up by making a donation for digital rights. Really, when it comes to free speech online, digital privacy, net neutrality and innovation — we got your back!