top of page
  • Muazzam Nasir & Ashish Kumar

Distorted Light at the End of the Dark Pattern Tunnel:Assesing Nagging Patterns from a Privacy Prism

Introduction to the Patterned Darkness of the Online Ecosystem

In 2020, Instagram updated its direct messaging services to create vanish mode. Upon clicking the vanish mode, an update dialogue appears that does not explicitly tell the users that they are about to link their Instagram account to Facebook Messenger. Post-this-exercise, Facebook reported that more than 60% (sixty percent) of users adopted interoperability between Instagram and Messenger. The design of the apps downplays key information and uses confusing language to deceive or manipulate users to bypass consent.


These practices are driven by the motive of impairing the user’s rational decision-making process, to maximize the collection of personal information. Dark Patterns manipulate consumers by altering choice architecture. It employs strategies which cover up the mechanism which influences users, deceive users through acts or omission, hide relevant information which makes possible but asymmetrically difficult for users to act according to his preferences. Various online companies such as e-commerce sites, platforms, and apps deploy manipulative techniques known as “dark patterns”. Harry Brignull is a user experience specialist who coined the term dark patterns in 2010. He describes dark patterns as a user interface (“UI”) choice carefully designed to manipulate users into doing things that they would abstain from in the absence of such patterns. It is designed with a broad understanding of human psychology without user interests in mind. Dark patterns exploit cognitive, heuristics, and emotional vulnerabilities to seize attention or misdirect users.


There are various categories of dark patterns – interface interference, sneaking, obstruction, privacy zuckering, and nagging. Interface interference allows manipulation of the interface by facilitating specific actions that limit the user’s ability to discover other action possibilities. Interface interference has three subtypes: hidden information, aesthetic manipulation, and preselection. Hidden information hides relevant information and options from the user. Aesthetic manipulation creates design choices that direct the user to a specific choice, distracting them from other available options. Preselection is a situation where an option is selected by default before any user interaction takes place. Sneaking involves including hidden costs and delaying the presentation of information to a stage where consent revocation is impossible. Obstruction is built on the “roach motel” model, which is easy to enter, but difficult to exit. For instance, Amazon deploys roach motel to prevent users from deleting their accounts. The path to delete option is a maze of help options and drop-down menus. It leads to a dead-end where users have to convince a customer-service person to delete their account.  Similarly, Facebook’s weapon of choice for manipulating user consent is “privacy zuckering” – named after its founder Mark Zuckerberg. Privacy zuckering leverages the hidden data broker industry through opaquely informed terms and conditions to establish a marketplace for user data. Thus, the design of the online ecosystem is built to , coaxing them through various dark patterns to enter a closed room and make it difficult for users to perform their intended actions. 


This essay aims to shine a light on the “nagging” dark patterns – an extremity within the basket of dark patterns. Nagging posits unique legal issues due to its design that uses timely intervention to misdirect the user towards an unwanted action. We argue that this violates the user’s decisional privacy and collects excessive personal information. We further establish that a consent-based framework cannot combat privacy harms caused by nagging dark patterns, and advance a privacy-by-design model to combat the recurrent threat of nagging dark patterns.


The Distinct Nature of Nagging Dark Patterns: Forced Consent

Nagging dark patterns are repeated interruptions during a user’s digital interaction. In a systematic manner, user’s online experience is interrupted by pop-ups or audio notices. The User Interface (UI design creates repeated interruption which redirects user attention from the desired task and compels them to perform actions they are unaware of. For instance, Instagram repeatedly prompts the user to turn on notifications and provides a choice between “not now” and “ok”. There is a careful exclusion of the “never” option which allows Instagram to field a pop-up at a later stage. Similarly, Google Location Services repeatedly prompts users to permit the usage of locational data. It only displays the “don’t show me again” option and does not provide an actual decline option. In both cases users can decline, but not in a permanent manner. The UI design will bombard pop-ups until the user wears down and enables notification to negate future interruption. A user giving consent after a repeated interruption cannot contend that he was tricked. In practice, nagging wears down users to give lawful consent.


Unlike, traditional dark patterns which exploit cognitive vulnerabilities (emotional or decision-making), “nagging” rely on repetition, continuously, through subtle persuasion and cognitive load to obtain the consent of a user. According to the cognitive load theory, a person’s information-processing capacities can be affected by deploying manipulating external circumstances and variables. Nagging uses , a kind of cognitive load caused by the design of the instruction or irrelevant cognitive activities. A repeated interruption design without an actual option to decline is an irrelevant activity because it is not associated with improving user experience. The primary motive behind nagging is to seize the attention of the user in a non-consensual manner. Tim Wu describes such forced-consensual attention seizure as “attentional theft”. The conjoint assimilation of forced consent to reproduce attentional theft is a minefield of privacy harm.


Evaluating Privacy Harms of Nagging Dark Patterns

The Twin Privacy Harms – Decisional and Minimized

Nagging creates privacy harms in a two-fold manner. First, the repeated interruptions creates cognitive overload which eventually invades a user’s decisional autonomy. In the landmark Puttaswamy II” judgment, the Indian Supreme Court (“court”) stated that “decisional privacy” is a facet of privacy. According to the court’s interpretation, decisional privacy is an individual’s ability to make choices or the freedom to exercise one’s mind. It also overlaps with the right of self-determination, the ability to conduct one’s life in a manner of one’s choosing. Justice Chelameswar’s characterization of “freedom from unwarranted stimuli” as a grundnorm of privacy in his concurring opinion in Puttaswamy II significantly outcasts nagging dark patterns. Second, cognitive overload forces users to surrender excessive personal information which violates the “data minimization principle”. Data minimization is the collection of adequate, relevant and limited data that is necessary to fulfil a specific purpose. Google deploys nagging dark patterns in their UI to access users’ locational data. The repeated interruption forces users to share excessive personal information which they would not have disclosed under normal circumstances. The UI architecture is designed to collect more personal information. According to Solove’s theory of privacy harms, the insights from personal information might be used by online companies to influence users. 


The Dichotomy of Consent under India’s Proposed Data Protection Law

Currently, India lacks an established legal framework to address privacy harms. A glance over foreign jurisdictions indicates the legislative nudge to outlaw dark patterns. In the United States, Colorado Privacy Act, 2021 (“CPA”) places a ban on consent obtained through dark patterns. CPA defines dark patterns as a UI designed or manipulated with the substantial effect of subverting or impairing user autonomy or decision making. Similarly, the California Consumer Privacy Act, 2018 prohibits companies from forcing consumers to not opt out and complicating the language of privacy policies. As the latest order-of-events points in India, the Joint Parliamentary Committee’s report on Personal Data Protection Bill, 2021 (“Data Protection Law”) The present model of consent establishes a distinction between the extent of consent required by the data fiduciaries (here: companies that can create nagging dark patterns by processing data) for “personal data” and “sensitive personal data”. The proposed  Data Protection Law states that user consent for the processing of “personal data” should be informed, free, specific, and clear. Whereas section 11 takes a leap  for processing of “sensitive personal data”, where user consent is put to the test of significant harm and requires clear terms without recourse to inference to be drawn from conduct or context. Harm has been defined to include psychological manipulation which impairs the autonomy of the individual under section 2(20) of the Data Protection Law (emphasis supplied). Consequentially, where nagging dark patterns would face the axe of law only to the extent where “sensitive personal data” is at test. Since harm and manipulation are not criteria for processing “personal data”, it creates a dichotomy of consent as nagging dark patterns do not distinguish between personal and sensitive personal data. Thus, a mandatory “privacy-by-design” policy, as we shall argue next, would remedy the privacy harm to a significant extent.


Creating a Legal Obligation Driven “Privacy-by-Design” Policy

Section 22 of the proposed Data Protection Bill directs data fiduciaries to adopt a privacy by design framework. The General Data Protection Regulation defines Privacy-by-design (“PbD”) means “data protection through technology design”. It significantly contributes to fostering a rights-based mindset to implement privacy law principles in a legal framework. PbD directs data fiduciaries to embed data protection principles such as data minimization and transparency into the design of the data processing system to anticipate, identify, and avoid harm to the data principal. Data Protection Law embeds the privacy by design policy under section 22 where a set of obligations are cast on the data fiduciary. The data fiduciary has to adopt technical systems which avoid harm to the data principal – coupled with transparency in processing of personal data and within the ambit of interest of the data principal. Data minimization, as explained above, limits processing of data to a specific purpose. Transparency involves giving the data principal prior clarity on the technology or business practice utilized by the data fiduciary. PbD absolves the dichotomy of consent to create a single-roof of obligation strictly linked to harm. However, Data Protection Bill creates a limited obligation on data fiduciary to prepare a PbD policy and does not create an obligation upon the Data Protection Authority to strictly examine or certify the technology design – beyond the assurance of the data fiduciary. Unless the Indian model of PbD does not confer a legal obligation on the data fiduciaries to have their PbD policy obligatorily examined and certified, it cannot move to strict elimination of nagging dark patterns. While a consent framework under section 11 would be adjudged ex post facto i.e., once the harm has been caused, PbD under section 22 would eliminate manipulation ex-ante i.e., before the harm is to be caused. For instance, Apple recently launched App Tracking Transparency, imbibed as PbD which forces apps to request user permission to track activity across the apps. More than 96%of iPhone users in the United States have opted out of app tracking initially, thus disallowing apps from displaying repeated pop-ups at a later stage.  This visibly indicates that the implementation of PbD restricts nagging.


Conclusion

Due to rapid technological innovation, data fiduciaries can micro-target users with relative ease. Interface design plays a crucial role in increasing digital interaction. An interface design that influences the consumer decision-making process through manipulation has attracted the attention of lawmakers and scholars. However, nagging, which relies on repeated interruption to influence user decisions has evaded the scrutiny. It creates a unique legal situation by creating a cognitive load to bypass consent. Data Protection Law, in its current form, is not able to address privacy harms created by nagging.  It creates a mirage of consent that easily circumvents a consent-based framework. A stricter model of the PbD reverses the burden on the data fiduciaries removing the outlet of operation for nagging. Data Protection Law will have to bend the distorted light of nagging at the end of the dark pattern tunnel, to smoothen India’s privacy journey.

 

This article has been co-authored by Muazzam Nasir and Ashish Kumar students at Hidayatullah National Law University, Chattisgarh. This article is a part of RSRR’s Blog Series in collaboration with The Centre for Internet and Society


Mailing Address

Rajiv Gandhi National University of Law,

Sidhuwal - Bhadson Road, Patiala, Punjab - 147006

Subscribe to RSRR

Thanks for submitting!

Email Us

General Inquiries: rslr@rgnul.ac.in

Submissions: submissionsrslr@rgnul.ac.in

Follow Us

  • LinkedIn
  • X
  • Instagram

Copyright © 2023 RGNUL Student Research Review (RSRR). ISSN: 2349-8293.

bottom of page