Jump to content

Dark pattern

From Consumer Rights Wiki
Revision as of 01:21, 23 March 2026 by Sojourna (talk | contribs) (Gallery: Grammar.)

⚠️ Article status notice: This article has been marked as incomplete

This article needs additional work for its sourcing and verifiability to meet the wiki's Content Guidelines and be in line with our Mission Statement for comprehensive coverage of consumer protection issues.

This notice will be removed once sufficient documentation has been added to establish the systemic nature of these issues. Once you believe the article is ready to have its notice removed, please visit the Moderator's noticeboard, or the discord and post to the #appeals channel.

Learn more ▼

🔧 Article status notice: This article may rely heavily on AI/LLMs

This article has been marked because it may have heavy use of LLM generated text that affects its perceived or actual reliability and credibility.


To contact a moderator for removal of this notice once the article's issues have been resolved, or if this was a mistake, please use either the Moderator's noticeboard, or the #appeals channel on our Discord server (Join using this link]).


Learn more ▼


A dark pattern is a manipulative design practice that tricks or influences users into making decisions that may not align with their true preferences or interests. These techniques exploit cognitive biases and behavioral psychology to benefit businesses, often at the expense of user autonomy. Initially coined by user experience (UX) designer Harry Brignull in 2010, the concept has evolved into a significant focus of regulatory scrutiny and academic research.[1][2]

The prevalence of dark patterns is remarkably widespread, and they represent a growing concern in digital interfaces. A 2019 study examining 11,000 e-commerce websites found approximately 10% employed deceptive practices,[3] while a 2022 European Commission report indicated that 97% of popular apps used by EU consumers displayed them.[4]

Definition and terminology

The term dark patterns was originally defined by Harry Brignull as "design tricks that manipulate users into taking actions they didn't intend to". The Federal Trade Commission (FTC) describes them as "design practices that trick or manipulate users into making choices they would not otherwise have made and that may cause harm".[1][2]

There is ongoing discussion regarding the most appropriate terminology. Alternative labels include deceptive design, manipulative UX, coercive design, or anti-patterns. Some advocates argue for terms like deceptive patterns to more accurately describe the intentional nature of these designs and avoid potential racial connotations. Brignull himself has transitioned to using deceptive.design.[2]

What distinguishes dark patterns from merely persuasive design is their exploitative nature – they are not about creating value for users but about benefiting the service provider through manipulation and deception.

Common types and examples

Research has identified numerous specific dark patterns, with one comprehensive study proposing a taxonomy comprising 68 distinct types. These manifest across various industries and digital contexts.[5]

Obstruction patterns

These designs make desired actions (like rejecting tracking) significantly more difficult than accepting alternatives. A classic example is the Roach Motel pattern, where signing up for a service is straightforward but cancellation is excessively difficult. The FTC highlighted this pattern in their case against ABCmouse, where cancellation was made "extremely difficult" despite promising "Easy Cancellation".[6]

Interface interference

This category includes designs that manipulate interface elements to steer user behavior. Misdirection focuses user attention on one element to obscure another critical detail. Disguised ads blend advertisements with genuine interface elements, like fake "Download" buttons on software websites.[1]

Forced action

These patterns require users to complete unnecessary actions to access desired functionality. Forced registration demands that users create an account to complete a task. Forced continuity involves automatically transitioning users from free trials to paid subscriptions without adequate notification. The FTC alleged that Adobe violated regulations by "tricking customers into enrolling in subscription plans without proper disclosure".[1][7]

Sneaking and information hiding

These practices involve concealing or obscuring material information from users. Hidden costs reveal unexpected fees only at checkout, a practice employed by ticketing platforms. Drip pricing advertises only part of a product's total price initially and then imposes other mandatory charges later.[1]

Social proof and urgency

These patterns exploit social influence and time pressure to manipulate decisions. False activity messages misrepresent site activity or product popularity. False scarcity creates pressure to buy immediately by claiming limited inventory. Baseless countdown timers display fake countdown clocks that reset when expired.

Mind tricks and business incentives

Cognitive biases exploitation

Dark patterns trick users by taking advantage of unconscious thoughts. For example, in cookie banners the "Accept All" option is the first option listed and uses a green background. People tend to choose the first option before considering others. Green is associated with good in design. In cookie banners, there is also a "Manage my choices" option that typically involves opting out of each data collection category or website one at a time. It is easier for users to accept all cookies than to decline them, due to using dark patterns.[8][9]

Incentives and short-term gains

Editor note: This section is overly technical language and needs to be simplified, as well as include citations. It's also clearly made by AI/LLM.

The persistence of dark patterns is driven by their effectiveness in achieving short-term business objectives like increased conversion rates. Additionally, the competitive landscape fosters copycat behavior, as companies mimic their rivals' strategies.

Research suggests these short-term gains often come with long-term consequences. Studies indicate that "once users feel manipulated, they don't just avoid your settings—they avoid your brand". The erosion of trust can have significant business implications.

United States framework

In the United States, regulation occurs primarily through existing consumer protection statutes. The FTC Act empowers the Federal Trade Commission to take action against "unfair or deceptive acts or practices in or affecting commerce".[10]

In October 2024, the FTC amended its Negative Option Rule to include specific requirements for cancellation mechanisms, implementing a "Click-to-Cancel" provision.[11] The FTC later voted on 9 May 2025 to extend the original 14 May 2025 compliance deadline by sixty days.[12][13]

On 8 July 2025, the Eighth Circuit Court of Appeals vacated the entire 2024 change to the Negative Option Rule on procedural grounds in Custom Communications, Inc. v. Federal Trade Commission.[14][15] Despite the legal setback, the FTC carried out findings against Match.com, Chegg Inc., Cleo AI and Amazon based on the Restore Online Shoppers' Confidence Act (ROSCA) and Section 5 of the FTC Act.[16]

On 30 January 2026, the FTC indicated renewed interest in updating the Negative Option Rule by submitting a draft Advance Notice of Proposed Rulemaking (ANPRM) to the Office of Management and Budget (OMB) for review.[17][18] It was opened to public comment on 11 Mar 2026.[19]

European Union's approach

The European approach combines general consumer protection laws with data privacy-specific regulations. While the General Data Protection Regulation (GDPR) doesn't explicitly mention dark patterns, its requirements for valid consent effectively prohibit many deceptive designs.[20]

The Digital Services Act (DSA) and Digital Markets Act (DMA) further address dark patterns by prohibiting practices that "deceive or manipulate" users.[21]

Enforcement cases and penalties

Recent years have seen significant enforcement actions:

  • Epic Games paid $245 million to settle charges related to deceptive patterns in Fortnite.[22]
  • Noom paid $62 million to settle charges regarding deceptive subscription practices.[23]
  • TikTok received a €345 million fine for failing to protect children's data through manipulative consent practices.[24]

Impact on consumers and businesses

Editor note: This entire section reads as LLM.

Consumer harms

Dark patterns create multiple forms of harm for consumers, ranging from financial losses to privacy violations and emotional distress. Privacy harms occur when users are manipulated into sharing more personal data than they intended. Emotional and psychological harms include frustration, stress, and feelings of betrayal.[1][20]

Vulnerable groups are disproportionately affected. "People with low digital literacy, cognitive impairments, or disabilities often struggle to recognize manipulative designs."

Business implications

While dark patterns may deliver short-term benefits, they often create long-term risks for businesses. The erosion of consumer trust can have lasting negative impacts on customer retention and brand reputation. Businesses also face increasing regulatory risks as enforcement actions become more common and severe.[1]

Detection, avoidance and mitigation

Editor note: This entire section reads as LLM.

Technical detection and tools

Efforts to automatically detect dark patterns are evolving but face significant challenges. A comprehensive study found that existing tools could only identify 31 of 68 identified dark pattern types, a coverage rate of just 45.5%.[5] The study proposed a Dark Pattern Analysis Framework (DPAF) to address existing gaps.

Ethical design alternatives

Companies can implement ethical alternatives that respect user autonomy. They should provide a balanced choice architecture where users can decline as easily as they accept represents an ethical approach for obstruction patterns. Designers should implement neutral default settings that don't assume consent.[9]

Transparency and clear communication are essential. Companies should provide honest explanations of data practices and costs in clear, understandable language.

Consumer protection and advocacy

Consumer education plays a crucial role. Initiatives like the Dark Patterns Tip Line allow users to report deceptive designs they encounter. Advocacy organizations provide resources to help identify and avoid dark patterns.[2]

Further reading

Examples of dark patterns, with notes.

References

  1. 1.0 1.1 1.2 1.3 1.4 1.5 1.6 "Bringing Dark Patterns to Light". Federal Trade Commission. Sep 2022. Archived from the original on 9 Dec 2025. Retrieved 22 Mar 2026.
  2. 2.0 2.1 2.2 2.3 Brignull, Harry; Leiser, Mark; et al. (25 Apr 2023). "Dark Patterns: inside the interfaces designed to trick you". Deceptive.Design. Archived from the original on 22 Mar 2026. Retrieved 22 Mar 2026.
  3. Cimpanu, Catalin (11 Nov 2019). "Study of over 11,000 online stores finds 'dark patterns' on 1,254 sites". ZDNET. Archived from the original on 14 Nov 2025. Retrieved 8 Nov 2025.
  4. Lupiáñez-Villanueva, Francisco; Boluda, Alba; et al. (Apr 2022). "Behavioural study on unfair commercial practices in the digital environment". Publications Office of the EU. doi:10.2838/859030. ISBN 978-92-76-52316-1. Archived from the original on 18 Jan 2026. Retrieved 22 Mar 2026.
  5. 5.0 5.1 Li, Meng; Wang, Xiang; Nei, Liming; Li, Chenglin; Liu, Yang; Zhao, Yangyang; Xue, Lei; Kabir Sulaiman, Said (2024-12-12). "[2412.09147] A Comprehensive Study on Dark Patterns". arXiv. doi:10.48550/arXiv.2412.09147. Archived from the original on 9 Nov 2025. Retrieved 2025-11-08.
  6. Keller and Heckman LLP (28 Sep 2020). "FTC Targets Negative Option Schemes in Two Multimillion Dollar Settlements". Lexology. Archived from the original on 14 Nov 2025. Retrieved 28 Nov 2025.
  7. "FTC Charges Adobe". Federal Trade Commission. 17 Jun 2024. Archived from the original on 17 Jun 2024. Retrieved 22 Mar 2026.
  8. Stroink-Skillrud, Donata (2 Feb 2023). "Your Cookie Consent Banner is (Probably) Not Compliant". MainWP. Archived from the original on 16 Feb 2026. Retrieved 22 Mar 2026.
  9. 9.0 9.1 Keyser, Robert (2023-10-05). "Cookie Consent Dark Patterns: How to Identify and Fix Them". Ethyca. Archived from the original on 12 Dec 2025. Retrieved 2025-08-11.
  10. "FTC Act". Federal Trade Commission. Archived from the original on 27 Jan 2026. Retrieved 22 Mar 2026.
  11. "Federal Trade Commission Announces Final "Click-to-Cancel" Rule Making It Easier for Consumers to End Recurring Subscriptions and Memberships". Federal Trade Commission. 16 Oct 2024. Archived from the original on 17 Oct 2024. Retrieved 22 Mar 2026.
  12. "FTC Votes on Negative Option Rule Deadline". Federal Trade Commission. 9 May 2025. Archived from the original on 10 May 2025. Retrieved 22 Mar 2026.
  13. Ferguson, Andrew N.; Holyoak, Melissa; Meador, Mark R. (9 May 2025). "Statement of the Commission Regarding the Negative Option Rule". Federal Trade Commission. Archived from the original on 10 May 2025. Retrieved 22 Mar 2026.
  14. "Click to Cancel Just Got Cancelled: Eighth Circuit Vacates Entirety of FTC's Negative Option Rule". Cooley. 11 Jul 2025. Archived from the original on 25 Jul 2025. Retrieved 22 Mar 2026.
  15. Conkle, Brooke; Cover, Jason; et al. (10 Jul 2024). "Eighth Circuit Vacates FTC's Negative Option Rule for Procedural Violations". Consumer Financial Services Law Monitor. Archived from the original on 19 Jul 2025. Retrieved 22 Mar 2026.
  16. Goodrich, Brian J.; Genn, Benjamin; et al. (25 Sep 2025). "FTC Steps Up Subscription Enforcement After "Click to Cancel" Rule Struck Down". Holland & Knight. Archived from the original on 26 Sep 2025. Retrieved 22 Mar 2026.
  17. "FTC Submits Draft ANPRM Related to Negative Option Plans to OMB for Review". Federal Trade Commission. 30 Jan 2026. Archived from the original on 31 Jan 2026. Retrieved 22 Mar 2026.
  18. "U.S. FTC Signals Renewed Interest in "Click-to-Cancel" Rulemaking". Sidley. 9 Feb 2026. Archived from the original on 22 Mar 2026. Retrieved 22 Mar 2026.
  19. "FTC Seeks Public Comment in Response to Advance Notice of Proposed Rulemaking Regarding Negative Option Marketing Practices". Federal Trade Commission. 11 Mar 2026. Archived from the original on 11 Mar 2026. Retrieved 22 Mar 2026.
  20. 20.0 20.1 "Guidelines on Dark Patterns in Social Media Platform Interfaces". European Data Protection Board. 14 Feb 2023. Archived from the original on 26 Feb 2023. Retrieved 22 Mar 2026.
  21. "Digital Services Act". European Commission. Archived from the original on 16 Feb 2026. Retrieved 22 Mar 2026.
  22. "Epic Games to Pay $245 Million". Federal Trade Commission. 19 Dec 2022. Archived from the original on 19 Dec 2022. Retrieved 22 Mar 2026.
  23. Davis, Ayumi (14 Feb 2022). "Noom to Pay $62M to Customers Forced Into Renewals They Didn't Want". Newsweek. Archived from the original on 14 Feb 2022. Retrieved 22 Mar 2026.
  24. "Irish Data Protection Commission announces €345 million fine of TikTok". Data Protection Commission. 15 Sep 2023. Archived from the original on 1 Feb 2026. Retrieved 22 Mar 2026.