Dark pattern
⚠️ Article status notice: This article has been marked as incomplete
This article needs additional work for its sourcing and verifiability to meet the wiki's Content Guidelines and be in line with our Mission Statement for comprehensive coverage of consumer protection issues.
This notice will be removed once sufficient documentation has been added to establish the systemic nature of these issues. Once you believe the article is ready to have its notice removed, please visit the Moderator's noticeboard, or the discord and post to the #appeals
channel.
Learn more ▼
Dark patterns represent a growing concern in digital interfaces, referring to manipulative design practices that trick or influence users into making decisions that may not align with their true preferences or interests. These techniques exploit cognitive biases and behavioral psychology to benefit businesses, often at the expense of user autonomy. Initially coined by UX designer Harry Brignull in 2010, the concept has evolved into a significant focus of regulatory scrutiny and academic research .[1][2]
The prevalence of dark patterns is remarkably widespread. A 2019 study examining 11,000 e-commerce websites found approximately 10% employed deceptive practices, while a 2022 European Commission report indicated that 97% of popular apps used by EU consumers displayed them.
Definition and terminology
[edit | edit source]The term dark patterns was originally defined by Harry Brignull as "design tricks that manipulate users into taking actions they didn't intend to." The Federal Trade Commission (FTC) describes them as "design practices that trick or manipulate users into making choices they would not otherwise have made and that may cause harm."[1][2]
There is ongoing discussion regarding the most appropriate terminology. Alternative labels include deceptive design, manipulative UX, coercive design, or anti-patterns. Some advocates argue for terms like deceptive patterns to more accurately describe the intentional nature of these designs and avoid potential racial connotations. Brignull himself has transitioned to using deceptive.design.[2]
What distinguishes dark patterns from merely persuasive design is their exploitative nature – they are not about creating value for users but about benefiting the service provider through manipulation and deception.
Common types and examples
[edit | edit source]Research has identified numerous specific dark patterns, with one comprehensive study proposing a taxonomy comprising 68 distinct types. These manifest across various industries and digital contexts.
Obstruction patterns
[edit | edit source]These designs make desired actions (like rejecting tracking) significantly more difficult than accepting alternatives. A classic example is the Roach Motel pattern, where signing up for a service is straightforward but cancellation is excessively difficult. The FTC highlighted this pattern in their case against ABCmouse, where cancellation was made "extremely difficult" despite promising "Easy Cancellation."
Interface interference
[edit | edit source]This category includes designs that manipulate interface elements to steer user behavior. Misdirection focuses user attention on one element to obscure another critical detail. Disguised ads blend advertisements with genuine interface elements, like fake "Download" buttons on software websites.[1]
Forced action
[edit | edit source]These patterns require users to complete unnecessary actions to access desired functionality. Forced registration demands that users create an account to complete a task. Forced continuity involves automatically transitioning users from free trials to paid subscriptions without adequate notification. The FTC alleged that Adobe violated regulations by "tricking customers into enrolling in subscription plans without proper disclosure."[1][3]
Sneaking and information hiding
[edit | edit source]These practices involve concealing or obscuring material information from users. Hidden costs reveal unexpected fees only at checkout, a practice employed by ticketing platforms. Drip pricing advertises only part of a product's total price initially and then imposes other mandatory charges later.[1]
Social proof and urgency
[edit | edit source]These patterns exploit social influence and time pressure to manipulate decisions. False activity messages misrepresent site activity or product popularity. False scarcity creates pressure to buy immediately by claiming limited inventory. Baseless countdown timers display fake countdown clocks that reset when expired.
Mind tricks and business incentives
[edit | edit source]Cognitive biases exploitation
[edit | edit source]Dark patterns effectively manipulate users by leveraging well-established cognitive biases. Default bias describes the tendency to stick with pre-selected options, exploited through pre-ticked checkboxes. Inertia makes users more likely to choose the path of least resistance. The tendency to prefer avoiding losses, loss aversion, is triggered through messages suggesting users may lose functionality if they decline certain options.
The effectiveness is enhanced through A/B testing and data analytics, allowing companies to refine dark patterns based on actual user behavior. This data-driven approach represents a significant evolution from earlier deceptive practices.
Incentives and short-term gains
[edit | edit source]The persistence of dark patterns is driven by their effectiveness in achieving short-term business objectives like increased conversion rates. Additionally, the competitive landscape fosters copycat behavior, as companies mimic their rivals' strategies.
Research suggests these short-term gains often come with long-term consequences . Studies indicate that "once users feel manipulated, they don't just avoid your settings—they avoid your brand." The erosion of trust can have significant business implications.
Legal and regulatory landscape
[edit | edit source]United States framework
[edit | edit source]In the United States, regulation occurs primarily through existing consumer protection statutes . The FTC Act empowers the Federal Trade Commission to take action against "unfair or deceptive acts or practices in or affecting commerce."[4]
In October 2024, the FTC amended its Negative Option Rule to include specific requirements for cancellation mechanisms, implementing a "Click-to-Cancel" provision.[5]
European Union's approach
[edit | edit source]The European approach combines general consumer protection laws with data privacy-specific regulations. While the General Data Protection Regulation (GDPR) doesn't explicitly mention dark patterns, its requirements for valid consent effectively prohibit many deceptive designs.[6]
The Digital Services Act (DSA) and Digital Markets Act (DMA) further address dark patterns by prohibiting practices that "deceive or manipulate" users.[7]
Enforcement cases and penalties
[edit | edit source]Recent years have seen significant enforcement actions :
- Epic Games paid $245 million to settle charges related to deceptive patterns in Fortnite.[8]
- Noom paid $62 million to settle charges regarding deceptive subscription practices.[9]
- TikTok received multimillion-euro fines for failing to protect children's data through manipulative consent practices.
Impact on consumers and businesses
[edit | edit source]Consumer harms
[edit | edit source]Dark patterns create multiple forms of harm for consumers, ranging from financial losses to privacy violations and emotional distress . Privacy harms occur when users are manipulated into sharing more personal data than they intended. Emotional and psychological harms include frustration, stress, and feelings of betrayal.[1][6]
Vulnerable groups are disproportionately affected. "People with low digital literacy, cognitive impairments, or disabilities often struggle to recognize manipulative designs."
Business implications
[edit | edit source]While dark patterns may deliver short-term benefits , they often create long-term risks for businesses. The erosion of consumer trust can have lasting negative impacts on customer retention and brand reputation. Businesses also face increasing regulatory risks as enforcement actions become more common and severe.[1]
Detection, avoidance, and mitigation
[edit | edit source]Technical detection and tools
[edit | edit source]Efforts to automatically detect dark patterns are evolving but face significant challenges. A comprehensive study found that existing tools could only identify 31 of 68 identified dark pattern types, a coverage rate of just 45.5%. The study proposed a Dark Pattern Analysis Framework (DPAF) to address existing gaps.
Ethical design alternatives
[edit | edit source]Companies can implement ethical alternatives that respect user autonomy. Providing balanced choice architecture where users can decline as easily as they accept represents an ethical approach for obstruction patterns. Designers should implement neutral default settings that don't assume consent.
Transparency and clear communication are essential. Companies should provide honest explanations of data practices and costs in clear, understandable language.
Consumer protection and advocacy
[edit | edit source]Consumer education plays a crucial role. Initiatives like the Dark Patterns Tip Line allow users to report deceptive designs they encounter. Advocacy organizations provide resources to help identify and avoid dark patterns.[2]
References
[edit | edit source]- ↑ 1.0 1.1 1.2 1.3 1.4 1.5 1.6 "Bringing Dark Patterns to Light". Federal Trade Commission. September 2022. Archived from the original on September 16, 2025.
- ↑ 2.0 2.1 2.2 2.3 Brignull, Harry. "Dark Patterns: inside the interfaces designed to trick you". Deceptive.Design.
- ↑ "FTC Charges Adobe". Federal Trade Commission. June 17, 2024.
- ↑ "FTC Act". Federal Trade Commission.
- ↑ "FTC Strengthens Negative Option Rule". Federal Trade Commission. October 11, 2024.
- ↑ 6.0 6.1 "Guidelines on Dark Patterns in Social Media Platform Interfaces". European Data Protection Board. 2022.
- ↑ "Digital Services Act". European Commission.
- ↑ "Epic Games to Pay $245 Million". Federal Trade Commission. December 19, 2022.
- ↑ "Noom to Pay $62 Million". Federal Trade Commission. March 7, 2024.