top of page

How Does Dark Pattern Affect Your Data Privacy?

Dark patterns are one way apps and websites steer consumers into making the choice that’s right for the app or website but wrong for the consumer. Dark patterns make it harder for you to select higher privacy or security settings so they can collect more of your data and monetize your engagement.


These practices on websites, forms, emails, apps, etc. are intended to trick or manipulate visitors into actions that an entity wants them to take, that the user didn’t intend, or sometimes that the user actively didn’t want to do. They eschew transparent communication or more benign persuasion. They aim to reduce or remove the uncertainty of users choosing to limit access to their data, cancel a subscription, leave the website, or other actions.


The term “dark patterns” was coined by UK-based user experience researcher Harry Brignull.


Some examples of ways dark patterns can manifest include:

  • Buttons or other user interface elements that encourage selecting one option over others via color, size, placement or text format

  • Necessary text that is intentionally made hard to notice via size, colour, or placement

  • Interactive elements (like a toggle) that are extremely difficult to select or deselect

  • Making the entity’s preferred action the default selection

  • A sign-up form that uses complex or confusing language and obscures what the user is really agreeing to

  • Hiding or obscuring information, like pricing

  • Omitting or downplaying potentially negative information

  • Requiring the user to actively decline or remove options (like purchase add-ons) they don’t want and didn’t select

  • Making it difficult to cancel a subscription


Types of dark patterns

  1. Trick questions- This could include a question on a form or a banner with options, and at first glance the question or choices may appear to say one thing. But if you read more closely, you’re answering or agreeing to something else entirely. Double negatives and similar tricks can show up here.

  2. Roach motels- This is a user experience tactic that obfuscates navigation. Basically, it’s easy to get into a situation, but much harder to get out. This is often found when you want to stop doing something you may have initially agreed to, like providing your consent for data collection, or canceling a subscription. Under many privacy laws, this is illegal. It must be as easy to withdraw consent as it was to give it initially.

  3. Privacy “Zuckering”- Named after the Facebook CEO, this involves using tricks to get users to share more information about themselves than they wanted or meant to. Regarding data privacy, this can show up where users are not given equal ability to consent to or refuse data collection, or the most permissive consent settings are made the default.

  4. Misdirection- This tactic is meant to distract your attention from one thing by focusing you on another. Usually while something happens with the thing you’re not supposed to be paying attention to. This could tie in to nudging, where design elements encourage you to select one option over another, like consent over rejection. Or the text could go on about how the company needs your consent for the best customer experience, or to make the website work correctly. Which is probably true, but at the same time you’re being nudged toward giving consent for marketing data collection, which isn’t essential.

  5. Confirmshaming- Trying to convince you that the company can’t deliver products or services (or won’t remain in business) without your consent to collect your data is an example of this. Telling you that bad things will result from saying no or cancelling a service or subscription is also common. As is implying you’re doing something stupid if you decline the “offer”. This is also referred to as “negative opt-out”.

  6. Nudging- Nudging itself is not a dark pattern, but depending on how it’s used can become one. Nudging refers to the use of user interface or design elements, sometimes referred to as “choice architecture”, to guide user behavior. Some nudges you don’t even notice, and they’re innocuous, even beneficial. Others are more manipulative, possibly even illegal, and that’s where they move into the category of being dark patterns.

How GDPR and CCPA address dark patterns?

Dark patterns are not explicitly mentioned in the GDPR to date. A study published in January 2020 called “Dark Patterns after the GDPR: Scraping Consent Pop-ups and Demonstrating their Influence” found that dark patterns remain common practice, ranging from mildly questionable gray area tactics to outright illegal noncompliant activities.

Some EU enforcement authorities are taking more aggressive action. French data protection authority CNIL released a report in April 2019 called “Shaping Choices in the Digital World”, with the subtitle “From dark patterns to data protection: the influence of ux/ui design on user empowerment”. The CNIL’s € 50 million fine to Google was also in part because of dark pattern usage regarding privacy settings.

Several bills have been introduced in the US Senate to address dark pattern usage, like the Deceptive Experiences to Online Users Reduction Act (DETOUR Act), which would lean on the Federal Trade Commission’s powers to curb dark pattern usage. None of the introduced legislation has been passed or become law, however.

The requirement for transparency and informed, voluntary consent is central to the current California Consumer Privacy Act (CCPA), which in theory guards against the kinds of activities that dark pattern usage encompasses. Modifications to the CCPA were proposed in October 2020, among them:

  • Limiting the number of steps required for a consumer to opt out of the sale of their personal information (cannot require more than opt in does)

  • Prohibiting businesses from using confusing language to prevent consumer opt out

  • Prohibiting businesses from requesting personal information from consumers trying to opt out when it is not necessary to complete the request

  • Prohibiting businesses from forcing the consumer to read or listen to a list of reasons not to opt out while they are trying to opt out

  • Prohibiting businesses from requiring consumers to search or scroll through a privacy policy, web page, etc. to find how to submit an opt out request when they have clicked “Do Not Sell My Personal Information”

Advertising and marketing associations were not fans of these proposed amendments, particularly the one about making consumers read or listen to a list of reasons not to opt out. They claimed that they would unduly hinder consumers’ receipt of “factual, critical information about the nature of the ad-supported Internet” thus “undermining a consumer’s ability to make an informed decision.”

California’s upcoming privacy law, the California Privacy Rights Act (CPRA), explicitly addresses dark patterns, listing them in definitions as: “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision‐making, or choice, as further defined by regulation”. Dark patterns are also referenced in the CPRA section about consent, noting that: “agreement obtained through use of dark patterns does not constitute consent.” It is reasonable to expect that dark patterns and comparable tactics will be addressed in new privacy legislation that is tabled in various countries, and likely added to updates of existing laws, especially as technologies change and new applications become possible.

266 views0 comments
bottom of page