Dark Patterns

From SI410
Jump to: navigation, search

Dark patterns are intentional design strategies used in websites and applications to guide users to perform an action they may not have done otherwise. Such an action might include buying a product that was automatically added to their shopping cart, signing up for a subscription service, or posting a social media status without explicit permission. The term, created in 2010 to raise awareness about it, closely relates to the field of user experience (UX) design, which is the process of making a product usable, accessible, and pleasurable to use for customers. Dark patterns prioritize company goals (e.g. generating revenue) over the goals of users or customers. There are currently eleven different types of dark patterns, with an additional eight subtypes related specifically to privacy.


The term “dark patterns” was coined in 2010 by London-based UX designer Harry Brignull.[1] He created darkpatterns.org to publicly shame websites that employ dark patterns and to raise awareness about these practices. Brignull uses Twitter to let the public submit dark patterns by sharing a screenshot of the dark pattern and tagging @darkpatterns or using the hashtag #darkpattern.

Other user experience professionals have joined the conversation about dark patterns. Gary Bunker, a user experience strategist, proposed a standard code of conduct or “moral compass” that stops UX professionals from working on projects employing dark patterns.[2]. Paul Brooks, Global UX Design Authority at PGRX, also adds that there are other ways to drive revenue rather than using dark patterns[3].

However, Chris Nodder, author of Evil By Design and previously a Senior User Researcher at Microsoft, believes that the morality of dark patterns depends on the designer's intentions, and that it is okay to trick users if it benefits them or if they gave implicit permission. He provides an example of this by comparing it to the implied consent to be deceived when attending a magic show. [4].

Types of Dark Patterns

As of March 2017, there are eleven different types of dark patterns. Many of the listed examples combine one or more dark patterns.

Bait and Switch

The bait and switch is when a user tries to do one task, but an undesirable thing happens instead.

An example is when Microsoft employed the use of pop-up prompts asking users to upgrade to Windows 10 operating system. Over time, these pop-ups became increasingly more forceful and began employing dark patterns such as the bait and switch. The “X” in the upper right corner of the pop-up window, which typically means close or cancel, was changed so that when clicked, the Windows 10 update would proceed without the user intending to do so.[5]

Disguised Ads

Example of the disguised ads dark pattern on Softpedia.com, where the advertisements (highlighted in pink) look like download buttons so users accidentally click on it.

Advertisements are disguised to look like part of the website or application, typically as part of navigation or content, to encourage users to click on them.

An example is Softpedia, a software download site, which uses advertising to get revenue. Advertisements on this site often look like download buttons, causing users to click on the ads instead of what they actually wanted to download.[6]

Forced Continuity

Forced continuity is when a free trial with a service ends and a user's credit card is charged without a reminder. In some cases, it might be hard to cancel the membership.

An example is Affinion Group, an international loyalty program that works on customer relations for businesses. When buying a train ticket through Trainline (a partner ecommerce business), users are confronted with a disguised ad at the top of the confirmation page with a large “Continue” button, fooling users into clicking it. They're then taken to an external website that tries to convince the user to provide their payment info to receive a voucher off their purchase. In doing so, the user actually is signing up for a Complete Savings membership, which is not clearly defined on the page. If users do decide to sign up, they may not realize it is a subscription service until the charges appear on their bank statements.[7]

Friend Spam

The website or application asks for a user's email or social media permissions to find a user's friends and sends the user's contacts a message.

For example, when signing up for a LinkedIn account, the website encourages users to grant LinkedIn access to their email account contacts to grow their network on LinkedIn. If a user accepts, LinkedIn sends emails to all of a user's contacts claiming to be from the user.[8]

Hidden Costs

The hidden costs pattern typically occurs in the checkout process where some costs are not upfront or transparent.A user may find some unexpected charges like delivery or tax fees at the very end of the checkout process before confirming an order. By adding these hidden costs at the very end, a user is less inclined to find another service to get the product after investing a lot of time in the checkout process.

For example, ProFlowers, a flower retailer in the United States, has a six-step checkout process. When a user adds something to their shopping cart, the cost of that item is upfront. After going through the checkout process and getting past the fifth step of entering payment information, the last step before confirming the order displays delivery and care and handling costs that were not mentioned prior to that step.[9]


Misdirection draws a user's attention from one part of the page or website to another part of the interface.

For example, the Australian low-cost airline jetstar.com by default pre-selects an airplane seat for the customer during the flight booking process, but this costs $5. A user can also skip seat selection for no extra cost, but the page hides this skip option at the bottom of the page, therefore misdirecting the customer to think they should choose a seat. [10]

Price Comparison Prevention

The price comparison prevention dark pattern makes it hard for a user to compare the price of an item with another, making it harder to make an informed decision.This is common with online retailers that sell packaged products without showing price per weight, making it hard to work out the unit price of items within a bundle.

An example is sainsburys.co.uk, an online grocery service. The company hides the price per weight so users cannot calculate if it's cheaper to buy loose items or the packaged/bundled item. [11]

Roach Motel

The roach motel dark pattern makes it easy for a user to get into a particular situation but is difficult to opt-out of it later.

An example is a ticket sales site like Ticketmaster, which by default sneaks a magazine subscription into a user's shopping cart. If a user doesn't notice this and proceeds, they are automatically charged. To get a rebate and unsubscribe, the user has to download a form and send the request via postal service. [12]

Sneak Into Basket

The dark pattern “sneak into basket,” also known as “negative option billing” and “inertia selling,” appears when a website sneaks an additional item into a user's shopping cart. This occurs most often via an opt-out radio button or checkbox on a preceding page.

An example of this dark pattern is GoDaddy.com, a domain name hosting site. When a user buys a domain name bundle (e.g. getting 3 domains and therefore saving 69%), the site automatically adds privacy protection at $7.99 per domain per year. When a user proceeds to checkout, it's revealed that two years of domain name registration for all purchased domains has been added to the basket. [13]

Trick Questions

The trick questions dark pattern is when a user is asked a question that at first glance seems to ask one thing, but upon closer reading actually means another thing entirely. This often occurs when registering or signing up for a service.

An example is on Currys PC World's registration form, which presents two checkboxes: one that says “Please do not send me details of products and offers from Currys.co.uk” and below that, “Please send me details of products and offers from third party organisations...” If the user doesn't read carefully, they may check both boxes assuming they are meant to unsubscribe. [14]

Privacy Zuckering

Privacy zuckering is a design pattern that gets users to publicly sharing more information about themselves than they intend to. It is named after Facebook CEO Mark Zuckerberg because older versions of Facebook made it easy for users to overshare their data by making privacy settings hard to find or access.

Now, this pattern typically occurs in the small print within the Terms and Conditions of a service, which gives them permission to sell a user's personal data, such as physical and mental health information, to data brokers. Data brokers then buy and resell the information to third parties. [15]

Additional Privacy-related Dark Patterns

In 2016, a research paper was published that outlined eight more dark patterns focused more specifically on privacy violations of users.[16]

  • Maximize – this dark pattern occurs when a website or application requests and collects more data than is needed for the task. An example is an sign-up form requesting more personal information from a user than is necessary.
  • Publish – Personal data is published to the public or is not hidden from plain view. An example is a website that collects sensitive personal data but is not encrypted or has no access control to protect a user's information.
  • Centralize – Personal data is collected, stored, or processed at a central entity, rather than being processed in a distributed way to prevent interrelationships in personal data. An example of this privacy dark pattern of centralizing data is the practice of flash cookies, which are browser cookies stored centrally in the flash plug-in on a user's file system. These flash cookies therefore aren't restricted to a specific web browser.
  • Preserve – A dark pattern preserve means that interrelationships between data are stored after being processed. Data aggregation is one example of processing. Instead, data should be stored in their original state for analysis. An example is telecommunications data retention, where analysis traffic analysis can recover relationships between people.
  • Obscure – It is difficult or impossible for users to learn how their personal data is collected, stored, and used. This is related to Privacy Zuckering (above).
  • Deny – This occurs when users are denied control over their personal data. An example is when WhatsApp, a mobile messaging application, automatically shared users' online statuses with everyone subscribed to the user's phone number, and didn't provide an option to control who could see the status.
  • Violate – This dark pattern occurs when a privacy policy that was presented to the user is violated intentionally. Users are often not aware about the violation and therefore doesn't harm the user's trust in the company or service. Using this strategy is illegal and is not advertised publicly by companies.
  • Fake – This occurs when an entity that collects, stores, or processes personal data claims it has strong privacy protection but actually doesn't. Examples include fake, self-designed privacy seals on websites, making the user feel more secure despite not actually having the safety certification to do so.

Ethical Implications

Misinformation (Floridi 78-79)



(See privacy-related dark patterns above)

See Also

External Links


  1. Grauer, Y. (2016). Dark Patterns are designed to trick you (and they're all over the Web). Ars Technica. Retrieved 16 March 2017, from https://arstechnica.com/security/2016/07/dark-patterns-are-designed-to-trick-you-and-theyre-all-over-the-web/
  2. Bunker, G. (2013). The ethical line in user experience research. Mumbrella. Retrieved 16 March 2017, from https://mumbrella.com.au/the-ethical-line-in-user-experience-research-163114
  3. Brooks, P. (2017). Has the Recession Taken Your Experience to the Dark Side?. Uxbooth.com. Retrieved 16 March 2017, from http://www.uxbooth.com/articles/has-the-recession-taken-your-experience-to-the-dark-side
  4. Nodder, C. (2013). How Deceptive Is Your Persuasive Design? UX Magazine. Retrieved 16 March 2017, from https://uxmag.com/articles/how-deceptive-is-your-persuasive-design
  5. Bait and Switch. (2017). Darkpatterns.org. Retrieved 15 February 2017, from https://darkpatterns.org/types-of-dark-pattern/bait-and-switch
  6. Disguised Ads. (2017). Darkpatterns.org. Retrieved 15 February 2017, from https://darkpatterns.org/types-of-dark-pattern/disguised-ads
  7. Forced Continuity. (2017). Darkpatterns.org. Retrieved 15 February 2017, from https://darkpatterns.org/types-of-dark-pattern/forced-continuity
  8. Friend Spam. (2017). Darkpatterns.org. Retrieved 15 February 2017, from https://darkpatterns.org/types-of-dark-pattern/friend-spam
  9. Hidden Costs. (2017). Darkpatterns.org. Retrieved 15 February 2017, from https://darkpatterns.org/types-of-dark-pattern/hidden-costs
  10. Misdirection. (2017). Darkpatterns.org. Retrieved 15 February 2017, from https://darkpatterns.org/types-of-dark-pattern/misdirection
  11. Price Comparison Prevention. (2017). Darkpatterns.org. Retrieved 15 February 2017, from https://darkpatterns.org/types-of-dark-pattern/price-comparison-prevention
  12. Roach Motel. (2017). Darkpatterns.org. Retrieved 15 February 2017, from https://darkpatterns.org/types-of-dark-pattern/roach-motel
  13. Sneak Into Basket. (2017). Darkpatterns.org. Retrieved 15 February 2017, from https://darkpatterns.org/types-of-dark-pattern/sneak-into-basket
  14. Trick Questions. (2017). Darkpatterns.org. Retrieved 15 February 2017, from https://darkpatterns.org/types-of-dark-pattern/trick-questions
  15. Privacy Zuckering. (2017). Darkpatterns.org. Retrieved 15 February 2017, from https://darkpatterns.org/types-of-dark-pattern/privacy-zuckering
  16. Bösch, C., et al. (2016). Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns. Proceedings on Privacy Enhancing Technologies, 4, pp. 237-254.