Destination ‘dark patterns’: On the EU (digital) legislative train and line-drawing

by: in Law
dark patterns

The IMCO Committee is trying to amend the Unfair Commercial Practices Directive (2005/29) to include a ban on dark patterns. The proposed amendments are part of the EU’s plans to empower consumers for the green transition and the Parliament is expected to address the proposed amendments on 17 April 2023.


Dark patterns are user interface design choices that ‘benefit an online service by coercing, steering, or deceiving users into making unintended and potentially harmful decisions’, often by way of exploiting our cognitive biases. While computer scientists are familiar with dark patterns and have been raising awareness of their use in digital environments since 2010, it is only in recent years, with the emergence of studies measuring the prevalence of dark patterns and some incipient evidence on their effectiveness, that EU policy-makers have started taking note and trying to regulate the use of dark patterns. This may not always be a good idea.

‘Dark patterns’ are a catch-all concept in the field of Human-Computer Interaction to describe undesirable user interface design choices which may impair users’ autonomy, and could cause financial detriment and privacy harms. Dark pattern scholarship is yet to agree on a single definition. A 2021 study comparing 19 definitions of dark patterns found in prior studies and government materials showed that the definitions differed in terms of the UI characteristics that can affect users, the mechanism of effect for influencing users, the role of the interface designer (although most definitions require intention), as well as the benefits and harms resulting from a user interface design. This analysis also revealed within-study and across-studies discrepancies between the proposed definitions and the types of dark patterns described in prior work. 

That  the concept of dark patterns is still ill-conceptualized is clear in the Digital Services Act (DSA, Regulation 2022/2065), the first EU legislative act to tackle dark patterns head-on. Art. 25(1) DSA requires online platforms not to ‘design, organise or operate their online interfaces in a way that deceives or manipulates (emphasis added) the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions’. Art.25(2) goes on to state that the Commission may issue guidelines on how the first paragraph applies to three practices:  ‘(a) giving more prominence to certain choices when asking the recipient of the service for a decision; (b) repeatedly requesting that the recipient of the service make a choice where that choice has already been made, especially by presenting pop-ups that interfere with the user experience; (c) making the procedure for terminating a service more difficult than subscribing to it’. A literal reading of Art.25(2) suggests that the listed practices are not caught by the first paragraph in all circumstances. This is a wise legislative choice. Making certain courses of action more prominent is a value-neutral UI design choice. What should interest us is the end that is being pursued - is it an end that benefits the consumer or the online platform? For example, consider cookie banners: either one of the buttons to accept and reject cookies can be made more prominent, yet clearly one of these options will benefit the consumer, and the other the service provider. The Commission’s guidelines can help draw dividing lines for such practices. The DSA Preamble points us in the same direction. Recital 67, which refers directly to dark patterns, clarifies the legislator’s intent to clear online platforms of user interface design choices that ‘direct the recipient to actions that benefit the provider of online platforms, but which may not be in the recipients’ interests’. Line-drawing is also an important exercise for the other practices listed in Art.25(2) - Nagging (subparagraph (b)) and Roach Motel/Hard to Cancel (subparagraph (c)). Nagging a consumer towards a certain choice may, for example, cause consumer detriment where that choice entails succumbing to privacy-intrusive settings; in other instances, nagging may be merely annoying. Introducing some friction into a service termination procedure (Roach Motel/Hard to Cancel) may prevent accidental termination, yet introducing a lot of friction may keep users trapped in recurring subscriptions or prevent them from  deleting their accounts. 

There are, however, some more problematic aspects to the DSA prohibition of dark patterns, notably its reference to manipulation as a source of consumer harm, as also noted by Martini and Drews. ‘Manipulation’ is not defined in the regulation, and is a new EU legal term. Many philosophers have reflected on the meaning of manipulation and how it manifests in digital environments, and the jury is still out on this question. The problem with regulating manipulation, and the reason why so far manipulation has not featured in legislative acts, is that, however conceived, it is a very broad term, and drawing the boundaries between immoral manipulation and acceptable persuasion is a difficult task. As Sunstein put it, ‘it [manipulation] has at least fifty shades’. Which one of these does the DSA refer to? We will likely only find out once the CJEU gets to express its views on the matter. However, it may not be too early to state that this legislative approach of embracing vague legal terms that are not given concrete meaning risks watering down the effectiveness of the DSA prohibition, much like the general fairness test under the UCPD has lost much of its bite by referring to ‘professional diligence’.

The amendments to the UCPD proposed by the IMCO Committee raise the same issues as DSA’s conceptualization of dark patterns, and some more. The proposed prohibitions touch upon two levels of the protective regime established by the UCPD: the list of per se prohibited practices in Annex I, and one of the ‘small’ general fairness clauses, Art.6, which deals with misleading actions. Let us unpack the proposal in this order. 

The proposal introduces the following points largely inspired by the DSA to Annex I: ‘(i) giving more prominence to certain choices when asking the recipient of an online service for a decision’ and ‘(ii) making the procedure of terminating a service significantly more burdensome than signing up to it’. The issue with the first point is that the effect of introducing it into Annex I UCPD would be to ban the, as seen above, value-neutral design choice of making some options more prominent, regardless of whose interests it serves. The insertion of the second point is less controversial. The UCPD has already been relied on by consumer authorities to tackle the Hard to Cancel/Roach Motel dark pattern. In July 2022, following a dialogue with the European Commission and national consumer authorities, Amazon had to adjust its Amazon Prime subscription cancellation process. While users could sign up for Amazon Prime with only a couple of clicks, they had to jump through numerous user interface design hoops in order to cancel their subscription. However, describing the prohibited practice in terms of making it ‘significantly more burdensome' to terminate a service introduces a degree of uncertainty. How many more clicks amount to a burdensome cancellation process - one, five or 10? Would requiring a consumer to call/e-mail customer service in order to cancel a subscription amount to a slightly or significantly more burdensome termination process? What if termination via an app is not possible, but can be easily done via a desktop/mobile browser? Again, lines need to be drawn. The list of per se prohibitions in Annex I is meant to provide traders with legal certainty. The proposed amendments could do more to serve that purpose in my opinion. In Germany, the Fair Consumer Contracts Act amended the German Civil Code to require businesses that offer online subscription options to implement a two-step cancellation procedure with a prominent ‘cancellation button’ on their websites. The ‘cancellation button’ ought to be labeled with the words ‘cancel now’ or similar wording to that effect. While we may ask ourselves how this requirement aligns with the maximum harmonization character of the UCPD, which entails that Member States may not impose categorical prohibitions of B2C commercial practices, this legislative approach is commendable for acknowledging that fair user interface design is a technological matter, and that lawmakers may need to grapple with technological artifacts and their potential harms beyond imposing vague prohibitions.

Moving on to the amendments to be made to Article 6 UCPD, the IMCO Committee envisages a new paragraph in this provision, which could render the following misleading and unfair: ‘(ea) practices with the effect or likely effect of distorting or impairing the autonomy, decision-making or choice of the recipients of the service, on purpose or in effect, via the structure, design, or functionalities of an online interface or a part thereof’. The first problem with this proposed amendment is that it would likely circumscribe the scope of prohibited dark patterns to those that are misleading. Indeed, there are dark patterns that operate on an informational dimension, but some user interface design choices may effectively coerce consumers into making detrimental choices they did not intend to make by exploiting their cognitive biases and therefore bear a closer resemblance to aggressive commercial practices. That is the case for Sneak into Basket, a dark pattern that entails adding optional products to a consumer’s virtual shopping cart without their consent; this practice exploits the very strong default effect - our tendency to stick to the status quo. The second problem is that including a prohibition of dark patterns under one of the general UCPD clauses could subject them to the average consumer test. The average consumer is one that bears a close resemblance to the ‘homo economicus’ of neoclassical rational choice theory, and, as such, is not biased or susceptible to the exploitation of their cognitive biases. Including a prohibition of dark patterns as a sub-paragraph of Art.5 UCPD, which is the general unfairness clause, could remedy the first problem, but not the second; addressing the second problem would likely require an overhaul of the entire consumer-empowerment-via-the-information-paradigm approach to consumer protection that permeates throughout the EU consumer acquis. Such a change is much awaited by some scholars. Another problem with this envisaged amendment is, yet again, the use of vague legal terminology. What is ‘autonomy’? The proposed provisions do not contain a definition. Many philosophers have reflected on this question as well, and they are yet to agree on a single definition. 

All this is to say that dark patterns may be indeed very harmful for consumers in some cases, and not in others. In a regime where evidence is deemed to be central to good law-making, we may want to evaluate the available evidence on consumer harms/wait for evidence to emerge in order to draw clear dividing lines; vague prohibitions may have the ill fate of not serving the interests of those they are supposed to protect nor providing legal certainty to those who have to comply with them. The Commission’s ongoing Digital Fairness Fitness Check seems like a good site for gathering and evaluating evidence.



  • C. Rosca

    Currently I am a PhD researcher in Digital Legal Studies. My PhD research combines consumer law and computer science insights to explore legal and technical solutions for the interpretation and enforcement of regulation on dark patterns. I also regularly advise national and European policy makers on dark patterns, particularly in the context of the European Commission’s E-Enforcement Academy, where I provide training on digital investigations on the internal market.

    More articles from C. Rosca