You Are Probably Doing Privacy UX Wrong

Luiza Jarovsky
8 min readJan 19, 2023
laptop computer showing UX design frames, and someone holding a smartphone showing a UX design frame
Photo by Daniel Korpai

Dark patterns in privacy are a trending topic. There are various academic papers (including mine) dealing with the topic, the EU Digital Services Act (DSA) and the California Privacy Rights Act (CPRA) explicitly prohibit dark patterns, and fines punishing dark patterns continue to accumulate on both sides of the Atlantic. Nevertheless, there is still uncertainty regarding where is the line that separates dark patterns from other design practices. In today’s newsletter, I discuss the latest guidelines issued by the European Data Protection Board for cookie banners, and my proposed Privacy-Enhancing Design framework, showing that there is still a long way to go for the law to meet Privacy UX practices on the ground.

Despite recent fines, such as the record-breaking penalty issued by the FTC to Epic Games (the developers of Fortnite) or the smaller but conceptually meaningful penalty that the French Data Protection Authority (CNIL) issued to Discord, there are still blurred lines regarding what exactly will be considered a dark pattern in privacy (and punished as so) and what will not.

Yesterday, the European Data Protection Board (EDPB) issued a “Draft Report of the work undertaken by the Cookie Banner Taskforce” clarifying various types of dark patterns in the specific context of cookie banners. Before we review them, it is important to highlight the disclaimer issued by the EDPB regarding the practical meaning of their interpretation.

They presented various types of cookie banner dark patterns, which will serve to help the community understand how they look in practice. I must comment that it would have been even more helpful — especially to non-legal professionals — if the EDPB had incorporated visual representations of the dark patterns in their draft report. Let’s go through each of them:

Type A: “No reject button on the first layer”

“When authorities were asked whether they would consider that a banner which does not provide for accept and refuse/reject/not consent options on any layer with a consent button is an infringement of the ePrivacy Directive, a vast majority of authorities considered that the absence of refuse/reject/not consent options on any layer with a consent button of the cookie consent banner is not in line with the requirements for a valid consent and thus constitutes an infringement.”

Type B: “Pre-ticked boxes”

“The taskforce members confirmed that pre-ticked boxes to opt-in do not lead to valid consent as referred to either in the GDPR.”

Type C: “Deceptive link design”

“In order for a valid consent to be freely given, the taskforce members agreed that in any case a website owner must not design cookie banners in a way that gives users the impression that they have to give a consent to access the website content, nor that clearly pushes the user to give consent”

Type D: “Deceptive button colors” & type E: “Deceptive button contrast”

“It appears that the configuration of some cookie banners in terms of colours and contrasts of the buttons (…) could lead to a clear highlight of the ‘accept all’ button over the available options.”

Type H: “Legitimate interest claimed”

“(…). The controller relied on legitimate interests under article 6(1)(f) GDPR for different processing activities as, for example, ‘Create a personalised content profile’ or ‘Select personalised ads’ whereas it could be considered that no overriding legitimate interest would exist for such processing activities.”

Type I: “Inaccurately classified essential”

“It appears that some controllers classify as ‘essential’ or ‘strictly necessary’ cookies and processing operations which use personal data and serve purposes which would not be considered as ‘strictly necessary’ within the meaning of Article 5(3) ePrivacy Directive or the ordinary meaning of ‘strictly necessary’ or ‘essential’ under the GDPR.”

Type K: “No withdraw icon”

“It appears that where controllers provide an option allowing to withdraw consent, different forms of options are displayed. In particular, some controllers have not chosen to use the possibility to show a small hovering and permanently visible icon on all pages of the website that allows data subjects to return to their privacy settings, where they can withdraw their consent.”

As someone researching dark patterns, it is very helpful that they brought more clarification on the matter. However, there should be much more detailed official guidelines and best practices recommendations on how to do Privacy UX properly.

In this newsletter, a few months ago, I proposed a framework called Privacy-Enhahcing Design, with the goal of bringing to attention this important interaction between the fields of privacy and UX design.

The central idea behind Privacy-Enhancing Design is that users are vulnerable, manipulable, and easily influenced by cognitive biases. UX designers can maliciously exploit cognitive biases through deceptive design (i.e., dark patterns), negatively affecting user privacy.

Privacy-Enhancing Design proposes that UX designers must acknowledge the existence of cognitive biases and human errors and create interfaces that respect user autonomy, and prioritize choices that preserve user privacy.

A privacy-enhancing UX design practice is a UX practice that acknowledges cognitive biases and human errors, respects user autonomy and prioritizes choices that preserve user privacy

To be able to implement Privacy-Enhancing Design correctly, UX designers and product managers must have some understanding of privacy and data protection law. From my point of view, it should be taught in design, computer science, marketing, and business schools (and also in law schools as part of data protection law).

The seven principles (or heuristics, as UX designers prefer) of Privacy-Enhancing Design are:

  1. Autonomy and Human Dignity are Central. User autonomy and human dignity are fundamental rights and must be respected throughout the UX design. It must allow users to exercise their choices and preferences freely, autonomously, and in an informed way.
  2. Transparency. UX design practices should foster transparency and accessibility so that users know ongoing data transactions. Every new data transaction (collection, processing, or use) should be clearly signalized in an accessible way so that users can realize that data is being exchanged. Users should be made aware that their personal data is being collected, processed, and used. Symbols, colors, and various design features might be used to transmit information.
  3. No Previous Data Protection Knowledge. UX design should presuppose that users have no background data protection knowledge. Interfaces that involve data collection, processing, and use should be clear and accessible, with simple and user-friendly indications of the scope and extent of the data transaction, including possible risks (even if it seems obvious to the designer).
  4. Acknowledgment of Cognitive Biases. Cognitive biases must be broadly recognized and acknowledged. The exploitation of cognitive biases to collect more — or more sensitive — personal data (i.e., through dark patterns in privacy) must be stopped throughout the UX design process. Users should be seen as vulnerable and manipulable, and it is the organization’s responsibility to shield users from manipulation.
  5. The Burden on Organizations. Organizations should be responsible for designing UX interfaces that do not exploit users’ cognitive biases. Organizations should be able to prove — at any time — that their UX design practices are privacy-enhancing (and not privacy-harming). If users are committing errors, it is the responsibility of organizations to detect and correct the design practice that is fostering these errors.
  6. Design Accountability. Organizations should be held accountable for their design practices. Organizations should publicly publish their privacy-design practices (perhaps through a Privacy Design Policy similar to a Privacy Policy but focused on UX design practices). It should be possible to legally question an organization’s UX design practices.
  7. Holistic implementation. The principles above should be implemented throughout the UX design and present in every interaction between users and organizations (i.e., not restricted to privacy settings). Privacy and Data Protection should be made an integral part of the interaction between the organization and the user.

Below is a non-exhaustive list of practices that can be considered aligned with Privacy-Enhancing Design:

  • any default setting that favors zero data sharing;
  • building default settings that favor the most privacy-protective option;
  • using colors, fonts, sizes, or contrasts to prioritize the most “privacy-fostering” option in a menu;
  • building an interface that does not force or pressure users to constantly share more data;
  • transmitting any privacy-related information in a concise, usable user-friendly, and user-centered manner;
  • communicating a product or service’s privacy features (and possible risks) in a proactive and straightforward way;
  • not using pressuring language or terminology to induce users to share more or more sensitive data;
  • making it easier for users to choose a privacy-protective option;
  • making the privacy-protective option faster or more prominent;
  • offer prompt help (e.g., online chat, 24/7 customer service, email with quick answers by a human) to support users in navigating privacy settings and choices;
  • doing user experience research to check, in practice, if the user understands and can navigate the available privacy options and settings properly;
  • constantly conducting user research to check for privacy weaknesses of the UX design or additional privacy risks that the user might be experiencing;

What are your views on the topic? Do you think that Privacy-Enhancing Design could be an effective framework to help curb dark patterns and make tech professionals aware of their privacy impact? What are the next steps to improve Privacy UX?

💡 I would love to hear your opinion. I am sharing this article on Twitter and on LinkedIn, you are welcome to join the discussion there.

-

📌 Privacy & Data Protection Careers

If you are looking for a job in the fields of privacy and data protection, explore the links below and use the platforms’ filters to personalize results. Wishing you the best of luck!

LinkedIn Job Search:

Indeed Job Search:

-

📅 Upcoming Privacy Events & Conferences

IAPP Global Privacy Summit — April 4th-5th, 2023 — in Washington, DC, United States

Computers Privacy and Data Protection (CPDP) — May 24th-26th, 2023 — in Brussels, Belgium.

Privacy Law Scholars Conference (PLSC) — June 1st-2nd, 2023 — in Boulder, CO, United States.

Annual Privacy Forum — June 1st-2nd, 2023 — in Lyon, France

*To submit your privacy and data protection event, get in touch.

-

📢 Privacy Solutions for Businesses (sponsored)

There are various companies offering privacy solutions for businesses. In today’s newsletter, the featured privacy partner is Simple Analytics, an EU-based analytics service. Click on the image below to learn more about their privacy features. To get your first month for free, use my referral link.

✅ Before you go:

See you next week. All the best, Luiza Jarovsky

--

--

Luiza Jarovsky

CEO of Implement Privacy, LinkedIn Top Voice, Ph.D. Researcher, Author of Luiza's Newsletter, Host of Luiza's Podcast, Speaker, Latina, Polyglot & Mother of 3.