Hidden data protection settings, preset cookie banners, unnecessarily complicated processes when deleting accounts: The fact that online providers can relatively easily manipulate their users with design tricks has long been a thorn in the side of data and consumer protection groups. The European data protection authorities have now published guidelines aimed at helping to identify and prevent such an unethical design by using the example of social networks.
These invisible patterns in the design of user interfaces and decision-making processes on the Internet are called “dark patterns” and are used to guide users to a specific selection. The designers of online environments take advantage of the fact that many people on the internet exhibit similar patterns of behavior. In the hustle and bustle of everyday life, for example, they often choose the option that means the least click work for them. Or that they let certain design tricks prevent them from choosing a particular option. Because they draw on behavioral psychological research and can observe thousands and thousands of people’s behavior every day, there is a balance of power.
It was not until mid-March that Business Insider reporters discovered that Amazon had deliberately tried to use design tricks to make the termination of the Amazon Prime payment service more difficult. Apparently successful: “Project Iliad” is said to have reduced the cancellation rate by 14 percent. Businesses are also relying on dark patterns to stave off data protection efforts. In 2021, for example, the Austrian NGO NOYB filed hundreds of complaints about cookie banners that, through their color design or other tricks, aim to tempt people to simply accept all surveillance measures.
Typology of design tricks
In the 60-page guidelines, the European data protection authorities make it clear that the phenomenon is in no way limited to cookie banners. Based on the life cycle of a social media account – from creation to deletion – they illustrate how diverse the attempts to influence are on the Internet.
And they show that many of them are incompatible with the General Data Protection Regulation. According to the newspaper, dark patterns can violate very different articles of the GDPR, depending on their characteristics. For example, against the principles of fairness and transparency or the requirement for data protection through design and through data protection-friendly default settings. In addition, the consent obtained may be invalid because the providers violate the requirements of voluntariness and to be informed if they use design tricks to obtain consent in case of fraud.
In order to better recognize and understand dark patterns, data protection authorities also perform a useful typology. They divide the design tricks into a total of six categories:
- “overload”: With this method, users are confronted with so much information, requests or choices that they are deliberately overwhelmed. The “privacy maze” trick, for example, falls into this category, where the data protection settings are so complicated and hidden that it is extremely time-consuming to adjust them as desired.
- “skips”: Here, selection menus and processes are designed so complicated that users have a hard time keeping track of all relevant aspects or their original intent, and they therefore skip relevant things. For example, in a trick that the authorities call “fraudulent comfort”, where pre-selected options should tempt users to click on the less privacy-friendly option.
- “stirring”: Attempts to seduce users with verbal or visual stimuli to perform certain actions fall into this category. For example, when colors or other forms of graphic design are used to highlight certain possibilities and hide others. Or when users get guilt with emotional appeals when they choose stronger data protection options. Or vice versa, when users are tempted to permanently share their own location with a text that presents this as a particular social practice that can make the world a better place.
- “prevent”: This includes tricks used by data protection authorities to actively prevent users from exercising their data protection rights. For example, due to factually incorrect information, dead links or processes that take longer than they should. For example, when creating a social media account when users actually want to do without certain data but are anxious and stopped by the repeated question “Are you sure?”.
- “fuck”: This means methods with a deliberately inconsistent design. This includes, for example, an inadequate information hierarchy, where users are confronted with seemingly the same or similar information and opportunities in different places. This makes it impossible for them to understand exactly how their data is being processed. Another example is data protection decisions that are presented to users without adequate explanation and contextual information.
- “Left in Dark”: This category is about information and setting options that are deliberately hidden, eg by using another language or vague and ambiguous terms instead of clear descriptions. A concrete example from practice (which the data protection authorities do not mention, however) is WhatsApp’s terms and conditions and data protection provisions, which were not available in German before a complaint from the consumer centers a few years ago. As the example of the recently changed general terms and conditions shows, it is still not really clear what to say, despite the German version.
Feedback is required
Data protection authorities have so far forgotten a particularly ugly design trick on their list: It is not uncommon for users to be asked specific decision-making questions at times when their usage behaviors indicate that they are emotionally or emotionally stressed so they do not rest. for wise decisions.
Anyone who discovers further missing dark patterns in the guidelines or has other information on the subject can provide the data protection authorities with feedback on their website until the beginning of May.