As the new General Data Protection Regulation (GDPR) is implemented across Europe, users of digital services have been confronted with new privacy settings through numerous pop-up messages. Unfortunately, The Norwegian Consumer Councils just published analysis demonstrates that companies appear to have little intention of giving users actual choices.
– These companies manipulate us into sharing information about ourselves. This shows a lack of respect for their users, and are circumventing the notion of giving consumers control of their personal data, says Finn Myrstad, director of digital services in the Norwegian Consumer Council.
The Norwegian Consumer Council and several other consumer and privacy groups in Europe and the US are now asking European data protection authorities to investigate whether the companies are acting in accordance with the GDPR and US rules.
Sharing by default
Through the Consumer Council’s analysis of the companies’ privacy pop-ups, it is made evident that consumers are pushed into sharing through;
- Standard Settings – Research has shown that users rarely change pre-selected settings. In many cases, both Facebook and Google have set the least privacy friendly choice as the default.
- Cunning design choices – Sharing of personal data and the use of targeted advertising are presented as exclusively beneficial through wording and design, often in combination with threats of lost functionality if users decline.
- Confusing layout – The privacy friendly choices require significantly more clicks to reach and are often hidden away.
- Illusion of choice – In many cases, the services obscure the fact that users have very few actual choices, and that comprehensive data sharing is accepted just by using the service. The feeling of control may also convince users to share more information.
– Data protection law requires that companies make it easier for users to make clear and informed choices, and that they let users take control of their own personal data. Unfortunately, this is not the case, which is at odds with the expectations of consumers and the intention of the new Regulation, says Finn Myrstad.
Comments from BEUC and Privacy International
Monique Goyens, Director General of The European Consumer Organisation (BEUC):
– Companies have to respect the letter and the spirit of the GDPR. This report demonstrates that many global digital household names still have a long way to go to do just that. European consumer organisations will continue to be vigilant, expose misconduct and work together with regulator to improve the system.
Ailidh Callander, Legal Officer at Privacy International:
– We welcome this analysis by the Norwegian Consumer Council. As GDPR is implemented by companies, it is important to test how companies are making changes to their products and services to ensure that users’ privacy is protected. We call on regulators to investigate further the dark patterns in which NCC’ analysis suggest companies are deploying and engaging.
In going through a set of privacy popups put out in May by Facebook, Google, and Microsoft, the researchers found that the first two especially feature “dark patterns, techniques and features of interface design mean to manipulate users…used to nudge users towards privacy intrusive options.”
It’s not big obvious things — in fact, that’s the point of these “dark patterns”: that they are small and subtle yet effective ways of guiding people towards the outcome preferred by the designers.
For instance, in Facebook and Google’s privacy settings process, the more private options are simply disabled by default, and users not paying close attention will not know that there was a choice to begin with. You’re always opting out of things, not in. To enable these options is also a considerably longer process: 13 clicks or taps versus 4 in Facebook’s case.
That’s especially troubling when the companies are also forcing this action to take place at a time of their choosing, not yours. And Facebook added a cherry on top, almost literally, with the fake red dots that appeared behind the privacy popup, suggesting users had messages and notifications waiting for them even if that wasn’t the case.
When choosing the privacy-enhancing option, such as disabling face recognition, users are presented with a tailored set of consequences: “we won’t be able to use this technology if a stranger uses your photo to impersonate you,” for instance, to scare the user into enabling it. But nothing is said about what you will be opting into, such as how your likeness could be used in ad targeting or automatically matched to photos taken by others.
Disabling ad targeting on Google, meanwhile, warns you that you will not be able to mute some ads going forward. People who don’t understand the mechanism of muting being referred to here will be scared of the possibility — what if an ad pops up at work or during a show and I can’t mute it? So they agree to share their data.
In this way users are punished for choosing privacy over sharing, and are always presented only with a carefully curated set of pros and cons intended to cue the user to decide in favor of sharing. “You’re in control,” the user is constantly told, though those controls are deliberately designed to undermine what control you do have and exert.
Microsoft, while guilty of the biased phrasing, received much better marks in the report. Its privacy setup process put the less and more private options right next to each other, presenting them as equally valid choices rather than some tedious configuration tool that might break something if you’re not careful. Subtle cues do push users towards sharing more data or enabling voice recognition, but users aren’t punished or deceived the way they are elsewhere.
You may already have been aware of some of these tactics, as I was, but it makes for interesting reading nevertheless. We tend to discount these things when it’s just one screen here or there, but seeing them all together along with a calm explanation of why they are the way they are makes it rather obvious that there’s something insidious at play here.
Sources: