User data privacy decisions can be easily manipulated
Data privacy is an important topic in the digitalized economy. Recent policy changes have aimed to strengthen users’ control over their own data. Yet new research from Copenhagen Business School finds designers of cookie banners can affect users’ privacy choices by manipulating the choice architecture and with simple changes can increase absolute consent by 17%.
A website cookie banner is the consent management tool that allows users to give their consent to process their personal data. Given the current legal framework, users need to actively provide consent.
The manipulations of the banner can therefore affect the user decision about whether to make an active choice at all and what the outcome of this choice would be, accept or decline consent. The research findings provide empirical evidence that shows people’s data privacy decisions can be easily manipulated.
“Choice architecture should be designed to benefit the user to make more informed decisions, which are essential for free markets to work efficiently. Exploiting psychological mechanisms in design, to manipulate users to the benefit of the website owner is problematic,” says Associate Professor Jan Michael Bauer from the department of Management, Society and Communication, Copenhagen Business School.
“Detailed user data has become valuable as it allows to better understand customer behaviour and improve the targeting of advertisements. Users and customers deserve and should demand a choice environment that allows their own need satisfaction and not one that benefits the website owner,” adds Bauer.
The research highlights that the ability of website owners to manipulate the outcome of user privacy decisions is at odds with the ideals of the ePrivacy Directive and GDPR.
Privacy manipulation
When the researchers started the project in 2019, there was very little academic research about the impact of cookie banner design elements on acceptance rates. And few guides and rules were available beyond a case ruling about the use of pre-ticked boxes in cookie banners.
The empirical evidence supporting the study’s conclusions was gathered through an experiment testing different banner designs on a public website. The researchers analysed how their manipulations affected 1493 user interactions with the cookie banner and the resulting privacy choice, i.e., whether to give or decline consent.
While several official guides on banner design have been published since the experiment was conducted, the researchers argue that website owners remain in a privileged position.
“If they would use their expertise and design skills to elicit their user’s privacy preferences in a neutral way, we would potentially welcome this and not have a problem. Nudging users to make a privacy choice is potentially a good thing, manipulating them into providing consent is not and should be opposed,” states Bauer.
Protecting user data privacy
Initially, the researchers wanted to create awareness and action by policy makers and acknowledge that the problems of manipulative choice architecture in the digital space – also called dark patterns – remain important topics for debate. They introduce a conceptual distinction between choice-making architecture and choice outcome architecture that might help to have a more structured debate.
“We see this analysis of the choice-making architecture and a choice outcome architecture as a helpful deconstruction of this privacy decision when it comes to protecting user data,” says Jan Michael Bauer.
The choice-making architecture captures all elements of the choice environment that might deter or encourage people from/to make a decision – e.g., the complexity of the choice or required effort. The researchers argue there are many cases in which it might be beneficial to nudge people to decide without affecting the outcome (e.g., organ donation and elections). Increasing choice-making is however not the same as nudging people towards one choice outcome.
“In some cases, we might be more confident that selecting one specific option is likely to make users better off and target the choice outcome itself (e.g., cigarettes unhealthy foods). However, interventions that favour a specific outcome is suspect to manipulation and warrants more scrutiny,” says Bauer.
Learning about dark patterns
While regulators hopefully catch up with the digital world, the researchers conclude that it will be up to the consumer to detect, avoid and resist manipulative choice architecture. “One way forward for users and consumers could be to learn about the broader issues surrounding dark patterns and the tricks used in websites and apps to hopefully become less responsive to these manipulations. Even though these manipulations are often subtle, they should be called out,” adds Bauer.
“One helpful approach can be to treat aggressive prompts and design element that favour a specific choice outcome as a warning sign to pause and reflect: do I really want to share my data? An issue not limited to data privacy as many websites and online shops gear up with dark patterns in the fight for user attention and to increase sales,” concludes Bauer.