The default settings on our devices wield significant influence over our data privacy. From the staggering sums tech giants invest to secure default positions to the insidious tactics of dark patterns, let’s delve together into the complexities of consumer choice, manipulation, and the urgent need for a paradigm shift in data privacy practices.
Princeton programmer-turned-sociologist Zeynep Tufekci had an important piece in the New York Times last year touting the importance of the default settings that come pre-installed on our phones, apps, and the countless other digital tools upon which we now depend.
As Tufekci wrote, these default settings come with big wins and losses:
These numbers show the real value of users’ data — and the futility of relying on consumers themselves to make smart choices about how their data is used.
“If it were all as simple as people changing their settings, Google wouldn’t be forking over a sum larger than the G.D.P. of entire countries to have Apple users start with one setting rather than another,” Tufecki writes. “The default way the technology industry does business needs to change now.”
I couldn’t agree more. Anyone who works in tech knows that while consumers do care about their data and their privacy, it’s an open question as to how much effort they’ll actually put into enforcing the rights they say matter to them.
This has been digital marketers’ dirty secret since the industry began.
Cookies, after all, depend on people’s passivity, and their readiness to click away their rights in order to clear a “consent” message and access the content they’re actually looking for.
Privacy policies are much the same: everyone knows that consumers don’t actually read them — and let’s face it, they aren’t designed to be read by anyone who doesn’t have a JD. From there it’s a short step to full-blown dark patterns and UX that actively manipulates users into giving up their data rights.
A dark pattern is some kind of deceptive, manipulative way for a company to convince a user to do something they would not otherwise do on the company website. It tricks the user into taking an action that doesn’t align with their privacy preferences.
A few examples of this could include:
When I bought a new iPhone recently, I was able to complete the transaction in just a few clicks. Not long afterward, I started getting new “pre-screened” credit card offers based on the transaction I’d made — and in the small print, it said I’d need to personally call or write to the consumer credit bureaus in order to avoid getting such offers in future. That asymmetry of effort, requiring so much of me to assert my rights, is a perfect example of a dark pattern in action.
The legality of dark patterns varies globally. While some practices might breach consumer protection laws, enforcement and interpretation differ across jurisdictions.
Certain countries, such as the European Union member states under GDPR (General Data Protection Regulation), have specific provisions against deceptive design practices that infringe upon users' rights to privacy and informed consent.
However, enforcement and interpretation of these laws can differ, and not all dark patterns may be explicitly illegal. Nevertheless, ethical considerations and consumer backlash are increasingly pressuring companies to adopt more transparent and user-friendly design practices.
Regulators have been pushing back against dark patterns, but as Tufekci points out, it isn’t enough to stop marketers from putting their thumbs on the scales. As long as we allow privacy settings to be turned off by default — or hidden behind “opt out” options that consumers are never meant to actually use — we’ll continue to see consumers’ data being used in ways to which they would never have explicitly consented.
The entrenched use of dark patterns, and subtler forms of consent manipulation like default settings, is bad enough in a world of web tracking and targeted digital advertising. But it’s about to get a whole lot worse as we move into a world of ubiquitous AI and immersive technologies such as gaze and pupil dilation monitors, body language cameras, neural interfaces, and more.
The reality is that we’re using data in transformative new ways, and we need new ways to control and manage data effectively. That starts with building out programmatic data control and data permissioning systems that allow consumers to express their preferences clearly and then have them reflected fluidly across the entire data ecosystem.
The goal can’t simply be to make privacy technically attainable for consumers who care enough to dig through endless lists of preferences and settings. To prepare for a world powered by AI and immersive tech, we need to empower consumers — and that starts with a new approach to data privacy. Using data responsibly and making new technologies safe, sustainable, and consumer-friendly needs to start with making strong, effective privacy protection the default setting.