Modular manipulation: Towards radical user choice

Michael Linares
8 min readApr 10, 2021

--

Ways to defuse weapons of mass persuasion

Opt in, opt out, turn yourself about

Software is manipulative and everyone knows it. Companies use various techniques to manipulate users — techniques like gamification, social pressure, ranked content optimized for engagement, etc. — and set these as defaults. Users are give some semblance of control in the form of service agreements and preferences, masking what is essentially an asymmetric power dynamic. In other words, companies create an irresistible product and users make a binary choice: they submit to it or they don’t. What if a user could radically opt in or out of manipulative tactics? And how could products support this kind of modularized UX/UI?

In a world where consumers are becoming smarter on ethical issues (see: the recent WhatsApp exodus to Signal over privacy concerns) software needs to become radically transparent.

This is an admittedly speculative post. My hope is that this exercise in imagination can help us start to build products that are more transparent, ethical, and honest. Today we’ll look at:

  • The sad state of user choice
  • The addictive additives in software (e.g., gamification, social pressures)
  • The current tools we have to exercise choice (e.g. email prefs, privacy prefs)
  • Five imagined case studies — dreams of a Duolingo without gamification, a Peloton w/out leaderboards, etc.
  • And the business case for user choice

The farce of user choice
Apple recently rolled out some privacy labels in their App store, to the chagrin of startups everywhere. The privacy labels are akin to nutrition labels — they tell you, the consumer, what the company does with your data. It’s a neat idea, and a step in the right direction. But the labels provide limited value in practice. For one, the labels are long, hard to parse, and inconsistent. Those issues can be ironed out in time, and I hope that they are. More importantly, the labels assume a level of user choice that doesn’t really exist.

Uber & Lyft’s privacy labels: An app’s data practices by any other name…

Users have little choice between tech products, especially in the more competitive categories like mapping or social networking. Tech CEOs will tell you — during antitrust hearings especially — that switching costs are low. That if users don’t like Google Maps, they can use Apple Maps, or vice versa. Sure, that’s technically true. But what if you fundamentally disagree with how both companies treat your data? (The recent WhatsApp-Signal kerfuffle is a relevant counterexample; I’d argue that the exodus was driven by mistrust and misinformation rather than privacy labels.)

I dream of a world where users can opt in to a series of agreements and experiences — a future where users are empowered to tailor an app’s every aspect.

The nicotine in apps
Before we go on, it’s worth specifying what I mean when I say “manipulative tactics.” Here’s a short list of levers that companies use to get what they want out of you, the user:

  • Gamification: Badges, levels, leaderboards, rewards, animations. Gamification is used to motivate users and to keep them engaged.
  • Social pressure: Referrals and invitations, social shares, your friends’ baby photos.
  • Notifications and nudges: Emails, push notifications, inboxes. All designed to bring you back to the app.
  • Engagement-optimized content: Facebook News Feed, Twitter’s Timeline, IG’s Discover page. These apps display content they predict will get a rise out of you and keep you online.
  • Personalized features based on your data: Email ads, display ads, tailored content, social recommendations. Companies use your own data to tailor your experience. Some of these things — like your email or private conversations — you might not want as fodder for targeted ads.
Behold: A PM’s weapons of mass persuasion

In the beginning, God glossed over user choice
A VC would say we’re in ‘early innings’ with regard to user choice. Consumers have some tools at their disposal (covered below), and they’re bound to gain more as anti-tech antitrust sentiment grows around the globe. Apple, for example, is sprinting to differentiate itself from the other FANGs on privacy — it’s this strategy that’s fueling privacy labels, Screen Time, and the upcoming (Facebook-shanking) advertising opt-in.

Decent tools from Apple to help consumers make informed choices

Right now, users have available to them tools like:

  • Email and notification controls
  • Privacy, location data, ads preferences
  • Terms of Service & Privacy Policy agreements
  • OS-level controls like Screen Time

That’s a start, though these features are nowhere near universally adopted. And even still, a user’s default experience is like fast food: they’re defaulted to the least nutritious experiences, subjecting themselves to all the additives. There’s a chasm between the impoverished users of today and the empowered users of tomorrow. Let’s get into it.

Imagined futures — five case studies on user choice

1/Gamification and Duolingo — Duolingo is notorious for its gamification elements. And gamification is a notoriously fraught topic: what’s the line between motivation and manipulation? I’m dreaming of a Duolingo where users are allowed to opt out of gamification features altogether. During onboarding, you could give users a transparent choice between “learning styles.” One would be the traditional gamified approach and the other would be more like a traditional course, relying on a user’s intrinsic motivation. The latter would have all the same content — minus the owl animations and badges and confetti. I know from my own experience at CTL that some volunteers love our gamification elements, and others resent them for minimizing the commitment. What if we could keep both groups happy?

Cut the games, deadass

2/Social features and Peloton — Similarly, what if we tore the social features out of Peloton? Community is Peloton’s key differentiator, and they would probably argue that their social features help users build healthier habits. But researchers might disagree. Leaderboards in learning systems can be sexist because they privilege competitive (male) impulses over cooperative (female) ones. And anyway, some people might prefer to work out alone, eschewing extrinsic motivators. What if users were allowed to simply go to a class, ride their bike, and not have to see anyone else’s avatars or high fives or stats?

Leave Britney alone!

3/Recommendation algos and Twitter — Users should have a right to control the content they see — how much of it, for how long, and from whom. Today, Twitter users can exercise some control: they can view tweets chronologically or ranked by an algorithm. The latter actually caused a small uproar when it launched: users liked seeing what they wanted and as it came in (especially when livetweeting events, like the Super Bowl).

Choose your own algorithmic adventure

Let’s take this concept a few steps further. What if a user could decide to only see content from their contacts? Or what if a user could limit their feed to only display X items at a time? Ideally we would have controlled feed experiences that reverse the TikTok dynamic. TikTok is successful in part because it pumps algorithmic content into its users veins, content that “reinforces the average biases and tastes in society.” Let’s not do that. Let’s let users decide whether they drink by firehose or by straw. [Edit: Twitter CEO Jack Dorsey suggested building “an app store for social media algorithms” as part of a decentralized social network — a clever that idea that helps shift power towards consumers.]

5/ Autoplay and infinite time on Youtube — Building on that the last point: users should be able to control the rate at which they see content, and set limits around consumption. Youtube’s been found to radicalize its users not just because of recommendation algos, but because of autoplay, which essentially puts users on a roller coaster they can’t escape (similar to TikTok). Imagine a product where users have autoplay disabled by default, and where they might set a time limit for themselves. I’d love to see a Screen Time feature in every product to stem the vampiric draw of infinite feeds.

Pupils yearning to be free

5/Granular data controls and Instagram — Data controls are a little off-topic — they go beyond pure UI changes — but they’re another instructive example. Start with the assumption that users own their data. In this world, they could sell it or lend it to a company like Instagram. Users could stipulate that their browsing data be wiped every 30 days or so. Or they could sell that browsing data to Instagram for $10 a month (which IG would to fuel its ads products). Further still, they could sell their data to a specific advertiser, like Target, in exchange for a discount (or whatever terms they set). This example gets capitalist-dystopic fast — but the status quo is scarier. In 2021, tech companies peddle addictive products and use data from that to keep those users coming back in a vicious cycle. And all the while the companies sell and resell that data to a vast web of third parties. Tech companies have a monopoly over attention and data, exploiting both like how their forebears exploited fossil fuels.

The business case for user choice
The business case for radical user choice makes sense in the long term. Consumers are becoming increasingly aware of how they’re manipulated and are demanding more rights over data, privacy, user experience, etc. I can tell you as a product leader that implementing some of those examples above would be a nightmare because they fly in the face of modern software development. Modularized UX adds exponential complexity — to product development, research, testing, data analysis, etc. — thus making it harder to create a cohesive experience. After all, businesses generally like streamlined, focused products that are easier to sell.

But the world is moving in the direction of choice. By adding in more choice, and modularizing experiences, companies can regain and maintain user trust while unlocking yet new users. There’s a technical advantage, for example, to modularizing UI in this way. A Duolingo without gamification features is lighter and faster, requiring less storage and processing. A stripped-down version would allow Duolingo to work in more places, on more devices, and at lower speeds. It’s more inclusive, more accessible, and, perhaps, an all around better experience.

It’s incumbent on companies to start providing these features before these features are required further “up the stack.” Take the example of time limits. Apps weren’t providing this feature, so Apple implemented it at the OS level. It could also require time limits at the App Store level. Or, zooming out even further, regulators could require limits on every app built in the U.S. It’s just a matter of time.

Tech nonprofits have a unique advantage in that they can pioneer new models. Signal, for example, has pioneered encrypted messaging, which is now being embraced by the big companies. A Facebook exec recently declared (via internal memo): “We should become the undisputed leaders in providing privacy aware software.”

Businesses will be increasingly judged by their ethical decisions — and the best ones will differentiate themselves by putting power back in the hands of users. Radical user choice is in fact a conservative business strategy. — XML

--

--

Michael Linares
Michael Linares

Written by Michael Linares

Product Director @ Crisis Text Line

No responses yet