“Most fears about AI and technology are best understood as fears about capitalism… How capitalism will use technology against us.”
— Ted Chiang
This quote, from sci-fi writer Ted Chiang (via Ezra Klein’s podcast), gets to the crux of it. The ills we see in tech are a direct result of the capitalist impulse — a worldview that renders everything “an optimization problem.” It’s this impulse that pushes us to push code without fully considering consequences.
Growth is a dangerous ideology. It’s one that’s taken hold across the globe — in our financial systems, in business, in the internet’s architecture, and in software development. Endless growth is not possible, or desirable, and it’s killing us — emissions related to the internet and its data centers account for 3.7% of global green house emissions, as much as that of the airline industry. In today’s short post, I’ll unpack the idea of growth, offer a few alternatives for a more sustainable internet, and end with a note on growth and social impact.
You can’t change what you don’t measure — and measurement is a core challenge of building inclusive products.
In this post, we’ll cover micro-level quantitative frameworks for measuring product inclusion and product equity — areas like representativeness, product outcomes, and subjective experience — as well as more macro-level mitigating methods like boosting negative signals and Red Teaming. To build truly inclusive products you need to safeguard your marginalized users on both fronts. …
Software is manipulative and everyone knows it. Companies use various techniques to manipulate users — techniques like gamification, social pressure, ranked content optimized for engagement, etc. — and set these as defaults. Users are give some semblance of control in the form of service agreements and preferences, masking what is essentially an asymmetric power dynamic. In other words, companies create an irresistible product and users make a binary choice: they submit to it or they don’t. What if a user could radically opt in or out of manipulative tactics? And how could products support this kind of modularized UX/UI?
Marginalized people are often marginalized in datasets. It’s an inversion of what happens IRL: in real life, minorities can’t help but stick out, but in data they’re lumped in with the rest. In data, marginalized people, if they are encoded at all, are often hidden in averages and other aggregate measures; finding those signals requires a mix of commitment and creativity. In this longish post, I’ll (attempt to!) use a case study to illustrate how exactly to do that, which I hope is useful for people who regularly play with data (PMs, marketers, strategists, etc.)
Here a rundown of all…
Traversing the internet is inherently lonely — it’s just you and a device — and things are doubly lonely during COVID. More and more, organizations are realizing that community can be a big differentiator for their cause or business. Indeed, “community” has become a bit of a buzzword across the industry, resulting in more than a few ham-handed or inauthentic efforts. I have quite a bit of experience building community products at Lean In, Option B, and Crisis Text Line. When done well, community can differentiate your org/product/experience and drive growth (new users), engagement (more activity), and retention (longer relationships).
I can’t stop thinking about this story from December by Documented NY (a nonprofit journalism organization) about how Latinx immigrants are scammed on WhatsApp. It makes for a good case study in how we might apply product inclusion principles. Oftentimes, talking to your most vulnerable users can reveal larger, systemic issues. The case study is timely, as WhatsApp users are fleeing the platform for more secure alternatives like Signal — which is being driven, ironically, by misinformation across WhatsApp 🥴.
Noom is one of my favorite apps at the moment. It helps people lose weight and eat better, a Weight Watchers for the millennial set. Noom’s effective techniques are backed by research and behavioral psychology. Today I’d like to focus specifically on how Noom uses expectations to shape their user journey, and in a way that feels ethical.
What to expect?
Expectation setting is a valuable technique used in a variety of domains, from education to sales, to help people reach their goals. Any UX designer will tell you that it’s core to what they do, i.e. communicate why a product…
Morals ain’t cheap — especially when they require engineering.
I love this read on the “Privacy Tax” from The Markup’s EIC Julia Angwin. She defines it as: “the time and money orgs spend building or customizing tools that would be off-the-shelf for other websites.” The Markup is deeply committed to user privacy, as well as user safety and accessibility. Over the last year, they built cookie- and tracking-free alternatives to products from Youtube, Stripe, Eventbrite, and others. Here are a few of their products and what they cost:
This is a web log about how ~tech can change the world~. I’m not talking about ads. I’m talking about the growing movement of tech nonprofits — organizations that seek to improve society, rather than raise profits, with novel technologies.
It’s a fascinating and diverse space. There are orgs that provide direct services (Trevor Project), mobilize digital communities (Swing Left), democratize knowledge (Wikimedia), expand journalism (The Markup), operate sustainable marketplaces (Etsy), and even modernize gov infrastructure (Code for America).
Tech has a bad rap right now. Rightly so. But there’s a lot of goodness out there, too. I’m more interested…