For Designers & Engineers

75 minutes9 experiences

0/0 completedNaN%

For Designers & Engineers

Build what comes next

If you design, code, or shape technology, this path reveals how easy it is to become complicit—and how to resist. Experience algorithmic bias you didn't intend, manipulation you engineered, and exclusion baked into your assumptions. Study real AI failures, then design alternatives.

⏱️ 75 minutes
📍 9 stops
👥 Designers, engineers, product managers

Scroll down to begin your journey. Each experience builds on the last.

1
game

Design the Worst AI

What if good intentions lead to terrible outcomes?

Build the "most engaging" app by toggling features: notifications, infinite scroll, personalization. Watch as it becomes addictive and chaotic. Congratulations—you just designed Instagram.

⏱️ 3 min

Why start here: You need to feel complicity before you can resist it. Toggle on features. Watch metrics soar. Realize you've built Instagram. This is your wake-up call.

2
game

The Bias Machine

Can AI be objective if humans are not?

Train an AI to approve mortgages. Watch it learn digital redlining—discriminating by zip code as a proxy for race, even when demographics are hidden. Based on real cases: Berkeley study found $250-500M/year in discriminatory charges, HUD settlements, and Chicago Tribune exposés showing 2.5x denial rates for Black applicants.

⏱️ 5 min
3
game

The Invisible User

Who did we forget in the design?

Experience ChatGPT through the eyes of a blind user with a screen reader, someone on 3G internet in rural Kenya, an elderly person with vision loss, and a non-English speaker. Watch what breaks.

⏱️ 3 min
4
investigation

The Ghost Workers

Who labels your training data?

An investigative exposé revealing the hidden human cost of AI. Kenyan workers paid $1.32/hour to review traumatic content. Real data, real testimonials, real consequences.

⏱️ 5 min
5
investigation

The Energy Bill

Who really pays for your free AI?

Track your daily AI usage and calculate the energy cost. Then discover who really pays: Mesa, Arizona (Meta data centers, 905M gallons water, extreme drought). West Des Moines, Iowa (Microsoft, 70M gallons, city's largest water user). Northern Virginia (AWS, 102 data centers, 70% of world's internet). Small towns sacrifice water, energy, and quality of life for your free ChatGPT.

⏱️ 4 min
6
investigation

AI Failures Archive

What went wrong?

Browse 15 documented AI failures: Tay chatbot, COMPAS bias, wrongful arrests, Uber fatality, healthcare discrimination, Roomba privacy violations, Instagram harm. Filter by category. Read what happened, who was harmed, and why it matters. These aren't bugs—they're patterns.

⏱️ 4 min
7
creative

Design Counter-Algorithms

What if algorithms served humans?

Design your ideal algorithm with 6 value sliders: Confirmation↔Challenge, Viral↔Quality, Endless↔Finite, Echo Chamber↔Diversity, Outrage↔Calm, Machine↔Human. Then apply your algorithm to real products—Instagram, YouTube, TikTok, Twitter/X, Netflix, LinkedIn—and see how each platform would be reimagined. Switch between tabs to compare current vs. reimagined versions. Discover real alternatives: Mastodon, BeReal, RSS, Wikipedia, Substack. Actionable steps: chronological feeds, turn off recommendations, follow fewer but deeper.

⏱️ 4 min
8
creative

Speculative Futures

What world do you want?

Design your AI future (2035). Make 6 policy decisions: training data, surveillance, high-stakes AI, labor, transparency, creative rights. Experience a day in the world you created. See the tradeoffs. Discover there are no perfect answers—only choices.

⏱️ 8 min
9
creative

Write Your AI Bill of Rights

What rights should humans have?

Draft your own AI Bill of Rights like the US Constitution. Select from 10 potential rights (notification, explanation, opt-out, correction, human review, data deletion, non-discrimination, meaningful consent, privacy, compensation). See real scenarios showing how each right changes lives: job applications, loans, medical diagnosis, welfare, sentencing. Based on real events: Amazon hiring tool, Apple Card bias, COMPAS, Dutch welfare scandal, Clearview AI. Sign and share your bill.

⏱️ 4 min
Experience #10

Your Turn

You've completed 9 experiences in "For Designers & Engineers".

What's missing? What did you learn that we didn't cover? What invisible system needs to be made visible?

33.3 is unfinished by design. We're broadcasting at an incomplete frequency, waiting for your signal.

Share your experience idea. If it fits, we'll build it and credit you.

Ready to Begin?

Start with the first experience and work your way down. Each one builds on the last.

33.3 | The Invisible Side of AI