Mission

To make invisible technology systems visible through age-appropriate conversations and activities

Why did I start 33.3

I have two boys, 13 and 10 years old, who deeply care about the world and its people. They think wars are pointless. They want the artists they love to be happy and well paid and create more. They want oceans and fishes to thrive. And icecaps to live. They want privacy and respect. But they are also incredibly trusting of anything the world throws at them.

The thing is, I don't want them to stop caring as they get older—to get desensitized as adults the way so many of us have.

This is not unique to my kids. I teach AI and Ethics at my son's middle school, and I've learned that every child is this way. Until they're not.

The question is, how do we show them with kindness the world they're inheriting?

I believe technology is a great tool to advance humanity and I'm generally optimistic about all the problems we can solve in the world. But the reality is that: a lot of products/platforms today are built for engagement and revenue, methodically desensitizing its users to the harm it causes in favor of convenience.

This project is an honest attempt at unveiling what happens behind the curtains. I started with a simple question: How do make the invisible infrastructure visible—the human labor, the exclusion, the environmental costs that AI rests upon. Mostly, I built this for my kids with the hope that once they learn to see these things, really see it, they won't have to choose between caring and participating. Several months of design, research, and conversations went into building these.

The world is very simple for me: In a few years, my children and their friends will inherit everything in this world with its good and bad. They need to understand how to reshape it.

The technology we have isn't the technology we're stuck with. It's just the technology we've chosen so far.

Can all our rich imagination be focused on making the world better? I believe it can.

This project is meant to be a catalyst for dinner and classroom conversations. A seed.

Why the name 33.3?

Like a radio station broadcasting on frequencies you can't see, AI infrastructure operates on channels most people never tune into. You hear the music—the recommendations, the chatbots, the filters—but you don't see the transmission towers, the power grid, or the workers keeping the signal alive.

33.3 can hopefully be the station that makes the invisible visible. A curated broadcast for parents, educators, and anyone raising kids who will inherit this world.

What I Want My Children to See

When my boys use ChatGPT/Claude/Gemini for homework help, I want them to know about the Kenyan content moderator earning $2/hour, required to view 150-200 pieces of traumatic content daily—beheadings, child abuse, suicides—to train the safety systems that keep their experience "clean." They receive no mental health support. When they develop PTSD, they're easily replaced. The AI that helps with algebra rests on someone else's trauma.

I want them to understand that "universal" AI excludes 2.2 billion people with vision loss who can't use these AI systems because it requires mouse interaction that screen readers can't navigate. And the 3.7 billion people are still offline, mostly in the Global South, who will never access the tools my children take for granted. Universal means "for people like us."

When we talk about climate change—something they already care deeply about—I want them to know that training GPT-3 produced 552 tons of CO2—equivalent to 1.5 million miles driven. But the real cost isn't abstract. It's the 905 million gallons of water projected for Meta's data centers in Mesa, Arizona in 2025, while the Southwest faces historic drought. Silicon Valley gets the innovation; rural communities get the water bills.

I want them to understand that features designed to keep them scrolling aren't accidents—they're psychological manipulation by design. Snapchat's streaks trigger FOMO and anxiety. Instagram's "double-tap" uses slot machine psychology. TikTok's infinite scroll exploits the same dopamine pathways as gambling. Engineers with PhDs in behavioral psychology designed these features knowing they would be addictive. Knowing kids would get anxious. And they shipped them anyway.

And when they're old enough to apply for jobs or mortgages, I want them to recognize when AI is screening them—and to know that the $25.6 million stolen from Arup via a deepfake CFO on a Zoom call isn't science fiction. It's Tuesday.

These aren't merely side effects of technology, they are often design choices.

The invisibility is sophisticated and sadly, intentional. Platforms describe themselves as "powered by AI" rather than "powered by underpaid data labelers in the Global South." They say "free to use" instead of "funded by the commodification of your attention and intimate data." The language itself obscures the human cost. And it's important to talk about that.

But that doesn't mean that we have to maintain or respect this status quo.

How can you use this?

Use the curated paths or the lesson plans that are constantly evolving. Have conversations with the young people in your life about these ideas. There's really no script, only intention and learning.


If this resonates with you and helps, please drop a line. I would be happy to know that I'm not alone in thinking this way.

Designed and created by Saranyan Vigraham

All experiences, concepts, research, and fact-checking done by a human.

Used Claude for coding assistance.