The Ghost Workers

Who labels your training data?

An investigation into the hidden human cost of artificial intelligence.

$1.32/hour

What OpenAI paid Kenyan workers to make ChatGPT "safe"

Source: TIME Magazine, January 18, 2023

What the work involves

Before ChatGPT could become "safe" for you to use, thousands of workers in Kenya, the Philippines, Venezuela, and India had to read and label the most disturbing content imaginable.

Child sexual abuse material (CSAM)

Graphic descriptions of bestiality, torture, and murder

Detailed accounts of suicide and self-harm

Extreme hate speech and incitement to violence

The Quota

150-250 passages of this content. Every day. For 9 hours.

Source: TIME Magazine

The workers

"That was torture... I couldn't even think anymore. My brain was just stuck."

— Kenyan data labeler, after reading graphic description of child abuse

"The work's traumatic nature eventually led Sama to cancel all its work for OpenAI eight months earlier than planned."

— TIME Magazine investigation

"I couldn't sleep. Every time I closed my eyes, I saw it again."

— Content moderator, describing recurring visions

Mental Health Impact

47.6%

Score at levels associated with clinical depression

34.6%

Moderate-to-severe psychological distress

Source: PubMed Study, 2024

The pay disparity

Kenyan Data Labeler

$1.32 - $2.00

per hour

vs

OpenAI Engineer (SF)

$500,000+

per year

How long to afford basic needs at $2/hour?

Meal for family of 4~3 hours
Monthly rent (Nairobi)~100 hours
Therapy session (trauma)~25 hours

Where the work happens

AI companies outsource this work to the Global South, where labor is cheap and regulations are weak.

🇰🇪 Kenya

Major hub for OpenAI, Meta, Google

Avg. wage: $1.32-2/hour

🇵🇭 Philippines

Content moderation for social media

Avg. wage: $2-3/hour

🇻🇪 Venezuela

Data labeling under strict timers

Often no bathroom breaks

🇮🇳 India

Large-scale annotation work

Algorithmic surveillance

The race to the bottom: As AI companies compete, they push wages lower and conditions worse. Workers in these countries often don't know who they're working for or what their data will be used for.

Why "ghost" workers?

They're called "ghost workers" because they're invisible to you. When you use ChatGPT, you don't see them. When you read that AI is "safe," you don't think about them. When tech companies talk about "AI safety," they rarely mention them.

They can't speak to managers directly

They get no feedback on their work

They have no labor protections

They're paid through third-party contractors to avoid liability

They're bound by NDAs that prevent them from speaking out

The AI industry is built on a foundation of invisible, exploited labor. Without these workers, ChatGPT would be unusable. But they're kept hidden—because if you knew the human cost, you might ask uncomfortable questions.

What the companies say

OpenAI's response

OpenAI stated they require partners to pay workers a living wage and provide mental health support. However, workers reported earning far below living wage and receiving inadequate mental health resources.

The contract was terminated early due to "traumatic nature" of the work.

Meta's legal battles

In 2024, a Spanish court ruled that a content moderator's mental health condition was work-related. Meta reached settlements in cases accusing them of failing to protect moderators from psychological injuries.

Source: The Japan Times, July 2025

The contradiction

AI companies claim their systems are "autonomous" and "intelligent," but they depend entirely on massive human labor forces working in conditions that would be illegal in Silicon Valley.

What you can do

Support labor organizing

Workers are organizing. Follow and support groups like the Communications Workers of America's Ghost Workers campaign.

Demand transparency

Ask AI companies: Who labeled your data? What were they paid? What mental health support did you provide?

Support regulation

Advocate for laws protecting content moderators and data workers, including mental health support and living wages.

Recognize the labor

Every time you use AI, remember: there's a human behind it who may have been traumatized to make it work.

Share this story

The only way to change this is to make it visible. Share what you learned.

"There is no such thing as fully autonomous AI. Behind every system is human labor—exploited and hidden from view."

Every "AI safety" feature comes at a human cost.

Share This Experience

Help others discover AI ethics through interactive experiences