Whistle-blower sues Facebook’s owner Meta over ‘poor’ working conditions

Globally, thousands of moderators review posts containing graphic content – keeping harmful material from appearing on users’ feeds.


The first video Daniel Motaung had to watch while working as a Facebook content moderator in Kenya was of a beheading. After just six months in the job, his mental health was spiralling.

“I was actually dysfunctional. I wasn’t able to think properly,” Motaung, 31, said.

Today, he is suing Facebook’s owner Meta over the working conditions he faced and has helped set up the first African union for content moderators whose daily job involves reviewing graphic content so social media users are spared from seeing it.

‘Respect we deserve’

“The entire [social media] business model is actually dependent on content moderation. It’s high time they recognise that and treat us with the respect we deserve,” he said following the union’s launch last week during a meeting of Facebook, TikTok and ChatGPT moderators in Nairobi.

Globally, thousands of moderators review posts containing graphic content – keeping harmful material from appearing on users’ feeds.

Many work for third-party contractors rather than directly for tech companies.

ALSO READ: Meta fined R4.7 billion after leaking 533 million Facebook users’ data

Motaung’s lawsuit, which was also filed against Meta’s local outsourcing company Sama, seeks financial compensation, an order that outsourced moderators have the same healthcare and pay scale as Meta employees, union protection and an independent human rights audit of the office.

Asked to comment on the lawsuit’s allegations, a Sama spokesperson said the company cared “deeply about the health and emotional well-being of our team” and disputed accusations of low pay and poor conditions.

A Meta spokesperson said the company does not comment on ongoing litigation but only works with contractors that ensure above-average pay and well-being support.

Facebook must honour moderation work

Motaung’s lawsuit could have implications for how the firmworks with moderators worldwide, said Cori Crider, a director at Foxglove, a London-based tech-justice advocacy group supporting the case.

“That’s the ambition of this case: to force Facebook to honour moderation work and make it safe and fair,” she said.

Motaung, who is South African, said he did not expect his university studies on workers’ rights to come in handy when he accepted the content moderation job in Nairobi in 2019, but soon felt impelled to speak up.

ALSO READ: Meta introduces new updates to protect teens on Facebook and Instagram

He recounted seeing colleagues fainting, crying and binge-drinking and hearing others complain of flashbacks and insomnia.

Most of Sama’s moderators were paid less than $2 (about R38) per hour, but many stuck with the job because they were made to feel “expendable” and desperately needed the work, Motaung said.

Mandatory work breaks

The Sama spokesperson detailed well-being policies including mandatory work breaks, limits on weekly hours and counselling sessions, and said wages were high in local terms.

“We recognise that content moderation is a tough job and have paid wages that are four-times the minimum wage and 2.5-times the living wage in Kenya as a recognition of that,” the spokesperson said.

Motaung, who was fired by Sama when he banded together with fellow moderators to challenge working conditions, said he now saw his main job as speaking out about “the politics of content moderation”.

His activism has spurred further legal action from content moderators and global support from rights groups. In March, 43 applicants said they are also suing Meta for losing their jobs after organising a union.

ALSO READ: Facebook owner Meta to lay off 11,000 staff

His lawsuit is also seen as significant because a labour court ruled that the US company could be sued in the East African country – a judgment that Meta swiftly appealed.

“At the core of the case is a fundamental question: can the world’s richest tech firms operate in Kenya and profit from workers in Kenya, while refusing to answer to Kenyan justice?” Crider said.

“The answer will set the terms of tech accountability in the region for years to come.”

Motaung, who was diagnosed with post-traumatic stress disorder, urged other content moderators to speak up.

“There are support systems, out there that can help,” he said. “Or find me, send me a message. Don’t die in silence.”

-Context is powered by the Thomson Reuters Foundation news room

Read more on these topics

Meta (Facebook)