Job cuts by Elon Musk decimated Twitter team tackling child sexual abuse
Elon Musk has dramatically reduced the size of the Twitter team devoted to tackling child sexual exploitation on the platform, cutting the global team of experts in half and leaving behind an overwhelmed skeleton crew, people familiar with the matter said.
The team now has fewer than 10 specialists to review and escalate reports of child sexual exploitation, three people familiar with the matter said, asking not to be identified for fear of retaliation. At the beginning of the year, Twitter had a team of about 20, they said.
The change comes as lawmakers in the European Union and the U.K. are planning broad-reaching online safety rules that will require social media platforms to better protect children or face significant fines.
Twitter didn’t respond to a request for comment.
The team — a mix of former law enforcement officers and child safety experts based in the U.S., Ireland and Singapore — was stretched before the cuts, working long hours to respond to user reports and legal requests, the people said. They were responsible for stopping the distribution of child sexual abuse material, instances of online grooming, and media that promoted attraction to minors as an identity or sexual orientation.
Elon Musk’s track record as a boss is an endless scroll of impulse firings, retribution, tone-deafness on race — and the impregnation of a subordinate.
Last week, Musk tweeted that “removing child exploitation is priority #1†and called on people to “reply in the comments if you see anything that Twitter needs to address.â€
Some prominent hashtags associated with child sexual exploitation have been removed since Musk took over, changes that had been in the works before he joined, the people said. Still, combating this type of messaging isn’t always as simple as removing tweets containing the offending hashtags since many have other, innocuous purposes, they said. Offenders also constantly change the terms they use to evade detection.
Although artificial intelligence-based tools can be useful for identifying images that have already been reviewed and categorized as child sexual exploitation material by law enforcement, human review is particularly important for recognizing the nuances of grooming and other exploitative behaviors, identifying previously unknown abusive images and videos, and understanding regional differences in the law, the people said. Humans are also required to respond to requests from law enforcement as part of criminal investigations.
Losing specialists in Europe and Singapore will make policing non-English-speaking markets a particular challenge, the people said.
These specialists worked closely with dedicated product managers and engineers to build tools and automation to stop the spread of the material, as well as third-party contractors who helped triage posts that users reported. Only a few employees were cut in the first round of layoffs, but the team was decimated when Musk called on Twitter’s workers to commit to a “hardcore†culture or lose their jobs, the people said. Musk didn’t create an environment where the team wanted to stay, the people said.
Elon Musk offered employees a deal: Commit to a new “hardcore†Twitter or leave with three months’ severance. For the many who opted to leave, questions remain about what will happen next.
The defections were part of a broader exodus at Twitter’s trust and safety team, whose members left after Musk sent the ultimatum this month, people familiar with the matter said previously. The company also lost a significant number of its employees who block foreign disinformation campaigns on the platform, and entire swaths of Twitter’s audience have been left without content moderation, one of the people said. In the Asia-Pacific region, just one contractor hired to help with spam in the Korean market remained, the person said.
Twitter also has cut a number of contractors who helped moderate content, Axios has reported. Social media platforms including Facebook, TikTok and Twitter use third-party moderators to help sift through flagged posts for violations.
Unlike other types of egregious content that violates Twitter’s rules, child sexual abuse material is illegal to host on the platform, and, depending on the country, there are requirements to take down and report material within specific time limits.
In the U.K., the Online Safety Bill gives regulators the power to fine platforms hosting user-generated content as much as 10% of their revenue if they fail to police the content effectively.
The EU is also planning regulation that would require tech companies to take a more aggressive approach to detecting sexual abuse material.
The European Commission’s controversial proposal would give courts the power to require companies to scan for material in messages, even if they are end-to-end encrypted. The commission also wants companies to detect grooming via artificial intelligence and use age verification to find minors on their platforms.
“Elon Musk has been very vocal about his commitment to tackling online child sexual abuse,†said Ylva Johansson, the EU commissioner in charge of the proposal. “I fully expect him to follow through on these public commitments.â€
“Having experienced experts and teams in place, as well as those familiar with EU legislation, seems to me an obvious baseline from which to scale up this fight,†she said.
Bloomberg writers Davey Alba, Jack Gillum and Margi Murphy contributed to this report.
More to Read
Inside the business of entertainment
The Wide Shot brings you news, analysis and insights on everything from streaming wars to production — and what it all means for the future.
You may occasionally receive promotional content from the Los Angeles Times.