Introduction
On August 22, 2025, news broke that TikTok — owned by ByteDance — is planning to lay off hundreds of employees from its London-based Trust & Safety team. This move is the latest chapter in a sweeping global restructuring, with TikTok pivoting heavily toward AI-driven content moderation. Although automation promises scalability and efficiency, critics warn that this approach may compromise user safety, transparency, and ethical accountability.
The What: Details of the Layoffs
-
TikTok is preparing to phase out moderation and quality assurance tasks at its London site. The Communication Workers Union (CWU) estimates that around 300 staff in the Trust & Safety department are expected to be affected.
-
An internal email informed employees of the shifts, emphasizing that moderation would no longer be carried out at the London office. This marks the start of a collective consultation process.
-
TikTok's broader regional restructuring includes centralizing moderation operations in Lisbon and Dublin, while shutting down its Berlin trust and The Online SAM
AMAMety Act and Regulatory Imperatives
The UK’s Online Safety Act has recently come into effect, imposing strict requirements on tech platforms to prevent harmful and illegal content and implement age checks.
-
Companies failing to adhere to these rules risk punitive measures — fines of up to £18 million or 10% of global turnover, whichever is greater.Financial TimesThe Sun
-
TikTok has already rolled out AI-based age assurance systems, designed to infer a user’s age based on their behavior and interactions. These systems remain under review by the regulator Ofcom, which has not yet endorsed them.Financial TimesThe SunCity AM
The AI Play: Efficiency vs. Accountability
TikTok’s Perspective
-
TikTok claims that over 85% of removed content violating community guidelines is flagged and taken down by automation.City AMThe Guardian
-
In its global filings, the company highlighted strong financials — 38% year-on-year revenue growth in Europe, reaching $6.3 billion in 2024, and a significant reduction in pre-tax losses from $1.4 billion to $485 million.Financial TimesThe Sun
-
TikTok argues this reorganization is meant to streamline operations, harness technological advances, and “strengthen our global operating model for Trust and Safety.”
Critics’ View: Risk and Displacement
-
The CWU cautioned that TikTok's focus on AI might be less about innovation and more about offshoring jobs to lower-cost regions, arguing: "AI makes them sound smart … but they're actually just going to offshore it."Financial TimesThe Sun
-
On The Guardian’s business live feed, union representatives warned that cuts could endanger millions of UK TikTok users, with concerns over “hastily developed, immature AI alternatives.”The Guardian
-
In Germany, the situation echoes with TikTok replacing nearly 40% of staff in its Berlin trust & safety team — about 150 employees — with AI and outsourced contractors. This led to strikes, driven by concerns over misclassification errors and the welfare of outsourced workers.The Guardian
The Bureau of Investigative Journalism (TBIJ) previously documented that in October 2024, at least 125 UK moderators were told their roles were at risk — part of a longer pattern of increasing reliance on automated systems.TBIJ
TikTok’s Global Context: A History of Moderation Restructuring
Historical Restructuring Trends
-
In October 2024, TikTok laid off hundreds of content moderation staff globally — particularly in Malaysia (up to 500 roles) — as it shifted toward automation.Reuters
The company claimed that its investment in trust & safety initiatives exceeded $2 billion, leveraging AI to remove up to 80% of guideline-violating content automatically.TBIJSilicon UK
-
Additional global restructuring occurred in February 2025, when TikTok initiated layoffs across its trust & safety teams in Asia, Europe, Middle East, and Africa, amid pressure from a US mandate requiring ByteDance to divest or face a ban.
-
-
-
Comments
Post a Comment