Popular video content platform, TikTok is laying off global staff at its trust and safety unit as part of a restructuring process. The unit handles various content moderation and deploys community guidelines.
It was reported that the operations head of the platform who also oversees the unit, sent a memo out to staff on Thursday notifying them of the development. The layoffs were also believed to commence the same day for teams in Asia and Europe, the Middle East, and Africa.
Technext reached out to TikTok on the layoff but declined to comment.
“We do not have a comment at this stage. Thanks for reaching out. Appreciate it,” said Keagile Makgoba, Head of Communications for Sub-Saharan Africa at TikTok.
The safety teams work to protect TikTok’s community and monitor the platform for safety to allow users to explore entertaining content and share their creativity. Its activity cut across creating and updating community guidelines, detecting potential harms maintaining various TikTok policies, and focusing on employee well-being.
Also in October last year, the platform laid off hundreds of employees from its global workforce which included about 400 staff in Malaysia as it shifted focus towards a greater use of AI in content moderation. Most of the employees involved were in TikTok’s content moderation operations.
According to TikTok, its global trust and safety consists of 40,000 professionals.

The development comes at a time when its U.S., subsidiary’s fate is still left undecided.
Recall that on January 20, United States President Donald Trump signed an executive order that delayed TikTok’s ban enforcement action for 75 days. The order restored TikTok’s service in the country, comes a day after the video content platform restored services in the U.S. after shutting down due to a “sell or ban” law.
The platform stopped working for users in the U.S. before the law shutting it down on national security grounds took effect on January 19.
TikTok has been battling for months with a bill signed into law by President Biden in April 2024. The move mandates ByteDance to divest its U.S. operations to another owner by January 19 or face a ban that will halt its download on download App stores.
Similar Read: “I don’t have plans to acquire TikTok”- Elon Musk responds to purchase claims.
TikTok’s content moderation
In a moderation case last January, TikTok Shou Chew testified before Congress alongside Meta chief Mark Zuckerberg and other tech and media heads in a hearing where lawmakers accused the companies of failing to protect children from escalating threats of sexual predation on their platforms.
The hearing was an effort by lawmakers to address the concerns of parents and mental health experts that social media companies put profits over making sure their platforms do not harm children.
TikTok’s CEO, Republican Senator Lindsey Graham, had mentioned then that the company would spend more than $2 billion on trust and safety efforts.


In the Community Guidelines Enforcement Report, the company also pledged its commitment to strengthening content moderation systems to safeguard its diverse community.
“As TikTok continues to invest in cutting-edge moderation technologies, its commitment to transparency and platform safety remains at the forefront, ensuring a secure environment for its diverse user base across Nigeria and globally”, the report reads.
The platform removed over 2 million video content posted by users in Nigeria between July and September 2024, an action that was in line with its community guidelines and reaffirmed its dedication to online safety.
According to its Community Guidelines Enforcement Report for Q3 2024, which provides an account of the platform’s proactive approach to content moderation, TikTok indicated that 99.1 per cent of the said videos were removed within 24 hours of posting.
Its proactive detection rate has also improved with a 98.2 per cent global growth. This reveals that TikTok is more efficient at addressing harmful content before users encounter it.


Also in another African country, TikTok indicated that it removed over 360,000 video content uploaded by users in Kenya in Q2, 2024. The platform said the removed content constitutes 0.3 per cent of the total volume of videos uploaded in the country within the period.
In June 2024, the platform removed over 178 million videos globally. Of these, 144 million were removed through automation. Likewise, between July and September 2024, the platform removed over 147 million videos where 118 million were removed through automation.
These technical advancements reduce the volume of content that moderators review, helping alleviate human exposure to violent videos.




