TikTok removed 7,097,236 videos from its platform in Türkiye in the first half of 2024 for violating community guidelines, according to the platform's report. Of these, 98.67% were removed before being viewed by users, and 88.66% were deleted within 24 hours.
Emir Gelen, TikTok's Director of Public Policy for META (Türkiye, Middle East, Africa, Pakistan, South Asia), emphasized the need for a multi-faceted strategy to ensure safety on the platform, which sees over 1 billion pieces of content shared monthly. TikTok uses both AI technology for automatic detection of rule violations and manual moderation with a team of over 40,000 people.
Gelen also reported that over 427 million TikTok accounts were removed globally between January and July 2024, including 379.7 million fake accounts and 41.8 million accounts suspected of being underage (below 13). He added that TikTok is developing automated systems to detect false age claims.
TikTok's proactive content moderation reached a record high of 98.2%, with the proportion of videos removed due to rule violations increasing from 62% to 80% compared to the previous year. Additionally, the rate of re-uploaded removed videos has decreased by half.
Gelen also highlighted TikTok's efforts to enhance safety, such as testing advanced AI-based age verification systems, and emphasized the importance of the "family pairing" feature, which allows parents to control their children's use of the platform.
Looking ahead to 2025, TikTok plans to continue prioritizing user safety, especially for children, and strengthening its community-focused approach while redefining entertainment, e-commerce, and talent discovery.