TikTok bans beauty filters for under-18s, implements age verification
TikTok is banning beauty filters for users under 18 to protect mental health, while also introducing AI-based age verification to block access for those under 13. These measures are part of broader efforts to comply with stricter regulations and ensure user safety.
- Tech
- Agencies and A News
- Published Date: 10:55 | 27 November 2024
- Modified Date: 10:58 | 27 November 2024
TikTok is implementing measures to protect mental health by banning beauty filters for young users and introducing AI-based age verification systems to block access for users under 13. These steps are part of a series of actions to comply with stricter regulations.
In response to rising anxiety and low self-esteem among young people, TikTok is introducing new limits on beauty filters. Under the new rules, users under 18 will not be able to use filters that enlarge eyes, plump lips, or alter skin tones. However, fun filters, such as those adding rabbit ears or dog noses, will not be affected by these restrictions.
Beauty filters have created pressure to perfect physical appearance, especially among young girls, leading to mental health issues where users find their real faces unattractive after using the filters.
New steps to enforce age limits
TikTok is also taking measures to prevent users under 13 from accessing the platform. The company is testing AI-based age verification systems and will implement automated systems to detect false age declarations by the end of the year. Users who are wrongly blocked will be able to appeal.
These steps are part of a broader effort by social media platforms to increase safety ahead of strict regulations like the Online Safety Bill in the UK, which will take effect in 2024. Heavy fines will be imposed for non-compliance.
Experts are calling for more transparency and quick action
Andy Burrows, CEO of the Molly Rose Foundation, stated that these changes aim to comply with regulations and called for TikTok to transparently explain the effectiveness of its age verification measures. The NSPCC, while finding these measures "encouraging," argues that more needs to be done.
In addition to TikTok, other platforms are also increasing security measures. Roblox is limiting young users' access to violent content, while Instagram has introduced new features allowing parents to monitor their children's accounts.