The platform will block access to filters that alter facial features, such as "Bold Glamour," which provides smooth skin, plumps lips, and reshapes eyes. However, filters used for comedic purposes will still accessible to teens.
The move follows a report by Internet Matters, which found that these filters contribute to a "distorted worldview" by normalizing perfected images. Many teenagers reported feeling pressure to conform to these altered appearances.
The effectiveness of these restrictions will rely on accurate age verification. In order to verify age, TikTok plans to roll out automated systems using machine learning to detect users misrepresenting their age. Recently, the platform removes approximately 20 million accounts every quarter for violating its age policy.
TikTok s Safety and Well-being Public Policy Head, Dr. Nikki Soo, confirmed the introduction of the new restrictions, stating that the platform aims to "reduce social pressure on young users" and promote healthier online habits.
The effectiveness of these restrictions relies on accurate age verification. TikTok plans to introduce automated systems using machine learning to detect users misrepresenting their age.
The policy changes coincide with tighter regulations in the UK under the upcoming Online Safety Act, which requires social media platforms to implement "highly effective" age checks.
Head of policy for child safety online at the NSPCC, Richard Collard welcomed TikTok’s move but asserted that it should be enforced quickly. “This is a positive step, but it’s just the tip of the iceberg. Other social media platforms must follow suit and implement robust age-checking systems,” he stated.
Chloe Setter said that TikTok would continue to refine its safety measures as it faces increased scrutiny from regulators. The platform also plans to tighten its restrictions on under-13 users, a demographic that has historically been difficult to monitor.
On the other hands, other platforms are taking same initiatives. Roblox recently restricted younger users from accessing violent or explicit content. Instagram, owned by Meta, has launched “teen accounts” that allow parents to monitor and control their children’s activity on the app.