TikTok to Ban Teenagers from Using Beauty Filters Amid Growing Mental Health Concerns

  • Ar-Riyad - Saudi Arabia

Translate In Arabic

LONDON: TikTok has announced that it will soon restrict teenagers under the age of 18 from using beauty filters that alter their facial features. This move comes as the social media platform responds to increasing concerns about the impact of these filters on the mental health and self-esteem of young users. The restrictions will target filters like "Bold Glamour," which smooth skin, reshape facial features, and blur reality—effects that are often hard to distinguish from real life. However, filters designed for comedic purposes, such as adding animal ears or exaggerated facial features, will remain accessible to teens. The decision follows a report by Internet Matters, a children's online safety non-profit, which found that beauty filters contribute to a "distorted worldview" by normalizing unrealistic beauty standards. Many teenage users, especially girls, reported feeling pressure to conform to these altered appearances. Some even admitted that after prolonged use of the filters, they began to view their unfiltered faces as "ugly."

TikTok's Response to Concerns: Dr. Nikki Soo, TikTok’s Safety and Well-being Public Policy Lead for Europe, confirmed the new policy, stating that the platform aims to "reduce the social pressure on young users" and encourage healthier online habits. The platform will also introduce automated systems using machine learning to detect underage users misrepresenting their age. TikTok currently deletes around 20 million accounts every quarter for violating its age policy, but the effectiveness of the new age verification system will depend on its accuracy.

Global Implications and New Regulations: This policy change aligns with upcoming regulations in the UK under the Online Safety Act, which requires social media platforms to implement more effective age checks. Ofcom, the UK’s communications regulator, had previously raised concerns about the effectiveness of TikTok’s age verification measures. Richard Collard, associate head of policy for child safety online at the NSPCC, welcomed the move but stressed that more must be done. "This is a positive step, but other social media platforms must follow suit and implement strong age-checking systems," he stated.

A Global Push for Safer Social Media: TikTok’s decision is part of a broader trend of increasing scrutiny on social media platforms to protect younger users. Other platforms like Instagram and Roblox have already introduced measures to protect underage users from harmful content. As TikTok enforces these new rules, the company is expected to face more scrutiny from regulators. The platform will continue refining its safety measures, with a focus on providing a safer and more age-appropriate online experience for all users

To comment and like Please first login ..
Login / Register