The Malaysian government is pushing TikTok to introduce a stronger age-verification system for users, citing concern over the platform’s “weak” enforcement of its own rules against harmful content and the impact on children’s mental health.

The move follows a meeting between Datuk Fahmi Fadzil, Malaysia’s Minister for Communications, and TikTok’s senior management in Kuala Lumpur. Fahmi said he was “very dissatisfied" with steps TikTok had taken to curb activity such as cyberbullying, scams and deepfakes, warning that children younger than the app’s minimum age of 13 were still widely present on the platform.

“In many of my visits to schools … I noticed children as young as Year One already have TikTok accounts,” Fahmi said in a statement.

According to data gathered earlier this year by DataReportal, TikTok had 19.3 million users aged 18 and above in Malaysia. There is no reliable data on how many of the app’s users are under 18 years of age. 

Malaysia has tightened its stance on social media in recent months. Last year, the government urged social media platforms to commit to addressing cybercrimes such as scams and harmful content and stated that TikTok had a compliance rate of just 76% when it came to removing harmful content at the Malaysian government’s request.

Since January, Malaysia has required social media platforms with more than eight million users to apply for a licence, in an attempt to combat cyber offences.

A global trend

In recent years, TikTok has become one of the most popular social media platforms among younger users. According to data compiled by Magnet, 25% of TikTok’s global users are aged between 10 and 19.

It has also rapidly become a major platform for advertising to and selling to children and teens. In July, TikTok’s e-commerce ecosystem, TikTok Shop, recorded over 100mn product searches daily in Malaysia.

Malaysia’s move reflects widening global momentum for stricter social media regulation to protect young users. Starting in December, Australia will ban under-16s from using social media entirely. Meanwhile, France, Spain, Italy, Denmark and Greece are jointly testing a common age-verification app for the same reasons.

EU law requires parental consent for under-16s, although member states may lower that to 13. In the US, more than 20 states have enacted some form of age verification law, although many face legal challenges. Tennessee’s law is already in force, and Mississippi’s was upheld by the courts in August 2025.

While the UK does not legally require social media platforms to carry out age checks, a new law states they must ensure a “child-appropriate” experience, such as turning on sensitive content filters by default.

Research shows that self-declared age limits on social media can often be bypassed through disposable email addresses, fake birth dates and other methods, highlighting the need for more robust age-verification processes.