75 Mall Live Search
10:23 / Friday, 03 October 2025 / PM

Disturbing report: TikTok recommends pornography to children

TikTok's algorithm recommends pornography and highly sexualized content to children's accounts, according to a new report from a human rights activist group.

The researchers created fake accounts for children and enabled safety settings, but still received search suggestions with sexually explicit content, the BBC reports, Klankosova.tv reports.

Suggested search terms led to sexualized material, including explicit videos of penetrative sex.

The platform says it is committed to safe and age-appropriate experiences and took immediate action as soon as it learned of the problem.

In late July and early August of this year, researchers from the campaign group Global Witness created four TikTok accounts posing as 13-year-olds. They used fake birth dates and were not asked to provide any other information to confirm their identity.

They also activated the platform's "restricted mode," which TikTok says prevents users from viewing "adult or complex themes, such as... sexually suggestive content."

Without doing any research themselves, investigators found sexualized search terms in the app's "you might like" section.

These search terms led to content of women simulating masturbation.

Other videos showed women showing their underwear in public places or exposing their breasts.

In the most extreme case, the content included explicit pornographic films with penetrative sex.

These videos were included with other innocent content in a successful attempt to avoid content moderation.

Global Witness is a campaign group that typically investigates how big tech influences discussions about human rights, democracy and climate change.

The researchers accidentally encountered this problem while conducting other research in April of this year.

They informed TikTok, which said it had taken immediate action to resolve the problem.

But in late July and August of this year, the group repeated the exercise and discovered once again that the app was recommending sexual content, reports klankosova.tv.

The app says it removes nine out of 10 videos that violate its guidelines before they are ever viewed.

When informed by Global Witness of its findings, TikTok said it took action to “remove content that violated our policies and initiate improvements to our search suggestions feature.”