The Popular Video Platform Allegedly Leads Child Accounts to Explicit Material Within a Few Clicks
According to a recent investigation, TikTok has been discovered to guide profiles of minors to adult videos within a small number of clicks.
How the Study Was Conducted
An advocacy group created simulated profiles using a date of birth for a minor and enabled the "restricted mode" setting, which is designed to restrict exposure to adult-oriented content.
Investigators observed that TikTok suggested inappropriate and adult-themed search terms to seven test accounts that were set up on clean phones with no search history.
Alarming Recommendation Features
Search phrases proposed under the "suggested searches" feature featured "extremely revealing clothing" and "very rude babes" – and then escalated to keywords such as "hardcore pawn [sic] clips".
In three cases of the accounts, the sexualized searches were proposed instantly.
Fast Track to Adult Material
After a "small number of clicks", the investigators encountered explicit material including women flashing to explicit intercourse.
The organization reported that the content attempted to evade moderation, typically by showing the clip within an benign visual or video.
In one instance, the procedure took two interactions after signing in: one interaction on the search feature and then another on the recommended term.
Legal Framework
The research entity, whose mandate includes researching technology companies' influence on societal welfare, stated it carried out several experimental rounds.
The first group occurred preceding the activation of child protection rules under the United Kingdom's digital protection law on July 25th, and a second set subsequent to the regulations took effect.
Concerning Discoveries
Researchers noted that two videos showed someone who seemed to be under 16 years old and had been submitted to the child protection organization, which tracks exploitative content.
The campaign group asserted that the social media app was in non-compliance of the digital protection law, which requires digital platforms to prevent children from accessing inappropriate videos such as adult material.
Regulatory Response
A communications officer for the UK communications regulator, which is responsible for monitoring the law, stated: "We appreciate the research behind this investigation and will review its results."
Ofcom's codes for complying with the act indicate that digital platforms that carry a medium or high risk of showing harmful content must "modify their programming" to block inappropriate videos from minors' content streams.
The platform's rules forbid explicit material.
Platform Response
The social media company said that following notification from Global Witness, it had removed the problematic material and introduced modifications to its suggestion feature.
"Upon learning of these assertions, we acted promptly to look into the matter, take down videos that violated our policies, and launch improvements to our recommendation system," said a official speaker.