An investigation detected massive accounts using TikTok as a bridge to illegal child abuse material
Compartir:
An investigation raised alarms about the role of TikTok as a possible gateway to child sexual abuse content, both real and caused with artificial intelligence. The report warns that the social network not only hosts sexualized videos of minors but also facilitates access to illegal material that is later traded on other platforms.
The survey was conducted by the Spanish outlet Maldita.es and once again brings the responsibility of platforms regarding the protection of children to the center of the debate.
La red social aloja videos sexualizados de menores y facilita el acceso a material ilegal
The detected accounts and the reach of the content
According to the investigation, 40 accounts were identified that together exceed 1.5 million followers. From those profiles, thousands of videos featuring sexualized minors were published.
Of the total analyzed, more than 5,200 pieces of content were created with AI, while another 3,600 involve real minors. In many cases, the videos retain TikTok's watermark, which allows the original profile from which they were taken to be traced.
Repeated patterns and sexualized aesthetics
In the artificially caused videos, scenes with school uniforms, bikinis, or tight clothing are repeated. The shots are often focused on intimate areas, with a clearly sexualized aesthetic.
En los videos generados artificialmente se repiten escenas con uniformes escolares
This type of content is published systematically, without the platform acting quickly to stop it.
Comments, links, and the bridge to Telegram
The report notes that comments function as a contact channel. There, terms such as "buy," "trade," or "tlg" appear, common codes used to redirect to Telegram groups.
TikTok lanzó una herramienta con Inteligencia Artificial que transforma imágenes en clips
In those spaces, illegal material is exchanged or sold, both real and caused with AI. Several profiles also featured direct links to external sites.
To verify the mechanism, the researchers contacted 14 accounts promoted on Telegram. Eleven replied with an automatic menu of "packs," and seven sent images of child sexual abuse without being asked.
The role of the algorithm and the lack of response
The investigation also exposed how TikTok's algorithm works. By interacting with just a few videos of this type, a test account began to receive more similar recommendations in For You, the search engine, and profile suggestions.
Algunos usuarios, así como la empresa, han mostrado preocupación debido a las implicaciones económicas y sociales que el fallo podría representar
Despite the complaints, 14 of the 15 reported accounts remained active. Of 60 reported videos, only seven were removed.