Maldita Tecnología
Public Policy
Investigaciones

TikTok's inaction regarding accounts using AI to sell child pornography on Telegram

Publicado el
Tiempo de lectura: 9 minutos
Share:
En corto:
  • TikTok’s rules prohibit “accounts focused on AI images of youth in sexualized poses” but after reporting them, the platform kept 93% of them up and running
  • The 15 reported accounts accumulate almost 300,000 followers and their 3.900 videos have over 2 million likes
  • Psychologists warn this AI-generated content can “reinforce consumption patterns in people with a harmful interest in minors” and “end up affecting actual minors” 

An investigation by Fundación Maldita.es has found dozens of TikTok accounts that post sexualizing AI-generated videos of underage-looking girls and often direct users to Telegram, where they are offered child pornography. In their own bios, some of these accounts openly brand their content as “delicious-looking high school girls” or “junior models”. In the videos, girls often appear wearing tight clothes that hint at their genitals or nipples, or wearing suggestive school uniforms. Even more subtle videos like those of young girls licking ice cream are full of crude sexual comments. 

We flagged to TikTok a selection of those accounts, a group of 15 that accumulates almost 300,000 followers and their 3.900 videos have over 2 million likes altogether. The platform’s rules prohibit “accounts focused on AI images of youth in sexualized poses or facial expressions” but TikTok nevertheless found “no violation” in 14 of accounts reported (93%), and communicated they had restricted (but not deleted) only one of the accounts.

Read the full investigation: "PREDATORS ON TIKTOK: A GOLDENMINE FOR PEDOPHILES"

Videos that should have never been on TikTok, according to its own rules

TikTok’s community guidelines explicitly ban “accounts focused on AI images of youth in clothing suited for adults, or sexualized poses or facial expressions” and AI-generated “Sexualized, fetishized, or victimizing depictions” but the size and impact of these accounts clearly points to either a lack of enforcement or the reliance of automated moderation incapable of identifying clear but not obvious elements of minor sexualization.

We flagged 15 accounts and 60 videos to TikTok using the platform’s own reporting mechanism, open to all users. We selected the option to alert them of “sexually suggestive behaviour by youth”, for lack of a better and more detailed alternative. For 14 of the accounts (93%) TikTok responded that they did not violate their rules and in one case they said they had restricted the account. We appealed TikTok’s decision for each of their refusals and explained in detail why the accounts do violate its policies, but in a short time (exactly 30 minutes after the appeal for every single case) the platform reiterated their initial decision.

For the AI-generated videos of sexualized minors, TikTok said 46 of them (76.67%) did not violate their policies and removed or restricted just 14 (23.33%). After appealing those initial refusals, TikTok removed three more videos and restricted another one. You can see below some screenshots from videos that were not considered “sexualized” according to TikTok and judge by yourself:

Are those videos illegal? It’s complicated

The accounts and the videos we reported were clearly against TikTok’s community guidelines, but are they illegal content? The current legislation in countries like Spain might be ambiguous about AI-generated sexualized content involving children, first of all because the minors appearing in the videos do not actually exist, but IA legal expert Marcos Judel says that it might still be considered a child pornography which is illegal to produce and distribute: “Not every image of a teenager in a swimsuit is child porn, but a hyper-focused image of body parts generated with sexual intent can”.

Additionally, TikTok does have a legal obligation under the European Union Digital Services Act (DSA) to effectively mitigate systemic risks that stem from their platform, particularly those affecting “the rights of the child enshrined” or having negative effects on “the protection of minors”. Those effects are precisely what experts warn about: Psychologist Mamen Bueno says this kind of AI-generated content can “reinforce consumption patterns in people with a harmful interest in minors” and she gets even more specific: “The dissemination of sexualized content involving ‘fictional minors’ makes it socially difficult to distinguish between what is permissible and what is ethically unacceptable, fueling a demand that can later be transferred to real minors.” 

This specific risk involving AI-generated content depicting child sexual abusive material has already been documented by the European Commission and the Board of Digital Services in their first report on prominent and recurrent systemic risks under the DSA, based on evidence provided by CSOs and platforms themselves.

The role of effective risk mitigation in this case is even more relevant. Many of the actors involved in recruiting customers of child pornography on TikTok rely not only on the basic infrastructure of the platform, but benefit from its algorithmic amplification and TikTok-provided specific creation tools: Many of the videos themselves are generated in-platform using TikTok AI Alive and some of the accounts are part of TikTok’s subscription program.

Ayúdanos a seguir haciendo investigaciones como esta para hacer el ecosistema digital más seguro

Final destination: the sale of real child pornography on Telegram and others

As we reviewed the crude comments below the videos, one thing caught our attention: many of them promoted different platforms. After being redirected there, for instance to Telegram private chats, we were offered to purchase real child pornography. Some TikTok accounts directly had Telegram links in their bios where, after making contact, they asked us what particular kind of “material” we were interested in.

Some of the accounts responded to our direct messages on TikTok with links to external websites that sold AI-generated videos and images that sexualized minors, with prices ranging from 50 to 150 euros. At least one account that has over 13,000 followers directed us to make a particular ask and pay them through their PayPal account. Another one used Patreon, a membership-management platforms that was already mentioned in an investigation by the BBC about the sale of videos allegedly depicting child abuse. No transactions were ever made. 

The problem is clearly not contained to either TikTok, Telegram or the online payment services already mentioned. Many of the accounts shared links in which they said they hosted similar content in services such as YouTube, Google Drive, or Twitter (now X). Fundación Maldita.es has reported the Telegram accounts and websites that offered child pornography to the Spanish police so they can be investigated.


Etiquetas: