Australia implemented one of the world's most absurd digital safety laws on Wednesday, banning minors under 16 from having accounts on social networks such as TikTok, Instagram, Facebook, X, YouTube, Snapchat, Reddit, Threads, Twitch, and Kick.
The measure, part of the Online Safety Amendment (Social Media Minimum Age) Act passed in 2024, requires platforms to delete existing accounts belonging to minors, prevent new sign-ups, and apply age verification using technologies such as official identification, facial recognition, or age inference systems. Companies that fail to comply will face fines of up to 49.5 million Australian dollars.
The socialist government of Prime Minister Anthony Albanese defended the regulation, arguing that “it will reduce young people's exposure to harmful content” and mitigate the impact of algorithms that encourage compulsive use.

According to a 2025 state study, 96% of minors between 10 and 15 years old use social networks, and seven out of ten have seen material “violent, misogynistic, or related to suicide and eating disorders.” In addition, one in seven reported grooming experiences, and more than half are believed to have suffered cyberbullying.
Despite official arguments, the law has caused harsh criticism from technology, digital safety, and privacy experts. One of the main concerns is the limited scope of the measure.
Platforms such as Discord or Roblox, widely used by minors and where security risks have also been reported, were left out of the ban because the government did not classify them as social networks. For many analysts, this exclusion creates a significant protection gap.











