Australia’s eSafety Commissioner, Julie Grant, has fined X (formerly Twitter) A$610,500 for serious non-compliance with a transparency notice regarding child sex abuse material. Grant had issued transparency notices to several online service providers, including Google and X, in February under the Online Safety Act 2021. While Google was warned for providing generic responses, X’s non-compliance was deemed more severe. X failed to adequately disclose important information regarding its response time to reports of child sexual exploitation material, measures to detect such material in live streams, tools and technologies used for detection, and teams and resources dedicated to ensuring safety.
A recent report by the Stanford Internet Observatory revealed that X (then Twitter) and Instagram are popular platforms for advertising the sale of self-generated child sex abuse material. The researchers identified 405 accounts on Instagram and 128 on Twitter that were advertising such material. The report also highlighted that Instagram’s recommendation algorithms effectively promote child sex abuse material once accessed. Additionally, the researchers found that some platforms implicitly allow the trading of such material in private channels. X was found to even allow the public posting of known child sex abuse material.
The creation and trading of child sex abuse material is widely regarded as one of the most harmful abuses of online services. Major platforms, including X, have policies banning such material from their public services. Elon Musk, who took over X last year, prioritized removing child exploitation material. Moderating this content is challenging and cannot rely solely on user reporting. Platforms like X have a responsibility to distinguish between minors and adults in terms of who is depicted in the content and who shares it. Musk’s decision to fire content moderation employees may have hindered X’s ability to respond to harmful material and compliance notices.
Governments are increasingly demanding accountability from social media platforms for their content, data privacy, and child protection. Non-compliance now results in significant fines in many jurisdictions. For example, X was fined US$150 million last year for misleadingly using email addresses and phone numbers for targeted advertising. This year, Meta, Facebook’s parent company, was fined €1.2 billion for mishandling user information. While the A$610,500 fine imposed on X may seem small in comparison, it adds to the company’s reputation damage caused by poor content moderation and the reinstatement of banned accounts.
X has 28 days to settle the fine, or eSafety can initiate civil penalty proceedings. Failure to settle could result in a cumulative fine of up to A$780,000 per day, retroactive to the initial non-compliance in March. The fine’s impact extends beyond financial implications, as it could lead to pressure from advertisers to withdraw their ads or encourage other governments to take similar actions. India’s Ministry of Electronics and IT recently sent notices to X, YouTube, and Telegram, instructing them to remove child sex abuse material or face heavy fines and penalties. To address the issue and regain trust, X will need to drastically improve its approach to content moderation, particularly when it involves harm to minors.