Bluesky — Data Breach
Executive Summary
Brazilian investigative outlet Nucleo reported that Bluesky was failing to moderate Portuguese-language CSAM terms, identifying over 125 profiles sharing or selling child sexual abuse material. The crisis was triggered by Brazil's ban of X in late August 2024, which drove 2.5 million new Brazilian users to Bluesky within a week, creating a moderation backlog with reports spiking to 50,000 per day. Bluesky expanded its Portuguese-language moderation team and hired external contractors.
What Happened
In September 2024, Brazilian investigative outlet Nucleo identified over 125 Portuguese-language profiles on Bluesky sharing or selling child sexual abuse material without moderation. The crisis emerged after Brazil banned X in late August 2024, causing 2.5 million new Brazilian users to join Bluesky within a week and overwhelming the platform's moderation capacity with reports spiking to 50,000 per day. Bluesky's head of Trust and Safety reported a tenfold week-over-week increase in reports of illegal content on September 5, 2024. In response, Bluesky expanded its Portuguese-language moderation team and hired external contractors, eventually quadrupling its contract workforce from 25 to 100 moderators.
Who Is Affected
All Bluesky users were potentially exposed to unmoderated illegal content, particularly Portuguese-speaking users who comprised the bulk of new signups during this period. The platform's entire user community was affected by the moderation failures, as the system was unable to block known keywords and terms associated with child sexual abuse material in Portuguese. Children whose images were being distributed on the platform were directly harmed by the moderation backlog.
Why It Matters
This incident demonstrates how rapid platform growth can overwhelm content moderation systems, particularly for non-English languages, creating dangerous gaps in user safety. The failure to moderate explicit Portuguese-language terms for illegal content reveals structural weaknesses in how emerging social platforms prepare for international expansion. With only two confirmed CSAM cases in all of 2023 compared to eight in a single day during the crisis, the incident shows how quickly safety infrastructure can become inadequate when user bases grow exponentially.
What You Should Do
Report any suspicious content or accounts to Bluesky's moderation team immediately through the platform's reporting tools. Consider using Bluesky's blocking and muting features extensively to filter your experience until moderation systems stabilize. Parents and guardians should monitor children's use of the platform during this period of rapid growth and moderation challenges. Users concerned about content safety may want to temporarily limit their use of the platform or adjust content filters until the expanded moderation team is fully operational.
AI-Assisted
Event summaries are generated by Claude AI from verified sources and reviewed by humans before publication.
Sources