Bluesky — Policy Change
Executive Summary
Bluesky publicly announced its partnership with the UK-based Internet Watch Foundation (IWF) to combat child sexual abuse material. Bluesky had become an IWF member on December 1, 2024, gaining access to the IWF's hash database of known CSAM images for automated detection. The partnership came after the September 2024 Portuguese-language CSAM moderation crisis and amid rapid user growth past 30 million accounts.
What Happened
Bluesky announced on February 24, 2025, that it had partnered with the UK-based Internet Watch Foundation to combat child sexual abuse material on its platform. Bluesky became an IWF member on December 1, 2024, gaining access to the organization's hash database and tools for detecting known CSAM images through automated matching without directly viewing user content. The partnership followed a November 2024 announcement that Bluesky would quadruple its moderation team to 100 people due to increased harmful content accompanying rapid user growth to over 30 million accounts.
Who Is Affected
All Bluesky users are affected by this change, as the platform now employs hash-matching technology to scan content uploaded, downloaded, viewed, or shared on the network. The IWF's system creates digital fingerprints of confirmed child sexual abuse material to enable automated detection across the platform. Users posting or sharing content on Bluesky are subject to this automated scanning infrastructure designed to identify matches against the IWF's database of illegal imagery.
Why It Matters
This partnership represents Bluesky's adoption of industry-standard content scanning tools used by major technology companies to detect illegal material, marking a shift toward more automated content monitoring as the decentralized platform scales. The IWF reported record levels of child sexual abuse material online, with over 291,000 web pages removed in 2024, representing a 5 percent increase from the previous year. As Bluesky grows beyond 30 million users, the implementation of automated hash-matching systems establishes precedent for how the platform will balance user privacy with content moderation obligations.
AI-Assisted
Event summaries are generated by Claude AI from verified sources and reviewed by humans before publication.
Sources