Back to Bluesky

BlueskyPolicy Change

minorPro-PrivacyPolicy Change

Executive Summary

Bluesky published its first annual moderation report covering 2023, documenting the growth from a small beta to over 3 million registered accounts. The report detailed the hiring of a full-time moderation team, the launch of community moderation features, and the development of internal Trust and Safety infrastructure built from scratch.

What Happened

On January 16, 2024, Bluesky published its first annual moderation report covering 2023. The report documented the platform's growth from a small beta to over 3 million registered accounts and detailed the development of moderation infrastructure including hiring a full-time moderation team, launching community moderation features, and reviewing 358,165 individual reports with human moderators rather than automated systems. The company stated this transparency report is a first step toward full transparency, with plans to provide more granular data in the future.

Who Is Affected

All users of the Bluesky platform are affected by the transparency practices outlined in the report. Approximately 5.6 percent of active users filed at least one report in 2023, while 3.4 percent of active users received at least one report on their account or content.

Why It Matters

This represents an early effort at transparency in content moderation from a platform built on decentralized infrastructure. Bluesky's commitment to human review of all reports and direct employment of moderators rather than third-party vendors differs from common industry practices. The publication of moderation data allows users and researchers to understand how community guidelines are enforced and provides accountability for platform governance decisions.

What You Should Do

Users can review the published moderation report to understand how Bluesky enforces its Community Guidelines and Terms of Service. Those interested in platform accountability can access the raw moderation data that Bluesky has made available for independent analysis. Users should familiarize themselves with the reporting mechanisms available in the app to address content that violates community standards.

AI-Assisted

Event summaries are generated by Claude AI from verified sources and reviewed by humans before publication.

Bluesky published its first annual moderation report covering 2023, documenting... — Bluesky | PrivacyWire