Back to Bluesky

BlueskyPolicy Change

minorPro-PrivacyPolicy Change

Executive Summary

Bluesky published its 2024 moderation report, revealing a 17x increase in moderation reports (6.48 million, up from 358,000 in 2023) as users grew from 2.89 million to 25.94 million. Moderators removed 66,308 accounts and automated systems removed another 35,842. The platform applied 5.5 million content labels and fielded 238 law-enforcement requests from Germany, the U.S., Brazil, and Japan, complying with 146 of them.

What Happened

Bluesky published its 2024 moderation report on January 17, 2025, revealing that user reports increased 17 times from 358,000 in 2023 to 6.48 million in 2024 as the platform grew from 2.89 million to 25.94 million users. The company expanded its moderation team to roughly 100 staff members working 24/7 and began offering psychological counseling in September 2024. Moderators removed 66,308 accounts while automated systems removed another 35,842 accounts, and the platform applied 5.5 million content labels throughout the year. Bluesky received 238 law enforcement requests from Germany, the U.S., Brazil, and Japan, complying with 146 of them.

Who Is Affected

All Bluesky users are affected by the moderation practices detailed in the report, with particular impact on users in Brazil, Germany, the U.S., and Japan where law enforcement made data requests. The report noted that harassment, trolling, and intolerance generated the largest number of user reports. Over 4,000 users running their own Personal Data Servers are also part of the platform's ecosystem.

Why It Matters

The report provides transparency into how a rapidly growing social platform handles content moderation and responds to law enforcement requests for user data. The 61 percent compliance rate with law enforcement requests and the removal of over 100,000 accounts demonstrates the scale at which user data and content are being evaluated by both human moderators and automated systems. Starting in 2025, Bluesky will accept moderation reports directly in-app and later support appeals, changing how users interact with the moderation process.

What You Should Do

Review Bluesky's community guidelines to understand what content may be flagged or removed and how moderation decisions are made. When Bluesky launches in-app reporting in 2025, use that feature to track the status of your reports and any actions taken on content you've flagged. Watch for the draft of updated Guidelines that Bluesky plans to share in Q1 2025 to understand how policies may change.

AI-Assisted

Event summaries are generated by Claude AI from verified sources and reviewed by humans before publication.

Bluesky published its 2024 moderation report, revealing a 17x increase in... — Bluesky | PrivacyWire