Back to Google

Google - Policy Change

moderatePro-PrivacyPolicy Change

Executive Summary

The European Parliament blocked the extension of a temporary law that allowed tech companies to scan their platforms for child sexual abuse material, creating a legal gap that makes such scanning currently illegal in the EU. Google, Meta, Snap, and Microsoft criticized the decision as an "irresponsible failure" and stated they would continue voluntary scanning despite the regulatory uncertainty. Child safety experts warn this could sharply reduce abuse reports, similar to a 58% drop that occu...

What Happened

On April 3, 2026, a temporary European Union law permitting tech companies to scan their platforms for child sexual abuse material expired, and the European Parliament blocked its extension amid privacy concerns from some lawmakers. This created a legal gap where automated scanning for such material is now illegal under EU privacy law, even though companies remain obligated to remove illegal content under the separate Digital Services Act. Google, Meta, Snap, and Microsoft issued a joint statement calling the decision an "irresponsible failure" and announced they would continue voluntary scanning despite the regulatory uncertainty.

Who Is Affected

EU-based users of major platforms including Google, Meta, Snap, and Microsoft services are affected, as these companies must now navigate conflicting legal obligations regarding content scanning and removal. Child safety advocates warn that vulnerable children across the EU face increased risk, pointing to a previous 2021 legal gap when reports of child sexual abuse material from EU accounts dropped 58% over 18 weeks. Tech companies operating in the EU face legal uncertainty about compliance with privacy protections versus content moderation requirements.

Why It Matters

This creates a direct conflict between user privacy protections and child safety enforcement in one of the world's largest digital markets. The regulatory gap demonstrates the ongoing tension between encrypted communications privacy and law enforcement access, with the EU Parliament prioritizing privacy concerns over extending surveillance capabilities even for detecting serious crimes. The 2021 precedent suggests this legal uncertainty could substantially reduce detection and reporting of child exploitation, potentially allowing abuse to continue undetected while permanent legislation remains under negotiation with no announced timeline.

What You Should Do

EU users concerned about privacy should understand that major tech companies have stated they will continue voluntary scanning despite the legal uncertainty, so platform monitoring has not stopped. Parents and caregivers should maintain existing online safety practices including monitoring children's online activities, using parental controls, and discussing online safety regardless of platform detection capabilities. Users who want stronger privacy protections should note that Google has simultaneously rolled out end-to-end encryption for Gmail enterprise users on mobile devices, though this feature currently requires Enterprise Plus licenses and is not available to personal Gmail users.

AI-Assisted

Event summaries are generated by Claude AI from verified sources and reviewed by humans before publication.

The European Parliament blocked the extension of a temporary law that allowed... - Google | PrivacyWire