Back to Apple

ApplePolicy Change

criticalAnti-PrivacyPolicy Change

Executive Summary

Apple announced plans to scan iCloud Photos for known child sexual abuse material (CSAM) using on-device hash matching before upload, along with Communication Safety features for Messages. The CSAM scanning proposal drew immediate backlash from privacy advocates, security researchers, and civil liberties organizations who warned it could be repurposed for government surveillance.

Apple announced plans to scan iCloud Photos for known child sexual abuse... — Apple | PrivacyWire