1 Comment
author

08 Dec 2022

Apple is reportedly ditching a controversial plan to scan users’ photos stored in iCloud for child sexual abuse material, or CSAM, amid an ongoing privacy push. 

These safety tools, announced in August 2021, were meant to flag illicit content while preserving privacy. But the plans drew widespread criticism from digital rights groups who argued that the surveillance capabilities were ripe for potential abuse. 

Expand full comment