Apple stalls CSAM auto-scan on devices after ‘feedback’ from everyone on Earth

Apple said it intends to delay the introduction of its plan to commandeer customers’ own devices to scan their iCloud-bound photos for illegal child exploitation imagery, a concession to the broad backlash that followed from the initiative. “Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material,” the company said in a statement posted to its child safety webpage.

This follows previous releases about Apple and their proposed CSAM monitoring, which looks suspiciously like an end-run around the corporate surveillance model they have already implemented for China. It seems very likely that the tools and services are already built into iOS 15 regardless of whether they are “turned on”. If only we could view the code.

Link: Apple stalls CSAM auto-scan on devices after ‘feedback’ from everyone on Earth
via www.theregister.com