Skip to main content
W
External Link
Apple sued for not implementing 'NeuralHash' CSAM detection in iCloud.

It’s been two years since Apple dropped its plan to detect child abuse imagery using client-side iCloud scanning.

Now, the New York Times reports on a class-action lawsuit filed in California saying it harmed a group of 2,680 victims by failing to "implement those designs or take any measures to detect and limit" CSAM, like using Microsoft's PhotoDNA.

Under law, victims of child sexual abuse are entitled to a minimum of $150,000 in damages, which means the total award...could exceed $1.2 billion