Apple Faces Lawsuit Over Decision to Abandon CSAM Detection in iCloud

Apple is being sued for its decision not to implement a system designed to scan iCloud photos for child sexual abuse material (CSAM).

The lawsuit, filed by a 27-year-old woman under a pseudonym, claims Apple’s inaction forces victims to continuously relive their trauma. According to The New York Times, the suit accuses Apple of announcing “a widely touted improved design aimed at protecting children” but ultimately failing to implement measures to detect or limit CSAM.

In 2021, Apple introduced plans to use digital signatures from organizations like the National Center for Missing and Exploited Children to identify known CSAM content within users’ iCloud libraries. However, the company shelved the system following backlash from security and privacy advocates who warned it could open a backdoor for government surveillance.

The plaintiff alleges she was molested as an infant by a relative who distributed images of the abuse online. She continues to receive law enforcement notifications almost daily regarding individuals being charged for possessing those images.

Attorney James Marsh, representing the plaintiff, noted that there are approximately 2,680 potential victims who might be eligible for compensation if the case succeeds.

A spokesperson for Apple told The Times that the company is “urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.”

This is not the first time Apple has faced scrutiny over this issue. In August, a 9-year-old girl and her guardian filed a lawsuit against the company, accusing it of failing to address CSAM on iCloud.

Source

Control F5 Team
Blog Editor
OUR WORK
Case studies

We have helped 20+ companies in industries like Finance, Transportation, Health, Tourism, Events, Education, Sports.

READY TO DO THIS
Let’s build something together