Tim Hardwick
Apple is facing a $1.2 billion damages lawsuit over its decision to scrap plans to scan iCloud Photos for child sexual abuse material (CSAM), according to a report from The New York Times.
The lawsuit, filed in Northern California on Saturday, represents a potential class of 2,680 victims and alleges that Apple's failure to implement previously announced child safety tools allowed harmful content to continue to circulate, causing ongoing harm to victims.
In 2021, Apple announced plans to bring CSAM detection to iCloud Photos, along with other child safety features. However, the company faced significant backlash from privacy advocates, security researchers, and political groups who argued that the technology could create potential loopholes for government surveillance. Apple subsequently delayed and then abandoned the initiative.
In explaining its decision at the time, Apple said that implementing universal scanning of users’ personal iCloud storage would introduce serious security vulnerabilities that could potentially be exploited by attackers. Apple also expressed concern that such a system could set a problematic precedent, since once a content scanning infrastructure exists for one purpose, it could face pressure to expand to broader surveillance applications across different types of content and messaging platforms, including those that use encryption.
The lead plaintiff in the lawsuit, who filed the lawsuit under a pseudonym, said she continues to receive notifications from law enforcement about individuals accused of storing abuse images of her taken as a baby. The lawsuit alleges that Apple's decision to fail to take the safety measures it announced caused victims to repeatedly relive their traumas.
In response to the lawsuit, Apple spokesman Fred Sainz emphasized the company's commitment to combating child exploitation, saying that Apple is “urgently and aggressively innovating to combat these crimes without compromising the safety and privacy of all of our users.” Apple pointed to existing features like Communication Safety, which warns children about potentially inappropriate content, as examples of its ongoing efforts to protect children.
Note: Due to the political and social nature of the discussion on this topic, this thread is located in our Political News forum. All forum members and site visitors may read and follow the thread, but posting is limited to forum members with at least 100 posts.
Tag: Apple Child Safety Features[ 131 comments ]