A lawsuit filed against Apple in Northern California seeks $1.2 billion in damages over the company's decision to abandon planned child sexual abuse material (CSAM) detection tools for iCloud.
The legal action, filed by a 27-year-old abuse survivor, aims to represent approximately 2,000 affected devices victims. The plaintiff regularly receives law enforcement notices about abuse images being found on Apple devices and stored in iCloud.
In 2021, Apple announced plans to implement tools that would scan iCloud photos for CSAM and notify the National Center for Missing and Exploited Children when such material was detected. However, the company later dropped these plans following intense criticism from privacy advocates who warned the technology could enable government surveillance.
The lawsuit claims that by failing to follow through with the promised safety measures, Apple has allowed harmful material to continue circulating, causing ongoing harm to victims. Under federal law, each victim could receive minimum damages of $150,000 if Apple is found liable.
Apple maintains its commitment to fighting child exploitation while protecting user privacy. "We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users," said company spokesperson Fred Sainz. He highlighted existing features like Communication Safety that warn children about potentially inappropriate image sharing.
The case follows recent accusations from the UK's National Society for the Prevention of Cruelty to Children that Apple has been underreporting CSAM on its platforms. Through her attorney, the plaintiff argues that Apple's current approach leaves victims vulnerable by not implementing stronger protective measures.
The lawsuit seeks both monetary compensation for victims and changes to Apple's practices regarding CSAM detection and prevention. The outcome could influence how tech companies balance user privacy with child safety measures going forward.
Only one link was contextually appropriate to insert, referring to the number of affected devices/victims. The other content in the provided link was not directly related to the main article topic about Apple's CSAM detection plans.