Apple Faces $1.2B Lawsuit Over Abandoned Child Safety Scanning System
• 1 min read
A lawsuit seeking $1.2 billion in damages has been filed against Apple for abandoning planned CSAM detection tools on iCloud. The case, brought by an abuse survivor representing thousands of victims, challenges Apple's balance between user privacy and child protection measures.
AI-Generated Fake Nudes Crisis Exposes Legal System's Shortcomings
• 1 min read
The proliferation of AI technology creating realistic fake nude images has become a widespread problem, with one website receiving 14 million monthly visits. Victims and experts warn that current laws are inadequate as police struggle to address cases, while the UK government promises new legislation in 2024.