Apple is accused of “privacy-washing” its responsibilities regarding child sexual abuse material on its iCloud platform. The legal complaint suggests that Apple failed to implement adequate measures for detecting, reporting, and removing CSAM, effectively allowing this harmful material to proliferate. Source.
The US District Court for the Northern District of California received the lawsuit, which names an unnamed nine-year-old, “Jane Doe,” as the plaintiff. Source
The lawsuit aims for over $5 million in damages for each individual in the class action, asserting that Apple neglected to adopt industry standards for CSAM detection. Moreover, it insists on the implementation of comprehensive protection measures, accessible reporting mechanisms, and quarterly third-party monitoring to prevent further incidents Source.
Apple allowed the storage and distribution of child sexual abuse material on its iMessage and iCloud products under the pretense of privacy protections, according to a proposed class action. https://t.co/sm0rHvlkma
— Bloomberg Law (@BLaw) August 15, 2024
Details from the Legal Complaint and Apple’s Responsibility
The complaint notably accuses Apple of violating sex trafficking and consumer protection laws, breaching contract, misrepresentation, and unjust enrichment. Source Apple’s previous attempt to introduce stricter CSAM detection tools was abandoned in 2022 due to concerns about user surveillance. As a result, critics argue that Apple’s “privacy features” simultaneously neglected children’s safety. Source.
“The system meant to protect Apple’s youngest users…[is] an afterthought.” Source
Furthermore, Apple’s policies may set dangerous precedents for other tech companies. Reports indicate that the App Store continues to host apps used for sextortion and deepfake pornography. Additionally, Apple doesn’t enforce its own Developer Guidelines, leaving teens vulnerable to inappropriate content Source.
Support and Criticism from Experts
Apple’s approach has faced substantial criticism. Sarah Gardner, founder and chief executive of Heat Initiative, stated, “Apple could easily prevent known child sexual abuse images and videos from being stored and shared on iCloud. Yet the company has chosen to allow it,” highlighting the information’s devastating real-life consequences for survivors of childhood sexual abuse Source.
“Trust is given, not earned. Distrust is earned.” Source
Meanwhile, advocates point to the inherent risks in overreaching surveillance. Scanning for one type of content could open a slippery slope for bulk surveillance affecting all users and inject new threat vectors for data thieves to exploit. The balance between security and privacy remains delicate and challenging for tech firms Source.