Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as ...
Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse ...
It claims that, after Apple showed off its planned child safety tools, the company “failed to implement those designs or take ...
Apple is facing a lawsuit seeking $1.2 billion in damages over its decision to abandon plans for scanning iCloud photos for ...
A second suit says Apple isn't doing enough to stop the spread of harmful images and videos and that it's revictimizing the ...
A Nashua man has been sentenced to federal prison for possession of child sexual abuse material (CSAM). U.S. Attorney Jane E.
Thousands of CSAM victims are suing Apple for dropping plans to scan devices for the presence of child sexual abuse materials. In addition to facing more than $1.2B in penalties, the company could be ...
A release from ASP said that the Internet Crimes Against Children unit and SWAT team executed a search warrant at the ...
Thousands of victims banded together for a proposal regarding a class action lawsuit against Apple, with the company now ...
The controversial CSAM scanning plan didn't manage to get full support, just yet, but it keeps coming back to lawmakers' ...
Announced in 2021, the plan was for Apple to scan images on iCloud for child abuse material using on-device technology. While ...
Apple faces a $1.2 billion lawsuit for failing to address child sex abuse material (CSAM) after cancelling a detection tool.