News
Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages ...
Child sex abuse survivors suing have accused Apple of using the cybersecurity defense to ignore the tech giant's mandatory CSAM reporting duties. If they win over a jury, Apple could face more ...
Apple details reasons to abandon CSAM-scanning tool, more controversy ensues Safety groups remain concerned about child sexual abuse material scanning and user reporting. Lily Hay Newman, wired ...
Apple has drawn backlash from privacy advocates over its new plans to try to prevent the spread of child sexual abuse material (CSAM), by scanning for images on iOS devices that match with images ...
Apple wants to prevent child sexual abuse material (CSAM) from spreading on iCloud and iMessages. But it could go the way of NSO Group's spyware on your iPhone.
In December, Apple said that it was killing an effort to design a privacy-preserving iCloud photo-scanning tool for detecting child sexual abuse material (CSAM) on the platform. Originally ...
Over a year ago, Apple announced plans to scan for child sexual abuse material (CSAM) with the iOS 15.2 release. The technology is inevitable despite imperfections and silence about it.
In the wake of reports that dozens of teen boys have died by suicide since 2021 after being victimized by blackmailers, Apple is relying on its Communication Safety features to help protect ...
Apple has been accused of underreporting the prevalence of child sexual abuse material (CSAM) on its platforms.The National Society for the Prevention of Cruelty to Children (NSPCC), a child ...
If you Google my name, the phrase “revenge porn” pops up. As five tech CEOs testify about CSAM before the Senate today, Apple CEO Tim Cook's absence is glaring.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results