Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Analysis News Spotlights Technology

Apple sued for failing to implement tools that would detect CSAM in iCloud

post-img

Apple is being sued by victims of child sexual abuse over its failure to follow through with plans to scan iCloud for child sexual abuse materials (CSAM), The New York Times reports. In 2021, Apple announced it was working on a tool to detect CSAM that would flag images showing such abuse and notify the National Center for Missing and Exploited Children. But the company was hit with immediate backlash over the privacy implications of the technology, and ultimately abandoned the plan.

The lawsuit, which was filed on Saturday in Northern California, is seeking damages upwards of $1.2 billion dollars for a potential group of 2,680 victims, according to NYT. It claims that, after Apple showed off its planned child safety tools, the company “failed to implement those designs or take any measures to detect and limit” CSAM on its devices, leading to the victims’ harm as the images continued to circulate.

In a statement shared with Engadget, Apple spokesperson Fred Sainz said, “Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users. Features like Communication Safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. We remain deeply focused on building protections that help prevent the spread of CSAM before it starts.”

The lawsuit comes just a few months after Apple was accused of underreporting CSAM by the UK’s National Society for the Prevention of Cruelty to Children (NSPCC).

Related Post