US State Sues Apple Over Alleged Use Of ICloud For Storing Child Sex Abuse Material

The suit filed by Attorney General John Bohen McCuskey in a state court accuses the iPhone maker of letting iCloud be used to store and distribute such content, often referred to as child pornography.

Advertisement
Read Time: 2 mins
Apple has been under pressure from some child safety advocates
Quick Read
Summary is AI-generated, newsroom-reviewed
  • US state West Virginia is suing Apple over child sex abuse material stored on iCloud
  • The suit accuses Apple of enabling storage and distribution of child pornography
  • Apple refused to scan iCloud data to protect user privacy and security
Did our AI summary help?
Let us know.

The US state of West Virginia on Thursday said it is suing Apple over child sex abuse material people tuck away in its iCloud online storage service.

The suit filed by Attorney General John Bohen McCuskey in a state court accuses the iPhone maker of letting iCloud be used to store and distribute such content, often referred to as child pornography.

Apple has been under pressure from some child safety advocates due to a decision several years ago not to jeopardize user privacy and security by allowing scanning of data stored in iCloud.

"Preserving the privacy of child predators is absolutely inexcusable," McCuskey said in a release.

McCuskey argued that Apple has "shirked their responsibility to protect children under the guise of user privacy."

In response to an AFP query, Apple countered that safety is a priority for the tech firm, which is constantly innovating to combat threats to users, especially children.

"All of our industry-leading parental controls and features are designed with the safety, security, and privacy of our users at their core," an Apple spokesperson told AFP.

For example, a Communication Safety feature automatically intervenes on children's devices when nudity is detected in Messages, shared Photos, AirDrop or live FaceTime video calls, the spokesperson added.

Apple opted not to allow scanning of photos stored in iCloud for child sex abuse, saying in a 2023 letter to an advocacy group that such a move brought the potential for unintended consequences.

Advertisement

"Scanning for one type of content, for instance, opens the door for bulk surveillance," Apple said in the letter.

"How can users be assured that a tool for one type of surveillance has not been reconfigured to surveil for other content such as political activity or religious persecution?"

(Except for the headline, this story has not been edited by NDTV staff and is published from a syndicated feed.)

Featured Video Of The Day
AI Is A "Tsunami" Hitting Global Jobs: IMF Chief To NDTV At AI Impact Summit