Meta has rejected allegations that it used pornographic material to train its artificial intelligence models, stating that any downloads of adult content were made by individual employees for “personal use.”
The tech giant made the statement in a legal filing urging a US district court to dismiss a copyright infringement lawsuit brought by Strike 3 Holdings, a company known for producing what it calls high-quality, ethical adult videos. The lawsuit accuses Meta of using Strike 3's copyrighted films to train an undisclosed AI model said to power its video generator, Movie Gen.
According to a report in ArsTechnica, in its motion to dismiss the lawsuit, Meta accused Strike 3 of basing its case on “guesswork and innuendo,” adding that the company “has been labelled by some as a ‘copyright troll' that files extortive lawsuits.”
Urging the court to dismiss all copyright claims, Meta said there was no proof that it had instructed or even known about the alleged downloading of around 2,400 adult films owned by Strike 3.
According to Meta, Strike 3 also cited “no facts to suggest that Meta has ever trained an AI model on adult images or video, much less intentionally so.”
The lawsuit stated that Meta allegedly downloaded Strike 3's content as far back as 2018, four years before its AI research into “Multimodal Models and Generative Video” began. The company argued that this timeline made it “implausible” that the downloads were linked to AI training.
Meta added that activity traced to corporate IP addresses amounted to roughly 22 downloads of adult videos per year. “The far more plausible inference to be drawn from such meagre, uncoordinated activity is that disparate individuals downloaded adult videos for personal use,” Meta's filing stated.
The lawsuit came as Meta faces scrutiny following a Reuters report that revealed its internal rules allowed AI chatbots to “engage a child in conversations that are romantic or sensual,” spread false medical information, and assist users in making racist arguments. Meta has since updated its policies.
In its legal filing, Meta reiterated that its user terms forbid generating adult content with its AI models, which it said contradicts “the premise that such materials might even be useful for Meta's AI training.”
Strike 3, Meta added, “does not identify any of the individuals who supposedly used these Meta IP addresses, allege that any were employed by Meta or had any role in AI training at Meta, or specify whether (and which) content allegedly downloaded was used to train any particular Meta model.”
A Meta spokesperson told ArsTechnica, “We don't want this type of content, and we take deliberate steps to avoid training on this kind of material.”
Strike 3's lawsuit also claims Meta used a “stealth network” of 2,500 “hidden IP addresses” to download the pornographic videos and is seeking $350 million in damages. The adult film company reportedly has two weeks to respond to Meta's motion.