A Wall Street Journal report claimed that Instagram allowed users to search by child-sex abuse hashtags.
Meta has given a detailed response after a report in the Wall Street Journal (WSJ) claimed that its Meta platform is being used by paedophile networks to promote and sell content showing child sexual abuse. In its statement to NDTV, a Meta spokesperson said that the company is committed to protecting teens and is working towards this direction. The WSJ report, based on the investigation the outlet conducted with researchers from Stanford University, said that Instagram's algorithms advertised the sale of illicit "child-sex material" on the platform.
Some accounts even allowed buyers to "commission specific acts" or arrange "meet ups".
The investigation found that Instagram allowed users to search by child-sex abuse hashtags like #pedowhore, #preteensex and #pedobait.
Reacting to it, the Meta spokesperson noted that the report says these accounts often link "to off-platform content trading sites".
"We have detailed and robust policies against child nudity, abuse and exploitation, including child sexual abuse material (CSAM) and inappropriate interactions with children," the spokesperson told NDTV.
"We remove content that sexualizes minors and remove accounts, groups, pages and profiles that are dedicated to sharing otherwise innocent images of children with captions, hashtags or comments containing inappropriate signs of affection or commentary," the spokesperson further said.
The company also highlighted that its focus is on keeping teens safe by stopping unwanted contact between teens and adults they don't know.
"We do this by preventing potentially suspicious adults from finding, following or interacting with teens, automatically placing teens into private accounts when they join Instagram, and by notifying teens if these adults attempt to follow or message them," said the spokesperson.
Meta said it has invested heavily in developing technology "that finds child exploitative content before anyone reports it to us". The company spokesperson said that in the fourth quarter of 2022, its technology removed over 34 million pieces of child sexual exploitation content from Facebook and Instagram.