Meta CEO Mark Zuckerberg and the chief executives of TikTok, X, Discord and Snap faced a grilling by hostile US lawmakers on Wednesday over the dangers that children and teens face on social media platforms.
The tech chiefs were convened by the US Senate Judiciary Committee where they were put to task about the effects of social media in a session titled "Big Tech and the Online Child Sexual Exploitation Crisis."
The executives are confronting a torrent of political anger for not doing enough to thwart online dangers for children, including from sexual predators and teen suicide.
During one round of particularly heated questioning, Zuckerberg was made to stand up and apologize to the families of victims who had packed the committee room.
"Mister Zuckerberg, you and the companies before us, I know you don't mean it to be so, but you have blood on your hands. You have a product that's killing people," Senator Lindsey Graham told the chief executives.
Testifying to senators were Zuckerberg, X's Linda Yaccarino, Shou Zi Chew of TikTok, Evan Spiegel of Snap and Discord's Jason Citron.
"We work hard to provide parents and teens support and controls to reduce potential harms," Meta's Zuckerberg told the committee in his opening statement.
"Keeping young people safe online has been a challenge since the internet began and as criminals evolve their tactics, we have to evolve our defenses too," he added.
Zuckerberg also told the lawmakers that according to research, "on balance" social media was not harmful to the mental health of young people.
TikTok's Chew said "as a father of three young children myself I know that the issues that we're discussing today are horrific and the nightmare of every parent."
"I intend to invest more than $2 billion in trust and safety. This year alone, we have 40,000 safety professionals working on this topic," Chew said.
Meta also said 40,000 of its employees work on online safety and that $20 billion has been invested since 2016 to make the platform safer.
Ahead of their testimony, Meta and X, formerly Twitter, announced new measures in anticipation of the heated session.
Meta, which owns the world's leading platforms Facebook and Instagram, said it would block direct messages sent to young teens by strangers.
By default, teens under age 16 can now only be messaged or added to group chats by people they already follow or are connected to.
Meta also tightened content restrictions for teens on Instagram and Facebook making it harder for them to view posts that discuss suicide, self-harm or eating disorders.
Multi-state lawsuit
Singling out Meta, senators pointed to internal company documents that show that Zuckerberg declined to strengthen the teams devoted to tracking online dangers to teens.
"The hypocrisy is mind-boggling," Senator Richard Blumenthal told the New York Times.
Those documents are part of a major lawsuit brought by about 40 states jointly suing Meta over alleged failures with children.
Under US law, web platforms are largely shielded from legal liability in relation to content that is shared on their site.
While lawmakers would like to set up more rules to increase online safety, new laws have been stymied by a politically divided Washington and intense lobbying by big tech.
One existing proposal is the Kids Online Safety Act, or KOSA, which aims to protect children from algorithms that might trigger anxiety or depression.
Another idea would require social media platforms to verify the age of account holders and completely bar children under the age of 13.
"I don't think you're gonna solve the problem. Congress is gonna have to help you," Senator John Neely Kennedy told the executives.
(Except for the headline, this story has not been edited by NDTV staff and is published from a syndicated feed.)