Business

Federal judge hints that Big Tech companies may have to face consumer allegations of mental health harm

2 Mins read

A federal judge in California hinted Friday that Google, Meta, Snap and TikTok could very likely have to face allegations by consumers that the social media companies harmed young Americans’ mental health with addictive features built into their respective platforms — and that Big Tech’s signature liability shield, known as Section 230, may not be enough to deflect those claims.

The judge overseeing the litigation — which includes nearly 200 individual cases against the social media companies — said repeatedly that the tech companies may be unable to escape liability for what the consumer plaintiffs allege are vast harms to America’s children posed by algorithmic rabbit holes, image filters that encourage eating disorders or limitless content feeds.

Should the claims be allowed to proceed, it could mark a significant blow to the tech industry, which is currently fending off a nationwide legal assault on their services linked to mental health allegations. And it could mark a turning point for how courts have interpreted Section 230, a sweeping 1996 law that has exempted websites from a wide range of suits targeting their content moderation decisions.

This week, dozens of states filed a virtually identical federal lawsuit against Meta alleging the company knew that the design of its social media platforms had been harmful to kids. Eight additional states filed similar suits in their respective state courts. (In response, Meta has said it’s committed to providing safe experiences online.)

Addressing lawyers for both the consumer plaintiffs as well as the tech companies on Friday, District Judge Yvonne Gonzalez Rogers of the US District Court for the Northern District of California said she was unpersuaded by arguments that either all of the claims should be thrown out, or none of them.

She also expressed skepticism in response to claims by industry lawyers that tech companies have no legal obligation to ensure their platforms are safe for children.

Gonzalez Rogers criticized the consumer plaintiffs for making a disorganized grab-bag of allegations, and faulted them for appearing to make numerous complaints about the content that appears on social media platforms, as opposed focusing on the design decisions that serve that content to users.

Still, she said, the burden falls on the tech platforms to prove why she should throw out the cases at an early stage in litigation.

And she pointed to the potential limits of Section 230 in two critical exchanges. In one, she said there are “more objective functionality decisions” being litigated than simple content moderation decisions that would be protected by Section 230.

“It doesn’t seem to me that you can escape that,” Gonzalez Rogers said.

Later, she suggested that “it’s not clear to me that the entire thing is thrown out” due to Section 230, implying that some claims could be tossed while others survive.

The more than four-hour hearing saw attorneys sparring over numerous legal theories of liability, and it is still possible that Gonzalez Rogers may throw out some claims based on factors other than Section 230.

But one thing is certain, Gonzalez Rogers said: “Your billing fees today exceed my annual salary.”

Read the full article here

Related posts
Business

US banking lobby sues Federal Reserve over stress test framework

1 Mins read
Stay informed with free updates Simply sign up to the US financial regulation myFT Digest — delivered directly to your inbox. US…
Business

US launches probe into Chinese semiconductor industry

2 Mins read
Unlock the White House Watch newsletter for free Your guide to what the 2024 US election means for Washington and the world…
Business

Germany set to investigate warnings over Magdeburg attacker

3 Mins read
Unlock the Editor’s Digest for free Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter. The German…
Get The Latest News

Subscribe to get the top fintech and
finance news and updates.

Leave a Reply

Your email address will not be published. Required fields are marked *