kids social media
Meta, TikTok and other social media companies must still face some claims of liability over the harm a class action says their platforms cause to young users.

Meta and other powerful social media companies must face parts of a sweeping class action after failing to convince a federal judge to throw out all claims that they are liable for the harmful effects of their apps on young users.

The companies, including TikTok and Snap, tried to get a judge to toss the massive litigation, which accuses them of profiting by making their apps addictive and harmful to children. But U.S. District Judge Yvonne Gonzalez Rogers ruled Tuesday to dismiss only some of the plaintiffs' five priority claims under Section 230 of the Communications Decency Act of 1996 and the First Amendment.

Hundreds of plaintiffs claim that Meta, Google and others knowingly design their apps to be addictive to children, accusing the companies in a 288-page complaint of making it impossible for parents to ensure that their children avoid harmful or dangerous content.

Attorneys for the defendants made arguments Oct. 27 in court that social media companies are protected from liability for harms to users under the Communications Decency Act, which allows web operators to moderate user speech and content as they see fit.

In her 52-page order, Rogers said defendants know that young users are core to their platforms, which are in turn demonstrably harmful to children using them. The case continues to question how liable the social media companies are for those harms.

The judge did not dismiss the plaintiffs' claims of design defect product liability, saying their accusations are not barred under Section 230 because they do not implicate publishing or monitoring of third-party content.

Rogers found that Section 230 does not grant immunity from a negligence per se claim, and that claims of failure to warn and negligence plausibly assert that defendants are liable for conduct other than publishing of third-party content.

However, the judge dismissed product defect claims specifically based on Section 230. She said the section immunizes defendants from assertions that they recommend adult accounts to adolescents, and from claims that the products are defective because they provide short-form content that plaintiffs find problematic.

"Nothing in Section 230 or existing case law indicates that Section 230 only applies to publishing where a defendant's only intent is to convey or curate information," Rogers said. "To hold otherwise would essentially be to hold that any website that generates revenue by maintaining the interest of users and publishes content with the intent of meeting this goal, would no longer be entitled to Section 230 immunity."

The judge also rejected throwing out liability claims based on the First Amendment, saying that much of the conduct defendants are accused of is not based on their speech or expression. She said some remedies plaintiffs suggest, like giving users tools to limit time spent on a platform, do not alter what the platform is able to publish. However, she did dismiss claims that the defendants are not protected under the First Amendment for how they time notifications about third-party content sent to users.

Rogers denied Snap's request to dismiss specific claims about Snapchat, along with the motion to dismiss for failure to adequately plead causation. However, she granted with leave to amend the motion to dismiss claims that defendants breached a duty to protect users from harm by third parties, such as adult predators.

"Plaintiffs only allege that defendants sought to increase minors' use of their platforms while 'knowing or having reason to know' that adult predators also used the sites and therefore increased the risk to the minors," the judge said. "This generality of the allegations is insufficient to show misfeasance."

Rogers found both parties' arguments on whether social media platforms should be classified as services, not products, "wanting." She chastised both sides for taking an "all or nothing approach," which she called "overly simplistic and misguided," accusing both sides of downplaying nuances in the case law and the facts.

"Cases exist on both sides of the questions posed by this litigation precisely because it is the functionalities of the alleged products that must be analyzed," she said. "The cases generally concern a specific product defect and the determination of whether a specific technology is a product hinges on the specifics of that defect."

Attorneys for both sides did not respond to requests for comment before press time. The litigation now enters the discovery phase.

California Attorney General Rob Bonta is among the state attorneys general who filed similar claims against Meta, saying the company deliberately made its social media platforms addictive to children and teens and arguing that it has a duty to prevent harm.

The Golden State has had Assembly Bill 2273 on the books since 2022, requiring web platforms to implement safety settings to protect kids' data and prohibiting companies that provide online services from collecting, retaining or using a child's personal information or geolocation.
Natalie Hanson reports on legal and general news in Oakland and the East Bay Area for Courthouse News Service. She previously wrote about Marin County news in Marin Independent Journal and covered city government, housing and homelessness and other Butte County news for Chico Enterprise-Record. She has also been published in has also been published in the San Jose Spotlight, Ethnic Media Services, The Mercury News and The Washington Post. Follow @nhanson_reports