Advertisement
  1. SEJ
  2.  ⋅ 
  3. News

This Lawsuit Could Make Social Media Safer For Your Kids

A social media lawsuit alleging harm to kids mostly advances despite objections from tech giants.

  • A judge allowed a lawsuit against social media companies to move forward.
  • The lawsuit says social media platforms harm kids.
  • The judge ruled the lawsuit can include some design claims. Claims about content recommendations were dismissed.
This Lawsuit Could Make Social Media Safer For Your Kids

A lawsuit accusing social media companies of intentionally harming children through addictive platform designs is allowed to advance, a federal court ruled today.

Brought on behalf of hundreds of minors across the U.S., the consolidated lawsuit alleges companies including Meta, Snap, TikTok, and YouTube specifically designed their platforms to “hook” young users, leading to mental health issues like anxiety and depression.

The case consolidates over 100 individual lawsuits filed beginning in early 2022, soon after Facebook whistleblower Frances Haugen leaked internal research about Instagram’s adverse effects on teen mental health.

Judge Allows Bulk Of Lawsuit to Proceed

“Defendants are alleged to target children as a core market and designed their platforms to appeal to and addict them,” wrote Judge Yvonne Gonzalez Rogers in the order issued in California.

The companies sought to dismiss the lawsuit, arguing they are shielded from liability under Section 230 of the Communications Decency Act. Known simply as Section 230, this 1996 law protects online platforms from lawsuits over user-generated content.

But after a detailed analysis, Rogers wrote, “the parties’ ‘all or nothing’ approach to the motions to dismiss does not sufficiently address the complexity of the issues facing this litigation.”

At the center of the lawsuit are allegations around the platforms’ never-ending feeds, push notifications, algorithmic recommendations, and other design features that make them addictive. The plaintiffs say these intentional design choices are to blame for mental health harms, not the content itself.

Rogers agreed. Section 230 does not prohibit product liability claims focused on design defects like ineffective parental controls, age verification, time limits, and barriers to account deletion. However, claims about using algorithms to recommend accounts and some notification features were dismissed.

“The same applies here. The Court must consider the specific conduct through which the defendants allegedly violated their duties to plaintiffs,” Rogers wrote. “It is these detailed, conduct-specific allegations that require analysis.”

Potentially Damaging Discovery Ahead

With the case moving into discovery, internal documents and data from the tech companies related to their knowledge of potential harms could come to light. The plaintiffs argue the companies were aware of the mental health effects on children but did little to address safety concerns.

“The parties’ all or nothing approach does not fairly or accurately represent the Ninth Circuit’s application of Section 230 immunity,” wrote Rogers. “Rather, the Court has conducted an analysis of the actual functionality defects alleged in the complaint.”

Lawsuit Seeks Design Changes, Damages

The lawsuit seeks to apply product liability law to social media, treating platforms as defective products that require improved designs and warnings.

Tech companies have long maintained legal immunity for user content alone. Still, this case and others signal a new threat focused on algorithms, recommendation systems, and other operational choices embedded in platforms.

Attorneys for the companies have not indicated if they will appeal the decision, allowing significant parts of the lawsuit to advance. With billions of dollars and the future design of social media at stake, the case is being closely watched in tech and legal circles.

If the class action is certified, the plaintiffs will seek damages and force platform changes like default time limits, age verification, algorithmic transparency, etc. However, the case still faces hurdles like proving the platforms directly caused mental health harm suffered by individual minors.

Ultimately, the lawsuit takes aim at the core advertising business model of social media companies that depends on maximizing user engagement time through design choices like endless scrolling feeds. Any changes forced upon the companies could impact their profits.

Looking Ahead

While still at an early stage, the mixed ruling keeps the lawsuit alive and paves the way for internal documents to potentially expose how far Facebook, Snapchat, TikTok, and YouTube may have gone to hook in young users at the expense of their well-being.


Featured Image: DavideAngelini/Shutterstock

Category News Instagram
ADVERTISEMENT
SEJ STAFF Matt G. Southern Senior News Writer at Search Engine Journal

Matt G. Southern, Senior News Writer, has been with Search Engine Journal since 2013. With a bachelor’s degree in communications, ...