Advertisement
  1. SEJ
  2.  ⋅ 
  3. News

Google & Other Tech Giants Not Liable For Terrorist Content

Supreme Court rules tech giants not liable for terrorist content, sending Gonzalez v. Google case back for reconsideration.

  • The Supreme Court ruled tech companies aren't liable for terrorist content on their platforms.
  • The Gonzalez v. Google case was remanded for reconsideration in light of the Twitter ruling.
  • The scope of Section 230, shielding tech companies from liability for user content, remains to be defined by Congress or in future cases.
Google & Other Tech Giants Not Liable For Terrorist Content

The Supreme Court decided today that tech companies aren’t liable for terrorist content posted on their platforms.

The lawsuit, initiated by the family of a victim of a 2017 ISIS attack, argued that Twitter, Facebook, and Google should be held accountable for allowing the terrorist organization to utilize their platforms in its terrorism efforts.

However, the court unanimously decided that the lawsuit couldn’t proceed.

Justice Clarence Thomas, writing for the unanimous court in Twitter v. Taamneh, clarified that social media platforms aren’t guilty, even if nefarious actors use such platforms for illegal and sometimes terrible ends.

The argument brought forth by the victim’s family, that tech companies should be held liable for the alleged failure to stop ISIS from using these platforms, lacked the necessary link between the tech companies and the terrorist attack to establish liability.

Justice Ketanji Brown Jackson, in a brief concurring opinion, underscored that the court’s opinion was narrow in significant respects. She suggested that other cases with different allegations and records could lead to different conclusions.

Impact Of The Gonzalez v. Google Case

Following the Twitter ruling, the Supreme Court addressed the case of Gonzalez v. Google, a lawsuit filed by the family of Nohemi Gonzalez, a 23-year-old American woman killed in the 2015 ISIS attack on a Parisian cafe.

The Gonzalez family argued that Google, through its ownership of YouTube, aided ISIS’s recruitment by allowing the terrorist group to post videos on YouTube that incited violence and sought to recruit potential ISIS members.

The family also claimed that Google’s algorithms recommended ISIS videos to users.

The U.S. Court of Appeals for the 9th Circuit previously held that Section 230 of the Communications Decency Act of 1996, which shields tech companies from liability for user-published content, protected such recommendations.

However, in light of the Twitter decision, the Supreme Court vacated this judgment and remanded the case for reconsideration.

The court refrained from deciding on the scope of Section 230, suggesting that this issue is best left to Congress or a future case.

Certain members of Congress feel strongly about reforming Section 230, believing it offers tech giants too much protection.

The office of Senator Mark Warner, a vocal critic and advocate for reforms to Section 230, provided a statement to Search Engine Journal regarding the decision in the Gonzalez v. Google case.

He characterizes Section 230 as old and outdated, arguing it’s a “Get Out of Jail Free Card” for large companies.

“For years now, I have been saying that Congress needs to take action to address the sweeping protection that Section 230 gives technology companies. This antiquated statute has outlived its usefulness and provided the largest platform companies with a “Get Out of Jail Free Card,” as their sites are used by scam artists, harassers, and violent extremists.”

Senator Warner, however, makes it clear that he doesn’t view reforming Section 230 as opening the floodgates for massive liability claims against platform companies.

“Reforming Section 230 doesn’t mean that platforms will automatically be subject to massive liability claims; victims will still have to prove their case in court.”

In Summary

These cases collectively highlight the ongoing debate surrounding the responsibility of tech companies in moderating user-generated content and the extent to which they can be held liable for harmful content shared on their platforms.

The Supreme Court’s decisions indicate that, at least for now, a direct connection between the actions of tech companies and specific acts of terrorism is necessary to establish legal liability.

Nonetheless, the court’s comments suggest that different circumstances could result in different outcomes.


Sources: SCOTUS

Featured image generated by the author using Midjourney.

Category News
ADVERTISEMENT
SEJ STAFF Matt G. Southern Senior News Writer at Search Engine Journal

Matt G. Southern, Senior News Writer, has been with Search Engine Journal since 2013. With a bachelor’s degree in communications, ...