News Detail Banner
All News & Events

U.S. Supreme Court Rules For Twitter, Google, And Facebook On Terrorism Liability And Declines Calls To Reshape Liability For Internet Platforms Under Section 230.

May 19, 2023
Firm Memoranda

Click here to download PDF.

Yesterday the U.S. Supreme Court released its highly anticipated decisions in two companion cases,  Twitter, Inc. v. Taamneh and Gonzalez v. Google LLC.  In doing so, it handed a significant win to social media companies by holding that they generally are not liable for terror-related content appearing on their respective platforms.  At the same time, however, the Court punted on the arguably more significant question of whether Section 230 of the Communications Decency Act shields tech platforms from liability arising out of their so-called “recommendations” of content to specific users.

Taamneh and Gonzalez arose from lawsuits filed by family members of victims of ISIS terrorist attacks.  In both cases, the plaintiffs sought damages, not against ISIS itself, but against the social media companies—Twitter, Facebook, and Google (which owns YouTube)—that had hosted ISIS-related content among the billions of other pieces of content that appear daily on their websites.  According to the plaintiffs, by hosting such content, the companies had aided and abetted the terrorist attacks.  The plaintiffs relied on a 2016 enactment, the Justice Against Sponsors of Terrorism Act (“JASTA”), that imposes civil liability on anyone “who aids and abets, by knowingly providing substantial assistance, or who conspires with the person who committed such an act of international terrorism.”

I. In Twitter v. Taamneh, the Court Ruled That The Social Media Companies Did Not Aid And Abet ISIS Terrorist Attacks.

In Taamneh, the Court held unanimously that the plaintiffs’ allegations were insufficient to impose liability on the companies under JASTA, reversing the contrary ruling of the United States Court of Appeals for the Ninth Circuit.  After exhaustively reviewing the scope of aiding and abetting liability at common law, the Court held that “aiding and abetting” under JASTA requires that the defendant “gave such knowing and substantial assistance” to the terrorist attack that the defendant could be said to have “culpably participated” in the attack.  The Court concluded that the plaintiffs had failed to meet that standard, noting that it was not alleged that the companies had given “any special treatment or words of encouragement” to ISIS or “selected or took any action at all with respect to ISIS’ content (except, perhaps, blocking some of it).” 

The Court further explained that the “mere creation” of the social-media platforms “is not culpable.”  Placing those platforms in the broader context of communications platforms generally, the Court stated that while “it might be that bad actors like ISIS are able to use platforms like defendants’ for illegal—and sometimes terrible—ends,” “the same could be said of cell phones, email, or the internet generally,” yet “we generally do not think that internet or cell service providers incur culpability merely for providing their services to the public writ large.”  Elaborating on the point, the Court observed that “such providers would [not] normally be described as aiding and abetting, for example, illegal drug deals brokered over cell phones—even if the provider’s conference-call or video-call features made the sale easier.” 

The Court added that, to the extent that plaintiffs’ theory of aiding and abetting was based on the tech platforms’ alleged failure to remove ISIS content, the common law generally does not impose liability for mere inaction nor impose a generalized duty to rescue.  “[A] contrary holding,” the Court explained, “would effectively hold any sort of communication provider liable for any sort of wrongdoing merely for knowing that the wrongdoers were using its services and failing to stop them.”

II. In Gonzalez v. Google, The Court Declined To Rule On Whether Section 230 Of The Communications Decency Act Provides Separate Immunity For So-Called “Recommendations.”

The  Gonzalez case, while involving underlying facts similar to Taamneh, raised a separate question with potentially far-reaching consequences: whether Section 230 of the Communications Decency Act—the statute at the heart of the modern Internet—shields social media companies from liability arising out of their algorithmic “recommendations” of content to specific users. 

When enacted in 1996, Section 230 addressed concerns that interactive websites could be held liable for content uploaded by third parties—such as through defamation actions.  To foreclose such liability, Section 230 provides that websites may not be “treated as the publisher or speaker of any information provided by another information content provider.”  In Gonzalez, the plaintiffs argued that YouTube’s recommendation algorithm, which selected content for individual users based on prior viewing habits and which had allegedly recommended ISIS videos to specific users, is not encompassed by Section 230’s immunity.  In arguing in favor of Section 230 immunity, the companies contended that sorting and grouping videos (i.e., recommending via algorithm) is simply a necessary component of organizing the vast quantity of information hosted on social-media platforms, a traditional publisher function protected by Section 230. 

The Supreme Court, however, ultimately opted not to reach the Section 230 issue.  Instead, the Court held that under the construction of JASTA’s aiding-and-abetting provision adopted in Taamneh, the plaintiffs in Gonzalez likely could not succeed on their lawsuit regardless of the applicability of Section 230.  The Court thus effectively punted on the arguably more consequential question presented by the cases, leaving in place the Ninth Circuit’s interpretation of Section 230, which generally favors Internet platforms.  The Court instead vacated the Ninth Circuit’s decision in the case, remanding it with instructions that the plaintiffs’ complaint be considered in light of the Court’s decision in Taamneh.  

Still, the Court did leave some breadcrumbs in Taamneh as to how it might ultimately resolve the Section 230 issue.  In particular, the Court stated that it “disagree[d]” with the plaintiffs that the platforms’ “‘recommendation’ algorithms go beyond passive aid and constitute active, substantial assistance.”  To the contrary, the Court explained that “the algorithms appear agnostic as to the nature of the content, matching any content (including ISIS’ content) with any user who is more likely to view that content” and that “[o]nce the platform and sorting-tool algorithms were up and running, defendants at most allegedly stood back and watched.”  The Court thus seemed to view a recommendation algorithm as a mere passive organizing tool—and thus arguably a mere part of the infrastructure of publishing third-party content on the Internet.

III. The Decisions Represent Major Victories for Social Media Companies

Taamneh and Gonzalez represent substantial victories for social media platforms.   By rejecting the claim that social media companies could be subject to aiding-and-abetting liability for terrorist attacks committed by groups with a social media following, Taamneh shields the companies from the considerable liability that could have arisen from nearly every terrorist atrocity.  And though the Court punted on the Section 230 issue in Gonzalez, tech companies retained favorable precedent in lower courts of appeals and avoided a potentially disastrous reduction in the scope of Section 230 immunity, at least for the time being.  That was more than a moral victory, because earlier individual opinions (including from Justice Thomas, the author of Taamneh) had appeared to signal that the Court was inclined to adopt a much narrower reading of Section 230.  Still, the risk remains that the Court will again take up the Section 230 question in a future case.

Quinn Emanuel Urquhart & Sullivan LLP filed an amicus brief in the Supreme Court in Gonzalez v. Google LLC in support of Google on behalf of several Internet companies.  See link to brief.


If you have any questions about the issues addressed in this memorandum, or if you would like a copy of any of the materials mentioned in it, please do not hesitate to reach out to:

John F. Bash
Phone: 713-221-7006

Margret Caruso
Phone: 650-801-5101

Rachel Herrick Kassabian
Phone: 650-801-5005

Andrew H. Schapiro
Phone: 312-705-7403


To view more memoranda, please visit

To update information or unsubscribe, please email