Hogan Lovells 2024 Election Impact and Congressional Outlook Report
15 November 2024
The U.S. Supreme Court recently released its decision in Moody v. NetChoice, providing some much-needed guidance to lower courts on the application of the First Amendment to laws regulating content moderation practices of websites, social media platforms, and other online services (“online platforms”). The Court held that online platforms’ selection, ordering, and ranking of third-party content is expressive and thus protected by the First Amendment—but the lower courts failed to conduct a proper analysis of whether the challenged laws are facially unconstitutional.
The decision also left much to be addressed. Most prominently, lower courts will now need to determine whether the challenged laws are facially unconstitutional. The majority opinion also left open the prospect that non-expressive algorithmic curation may receive different constitutional treatment.
Below are six takeaways from the opinion and things to watch for as policymakers and lower courts decide what to do next.
Florida and Texas enacted statutes curtailing online platforms’ ability to engage in content moderation (such as filtering, prioritizing, and labeling content) and requiring certain disclosures. Trade associations challenged these laws as infringing on online platforms’ First Amendment-protected right to organize content they published by impermissibly restricting speech. These cases were brought as “facial challenges,” which attempt to fully invalidate a law under the theory that it is unconstitutional in at least a substantial number of applications. This differs from an “as-applied” challenge in which the plaintiff argues that a particular application of a statute is unconstitutional.
In an opinion penned by Justice Kagan, the Court concluded that the lower courts failed to address the high burden required for a facial challenge because they and the parties did not fully address the range of online platforms and activities to which the laws conceivably apply. The Court noted, for example, that the analysis may differ when applied to a platform’s news feed versus its direct messaging services.
Five justices signed on to the opinion in full. Justice Jackson concurred with most but not all of the opinion. Justices Thomas, Alito, Barrett, and Jackson filed concurring opinions.
The cases were vacated and remanded to the Fifth and Eleventh Circuits to conduct a proper facial analysis.
The Court recognized that the editorial decisions made by online platforms regarding content moderation are a form of protected speech under the First Amendment. This protected speech includes decisions about what content to allow, remove, or promote on their platforms. The Court emphasized that government regulations that force private entities to host or promote speech against their will can violate the First Amendment.
The Court concluded that online platforms’ algorithmic decisions to sort, rank, and remove third-party content are a form of protected speech. Thus, government regulation of expressive algorithmic curation of content is subject to First Amendment protections.
The Court also clarified that it was not addressing the potential for certain types of algorithmic sorting to not be expressive. For example, the Court noted that it was not addressing “algorithms [that] respond solely to how users act online—giving them the content they appear to want, without any regard to independent content standards.” That said, it remains to be seen whether it is workable for courts to decouple such algorithms from expressive, First Amendment-protected algorithmic curation.
State legislatures across the country and the political aisle have been eager to enact laws regulating online platforms. The Court’s decision is unlikely to dissuade legislators from continuing these efforts, even if the laws are ultimately held unconstitutional. The Court leaves many questions unresolved, such as whether the Florida and Texas laws are constitutional for any online services. And given the Court’s explanation of how high the bar is for facial challenges, legislators may see an opportunity to enact laws that may remain in force for a longer period of time before courts can review whether a particular application of the law is unconstitutional.
This ruling may lead to an onslaught of as-applied challenges against online platform regulation because the Court has made facial challenges more difficult. This could increase the amount of litigation needed to address the constitutionality of these laws, clogging courts and potentially leading to conflicting district and circuit decisions in the coming years as litigation plays out. For companies potentially impacted by these laws, it will be important to follow these cases closely.
Section 230 of the Communications Act has long been relied on by online platforms to protect themselves from liability arising from third-party content. Although the parties also litigated whether Section 230 preempted the Florida and Texas laws, the majority opinion did not address this issue.
The Court concluded that the laws’ content-moderation provisions, as applied to platform news feeds, would fail even under lower standards of First Amendment scrutiny because Florida and Texas explicitly sought to suppress speech. But the Court noted that states might be able to sustain regulations of platforms based on other “possible interests” unrelated to speech suppression.
The Court left unresolved what to do with disclosure requirements for platforms or how the content-moderation provisions apply to services other than news feeds, like direct messaging, events management, email filtering, customer reviews, payments, and ride-sharing.
Authored by Mark Brennan, Ryan Thompson, and Thomas Veitch.
Zaid Umar, summer associate in our Washington, D.C. office, contributed to this article.