Bates College Undergraduate Law Review
Abstract
In 2024, the Supreme Court of the United States jointly heard Moody v. NetChoice, LLC and NetChoice, LLC v. Paxton, cases challenging Florida and Texas statutes that restricted the capacity of social media platforms to moderate content. While remanding the cases, the Court reaffirmed that content moderation and curation qualify as protected expression under the First Amendment. This paper critically assesses the Court's majority opinion and its nonbinding dicta regarding the expressive nature of algorithmic moderation.
Specifically, drawing on Justice Alito’s concurrence in judgement only, this paper argues that the Court fails to account for fundamental differences between traditional and social media with respect to the use of algorithmic content moderation. By assuming these tools accurately capture a platform's expressive intent, the Court establishes a troublesome precedent of limited accountability for platforms while subverting the very principles underlying free expression and potentially facilitating more social media misinformation. This paper concludes by exploring alternative regulatory paths, suggesting that the government may use compelling interests, such as in TikTok v. Garland, to address the societal impacts of poor moderation through other means.
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.
Recommended Citation
Merkatz, Sam
(2026)
"A Constitutional Black Box: Critically Assessing the Constitutional and Algorithmic Implications of the NetChoice Cases,"
Bates College Undergraduate Law Review: Vol. 3:
Iss.
1, Article 7.
Available at:
https://scarab.bates.edu/bulr/vol3/iss1/7
Included in
Communications Law Commons, Constitutional Law Commons, First Amendment Commons, Internet Law Commons, Science and Technology Law Commons