•  
  •  
 

Abstract

The proliferation of online disinformation and the rise of private censorship are paradigmatic examples of the challenges to traditional First Amendment jurisprudence in an algorithmic society. The limitations of traditional First Amendment jurisprudence are amplified by the impact of the COVID-19 pandemic in two ways. On the one hand, in the wake of the pandemic, we have entered an “infodemic” era where the volume of disinformation, as well as the harm it causes have reached unprecedented levels. For example, health disinformation has contributed to vaccine hesitancy. On the other hand, even though the proliferation of online disinformation seems to suggest that it is desirable to enforce content moderation more rigorously, the pandemic has also revealed the importance of access to online information, raising concerns about censorship imposed by private platforms on social media users. Furthermore, the high degree of opacity and unpredictability of content moderation pose great danger to users’ First Amendment right. In light of the above consideration, this Note proposes a legal framework that would curtail online disinformation while ensuring users’ right to accessing online platforms. To achieve this goal, this Note argues that the First Amendment should be interpreted as not merely a negative right but also a positive right. That is, the traditional laissez-faire First Amendment jurisprudence, which considers public actors as the sole threat to freedom of speech and neglects the power asymmetry between private platforms and their users, should be rejected. The underlying principle of the positive approach is to design a regulatory regime that is least restrictive and fosters accountability and transparency in content moderation by introducing procedural requirements. In this regard, the recently introduced Digital Services Act in the European Union—which represents a paradigmatic shift from interpreting freedom of speech as a “negative right” (i.e., protecting users from government interferences) to a “positive right” (i.e., ensuring the government provides users with sufficient procedural safeguards to check against private platforms)—could provide some important lessons for the U.S. to reconstruct its online platforms regulation in the era of an algorithmic society.

Share

COinS