•  
  •  
 

Abstract

This Article addresses a critical but underexplored aspect of content moderation: if a user’s online content or actions violate an Internet service’s rules, what should happen next? The longstanding expectation is that Internet services should remove violative content or accounts from their services as quickly as possible, and many laws mandate that result. However, Internet services have a wide range of other options—what I call “remedies”—they can use to redress content or accounts that violate the applicable rules. This Article describes dozens of remedies that Internet services have actually imposed. It then provides a normative framework to help Internet services and regulators navigate these remedial options to address the many difficult tradeoffs involved in content moderation. By moving past the binary remove-or-not remedy framework that dominates the current discourse about content moderation, this Article helps to improve the efficacy of content moderation, promote free expression, promote competition among Internet services, and improve Internet services’ community-building functions.

Share

COinS