Main Content
The Privacy Paradox at the Heart of Content Moderation
This seminar introduces student to the tension between a comprehensive and responsive content moderation scheme and user privacy. Additionally, this seminar exposes students to the world of “Trust & Safety,” a burgeoning field that may fit the career interests of students already intrigued by privacy law.
SEMINAR DESCRIPTION & GOALS
Content moderation has become an increasingly controversial and necessary aspect
of platform governance. Trust & Safety teams within the social media platforms
have developed increasingly sophisticated technical and procedural mechanisms to
enforce their platform’s community standards. Efforts to improve the enforcement
of such standards, though, often requires the collection of more data from users.
The trade-off between robust content moderation and respect for user privacy has
received inadequate attention from platforms, policymakers, and legal scholars.
This seminar provides students with the tools to assess the trade-offs between
content moderation and user privacy. Students will gain enough familiarity with the
technical aspects of content moderation to “speak engineer,” enough knowledge of
content moderation adjudication systems to recommend reforms that respect user
privacy while furthering the platform’s aims, and enough familiarity of applicable
content moderation laws and policies to theorize and advocate for new regulations.
Pre-Reading Materials
Required: Gillespie, Tarleton. 2018. Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. New Haven: Yale University Press.
Recommended: Suzor, Nicolas. 2019. Lawless: The Secret Rules That Govern Our Digital Lives. Cambridge: Cambridge University Press.
Other readings provided via PDF.
Assessment
Technical quiz and short answer exam (20%). Students will complete a multiple choice exam on the ins and outs of content moderation--specifically the technical aspects of content moderation. Students will also answer several short answer questions about the challenges of conducting content moderation at scale.
Privacy Assessment of a Content Moderation Intervention (80%). Students will select a content moderation intervention from the Prosocial Design Network database and perform a privacy assessment on that intervention. The assessment must include the following aspects:
- A brief overview of the technical feasibility of the intervention
- An analysis of the privacy concerns raised by that intervention
- A series of recommendations for how best to mitigate those concerns
- A recommendation for whether platforms should adopt the intervention
The assessment must exceed 2,500 words but may not exceed 3,500 words. We will discuss this assignment in more detail during the last class.
This book, and all H2O books, are Creative Commons licensed for sharing and re-use. Material included from the American Legal Institute is reproduced with permission and is exempted from the open license.