cover image: Dispute Resolution and Content Moderation: Fair, Accountable, Independent, Transparent, and Effective

20.500.12592/ttdz2n9

Dispute Resolution and Content Moderation: Fair, Accountable, Independent, Transparent, and Effective

14 Jan 2020

This paper offers suggestions for how to improve one specific aspect of content moderation: resolving disputes over takedowns. Dispute resolution may seem like a small area, but it actually encapsulates the legitimacy problems behind content moderation decisions. The mechanisms behind the decisions seem arbitrary to outside observers. Decisions are subject to no public scrutiny or accountability. Appeals can only be directed to the companies themselves, if appeals mechanisms even exist. This paper considers the problem of legitimacy in content moderation decisions and suggests new institutions to rebalance the private and public interests in resolving disputes. David Kaye, the U.N. Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, has noted that appeal mechanisms and remedies are “limited or untimely to the point of nonexistence.”7 He has recommended that “[g]iven their impact on the public sphere,” social media companies “open themselves up to public accountability.”8 Our recommendations thus contribute to a broader conversation about rebalancing the relationship between companies, users, and governments. This is otherwise known as platform governance.9 Following the particular concerns of the Transatlantic Working Group, we seek solutions that safeguard freedom of expression.
dispute resolution content regulation

Authors

Heidi Tworek, Ronan Ó Fathaigh, Lisanne Bruggeman, Chris Tenove

Published in
Netherlands