cover image: A Guide to Content Moderation for Policymakers


A Guide to Content Moderation for Policymakers

21 May 2024

Content moderation represents the policies and practices that companies use to express their own preferences and to create the kind of online space that is best for their interests. Government policies that interfere with these content decisions not only harm the rights of private actors but also are likely to cause harmful unintended consequences and chill innovation. While prominent social media platforms may be biased and imperfect, the government cannot solve these problems and will only make them worse. Policymakers worldwide are increasingly advancing policies related to content moderation. From the left, there are efforts to stop hate speech and misinformation, as seen in New York's Online Hate Speech Law and the European Union's Digital Services Act. From the right, there are efforts that try to force social media companies to host content from certain political speakers or viewpoints, as seen in legislation in Texas and Florida. Despite the intensity of these concerns--some of which may be valid--efforts to regulate content moderation often reflect a lack of understanding of how content moderation works.


David Inserra

Published in
United States of America