How can we build a fairer, safer internet?
From misinformation to hate speech, the struggle of moderating content on social media platforms is a tremendously tricky balance of liberty and safety.
The harms are real: More than a third of adults report having been subjected to online harassment. False rumors and misinformation spread on platforms have led to mob violence and put elections under scrutiny. Conspiracy theories spread online endanger everyone, leading to radicalization and mass violence.
Companies like Facebook, Twitter, and Google are entirely responsible for deciding what can or cannot stay up. AI-based algorithms speed up the process, but how are the rules decided? Human moderators can be hired, but at what cost to their mental health?
The internet has connected our world more connected than ever before, but we now risk losing it to the tragedy of the “digital commons”.
Join a "Digital Jury"
We are researchers building a new system to make decisions around how content moderation rules are formed. Like doing jury duty, you will join a panel of “netizens” in deciding the hard questions on how content moderation rules should be formed using real cases.
Sorry, we are not taking participants right now.
Thank you for participating in this experiment! Our results and final paper have been selected to be published in ACM CHI (Conference on Human Factors in Computing Systems), April 2020.
Abstract
As concerns have grown regarding harmful content spread on social media, platform mechanisms for content moderation have become increasingly significant. However, many existing platform governance structures lack formal processes for democratic participation by users of the platform. Drawing inspiration from constitutional jury trials in many legal systems, this paper proposes digital juries as a civics-oriented approach for adjudicating content moderation cases. Building on existing theoretical models of jury decision-making, we outline a 5-stage model characterizing the space of design considerations in a digital jury process. We implement two examples of jury designs involving blind-voting and deliberation. From users who participate in our jury implementations, we gather informed judgments of the democratic legitimacy of a jury process for content moderation. We find that digital juries are perceived as more procedurally just than existing common platform moderation practices, but also find disagreement over whether jury decisions should be enforced or used as recommendations.
Presentation
The video recording of the presentation is below, as presented remotely during the CambridgeCHI presentations on May 14, 2020.