The Use of Restorative Justice to Address the Rise in Content Moderation Disputes
By Elad Michael
In February 2022, Spotify apologized for the controversy surrounding Joe Rogan who posted racial, insensitive content on the company’s platform.[1] Spotify’s CEO commented that the company would consult outside advisors on those issues, a move that is reminiscent of Facebook’s introduction of its Oversight Board.[2] While the Board provides quasi-judicial review over content moderation issues submitted by Facebook or its users, it can also serve as a model for restorative justice in the content moderation area.[3]
Spotify’s incident reflects a steady trend of requiring platforms to do better and take more content down.[4] This, coupled with companies’ failure to regulate harmful but not extreme content, drove users to find recourse in actions of muting, unfollowing, and blocking the posting user.[5] Because of the technical and operational difficulties moderation raises, companies hybridized existing models with user-led models that rest on users’ reactions and feedback.[6] Twitter, for instance, recently added a downvote feature that is reportedly aimed at monitoring low-quality content based on the downvote volumes.[7] Facebook introduced another feature that lays out an array of message templates to express objection and enables users to voice how the post makes them feel.[8] To some extent, these features embody a practice of restorative justice rooted in bringing injured parties to speak about their harm with wrongdoers, who listen and may present their narrative in the story.[9]
Restorative justice usually involves direct communication of victims and offenders to provide a setting for acknowledgment of fault by the offender, restitution of some sort to the victim, including apologies and payments, and often new mutual understandings.[10] As restorative justice practices prolificated in traditional, discrete crimes over decades,[11] it scarcely expanded to the corporate context in forms of supplemental sentences in environmental violations[12] and non-trial resolutions in foreign bribery enforcement.[13] In these instances, corporations’ remedial actions included purchasing assets for the public benefit, supporting public infrastructure initiatives, donating money to charities, creating a foundation, and performing community service.[14] Yet restorative justice did not seem to break into content moderation practices, despite the potential of criminal liability for illegal content distribution.[15]
A framework that synthesizes between restorative justice and content moderation might lie in Facebook’s Oversight Board model, which was established in an effort to moderate content. The Board has the power to allow or remove the disputed content, uphold or reverse a designation that led to an enforcement outcome, such as deciding that the content depicts graphic violence and should therefore display a warning screen, and grant additional resolutions or other technical remedies permitted by Facebook.[16] This authority opens the door to restoration through a remedy of an apology of the offender, who may acknowledge his conduct and its wrongfulness, to the victim, who draws into a deeper understanding of the causes and potentially forgiveness.[17] The decisions are typically made within 90 days, a sufficient period to guarantee the possibility of a substantive remedy by the time the request is decided.[18]
Although the process involves the posting person, reporter, and other individuals and groups, who submit written statements,[19] it does not provide the participants with the opportunity to express their needs and affect the determination of the best way to seek reparation.[20] Nor does it charge the community with any responsibility to contribute to the process.[21] The Board further falls short of restoring users because it selects to review and decide a narrow set of requests that entails only to the most challenging content matters.[22] Thus, the vast majority of users are left without any recourse. Nonetheless, Facebook can redesign a more restorative justice-conscious framework with certain adaptions. For instance, it can adopt procedures such as victim-offender mediation, circle sentencing, and peacemaking circles.[23] Consequently, users can gain a stronger voice that goes beyond the factual scope of the dispute.
In the face of the reality that Facebook’s content moderation will never be perfect, the need for users-led moderation becomes clear.[24] This need calls for a restorative justice framework that, by its nature, explores options to rectify the damage with the injured party, while achieving greater satisfaction from the parties and low recidivism rates.[25] Accordingly, companies that do not have effective content complaint intake systems in place can piggyback on Facebook’s model with reference to restorative justice practices.
____________________________
[1] Jason Abbruzzese, Spotify CEO apologizes but backs Rogan after racial slur episodes are removed, NBC (Feb. 6, 2022, 10:32 PM), https://www.nbcnews.com/tech/tech-news/spotify-ceo-apologizes-backs-joe-rogan-rcna15106 [https://perma.cc/R4NM-PCWB].
[2] Id.
[3] Mark MacCarthy, The Facebook Oversight Board’s failed decision distracts from lasting social media regulation, Brookings (May 11, 2021), https://www.brookings.edu/blog/techtank/2021/05/11/the-facebook-oversight-boards-failed-decision-distracts-from-lasting-social-media-regulation/ [https://perma.cc/H4ZQ-PWJX] (noting that the Board does not see itself as an external review board dedicated to resolving disputes between Facebook and its users, rather, a judicial review).
[4] Evelyn Douek, More Content Moderation Is Not Always Better, Wired (June 2, 2021, 8:00 AM), https://www.wired.com/story/more-content-moderation-not-always-better/ [https://perma.cc/MTU4-Q3L6].
[5] Ali Tehrani, Why User-Led Moderation Could Be The Answer To The Content Moderation Problem, Forbes (May 21, 2021, 8:20 AM), https://www.forbes.com/sites/forbestechcouncil/2021/05/21/why-user-led-moderation-could-be-the-answer-to-the-content-moderation-problem/?sh=56bb4aa34c4d [https://perma.cc/R3UC-DQWL].
[6] Id.
[7] Sarah Sarder, Twitter's new downvote feature is nothing like Reddit, YouTube. Here's why., USA Today (Feb. 4, 2022, 8:26 PM), https://www.usatoday.com/story/tech/2022/02/04/twitter-downvote-button-replies/6670356001/ [https://perma.cc/L946-QDTV].
[8] Katie Shonk, Dispute Resolution on Facebook: Using a Negotiation Approach to Resolve a Conflict, Disp. Resol. Harv. L. School (Jan. 25, 2022), https://www.pon.harvard.edu/daily/dispute-resolution/on-facebook-dispute-resolution-goes-live/ [https://perma.cc/63PH-WAC6].
[9] Andrew Brady Spalding, Restorative Justice for Multinational Corporations, 76 Ohio St. L.J. 357, 387 (2015).
[10] Carrie Menkel-Meadow, Restorative Justice: What Is It and Does It Work?, 3 Ann. Rev. L. & Soc. Sci. 161 (2007).
[11] Thalia Gonzalez, Reorienting Restorative Justice: Initiating a New Dialogue of Rights Consciousness, Community Empowerment and Politicization, 16 Cardozo J. Conflict Resol. 457, 474 (2015).
[12] Andrew Brady Spalding, Restorative Justice for Multinational Corporations, 76 Ohio St. L.J. 357, 388–91 (2015).
[13] Samuel J. Hickey, Remediation in Foreign Bribery Settlements: The Foundations of a New Approach, 21 Chi. J. Int'l L. 367, 388, 390–91 (2021).
[14] Id. at 411; Spalding, supra note 9, at 389–90.
[15] Can a social media post result in criminal charges?, Nathan Akamine, https://www.akamine-law.com/can-a-social-media-post-could-result-in-criminal-charges/ [https://perma.cc/8J3A-RZM7] (last visited Feb. 14, 2022).
[16] Oversight Board Charter, Facebook 6, https://about.fb.com/wp-content/uploads/2019/09/oversight_board_charter.pdf [https://perma.cc/86H3-ES7F] (last visited Feb. 14, 2022).
[17] Spalding, supra note 9, at 387.
[18] Appeals process, Oversight Board, https://oversightboard.com/appeals-process/ [https://perma.cc/T95A-6C9N] (last visited Feb. 14, 2022).
[19] Oversight Board Charter, supra note 16, at 6.
[20] Spalding, supra note 9, at 385.
[21] Id.
[22] Id.; Facebook Oversight Board for Content Decisions: What to Know, Meta (Aug. 22, 2019),https://www.facebook.com/journalismproject/facebook-oversight-board-for-content-decisions-overview [https://perma.cc/WXD9-FRVZ].
[23] Spalding, supra note 9, at 387.
[24] Mark Zuckerberg, A Blueprint for Content Governance and Enforcement, Facebook (May 5, 2021), https://www.facebook.com/notes/751449002072082/ [https://perma.cc/FL9G-95DC].
[25] Menkel-Meadow, supra note 10, at 4.