Difference between revisions of "Content moderation in Reddit"

From SI410
Jump to: navigation, search
(Overview)
(History)
Line 20: Line 20:
 
* '''No Moderation''' is lack of any moderation whatsoever. This occurs most often on abandoned subreddits where moderators are inactive or moderators fail to moderate their subreddit according to the site-wide Content Policy.
 
* '''No Moderation''' is lack of any moderation whatsoever. This occurs most often on abandoned subreddits where moderators are inactive or moderators fail to moderate their subreddit according to the site-wide Content Policy.
  
==History==
+
==Notable Instances of Moderation in Reddit ==
  
Reddit proclaims itself as the "front page of the internet".  
+
Since its founding 2005, Reddit has had several notable instances of Content moderation, often reactionary to the current climate, events, or incidents.  
  
===2005 to 2010===
+
May 2011: Posting any personal information is prohibited
  
June 2005: Reddit is created.  
+
October 2011: r/jailbait is banned after a CNN report about whether it is or is not child porn.  
  
December 2005: Comments
+
April-May 2013: In the aftermath of the Boston Bombing, several Reddit users attempt to identify the bomber, which results in the death of Sunil Tripathi by suicide as a result of the witchhunt. Reddit later apologized and promised to use this instance to prevent this from occurring again and more sensitive to the rule on personal information. [https://redditblog.com/2013/04/22/reflections-on-the-recent-boston-crisis/]
  
March 2008: Subreddit creation left to the Users.
+
October 2013: r/politics bans several websites to raise the quality of content on the subreddit. [https://slate.com/technology/2013/11/reddit-politics-r-politics-mods-ban-mother-jones-others-for-bad-journalism.html]
  
===2011 to 2015===
+
September 2014: Reddit affirms freedom of speech after their banning of r/thefappening, a subreddit featuring iCloud leaks of celebrity nudes.
  
May 2011: Posting any personal information is banned.
+
June 2015: Reddit bans five subreddits under their harassment policy, including r/fatpeoplehate, r/hamplanethatred, r/neofag, r/transfags, and r/shitniggerssay.
 
+
April-May 2013: Boston Marathon bombing witchhunt.
+
 
+
October 2013: r/politics bans several websites.
+
 
+
September 2014: Reddit affirms freedom of speech in the shadow of "The Fappening".
+
 
+
June 2015: CEO Ellen Pao criticism begins.
+
 
+
June 2015: r/FatPeopleHate is banned
+
 
+
August 2015: Many controversial subreddits are banned under a new policy of new CEO Steve Huffman
+
 
+
===2016 to Present===
+
  
 
September 2016: Steve Huffman edits comments on r/The_Donald
 
September 2016: Steve Huffman edits comments on r/The_Donald
  
November 2016: r/PizzaGate is banned.
+
November 2016: r/pizzaGate is banned for doxxing
 
+
February 2017: r/altright is banned
+
  
 
February 2019: Chinese tech giant Tencent invests in Reddit, sparking concern over its influence on-site moderation.
 
February 2019: Chinese tech giant Tencent invests in Reddit, sparking concern over its influence on-site moderation.
  
June 2019: r/The_Donald is quarantined.  
+
June 2019: r/The_Donald is quarantined for promoting violence against police and other public officials.  
  
February 2020: Reddit adminstrators remove several mods from r/The_Donald
+
February 2020: Reddit administrators remove several mods from r/The_Donald
  
 
==Ethical Issues==
 
==Ethical Issues==

Revision as of 19:22, 27 March 2020

Content moderation in Reddit is unique in the social media industry for its approach to content moderation. Reddit is governed using a model similar to democracy where users have responsibility for the content posted and deciding what is and isn't acceptable. The first layer of rules is the specific rules for an individual community known as a subreddit. They are run by volunteer moderators who write and enforce rules customized for each subreddit. Rules on subreddits are wholly enforced by the moderators without direction or commands from Reddit. The second layer of rules is Reddit's Content Policy rules[link] which are enforced both by subreddit moderators as well as Reddit administrators. These site-wide rules include bans on harassment, involuntary porn, promoting violence, sharing personal information or doxxing [link] and other behaviors deemed unacceptable for Reddit. Ethical issues include freedom of speech, bots, bias from algorithms, moderators, and administrators, effects on volunteer moderators, decentralized model of moderation, lack of moderation by moderators and administrators, and abuse by moderators and administrators.


Current Logo of Reddit

Overview

Unlike other social media websites, Reddit uses a highly decentralized model of moderation which, according to their official Transparency report, "is the most scalable solution we’ve seen to the challenges of moderating content online."[link] Moderators for subreddits have the freedom to enforce their subreddit rules as they see fit, but they must enforce Reddit's content policy. This model of moderations can be understood in 5 parts:

  • Auto-Moderator is a bot built into Reddit for moderators to use to enforce rules that are tedious or repetitive for moderators themselves to do as well as some recurring community discussion posts. It has the ability to flag, detect, and/or remove rule-breaking or inappropriate content on a subreddit. Auto-moderator or other similar bots are often the first line of moderation and help to keep subreddits free of spam and most obvious rule-breaking content. About 2,767,257,085 pieces of content are removed from Reddit by bots.[1]
  • User Reports are the action of a user of Reddit who uses the report feature of Reddit to bring possible rule-breaking comments or content to the attention of moderators to review and take action if necessary. These reports are anonymous. If inappropriate content does get past auto-moderator, a lack of user reports may see inappropriate content remain visible on a subreddit for an unknown amount of time until a report is filed or a moderator sees it and removes it. [2]
  • Moderator Action is a moderator or moderation team taking moderation action to remove content considered rule-breaking for their subreddit and issue suspensions or bans if necessary. This process can be slow as reports are in a queue to be reviewed and moderators are online at all times to check the queue. This type of moderation is the vast majority of moderation action done on Reddit with 84,140,588 instances of content removal. In terms of Content
  • Administrator Action is the Reddit administrators taking moderation action over content considered to be in violation of the site-web content moderation policy or against subreddit moderators who have not been enforcing it. This type of moderation is the smaller portion of moderation actions with 222,309 instances of content removal. Reddit administrators also have the ability to quarantine a community to prevent its content from appearing anywhere but on the subreddit itself.
  • No Moderation is lack of any moderation whatsoever. This occurs most often on abandoned subreddits where moderators are inactive or moderators fail to moderate their subreddit according to the site-wide Content Policy.

Notable Instances of Moderation in Reddit

Since its founding 2005, Reddit has had several notable instances of Content moderation, often reactionary to the current climate, events, or incidents.

May 2011: Posting any personal information is prohibited

October 2011: r/jailbait is banned after a CNN report about whether it is or is not child porn.

April-May 2013: In the aftermath of the Boston Bombing, several Reddit users attempt to identify the bomber, which results in the death of Sunil Tripathi by suicide as a result of the witchhunt. Reddit later apologized and promised to use this instance to prevent this from occurring again and more sensitive to the rule on personal information. [3]

October 2013: r/politics bans several websites to raise the quality of content on the subreddit. [4]

September 2014: Reddit affirms freedom of speech after their banning of r/thefappening, a subreddit featuring iCloud leaks of celebrity nudes.

June 2015: Reddit bans five subreddits under their harassment policy, including r/fatpeoplehate, r/hamplanethatred, r/neofag, r/transfags, and r/shitniggerssay.

September 2016: Steve Huffman edits comments on r/The_Donald

November 2016: r/pizzaGate is banned for doxxing

February 2019: Chinese tech giant Tencent invests in Reddit, sparking concern over its influence on-site moderation.

June 2019: r/The_Donald is quarantined for promoting violence against police and other public officials.

February 2020: Reddit administrators remove several mods from r/The_Donald

Ethical Issues

The ethical issues involving Content moderation in Reddit, in general, revolve around the actions or lack thereof by moderations. Bots are used to help with moderation, but the majority of moderation is subjective and thus full of ethical issues.

Freedom of Speech

The history of this topic on Reddit has a long history. In 2011, Reddit general manager Erik Martin stated, "We're a free speech site with very few exceptions (mostly personal info) and having to stomach occasional troll reddit like picsofdeadkids or morally questionable reddits like jailbait are part of the price of free speech on a site like this." [5]

Bias

  • Algorithms

Algorithms can be incorrectly programmed, intentionally or not, and cause issues that may not be apparent to moderations until either they or their communities notice an issue with them.

  • Moderators

Moderators are said to be impartial in all matters, but their actions as moderators can be heavily influenced by their personal biasses often unknowingly.

Effects on Volunteer Moderators

The psychological effects of reviewing or reading rule-breaking cannot be understated. Moreover, unlike most websites, users know who banned them and Reddit. This allows banned users to harass moderations over their actions and make new accounts to keep up the harassment.

Lack of Moderation by Moderators

Some subreddits have moderators using a hands-off approach when it comes to controlling their subreddits. Many subreddits have been quarantined in part because of this but also because moderators have failed to uphold site-wide rules.

Lack of moderation by Adminstrators

Many of the more controversial subreddits have only been removed after pressure from the media, which raises the question of whether should they have been banned earlier instead of after bad press.

Moderation Abuse

With the decentralized model of subreddits, abusive moderators continue to remain in their positions despite complaints from users to the administrators or posted about the said moderator. Reddit rarely acts to remove troublesome moderators.

Report Bombing

See Also

References

  1. [1]
  2. [2]