Difference between revisions of "Content moderation in Reddit"
(→Ethical Issues) |
|||
Line 92: | Line 92: | ||
With the decentralized model of subreddits, abusive moderators continue to remain in their positions despite complaints from users to the administrators or posted about the said moderator. Reddit rarely acts to remove troublesome moderators. | With the decentralized model of subreddits, abusive moderators continue to remain in their positions despite complaints from users to the administrators or posted about the said moderator. Reddit rarely acts to remove troublesome moderators. | ||
+ | |||
+ | ==Report Bombing== | ||
==See Also== | ==See Also== |
Revision as of 15:29, 27 March 2020
Content moderation in Reddit is the process of filtering and removing posts or comments that do not confine to the rules of Reddit. Nearly all content moderation is done by volunteer moderators who enforce their subreddit’s rules in addition to the site-wide [1] rules. Reddit moderators have almost complete control over their subreddits. However, there have been several cases in which the Reddit administrators have stepped to moderate, remove moderators, or even ban entire subreddits. Ethical issues include freedom of speech, bots, bias of algorithms and moderators, effects on volunteer moderators, lack of moderations by moderators, lack of moderation by administrators, and moderation abuse.
Contents
Overview
Reddit uses a highly decentralized model of moderation. Subreddit moderations, in general, have the freedom to enforce their own subreddit rules as they see fit, with administrators only becoming involved in extreme circumstances.
- Auto-Moderator is a bot built into Reddit for moderators to use to enforce rules that are tedious or repetitive for moderators themselves to do as well as some recurring community discussion posts. It has the ability to flag and detect rule-breaking or inappropriate content on a subreddit. [2]
- User Reports are the action of a user of Reddit who uses the report feature of Reddit to bring possible rule-breaking comments or content to the attention of moderators to review and take action if necessary. These reports are anonymous. [3]
- Moderator Action is a moderator or moderation team taking action to remove content considered rule-breaking for their subreddit and issue suspensions or bans if necessary.
- Administrator Action is the Reddit Administrators taking action either because they have deemed the moderators to have not properly moderated their subreddit or if an entire subreddit is deemed to be breaking the site-wide content-policy.
- Lack of Moderation is where either the moderators or administrators fail to properly moderate their subreddits, leading to blatant rule violations and dangerous and life-threating consequences.
History
Reddit proclaims itself as the "front page of the internet".
2005 to 2010
June 2005: Reddit is created.
December 2005: Comments
March 2008: Subreddit creation left to the Users.
2011 to 2015
May 2011: Posting any personal information is banned.
April-May 2013: Boston Marathon bombing witchhunt.
October 2013: r/politics bans several websites.
September 2014: Reddit affirms freedom of speech in the shadow of "The Fappening".
June 2015: CEO Ellen Pao criticism begins.
June 2015: r/FatPeopleHate is banned
August 2015: Many controversial subreddits are banned under a new policy of new CEO Steve Huffman
2016 to Present
September 2016: Steve Huffman edits comments on r/The_Donald
November 2016: r/PizzaGate is banned.
February 2017: r/altright is banned
February 2019: Chinese tech giant Tencent invests in Reddit, sparking concern over its influence on-site moderation.
June 2019: r/The_Donald is quarantined.
February 2020: Reddit adminstrators remove several mods from r/The_Donald
Ethical Issues
The ethical issues involving Content moderation in Reddit, in general, revolve around the actions or lack thereof by moderations. Bots are used to help with moderation, but the majority of moderation is subjective and thus full of ethical issues.
Freedom of Speech
The history of this topic on Reddit has a long history. In 2011, Reddit general manager Erik Martin stated, "We're a free speech site with very few exceptions (mostly personal info) and having to stomach occasional troll reddit like picsofdeadkids or morally questionable reddits like jailbait are part of the price of free speech on a site like this." [4]
Bias
- Algorithms
Algorithms can be incorrectly programmed, intentionally or not, and cause issues that may not be apparent to moderations until either they or their communities notice an issue with them.
- Moderators
Moderators are said to be impartial in all matters, but their actions as moderators can be heavily influenced by their personal biasses often unknowingly.
Effects on Volunteer Moderators
The psychological effects of reviewing or reading rule-breaking cannot be understated. Moreover, unlike most websites, users know who banned them and Reddit. This allows banned users to harass moderations over their actions and make new accounts to keep up the harassment.
Lack of Moderation by Moderators
Some subreddits have moderators using a hands-off approach when it comes to controlling their subreddits. Many subreddits have been quarantined in part because of this but also because moderators have failed to uphold site-wide rules.
Lack of moderation by Adminstrators
Many of the more controversial subreddits have only been removed after pressure from the media, which raises the question of whether should they have been banned earlier instead of after bad press.
Moderation Abuse
With the decentralized model of subreddits, abusive moderators continue to remain in their positions despite complaints from users to the administrators or posted about the said moderator. Reddit rarely acts to remove troublesome moderators.