Difference between revisions of "Talk:Content Moderation in Facebook"

From SI410
Jump to: navigation, search
 
Line 1: Line 1:
 
The article definitely meets the minimum word requirement of 1000 words and is structured in a way where readers can effectively dive deep into how Facebook moderates their users’ contents and the ethics behind their process. The article also includes the 3 major components well: both the opening paragraph and first paragraph in the “Overview/Background” section does a good job explaining what content moderation is, how Facebook specifically organizes their content moderation system, and the ethical concerns that have arised with it. The body of the article is split up into many sections; these later sections in the article also do a great job elaborating upon the details mentioned in the opening paragraphs, such as how Facebook utilizes AI moderators and human moderators to filter through the posts. Although the “Instances” section appears to be an essential addition to the article, based on the subsection titles, it seems like there will be a greater focus on unsuccessful instances of Facebook’s content moderation system than successful ones. Therefore, it makes me wonder if there have been any successful instances of Facebook’s content moderation system. I only wonder this because the opening paragraph does mention that the tactics have been successful many times. The addition of such successful instances would be the only improvement I have for the article as it may enable readers to have a wider perspective on this topic.  
 
The article definitely meets the minimum word requirement of 1000 words and is structured in a way where readers can effectively dive deep into how Facebook moderates their users’ contents and the ethics behind their process. The article also includes the 3 major components well: both the opening paragraph and first paragraph in the “Overview/Background” section does a good job explaining what content moderation is, how Facebook specifically organizes their content moderation system, and the ethical concerns that have arised with it. The body of the article is split up into many sections; these later sections in the article also do a great job elaborating upon the details mentioned in the opening paragraphs, such as how Facebook utilizes AI moderators and human moderators to filter through the posts. Although the “Instances” section appears to be an essential addition to the article, based on the subsection titles, it seems like there will be a greater focus on unsuccessful instances of Facebook’s content moderation system than successful ones. Therefore, it makes me wonder if there have been any successful instances of Facebook’s content moderation system. I only wonder this because the opening paragraph does mention that the tactics have been successful many times. The addition of such successful instances would be the only improvement I have for the article as it may enable readers to have a wider perspective on this topic.  
Moreover, based on the “References” section, it does appear that the article is backed up by many reliable sources and is embedded in the article in their respective spots. The ethical issues at stake with Facebook’s content moderation process are clear as well, as indicated in the opening paragraphs and by the subsections in the Ethical Concerns section. Especially in regards to the “Mental Health & Unfair Treatment of Human Moderators” subsection, it is easy to infer why the ethical issue is of major concern as well. The article does report on ethical issues objectively as there are no personal opinions when discussing them, or in the article overall. Good work!
+
Moreover, based on the “References” section, it does appear that the article is backed up by many reliable sources and is embedded in the article in their respective spots. The ethical issues at stake with Facebook’s content moderation process are clear as well, as indicated in the opening paragraphs and by the subsections in the "Ethical Concerns" section. Especially in regards to the “Mental Health & Unfair Treatment of Human Moderators” subsection, it is easy to infer why the ethical issue is of major concern as well. The article does report on ethical issues objectively as there are no personal opinions when discussing them, or in the article overall. Good work!
 
- Jennifer Lee
 
- Jennifer Lee

Latest revision as of 01:34, 4 February 2023

The article definitely meets the minimum word requirement of 1000 words and is structured in a way where readers can effectively dive deep into how Facebook moderates their users’ contents and the ethics behind their process. The article also includes the 3 major components well: both the opening paragraph and first paragraph in the “Overview/Background” section does a good job explaining what content moderation is, how Facebook specifically organizes their content moderation system, and the ethical concerns that have arised with it. The body of the article is split up into many sections; these later sections in the article also do a great job elaborating upon the details mentioned in the opening paragraphs, such as how Facebook utilizes AI moderators and human moderators to filter through the posts. Although the “Instances” section appears to be an essential addition to the article, based on the subsection titles, it seems like there will be a greater focus on unsuccessful instances of Facebook’s content moderation system than successful ones. Therefore, it makes me wonder if there have been any successful instances of Facebook’s content moderation system. I only wonder this because the opening paragraph does mention that the tactics have been successful many times. The addition of such successful instances would be the only improvement I have for the article as it may enable readers to have a wider perspective on this topic. Moreover, based on the “References” section, it does appear that the article is backed up by many reliable sources and is embedded in the article in their respective spots. The ethical issues at stake with Facebook’s content moderation process are clear as well, as indicated in the opening paragraphs and by the subsections in the "Ethical Concerns" section. Especially in regards to the “Mental Health & Unfair Treatment of Human Moderators” subsection, it is easy to infer why the ethical issue is of major concern as well. The article does report on ethical issues objectively as there are no personal opinions when discussing them, or in the article overall. Good work! - Jennifer Lee