Difference between revisions of "Content Moderation in Facebook"

From SI410
Jump to: navigation, search
Line 1: Line 1:
 
[[File:fb-moderators.png|500px|thumb|right|Workers at a Facebook content moderation center <ref></ref>]]
 
[[File:fb-moderators.png|500px|thumb|right|Workers at a Facebook content moderation center <ref></ref>]]
  
Content moderation is the process of screening content users post online by applying a set of pre-set rules or guidelines to see if it is appropriate or not(1). Facebook is a social media platform that allows people to connect with friends, family and communities of people who share common interests(2). As with many other popular social media platforms, Facebook has come up with an approach to moderate and control the type of content users see and engage with. Facebook has two main approaches when it comes to content moderation; they utilize both AI moderators and human moderators. The way Facebook moderates its content is through its community standards that lay out what they believe each post should follow(3). There have been many instances where Facebook’s content moderation tactics have been a success, but also many instances where it has failed. The ethics behind Facebook’s content moderation approach has also been widely controversial from the mental health struggles human moderators are forced to deal with to questioning how the AI is trained to flag inappropriate content.  
+
Content moderation is the process of screening content users post online by applying a set of pre-set rules or guidelines to see if it is appropriate or not <ref>Roberts, S. T. (2017). Content moderation. In Encyclopedia of Big Data. UCLA. Retrieved from https://escholarship.org/uc/item/7371c1hf
 +
</ref>. Facebook is a social media platform that allows people to connect with friends, family and communities of people who share common interests(2). As with many other popular social media platforms, Facebook has come up with an approach to moderate and control the type of content users see and engage with. Facebook has two main approaches when it comes to content moderation; they utilize both AI moderators and human moderators. The way Facebook moderates its content is through its community standards that lay out what they believe each post should follow(3). There have been many instances where Facebook’s content moderation tactics have been a success, but also many instances where it has failed. The ethics behind Facebook’s content moderation approach has also been widely controversial from the mental health struggles human moderators are forced to deal with to questioning how the AI is trained to flag inappropriate content.  
  
 
==Overview/Background==
 
==Overview/Background==

Revision as of 16:31, 25 January 2023

Workers at a Facebook content moderation center Cite error: Invalid <ref> tag; refs with no name must have content

Content moderation is the process of screening content users post online by applying a set of pre-set rules or guidelines to see if it is appropriate or not [1]. Facebook is a social media platform that allows people to connect with friends, family and communities of people who share common interests(2). As with many other popular social media platforms, Facebook has come up with an approach to moderate and control the type of content users see and engage with. Facebook has two main approaches when it comes to content moderation; they utilize both AI moderators and human moderators. The way Facebook moderates its content is through its community standards that lay out what they believe each post should follow(3). There have been many instances where Facebook’s content moderation tactics have been a success, but also many instances where it has failed. The ethics behind Facebook’s content moderation approach has also been widely controversial from the mental health struggles human moderators are forced to deal with to questioning how the AI is trained to flag inappropriate content.

Overview/Background

Notable Instances

Ethical Concerns

Human Moderators

Conclusion

References

  1. Roberts, S. T. (2017). Content moderation. In Encyclopedia of Big Data. UCLA. Retrieved from https://escholarship.org/uc/item/7371c1hf