Content Moderation in Facebook

From SI410
Revision as of 15:45, 25 January 2023 by Ebethwu (Talk | contribs)

Jump to: navigation, search

Content moderation is ________. Facebook is a social media platform that allows people to connect with friends, family and communities of people who share common interests(1). As with many other popular social media platforms, Facebook has come up with an approach to moderate and control the type of content users see. Facebook has two main approaches when it comes to content moderation; it utilizes both AI moderators and human moderators. The way Facebook moderates its content is through its community standards that say what is okay or not okay(2). There have been many instances where Facebook’s content moderation tactics have been a success, but also many instances where it has failed. The ethics behind Facebook’s content moderation approach has also been widely controversial from the mental health struggles human moderators are left to deal with all the way to how the AI is trained to flag inappropriate content.

Overview/Background

Notable Instances

Ethical Concerns

Human Moderators

Conclusion

Notes