Advertising on Facebook
Advertising on Facebook is a chance for businesses and groups to appear in users’ news feeds on Facebook desktop and mobile apps. Advertising was integrated into Facebook’s platform in 2007 as a way to monetize the platform and create profit . As a result, Facebook turned cash flow positive for the first time in 2009 . Advertising continues to be the main source of revenue for Facebook earning them US$16.6 billion in profit as of 2019 . Facebook’s advertising platform has developed to be a prominent part of Facebook’s information-sharing platform making it an organic element to its own News Feed. As it continues to develop, ethical issues regarding information transparency have raised concerns over how Facebook's large advertising scope affects marginalized communities and how Facebook's ad targeting system has blurred the lines between relevant and invasive as it continues to search for information about its users.
Facebook ads are pages built by businesses and other organizations on the social network. Businesses and other advertisers serve pages straight into the News Feeds of Facebook users. As these pages appear, people can comment on them and “like” them. If people click the like button, these pages will continue to show up in their feeds and the feeds of Facebook friends. The ad content displayed to a user is determined through an ad targeting and ad matching process based on Facebook's inferred perception of users' interests and behavior . Facebook infers users' interests and behaviors by collecting data through the following methods:
- Information from the activity on Facebook and Facebook products includes any action or information inputted to the platform. This includes content engagement and profile information of users and users' friends. This information is used to determine users' interests in order to display relevant ad content. For example, if a bike shop wanted to reach female cyclists in Atlanta through Facebook, Facebook can display their ad to women in Atlanta who liked a Page about bikes. Businesses do not know who these users are. Facebook provides advertisers with reports about the kinds of people seeing their ads and how their ads are performing. They do not share information that personally identifies users.
- Information from advertisers shared with Facebook is the customer information brought to Facebook from advertisers. The purpose of this data is so they can reach their target audiences on Facebook. Information includes email addresses from past purchases or other data sources available to that advertiser. Facebook finds accounts that match the given data, but they do not tell the advertiser who matched. In ad preferences, users can see which advertisers with their contact information are currently running campaigns.
- Information sent to Facebook by websites and apps comes from the websites and apps users visit that use Facebook business tools. For example, if an online retailer is using Facebook Pixel, they can ask Facebook to show ads to people who looked at a certain style of shoe or put a pair of shoes into their shopping cart.
With this information, Facebook can determine personal information regarding gender, behavior, interests, and location. This is used by Facebook to pair users with relevant ad content.
Users cannot opt-out of receiving ad content because ads are what keep Facebook free to users according to the company . Instead, Facebook users can control their ad experience through the Ad Preferences menu. Under Ad Preferences, users can edit Facebook's generated interests and categories, hide ads from advertisers, and turn off data sharing with outside advertisers .
Revenue from Advertisers
The majority of Facebook’s revenue comes from advertising . As of 2017, Facebook earns US$20.21 per user from advertising . Mobile advertising accounts for most of Facebook's advertising revenue. In 2012, Facebook mobile advertising model was inserted in 2012 and was modeled after the Vickrey-Clarke-Groves auction model. Prices for advertising follow a variable pricing model based on ad auction bids, potential engagement levels of the advertisement itself. An advertiser might bid $2 to place an ad in a particular situation. Then Facebook weighs these bids against how relevant the ad is to that situation. If an ad is relevant, the advertiser need not bid as high to win the auction . In addition, Facebook considers how relevant an ad is next to all other content, and it chooses ads based on how well they compete for attention against organic posts . As of June 2014, mobile accounted for 62% of advertising revenue, an increase of 21% from the previous year .
Facebook has been criticized by users for keeping the their ad system's workings private and downplaying the amount of data they collect on users used for advertising. In 2018, journalist Louise Matsakis condemned Facebook's vice president for ads Rob Goldman's blog post about the data Facebook shares with advertisers because it left out the realities of what Facebook can do with users' data . Facebook's only provided information on their advertising methods is an infographic located under a subdomain which is thought to be too short to cover the complexities of an advanced ad system. In addition, Matsakis pointed out Facebook's lack of definition for a "relevant" ad and the dangers it could have for Facebook's large user base. "A 'relevant' ad to a marketer might target a specific personality type, or perceived emotional state. It might also be designed to take advantage of an already vulnerable population. That can quickly get a lot more involved than just people who like bikes," . In his blog post, Goldman defends Facebook's ad system by implying users have always had the ability to control their ad experience under ad preferences . However, Facebook is accused of designing its user interface to make these settings difficult for users to find as to keep the collection of user information. This is deemed unethical in terms of information transparency as Facebook remains opaque about the data they share and how users can control their data.
Users have felt targeted ads on Facebook overreach certain personal boundaries due to its large collection of personal information and less ad restrictions than other popular social media sites. This includes allowing advertisers to target users based on sensitive interests, including political beliefs, sexuality, and religion as marked by the European data protection law . The joint investigation found Facebook’s platform had made sensitive inferences about users — allowing advertisers to target people based on inferred interests including communism, social democrats, Hinduism and Christianity. All of which would be classed as sensitive personal data under EU rules. The criticism comes from Facebook not asking users if they consent to be categorized by sensitive labels. Facebook argues otherwise, of course — claiming that the information it gathers about people’s affinities/interests, even when they entail sensitive categories of information such as sexuality and religion, is not personal data.
Targeted ads have been labeled as "creepy" according to some Facebook users . Specific and intimate targeted ads acts as a reminder of how much data Facebook stores about its users and how much Facebook knows about its users. That data is being sold to third-party advertisers. This suggests a lack of users' control over their ownership of personal information that can be seen as a violation of privacy and personal identity. Facebook further endangers the privacy of users because it is creating tools to predict future behavior based on existing data it has and use that data for advertising purposes. For example, Facebook’s AI division has turned to several different machine learning techniques, among them “gradient boosted decision trees,” or GBDT, which, according to the sources, is used for advertising purposes .
Some research studies have pointed out how Facebook's ad algorithm perpetuates gender biases, including showing certain jobs more often to men than women and vice versa. One study led by University of South California discovered that Facebook's algorithm was more likely to present specific job ads to a user if the user's gender identity aligns with the concentration of that gender in the job industry. For example, the study revealed that a Domino's ad for recruiting delivery drivers would more likely be shown to men, and an ad for recruiting shoppers for a grocery-delivery service would more likely be shown to women. Furthermore, this imbalance is also applicable to higher-skill jobs. For example, Facebook's ad algorithm would more likely show an ad for a technical job at Netflix to a woman because Netflix has a relatively high level of female employment compared to other companies in the technical industry. On the other hand, the algorithm would more likely show an ad for a job at Nvidia Corp to a man because the company already has a higher percentage of male employment than female employment. This shows how Facebook's algorithm can lead to self-fulfilling prophecy, as the USC study describes Facebook as “a platform whose algorithm learns and perpetuates the existing difference in employee demographics". Although Facebook says that they are working to find and eliminate these discriminatory biases from their algorithm, many people think that the ad-targeting system simply does not work. Additionally, a large number of people think it is problematic how Facebook's algorithms are kept secret within the company, and there is not much transparency about how Facebook is designing their algorithms to better personalize ads for specific individuals.
Studies have also shown that Facebook's ad algorithms can sustain racial biases as well. Facebook researchers found that users on Instagram whose activity suggested that they were black were 50% more likely to have their accounts automatically disabled by the moderation system than users whose activity suggested they were white. Unfortunately, Facebook management has frequently swept this type of research under the rug. When asked about it, Facebook says they told researchers to stop looking into racial bias because the methodology they used was flawed. Several employees have mentioned that they quit their jobs at Facebook because they wanted to create social change and make Facebook better for fighting against racism, only to be ignored by those in leadership or higher roles. While Facebook says that they are now creating new teams to study racial bias on the platform, it is clear that many people think Facebook management is not taking issues of racial discrimination seriously enough.
- Rob Goldman, "Hard Questions: What Information Do Facebook Advertisers Know About Me?", Facebook, April 2018. Retrieved March 2020.
- Martina Frascona 'Sochurkova, "Facebook’s revenue exceeded $55 billion in 2018", NewsFeed.org, July 2019. Retrieved March 2020.
- Audrey Schomer "Facebook's ad revenue growth slowed in Q2 — here's where its future growth will come from", Business Insider, July 2019. Retrieved March 2020.
- Andrew Perrin, "Share of U.S. adults using social media, including Facebook", PewResearch.org, February 2019. Retrieved April 2019.
- John Gramlich, "Facts about Americans and Facebook", PewResearch.org, April 2018. Retrieved May 2019.
- BBC News, "Facebook flush with advertising money", bbc.com, Retrieved July 2017.
- Maddy Osman, "Steps to Conduct Deep Facebook Analysis", Sprout Social, Retrieved August 2020.
- Cade Metz, "How Facebook’s Ad System Works ", New York Times, October 2017. Retrieved March 2020.
- Facebook, "About Facebook Ads", Facebook, Retrieved March 2020.
- Daniel Malloy, "TOO BIG NOT TO FAIL?", Ozy, May 2019. Retrieved March 2020.
- Cade Metz, "Facebook Doesn't Make as Much Money as It Could—On Purpose", 'Wired, September 2015. Retrieved March 2020.
- Lewis DVorkin, "Inside Forbes: Mobile Part II, Or 4 More Charts That Offer a Peek Into the Future of Journalism", Forbes, July 2014. Retrieved March 2020.
- Louise Matsakis , "Facebook's Targeted Ads Are More Complex Than It Lets On", Wired, April 2018. Retrieved March 2020.
- Natasha Lomas, "Facebook faces fresh criticism over ad targeting of sensitive interests", Tech Crunch, May 2018. Retrieved March 2020.
- Rebecca Jennings, "Why targeted ads are the most brutal owns", Vox, September 2018. Retrieved March 2020.
- Sam Biddle, "FACEBOOK USES ARTIFICIAL INTELLIGENCE TO PREDICT YOUR FUTURE ACTIONS FOR ADVERTISERS, SAYS CONFIDENTIAL DOCUMENT", Intercept, April 2018. Retrieved March 2020.
- Hao, Karen. “Facebook's Ad Algorithms Are Still Excluding Women from Seeing Jobs.” MIT Technology Review, MIT Technology Review, 9 Apr. 2021, https://technologyreview.com/2021/04/09/1022217/facebook-ad-algorithm-sex-discrimination/.
- Horwitz, Jeff. “Facebook Algorithm Shows Gender Bias in Job Ads, Study Finds.” The Wall Street Journal, Dow Jones & Company, 9 Apr. 2021, https://wsj.com/articles/facebook-shows-men-and-women-different-job-ads-study-finds-11617969600.
- Biddle, Sam. “Research Says Facebook's Ad Algorithm Perpetuates Gender Bias.” The Intercept, 9 Apr. 2021, https://theintercept.com/2021/04/09/facebook-algorithm-gender-discrimination/.
- Solon, Olivia. “Facebook Ignored Racial Bias Research, Employees Say.” NBCNews.com, NBCUniversal News Group, 23 July 2020, https://nbcnews.com/tech/tech-news/facebook-management-ignored-internal-research-showing-racial-bias-current-former-n1234746.