Gender bias in Wikipedia

From SI410
Revision as of 18:45, 29 March 2019 by Svschou (Talk | contribs) (Evidence of the Gender Gap)

Jump to: navigation, search

Wikipedia, the online encyclopedia, makes knowledge sharing easy and accessible to a wide audience. Because anyone can access and edit Wikipedia with little training, the site offers a neutral setting to contribute one's personal expertise to a collective knowledge base. However, among other biases, a stark gender bias can be seen in both Wikipedia content and the culture of Wikipedia editing.

Evidence of the Gender Gap

Differences in Wikipedia Editors

According to a study performed by Wikipedia, 15% of Wikipedia contributors from the United States are women, with only 13% women contributors worldwide. [1][2] Other studies have found the proportion of editors identifying as female to be as high as 23%, though the moderators and contributors remain predominantly male. [3] This lack of female voices can result in the erasure of women's perspectives in WIkipedia's content. Wikipedia has struggled to maintain women editors and contributors due to other culture complications. When there is a lack of female contribution, the editable content and editor culture can be hostile towards women, potentially driving away other female editors. [4]

Content Imbalances

WIkipedia's content reflects the perspectives and interests of its editors. There exist inherent differences among interests, preferences and topics that female and male editors decide to focus their time. With a smaller percentage fo Wikipedia contribution coming from women, there may be an imbalance of attention and unbiased reporting on certain topics related to the minority perspectives. [5] Biographies of notable women is one area where Wikipedia lacks equal representation with male-counterparts. In 2014, Wikimedia, Wikipedia's parent foundation, evaluated all biographies shown in the English Wikipedia and discovered that 15% of them were about notable women across all fields of study and profession. [6] Many initiatives exist to combat this imbalance of women-specific content by organizing the mass editing and addition of many women biographies across many fields. Many utilize and perceive Wikipedia as an unbiased representation of knowledge, though with a lack of content about topics specific to women, this shows an implicit bias in how Wikipedia and its editors perceive and prioritize these topics. [7]

Women-specific Characterization

Many Wikipedia pages of notable women are more likely to include women-specific characterization through language and links to other Wikipedia pages. A study investigating gender bias in Wikipedia through language found through natural language processing that words most associated with male Wikipedia profiles were mainly about sports, while words most associated with female Wikipedia profiles were mainly arts, gender or family. Other common themes found on Wikipedia pages of notable women are references to a spouse, primarily husbands, and being "the first woman" in her field.[8] These patterns found by this analysis highlight the prevalence of stereotyping theory in Wikipedia. The pages of women are often characterized as one of two extremes: norm-breaking or being the exception to a common narrative, or existing within common, predefined roles. This dichotomy of characterization shows a gap in which knowledge about successful women cannot be robust, but can only be one of two narratives.[8]

Why Don't More Women Edit Wikipedia

Foundational Issues in Wikipedia

The online encyclopedia Wikipedia was originally founded to reflect a culture that encourages honest, diplomatic thought and neutral points of view. [9] The foundational structure of Wikipedia allows the editing of any pages with little policing, though a select group of editors keep a close watch on popular, well-visited pages. Because Wikipedia was meant to reflect the neutrality of an encyclopedia, one must look at the foundations of encyclopedia's themselves. Encyclopedias were originally developed to create a collective, foundational knowledge between educated men. [10] The creation of the internet also emerged during a time where the intersection of male-dominated focuses - government, military, academia and engineering - were at the center of culture.[11]

This argument suggests the question: For whom was this knowledge sharing space created and who has inhabited this space the longest? If encyclopedias and Wikipedia were created with an intended male audience, the content and culture of the site will share these values and reflect this bias. This bias against women can take the form of transparent biases, like blatant online harassment, or opaque biases, like the lack of content regarding notable women. [12]

Harassment Online

  • “In Wikimedia’s 2011 survey, more than half of editors reported getting into an argument with other editors on discussion pages; 12 percent of female editors reported someone leaving inappropriate messages about or for them.” [13]
  • “…when women express anger, they tend to be penalized more than men would be; when they assert themselves, they face more backlash; and they tend to be judged more harshly for their mistakes.” [1]

Policing in a Male-Dominated Space

  • "these gatekeepers apply their own judgment and prejudices,” [14]
  • “As a website built on the idea that anyone can access and add information, user bias is ingrained within its self-policing wiki model.” [15]
  • "Governance on the site relies heavily on community-generated social norms, which are articulated in artifacts of governance called “policy.” ... "norms are a more powerful governing mechanism for supporting collective action than rules and laws becomes complicated when the rules are explicitly intended to reflect norms." [16]

Survey Responses

  • “Women reported feeling less confident about their expertise, less comfortable with editing others’ work (a process which often involves conflict), and reacting more negatively to critical feedback than men.” [1]
  • “In Gardner’s informal survey of women who use Wikipedia, she found that many women found the interface complicated. Others lacked the time or confidence in their expertise to write for a community site, or found the site’s culture too sexualized, misogynistic, and aggressive.” [17]

Responsive Measures

To balance the gender gap, Wikipedia requires a broader population of editors to broaden the topics and knowledge covered on its site. [18] To do this is no easy task. To attract more women contributors and editors, Wikipedia could rely on both internal and external efforts.

Internally, Wikipedias current self-policing system could also be altered to maintain a space for women editors. This could be accomplished by implementing a fuller security system that gauges a contributor's expertise, monitors for harassment or vandalism, and encourages training or education in including multiple perspectives in knowledge sharing. [1]

While the explicit language and personal perspectives of women come from editors themselves, the gender bias found in Wikipedia also reflects the biases found in the secondary sources that Wikipedia relies on.[19] Due to a lack of coverage in external, independent sources about notable women and woman-related topics from other fields of academic and research, Wikipedia also lacks the reflection of this coverage. To account for this difference, other fields also need to recognize gender biases they may face to build the repository from research that Wikipedia editors may pull from.

Example Efforts from the Wikimedia Foundation

Other External Efforts

  • Wikid GRRLS: This project teaches online and research skills and encourages teenage girls in the Detroit area to participate in online discussion.

References

  1. 1.0 1.1 1.2 1.3 Nicole Torres “Why Do So Few Women Edit Wikipedia?” (Harvard Business Review, Gender, Jun 02 2016)
  2. Noam Cohen “Define Gender Gap? Look Up Wikipedia’s Contributor List” (New York Times, Media, Jan 30 2011)
  3. Aysha Khan “The slow and steady battle to close Wikipedia’s dangerous gender gap” (Think Progress, Dec 15 2016)
  4. Emma Paling “Wikipedia’s Hostility to Women”, (The Atlantic, Technology, Oct 21 2015)
  5. Julia Bear, Benjamin Collier "Where are the Women in Wikipedia? Understanding the Different Psychological Experiences of Men and Women in Wikipedia", (Sex Roles: A Journal of Research, (2016) 74: 254)
  6. Katherine Maher "Wikipedia is a mirror of the world’s gender biases", (Wikimedia Foundation, News, Oct 18 2018)
  7. Joseph Reagle, Lauren Rhue "Gender Bias in Wikipedia and Britannica", (International Journal of Communication, 5 (2011), 1138–1158)
  8. 8.0 8.1 Eduardo Graells-Garrido, et. al. "First Women, Second Sex: Gender Bias in Wikipedia" (Cornell University, Computer Science, Social and Information Networks, 2015)
  9. Emma Paling “Wikipedia’s Hostility to Women”, (The Atlantic, Technology, Oct 21 2015)
  10. Emma Paling “Wikipedia’s Hostility to Women”, (The Atlantic, Technology, Oct 21 2015)
  11. Aysha Khan “The slow and steady battle to close Wikipedia’s dangerous gender gap” (Think Progress, Dec 15 2016)
  12. Philip Brey "Values in technology and disclosive computer ethics" (2010)
  13. Aysha Khan “The slow and steady battle to close Wikipedia’s dangerous gender gap” (Think Progress, Dec 15 2016)
  14. Monica Hesse “History has a massive gender bias. We’ll settle for fixing Wikipedia.” (The Washington Post, Perspective, Feb 17 2019)
  15. Aysha Khan “The slow and steady battle to close Wikipedia’s dangerous gender gap” (Think Progress, Dec 15 2016)
  16. Andrea Forte , Vanesa Larco & Amy Bruckman "Decentralization in Wikipedia Governance" (Journal of Management Information Systems, 26:1, 49-72, 2009)
  17. Aysha Khan “The slow and steady battle to close Wikipedia’s dangerous gender gap” (Think Progress, Dec 15 2016)
  18. Katherine Maher “Wikipedia mirrors the world’s gender biases, it doesn’t cause them”, (The Los Angeles Times, Op-Ed, Oct 18 2018)
  19. Eduardo Graells-Garrido, et. al. "First Women, Second Sex: Gender Bias in Wikipedia" (Cornell University, Computer Science, Social and Information Networks, 2015)