Difference between revisions of "Automatic gender recognition"

From SI410
Jump to: navigation, search
Line 2: Line 2:
  
 
==Background==
 
==Background==
 +
PUT IN THE IMAGE WITH THE CAMERA AND STICK PEOPLE HERE
 +
 
Automatic gender recognition works by using other forms of machine learning like [[Wikipedia:Facial_recognition_system|facial recognition]] and [[Wikipedia:Computer_vision|computer vision]] to infer gender by looking for key features such as facial hair or the tone of a person’s voice and comparing them to data that has the gender correctly labeled<ref>Hamidi, Foad, and Stacy M. Branham. “Gender Recognition or Gender Reductionism?: The Social Implications of Embedded Gender Recognition Systems.” Gender Recognition or Gender Reductionism? | Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, 1 Apr. 2018, dl.acm.org/doi/10.1145/3173574.3173582.</ref>.
 
Automatic gender recognition works by using other forms of machine learning like [[Wikipedia:Facial_recognition_system|facial recognition]] and [[Wikipedia:Computer_vision|computer vision]] to infer gender by looking for key features such as facial hair or the tone of a person’s voice and comparing them to data that has the gender correctly labeled<ref>Hamidi, Foad, and Stacy M. Branham. “Gender Recognition or Gender Reductionism?: The Social Implications of Embedded Gender Recognition Systems.” Gender Recognition or Gender Reductionism? | Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, 1 Apr. 2018, dl.acm.org/doi/10.1145/3173574.3173582.</ref>.
  

Revision as of 17:48, 26 March 2020

Automatic gender recognition (AGR) is the use of computer algorithms to attempt to identify a person’s gender based on pictures, video, or audio recordings. Although a relatively new form of artificial intelligence, it has been lauded for its potential to improve consumer marketing and interactive devices. At the same time, AGR has received a great deal of skepticism as the algorithmic methods it relies on have a studied history of misidentifying and misgendering users, especially from already marginalized communities such as transgender and non-binary people and people of color. There have also been concerns about the privacy of users, in addition to the inability to know how AGR has gendered them and for what reasons in most applications. This issues have made AGR one the most highly contested and controversial forms of artificial intelligence to date.

Background

PUT IN THE IMAGE WITH THE CAMERA AND STICK PEOPLE HERE

Automatic gender recognition works by using other forms of machine learning like facial recognition and computer vision to infer gender by looking for key features such as facial hair or the tone of a person’s voice and comparing them to data that has the gender correctly labeled[1].

Much like other forms of artificial intelligence, automatic gender recognition is increasingly being used to try and improve the user experience of things like shopping and interactive devices by using gender to help build up a user persona of what a user might like or want.

Targeted Marketing

One of the ways that automatic gender recognition is currently being used is for targeted marketing. If a company or platform can distinguish the gender of a user, they use that information to cater to them with specific advertisements that are deemed to be popular with that specific gender. For example, in a restaurant in Norway, an advertisement that crashed and revealed the code was found to be showing advertisements for salad to women and pizza to men[2]. This type of targeted advertising is often used on social media as well, where platforms will often ask for your gender to fill out your profile, but use that data to send you specific advertisements[3].

Ethical Issues

As with many new technologies, there are many ethical issues that arise with automatic gender recognition, largely focused on issues that come with the privacy of and/or misgendering of a user.

Misgendering

To misgender a person is to identify a person as a gender that does not match their gender identity, usually by way of pronouns. For example, if a user identifies as male and is referenced as “she,” this user is being misgendered.

Transgender and non-binary people

Trans Pride Flag
Non-Binary Pride Flag

Automatic gender recognition specifically places transgender (trans) people, those whose gender identity differs from the sex they were assigned at birth, and non-binary people, those whose gender identity exists outside of the gender binary, at risk.

The gender binary is the classification of gender as either male or female, and the belief that the sex a person is assigned at birth will align with the societally accepted corresponding gender identity, e.g. a person born with male genitalia will identify as a man[4].

Although there has been an increase in inclusivity of all genders in recent years, there are still many companies and platforms that only follow the gender binary, including most applications of automatic gender recognition. This leaves trans users at a much higher risk of being misgendered, and if a user’s does not exist within the given options, as is the case for non-binary users, they are going to be misgendered.

Studies have shown that misgendering of trans people increases feelings of social exclusion and oppression, creating an overall negative effect on their mental and physical well-being[5].

Where trans and non-binary people previously only had to battle being misgendered in person, they now must also combat it online, including cases where they may not even be able to tell how they are being gendered, like in the case of advertising.

People of color

Certain components of automatic gender recognition such as computer vision and facial recognition have been found to often incorrectly label the gender of people of color, especially women of color. One study showed that lighter skinned individuals were labeled correctly more often than darker skinned individuals, and that males were labeled correctly more often than females[6]. None of the services tested provided gender options outside of the gender binary. In this same study, in women, the darker the skin tone got, the closer the chance of being correctly gendered got to a coin toss[7]. This type of gender bias in facial recognition perpetuates harmful racial stereotypes that darker skinned people are more masculine.

PUT IN PICTURE OF GENDER SHADES?

Privacy

Another major concern regarding automatic gender recognition is privacy. As many users flock to the internet to escape their realities, having a computer assume your gender can be a harmful reinforcement of something a user may be trying to escape in the real world. This type of "informational privacy" is something that philosophy professor David Shoemaker describes as the “control over the access to and presentation of information about one’s self-identity."[8] Not having this kind of control about what information a given algorithm knows about you can be considered a breach of that privacy, and can be dangerous for users.

ADD IN STUFF ABOUT GENDER BEING A PROTECTED CLASS UNDER EEOC, APPARENTLY NOT UNDER DOJ ANYMORE, TRUMP SUCKS


See Also

References

  1. Hamidi, Foad, and Stacy M. Branham. “Gender Recognition or Gender Reductionism?: The Social Implications of Embedded Gender Recognition Systems.” Gender Recognition or Gender Reductionism? | Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, 1 Apr. 2018, dl.acm.org/doi/10.1145/3173574.3173582.
  2. Turton, William. “A Restaurant Used Facial Recognition to Show Men Ads for Pizza and Women Ads for Salad.” The Outline, The Outline, 12 May 2017, theoutline.com/post/1528/this-pizza-billboard-used-facial-recognition-tech-to-show-women-ads-for-salad?zd=1&zi=kcmponzy.
  3. Bivens, Rena, and Oliver L. Haimson. “Baking Gender Into Social Media Design: How Platforms Shape Categories for Users and Advertisers.” Social Media + Society, Oct. 2016, doi:10.1177/2056305116672486.
  4. Newman, Tim. “Sex and Gender: Meanings, Definition, Identity, and Expression.” Medical News Today, MediLexicon International, 7 Feb. 2018, www.medicalnewstoday.com/articles/232363.
  5. Hamidi, Foad, and Stacy M. Branham. “Gender Recognition or Gender Reductionism?: The Social Implications of Embedded Gender Recognition Systems.” Gender Recognition or Gender Reductionism? | Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, 1 Apr. 2018, dl.acm.org/doi/10.1145/3173574.3173582.
  6. Buolamwini, Joy. “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification.” Gender Shades, 2018, gendershades.org.
  7. Buolamwini, Joy. “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification.” Gender Shades, 2018, gendershades.org.
  8. Shoemaker, David W. “Self-Exposure and Exposure of the Self: Informational Privacy and the Presentation of Identity.” Ethics and Information Technology, vol. 12, no. 1, 2009, pp. 3–15., doi:10.1007/s10676-009-9186-x.