Difference between revisions of "Automatic gender recognition"

From SI410
Jump to: navigation, search
Line 15: Line 15:
  
 
===Misgendering===
 
===Misgendering===
To [[misgender]] a person is to identify a person as a gender that does not match their [Wikipedia:Gender_identity|gender identity]], usually by way of pronouns (CITATION OF WIKTIONARY PAGE). For example, if a user identifies as male and is referenced as “she,” this user is being misgendered.
+
To [[Wiktionary:misgender|misgender]] a person is to identify a person as a gender that does not match their [[Wikipedia:Gender_identity|gender identity]], usually by way of pronouns (CITATION OF WIKTIONARY PAGE). For example, if a user identifies as male and is referenced as “she,” this user is being misgendered.
  
 
====Transgender people====
 
====Transgender people====

Revision as of 17:49, 12 March 2020

Automatic gender recognition (AGR) is the use of computer algorithms to attempt to identify a person’s gender based on pictures, video, or audio recordings. As a relatively new form of artificial intelligence, little research has been done on its implications on society when put into use.

Background

Automatic gender recognition works by using other forms of machine learning like facial recognition and computer vision to infer gender by looking for key features such as facial hair or the tone of a person’s voice and comparing them to data that has the gender correctly labeled[1].

Much like other forms of artificial intelligence, automatic gender recognition is increasingly being used to try and improve the user experience of things like shopping and interactive devices by using gender to help build up a user persona of what a user might like or want.

Targeted Marketing

One of the ways that automatic gender recognition is currently being used is for targeted marketing. If a company or platform can distinguish the gender of a user, they use that information to cater to them with specific advertisements that are deemed to be popular with that specific gender. For example, in a restaurant in Norway, an advertisement that crashed and revealed the code was found to be showing advertisements for salad to women and pizza to men[2]. This type of targeted advertising is often used on social media as well, where platforms will often ask for your gender to fill out your profile, but use that data to send you specific advertisements[3].

Interactive/Adaptable Devices

Ethical Issues

As with many new technologies, there are many ethical issues that arise with automatic gender recognition, largely focused on issues that come with misgendering a user.

Misgendering

To misgender a person is to identify a person as a gender that does not match their gender identity, usually by way of pronouns (CITATION OF WIKTIONARY PAGE). For example, if a user identifies as male and is referenced as “she,” this user is being misgendered.

Transgender people

People of color

See Also

References

  1. Hamidi, Foad, and Stacy M. Branham. “Gender Recognition or Gender Reductionism?: The Social Implications of Embedded Gender Recognition Systems.” Gender Recognition or Gender Reductionism? | Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, 1 Apr. 2018, dl.acm.org/doi/10.1145/3173574.3173582.
  2. Turton, William. “A Restaurant Used Facial Recognition to Show Men Ads for Pizza and Women Ads for Salad.” The Outline, The Outline, 12 May 2017, theoutline.com/post/1528/this-pizza-billboard-used-facial-recognition-tech-to-show-women-ads-for-salad?zd=1&zi=kcmponzy.
  3. Bivens, Rena, and Oliver L. Haimson. “Baking Gender Into Social Media Design: How Platforms Shape Categories for Users and Advertisers.” Social Media + Society, Oct. 2016, doi:10.1177/2056305116672486.