Difference between revisions of "Automatic gender recognition"

From SI410
Jump to: navigation, search
Line 16: Line 16:
 
==See Also==
 
==See Also==
 
{{resource |
 
{{resource |
*[[https://en.wikipedia.org/wiki/Computer_vision Computer Vision]]
+
*[https://en.wikipedia.org/wiki/Computer_vision Computer Vision]
*[[https://en.wikipedia.org/wiki/Facial_recognition_system Facial Recognition]]
+
*[https://en.wikipedia.org/wiki/Facial_recognition_system Facial Recognition]
 
*[https://en.wikipedia.org/wiki/Gender Gender]
 
*[https://en.wikipedia.org/wiki/Gender Gender]
 
*[https://en.wikipedia.org/wiki/Transgender Transgender]
 
*[https://en.wikipedia.org/wiki/Transgender Transgender]

Revision as of 16:49, 12 March 2020

Automatic gender recognition (AGR) is the use of computer algorithms to attempt to identify a person’s gender based on pictures, video, or audio recordings. As a relatively new form of artificial intelligence, little research has been done on its implications on society when put into use.

Background

Automatic gender recognition works by using other forms of machine learning like facial recognition and computer vision to infer gender by looking for key features such as facial hair or the tone of a person’s voice and comparing them to data that has the gender correctly labeled[1].

Much like other forms of artificial intelligence, automatic gender recognition is increasingly being used to try and improve the user experience of things like shopping and interactive devices by using gender to help build up a user persona of what a user might like or want.

Targeted Marketing

Interactive/Adaptable Devices

Ethical Issues

Misgendering

Transgender people

People of color

See Also

References

  1. Hamidi, Foad, and Stacy M. Branham. “Gender Recognition or Gender Reductionism?: The Social Implications of Embedded Gender Recognition Systems.” Gender Recognition or Gender Reductionism? | Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, 1 Apr. 2018, dl.acm.org/doi/10.1145/3173574.3173582.