Difference between revisions of "Emotion Recognition Algorithms (ERAs)"

From SI410
Jump to: navigation, search
Line 1: Line 1:
 
[[File:Emotions.jpeg|400px|thumb|right]]
 
[[File:Emotions.jpeg|400px|thumb|right]]
 
{{Nav-Bar|Topics##}}<br>
 
{{Nav-Bar|Topics##}}<br>
'''Emotion recognition algorithms (ERAs)''' or '''emotion artificial intelligence (EAIs)''' are a form of technology that recognizes, infers, and harvests "emotions using data sources such as social media behavior, streaming service use, voice, facial expressions, and biometrics" <ref name="Andalibi">Andalibi, Nazanin and Buss, Justin. (2020). ''The Human in Emotion Recognition on Social Media: Attitudes, Outcomes, Risks.'' CHI 2020.</ref>. ERAs are artificial intelligence technologies that are used in many different situations such as employment <ref name="Designing">Robert, L. P., Pierce, C., Marquis, L., Kim, S., & Alahmad, R. (2020). ''Designing fair AI for managing employees in organizations: a review, critique, and design agenda.'' Human–Computer Interaction, 35(5-6), 545-575.</ref>, and these algorithms can be useful in identifying information pertaining to different contexts, but they may also be invasive to individuals who feel uncomfortable being closely observed<ref name="Andalibi" />. The history of emotion recognition comes from literature on the importance of emotions such as Charles Darwin's ''The Expression of the Emotions in Man and Animals, Anniversary Edition'' <ref name="Darwin"> Darwin, Charles. (2009). ''The Expression of the Emotions in Man and Animals, Anniversary Edition.'' (4 ed.). Oxford University Press. </ref> and from Paul Ekman<ref name="Ekman"> https://en.wikipedia.org/wiki/Paul_Ekman </ref> and from Paul Ekman<ref name="Ekman"> https://en.wikipedia.org/wiki/Paul_Ekman </ref> and Wallace V. Friesen<ref name=”Friesen”> https://en.wikipedia.org/wiki/Wallace_V._Friesen </ref> who identified different but salient human emotions which became used throughout emotion recognition technologies. As technology has advanced overtime, ERAs have been implemented into everyday life including social media. Discussions have emerged on the benefits and harms of emotion recognition technologies and the implications it can have on technology users.  
+
'''Emotion recognition algorithms (ERAs)''' or '''emotion artificial intelligence (EAIs)''' are a form of technology that recognizes, infers, and harvests "emotions using data sources such as social media behavior, streaming service use, voice, facial expressions, and biometrics" <ref name="Andalibi">Andalibi, Nazanin and Buss, Justin. (2020). ''The Human in Emotion Recognition on Social Media: Attitudes, Outcomes, Risks.'' CHI 2020.</ref>. ERAs are artificial intelligence technologies that are used in many different situations such as employment <ref name="Designing">Robert, L. P., Pierce, C., Marquis, L., Kim, S., & Alahmad, R. (2020). ''Designing fair AI for managing employees in organizations: a review, critique, and design agenda.'' Human–Computer Interaction, 35(5-6), 545-575.</ref>, and these algorithms can be useful in identifying information pertaining to different contexts, but they may also be invasive to individuals who feel uncomfortable being closely observed<ref name="Andalibi" />. The history of emotion recognition comes from literature on the importance of emotions such as Charles Darwin's ''The Expression of the Emotions in Man and Animals, Anniversary Edition'' <ref name="Darwin"> Darwin, Charles. (2009). ''The Expression of the Emotions in Man and Animals, Anniversary Edition.'' (4 ed.). Oxford University Press. </ref>and from Paul Ekman<ref name="Ekman"> https://en.wikipedia.org/wiki/Paul_Ekman </ref> and from Paul Ekman<ref name="Ekman"> https://en.wikipedia.org/wiki/Paul_Ekman </ref> and Wallace V. Friesen<ref name=”Friesen”> https://en.wikipedia.org/wiki/Wallace_V._Friesen </ref> who identified different but salient human emotions which became used throughout emotion recognition technologies. As technology has advanced overtime, ERAs have been implemented into everyday life including social media. Discussions have emerged on the benefits and harms of emotion recognition technologies and the implications it can have on technology users.  
  
 
==History==
 
==History==

Revision as of 03:07, 11 February 2022

Emotions.jpeg
Back • ↑Topics • ↑Categories

Emotion recognition algorithms (ERAs) or emotion artificial intelligence (EAIs) are a form of technology that recognizes, infers, and harvests "emotions using data sources such as social media behavior, streaming service use, voice, facial expressions, and biometrics" [1]. ERAs are artificial intelligence technologies that are used in many different situations such as employment [2], and these algorithms can be useful in identifying information pertaining to different contexts, but they may also be invasive to individuals who feel uncomfortable being closely observed[1]. The history of emotion recognition comes from literature on the importance of emotions such as Charles Darwin's The Expression of the Emotions in Man and Animals, Anniversary Edition [3]and from Paul Ekman[4] and from Paul Ekman[4] and Wallace V. Friesen[5] who identified different but salient human emotions which became used throughout emotion recognition technologies. As technology has advanced overtime, ERAs have been implemented into everyday life including social media. Discussions have emerged on the benefits and harms of emotion recognition technologies and the implications it can have on technology users.

History

Human emotions are crucial to understand emotion recognition technologies. Charles Darwin, in 1872, described the importance of human emotion in his book, The Expression of the Emotions in Man and Animals, Anniversary EditionCite error: Invalid <ref> tag; refs with no content must have a name. Emotion recognition roots can be traced back to Paul Ekman[4] and Wallace V. FriesenCite error: Invalid <ref> tag; refs with no content must have a name who, in 1975, identified what he found to be "basic" and "universal" emotions: anger, disgust, fear, joy, sadness, and surprise.[6] As technology has advanced, emotion recognition technologies such as emotion artificial intelligence algorithms have emerged using much of Ekman's work. These technologies are constantly being iteratively designed to better the process in which these technologies function. [7] Major companies are now using emotion recognition technologies to make decisions and curate content [8] .

ERAs on Social Media

Companies such as Google [9] have begun to use emotion recognition technologies with patents [10] to analyze user data through emotions and to then curate content for users. Companies with emotion recognition data that are gathered from their users have control over how the data is used which can be referred to as information asymmetry [11]. Emotion recognition technology can also help determine the state of wellbeing of users; for instance, emotion recognition algorithms on Facebook can be used to identify individuals in states of deep distress when posting about their emotions which can then curate helpful and supportive content to those individuals. [12] Emotion recognition on social media can, overall, enhance the user experience by optimizing the content and interactions exposed to the users using emotion recognition data [13]

Implications of ERAs

Benefits of ERAs

ERAs can be used in a plethora of ways. Emotion recognition can be used to provide social media users with curated content to enhance experiences such as protecting users' wellbeing by keeping track of statements pertaining to negative mental and emotional wellbeing[12] and improving interactions on interfaces through curation and design[13]. Much discussion surrounding the benefits of ERAs pertains to the detection of mood instability and wellbeing [14]. Being able to detect negative wellbeing can help users, overall, in improving their mental and emotional state as well as finding help and providing resources for these in need if necessary.

Harms of ERAs

Although emotion recognition technologies can help identify a potentially negative situation, ERAs come with harms as all other technologies. Emotion recognition often requires artificial intelligence bias mitigation methods such as auditing the data from artificial intelligence and highlighting disparities in data can influence companies to make changes [15] Bias can often become integrated into the technology itself by its creators intentionally or unintentionally. Bias comes in many forms such as age bias [16], race, [17] and gender [17]. Inaccurate interpretations of one's emotions using such technology cannot be used to extend to larger populations. Additionally, it can be harmful to users by using inaccurate information to present incorrect and potentially harmful information.

See also

References

  1. 1.0 1.1 Andalibi, Nazanin and Buss, Justin. (2020). The Human in Emotion Recognition on Social Media: Attitudes, Outcomes, Risks. CHI 2020.
  2. Robert, L. P., Pierce, C., Marquis, L., Kim, S., & Alahmad, R. (2020). Designing fair AI for managing employees in organizations: a review, critique, and design agenda. Human–Computer Interaction, 35(5-6), 545-575.
  3. Darwin, Charles. (2009). The Expression of the Emotions in Man and Animals, Anniversary Edition. (4 ed.). Oxford University Press.
  4. 4.0 4.1 4.2 https://en.wikipedia.org/wiki/Paul_Ekman
  5. https://en.wikipedia.org/wiki/Wallace_V._Friesen
  6. Ekman, Paul and Friesen, Wallace V. (1975). Unmasking the face: A guide to recognizing emotions from facial cues.
  7. Fasel, Beat and Luettin, Juergen. (January, 2003). Automatic facial expression analysis: a survey. Pattern Recognition 36. Issue 1. https://doi.org/10.1016/S0031-3203(02)00052-3
  8. https://patents.google.com/patent/WO2017078960A1/en
  9. Puneet Agrawal. 2017. Emotionally connected responses from a digital assistant. Retrieved June 26, 2019 from https://patents.google.com/patent/WO2017078960A1/en
  10. https://en.wikipedia.org/wiki/Patent
  11. https://en.wikipedia.org/wiki/Information_asymmetry
  12. 12.0 12.1 https://engineering.fb.com/2018/02/21/ml-applications/under-the-hood-suicide-prevention-tools-powered-by-ai/
  13. 13.0 13.1 Wang, Yichen and Pal, Aditya. (2015). Detecting Emotions in Social Media: A Constrained Optimization Approach. In IJCAI.
  14. Koustuv Saha, Larry Chan, Kaya De Barbaro, Gregory D. Abowd, and Munmun De Choudhury. 2017. Inferring Mood Instability on Social Media by Leveraging Ecological Momentary Assessments. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 3: 95:1–95:27. https://doi.org/10.1145/3130960
  15. Raji, Inioluwa D. and Buolamwini, Joy. (2019). Actionable Auditing. Proceedings of the 2019 AAAI/ACM Conference on AI, Ethics, and Society (1 2019). https://doi.org/10.1145/3306618.3314244
  16. Kim, E., Bryant, D. A., Srikanth, D., & Howard, A. (2021, July). Age bias in emotion detection: an analysis of facial emotion recognition performance on young, middle-aged, and older adults. In Proceedings of the 2021 AAAI/ACM Conference on AI, Ethics, and Society (pp. 638-644).
  17. 17.0 17.1 B F Klare, M J Burge, J C Klontz, R W Vorder Bruegge, and A K Jain. 2012. FaceRecognition Performance: Role of Demographic Information. IEEE Transactions on Information Forensics and Security 7 (2012), 1789–1801. Issue 6. https://doi. org/10.1109/TIFS.2012.2214212