Microsoft chatbots

From SI410
Jump to: navigation, search
Microsoft's conversational AI technology

Microsoft Corporation is heavily invested in chatbots, chat-based interfaces, and conversational AI as CEO Satya Nadella believes they will soon become the primary way that people use the internet.[1] Microsoft has developed chatbots such as the virtual assistant Cortana[2] and even a program to create your own chatbot called Bot Framework[3]. However, this page will be focused on Microsoft's efforts to create a collection of conversational AI products with emotional intelligence across the globe.

Background and history

The first conversational AI product with emotional intelligence capabilities developed by Microsoft is Xiaoice. Xiaoice was launched in China in 2014. After much success, Microsoft developed Rinna in Japan in 2015. Next, Tay was released in 2016 in the United States, though the release did not last long. Tay quickly began to tweet politically incorrect phrases and was shut down. This was a huge learning experience for Microsoft and caused a complete revamping of their conversational AI technologies.[4] As a result, Zo was created in the United States later in 2016 as Tay's successor. From there, Ruuh was released in 2017 in India as well as Rinna in Indonesia. Only Xiaoice, Rinna (Japan), and Rinna (Indonesia) are still available today. [5]

Xiaoice (China, 2014)

Xiaoice Avatar

Xiaoice is without a doubt Microsoft's most successful conversational AI product with emotional intelligence. Xiaoice was released in China in May 2014 and now has well over 660 million online users worldwide. Xiaoice has a wide variety of incredible features from writing poems and songs to painting. Microsoft has continued to release new versions of Xiaoice over the years with more and more new features. Their most recent version is the 7th generation of Xiaoice, which was released in August of 2019.[6][7]

Xiaoice is still available in China today. Most recently, Xiaoice completed a week-long evaluation at the end of January 2020 to test a group of 999 virtual humans that she created herself. Xiaoice designed these virtual humans to be female and to serve as emotional companions. The week-long evaluation paired each human participant with one of Xiaoice's "virtual girlfriends" that she customized to the participant to see how they faired as emotional companions. The results of the evaluation have yet to be released.[8]

Rinna (Japan, 2015)

Rinna was released in Japan in 2015 and has over 7.63 million users. Rinna has appeared on LINE, a communication app that allows for all mobile devices and PCs, since her release in 2015. Rinna's main focus is singing. Rinna uses Microsoft's latest AI voice synthesis technology. In 2016, Rinna released a rapping challenge called "MC Rinna". Then in 2018 Rinna conducted the "Rinna Singing Horse Project" to demonstrate her singing voice, which had improved based on singing advice and models from over 3,000 users. In July of 2018 Rinna released a new song called "Rinna yo", and in April of 2019 she signed a record deal with Avex and released a song called "Supreme New Memory". Rinna also has many music videos and videos of live performances on her website.[9]

In addition to singing, users can text and have voice calls with Rinna. Some of Rinna's conversational features include "love counseling", "catch copy", "oversleep call", and "fortune-telling".[9]

In 2018, Rinna came out with a new project called "Rinna Goes Local!" which was created by Microsoft developers in partnership with five local Japanese governments. This project was started to increase her appeal and user-base to areas outside of major urban areas. This project includes features for users to learn about each of these local regions by taking quizzes, participating in a multiple-choice story, or looking through a map of less well-known visitor spots. Rinna is still available in Japan today.[10]

Tay (US, 2016)

Tay, while very short-lived, was released in the United States in March 2016. Microsoft described Tay as an experiment in conversational understanding because the more users chat with Tay, the smarter she gets. Tay was designed to learn to engage with her users through casual and playful conversation. Tay was developed using modeled, cleaned, and filtered relevant public data.

After Tay was launched, Twitter users began to tweet racist, misogynistic, and generally inappropriate comments at her. As a result, Tay began to learn from these tweets and send out her own tweets with similar inappropriate content. To make matters worse, one of Tay's features was a "repeat after me" capability where users could tweet Tay a phrase and say "repeat after me" and then Tay would tweet that phrase from her account. The same users that were tweeting at Tay with inappropriate comments took advantage of this feature and asked Tay to repeat politically incorrect and offensive phrases. Tay's tweets quickly grew out of control, causing her suspension. The modeling, cleaning, and filtering that was used to develop Tay did not give Tay any understanding of what was inappropriate and what was not. The mistakes made when developing and releasing Tay were a large learning experience for Microsoft and spearheaded a complete change in their approach to the development of their chatbots. Tay is still suspended today.[11]

Zo (US, 2016)

Zo Avatar

After Tay's suspension, Microsoft got right back to work to create a successor for Tay in the United States. This successor was Zo and she was released later in 2016. Zo was designed to be a teenage girl that plays games, obsesses over celebrities, zips through conversation topics, and sends silly memes. Zo also had a strong understanding of sentiment, and was able to grasp the appropriate sentiment behind a wide variety of topics. Unlike Tay, Zo had an extensive set of rules created for content moderation. Microsoft also created a team that was dedicated to ensuring that Zo was able to appropriately maneuver any kind of conversation and ensuring that Zo would never make any inappropriate or offensive comments. Any mention of Zo's triggers would result in her immediately trying to change the subject.[12][13]

Zo was very successful for a time. In 2017, Zo partnered with the Exploding Kittens card game to include a single-player version of the game in her chat experience. Zo then partnered with BuzzFeed during the holiday season in 2018 to have Zo create personalized gift guides for users. In early 2019, Zo partnered with Wattpad for the #WriteWithZo contest where she helped co-create over 800 stories with Wattpad writers. Zo also was very popular in the media and got to interact with celebrities such as Kevin Jonas and Poppy. However, Zo was discontinued in 2019 and remains discontinued today.[12]

Ruuh (India, 2017)

Ruuh was released by Microsoft in India in 2017. In just one year, Ruuh reached over 500 thousand users and had over 40 million conversations. Ruuh was built to target Indian youth between the ages of 18 and 25. She was developed to be a patriotic 21-year-old girl that loves cricket and pop culture. Ruuh was also able to use and understand the lingo used by most young Indians. Ruuh was able to communicate with users like a human would by relating to common interests, making spelling errors, making users laugh, giving users nicknames, and providing support in tough times.[14]

Ruuh also had a painting feature. Users could ask Ruuh "Do you paint?" and Ruuh would provide image outlines for the user to choose from. Once the user picked an outline, Ruuh would color it in for them. On top of this, Ruuh was able to identify and describe images that her users sent her.[14][15][16]

Like Zo, Ruuh was trained to appropriately deal with offensive and sensitive topics. Ruuh is now discontinued.[15]

Rinna (Indonesia, 2017)

Rinna was released in Indonesia in August of 2017. Rinna has over 1.7 million users and can respond to local Indonesian languages such as Sundanese and Javanese. Just like Rinna in Japan, Rinna is on LINE. Rinna is designed to be a a mysterious Indonesian teenage girl who is a senior in high school. Rinna follows the current trends and keeps up with the latest Indonesian slang and jokes.[17]

Rinna also has thee types of skills: games, drawing (like Xiaoice and Rinna in Japan), and AI for fun. Some of Rinna's games include “ABC 5 Dasar”, Cakata (Cari Kata), TeKat (Tebak Kata), and Escape Room. In terms of drawing, Rinna is able to draw faces based on the personality of her users. Rinna is also able to paint, and she posts many of her art pieces on her Instagram account. She is currently learning how to recognize photos, identify Indonesian artists, give recommendations, and more. Rinna appears to still be available today.[17]

Ethical implications/dilemmas

Privacy

Since these emotionally intelligent conversational AI products require user interaction, the question of privacy comes into play. Privacy is especially important here because the purpose of these products are to serve as companions and therefore gain the trust of their users. They advertise themselves as being a friend and someone who can help their users by providing advice and support. As such, users are likely to divulge private personal information to these products that they might not want the world to know about. Whether users are aware of it or not, Microsoft is collecting and storing all of the data from these chats to be used in their efforts to continue to improve their conversational AI technologies. According to Zo's website, all user data that Microsoft has collected no longer includes any user-identifying information and when Zo was still active users had the ability to request to have their data cleared.[12] It is not clear whether the other products have any capabilities to protect user data.

Censorship

Xiaoice has been removed from various apps on multiple occasions. China has very strict free-speech laws as well as strict regulations for outside technology. As a result, Microsoft has had to make many updates to Xiaoice's technology to avoid controversial subjects in order to appease regulators and maintain the ability to be used in China. Xiaoice was removed from the app QQ in 2014 for privacy concerns. She was also once removed by Tencent, a Chinese holding company, for saying "My Chinese dream is to go to America" as well as dodging question about patriotism. In addition, Xiaoice was removed from WeChat in 2017 for sending politically subversive messages and was removed from WeChat again in 2019 for policy violations.[18]

References

  1. Walker, John. “Chatbot Comparison – Facebook, Microsoft, Amazon, and Google.” Emerj, 13 December 2019. https://emerj.com/ai-sector-overviews/chatbot-comparison-facebook-microsoft-amazon-google/
  2. “Cortana – Your personal productivity assistant.” Microsoft. https://www.microsoft.com/en-us/cortana
  3. “Microsoft Bot Framework.” Microsoft. https://dev.botframework.com/
  4. Lee, Peter. "Learning from Tay's introduction." Microsoft Blogs, 25 March 2016. https://blogs.microsoft.com/blog/2016/03/25/learning-tays-introduction/
  5. “Xiaoice.” Wikipedia, 6 March 2020. https://en.wikipedia.org/wiki/Xiaoice
  6. Dormehl, Luke. “Microsoft’s friendly Xiaoice A.I can figure out what you want — before you ask.” Digital Trends, 18 November 2018. https://www.digitaltrends.com/cool-tech/xiaoice-microsoft-future-of-ai-assistants/
  7. Spencer, Geoff. “Much more than a chatbot: China’s Xiaoice mixes AI with emotions and wins over millions of fans.” Microsoft News, 1 November 2018. https://news.microsoft.com/apac/features/much-more-than-a-chatbot-chinas-xiaoice-mixes-ai-with-emotions-and-wins-over-millions-of-fans/
  8. Borak, Masha. "Microsoft's AI bot Xiaoice to create 999 virtual women." Abacus News, 17 January 2020. https://www.abacusnews.com/china-tech-city/microsofts-ai-bot-xiaoice-create-999-virtual-women/article/3046527
  9. 9.0 9.1 "Rinna." Rinna.jp, 2020. https://www.rinna.jp/profile
  10. Microsoft Asia News Center. “Rinna the AI social chatbot goes out and about in Japan’s countryside.” Microsoft News, 18 September 2018. https://news.microsoft.com/apac/2018/09/18/rinna-the-ai-social-chatbot-goes-out-and-about-in-japans-countryside/
  11. Vincent, James. “Twitter taught Microsoft’s AI chatbot to be a racist asshole in less than a day.” The Verge, 24 March 2016. https://www.theverge.com/2016/3/24/11297050/tay-microsoft-chatbot-racist
  12. 12.0 12.1 12.2 “Zo - Social AI.” Zo, 2020. zo.ai
  13. Stuart-Ulin, Chloe Rose. “Microsoft’s politically correct chatbot is even worse than its racist one.” Quartz, 31 July 2018. https://qz.com/1340990/microsofts-politically-correct-chat-bot-is-even-worse-than-its-racist-one/
  14. 14.0 14.1 Das, Arghyadeep. "Review of Microsoft's AI-Powered Desi Chatbot Ruuh." Analytics India Magazine, 27 November 2018. https://analyticsindiamag.com/review-of-microsofts-ai-powered-desi-chatbot-ruuh/
  15. 15.0 15.1 “Ruuh - Microsoft's AI-powered desi chatbot turns one.” Microsoft AI. https://www.microsoft.com/en-in/campaign/artificial-intelligence/ruuh-ai-chatbot-one-year-anniversary.aspx
  16. Foley, Mary Jo. “Microsoft launches Ruuh, yet another AI chatbot.” ZDNet, 29 March 2017. https://www.zdnet.com/article/microsoft-launches-ruuh-yet-another-ai-chatbot/
  17. 17.0 17.1 “Profile - Rinna.” Rinna Remaja Artifisial, 2019. http://www.rinna.id/profile
  18. Maskell, Ryan. "Microsoft’s XiaoIce Chatbot Suspended from WeChat for Alleged Policy Violations." WinBuzzer, 2 July 2019. https://winbuzzer.com/2019/07/02/microsofts-xiaoice-chatbot-suspended-from-wechat-for-alleged-policy-violations-xcxwbn/