Difference between revisions of "Virtual Assistants"
(→Google Duplex) |
(→Google Duplex) |
||
Line 35: | Line 35: | ||
==Ethical Implications== | ==Ethical Implications== | ||
− | ===Google Duplex=== | + | ===Google Duplex Deceptive Misrepresentation=== |
In the spring of 2018, Google introduced "Duplex", a technology featured aimed to help Google Assistant sound more human when interacting with users <ref> Lomas, Natasha. “Duplex Shows Google Failing at Ethical and Creative AI Design.” TechCrunch, TechCrunch, 10 May 2018, techcrunch.com/2018/05/10/duplex-shows-google-failing-at-ethical-and-creative-ai-design/. </ref>. While this technology displayed a new innovative form of communication unlike what the world has seen so far, it also raised a bunch of questions surrounding ethics. Immediately following the demo of Google Duplex, critics were quick to point out that there was no indication from Google Duplex's end that they were a bot -- however, the conversational and realistic aspect of this phone call is what made this interaction so special. Whether or not Google was responsible for letting the person on the other end know that they were talking to a bot was something debated after the conference. Researcher Dr Thomas King calls question the ethics of Google Duplex by looking at the intent behind this new technology. While the ultimate goal of Google's Duplex is to create a better user experience, King questions whether or not you need to be deceitful and appear to be human for a "better" user experience. In Harry Frankfurt's article he addresses the amount of bullshit prevalent in our lives today, and notes that the success of lies is dependent on the actors ability to deceive us <ref> Kellman, Steven G. The Georgia Review, vol. 59, no. 2, 2005, pp. 431–433. JSTOR, www.jstor.org/stable/41402610. </ref>. The main issue with Google's Duplex is the deceptive misrepresentation, which is what Frankfurt specifically points out to as a contribution to creating bullshit. Regardless of whether or not Google Duplex's intentions were to deceive, the way the technology is designed is in a way that is inherently deceptive as the application is meant to mimic humanlike conversations. In an age where information is easily manipulated and falsified the truth becomes increasingly more important, especially in the new frontier of human and AI interactions. | In the spring of 2018, Google introduced "Duplex", a technology featured aimed to help Google Assistant sound more human when interacting with users <ref> Lomas, Natasha. “Duplex Shows Google Failing at Ethical and Creative AI Design.” TechCrunch, TechCrunch, 10 May 2018, techcrunch.com/2018/05/10/duplex-shows-google-failing-at-ethical-and-creative-ai-design/. </ref>. While this technology displayed a new innovative form of communication unlike what the world has seen so far, it also raised a bunch of questions surrounding ethics. Immediately following the demo of Google Duplex, critics were quick to point out that there was no indication from Google Duplex's end that they were a bot -- however, the conversational and realistic aspect of this phone call is what made this interaction so special. Whether or not Google was responsible for letting the person on the other end know that they were talking to a bot was something debated after the conference. Researcher Dr Thomas King calls question the ethics of Google Duplex by looking at the intent behind this new technology. While the ultimate goal of Google's Duplex is to create a better user experience, King questions whether or not you need to be deceitful and appear to be human for a "better" user experience. In Harry Frankfurt's article he addresses the amount of bullshit prevalent in our lives today, and notes that the success of lies is dependent on the actors ability to deceive us <ref> Kellman, Steven G. The Georgia Review, vol. 59, no. 2, 2005, pp. 431–433. JSTOR, www.jstor.org/stable/41402610. </ref>. The main issue with Google's Duplex is the deceptive misrepresentation, which is what Frankfurt specifically points out to as a contribution to creating bullshit. Regardless of whether or not Google Duplex's intentions were to deceive, the way the technology is designed is in a way that is inherently deceptive as the application is meant to mimic humanlike conversations. In an age where information is easily manipulated and falsified the truth becomes increasingly more important, especially in the new frontier of human and AI interactions. |
Revision as of 16:37, 15 April 2019
Virtual Assistants are artificial intelligence systems that perform human-like tasks for users, similar to a personal assistant [1] Some of the popular virtual assistants include Apple's Siri, Amazon Alexa, Microsoft's Cortana, and Google Home. Many virtual assistants can carry out a number of tasks following a voice command, such as giving the weather, playing music, or controlling smart home devices.
Contents
Popular Virtual Assistants
Amazon Alexa Voice System
Often known as Alexa, the Alexa Voice System is a virtual assistant created by Amazon. It works on many different Amazon devices, such as Amazon Echo, the Echo Dot, and Amazon Tap, among other Alexa smart home devices. Upon saying the wake word, which is "Alexa" by default, the user can give a command to which the assistant will carry out.
Some of the many tasks Amazon Alexa can do are answering questions, reading headlines from your favorite news outlets, set timers and alarms, control certain smart home devices, stream music, and relay weather or traffic information. [2] Some of the tasks that the Alexa Voice System cannot do are distinguish between different voices, send audio messages, and saying swear words. [3] In addition to the built in functionality of Alexa, is the Smart Home Skill API which allows third-party developers to create apps called skills. By February 2017 there were 10,000 skills created for Alexa. [4] A majority of these skills are related to games, news, and education, but there are many that have low, if any, ratings.[4]
Apple's Siri
Siri can be used on the iPhone, iPad, iPod touch, and some of the new versions of the MacBook. To use Siri and ask it a question, you can either hit the home button on your device or say "Hey Siri" followed by your question or command. [5]
Google Home
Google Home is a device run from Google Assistant that can answer questions, follow commands, and control smart devices. Google can also distinguish voices and can therefore support multiple users for more personalization.[6]
Google Home Features [6]
- Calendar: get your schedule for the day or information about an event
- Alarm & Timer
- Calculator
- Dictionary
- Multi-room audio
- Games
- Music: play music from various music services such as Spotify and Pandora
- Travel Information: Check flight status, find and track flights
- Weather & Traffic
- Translation
- Unit Conversions
- TV streaming
- Thermostat control: can be used with supported smart thermostats
- My Day: gives a brief overview of your day including weather, reminders, calendar, news, and commute
- Light control
Performance
In a 2016 Brian X. Chen, a technology writer for The New York Times, put together a test of 16 tasks and compared Google, Siri, Cortana, and Alexa on how well the tasks were carried out. [7] Google scored the highest, followed by Siri, Cortana, and lastly, Alexa. Chen found that Alexa was the best at tasks related to music and Siri was best at calendar and email tasks. [7] Chen also found small differences amongst the four virtual assistants as Siri could read aloud email messages, compose emails, and create calendar events, while Google could not read email aloud and Alexa was unable to create calendar events or compose emails. [7]
Ethical Implications
Google Duplex Deceptive Misrepresentation
In the spring of 2018, Google introduced "Duplex", a technology featured aimed to help Google Assistant sound more human when interacting with users [8]. While this technology displayed a new innovative form of communication unlike what the world has seen so far, it also raised a bunch of questions surrounding ethics. Immediately following the demo of Google Duplex, critics were quick to point out that there was no indication from Google Duplex's end that they were a bot -- however, the conversational and realistic aspect of this phone call is what made this interaction so special. Whether or not Google was responsible for letting the person on the other end know that they were talking to a bot was something debated after the conference. Researcher Dr Thomas King calls question the ethics of Google Duplex by looking at the intent behind this new technology. While the ultimate goal of Google's Duplex is to create a better user experience, King questions whether or not you need to be deceitful and appear to be human for a "better" user experience. In Harry Frankfurt's article he addresses the amount of bullshit prevalent in our lives today, and notes that the success of lies is dependent on the actors ability to deceive us [9]. The main issue with Google's Duplex is the deceptive misrepresentation, which is what Frankfurt specifically points out to as a contribution to creating bullshit. Regardless of whether or not Google Duplex's intentions were to deceive, the way the technology is designed is in a way that is inherently deceptive as the application is meant to mimic humanlike conversations. In an age where information is easily manipulated and falsified the truth becomes increasingly more important, especially in the new frontier of human and AI interactions.
References
- ↑ Gray, Phil. "The Rise of Intelligent Virtual Assistants" (1 June 2016. 24 April 2017).
- ↑ Crist, Ry. "Amazon Alexa: Device compatibility, how-tos and much more"CNET (8 August 2016. Retrieved 24 April 2017).
- ↑ Martin, Taylor. "9 things Alexa can't do yet" CNET (10 January 2017. Retrieved 24 April 2017).
- ↑ 4.0 4.1 Crist, Ry. "Alexa just hit 10,000 skills, but does anyone care?"CNET(23 February 2017. Retrieved 24 April 2017).
- ↑ [1](Retrieved 24 April 2017).
- ↑ 6.0 6.1 [2](Retrieved 24 April 2017).
- ↑ 7.0 7.1 7.2 Chen, Brian."Siri, Alexa and Other Virtual Assistants Put to the Test" The New York Times (27 January 2016. Retrieved 24 April 2017).
- ↑ Lomas, Natasha. “Duplex Shows Google Failing at Ethical and Creative AI Design.” TechCrunch, TechCrunch, 10 May 2018, techcrunch.com/2018/05/10/duplex-shows-google-failing-at-ethical-and-creative-ai-design/.
- ↑ Kellman, Steven G. The Georgia Review, vol. 59, no. 2, 2005, pp. 431–433. JSTOR, www.jstor.org/stable/41402610.