Difference between revisions of "Artificial Intelligence in the Music Industry"

From SI410
Jump to: navigation, search
(Commodification of Music)
Line 1: Line 1:
 
{{Nav-Bar|Topics##}}<br>
 
{{Nav-Bar|Topics##}}<br>
'''Artificial Intelligence in the Music Industry''' is a relatively new phenomenon that has already hugely impacted the way we produce, consume, and experience music. Artificial intelligence (AI) itself is essentially a system created to learn and respond in a way that is similar to human beings. To some degree, AI technologies can simulate human intelligence and many different companies have developed applications that can create, perform, and/or curate music through the power of AI <ref> Freeman, Jeremy. “Artificial Intelligence and Music — What the Future Holds?” Medium, 24 Feb. 2020, medium.com/@jeremy.freeman_53491/artificial-intelligence-and-music-what-the-future-holds-79005bba7e7d. </ref> . This industry is relatively new, but growing rapidly. Although it's creating great advancements, it might cause future problems for artists having to compete against machines for recognition and potential legal discrepancies regarding copyright infringement.
+
'''Artificial Intelligence in the Music Industry''' is a recent phenomenon that has already had a significant impact on the way we conceive, produce, and consume music. Artificial intelligence (AI) as a system works by learning and responding in a way that mimics the thought processes of a human. Major corporations in the music industry have developed applications that can create, perform, and/or curate music through the power of AI's simulation of human intelligence <ref> Freeman, Jeremy. “Artificial Intelligence and Music — What the Future Holds?” Medium, 24 Feb. 2020, medium.com/@jeremy.freeman_53491/artificial-intelligence-and-music-what-the-future-holds-79005bba7e7d. </ref> . This microcosm of the music industry is relatively new, but growing at a rapid rate. Although its advancements have been great, using AI in music could cause issues in the future for artists who could have to compete against machines for recognition. Potential legal discrepancies regarding copyright infringement are also of concern.
 
[[Image:Illiac.jpeg|300px|thumb|The Illinois Automatic Computer (ILLIAC)]]
 
[[Image:Illiac.jpeg|300px|thumb|The Illinois Automatic Computer (ILLIAC)]]
  
 
==History==
 
==History==
 
===A Look Into AI===
 
===A Look Into AI===
Before diving into the musical aspect, we look at the history of AI production to better understand the ethics behind AI in music. The concept of AI was first accumulated in scientists, mathematicians, and philosophers' minds in the 1950's. In 1950, a young man named Alan Turing wrote a paper on how to build 'intelligent machines' and test their intelligence<ref> Anyoha, Rockwell. “The History of Artificial Intelligence” Harvard University, August 28, 2017, https://sitn.hms.harvard.edu/flash/2017/history-artificial-intelligence/. </ref>. There were a lot of times of flourishing for AI as well as setbacks when many people became less interested in AI and in doing research for it. Of course, new films and books that were popularized and contained futuristic AI made the majority of people interested in it again. Despite the setbacks, there were many breakthroughs in IA and programmers continued to test out different ideas. A great timestamp was when in 1997, IBM's Deep Blue Computer beat the then-current world chess champion, Garry Kasparov<ref> Tate, Karl. “History of A.I.: Artificial Intelligence (Infographic)” Live Science, August 25, 2014, https://www.livescience.com/47544-history-of-a-i-artificial-intelligence-infographic.html. </ref>. AI has continued to develop itself and we see it often in our day-to-day lives, like when we talk to machines on our phones. A current project being evolved is facial emotion recognition and detection, which has been adapted by the MyHeritage App using Deep Nostalgia<ref> Leichman, Abigail. “Everyone’s trying Deep Nostalgia to bring old photos alive” Israel21c, March 14, 2021, https://www.israel21c.org/everyones-trying-deep-nostalgia-to-bring-old-photos-alive/. </ref>.
+
An understanding of the history of AI can allow us to better understand the ethics involved in any discussion of AI in music. AI as a concept came into popular consciousness in the 1950s with British mathematician Alan Turing paper on how to build an 'intelligent machine' and how to test its intelligence<ref> Anyoha, Rockwell. “The History of Artificial Intelligence” Harvard University, August 28, 2017, https://sitn.hms.harvard.edu/flash/2017/history-artificial-intelligence/. </ref>. Since then, interest in AI as a potential field of study has gone waxed and waned. The increased use of AI as a subject of media like books and movies increased awareness of AI in the mainstream and in scientific circles. Since its inception, we have seen an exponential increase in the implementation of AI in our everyday lives, from virtual assistants like Alexa and Siri to fraud detection in personal banking.
  
 
===AI In Music===
 
===AI In Music===
The first recorded instance of computer-generated music was in 1951 by a British mathematician, Alan Turing. Turing built the BBC outside-broadcast unit at the Computing Machine Library in Manchester, England, and used it to generate a variety of different melodies. To do so, he programmed individual musical notes into the computer. Artificial intelligence was first used to create music in 1957 by Lejaren Hiller and Leonard Isaacson from the University of Illinois at Urbana–Champaign <ref> Li, Chong. “A Retrospective of AI + Music - Prototypr.” Medium, 25 Sept. 2019, blog.prototypr.io/a-retrospective-of-ai-music-95bfa9b38531. </ref>. They programmed the ILLIAC (Illinois Automatic Computer) to generate music completely written by artificial intelligence. Around this same time, Russian researcher R.Kh.Zaripov published the first paper on algorithmic music composing that was available worldwide. He used the historical Russian computer, URAL-1, to do so <ref> Li, Chong. “A Retrospective of AI + Music - Prototypr.” Medium, 25 Sept. 2019, blog.prototypr.io/a-retrospective-of-ai-music-95bfa9b38531. </ref>
+
The first recorded instance of computer-generated music was in 1951 by Turing. Turing built the BBC outside-broadcast unit at the Computing Machine Library in Manchester, England, and used it to generate a variety of different melodies. To do so, he programmed individual musical notes into the computer. Artificial intelligence was first used to create music in 1957 by Lejaren Hiller and Leonard Isaacson from the University of Illinois at Urbana–Champaign <ref> Li, Chong. “A Retrospective of AI + Music - Prototypr.” Medium, 25 Sept. 2019, blog.prototypr.io/a-retrospective-of-ai-music-95bfa9b38531. </ref>. Hiller and Isaacson programmed the ILLIAC (Illinois Automatic Computer) to generate music written start-to-finish by artificial intelligence. Around the same time, Russian researcher R.Kh.Zaripov published the first widely-available paper on algorithmic music composing. He used the historical Russian computer, URAL-1, to do so <ref> Li, Chong. “A Retrospective of AI + Music - Prototypr.” Medium, 25 Sept. 2019, blog.prototypr.io/a-retrospective-of-ai-music-95bfa9b38531. </ref>
 
<ref> Freeman, Jeremy. “Artificial Intelligence and Music — What the Future Holds?” Medium, 24 Feb. 2020, medium.com/@jeremy.freeman_53491/artificial-intelligence-and-music-what-the-future-holds-79005bba7e7d. </ref> .
 
<ref> Freeman, Jeremy. “Artificial Intelligence and Music — What the Future Holds?” Medium, 24 Feb. 2020, medium.com/@jeremy.freeman_53491/artificial-intelligence-and-music-what-the-future-holds-79005bba7e7d. </ref> .
  
  
From then on, research and software around AI generate music began to flourish. In 1974, the first International Computer Music Conference (ICMC) was hosted at Michigan State University in East Lansing, Michigan. This became an annual event hosted by the International Computer Music Association (ICMA) for AI composers and researchers alike <ref> Freeman, Jeremy. “Artificial Intelligence and Music — What the Future Holds?” Medium, 24 Feb. 2020, medium.com/@jeremy.freeman_53491/artificial-intelligence-and-music-what-the-future-holds-79005bba7e7d. </ref> .  
+
Since those milestones, research and software in AI generated music has flourished. In 1974, the first International Computer Music Conference (ICMC) was hosted at Michigan State University in East Lansing, Michigan. The ICMC is now an annual event hosted by the International Computer Music Association (ICMA) for AI composers and researchers alike <ref> Freeman, Jeremy. “Artificial Intelligence and Music — What the Future Holds?” Medium, 24 Feb. 2020, medium.com/@jeremy.freeman_53491/artificial-intelligence-and-music-what-the-future-holds-79005bba7e7d. </ref> .  
  
 
==Applications==
 
==Applications==
Line 20: Line 20:
 
<ref> Freeman, Jeremy. “Artificial Intelligence and Music — What the Future Holds?” Medium, 24 Feb. 2020, medium.com/@jeremy.freeman_53491/artificial-intelligence-and-music-what-the-future-holds-79005bba7e7d. </ref> .
 
<ref> Freeman, Jeremy. “Artificial Intelligence and Music — What the Future Holds?” Medium, 24 Feb. 2020, medium.com/@jeremy.freeman_53491/artificial-intelligence-and-music-what-the-future-holds-79005bba7e7d. </ref> .
  
In 2016 Google, one of the leaders in AI technology released '''Magenta'''. Magenta is “an open-source research project exploring the role of machine learning as a tool in the creative process” <ref> Magenta. Google, magenta.tensorflow.org. </ref>. Instead of learning from hard-coded rules, Magenta learns by example from other humans. It can be thought of as an assistant to humans in the creative process, rather than a replacement<ref> “How Google Is Making Music with Artificial Intelligence.” Science | AAAS, 8 Dec. 2017, www.sciencemag.org/news/2017/08/how-google-making-music-artificial-intelligence. </ref>.
+
In 2016, Google, a leader in AI technology, released '''Magenta'''. Magenta's mission statement writes that it is “an open-source research project exploring the role of machine learning as a tool in the creative process” <ref> Magenta. Google, magenta.tensorflow.org. </ref>. Instead of learning from hard-coded rules, Magenta learns by example from other humans. In this way, it acts as an assistant to humans in the creative process, rather than a machine-part replacement<ref> “How Google Is Making Music with Artificial Intelligence.” Science | AAAS, 8 Dec. 2017, www.sciencemag.org/news/2017/08/how-google-making-music-artificial-intelligence. </ref>.
  
 
===Recommendation Models===
 
===Recommendation Models===
Another way AI is used in the music industry is by creating recommendation models. One of the most prominent users of this tool today is '''Spotify'''. Spotify uses a variety of machine learning techniques to predict and customize playlists for its users. One example of this is the Discovery Weekly playlist which is a list of 30 songs curated specifically for the user each week based on search history, listening patterns, and predictive models. One way they do this is through '''collaborative filtering'''. To do this, Spotify compares different users with similar behaviors to predict what a user might enjoy or want to listen to next. Another tool is '''natural language processing (NPL)'''. NPL analyses human speech patterns through text. AI accumulates words that are associated with different artists by scanning the internet for articles and posts written about them. They can then associate artists who have similar cultural vectors, or top terms which each other and recommend similar artists to their users. One final tool Spotify uses to curate user-specific playlists is '''audio models'''. Audio models are most useful when an artist is new and doesn’t have much online about them yet or many listeners. Audio models analyze raw audio tracks and categorize them with similar songs. This way, they are able to recommend lesser-known artists alongside popular tracks <ref> Sen, Ipshita. “How AI Helps Spotify Win in the Music Streaming World.” Outside Insight, 26 Nov. 2018, outsideinsight.com/insights/how-ai-helps-spotify-win-in-the-music-streaming-world. </ref>.  
+
Another way AI is used in the music industry is by creating recommendation models. One of the most prominent users of this tool today is '''Spotify'''. Spotify uses a host of machine learning techniques to predict and customize playlists for its users. One example of this is the Discover Weekly playlist, a collection of 30 songs curated specifically for the user each week based on search history, listening patterns, and predictive models. One way they do this is through '''collaborative filtering'''. Spotify uses collaborative filtering to compare different users with similar behaviors to predict what a user might enjoy or want to listen to next. Another tool is '''natural language processing (NPL)'''. NPL analyses human speech patterns through text. AI accumulates words that are associated with different artists by scanning the internet for articles and posts written about them. They can then associate artists who have similar cultural vectors, or top terms which each other and recommend similar artists to their users. One final tool Spotify uses to curate user-specific playlists is '''audio models'''. Audio models are most useful when an artist is new and doesn’t have much online about them yet or many listeners. Audio models analyze raw audio tracks and categorize them with similar songs. This way, they are able to recommend lesser-known artists alongside popular tracks <ref> Sen, Ipshita. “How AI Helps Spotify Win in the Music Streaming World.” Outside Insight, 26 Nov. 2018, outsideinsight.com/insights/how-ai-helps-spotify-win-in-the-music-streaming-world. </ref>.  
  
 
==Ethical Implications==
 
==Ethical Implications==
 
===Commodification of Music===
 
===Commodification of Music===
Some ethical concerns about the future of music and AI have arisen over recent years. One of which is the commodification of music. With the popularization of AI-generated music on the horizon, there is the concern of music being made and sold solely for the purpose of profit. Music has always been a very human type of art that pulls from emotion and experience that you would assume no machine could recreate. However, that might not be required to make a chart-topping song these days. AI technology could easily come up with something modern and catchy with the power to analyze and remix music that is already popular. This also poses the threat of music becoming homogenous, lacking variety. Hopefully, people will continue to prefer something human-made, but as algorithms continue to advance and more data on our musical preferences are collected, it’s impossible to say what the future holds <ref> Staff. “What Does Commodification Mean for Modern Musicians?” Dorico, 13 Mar. 2018, blog.dorico.com/2018/01/commodification-music-mean-modern-musicians. </ref>.
+
Some ethical concerns about the future of music and AI have arisen over recent years. One of which is the commodification of music. With the popularization of AI-generated music on the horizon, there is concern that music will be or is being made and sold solely for profit. Music has always been an essentially-human art medium that pulls from emotion and experience. You would assume real music is something no machine could recreate. However, emotion and experience might not be key ingredients to a chart-topping song these days. AI technology can easily generate songs that are trendy and catchy with the power to analyze and remix music that is already popular. This also poses the threat of music becoming homogenous or lacking variety. Ideally, people will continue to prefer human-made music, but as algorithms continue to advance and more data on our musical preferences are collected, it’s impossible to say what the future holds <ref> Staff. “What Does Commodification Mean for Modern Musicians?” Dorico, 13 Mar. 2018, blog.dorico.com/2018/01/commodification-music-mean-modern-musicians. </ref>.
 
[[Image:Ai.jpeg|300px|thumb|Computers generating new music through AI ]]
 
[[Image:Ai.jpeg|300px|thumb|Computers generating new music through AI ]]
  
===Musicians===
+
 
Although AI in music is an exciting development, with any AI product there is a risk for an increase in jobless people. Many musicians are passionate about their music and have devoted their lives to making it their full-time career. But what happens when AI starts to make music? There is a risk of AI's becoming more popular than human-made music because an AI will be programmed to learn what music the majority prefers. This may cause many musicians to lose their job or be forced to have music more as a hobby than a career. Although this can scare people to hear, AI could take away but also produce many more job opportunities. However, the process of losing and gaining jobs might not be as smooth of a process as we would hope<ref> Thomas, Mike. “ARTIFICIAL INTELLIGENCE'S IMPACT ON THE FUTURE OF JOBS” builtin, 8 Apr. 2020, https://builtin.com/artificial-intelligence/ai-replacing-jobs-creating-jobs. </ref>.
+
Since its beginnings, AI has always carried the negative associations of leading to high volumes of unemployment. This phenomenon has begun to and will continue to take away jobs in the music industry as AI is increasingly implemented. While it makes many jobs redundant, it can also provide new opportunities <ref> Thomas, Mike. “ARTIFICIAL INTELLIGENCE'S IMPACT ON THE FUTURE OF JOBS” builtin, 8 Apr. 2020, https://builtin.com/artificial-intelligence/ai-replacing-jobs-creating-jobs. </ref>. Like many aspects of AI's future in music, how exactly its effect on employment and opportunities will play out is unclear.
  
 
===Copyright===
 
===Copyright===
One of the biggest ethical dilemmas that AI-generated music is facing is copyright infringement. Allowing the AI to essentially listen to copyrighted music and then generate similar songs without compensating or even acknowledging the original artist themselves could result in legal complications. As laws currently stand now, copyright infringement can only occur if AI creates a song that sounds similar to an existing song and claims it as its own. This is a hard line to draw because it may depend on how similar the songs are to make this call. This law was written without AI systems in mind so it’s likely that they never considered a machine listening to and pulling from an artist's entire discography to create one song. From an artist's point of view, it may seem unfair to not be credited or compensated for something like this <ref> “How AI Is Benefiting The Music Industry?” Tech Stunt, 20 Aug. 2020, techstunt.com/how-ai-is-benefiting-the-music-industry. </ref> <ref> “We’ve Been Warned About AI and Music for Over 50 Years, but No One’s Prepared.” The Verge, 17 Apr. 2019, www.theverge.com/2019/4/17/18299563/ai-algorithm-music-law-copyright-human. </ref>.
+
One of the biggest ethical dilemmas that AI-generated music is facing is copyright infringement. Allowing the AI to listen to copyrighted music and then generate similar songs without compensating or citing the original artist could result in huge legal complications. As laws currently stand now, copyright infringement can only occur if AI creates a song that sounds similar to an existing song and claims it as its own. This is a hard line to draw because it may depend on how similar the songs are to make this call—what does "similar" look like and can it be defined in a legal context? This law was written without AI systems in mind, meaning it is likely that they never took into consideration the implications of its consequences, like a machine listening to and pulling from an artist's entire discography to create one song. From an artist's point of view, it may seem unfair to not be credited or compensated for something like this <ref> “How AI Is Benefiting The Music Industry?” Tech Stunt, 20 Aug. 2020, techstunt.com/how-ai-is-benefiting-the-music-industry. </ref> <ref> “We’ve Been Warned About AI and Music for Over 50 Years, but No One’s Prepared.” The Verge, 17 Apr. 2019, www.theverge.com/2019/4/17/18299563/ai-algorithm-music-law-copyright-human. </ref>.
  
 
==Conclusion==
 
==Conclusion==
Even though artificial intelligence in music has been around for over 60 years now, it's still considered to be in its very early stages of development. No one knows for sure what the future holds for the music industry as we currently know it. Maybe in 20 years, the chart-topping songs will have been computer-generated or our favorite music streaming services will know exactly what we want to listen to based on our mood. We will just have to wait and see.  
+
Artificial intelligence's history in music has spanned 60 years now, but it is still in its very early stages of development. It is impossible to predict what the future of the music industry compared to what we know it as now. Perhaps in 20 years, the chart-topping songs will have been computer-generated, or our favorite music streaming services will be equipped with the technology to suggest music based on our mood. We will just have to wait and see.  
  
 
==See Also==
 
==See Also==

Revision as of 20:10, 25 March 2021

Back • ↑Topics • ↑Categories

Artificial Intelligence in the Music Industry is a recent phenomenon that has already had a significant impact on the way we conceive, produce, and consume music. Artificial intelligence (AI) as a system works by learning and responding in a way that mimics the thought processes of a human. Major corporations in the music industry have developed applications that can create, perform, and/or curate music through the power of AI's simulation of human intelligence [1] . This microcosm of the music industry is relatively new, but growing at a rapid rate. Although its advancements have been great, using AI in music could cause issues in the future for artists who could have to compete against machines for recognition. Potential legal discrepancies regarding copyright infringement are also of concern.

The Illinois Automatic Computer (ILLIAC)

History

A Look Into AI

An understanding of the history of AI can allow us to better understand the ethics involved in any discussion of AI in music. AI as a concept came into popular consciousness in the 1950s with British mathematician Alan Turing paper on how to build an 'intelligent machine' and how to test its intelligence[2]. Since then, interest in AI as a potential field of study has gone waxed and waned. The increased use of AI as a subject of media like books and movies increased awareness of AI in the mainstream and in scientific circles. Since its inception, we have seen an exponential increase in the implementation of AI in our everyday lives, from virtual assistants like Alexa and Siri to fraud detection in personal banking.

AI In Music

The first recorded instance of computer-generated music was in 1951 by Turing. Turing built the BBC outside-broadcast unit at the Computing Machine Library in Manchester, England, and used it to generate a variety of different melodies. To do so, he programmed individual musical notes into the computer. Artificial intelligence was first used to create music in 1957 by Lejaren Hiller and Leonard Isaacson from the University of Illinois at Urbana–Champaign [3]. Hiller and Isaacson programmed the ILLIAC (Illinois Automatic Computer) to generate music written start-to-finish by artificial intelligence. Around the same time, Russian researcher R.Kh.Zaripov published the first widely-available paper on algorithmic music composing. He used the historical Russian computer, URAL-1, to do so [4] [5] .


Since those milestones, research and software in AI generated music has flourished. In 1974, the first International Computer Music Conference (ICMC) was hosted at Michigan State University in East Lansing, Michigan. The ICMC is now an annual event hosted by the International Computer Music Association (ICMA) for AI composers and researchers alike [6] .

Applications

Music Production

One of the biggest breakthroughs in computer-generated music was the Experiments in Musical Intelligence (EMI) system. Developed by David Cope, an American composer and scientist at the University of California, Santa Cruz, EMI was able to analyze different types of music and create unique compositions by genre. It has now created more than a thousand different musical works based on over 30 different composers[7] [8] .

In 2016, Google, a leader in AI technology, released Magenta. Magenta's mission statement writes that it is “an open-source research project exploring the role of machine learning as a tool in the creative process” [9]. Instead of learning from hard-coded rules, Magenta learns by example from other humans. In this way, it acts as an assistant to humans in the creative process, rather than a machine-part replacement[10].

Recommendation Models

Another way AI is used in the music industry is by creating recommendation models. One of the most prominent users of this tool today is Spotify. Spotify uses a host of machine learning techniques to predict and customize playlists for its users. One example of this is the Discover Weekly playlist, a collection of 30 songs curated specifically for the user each week based on search history, listening patterns, and predictive models. One way they do this is through collaborative filtering. Spotify uses collaborative filtering to compare different users with similar behaviors to predict what a user might enjoy or want to listen to next. Another tool is natural language processing (NPL). NPL analyses human speech patterns through text. AI accumulates words that are associated with different artists by scanning the internet for articles and posts written about them. They can then associate artists who have similar cultural vectors, or top terms which each other and recommend similar artists to their users. One final tool Spotify uses to curate user-specific playlists is audio models. Audio models are most useful when an artist is new and doesn’t have much online about them yet or many listeners. Audio models analyze raw audio tracks and categorize them with similar songs. This way, they are able to recommend lesser-known artists alongside popular tracks [11].

Ethical Implications

Commodification of Music

Some ethical concerns about the future of music and AI have arisen over recent years. One of which is the commodification of music. With the popularization of AI-generated music on the horizon, there is concern that music will be or is being made and sold solely for profit. Music has always been an essentially-human art medium that pulls from emotion and experience. You would assume real music is something no machine could recreate. However, emotion and experience might not be key ingredients to a chart-topping song these days. AI technology can easily generate songs that are trendy and catchy with the power to analyze and remix music that is already popular. This also poses the threat of music becoming homogenous or lacking variety. Ideally, people will continue to prefer human-made music, but as algorithms continue to advance and more data on our musical preferences are collected, it’s impossible to say what the future holds [12].

Computers generating new music through AI


Since its beginnings, AI has always carried the negative associations of leading to high volumes of unemployment. This phenomenon has begun to and will continue to take away jobs in the music industry as AI is increasingly implemented. While it makes many jobs redundant, it can also provide new opportunities [13]. Like many aspects of AI's future in music, how exactly its effect on employment and opportunities will play out is unclear.

Copyright

One of the biggest ethical dilemmas that AI-generated music is facing is copyright infringement. Allowing the AI to listen to copyrighted music and then generate similar songs without compensating or citing the original artist could result in huge legal complications. As laws currently stand now, copyright infringement can only occur if AI creates a song that sounds similar to an existing song and claims it as its own. This is a hard line to draw because it may depend on how similar the songs are to make this call—what does "similar" look like and can it be defined in a legal context? This law was written without AI systems in mind, meaning it is likely that they never took into consideration the implications of its consequences, like a machine listening to and pulling from an artist's entire discography to create one song. From an artist's point of view, it may seem unfair to not be credited or compensated for something like this [14] [15].

Conclusion

Artificial intelligence's history in music has spanned 60 years now, but it is still in its very early stages of development. It is impossible to predict what the future of the music industry compared to what we know it as now. Perhaps in 20 years, the chart-topping songs will have been computer-generated, or our favorite music streaming services will be equipped with the technology to suggest music based on our mood. We will just have to wait and see.

See Also

References

  1. Freeman, Jeremy. “Artificial Intelligence and Music — What the Future Holds?” Medium, 24 Feb. 2020, medium.com/@jeremy.freeman_53491/artificial-intelligence-and-music-what-the-future-holds-79005bba7e7d.
  2. Anyoha, Rockwell. “The History of Artificial Intelligence” Harvard University, August 28, 2017, https://sitn.hms.harvard.edu/flash/2017/history-artificial-intelligence/.
  3. Li, Chong. “A Retrospective of AI + Music - Prototypr.” Medium, 25 Sept. 2019, blog.prototypr.io/a-retrospective-of-ai-music-95bfa9b38531.
  4. Li, Chong. “A Retrospective of AI + Music - Prototypr.” Medium, 25 Sept. 2019, blog.prototypr.io/a-retrospective-of-ai-music-95bfa9b38531.
  5. Freeman, Jeremy. “Artificial Intelligence and Music — What the Future Holds?” Medium, 24 Feb. 2020, medium.com/@jeremy.freeman_53491/artificial-intelligence-and-music-what-the-future-holds-79005bba7e7d.
  6. Freeman, Jeremy. “Artificial Intelligence and Music — What the Future Holds?” Medium, 24 Feb. 2020, medium.com/@jeremy.freeman_53491/artificial-intelligence-and-music-what-the-future-holds-79005bba7e7d.
  7. Li, Chong. “A Retrospective of AI + Music - Prototypr.” Medium, 25 Sept. 2019, blog.prototypr.io/a-retrospective-of-ai-music-95bfa9b38531.
  8. Freeman, Jeremy. “Artificial Intelligence and Music — What the Future Holds?” Medium, 24 Feb. 2020, medium.com/@jeremy.freeman_53491/artificial-intelligence-and-music-what-the-future-holds-79005bba7e7d.
  9. Magenta. Google, magenta.tensorflow.org.
  10. “How Google Is Making Music with Artificial Intelligence.” Science | AAAS, 8 Dec. 2017, www.sciencemag.org/news/2017/08/how-google-making-music-artificial-intelligence.
  11. Sen, Ipshita. “How AI Helps Spotify Win in the Music Streaming World.” Outside Insight, 26 Nov. 2018, outsideinsight.com/insights/how-ai-helps-spotify-win-in-the-music-streaming-world.
  12. Staff. “What Does Commodification Mean for Modern Musicians?” Dorico, 13 Mar. 2018, blog.dorico.com/2018/01/commodification-music-mean-modern-musicians.
  13. Thomas, Mike. “ARTIFICIAL INTELLIGENCE'S IMPACT ON THE FUTURE OF JOBS” builtin, 8 Apr. 2020, https://builtin.com/artificial-intelligence/ai-replacing-jobs-creating-jobs.
  14. “How AI Is Benefiting The Music Industry?” Tech Stunt, 20 Aug. 2020, techstunt.com/how-ai-is-benefiting-the-music-industry.
  15. “We’ve Been Warned About AI and Music for Over 50 Years, but No One’s Prepared.” The Verge, 17 Apr. 2019, www.theverge.com/2019/4/17/18299563/ai-algorithm-music-law-copyright-human.