top of page

Researchers explore impact of AI on music and culture

已更新:2022年6月7日

Source from University of Toronto

Global access to art, culture, and entertainment products – music, movies, books, and more – has undergone fundamental changes over the past 20 years in light of groundbreaking developments in artificial intelligence.


For example, users of streaming services like Netflix and Spotify have data collected and analyzed by algorithms to determine their streaming habits – resulting in recommendations that cater to their tastes. But this is only one of the many ways in which AI tools are transforming the arts and culture industries. AI is also being used in the production of music and other art, with algorithms generating photos or writing songs on their own.


Warner Music even “signed” an algorithm to a record deal in 2019.


Yet, while AI is drastically reshaping cultural industries around the world, we have yet to fully understand the consequences.


“The societal impacts these algorithmic developments are having on the production, circulation, and consumption of culture remain largely unknown,” says Ashton Anderson, an assistant professor in the department computer science in the University of Toronto’s Faculty of Arts & Science and a faculty affiliate at the Schwartz Reisman Institute for Technology and Society.


Anderson’s research aims to bridge the divide between computer science and the social sciences. He uses computation to study online well-being – for example, studying the impact of “echo chambers” on social media. Such interdisciplinary work is a key component of the Schwartz Reisman Institute’s mission to re-conceptualize common notions of the ways technology, systems and society interact.

Ashton Anderson


In October of 2019, Anderson and his collaborators – Georgina Born, a professor of music and anthropology at Oxford University; Jeremy Morris, an associate professor of media and cultural studies at the University of Wisconsin-Madison; and Fernando Diaz, a research scientist at Google in Montreal and a Canada CIFAR AI chair – convened a CIFAR AI & Society workshop to explore the effects of AI on the curation of culture, with a particular focus on the music industry.


“We deliberately brought together communications and computer science scholars, musicians, industry members and users to map out the major issues we could foresee now that cultural products are largely being distributed via algorithms on global platforms,” says Anderson.


The researchers’ findings and recommendations were published in a report titled “Artificial Intelligence, Music Recommendation and the Curation of Culture.”


“This report is the first major document to recognize and describe the societal effects of the algorithmic revolution in cultural industries,” says Anderson. “Existing journals are virtually entirely aligned with only one of the many stakeholder groups that took part in this interdisciplinary effort so, unfortunately, publishing this report in one of these journals would be next to impossible.


“We’re very happy to have Schwartz Reisman publish this report, as our methodology and the cross-disciplinary expertise we convened is well-aligned with SRI’s mission to straddle traditional academic boundaries in the pursuit of understanding how powerful new technologies shape the world around us.”


The report contains three overarching themes. First, participants agreed there will be major long-term impacts of the use of AI-driven technologies on cultural consumption and creation. For example, if algorithms decide what to distribute and recommend – and to whom and when – then arts and culture creators, and the organizations who fund them, may be incentivized to produce content that is more likely to get listener exposure or reach fans based on how algorithms work to connect audiences with content.


Participants also saw “a clear need to enrich existing AI-driven technologies so they can better serve diverse communities and genres of culture, art, and music,” says Anderson. If algorithms overly generalize information about certain groups, subcultures or communities – whether by race, gender, or other identity markers – Anderson notes that “we risk reinforcing rigid and potentially harmful social boundaries.”


A third theme emerging from the workshop is that the curation of culture always has involved, and always will involve, balancing competing objectives.


“The extraction of personal data has been privatized and corporatized by curation platforms, but as yet without any public debate or intervention for accountability and transparency,” says Anderson.


In other words, how should we measure the convenience and increased accessibility that streaming platforms provide against the fact that they threaten to harm important ideals of public safety, such as the right to privacy and cultural sovereignty?


The workshop’s report considers almost every step of the music industry’s processes, from the ways in which algorithms manipulate the existing variety of content itself, to the ways in which content is produced, distributed, valued, understood and consumed around the world – including the ways in which artists and creators are remunerated.


Important questions explored in the report include:


  • What assumptions are built into media recommendation systems?

  • What happens to the crucial role of social and community relationships at the heart of the experience of music?

  • How can we ensure appropriate cultural expertise is represented in algorithmic and technological design?

  • AI-based classifications of music may be highly efficient, but they do not necessarily reflect a truly intelligent analysis of music. Can they ever have any real understanding of what music is?

  • Will music and musical tastes become increasingly homogenized due to AI-based systems of production, promotion and distribution? Do we risk misrepresenting or underrepresenting marginalized communities or their agency to represent themselves?

  • Is the personalization of algorithms too seductive? Do we risk no longer “thinking for ourselves”?

“I’m delighted to see us producing this report at SRI,” says Professor Gillian K. Hadfield, the director of the Schwartz Reisman Institute for Technology and Society. “This project is aligned with the work that SRI Engineering Lead Ron Bodkin is leading to improve AI systems’ objectives and recommender systems, surveying research in this area and building new techniques. Recommendation is a huge part of the daily role AI plays in our lives, and ensuring it’s aligned with human values is a key part of SRI’s mission.”


“I'm glad to see SRI contributing a thoughtful interdisciplinary perspective to the considerations of how AI is affecting media and culture,” adds Bodkin. “This report raises important topics for how various stakeholders should be able to participate and how to allow for more autonomy and diversity.


“I believe that incorporating a wider range of values and increasing agency is a critical direction for recommendation systems and algorithmically curated media,” says Bodkin. “The report's call for giving stakeholders meaningful controls over recommendation systems is important and it's an area where we're exploring how AI research can contribute.”




加入LINE詢問相關資訊


掃描APPLY QRCode填寫申請表!


更多精彩內容,請持續關注教育商城,



歡迎前來諮詢申請!


Commenti


bottom of page