In August, Capitol Records dropped FN Meka, a fictitious musical artist partly powered by artificial intelligence. His look, lyrics, and persona seemed to be inspired by real-life artistes such as Lil Pump, 6ix9ine, and Travis Scott. However, the project received backlash over stereotypes.
This debate over cultural appropriation, and the songs of this robot rapper, which were actually written and sang by humans, has led to questions from observers of technology in music.
In August, an AI piece of art won a prize in Colorado. With DALL-E 2, an AI that creates visual art, and Hatsune Miku, a Japanese software which does something similar for music, technology is fast revolutionizing the arts world.
It is now common practice to consume culture through digital avatars like FN Meka. In 2020, Travis Scott performed in a concert through his avatar in the video game Fortnite; in 2012, a hologram of deceased famous rapper Tupac Shakur performed at a music festival. Just last month, Eminem and Snoop Dogg performed as their digital selves in a metaverse performance at the MTV Video Music Awards.
The questions that abound now is whether the humans responsible for the creation of the lyrics should be held accountable or not. How does cultural appropriation work when the person doing the appropriating is a fictitious character backed by an anonymous multiracial collective?
‘A lot of our moral intuitions and codes as humans may have evolved for a context where we have discrete human actors,’ said Ziv Epstein, Ph.D. candidate at the MIT Media Lab. ‘These emerging technologies require new legal frameworks and research to understand how we reason about them.’
It may satisfy the critics of FN Meka if they are aware that people of color were involved in the process of designing and promoting the character, in order to prevent the negative stereotypes associated with it. Industry Blackout, a nonprofit advocacy group stated that FN Meka had ‘insulted’ Black culture and stolen the sounds and looks of real-life Black artists. Capitol in a statement apologized for its ‘insensitivity’.
‘There are humans behind technology,’ said Sinead Bovell, founder of WAYE, an organization that educates people about technology. ‘When we disconnect the two, that’s where we could potentially risk harm for different marginalized groups.’
‘What concerns me about the world of avatars, is we have a situation where people can create and profit off the ethnic group an avatar represents without being a part of that ethnic group,’ she added.
Imani Mosley, Professor of Musicology at the University of Florida, stated that the culture most likely to be exploited in pop music and hip-hop is Black culture.
‘There’s so much overlap between digital culture and Gen Z culture and Black culture, to the point where a lot of people don’t necessarily recognize that a lot of things Gen Z says are pulled from African American vernacular,’ she said. ‘To interact with that culture, to be a part of that discourse, is to use certain digital and cultural markers, and if you don’t have access to that discourse because you’re not Black, one way to do that is to hide one’s own ethnicity behind the curtain of the internet.’
For some, painting the creators of FN Meka as villains raised the topic of artistic censorship.
James Young, Professor of Philosophy at the University of Victoria, admitted that there is a music tradition of placing a premium on the artist’s life experience. He quoted the line attributed to jazz legend Charlie Parker: ‘If you didn’t live it, it won’t come out of your horn.’
Young argued that the movement of the consensus to only sanctioning art that arises out of experience is to the detriment of political solidarity and art. He made reference to an incident years ago where a white artist was heavily criticized for painting Emmett Till, a Black civil rights martyr.
‘One of the claims is, “This is digital blackface”, Young said about FN Meka. ‘Maybe it is. You’ve got to be very careful. I don’t think you want to claim that all representations of Black people are somehow morally offensive.’ He advocated for balanced examination.
Aaron Hertzmann, scientist at Adobe Research in a paper titled ‘Can Computers Make Art?’, argued that only humans can make art, because computers are not capable of interacting socially with humans. Thus, machine learning is the tool, while the person who gave the software instructions is the artist.
‘Someday, better AI could come to be viewed as true social agents,’ said Hertzmann.
Epstein stated that some art is now the result of ‘a complex and diffuse system of many human actors and computational process interacting. If you generate a DALL-E 2 image, is that your artwork? Can you be the social agent of that? Or are they scaffolded by other humans?’
Anthony Martini, co-founder of Factory New, the virtual music company that created FN Meka said, ‘if you’re mad about the lyrical content because it supposedly was AI, why not be mad about the lyrical content in general?”
By Marvellous Iwendi.
Source: The New York Times