It's not usually an issue when new songs by well-known musicians become viral on social media, but it may be when a voice clone is singing. Deepfake music has become a reality, despite being one of the last forms of music to endure.
It functions by using a singer's voice to train an AI model, and the outcomes aren't necessarily unfavourable. Deepfake songs are produced with a lot of true ingenuity, but the issue extends beyond the result. Deepfake music has a number of issues, including situations where permission isn't obtained, concerns about who gets paid and if it's morally acceptable, to name a few.
1. Unauthorized Datasets
You need to compile audio recordings of a famous performer's singing voice into a dataset if you want to make a deepfake track of that performer. A deepfake song can be published on a major music site like YouTube, Spotify, or TikTok despite having utilised an unauthorised dataset, just as with any data that belongs to someone, you should usually seek their permission before using it.
This occurred when a user by the name of Ghostwriter made the song "Heart on My Sleeve" for the musicians Drake and The Weeknd. The Weeknd's ex-girlfriend was mentioned in the lyrics of the popular song, which also included the artists' AI-generated voices.
Fans genuinely enjoyed the song, and Ghostwriter received praise for using the deepfake vocals in a unique way. However, not everyone felt the same way, as The Seattle Times reported. A representative for Universal Music Group, the significant record company that represents both performers, queries:
2. Outdated Copyright Law
Copyright law might not be able to assist you right now if you are an artist who does not want to have your voice replicated. It's hardly surprising that laws are still attempting to catch up because our copyright rules were designed at a period when this kind of AI technology didn't exist.
The earliest known judicial instance of "voice theft" in the United States occurred in the 1990s. According to the Los Angeles Times, Frito-Lay Inc., the powerful chip company behind Doritos, Cheetos, Cracker Jack's, and other brands, awarded the artist Tom Waits $2.475 million in damages.
The commercial featured a voice that was similar enough to Tom Waits' to lead some listeners to mistake it for the genuine musician. The same may be said of the popular deepfake tunes, but artificial intelligence music hasn't yet been put to the test in court.
While we wait for the legal system to change, it's important to keep in mind that not everyone finds voice cloning to be problematic; take Holly Herndon, for instance. She chose to authorise Holly+, her voice double, along with a system that paid her fairly in order to combat the approaching tsunami of AI music apps.
No matter which position you adopt, the issue still exists. No particular copyright law mandates that you obtain the artist's consent before exploiting their voice. Artists could find themselves in the lawless frontier of AI technology up until that point without any rules to direct them.
3. Who Gets Paid?
Is it OK to use someone else's voice to sell music? It's a complicated issue that might become worse if more deepfake music is uploaded to streaming services and sold on social media.
As we all know, it's OK to perform a cover of a well-known song and upload it to services like YouTube or Spotify. In this situation, the words, song structure, melody, rhythm, and other elements are duplicated. Although deepfake music doesn't exactly riff on an existing song, it does use the voice of another person to create a brand-new song, vocal clones are very different from this.
In other words, if it weren't for AI technologies and unauthorised datasets, voice clones wouldn't exist. Artists devote their entire lives to developing the voice they were given and creating a distinctive tone. Taking someone else's voice and using it to make money might be going too far.
4. A Gray-Area Genre
Deepfake music can be difficult to categorise because not everyone finds it to be wholly offensive. Deepfake music is becoming into a distinct genre, as opposed to deepfake photos or movies that you would glance at momentarily before scrolling through on your phone.
Some people liken it to the concept of fanfiction writing, which is a fun and inventive way to honour an artist. It's a more optimistic perspective that makes it challenging to dismiss deepfake music as just forbidden. Just watch the video below to see Holly Herndon's method for AI voice cloning.
Though not everybody concurs that this genre of music ought to be permitted. According to The Financial Times, Universal Music Group, a prominent record company, sought to have lower-quality songs, including some produced by AI, removed from streaming services. It will eventually be up to streaming services like Spotify, Apple Music, or Tidal to decide whether or not this type of music is appropriate for their platform.
Similar arguments that are already being discussed in the art world about whether AI-generated work could be deemed art are being rekindled by deepfake music. We're only now debating AI music.
5. Ethical Concerns Around Race and Identity
A lot of music that imitates rap music has emerged in the deepfake music period. Some individuals are worried about race and identity in AI music because the genre's historical roots may be found among African American teenagers growing up in the Bronx, New York City, in the early 1970s.
One individual who considers deepfake music to be a major issue is writer Lauren Chanel. They clarify as cited in a New York Times article:
It's not the first time that music produced using AI techniques has violated moral standards. A virtual rapper entitled FN Meka was signed to a label, according to the Rolling Stone magazine, but was soon dismissed because an online civil rights organisation complained that the project promoted "gross stereotypes" about Back culture.
If anything, deepfake music serves as a reminder that the history of music-making and AI technologies are inextricably linked, and that neglecting this increases the risks associated with AI deepfakes.
6. Causing Harm to Artists
A vocal clone's ability to sing about events that never happened, convey feelings that aren't their own, and pretended to care about topics they might not be interested in is something that shouldn't be disregarded. This has an emotional influence on the artist that shouldn't be ignored.
The Weeknd used to date Selena Gomez, who was mentioned in the lyrics of the phoney Drake song featuring him. It was a strange distortion of reality that may be devastating to the artist who actually went through the experience, mixing real-life facts with made-up songs.
Similarly, employing AI text generators to create songs "in the style of" well-known individuals has made some artists despondent about the usage of AI technology. In the words of Nick Cave, "What ChatGPT is, in this instance, is replication as travesty."
Deepfake music has the power to make the artist sing or speak things that they have never done. On a personal level, the artist may suffer since they have no power to have the work removed without resorting to legal action.