The music business is trying to stop the theft and exploitation of art from generative AI on platforms, in court, and with lawmakers, but the war is still far from over. The size of the problem is demonstrated by the fact that Sony Music recently stated that it has already asked that 75,000 deep fakes—simulated images, songs, or videos that are easily mistaken for real—be eliminated.
Artificial intelligence-generated music contains “telltale signs” and is simple to identify, according to the information security firm Pin drop, but it appears to be everywhere. According to voice analysis specialist Pin drop, “Artificial intelligence-generated songs, even when they sound realistic, often have subtle irregularities in frequency variation, rhythm, and digital patterns that aren’t present in human performances.”
However, it only takes a few minutes on YouTube or Spotify, two of the most popular music-streaming websites, to identify a phony 2Pac rap about pizzas or an Ariana Grande version of a K-pop song that she never sang.
Sam Dub off, Spotify’s head on policy, stated, “We take that really seriously, and we’re trying to work on new tools in that space to make that even better.” According to YouTube, it is “improving” its own detection of artificial intelligence scammers and may reveal the results in the upcoming weeks.
According to Jeremy Goldman, an analyst at the company Emarketer, “the bad actors were a little bit more aware sooner,” leaving musicians, labels, and other industry participants “operating from a position of reactivity.”
Goldman stated, “YouTube, with a multiple of billions of dollars per year, has a strong vested interest to solve this,” and he expressed confidence that they are making a sincere effort to do so.
“If you’re at YouTube, you don’t want the platform itself to turn into an artificial intelligence nightmare,” he stated.
Court Cases
However, the music industry is especially worried about the unauthorized use of its content to train generative artificial intelligence models like Suno, Udio, or Mubert, which goes beyond deepfakes. The parent company of Udio was sued by a number of major labels in a federal court in New York last year, alleging that it developed its technology using “copyrighted sound recordings for the ultimate purpose of poaching the listeners, fans, and potential licensees of the sound recordings it copied.”
The process has not yet started in earnest, over nine months later. This also applies to a related case that was brought against Suno in Massachusetts.
The fair use concept, which permits limited use of certain intellectual content without prior authorization, is at the heart of the legal dispute. It might restrict how intellectual property rights are used.
According to Vanderbilt University law professor Joseph Fishman, “it’s an area of genuine uncertainty.” Any preliminary decisions won’t always be final since separate judges’ differing views could send the matter to the Supreme Court.
The main companies in artificial intelligence-generated music are still using copyrighted content to train their models in the interim, which begs the issue of whether the fight is already lost.
Although many models are already training on protected information, Fishman said it could be too soon to conclude so because new iterations of those models are constantly being issued, and it’s unclear if any court rulings will result in future licensing concerns for those models.
Deregulatory
Labels, artists, and producers have not had much success in the legislative arena. The US Congress has seen the introduction of numerous measures, but no real progress has been made.
Some governments have passed protective laws, particularly in regards to deep fakes. One such state is Tennessee, which is home to a significant portion of the influential country music business. Another possible obstacle is Donald Trump, the Republican president who has positioned himself as an advocate for deregulation, especially in the area of artificial intelligence.
The administration has been requested to “clarify that the use of publicly available data to train models is unequivocally fair use” by a number of Artificial intelligence titans, including Meta. Even though the courts are supposed to have the final say, if Trump’s White House follows that guidance, it might tip the scales against musicians.
The situation is not much better in Britain, where the Labor administration is thinking of changing the law to permit AI companies to utilize online content created by creators to build their models, unless the owners of the rights object.
Is This What We Want?, an album comprising the sound of silence recorded in multiple studios, was published in February by over a thousand performers, including Kate Bush and Annie Lennox, in protest of such attempts.
According to analyst Goldman, as long as artificial intelligence is not organized, it will probably continue to plague the music industry. He remarked, “The music industry is so fragmented.” “In my opinion, that ultimately does more harm than good in terms of resolving this issue.”
A new app aims to empower artists in the face of Artificial intelligence.
Scriptwriter Ed Bennett-Coles claimed to have had a “death moment” in 2008 after reading a story about artificial intelligence writing its first screenplay.
Almost twenty years later, he and his friend, musician Jamie Hartman, have created a blockchain-based program that they believe will enable authors, artists, and others to take ownership of and safeguard their creations. According to Hartman, “Artificial intelligence is swooping in and taking so many people’s jobs.” “No,” he answered, their app replies, “this is our work.”
“This is human, and since we own it, we determine its value.” AI poses an increasingly serious danger to livelihoods and intellectual property in the creative industries.
The inventors told AFP that their program, ARK, aims to track ownership of ideas and work from the original concept to the final product. For instance, one may register a song demo by just uploading the file.
The file is identified as belonging to the artist who uploaded it via features including non-disclosure agreements, blockchain-based verification, and biometric security measures. During the creative process, collaborators could then record their own contributions as well.
While his colleague nodded in agreement, Bennett-Coles stated that ARK “challenges the notion that the end product is the only thing worthy of value.” Maintaining “a process of human ingenuity and creativity, ring-fencing it so that you can actually still earn a living off it” is the aim, according to Hartman.
Balances and checks
The venture capital firm Claritas Capital has provided money for ARK, which is scheduled to open fully in the summer of 2025. Additionally, ARK has a strategic agreement with the performing rights organization BMI.
Its development has also involved a great deal of existential soul-searching for Hartman and Bennett-Coles. Bennett-Coles remarked, “I came across a quote yesterday that encapsulates it: the cancer cell’s philosophy is growth for growth’s sake.” “And artificial intelligence is that.”
“The sales justification gets faster and faster, but we really need to rediscover the love for process.” He compared the difference between artificial intelligence content and human-made art to a toddler going to the butcher with his grandfather as opposed to getting a chunk of meat from an online delivery service.
“As important as the actual purchase,” he continued, are the family moments spent together, such as the walk to and from the store and the chats in between errands. Similarly, “the journey Jamie takes in his car on his way to the studio may be just as significant to the composition of that song as the actual studio activities.”
They believe ARK can reclaim that creative process, which artificial intelligence devalues. According to Hartman, it serves as “a check and balance on behalf of the human being.”
Arise from the ashes.
The developers of ARK stated that because the software is decentralized, they determined it must be blockchain-based, with data kept on a kind of digital ledger.
According to Bennett-Coles, “it must be decentralized in order to give the creator autonomy and sovereignty over their IP and control over their destiny.”
According to them, app users would pay for ARK in tiers, with prices based on storage usage requirements.
The scriptwriter clarified that they want ARK to be able to be used as a “recording on the blockchain” or a “smart contract” in a court of law, referring to it as “a consensus mechanism.”
Hartman went on to say, “The process of registering has been fairly archaic for a long time, but the principle of copyright is pretty good as long as you can prove it and stand behind it.” He went on to say, “Why not advance copyright, as far as how it’s proven?” “We think we have a breakthrough.” According to both artists, the sectors have been too slow to adapt to the tremendous advancements in artificial intelligence.
A large portion of the answer, according to Bennett-Coles, must begin with the artists experiencing similar “death moments” to his own years ago. He stated, “They can then decide what can be done after rising from the ashes.”
“What is important to us and what do we enjoy doing? How can we keep it that way?”
For the latest updates and insights on new developments, visit the Socioon Blog
NEWSON –Artists are making modest attempts to combat artificial intelligence.
Q1: What is generative AI in the context of the music industry?
Generative AI refers to artificial intelligence systems capable of creating music, images, and videos that closely mimic real human creations, often without explicit permission from original artists.
Q2: Why is the music industry concerned about generative AI?
The industry worries that generative AI exploits artists’ works without permission, leading to lost income, unauthorized use of copyrighted material, and potential damage to the authenticity of their art.
Q3: How extensive is the issue of AI-generated deep fakes in music?
Sony Music alone has requested the removal of around 75,000 AI-generated deepfake songs, indicating the scale of the issue.
Q4: Can AI-generated songs be distinguished from human-created music?
Yes, according to experts like Pin Drop, AI-generated music typically contains subtle irregularities in frequency, rhythm, and digital patterns not found in human performances.
Q5: What legal actions have been taken against AI companies?
Major record labels have sued AI companies like Udio and Suno, accusing them of illegally using copyrighted recordings to train AI models.
Q6: How are platforms like Spotify and YouTube responding to AI-generated content?
Spotify and YouTube claim to take the issue seriously and are working on advanced tools to detect and remove AI-generated content.