Gallery Review Europe Blog Artists Dead artists will able ‘to continue reaching fans for generations to come’ thanks to AI, says CAA’s top strategist
Artists

Dead artists will able ‘to continue reaching fans for generations to come’ thanks to AI, says CAA’s top strategist


Soon you may be able to watch artists like The Beatles come back from the dead to play Hey Jude live at Glastonbury or Coachella, thanks to AI. In fact—it’s already happening, sort of.

“A world where you think about the legacy of somebody, (and) their ability to continue making impact years and years and years to come, is an interesting one,” Alexandra Shannon, head of strategic development at talent agency Creative Artists Agency (CAA) said on a panel at Fortune’s Brainstorm AI conference in London when asked whether we legendary bands like The Beatles could one day ‘work forever’.

 “We’re seeing versions of that here in the U.K. with Abba Voyage.” 

The Swedish pop group has been on tour in London since 2022—but in virtual form, with avatars of the four band members as they appeared in 1979.

“I think those sorts of experiences and ways to continue reaching fans for generations to come is a powerful opportunity,” Shannon, a Harvard Business School alum who started her career at Lehman Brothers in 2007 before its infamous collapse, predicted. 

“They are still able to reach fans and engage with fans in the right way,” Shannon added, with the caveat that “they were in control of that.”

It’s perhaps why CAA is already getting ahead of the curve (and those planning to clone performers without their consent) by creating “digital doubles” of its clients. 

“We are scanning their image, we’re scanning their voice, we’re scanning likeness and we are then storing that on their behalf,” Shannon, who joined the agency behind Aerosmith and Cardi B in 2021, revealed.

“We know that the law is going to take time to catch up and so this is a mechanism for our clients to actually own and have permissions around their digital identity.”

“This provides a way for us to help set a precedent for anyone who wants to work with one of our clients in their digital identity,” she added. “There’s a mechanism to have them be compensated.”

But if you want to use digital lookalikes of celebrities—alive or dead—don’t expect to get a discount just because it’s not the real deal.

“If you’re going to work with somebody’s digital self, you aren’t working with that business because you think you can work with that person in a cheaper way that is creating some big cost efficiency for you,” Shannon warned. 

“At the end of the day, you’re working with somebody—the value is still in that person representing your brand.”

‘This assault on human creativity must be stopped’

Shannon’s comments come as music industry titans like Nicki Minaj, Katy Perry, and Billie Eilish, have thrown their weight behind an open letter calling for a crackdown on their material being used to train AI without their permission.

Their open letter—posted to the online community Medium earlier this month—calls on tech companies and AI developers to “cease the use of AI to infringe upon and devalue the rights of human artists.”

“These efforts are directly aimed at replacing the work of human artists with massive quantities of Al-created ‘sounds’ and ‘images’ that substantially dilute the royalty pools that are paid out to artists,” the letter, posted by the Artists Rights Alliance, continues.

“Unchecked, Al will set in motion a race to the bottom that will degrade the value of our work and prevent us from being fairly compensated for it,” it adds.

Among the group of 200 signatories are artists who are no longer with us, like the estate of Bob Marley.

The letter finishes: “This assault on human creativity must be stopped. We must protect against the predatory use of Al to steal professional artists’ voices and likenesses, violate creators’ rights, and destroy the music ecosystem.”

Direct action is also demanded by the artists, asking tech giants, digital music platforms and AI developers to pledge not to develop or deploy AI-generated tools or music that undermines their work without compensation.

Representatives for many of the artists via record labels Universal Music Group, Interscope Records and Glassnote Music confirmed to Fortune their talent signed the letter.

The letter is far from the first landing on the desks of up-and-coming AI CEOs.

In July last year a handful of writers—including comedian Sarah Silverman—brought class-action complaints against Meta and ChatGPT maker OpenAI for “remixing the copyrighted works of thou­sands of book authors—and many oth­ers—with­out con­sent, com­pen­sa­tion, or credit.”

The plaintiffs, who include authors Paul Tremblay, Christopher Golden, and others, are represented by Joseph Saveri and Matthew Butterick who said they are standing up on behalf of authors to continue a “vital con­ver­sa­tion about how A.I. will coex­ist with human cul­ture and cre­ativ­ity.”

The case was partially dismissed in February, per Reuters, with Judge Araceli Martínez-Olguín rejecting claims of copyright infringement and that the businesses unjustly enriched themselves through other people’s work.

Hollywood has also ground to a halt over the issue.

In June 2023, Rolling Stone obtained a letter signed by prominent members of the Screen Actors Guild threatening to strike if their negotiating committee couldn’t reach a deal with major studios over issues such as streaming and AI.

Actors including Meryl Streep, Jennifer Lawrence and Ben Stiller reportedly wrote: “We do not believe that SAG-AFTRA members can afford to make halfway gains in anticipation that more will be coming in three years, and we think it is absolutely vital that this negotiation protects not just our likenesses, but makes sure we are well compensated when any of our work is used to train AI.”

A raft of agreements have since been made—though some voice actors believe the promises still don’t go far enough to protect them.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version