August 5, 2024
Artists

When Adobe promised not to train AI on artists’ content, the creative community reacted with skepticism


When users discovered Adobe’s new terms of service (quietly updated in February), there was a wave of outrage. Adobe had informed users that it could access their content “through both automated and manual methods” and use “techniques such as machine learning to improve [its] Services and Software.” Many interpreted the update as an attempt by the company to force users to grant unlimited access to their work in order to train Adobe’s generative artificial intelligence, known as Firefly.

The reaction of the creative community

After artists protested, Adobe issued a clarification: in a new version of its terms of service, the company pledged not to train AI on users’ content stored locally or in the cloud and gave users the option to opt out of content analysis. However, many artists remain skeptical about Adobe’s real intentions.

Jon Lam, a senior storyboard artist at Riot Games, says, “They have already betrayed our trust,” referring to how award-winning artist Brian Kesinger discovered that his work was being sold without his consent on Adobe’s stock image platform in the form of generated images. The legacy of photographer Ansel Adams has also publicly rebuked Adobe for allegedly selling AI-generated imitations of his works.

Adobe’s position and artists’ concerns

Scott Belsky, Adobe’s chief strategy officer, sought to reassure artists by clarifying that machine learning refers to Adobe’s non-generative AI tools, such as Photoshop’s “Content Aware Fill” tool. However, despite Adobe’s reassurances, artists like Lam remain convinced that the company will use work created on its platform to train Firefly without the creators’ consent.

The issue of intellectual property and generative AI.

Concerns about the non-consensual use and monetization of copyrighted works by generative AI models are not new. Artist Karla Ortiz was able to generate images of her work using her name on various generative AI models, resulting in a class action lawsuit against Midjourney, DeviantArt, and Stability AI. Polish fantasy artist Greg Rutkowski also discovered that his name was one of the most commonly used prompts in Stable Diffusion when the tool was launched in 2022.

Adobe’s monopoly and the challenges for artists

As the owner of Photoshop and creator of PDF, Adobe has dominated the creative industry for more than 30 years. Its attempt to acquire product design company Figma was blocked and abandoned in 2023 due to antitrust concerns, a testament to its size.

Concerns about Firefly’s formation

Adobe claims that Firefly is “ethically trained” on Adobe Stock, but Eric Urquhart, a longtime stock image contributor, insists that “there was nothing ethical about the way Adobe trained the AI for Firefly,” pointing out that Adobe does not own the rights to any of the individual contributors’ images. Urquhart originally uploaded his images to Fotolia, a stock image site, where he agreed to license terms that did not specify any use for the generative AI. Fotolia was then acquired by Adobe in 2015, which quietly made changes to its terms of service that later allowed the company to train Firefly using Urquhart’s photos without his explicit consent.

The impact on artists and regulatory initiatives

Since the introduction of Firefly, some artists have made the difficult decision to unsubscribe from Adobe, switching to tools such as Affinity and Clip Studio. Others feel forcibly tied to the software. “Professionally, I can’t give up Adobe,” says Urquhart.

Adobe has recognized its responsibility to the creative community in the past. In September 2023, the company announced the Federal Anti-Impersonation Right (FAIR) legislative initiative, which aims to protect artists from misappropriation of their work. However, the effectiveness of this law has been questioned, as it would not protect “accidentally generated” works in an artist’s style.

In addition to Adobe, other organizations are finding new ways to authenticate works and prevent intellectual property theft. A team of researchers at the University of Chicago has developed Nightshade, a tool that “poisons” training data and damages iterations of AI image generation models, and Glaze, a tool that helps artists “mask” their signature styles from AI companies.

Article source here.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *