NEW YORK — Artists under siege from artificial intelligence that studies their work, then replicates their styles, have teamed up with university researchers to stymie such copycat activity.
Paloma McClain, an illustrator in the United States, went into defense mode after learning that several AI models had been “trained” using her art, with no credit or compensation sent her way.
“It bothered me,” McClain told Agence France-Presse. “I believe truly meaningful technological advancement is done ethically and elevates all people instead of functioning at the expense of others.”
The artist turned to free software called “Glaze” created by researchers at the University of Chicago.
Glaze essentially outthinks AI models when it comes to how they train, tweaking pixels in ways indiscernible by human viewers but which make a digitized piece of art appear dramatically different to AI.
“We’re basically providing technical tools to help protect human creators against invasive and abusive AI models,” said professor of computer science Ben Zhao of the Glaze team.
Created in just four months, Glaze spun off technology used to disrupt facial recognition systems.
“We were working at super fast speed because we knew the problem was serious,” Zhao said of the urgency to defend artists from software imitators. “A lot of people were in pain.”
Generative AI giants have agreements to use data for training in some cases, but the majority of digital images, audio and text used to shape the way super smart software thinks has been scraped from the internet without explicit consent.
Since its release in March, Glaze has been downloaded more than 1.6 million times, Zhao said.
Zhao’s team is working on a Glaze enhancement called “Nightshade “that strengthens defenses by confusing AI, say by getting it to interpret a dog as a cat.
The team has been approached by several companies that want to use Nightshade, Zhao said.
“The goal is for people to be able to protect their content, whether it’s individual artists or companies with a lot of intellectual property,” Zhao said.
Startup Spawning has developed Kudurru software that detects attempts to harvest large numbers of images from an online venue.
An artist can then block access or send images that do not match what is being requested, Spawning co-founder Jordan Meyer said.
More than a thousand websites have already been integrated into the Kudurru network.
“The best solution would be a world in which all data used for AI is subject to consent and payment,” Meyer said. “We hope to push developers in this direction.”
Agencies via Xinhua