Gallery Review Europe Blog Visual artists How we should regulate AI is the trillion-dollar question
Visual artists

How we should regulate AI is the trillion-dollar question


The explosion of generative artificial intelligence (GenAI) is stretching copyright principles to breaking point. Major GenAI platforms are being accused of copyright infringement, and the level of damages sought is eye-watering. In January 2023, Getty Images accused Stability AI of scraping 12 million copyrighted images to train its image generation tool, and is demanding approximately $1.8 trillion in statutory damages.

In December 2023, The New York Times brought a case against OpenAI and Microsoft alleging that millions of their articles were scraped by the platforms. Though the numbers have not been officially released, damages are expected to be worth billions of dollars. It is not just major companies whose content is being hoovered up to feed GenAIs’ insatiable appetite for data. All creators whose work exists online are also at risk, including artists, writers and musicians. As was recently shown in a 24-page document leaked by GenAI platform Midjourney, around 16,000 artists were revealed to have had their work scraped without their consent. Artists are taking matters into their own hands: the visual artists Sarah Anderson, Kelly McKernan and Karla Ortiz took Stability AI to court in California for downloading or otherwise acquiring copies of billions of copyrighted images—without the permission of the creators—to be used as “training images” to act as a “software library” for a variety of visual GenAI platforms.

Litigation is spreading like wildfire, not just in the US but in Europe. In another David-and-Goliath tale, a German photographer, Robert Kneschke, has taken on LAION, the open network known for releasing huge image-training datasets for prominent GenAI models. When Kneschke asked LAION to remove his images from training datasets, the company replied that this was an “unjustified copyright claim”. In response, the photographer filed a lawsuit against LAION for copyright infringement in the German courts.

Restrict, or open up?

While the courts battle with copyright questions around fair use and exemptions, there are equally important policy questions at stake: will society benefit more by restricting or by opening up public data sources available to train AI systems? Should copyright law provide a remedy to authors who, having placed their works in the public domain, find that they have been swallowed up to create data sets that, in turn, are used to train GenAI systems? Should content creators have the right to authorise or block AI systems (opt-out) from collecting and using their content as training data?

Inevitably, as with any new technology, GenAI presents considerable challenges. Content creators such as the Polish artist Greg Rutkowski, known for his fantasy landscapes, have become the face of campaigns protesting the treatment of artists by GenAI platforms. Rutkowski has complained that typing text like “wizard with sword and a glowing orb of magic fire fights a fierce dragon Greg Rutkowski” will create an image that looks very similar to his original work—therefore threatening his income.

Uncertainty creates risks

Is there a need for AI regulation? We say yes, absolutely. The current lack of regulation creates huge uncertainty. That, in turn, means huge risks. At the same time, too much regulation, or bad regulation, stifles innovation. Regulation is needed—but not too much and not too fast.

We certainly need mandatory disclosures about how AI models are trained, where the data comes from, and metrics to assist with benchmarking the accuracy of a given model’s output. There may be more disclosure required to help differentiate model quality and legitimacy.

Another issue with regulation is that AI is evolving daily. By the time regulation is adopted, it is obsolete. What is the answer? For now, there is merit in introducing primary legislation setting out broad principles that can withstand the test of time, at least in the short term (say, three to five years). If more detail is needed, it can be introduced by way of secondary legislation. That secondary legislation can be fine-tuned over time, perhaps by an ad hoc regulator, operating under parliamentary scrutiny.

Finally, we require an international framework. The courts will struggle to apply national regulations to GenAI. Global standards are needed, and possibly an independent intergovernmental body that can call itself the voice of the international AI community. Artists and the art world deserve it.

Pierre Valentin is a solicitor specialising in art law, a co-founder of Artistate and a consultant at Fieldfisher LLP. Eloise Calder is an admitted barrister and solicitor of the High Court of New Zealand



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version