August 5, 2024
Visual artists

Artists flee Instagram for new app Cara in protest of Meta AI scraping


Painters, photographers and other artists have flocked to Instagram for years to share their portfolios and gain visibility. Now, many say they are leaving to prevent the app’s parent company Meta from using their art to train AI models.

Visual artists are resharing messages and templates on their accounts in protest, with many saying they are moving to Cara, a portfolio app for artists that bans AI posts and training. They are upset because a Meta executive stated in May that the company considers public Instagram posts part of its training data. A few weeks later, it pinged users in Europe, stating that their posts would be used to train AI starting June 26. There is no way to opt out, though some places such as the European Union allow people to dispute when Meta uses their personal data.

Tension is mounting between online creators and AI companies. Right now, almost everything posted publicly on the internet is considered fair game for AI training. The end product has the potential to replace the very people who created the training data, including authors, musicians and visual artists.

Artists said they feel powerless — they need Meta apps to market themselves but can’t prevent their work becoming fodder for AI. Some say they are already on the verge of losing their livelihoods.

Cara founder Jingna Zhang said the app has grown from about 40,000 users to 650,000 in the past week. At one point, it was the fifth most-downloaded social app in Apple’s store, per Apple’s rankings. Whether the flight will make an impression on Meta is unclear.

GET CAUGHT UP

Stories to keep you informed

“I haven’t slept,” said Zhang, a photographer and artists’ rights advocate. “We were not expecting this.”

Artists including Zhang have filed multiple lawsuits against AI companies such as Google and Stability AI. They say the companies are training their generators on material scraped from the internet, some of which is under copyright. Authors and publishers including George R.R. Martin and the New York Times have filed similar suits. The companies have argued that the training material falls under “fair use” laws that allow for remixes and interpretations of existing content.

For now, many artists feel their only real power is to try to protect future work, and that means trying untested alternatives.

Zhang said the free Cara app, which launched in January 2023, is still in development and has crashed multiple times this week because of the overwhelming interest. Available on iOS, Android and the web, its home tab is an Instagram-esque feed of images with like, comment and repost buttons.

Artist Eva Redamonti said that she has seen “four or five” Instagram alternatives marketed to artists, but that it’s tough to assess which apps have her best interests in mind. Ben Zhao, a professor of computer science at University of Chicago, said he has seen multiple apps attract users with promises they don’t keep. Some platforms intended for artists have already devolved into “AI farms,” he said. Zhao and fellow professor Heather Zheng co-created the tool Glaze, which helps protect artists’ work from AI mimicry and is on Cara.

Artists are not allowed to share AI-generated work until “rampant ethical and data privacy issues” are resolved, Cara’s FAQ page says. It uses detection technology from AI company Hive to scan for rule-breakers and labels each uploaded image with a “NoAI” tag intended to discourage scraping. However, there is no way to prevent AI companies from taking the images anyway.

Some artists say AI has already affected their bottom lines.

When Kelly McKernan — an artist and illustrator from Nashville — joined Facebook and Instagram over a decade ago, the apps quickly became the best place to find clients. But from 2022 to 2023, their income dropped 30 percent as AI-generated images ballooned across the internet, they said. One day last year they Googled their own name, and the first result was an AI image in the style of their work. Meta’s AI scraping policy is the “last straw,” they said.

McKernan, along with two other artists, is now suing AI companies including Midjourney and Stability AI.

Allie Sullberg, a freelance illustrator, downloaded the Cara app this week after seeing many of her artist friends post on Instagram about AI scraping and the switch to Cara. She said she is exasperated that Meta is presenting its AI efforts as a tool for creators, who don’t materially benefit when models are trained on their work.

Users consent to Meta’s AI policies when they use its apps, in accordance with its privacy policy and terms. Sullberg said she first joined Instagram around 2011. The first consumer-facing generative image model, OpenAI’s DALL-E, debuted in 2021.

Meta spokesman Thomas Richards told The Washington Post that the company doesn’t have an opt-out option. “Depending on where people live, they can also object to the use of their personal information being used to build and train AI consistent with local privacy laws,” he said.

Jon Lam, a video game artist and creators’ rights activist, spent hours hunting for a way to opt out of AI scraping on Instagram. He found a form, only to learn it was only applicable to users in Europe, which has a far-reaching privacy law. Lam said he is feeling “pure anger and fury” at Meta and other AI companies.

“These companies have turned on their customers. We were sold a false promise, which was that social media was built to stay connected to your friends and family and help you share what you’re up to,” Lam said. “A decade later, it’s just this platform for them to harvest data to train on.”

McKernan said they are hopeful that, as big lawsuits play out, actions by creators put pressure on AI companies to change their policies.

“Complacency is what allows companies like Meta to keep treating content creators — the people who make them money — the way they treat us,” they said.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *