Navigation
Search
|
Artists Are Deleting Instagram For New App Cara In Protest of Meta AI Scraping
Friday June 7, 2024. 01:20 AM , from Slashdot
Some artists are jumping ship for the anti-AI portfolio app Cara after Meta began using Instagram content to train its AI models. Fast Company explains: The portfolio app bills itself as a platform that protects artists' images from being used to train AI, and only allowing AI content to be posted if it's clearly labeled. Based on the number of new users the Cara app has garnered over the past few days, there seems to be a need. Between May 31 and June 2, Cara's user base tripled from less than 100,000 to more than 300,000 profiles, skyrocketing to the top of the app store. Cara is a social networking app for creatives, in which users can post images of their artwork, memes, or just their own text-based musings. It shares similarities with major social platforms like X (formerly Twitter) and Instagram on a few fronts. Users can access Cara through a mobile app or on a browser. Both options are free to use. The UI itself is like an arts-centric combination of X and Instagram. In fact, some UI elements seem like they were pulled directly from other social media sites. (It's not the most innovative approach, but it is strategic: as a new app, any barriers to potential adoption need to be low).
Cara doesn't train any AI models on its content, nor does it allow third parties to do so. According to Cara's FAQ page, the app aims to protect its users from AI scraping by automatically implementing 'NoAI' tags on all of its posts. The website says these tags 'are intended to tell AI scrapers not to scrape from Cara.' Ultimately, they appear to be html metadata tags that politely ask bad actors not to get up to any funny business, and it's pretty unlikely that they hold any actual legal weight. Cara admits as much, too, warning its users that the tags aren't a 'fully comprehensive solution and won't completely prevent dedicated scrapers.' With that in mind, Cara assesses the 'NoAI' tagging system as a 'a necessary first step in building a space that is actually welcoming to artists -- one that respects them as creators and doesn't opt their work into unethical AI scraping without their consent.' In December, Cara launched another tool called Cara Glaze to defend its artists' work against scrapers. (Users can only use it a select number of times.) Glaze, developed by the SAND Lab at University of Chicago, makes it much more difficult for AI models to accurately understand and mimic an artist's personal style. The tool works by learning how AI bots perceive artwork, and then making a set of minimal changes that are invisible to the human eye but confusing to the AI model. The AI bot then has trouble 'translating' the art style and generates warped recreations. In the future, Cara also plans to implement Nightshade, another University of Chicago software that helps protect artwork against AI scapers. Nightshade 'poisons' AI training data by adding invisible pixels to artwork that can cause AI software to completely misunderstand the image. Beyond establishing shields against data mining, Cara also uses a third party service to detect and moderate any AI artwork that's posted to the site. Non-human artwork is forbidden, unless it's been properly labeled by the poster. Read more of this story at Slashdot.
https://tech.slashdot.org/story/24/06/06/2014232/artists-are-deleting-instagram-for-new-app-cara-in-...
Related News |
25 sources
Current Date
Nov, Fri 22 - 01:32 CET
|