Why Nostr? What is Njump?
2024-06-22 15:44:26

Aeontropy on Nostr: When users first found out about Adobe’s new terms of service (which were quietly ...

When users first found out about Adobe’s new terms of service (which were quietly updated in February), there was an uproar. Adobe told users it could access their content “through both automated and manual methods” and use “techniques such as machine learning in order to improve [Adobe’s] Services and Software.” Many understood the update as the company forcing users to grant unlimited access to their work, for purposes of training Adobe’s generative AI, known as Firefly.

Late on Tuesday, Adobe issued a clarification: In an updated version of its terms of service agreement, it pledged not to train AI on its users' content stored locally or in the cloud and gave users the option to opt out of content analytics.

Caught in the crossfire of intellectual property lawsuits, the ambiguous language used to previously update the terms shed light on a climate of acute skepticism among artists, many of whom overrely on Adobe for their work. “They already broke our trust,” says Jon Lam, a senior storyboard artist at Riot Games, referring to how award-winning artist Brian Kesinger discovered generated images in the style of his art being sold under his name on Adobe's stock image site, without his consent. Earlier this month, the estate of late photographer Ansel Adams publicly scolded Adobe for allegedly selling generative AI imitations of his work.

Scott Belsky, Adobe’s chief strategy officer, had tried to assuage concerns when artists started protesting, clarifying that machine learning refers to the company’s non-generative AI tools—Photoshop’s “Content Aware Fill” tool, which allows users to seamlessly remove objects in an image, is one of the many tools done through machine learning. But while Adobe insists that the updated terms do not give the company content ownership and that it will never use user content to train Firefly, the misunderstanding triggered a bigger discussion about the company’s market monopoly and how a change like this could threaten the livelihoods of artists at any point. Lam is among the artists who still believe that, despite Adobe’s clarification, the company will use work created on its platform to train Firefly without the creators’ consent.

The nervousness over nonconsensual use and monetization of copyrighted work by generative AI models is not new. Early last year, artist Karla Ortiz was able to prompt images of her work using her name on various generative AI models, an offense that gave rise to a class action lawsuit against Midjourney, DeviantArt, and Stability AI. Ortiz was not alone—Polish fantasy artist Greg Rutkowski found that his name was one of the most commonly used prompts in Stable Diffusion when the tool first launched in 2022.

As the owner of Photoshop and creator of PDFs, Adobe has reigned as the industry standard for over 30 years, powering the majority of the creative class. An attempt to acquire product design company Figma was blocked and abandoned in 2023 for antitrust concerns attesting to its size.

Adobe specifies that Firefly is “ethically trained” on Adobe Stock, but Eric Urquhart, longtime stock image contributor, insists that “there was nothing ethical about how Adobe trained the AI for Firefly,” pointing out that Adobe does not own the rights to any images from individual contributors. Urquhart originally put his images up on Fotolia, a stock image site, where he agreed to licensing terms that did not specify any uses for generative AI. Fotolia was then acquired by Adobe in 2015, which rolled out silent terms-of-service updates that later allowed the company to train Firefly using Urquhart’s photos without his explicit consent: “The language in the current change of TOS, it’s very similar to what I saw in the Adobe Stock TOS.”

Since the introduction of Firefly, some artists have made the difficult (and arduous) decision to cancel their Adobe membership by pivoting to tools like Affinity and Clip Studio. Others feel forcefully bound to the software. “Professionally, I can’t quit Adobe,” says Urquhart.

Adobe has acknowledged its responsibility to the creative community in the past. In September 2023, the company announced the Federal Anti-Impersonation Right (FAIR) act, a legislative initiative that aims to protect artists from misappropriations of their work. The proposal only addresses intentional impersonations used for commercial purposes, raising questions around efficacy (the act would not protect works ‘accidentally generated’ in the style of an artist) and privacy (proving intent would require storing and monitoring user prompts.)

Outside of Adobe, organizations are finding new ways to help authenticate works and prevent intellectual property theft. A team of researchers at the University of Chicago developed Nightshade, a tool that “poisons” training data and damages iterations of image-generating AI models, and Glaze, a tool that helps artists “mask” their signature styles from AI companies. In terms of regulation, the Concept Art Association—an organization Lam is also a part of—advocates for artists rights with crowd-funded lobbying efforts.

https://www.wired.com/story/adobe-says-it-wont-train-ai-using-artists-work-creatives-arent-convinced/

Author Public Key
npub1kade5vf37snr4hv5hgstav6j5ygry6z09kkq0flp47p8cmeuz5zs7zz2an