MIT Technology Review » Artificial Intelligence 07月10日 17:34
This tool strips away anti-AI protections from digital art
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

LightShed是一种新技术,旨在瓦解艺术家用来阻止其作品被用于AI训练的保护工具。这项技术标志着艺术家与AI支持者之间持续多年的“猫捉老鼠”游戏进入新阶段。随着生成式AI模型对大量视觉材料的需求,艺术家们担心AI会学习、模仿他们的风格,从而威胁他们的职业。LightShed的出现,能够“清洗”掉Glaze和Nightshade等工具施加的“毒药”,使得艺术作品再次可用于AI训练。尽管如此,LightShed的开发者也强调,这并非为了窃取艺术家的作品,而是警示艺术家们不要对现有保护工具产生虚假的安全感。

🎨生成式AI模型需要大量视觉数据进行训练,而这些数据可能未经授权就包含了受版权保护的艺术作品,这引发了艺术家对AI模仿其风格的担忧。

🛡️Glaze和Nightshade等工具通过改变像素来“毒害”艺术作品,使其在AI训练中被误解,从而保护艺术家的创作风格和作品内容。

💡LightShed技术能够识别并清除Glaze和Nightshade等工具施加的“毒药”,使艺术作品能够再次被用于AI训练。

⚠️LightShed的开发者强调,他们的工作并非为了窃取艺术作品,而是为了提醒艺术家,现有的保护工具并非永久解决方案。

⚖️尽管LightShed出现,但Glaze和Nightshade的开发者认为,这些工具仍然具有威慑作用,旨在增加AI公司获取艺术家作品的难度,促使其与艺术家合作。

A new technique called LightShed will make it harder for artists to use existing protective tools to stop their work from being ingested for AI training. It’s the next step in a cat-and-mouse game—across technology, law, and culture—that has been going on between artists and AI proponents for years. 

Generative AI models that create images need to be trained on a wide variety of visual material, and data sets that are used for this training allegedly include copyrighted art without permission. This has worried artists, who are concerned that the models will learn their style, mimic their work, and put them out of a job.

These artists got some potential defenses in 2023, when researchers created tools like Glaze and Nightshade to protect artwork by “poisoning” it against AI training (Shawn Shan was even named MIT Technology Review’s Innovator of the Year last year for his work on these). LightShed, however, claims to be able to subvert these tools and others like them, making it easy for the artwork to be used for training once again.

To be clear, the researchers behind LightShed aren’t trying to steal artists’ work. They just don’t want people to get a false sense of security. “You will not be sure if companies have methods to delete these poisons but will never tell you,” says Hanna Foerster, a PhD student at the University of Cambridge and the lead author of a paper on the work. And if they do, it may be too late to fix the problem.

AI models work, in part, by implicitly creating boundaries between what they perceive as different categories of images. Glaze and Nightshade change enough pixels to push a given piece of art over this boundary without affecting the image’s quality, causing the model to see it as something it’s not. These almost imperceptible changes are called perturbations, and they mess up the AI model’s ability to understand the artwork.

Glaze makes models misunderstand style (e.g., interpreting a photorealistic painting as a cartoon). Nightshade instead makes the model see the subject incorrectly (e.g., interpreting a cat in a drawing as a dog). Glaze is used to defend an artist’s individual style, whereas Nightshade is used to attack AI models that crawl the internet for art.

Foerster worked with a team of researchers from the Technical University of Darmstadt and the University of Texas at San Antonio to develop LightShed, which learns how to see where tools like Glaze and Nightshade splash this sort of digital poison onto art so that it can effectively clean it off. The group will present its findings at the Usenix Security Symposium, a leading global cybersecurity conference, in August. 

The researchers trained LightShed by feeding it pieces of art with and without Nightshade, Glaze, and other similar programs applied. Foerster describes the process as teaching LightShed to reconstruct “just the poison on poisoned images.” Identifying a cutoff for how much poison will actually confuse an AI makes it easier to “wash” just the poison off. 

LightShed is incredibly effective at this. While other researchers have found simple ways to subvert poisoning, LightShed appears to be more adaptable. It can even apply what it’s learned from one anti-AI tool—say, Nightshade—to others like Mist or MetaCloak without ever seeing them ahead of time. While it has some trouble performing against small doses of poison, those are less likely to kill the AI models’ abilities to understand the underlying art, making it a win-win for the AI—or a lose-lose for the artists using these tools.

Around 7.5 million people, many of them artists with small and medium-size followings and fewer resources, have downloaded Glaze to protect their art. Those using tools like Glaze see it as an important technical line of defense, especially when the state of regulation around AI training and copyright is still up in the air. The LightShed authors see their work as a warning that tools like Glaze are not permanent solutions. “It might need a few more rounds of trying to come up with better ideas for protection,” says Foerster.

The creators of Glaze and Nightshade seem to agree with that sentiment: The website for Nightshade warned the tool wasn’t future-proof before work on LightShed ever began. And Shan, who led research on both tools, still believes defenses like his have meaning even if there are ways around them. 

“It’s a deterrent,” says Shan—a way to warn AI companies that artists are serious about their concerns. The goal, as he puts it, is to put up as many roadblocks as possible so that AI companies find it easier to just work with artists. He believes that “most artists kind of understand this is a temporary solution,” but that creating those obstacles against the unwanted use of their work is still valuable.

Foerster hopes to use what she learned through LightShed to build new defenses for artists, including clever watermarks that somehow persist with the artwork even after it’s gone through an AI model. While she doesn’t believe this will protect a work against AI forever, she thinks this could help tip the scales back in the artist’s favor once again.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

LightShed AI训练 艺术家版权 Glaze Nightshade
相关文章