Content
summary Summary

Nightshade, an AI-based tool designed to help artists protect their artwork, is now available in version 1.0.

Ad

The development team recommends that artists use Nightshade before using Glaze, another AI-based artwork protection tool (see below). Glaze should always be used.

The interaction between the two tools is currently being tested. A combined Glaze/Nightshade tool is in development.

After downloading Nightshade, users will need Internet access and approximately 4 GB of disk space for additional ML libraries and resources.

Ad
Ad

Users who already have Glaze installed can reuse the resource files with Nightshade, so no large additional downloads are required.

Before using Nightshade, the team recommends that you read the user manual for step-by-step instructions.

The application can be downloaded here for free for both Windows GPU/CPU and MacOS (CPU only, M1, M2, M3).

Glaze and Nightshade are tools designed to sabotage AI models. Glaze implements invisible pixels in original images that cause the image to fool AI systems into believing false styles. For example, Glaze can be used to transform a hand-drawn image into a 3D rendering.

Nightshade, named after the highly poisonous plant, works on the same principle as Glaze, but goes one step further: it is designed to use the manipulated pixels to damage the model by confusing it. For example, the AI model might see a car instead of a train.

Recommendation

Fewer than 100 of these "poisoned" images could be enough to corrupt an image AI model, the developers suspect. Nightshade is being developed by the same team as Glaze and will be integrated into Glaze.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • Nightshade is an AI-based application that helps artists protect their artwork from AI models. The application is now available in version 1.0.
  • The tool works in conjunction with Glaze, another application that inserts invisible pixels into original images to fool and confuse AI systems with fake styles.
  • Nightshade goes one step further and uses manipulated pixels to damage the AI model; less than 100 "poisoned" images could be enough to corrupt an image AI model.
Sources
Online journalist Matthias is the co-founder and publisher of THE DECODER. He believes that artificial intelligence will fundamentally change the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.