AI in practice

Artists can now poison their AI foes with Nightshade

Matthias Bastian
a glitch aesthetic digital nightshade

Midjourney prompted by THE DECODER

Nightshade, an AI-based tool designed to help artists protect their artwork, is now available in version 1.0.

The development team recommends that artists use Nightshade before using Glaze, another AI-based artwork protection tool (see below). Glaze should always be used.

The interaction between the two tools is currently being tested. A combined Glaze/Nightshade tool is in development.

After downloading Nightshade, users will need Internet access and approximately 4 GB of disk space for additional ML libraries and resources.

Users who already have Glaze installed can reuse the resource files with Nightshade, so no large additional downloads are required.

Before using Nightshade, the team recommends that you read the user manual for step-by-step instructions.

The application can be downloaded here for free for both Windows GPU/CPU and MacOS (CPU only, M1, M2, M3).

Glaze and Nightshade are tools designed to sabotage AI models. Glaze implements invisible pixels in original images that cause the image to fool AI systems into believing false styles. For example, Glaze can be used to transform a hand-drawn image into a 3D rendering.

Nightshade, named after the highly poisonous plant, works on the same principle as Glaze, but goes one step further: it is designed to use the manipulated pixels to damage the model by confusing it. For example, the AI model might see a car instead of a train.

Fewer than 100 of these "poisoned" images could be enough to corrupt an image AI model, the developers suspect. Nightshade is being developed by the same team as Glaze and will be integrated into Glaze.

Sources: