Very little is talked about how to protect your works against AI being trained on artist works when they can just throw in an artists name to replicate their style and such.
So here's something that can help prevent your works being added to the training data for AI.
You may of heard of tools like Nightshade, and Glaze, and these are two sets of tools that can poison the AI data set to combat generative AI. These are tools that came out in 2023, tools that you can download and use for free from the University of Chicago page.
Glaze tries to prevent style copying of an artist.
Nightshade tries to distort what's actually in the image, so a car could be an animal, or whatever.
It should be noted that these tools aren't future proof and they can be broken when new AI models are created that can figure these things out, but it will at the very least slow them down.
https://www.technologyreview.com/2023/10/23/1082189/data-poisoning-artists-fight-generative-ai/
