How artists can poison their pics with deadly Nightshade to deter AI scrapers
University of Chicago boffins this week released Nightshade 1.0, a tool built to punish unscrupulous makers of machine learning models who train their systems on data without getting permission first.
Nightshade is an offensive data poisoning tool, a companion to a defensive style protection tool called Glaze, which The Register covered in February last year.
Nightshade poisons image files to give indigestion to models that ingest data without permission. It's intended to make those training ima...
Read more at theregister.com