University of Chicago researchers seek to “poison” AI art generators with Nightshade
Altered images could destroy AI model training efforts that scrape art without consent.
University of Chicago researchers seek to “poison” AI art generators with Nightshade
Altered images could destroy AI model training efforts that scrape art without consent.
@arstechnica is this the watermark / copyright ©️ benchmark the on line community needs ?
#probably , a huge area of discussion is needed. But, YES.
It would be great that all copyright media was by default protected from AI designed to use its content to source material.
@arstechnica super pleased to learn we can use science to poison our doodles so that the robots will think we drew a skyscraper when we actually drew a bagel and that will trick the robots into drawing cities filled with giant bagels.
@arstechnica lets hope this works...
@arstechnica I call bullshit. If I use someone's art asa reference it falls under fair use and becomes a new work. I don't see any difference in AI doing. Tell AI to draw a tree 1,000 times and every one of them will be differen't artworks. So who got copied? Nobody, because AI does not copy and produce exact replicas.
076萌SNS is a social network, courtesy of 076. It runs on GNU social, version 2.0.2-beta0, available under the GNU Affero General Public License.
All 076萌SNS content and data are available under the Creative Commons Attribution 3.0 license.