AI Data Mining Cloak and Dagger. Nightshade AI Poisoning and Anti-Theft | by Aleia Knight | Feb, 2024 – Medium

Probably the biggest use of AI, commercially, has been for art. Models like DALL-E or Midjourney create anything from fantasy landscapes of modern people lounging with dragons to making a 1:1 recreation of the Mona Lisa. The biggest pushback for these models came from artists who, while making their creations public, did not consent to have their creations' data mined for AI training models. Oftentimes, I see people having an AI model take art specifically from a certain artist and then having it create a commission, rather than paying the artist themself to make it.

impersonating real people online with bot accounts, text generation, and image generation.

The Deepfake situation alone has escalated to the point that it has gotten to the desks of White House representatives. A big push was this was the recent Taylor Swift situation in which a user was using AI to scrap images of her from around the internet and create nude images of her, that she never took and without her consent. Imagine, if this can happen to a realistic scale with a celebrity, what that could impact on a social and political level, especially in terms of image, trust, and information exchange.

Even more so, at the beginining of 2024, when a video was release of a fake robocall from President Joe Biden urging the voters of New Hampshire not to vote.

See more here:

AI Data Mining Cloak and Dagger. Nightshade AI Poisoning and Anti-Theft | by Aleia Knight | Feb, 2024 - Medium

Related Posts

Comments are closed.