‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • Sweetpeaches69@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    Just as the other people in this made up scenario don’t need an app to imagine Scarlet Johansen naked. It’s a null point.

    • CleoTheWizard@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      I think most of this is irrelevant because the tool that is AI image generation is inherently hard to limit in this way and I think it will be so prevalent as to be hard to regulate. What I’m saying is: we should prepare for a future where fake nudes of literally anyone can be made easily and shared easily. It’s already too late. These tools, as was said earlier, already exist and are here. The only thing we can do is severely punish people who post the photos publicly. Sadly, we know how slow laws are to change. So in that light, we need to legislate based on long term impact instead of short term reactions.

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      1 year ago

      And?… There’s a major difference between “a lookalike of a grown adult” and “ai generated child porn” as im sure you’re aware. At no point did anyone say child porn was going to be legal, until the person I was replying to brought it up as a strawman argument. ¯\_(ツ)_/¯