Here’s a non-paywalled link to an article published in the Washington Post a few days ago. It’s great to see this kind of thing getting some mainstream attention. Young children have not made an informed decision about whether they want their photos posted online.

  • BreakDecks@lemmy.ml
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    9 months ago

    The problem with posting pictures of kids in closed groups is that pervs will just join those groups because they have what they’re looking for. You’re basically making it easier for them.

    It’s not that parents are afraid of their kids being part of a training set, though that is a bad thing in and of itself. It’s more about all of these AI undressing app ads that are showing up on every social media site, showing just how much of a wild-west situation things currently are, and that this brand of sexual exploitation is in-demand.

    Predators are already automating the process so that certain Instagram models get the AI undressing treatment as soon as they upload an exploitable pic. Pretty trivial to do at scale with Instaloader, GroundingDINO, SAM, and SD. Those pics are hosted outside of Instagram where victims have no power to undo the damage. Kids will get sexually exploited in this process, incidentally or intentionally.

    • grrgyle@slrpnk.net
      link
      fedilink
      arrow-up
      3
      ·
      9 months ago

      I believe by closed groups they mean the family or friends chat with like 5 people.

      Although I personally wouldn’t share too much in those groups too.