• Kbin_space_program@kbin.social
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    9 months ago

    So I played around some more. If I used the term “woman” I had to add that they were clothed or add the clothing they were wearing, for one of them I had to add fully clothed and specify “a full suit”.

    I went back over the one time it worked previously and they were nude, it only passed because it put the scene in silhouette, and that apparently enabled it to get passed the sensors.

    But it had absolutely no issue reproducing Iron Man and Ultron in a two word prompt and the absolute scariest is that it can make reproductions of big celebrities.

    • captainlezbian@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      9 months ago

      Yeah that’s the thing, it’s not even surprising. Men are socially treated as the default in our culture and especially by tech people (who are overwhelmingly men and surrounded by other men in social and professional contexts). The cultural sexualization of women showing in llms is exactly what one would expect to happen because when people are looking to use it to create a person doing a thing they’re lookin for that thing, same for a man, but for women it’s often for porn. And I would be shocked if that wasn’t a problem google had to actively combat early on.

      In short, more tech people need to read feminist theory as it relates to what they’re making