• smiletolerantly@awful.systems
    link
    fedilink
    arrow-up
    137
    ·
    5 months ago

    This is hilarious, but also: how could anyone develop such a tool and not at least test it out on their own images? Someone with a public persona no less! Boggles my mind.

  • Maggoty@lemmy.world
    link
    fedilink
    arrow-up
    101
    ·
    edit-2
    5 months ago

    Heat emissions.

    From a picture.

    Hold on there’s a ringing in my ears… Yeah yeah that’s my bullshit alarm going off.

    • lesbian_seagull@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      5 months ago

      I kinda want to use it to see if I pass as a cis woman with short hair, but then again is my ego really prepared to be misgendered by some shitass app!?

      Also, too, fuck TERFS 🖕

  • ch00f@lemmy.world
    link
    fedilink
    arrow-up
    45
    ·
    edit-2
    5 months ago

    What’s the play here? Does she not know that people upload highly inaccurate or blatantly fake photos to dating sites all the time?

    What problem does this solve?

    • alyth@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      edit-2
      5 months ago

      The problem it solves is that she needs plenty of money with little effort and morals are not a limiting factor. And what Diplomjodler3 said.

  • smiletolerantly@awful.systems
    link
    fedilink
    arrow-up
    40
    ·
    5 months ago

    OK im starting to have doubts that this is legit. Looks like OP (or OOP, idk) just found a classifier which misclassified that image. Nothing I’m seeing indicates that it’s the classifier used for her stupid app.

    • Cybrpwca@beehaw.org
      link
      fedilink
      English
      arrow-up
      28
      ·
      5 months ago

      I fed it a pre-HRT pic and got “Woman, 56% confident”. Lol. I guess it’s kind of affirming to think a machine could see the real me back then?

      • lud@lemm.ee
        link
        fedilink
        arrow-up
        9
        ·
        5 months ago

        I did the same pic that was used above and it said “Woman, 95% confident”

        • BananaOnionJuice@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          5 months ago

          Did you use the larger cropped picture? And else I was thinking what if the AI was actually saying I’m 97% sure that it’s a man facing away from the camera.

  • qjkxbmwvz@startrek.website
    link
    fedilink
    arrow-up
    34
    ·
    edit-2
    5 months ago

    According to the screenshot, it doesn’t even call her a trans woman, it calls her a man. Presumably because man and woman are the only options on her little TERF world.

  • nifty@lemmy.world
    link
    fedilink
    arrow-up
    27
    ·
    5 months ago

    As funny as it is, I don’t think people should be uploading their images to this app. Maybe it’s hilariously wrong because it’s trying to data mine?

    • Maggoty@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      5 months ago

      Yup if you give information to a company it’s now theirs. The old adage about being the product if you’re not paying no longer applies. Now you are the product even if you’re paying.

  • ClockworkOtter@lemmy.world
    link
    fedilink
    arrow-up
    19
    ·
    edit-2
    5 months ago

    I wonder if the AI is detecting that the photo is taken from further away and below eye level which is more likely for a photo of a man, rather than looking at her facial characteristics?

    • Tyoda@lemm.ee
      link
      fedilink
      arrow-up
      17
      ·
      5 months ago

      It’s possible to manipulate an image in a way that the original and the new one are indistinguishable to the human eye, but the AI model gives completely different results.

      Like this helpful graphic I found

      Or… edit the HTML…

    • drcobaltjedi@programming.dev
      link
      fedilink
      arrow-up
      16
      ·
      5 months ago

      Yeah, this is a valid point, if this is the exact case or not I don’t know, but a lot of people don’t realize a lot of the weird biases that can appear in the training data.

      Like that AI trained to detect ig a mole was cancer or not. A lot of the training data that was cancer had rulers in them. So the AI learned rulers are cancerous.

      I could easily see something stupid like angle the picture was taken from being something the AI erroniously assumed to be useful for determining biological sex in this case.