• smiletolerantly@awful.systems
    139·
    1 year ago

    This is hilarious, but also: how could anyone develop such a tool and not at least test it out on their own images? Someone with a public persona no less! Boggles my mind.

    • schnokobaer@feddit.de
      67·
      1 year ago

      I mean, I bet they did, but you can’t test it with all the photos ever taken of you. Someone probably tried dozens of photos to get this result. Which, to be clear, I admire.

    • Revonult@lemmy.world
      15·
      1 year ago

      They did, they know it doesn’t work, but they are already too far down in the money hole. Gotta grift and bullshit and spread bigotry until they make back the money.

      Edit: Some words

    • jaybone@lemmy.world
      1·
      1 year ago

      Does she have a public persona? Who is this person?

  • Maggoty@lemmy.world
    102·
    1 year ago

    Heat emissions.

    From a picture.

    Hold on there’s a ringing in my ears… Yeah yeah that’s my bullshit alarm going off.

    • JackbyDev@programming.devEnglish
      33·
      1 year ago

      The heat emission is the smoke they’re trying to blow up my ass.

    • BluesF@lemmy.world
      8·
      1 year ago

      Could be analysing 4 band imagery with an NIR layer. But then that usually comes from satellite imagery so it would make identifying gender challenging. I’d struggle with just a grainy image of the top of someone’s head, even if I knew how warm it was.

    • ThePyroPython@lemmy.world
      28·
      1 year ago

      Ah straight out of the Dictatorship Surviour’s Guide for Middle Management playbook.

    • lesbian_seagull@lemm.ee
      5·
      1 year ago

      I kinda want to use it to see if I pass as a cis woman with short hair, but then again is my ego really prepared to be misgendered by some shitass app!?

      Also, too, fuck TERFS 🖕

  • nick@midwest.social
    77·
    1 year ago

    So because she didn’t check herself, you might say she wrecked herself.

  • breadsmasher@lemmy.worldEnglish
    54·
    1 year ago

    well if it is 99.85% accurate, maybe she is self hating and hiding

  • ch00f@lemmy.world
    46·
    1 year ago

    What’s the play here? Does she not know that people upload highly inaccurate or blatantly fake photos to dating sites all the time?

    What problem does this solve?

    • Diplomjodler@lemmy.world
      53·
      1 year ago

      The problem that right wing fuckwits always need somebody to hate and discriminate against.

    • alyth@lemmy.world
      10·
      1 year ago

      The problem it solves is that she needs plenty of money with little effort and morals are not a limiting factor. And what Diplomjodler3 said.

  • smiletolerantly@awful.systems
    40·
    1 year ago

    OK im starting to have doubts that this is legit. Looks like OP (or OOP, idk) just found a classifier which misclassified that image. Nothing I’m seeing indicates that it’s the classifier used for her stupid app.

    • Cybrpwca@beehaw.orgEnglish
      28·
      1 year ago

      I fed it a pre-HRT pic and got “Woman, 56% confident”. Lol. I guess it’s kind of affirming to think a machine could see the real me back then?

      • lud@lemm.ee
        9·
        1 year ago

        I did the same pic that was used above and it said “Woman, 95% confident”

        • BananaOnionJuice@lemmy.dbzer0.com
          1·
          1 year ago

          Did you use the larger cropped picture? And else I was thinking what if the AI was actually saying I’m 97% sure that it’s a man facing away from the camera.

  • qjkxbmwvz@startrek.website
    34·
    1 year ago

    According to the screenshot, it doesn’t even call her a trans woman, it calls her a man. Presumably because man and woman are the only options on her little TERF world.

  • Rimu@piefed.socialEnglish
    28·
    1 year ago

    The AI probably saw that massive boner in her pants and got confused.

  • nifty@lemmy.worldBanned from community
    27·
    1 year ago

    As funny as it is, I don’t think people should be uploading their images to this app. Maybe it’s hilariously wrong because it’s trying to data mine?

    • Maggoty@lemmy.world
      3·
      1 year ago

      Yup if you give information to a company it’s now theirs. The old adage about being the product if you’re not paying no longer applies. Now you are the product even if you’re paying.

  • ClockworkOtter@lemmy.world
    19·
    1 year ago

    I wonder if the AI is detecting that the photo is taken from further away and below eye level which is more likely for a photo of a man, rather than looking at her facial characteristics?

    • Tyoda@lemm.ee
      17·
      1 year ago

      It’s possible to manipulate an image in a way that the original and the new one are indistinguishable to the human eye, but the AI model gives completely different results.

      Like this helpful graphic I found

      Or… edit the HTML…

    • drcobaltjedi@programming.dev
      16·
      1 year ago

      Yeah, this is a valid point, if this is the exact case or not I don’t know, but a lot of people don’t realize a lot of the weird biases that can appear in the training data.

      Like that AI trained to detect ig a mole was cancer or not. A lot of the training data that was cancer had rulers in them. So the AI learned rulers are cancerous.

      I could easily see something stupid like angle the picture was taken from being something the AI erroniously assumed to be useful for determining biological sex in this case.

  • bluewing@lemm.ee
    11·
    1 year ago

    We are all Trans Women on this blessed day.