This is hilarious, but also: how could anyone develop such a tool and not at least test it out on their own images? Someone with a public persona no less! Boggles my mind.
I mean, I bet they did, but you can’t test it with all the photos ever taken of you. Someone probably tried dozens of photos to get this result. Which, to be clear, I admire.
It looks like it’s the photo from her tweet
I would assume it was taken just for this tweet so all the testing would have been completed by that time.
Well - in this case it’s the photo she posted herself, to announce the app.
Wow
deleted by creator
They did, they know it doesn’t work, but they are already too far down in the money hole. Gotta grift and bullshit and spread bigotry until they make back the money.
Edit: Some words
Hubris
Does she have a public persona? Who is this person?
Heat emissions.
From a picture.
Hold on there’s a ringing in my ears… Yeah yeah that’s my bullshit alarm going off.
The heat emission is the smoke they’re trying to blow up my ass.
as in how hot the person in the picture is?
Lmao can’t be hotter than the CEO!
Could be analysing 4 band imagery with an NIR layer. But then that usually comes from satellite imagery so it would make identifying gender challenging. I’d struggle with just a grainy image of the top of someone’s head, even if I knew how warm it was.
Well, Big Shaq told us all that Man’s not hot.
Maybe her transphobia is just an attempt to pass better?
Ah straight out of the Dictatorship Surviour’s Guide for Middle Management playbook.
I kinda want to use it to see if I pass as a cis woman with short hair, but then again is my ego really prepared to be misgendered by some shitass app!?
Also, too, fuck TERFS 🖕
So because she didn’t check herself, you might say she wrecked herself.
☜(゚ヮ゚☜)
well if it is 99.85% accurate, maybe she is self hating and hiding
What’s the play here? Does she not know that people upload highly inaccurate or blatantly fake photos to dating sites all the time?
What problem does this solve?
The problem that right wing fuckwits always need somebody to hate and discriminate against.
The problem it solves is that she needs plenty of money with little effort and morals are not a limiting factor. And what Diplomjodler3 said.
OK im starting to have doubts that this is legit. Looks like OP (or OOP, idk) just found a classifier which misclassified that image. Nothing I’m seeing indicates that it’s the classifier used for her stupid app.
I fed it a pre-HRT pic and got “Woman, 56% confident”. Lol. I guess it’s kind of affirming to think a machine could see the real me back then?
I did the same pic that was used above and it said “Woman, 95% confident”
Did you use the larger cropped picture? And else I was thinking what if the AI was actually saying I’m 97% sure that it’s a man facing away from the camera.
that’s a solid disclaimer.
deleted by creator
It uses the phone’s built in chakra detector
deleted by creator
According to the screenshot, it doesn’t even call her a trans woman, it calls her a man. Presumably because man and woman are the only options on her little TERF world.
The AI probably saw that massive boner in her pants and got confused.
As funny as it is, I don’t think people should be uploading their images to this app. Maybe it’s hilariously wrong because it’s trying to data mine?
Yup if you give information to a company it’s now theirs. The old adage about being the product if you’re not paying no longer applies. Now you are the product even if you’re paying.
They can always tell!
I wonder if the AI is detecting that the photo is taken from further away and below eye level which is more likely for a photo of a man, rather than looking at her facial characteristics?
It’s possible to manipulate an image in a way that the original and the new one are indistinguishable to the human eye, but the AI model gives completely different results.
Like this helpful graphic I found
Or… edit the HTML…
You think someone would do that? Just go on the internet and lie?
Yeah, this is a valid point, if this is the exact case or not I don’t know, but a lot of people don’t realize a lot of the weird biases that can appear in the training data.
Like that AI trained to detect ig a mole was cancer or not. A lot of the training data that was cancer had rulers in them. So the AI learned rulers are cancerous.
I could easily see something stupid like angle the picture was taken from being something the AI erroniously assumed to be useful for determining biological sex in this case.
Lol it was 85% confident I was “female”
We are all Trans Women on this blessed day.