AI detectors are categorically useless, and should not even be referenced, let alone relied upon, to tell if something is AI. Treat them with the same level of skepticism you would a lie detector, ie complete fucking quackery. (I can explain this in depth if you’d like, but the long and short is that if an AI were capable of differentiating it’s own work from a human’s without a watermark, it would also be able to improve it’s human mimicry to exactly that same standard it’s detecting, thus destroying that capability)
The artist does have a consistent style and if you follow their page back you can see their work and growth detailed like any other actual artist, including in-progress pieces. This isn’t AI, the artist’s style has simply been stolen and copied by AIs, and now you’re faulting the artist for their own work being effectively stolen en mass.
Isgen.ai puts that at 88% chance it’s made by Stable Diffusion.
Reversely.ai says 99% Ai.
app.illuminarty.ai says 85% probability of being AI.
I’m sorry but the person who posted this is not being honest about how this was made.
Edit: every pic on that Instagram account screams Stable Diffusion.
AI detectors are categorically useless, and should not even be referenced, let alone relied upon, to tell if something is AI. Treat them with the same level of skepticism you would a lie detector, ie complete fucking quackery. (I can explain this in depth if you’d like, but the long and short is that if an AI were capable of differentiating it’s own work from a human’s without a watermark, it would also be able to improve it’s human mimicry to exactly that same standard it’s detecting, thus destroying that capability)
The artist does have a consistent style and if you follow their page back you can see their work and growth detailed like any other actual artist, including in-progress pieces. This isn’t AI, the artist’s style has simply been stolen and copied by AIs, and now you’re faulting the artist for their own work being effectively stolen en mass.