The line between inspiration and theft has always been contested in the fashion industry, but artificial intelligence is drawing that line into entirely new territory. A model who goes by Mel on TikTok, posting under the username @vanellimelli030, claims that a fashion brand created an advertisement using an AI-generated image that reproduces her face with striking accuracy — replicating her freckles, eyebrow shape, eye color, and a distinctive bump on her nose — without ever contacting her, obtaining her consent, or offering any compensation. She shared a side-by-side comparison on TikTok that quickly spread across social media, and the reaction was swift and overwhelmingly on her side.
“Until today I didn’t even know I had an AI twin,” Mel said in her video. “But today a friend sent me an ad from a fashion brand. And something about it seemed very, very familiar.” The details she pointed to in her comparison were not vague or impressionistic. They were specific enough to be uncomfortable: the placement and pattern of her freckles, the precise shape of her eyebrows, the color of her eyes, and the particular bump she has on her nose — each feature reproduced with the kind of fidelity that does not feel accidental. She noted that whoever created the AI image also appeared to update her hairstyle to something more current than what she wore when the original photos were taken in 2023. “They updated my hairstyle,” she observed. “Because when those photos were taken in 2023, I still had a different cut. So the resemblance is now even greater.”
The observation about the hairstyle is perhaps the most troubling detail in the whole story. It suggests that the image was not simply generated from a broad description but was likely informed by a close familiarity with her appearance, potentially through training data drawn from her existing public photographs. The face was modified just enough to create plausible deniability — “They changed the face a little,” she acknowledged — but the identifying features that make a face recognizably a specific person’s face were left intact. It is a calculated kind of alteration that reflects an awareness that the resemblance was close, combined with a bet that it would not be close enough to trigger consequences.
@vanellimelli030 they say imitation is the highest form of flattery… but i don’t feel flattered. this can’t be the future and should not be normalized. 🚫What do you guys think?
♬ original sound – vanellimelli
The comment section on Mel’s post became a catalog of reactions that illustrate how widely and immediately people recognized the ethical and legal dimensions of what she was describing. “Fashion brands steal other people’s designs, now they’re stealing people’s faces too,” wrote one commenter. Another reduced it to a sharp formulation: “AI is just legalized theft.” Others took a more practical view: “Send them an invoice,” said one. Several were more pointed: “They wanted you, they just didn’t want to pay you,” and “This is a matter for the courts.” The word that appeared repeatedly in different languages and phrasings was “disturbing,” which captures something important — this is not primarily a story about money, though money is certainly part of it, but about a deeply unsettling erosion of the idea that a person’s face belongs to that person.
Mel did not name the brand in her video, which left many commenters frustrated and asking her to identify them publicly. Speculation in the comments pointed to a brand, but Mel neither confirmed nor denied the suggestion, and the brand in question has not publicly addressed the claims. The absence of a named defendant is a feature of many early AI likeness cases — the technology moves faster than either the legal frameworks designed to address it or the social consensus about what constitutes unacceptable use. In the meantime, individuals like Mel are left in the position of having to make their case in the court of public opinion while the formal mechanisms catch up.
The broader context here is one of accelerating anxiety across the modeling and creative industries about what AI-generated imagery means for livelihoods that depend on the commercial value of appearance. Models, photographers, illustrators, and other visual workers have been raising alarms for several years about the ease with which AI systems trained on existing human-created images can generate outputs that replicate or approximate real people’s likenesses without any compensation flowing back to the source. Mel’s story is one specific, named, human face on a phenomenon that is happening at enormous scale without most people realizing it.
The legal landscape around AI-generated likenesses varies significantly by country and is evolving rapidly, with several US states having passed or proposed right-of-publicity laws specifically designed to address the use of someone’s likeness without consent, though enforcement across international borders remains largely untested. The European Union’s AI Act, which began coming into force in stages in 2024, includes provisions around biometric data and deep synthesis that are directly relevant to situations like Mel’s, representing one of the most ambitious attempts to regulate this space. And in the modeling industry specifically, the Screen Actors Guild and several modeling unions have been negotiating AI clauses into contracts since at least 2023, reflecting an industry-wide recognition that this issue was coming long before it became a TikTok story.
Do you think fashion brands should be legally required to obtain a model’s consent before using AI to recreate their likeness in advertising? Share your thoughts in the comments.





