
Indie writer Finji says TikTok used generative AI to change its advertisements with out permission—together with one which inserted a racist, sexualized stereotype of a personality from Common June. The studio, identified for publishing titles like Night time within the Woods and Tunic and creating Common June and Overland, says the modified advertisements ran as if posted from its official account.
Finji CEO Rebekah Saltsman first raised the problem publicly, asking followers on Bluesky, “In the event you occur to see any Finji advertisements that look distinctly UN-Finji-like, ship me a screencap.”
Speaking to IGN, Saltsman says that Finji has “AI turned all the best way off” on its TikTok advert account. The group grew to become conscious of the problem solely after customers started commenting on its professional advertisements, questioning unusual variations that did not match the unique video creatives. Viewers despatched screenshots displaying what seemed to be AI-generated variants. One picture considered by IGN exhibits an altered model of Common June key artwork by which protagonist June is redrawn with exaggerated hips and thighs, a bikini backside, and over-the-knee boots–imagery Saltsman says invokes a dangerous stereotype.
In messages reviewed by IGN, a TikTok help agent confirmed Finji had each “Sensible Artistic” and “Automate Artistic” turned off–features that use generative AI to remix or optimize advert belongings. Regardless of that, help initially advised Finji there was no proof that the system added AI-generated content material or auto-assembled slideshow belongings.
After Finji re-sent the offensive screenshot and demanded escalation, TikTok replied, “We’re now not disputing whether or not this occurred,” acknowledging “the unauthorized use of AI, the sexualization and misrepresentation of your characters,” and promising an inner overview.
Days later, nonetheless, TikTok help provided a special clarification, saying Finji had been included in “a broader automated initiative” tied to a catalog advertisements format designed to spice up efficiency, claiming campaigns utilizing blended belongings see a “1.4x ROAS elevate” (return on advert spend). The corporate provided to request that Finji be added to an opt-out blocklist, although approval wasn’t assured. When Finji requested why it had been opted in with out consent and why it could not reliably decide out, help responded that the present consultant was “the very best inner group accessible” and that “closing findings and actions” had already been offered.
Saltsman referred to as the response baffling. “It is one factor to have an algorithm that is racist and sexist, and one other factor to make use of AI to churn content material of your paying enterprise companions … after which to additionally NOT reply to any of these errors in a coherent method? Actually?”
She added: “That is my work, my group’s work, and mine and my firm’s reputation–which I’ve spent over a decade constructing … here is the true enjoyable factor about all of this—you, who paid for and made consultant advertisements on your work, won’t ever see what they’ve accomplished to your work. However your viewers will. And perhaps somebody in that viewers shall be courageous sufficient to achieve out to allow you to in on what tf is occurring.”
Finji says it has acquired no additional updates regardless of extra follow-ups, and TikTok declined to touch upon the report.
