One of the earliest teen victims bullied by fake nudes has sued to destroy the app she said left her living in “constant fear.”
In her complaint, the teen—who was granted anonymity as a 17-year-old minor—accused ClothOff of intentionally making it easy to generate and distribute child sexual abuse materials (CSAM), as well as nonconsensual intimate images (NCII) of adults. She also alleged that the social media network Telegram helps promote ClothOff through automated bots that have attracted hundreds of thousands of subscribers.
ClothOff’s operation, the teen alleged, goes beyond promoting a single app, which can be used for free to turn an ordinary Instagram photo into CSAM or NCII in “three clicks.”
It’s affiliated with at least 10 other services that can be used to undress images of “anyone,” using the same technology, which ClothOff claims is “always improving.” And “exacerbating the reach and harm of ClothOff’s illegal and predatory activities,” developers and companies can directly access that technology through an API that the teen victim alleged “allows users to create private CSAM and NCII,” including “the most extreme kinds of content while better evading detection.”
“Because the API’s code is easy to integrate, any website, application, or bot can easily integrate it to mass-produce and distribute CSAM and NCII of adults and minors without oversight,” the complaint said, while noting that ClothOff has also inspired “multiple copycat” websites and apps.
On average, ClothOff and its apps generate 200,000 images daily, the complaint said, and have reached at least 27 million visitors since launching. For users seeking “premium content,” ClothOff accepts credit card or cryptocurrency payments between $2 and $40, with the “sole purpose” of profiting off of “enticing users to easily, quickly, and anonymously obtain CSAM and NCII of identifiable individuals that are nearly indistinguishable from real photos,” the complaint alleged.
“ClothOff adds no stamp to images to indicate that they are not real, and viewers unaware of the images’ provenance have no way of knowing whether the images are authentic or fabricated,” the complaint said. “Once created, these images can be shared endlessly without the depicted victim’s knowledge and absent their consent.”
Further, ClothOff allows users to create galleries of fake nudes, suggesting that the platform stores images of victims, the lawsuit alleged. That terrified the teen victim, who is scared that ClothOff is training on her image to “better generate CSAM of other girls.”
ClothOff has yet to respond to the lawsuit, but its website claims that the company never saves data and that it’s “impossible” to “undress” images of minors. Any attempt to generate fake nudes of a minor would result in an account ban, the site says.
But the teen’s lawsuit alleged that such disclaimers were not posted when ClothOff produced CSAM based on an Instagram photo taken when she was 14 years old.
“Regardless, this disclaimer from ClothOff is ineffectual and false,” the complaint said, alleging that “ClothOff users could then, and still can, upload photos of clothed girls under the age of 18 to obtain CSAM of them,” and ClothOff does nothing to stop it.
The teen hopes that the court will intervene to end ClothOff’s operations, block all domains associated with it, and prevent any marketing or promotion, including through Telegram bots. She also asked for her images, as well as all of the CSAM and NCII that ClothOff may be storing, to be deleted, in addition to punitive damages due to suffering “intense” emotional distress.
Teen victim expects nightmare will never end
Telegram has apparently already acted to remove the ClothOff bot and will likely seek to end its involvement in the suit.
A spokesperson told The Wall Street Journal that “nonconsensual pornography and the tools to create it are explicitly forbidden by Telegram’s terms of service and are removed whenever discovered.”
For the teen suing, the prime target remains ClothOff itself. Her lawyers think it’s possible that she can get the app and its affiliated sites blocked in the US, the WSJ reported, if ClothOff fails to respond and the court awards her default judgment.
But no matter the outcome of the litigation, the teen expects to be forever “haunted” by the fake nudes that a high school boy generated without facing any charges.
According to the WSJ, the teen girl sued the boy who she said made her want to drop out of school. Her complaint noted that she was informed that “the individuals responsible and other potential witnesses failed to cooperate with, speak to, or provide access to their electronic devices to law enforcement.”
The teen has felt “mortified and emotionally distraught, and she has experienced lasting consequences ever since,” her complaint said. She has no idea if ClothOff can continue to distribute the harmful images, and she has no clue how many teens may have posted them online. Because of these unknowns, she’s certain she’ll spend “the remainder of her life” monitoring “for the resurfacing of these images.”
“Knowing that the CSAM images of her will almost inevitably make their way onto the Internet and be retransmitted to others, such as pedophiles and traffickers, has produced a sense of hopelessness” and “a perpetual fear that her images can reappear at any time and be viewed by countless others, possibly even friends, family members, future partners, colleges, and employers, or the public at large,” her complaint said.
The teen’s lawsuit is the newest front in a wider attempt to crack down on AI-generated CSAM and NCII. It follows prior litigation filed by San Francisco City Attorney David Chiu last year that targeted ClothOff, among 16 popular apps used to “nudify” photos of mostly women and young girls.
About 45 states have criminalized fake nudes, the WSJ reported, and earlier this year, Donald Trump signed the Take It Down Act into law, which requires platforms to remove both real and AI-generated NCII within 48 hours of victims’ reports.