Two influencers who got tired of their content being stolen worked together to create their own protection agency that has helped remove over 250,000 hijacked posts.
Twitch streamer Morgpie and anime-focused Instagram creator Zander Small boast a combined following of more than two million fans across multiple platforms.
Both influencers realized they suffered from the same problem: their content was being stolen, sometimes even deepfaked, and reposted on piracy websites, Telegram channels, forums, and social media apps.
It’s a modern problem for the modern age, but no less damaging. Morgpie previously paid thousands of dollars a month for services to root out and take down her stolen content, but that still wasn’t stopping thieves from ignoring DMCA notices and reposting her videos and streams, anyway.
Tired of the constant legal battles, Morgpie and Zander worked together to build their own creator protection agency: Fanlock.
Influencers remove 250K hijacked posts with new creator protection startup
Fanlock is a creator-focused service that finds influencers’ stolen content and gets it taken down. Unlike other agencies of its kind, Fanlock doesn’t stop at issuing DMCA notices; it makes things so expensive for the thieves that they’re essentially forced to comply or risk paying through the nose in fines.
Since launching Fanlock on February 1, Morgpie and Zander’s service has removed over 250K posts from Google and has permanently deleted 75K posts from the sites that were hosting them.
We got the chance to speak with these two influencers about the challenges of running Fanlock on top of their full-time careers as content creators, which Zander admitted is a “nonstop” job.
“We’re basically working two full-time jobs but we’re obsessed with solving this problem. We’re not just founders staring at spreadsheets, we’re in the trenches every day because our own content is on the line too,” he told us.
Zander explained that he takes the lead on Fanlock’s technical side and overall strategy, while Morgpie handles creator outreach and consulting. Morgpie was specifically passionate about Fanlock’s services, having been a victim of high-profile leaks in the past.
“As someone who has dealt with extremely prolific leaks and AI-generated deepfakes, Fanlock is definitely my priority right now!” she said. “Deepfakes of content creators, especially those who do not participate in sex work, are never okay. We hope that with our tools at Fanlock, creators can feel more empowered to take action against this sort of content. We will be fighting the good fight to ensure that creators with deepfaked content are protected and this sort of content is removed.”
Creators raise $200K to battle AI deepfakes and content leaks
Morgpie and Zander raised $200K in a seed round to help them turn Fanlock from an idea into a reality. Zander says he built the service’s scanning infrastructure from scratch, which includes proprietary Telegram indexing across more than 60 million posts.
“We monitor over 10 million websites and we crawl piracy forums, leak sites, file hosts, tube sites, and deep web sources that other services don’t even know exist. We also check for obfuscated versions of creator names that leakers use to hide content, things like binary and hex encoding,” he explained.
Currently, Fanlock has three employees working to manage the service’s infrastructure. Zander is not taking a salary, relying on income from his own content, so “every dollar of funding goes back into the platform.”
“We wanted to build something by creators, for creators, and that means putting the mission before our own paychecks.”
Fanlock claims average creator has 6,000+ hidden leaks online
Zander explained that the common denominator behind bad actors stealing and reposting creators’ content is “organized profit.” He described it as a “massive shadow economy” that pulls in “tens of thousands of dollars a month” from posts that are “built on the back of creators who are just trying to make a living.”
Even more concerning, Zander revealed that the average creator using Fanlock has over 6,000 leaks across forums, Telegram channels, file hosts, and tube sites.
Most of them had no idea the majority of those leaks even existed until we scanned for them,” he revealed. We’ve had creators whose entire private lives were indexed on Google through deepfake forums before they even knew the content existed. That’s the ‘why’ behind everything we do.”
Morgpie and Zander Small reveal how creators can protect their content from being stolen
Given their expertise on the subject, we asked Morgpie and Zander what creators can do to help protect themselves from having their content stolen and what they should do if they find their work reposted online.
“Don’t suffer in silence,” Zander advised. “It happens to almost every successful creator and it’s not your fault. …Also, get professional protection early. Don’t wait until there are tens of thousands of links floating around to start fighting back. And make sure whoever you hire actually escalates past a single DMCA notice, because if they don’t, you’re paying for a service that quits the moment things get hard.”
“Make sure your content is appropriately priced!’ Morgpie urged. “If you are selling your content for cheap, this will unfortunately result in your content being easier to leak due to the low price point. Another great strategy for this is curating specific bundles for fans! If you send a fan a particular set of videos and those exact videos start appearing on leak sites, you can pinpoint the source of the leak more easily and ensure the fan is reported.”
Zander credited the Take It Down Act as a “huge step” in helping Fanlock successfully remove creators’ stolen content, particularly when it comes to deepfaked posts. They’re able to leverage the act to force removal of non-consensual intimate imagery, using “every legal tool available” to escalate until the matter is resolved.
“Deepfakes are the new frontier of digital abuse,” he said. “Using AI to create explicit content of someone without their consent, especially creators who don’t even work in that space, is horrible.”
“AI is making piracy easier and faster for the bad guys, so we’re building smarter tech to find and remove it. We want creators to feel like they can actually fight back instead of feeling helpless against an algorithm. This is the content that takes the heaviest toll on creators’ mental health and we treat every deepfake case with urgency.”
The Peace Corps is recruiting volunteers to sell AI to developing nations