mr. steal your likeness
Creator's videos are being manipulated by AI to promote questionable products and companies.
I am a proud and not-so-proud active user of the modern internet.
Not proud because, well, catching yourself scrolling on TikTok for hours on end is not a thing to be proud of. On the other hand proud, because of said scrolling I am often mistaken to be part of the Gen-Z population. In translation, I’m in tune with what’s happening in culture and understand the language of a completely different demographic, which is extremely important in marketing and business.
I see how different the reality of some of my 30+ friends who refuse to download TikTok is different than mine. Sometimes it feels like we live in two completely different countries, have access to completely different musical top charts, and are completely different generations. That’s a topic to unpack for another time, but you get the gist.
Seems like those anti-TikTok friends of mine have always been onto something, though. While we were worrying about how much the app knows us and our preferences and whether that’s good or bad, there have been undetected tech thieves lurking on something else - our likeness.
In the past few weeks, creators have been sharing experiences of their face and voice being stolen. Questionable companies selling questionable products are manipulating creator videos to make it seem like those creators are selling their products. These videos are actually pretty convincing, so much that even the most trained eyes would not think twice about it.
A New Era of IP Infringement
The first video I stumbled upon was by creator and podcaster Arielle Lorre. She shared how beauty company Skaind leveraged existing content from her podcast and digitally manipulated it to make it seem like she was talking about Skaind products.
Enable 3rd party cookies or use another browser
In efforts to have the ad removed, Arielle first messaged the company asking them to take the video down.
Skaind responded, first by apologizing but then stating:
“Our marketing team accessed this content through an artificial intelligence platform without being aware it was a recognized person or person with image rights.”
There are a couple red flags here.
First, the company clearly knew what they were doing because in the video, besides having the AI podcast version of Arielle, they are reusing her skincare videos and other lifestyle content from her TikTok account. Claiming they weren’t aware “it was a recognized person” is clearly a lie. Not to mention that the “interviewer” you can see in the deepfake video is also a popular creator and podcaster Rich Roll, which - you guessed it - also had his original videos tinkered for this ad.
Second red flag is the fact that after this interaction, Arielle of course sent a seize and desist letter to Skaind (and they promptly blocked her, of course) and reported the ad to Meta. However, Instagram reviewed the report and responded back with “this post does not go against our Community Standards.”
Ummmmmm…. what do you mean does not go against our Community Standards?
This irked me, so I went digging.
First statement I found around this topic in Instagram’s Help Center was:
“Under Instagram’s Terms of Use and Meta's Community Standards you can only post content to Instagram or Threads that doesn’t violate someone else's intellectual property rights.”
Interpreted very literally and broadly, that would mean you can’t just use someone else’s videos, tinker with them, and then post as your own.
I went to read further.
“It's possible to infringe someone else's copyright when you post their content on Instagram or Threads, or facilitate copyright infringement, even if you:
Bought or downloaded the content (example: a song from iTunes)
Recorded the content onto your own recording device (examples: a song playing in the background during a party, concert, sporting event, wedding, etc.)
Gave credit to the copyright owner
Included a disclaimer that you didn’t intend to infringe copyright
Didn’t intend to profit from it
Modified the work or added your own original material to it
Found the content available on the internet
Saw that others posted the same content as well
Think that the use is a fair use
Are using an unauthorized streaming device or service (examples: a “jailbroken” or “loaded” app or service)”
I’m not a lawyer, so if you are, correct me if I’m wrong. I would say this means that even if you somehow got legal permission to use their content, checked all the boxes to make sure it’s not “stealing,” it doesn’t mean you’re not violating someone else’ IP rights.
I continued to read a little further and think hard about this, but suddenly I paused. Huh… well, aren’t we then technically infringing on other’s IP rights almost on a daily basis?
Reposting that Pinterest inspo quote you liked so much, sharing a video on your stories from the latest BRAT tour, tweaking a picture to create a meme… that can all be considered a violation of intellectual property rights.
The internet is basically a black hole of stolen and tinkered IP. Since the start of the social media era, we have all essentially infringed on someone’s IP thousands of times. Most of the time, however, it’s nothing of note to warrant lawsuits or reporting on these platforms. That’s the issue, though. Where does the thin line start and where does it end?
Are people outraged about Arielle’s situation because we’re just not used to this type of IP infringement yet? Will the new era of memes be AI deepfakes? Will our new quote sharing actually be a clip we cut from Mel Robbins’ podcast and posted on our feeds aesthetically, instead of the current Pinterest/Tumbler-like image reshare?
Obviously I’m playing a little bit of devil’s advocate here. It’s my job to question things, remember? I’m a UX Designer and a brat at my core after all. It’s an interesting topic of discussion though… the future of content. A frequent one people, creators, the media, and various experts are having.
“Good artists copy; great artists steal” is a famous saying you’ll overhear the creative director types quoting to their buddy next to you while you’re in line at Cafe Lyria waiting to order your coffee. However, in 2025 it’s true more than ever. Everyone seems to be stealing and we’re all used to it.
Followers maketh man
This brings me to my third red flag. Going back to Skaind’s response to Arielle, the phrases “recognized person” and “person with image rights” kind of give me the heebie jeebies. So now, the right to use my likeness depends on what? The amount of followers I have or whether or not I’m a public figure? If I have image rights to the video, you can’t use it. Buuut, if I don’t, you can do whatever you want with it, including using my face to sell and endorse products?
Small creators and non-creators have been the one to suffer the most when it comes to businesses and other influential people stealing their ideas and art for decades, if not centuries. While it’s nothing new, it’s the sad reality that if you have even a tiny amount of “influence” vs none, you’re more likely to be able to win your case against misuse of your IP or your image.
I’m worried that in this new era of AI Deepfakes and IP lawsuits, the gap will just continue to grow larger and the consequences unbearable. Having your face photoshopped on an Ad is one thing, CREATING A WHOLE VIDEO OF YOU TALKING is a whole other can of worms… which needs to be opened and discarded ASAP.
A Marketplace of Faces
Final red flag, which when I was first watching Arielle’s video went over my head, but in the process of writing this letter burned my eyes, is Skaind’s mention of an “artificial intelligence platform.” Allegedly, their team found Arielle’s videos on, I’m assuming, some type of content marketplace or deepfake platform and used it thinking it was labeled as “free use.” If this is true, and not another lie to cover their asses, then that’s even more concerning.
Why are these videos readily available on these platforms if Arielle didn’t grant them those rights? Has she granted the rights without realizing? I doubt it, her lawyers would have caught it by now and she probably would have included that information in her original video.
So this begs the question, how do we solve this? How do we make sure our videos aren’t out there in the ether being sold for use in not just Ads but questionable, reputation ruining content? This was “just an ad” but what if it was… a revenge porn video? Someone posing as her on dating apps? This can get so ugly, so quickly, and we’re barely doing anything about it.
What happened to Arielle and countless of other creators and non-creators is not okay and should be regulated.
How do we regulate it?
Now that’s a discussion I’d like to have not just once, but frequently.
Are these discussions even happening?
Barely, or at least I haven’t found much about it on the internet. Everything is still very surface level and I have an inkling nothing will be done about it until there’s a situation that warrants someone testifying in front of Congress.