In a Facebook ad, a woman with a face similar to that of actress Emma Watson smiles shyly and leans into the camera, appearing to initiate a sexual act. But the woman is not “Harry Potter” star Watson. The ad was part of a massive campaign this week for a deepfake app that allows users to swap any face in any video they like.
Deepfake is content in which faces or voices are changed or manipulated. Typically, deepfake creators create videos in which celebrities are made to appear as if they are voluntarily appearing in them, even though they are not. Increasingly, the technology has been used to create non-consensual pornography that features the faces of celebrities, influencers or anyone, including children.
The advertising campaign on Meta points to the fact that this once-advanced technology has spread rapidly to readily available consumer applications advertised on mainstream parts of the Internet. Despite many platforms banning manipulative and malicious deepfake content, apps such as those reviewed by NBC News have been able to slip through the cracks.
On Sunday and Monday, an app for creating “deepfake faceswap” videos released more than 230 ads across Meta’s services, including Facebook, Instagram and Messenger, according to a review of Meta’s ad library. Some of the ads featured what looks like a pornographic video beginning with the famous sound of the intro track from the porn platform Pornhub. In seconds, the women’s faces were swapped with those of famous actresses.
When Lauren Barton, a journalism student in Tennessee, saw the same ad on a different application, she was shocked enough to screen-record it and tweet thiswhere it has been viewed over 10 million times according to Twitter’s views counter.
“It can be used in public schools with high schoolers who are bullied,” Barton said. “It could ruin someone’s life, they could get in trouble at their job. And it’s super easy and free to do. All I had to do was upload a picture of my face and I had access to 50 free templates.
Watson’s likeness was shown in 127 of the meta ads. Another 74 featured actress Scarlett Johansson’s face replaced with that of women in a similar provocative video. Neither actress responded to a request for comment.
The caption on 80 ads read, “Swap faces with someone”. “Enjoy yourself with AI Swap Face Technology.”
After NBC News reached out to Meta for comment on Tuesday, all ads for the app were removed from Meta’s services.
While the videos did not show any sexual acts, their suggestive nature shows how the application could potentially be used to generate fake sexual content. The app allows users to upload videos to manipulate and includes dozens of video templates, many of which appear to be taken from TikTok and similar social media platforms.
Preset categories include “Fashion,” “Bridal,” “For Men,” “For Women,” and “TikTok,” while the category with the most options is called “Hot.” It features videos of scantily clad women and men dancing and posing. After selecting a video template or uploading their own, users can input a photo of anyone’s face, and get a face-swapped version of the video in seconds.
The terms of service for the app, which costs $8 per week, say it does not allow users to impersonate others or upload sexually explicit material through its services. The app developer listed on the App Store is called Ufoto Limited, which is owned by the Chinese parent company Wondershare. Neither company responded to a request for comment.
Meta banned most deepfake content in 2020, and the company prohibits adult content in ads, including nudity, depictions of people in explicit or suggestive positions, or activities that are sexually provocative.
“Our policies prohibit adult content, whether or not it was generated by AI, and we have prohibited this page from advertising on our platform,” a Meta spokesperson said in a statement.
The same ads were also seen on free photo-editing and gaming apps downloaded from Apple’s App Store, where the app first appeared in 2022 and is still available to download for free to anyone 9 and older. is available.
An Apple representative said that the company does not have specific rules regarding deepfakes, but that the company prohibits apps that contain pornography and defamatory material. Apple said it removed the app from the App Store after being contacted by NBC News.
The app is also on Google Play and is rated “Teen” for “suggestive themes”.
Apple and Google have taken action against similar AI face-swap apps, including the one that was the subject of a Reuters investigation in December 2021. Reuters found that the app was advertising the creation of “deepfake porn” on pornographic websites. At the time, Apple said it did not have specific guidelines regarding deepfake apps, but prohibited content that was defamatory, discriminatory, or likely to intimidate, humiliate, or harm anyone. While the app’s rating and advertising campaigns have been adjusted, the app is still available to download for free on Apple’s App Store and Google Play.
The app, reviewed by NBC News, is one of the latest in a surge of freely accessible consumer deepfake products.
A search for “deepfakes” on the App Store brings up dozens of apps with similar technical capabilities, including apps promoting “hot” content creation.
Mainstream examples of technology show celebrities and politicians doing and saying things that they have never actually said or done. Sometimes the effects are comical.
However, deepfake technology has been overused to create pornography with non-consenting stars. As technology has improved and become more widespread, the market for non-consensual sexual imagery has grown. Some websites allow users to sell deepfake porn without consent from behind paywalls.
A 2019 report by DeepTrace, an Amsterdam-based company that monitors synthetic media online, found that 96% of deepfake content online is pornographic in nature.
In January, female Twitch streamers spoke out after a popular male streamer apologized for consuming deepfake porn of her peers.
Livestreaming research by independent analyst Genevieve Oh found that the top deepfake porn viewing website spiked in traffic after the Twitch streamer’s apology. Oh’s research also found that since 2018, the number of deepfake pornographic videos has almost doubled every year. Oh said February saw the highest number of deepfake porn videos uploaded in a single month.
While the non-consensual sharing of sexually explicit photos and videos is illegal in most states, laws addressing deepfake media are only in effect in California, Georgia, New York, and Virginia.