Reddit finally bans ‘deepfake’ AI porn videos and images

8 Feb 2018

Reddit logo. Image: djandywdotcom/Flickr (CC BY-SA 2.0)

Reddit and other platforms are cracking down on AI-generated porn videos.

The phenomenon of the burgeoning ‘deepfake’ community on various online platforms has gained significant attention of late.

The videos and imagery are created using relatively simple artificial intelligence (AI) software, making it easy to create content that involves superimposing faces of celebrities or strangers on the bodies of porn actors.

A desktop app called Fakeapp was released in recent months increasing the level of ease for people who wanted to create such content. The app was first brought to wider attention by Motherboard, which found it was being used to make videos of Taylor Swift, Gal Gadot and other famous faces.

The app requires a large amount of photographs taken at a variety of angles to produce accurate results, making celebrities a prime source for such non-consensual pornographic videos.

Many platforms have already taken action

Reddit’s ban of ‘deepfake’ content follows similar moves from Twitter and Pornhub, which banned the unethical pornographic content from their platforms. The videos have been likened to revenge porn, due to the non-consensual use of people’s faces on the bodies of porn actors.

Pornhub’s vice-president, Corey Price, said the company takes “a hard stance against revenge porn, which we believe is a form of sexual assault”. He said users are flagging this content and moderators are removing it as alerts roll in.

GIF-hosting site Gfycat was a well-known source of these pornographic clips, but it banned them recently, citing its terms of service around the removal of “objectionable content”.

Reddit was a major hub for this type of content prior to the ban, with the ‘deepfakes’ subreddit having approximately 90,000 subscribers. Not all of the content shared there was pornographic, as videos featuring spoofs of US president Donald Trump and actor Nicolas Cage were also popular on the forum.

Reddit pulls shutters

Members of the forum began to raise concerns about the potential for images of child abuse to circulate, and Reddit addressed this following the ban. “Depending on the context, this can, in some cases, include depictions of minors that are fully clothed and not engaged in overtly sexual acts. If you are unsure about a piece of content involving a minor or someone who appears to be a minor, do not post it.”

Both r/deepfakes and r/deepfakesNSFW have been shuttered by Reddit, and the company has reworded the language around its involuntary pornography policy. “Reddit prohibits the dissemination of images or video depicting any person in a state of nudity or engaged in any act of sexual conduct apparently created or posted without their permission, including depictions that have been faked.

“Images or video of intimate parts of a person’s body, even if the person is clothed or in public, are also not allowed if apparently created or posted without their permission and contextualised in a salacious manner, eg ‘creepshots’ or ‘upskirt’ imagery.”

Extra information shows new regulatory policy around the ‘deepfake’ phenomenon: “Additionally, do not post images or video of another person for the specific purpose of faking explicit content or soliciting ‘lookalike’ pornography.”

Reddit also closed r/Celebfakes, a forum that had been active for years, which mainly consisted of faked static images.

Reddit logo. Image: djandywdotcom/Flickr (CC BY-SA 2.0)

Ellen Tannam was a journalist with Silicon Republic, covering all manner of business and tech subjects

editorial@siliconrepublic.com