Deepfakes, the subreddit where people were making fake porn using AI tech (some of which involved video game characters) has been shut down. Reddit found it to be in violation of its policy against “involuntary pornography.” Despite rules that prohibit “any person in a state of nudity or engaged in any act of sexual conduct apparently created or posted without their permission,” Deepfakes had been up for more than two months and gained a following of nearly 90,000 subscribers.

Kotaku senior reporter. Beats: Twitch, streaming, PC gaming. Writing a book about streamers tentatively titled "STREAMERS" to be published by Atria/Simon & Schuster in the future.

Share This Story

Get our newsletter

DISCUSSION

Sheer devil’s advocate here - do those cited rules actually apply?

In these fakes, the actual person in a state of nudity or engaged in an act of sexual conduct did give permission for both the creation and posting of the material. These are edits of existing pornography, after all. Inserting another person’s likeness digitally on top of that- while friggen creepy - isn’t the same thing as changing who actually was nude and engaged in sex.

If someone makes porn of a person having sex with a mask on, and that mask depicts someone else such as a famous celebrity, is that somehow different? Is it against these rules? If someone wears a realistic Trump mask in their porno, is that involuntary pornography because Trump didn’t sign off on it?

How about if someone uses rotoscoping to insert drawn art into the video, replacing an actor’s head with that of a cartoon caricature of someone? If instead of a physical mask of Trump, you edit in a political cartoon of Trump, does that change things somehow?

So then, if someone uses computer software to insert a “digital mask” of Trump into their porn video, is that somehow meaningfully different than using a physical mask, or a cartoon mask?

You could absolutely make the case for unauthorized use of a person’s likeness. (Although the case could also be made for Fair Use exceptions for parody). But could you realistically argue that any of these examples would actually qualify as “involuntary pornography”?

Personally, fakes like these weird me out. But clearly the rules about involuntary pornography exist to protect people from actual exploitation.

If someone snaps a polaroid of me, cuts out the face, and pastes it on top of a magazine picture of a nude Playboy Bunny, I can quite rightly be pissed off at them for being creepy, but I couldn’t realistically claim that I was the victim of involuntary pornography.