What Are 'Deepfakes,' and Why Are Pornhub and Reddit Banning Them? | Media Hard

What Are 'Deepfakes,' and Why Are Pornhub and Reddit Banning Them?

What Are 'Deepfakes,' and Why Are Pornhub and Reddit Banning Them?

Last year, a video popped up on Reddit that appeared to show Gal Gadot having sex with a man playing her stepbrother. It was fake, of course, but it looked uncannily real. And with it, a new genre of fake celebrity porn was created. Since then, the tangled ethical implications of what people are now calling “deepfakes” are changing how big companies like Reddit, Twitter, and even Pornhub handle X-rated content. Here’s your primer on deepfakes, and what these creepy videos mean for consent, pornography, and fake news.

Advertisement – Continue Reading Below

What is it?

“Deepfakes” are videos that use AI technology to paste a celebrity face onto a different body. The term was created by a Reddit user, “deepfakes,” who started posting pornographic celebrity videos using this face-swapping technology, including Gadot, Scarlett Johansson, Aubrey Plaza, and Taylor Swift. (Other videos from different sources have since targeted Natalie Portman, Daisy Ridley, and Michelle Obama, to name a few.) The X-rated videos were posted onto a subreddit, also called “deepfakes,” which quickly gained thousands of subscribers. They soon cropped up on Pornhub and similar sites.

How do you do it?

The creator told Motherboard in December that he trained an algorithm to track Gadot’s face and expressions at different angles through photos and videos, then manipulate pornographic videos so that they switched in her face on top of the actor’s. “I just found a clever way to do face-swap,” he said. It wasn’t long before an app, FakeApp, surfaced on the deepfakes subreddit, which lets anyone face-swap celebrities, friends, coworkers, etc., as long as they have the right hardware.

Advertisement – Continue Reading Below

Why is it bad?

Pornographic deepfakes are non-consensual and deeply gross. They depict extremely intimate, explicit acts, and are created without permission from the celebrity or the porn actor. Think about it like revenge porn or hacked celebrity nudes, which can be weaponized to humiliate a person (usually a woman) in front of millions of people. Though revenge porn and hacked nudes are different than deepfakes because the original material doesn’t need to be altered, the ethical issues of consent and objectification of victims are similar. Bottom line: She doesn’t have any say in what her body is made to seem like it’s doing, and strangers are getting off on that.

Who has banned it?

Reddit, Twitter, and Pornhub have all announced measures to wipe explicit deepfakes from their platforms. Pornhub told Motherboard it will remove any “non-consensual” content. Reddit issued a ban on what it called “involuntary pornography” and banned the deepfakes subreddit. Twitter told Verge it will suspend accounts that create or share “intimate media” like deepfakes without the subject’s consent. Other video platforms like Discord and Gfycat have also moved to ban non-consensual face-swapping on pornography.

What’s the punishment?

Legally, there is little protection against deepfakes because the body doing the act isn’t the celebrity’s own. These videos aren’t made from illegally stolen nudes, and as Wired wrote, “You can’t sue someone for exposing the intimate details of your life when it’s not your life they’re exposing.” You can even drag the First Amendment into it, defending deepfakes as art, satire, and “free speech” because they weren’t technically created illegally.

Advertisement – Continue Reading Below

Advertisement – Continue Reading Below

What does this mean for AI face tech?

The thing about face-swapping tech is that it isn’t always as icky as face celebrity porn. There’s a wide market out there for realistic face-swapping that has nothing to do with sex. You know the most famous and high-tech example: Princess Leia’s AI face at the end of Rogue One. And more recently, someone used FakeApp to plaster Nicolas Cage’s face all over iconic movie scenes, which was pretty damn amusing.

But these are just the infant days of the technology. When the tech gets really good, and it isn’t as easy to spot fakes, then “video proof” takes on a whole new meaning. You could argue that a real video showing you doing a bad thing was faked—you might even call it “fake news.” In the end, it dangerously blurs the lines between reality and mimicry, and anyone from celebrities to your mom to the president could get fooled.

Advertisement – Continue Reading Below

What does this mean for you?

For now, Reddit, Twitter, and Pornhub are relying on users to report deepfakes living on their platforms so they can delete them and ban the associated accounts. But deepfakes are still cropping up on these platforms, where policing every piece of content is virtually impossible. So consider this a “see something, say something” situation: If you see a video that stitches a celebrity’s face onto a porn star’s body, alert the website and get it taken down.

Let’s block ads! (Why?)

Lifestyle – Esquire

Leave a Reply

Your email address will not be published. Required fields are marked *

Read previous post:
Watch: House Majority Leader McCarthy Offers Explainer on FISA Memo — ‘Everything You Need to Know’

Watch: House Majority Leader McCarthy Offers Explainer on FISA Memo — ‘Everything You Need to Know’ On Monday, House Majority