Skip to content

How deep is your love? A short tale of deepfakes.

Today I want to tell a short tale of deepfakes. Let’s start with an impressive demonstration:  

The best definition (because honest and funny) I found is the following by Urban Dictionary:

A horrific AI-assisted face swapping app which takes someone’s face and places it on someone else’s body. Particularly great if you’re a creep imaging what your favorite celeb-crush looks like naked.

– Definition by the Urban Dictionary

What we can learn from this definition is that deepfakes have something to do with artificial intelligence and that their roots are in porn. Indeed, deep learning technology is used to create realistic videos that present fake content (e.g. a speech that never happened, a sex tape of a celeb that was altered, …). The term was coined in 2017 by a Reddit user with the pseudonym deepfakes distributing deepfake porn and the technology gained attraction, well, especially by users that swapped celebrity faces onto their favorite porn actors. By now, Reddit has banned deepfakes and other platforms like Pornhub followed since altered porn using deepfake technology is classified as non-consensual.

But morph porn was only the beginning. By now, there exist modified videos showing youth activist Emma Gonzáles shredding the Constitution or Barack Obama calling Donald Trump names.  

To create deepfake videos you need a person making body gestures and facial expressions (source) and then map someone else’s face on it (target). In contrast to conventional face swaps, deepfakes can preserve the facial expression of the source which makes the target look way more realistic. Next to that, using machine learning speeds up the process and makes it easy and fast to create fake videos.

There’s freeware* available that doesn’t require a lot of skills to create deepfakes. With that, the technology is put in the hand of the average person. The technology behind all this is called generative adversarial network (GAN) and was invented by Ian Goodfellow in 2014. For the sake of completeness, you can find the original publication here. But even though you don’t understand the abstract of the paper, you’d be able to use the available freeware which helped in the rapid propagation of deepfakes.

But without going into too much technical detail, let’s talk about the consequences of deepfakes. It’s easy to imagine that deepfakes could not only be used for fun but with bad intentions such as manipulating the public on a political topic or revenge or morph porn. As a short intro, I recommend the following video on the issue:

“Believe nothing you hear, and only half that you see”

– Edgar Allen Poe

By now all of us were aware that photos can be altered with Photoshop and other software. Still, video remained a trustworthy source of information. With ever improving deepfakes it might get harder to differentiate between real and fake. When videos are losing their evidential value, in what can we trust in the future? Especially people with malicious intentions could benefit from declining trust, claiming facts as fake and real actions as unauthentic.

Since I really don’t like to leave you now with dystopian thoughts and feelings, enjoy the following video putting Nicolas Cage in movies he never played in:

*I do not name the freeware for deepfake face swapping here because personally, I think that right now the technology does more harm than good. If you are interested anyhow, you will be able to find it easily.