Fake photographs are nearly ubiquitous on the internet, and soon fake videos will likely be as well. Deepfakes, a portmanteau of “deep learning” and “fake,” is a method of creating convincing simulated videos by manipulating existing images with the help of artificial intelligence. This technology can be used to make it seem as if people were saying or doing things they never did or said: “The algorithmic machine learning technology of deepfakes allows anyone to create a very convincing simulation of a human subject given ample photographic evidence on which to train the machine about what the image should look like.”
Here’s a video of President Obama explaining the dangers of deepfakes. Well, it’s actually a deepfake video featuring actor Jordan Peele’s voice and a manipulated version of the former president:
This technology can be used for harmless fun (have you ever wanted to see the Mona Lisa frown?), but its most prevalent use, unfortunately, appears to be in the creation of fake pornographic videos of celebrities.
The BBC reported of this technology in February 2018 that:
In recent weeks there has been an explosion in what has become known as deepfakes: pornographic videos manipulated so that the original actress’s face is replaced with somebody else’s.
As these tools have become more powerful and easier to use, it has enabled the transfer of sexual fantasies from people’s imaginations to the internet. It flies past not only the boundaries of human decency, but also our sense of believing what we see and hear.
Beyond its use for hollow titillation, the sophistication of the technology could bring about serious consequences. The fake news crisis, as we know it today, may only just be the beginning.
A number of actresses, such as Natalie Portman, Emma Watson, and Gal Gadot, have already fallen victim to these deepfake pornographers. In May 2019, we started receiving queries about another deepfake, this time featuring Jenna Fischer, an actress best known for her role as Pam Beasley on the television series The Office.
The website “CelebJihad” posted a doctored video and claimed that it showed “Jenna Fischer’s sex tape.” This was just another deepfake video, however, that was made by superimposing the Office actress’ face onto the body of someone else.
The original adult film was created in December 2018 for the “Girls Do Porn” series. We won’t link to that decidedly NSFW content, but here’s a comparison showing of the deepfake (left) and the original footage (right):
In addition to spreading this doctored video of Fischer, CelebJihad also posted a number of doctored photographs supposedly showing the same actress in various states of undress. Again, we won’t link to the NSFW content, but here are some comparisons to show that these images were doctored:
CelebJihad did present two genuine, albeit much more tame, photographs of Fischer. The first was one posted to her Instagram page in March 2018. The other came from a 2006 photo spread in Jane Magazine from 2006.
Deepfake technology continues to grow more advanced. While it typically takes a rather large data set to create a convincing fake video, CNet reported in May 2019 that Samsung had developed software that could generate a fake clip from a single image:
Software for creating deepfakes — fabricated clips that make people appear to do or say things they never did — usually requires big data sets of images in order to create a realistic forgery. Now Samsung has developed a new artificial intelligence system that can generate a fake clip by feeding it a little as one photo.
Lots of deepfakes, like the one animating the Mona Lisa, are harmless fun. The technology has made possible an entire genre of memes, including one in which Nicolas Cage’s face is placed into movies and TV shows he wasn’t in. But deepfake technology can also be insidious, such as when it’s used to graft an unsuspecting person’s face into explicit adult movies, a technique sometimes used in revenge porn.