Remember the good old days when “Photoshopped” images were all we had to worry about when judging the veracity of content online?
People have been altering images practically since the invention of photography, but up until recently, it was much harder and more expensive to create convincing fake video or audio.
But recent advances in machine learning and artificial intelligence have led to the advent of deepfakes: videos or audios of fake events. Now, people can make anyone — politicians, celebrities, even themselves — say or do anything they want with remarkably convincing results.
What is a deepfake?
The term deepfake comes from a mashup of “deep learning” and fake because deepfakes leverage techniques from machine learning and artificial intelligence to generate fake audio or video content.
So far, deepfakes are mainly being used for fun, for pranks, and for pornography. One study found that a staggering 96% of deepfake videos online were pornographic.
And they’re not just used for videos. Deepfake technology can create convincing “voice skins” or “voice clones” that can make it sound as though anyone is saying anything you want, and other technology can create utterly convincing still photographs of people who never existed.
Who is making deepfakes?
It’s still relatively tricky to make a convincing deepfake with a regular home computer, as they require sophisticated hardware and software. But that’s not slowing anyone down.
Deepfakes are being produced by everyone from researchers to amateur enthusiasts, visual effects companies, and porn producers. There's even some evidence that governments and political parties are getting into the game to produce media that might discredit their opponents or extremist groups.
There are companies that will create deepfake videos for you, and even an app called Zao that will put your face (somewhat less convincingly) into clips from your favorite movies and TV shows (though the app has raised serious privacy concerns).
What are the concerns around deepfakes?
The biggest concerns around deepfakes are that they will be used maliciously. Their widespread use in pornography has raised very real concerns that deepfakes might be used for blackmail or "revenge porn," wherein a person's face is superimposed into a pornographic video to embarrass, discredit, or blackmail them.
Governments and politicians are also concerned that deepfaked videos could be used to sway elections or social engineering scams that convince people they are receiving information from a trusted source.
Even more interesting and perhaps concerning is the trend that genuine footage or audio could be called into question or called a fake when it reflects poorly on a person.
How do you spot a deepfake?
For now, most deepfakes are presented as such and not intended to really fool anyone, but as technology advances, people are rightfully concerned about how it could be misused.
Lower quality deepfakes are obviously easier to spot: they might show flickering around the edges of transposed faces, poor lip-synching, or patchy skin tone and lighting. Fine details, including hair, jewelry, and teeth that are poorly rendered, can also be a giveaway. In 2018, a U.S. study found that deepfakes didn’t blink often enough; but as soon as this was detected, creators got savvier and added blinks.
But as the technology improves, the fakes get harder and harder to spot. A wide range of institutions are working on deepfake detection methods, and the Deepfake Detection Challenge created a dataset to help researchers work on the problem.
Ironically, AI may be the answer to spotting deepfakes, as it can be trained to spot tiny inconsistencies that the human eye might miss. Another solution might be to use blockchain technology to create and maintain the provenance of media so that origins and potential manipulations can always be accounted for.
Whatever the future holds, deepfakes aren’t going away. It’s likely that the technology for creating them and the technology for spotting them will develop hand in hand, and audiences will simply need to remain aware and on alert for deepfakes as they consume their media.
Read more about these and other future trends in my books, Business Trends in Practice: The 25+ Trends That are Redefining Organizations and Tech Trends In Practice: The 25 Technologies That Are Driving The 4th Industrial Revolution.