Deepfake technology is like Photoshop for video. It allows non-technical folks to alter a video’s look and audio so completely that at first and second glance it appears authentic. Deepfake almost puts the spin doctors out of business because the story you start with has already been spun.
An audio voice skin or clone is a digital asset which can transform a speaker’s voice in real time. It replicates unique speech patterns, pronunciation and even emotional range. While there are many benevolent applications for this kind of audio manipulation such as a tool for someone who has lost his voice, audio clones are part and parcel of the larger fake media world.
Artificial Intelligence is the driver of this technology, which takes real photos and audio and ‘learns’ their characteristics. The longer the AI has to churn, and the more data provided, the better the results.
As of last year, there were over 15,000 deepfake videos on the web of which 96% were porn. Fakers take an actual porn video and superimpose faces of celebrities on them. The remaining percentage of deepfake includes legitimate users such as visual effects studios, and folks just goofing around but also those who want to discredit, disrupt and sow havoc.
Ian Sample writing for The Guardian points to examples of deepfake identities such as non-existent journalists disseminating their news pieces and pundits working for the Center of Strategic and International Studies. It’s hard to figure out who creates these fake people but one good guess are foreign agencies who are trying to manipulate public opinion.
We are flooded with information. Figuring out what’s real and what isn’t, if its author has an agenda or bias, if the information has been adequately corroborated is time consuming at best and impossible at worst. Add in deepfake and the problem definitely gets worse. Deep Fake tech amplifies the distortion game by several degrees.
One way for a liar to deny the truth is to say that the authentic piece of video or audio making him/her look bad is in fact a fake. As the public becomes more and more aware of the existence of deep fake, as deep fake improves technically and the public’s general sense of trust in what they see and read diminishes, this ploy gets easier and more effective.
Like all threats, deepfake is evolving. At first, there were telltale signs that an image or video had been tampered with or in this case, was artificially created. For example, given that the video is created using photographs that (normally) show a person with their eyes open, it was common with deepfake videos that the person was not blinking. The AI didn’t have closed eyes to learn from. The technology has adapted to address this and other catches.
Deepfake Detection Challenge
To try to keep ahead of this evolving threat, Google, Apple, Microsoft as well as the BBC and NY Times have come together to promote the Deepfake Detection Challenge. Contestants are asked to create tools to detect manipulated media. First Prize is $500,000 – sorry, submissions deadline was March 31, 2020.
Although to create a deepfake, you should be using a high-end desktop with good graphics, and also need some expertise to clean up defects, a kid today can create a fake using an app on their phone. It could be fake audio of their parents giving permission to skip school, or approve a field trip, or relay an excuse for missing homework. The possibilities are bounded only by the kid’s motivation, creativity and willingness to deceive.
In another case, someone may be trying to make contact with an individual – perhaps for political or financial gain. Using a fake identity allows them to maneuver in a way they otherwise could not. The fake will have tailor-made attributes, stories and experience, all designed to facilitate the long game.
Think about the many ways bad actors could use Deep Fake to discredit, confuse and disrupt.
As it stands today, it’s so easy for a single facet of a given – real – story to go viral, get swept up by the media, turned into a narrative that is then exploited by various factions to support various agendas. The other facets of the story are buried. Ignored. They quickly lose relevance because what bleeds, leads. This happens with all kinds of stories from those about celebrities cheating on their spouses, to stories about politicians trying to game the system.
(…Speaking of awareness and combating deception, check out Chameleon’s newest online course ID Verification)
And finally, here’s a video that includes amazing examples of Deepfake: