Toks David, Lagos

Imagine you’re on Instagram casually checking your notifications, liking an image here and a video there, when you find yourself doing a double-take on a random post of someone who looks just like you in a short explicit video. You check again, and yes: the face and look are yours but you’re sure that isn’t ‘you’, because for one thing you never made such a video, and two: it has to be a fake because that’s not your body, you tell yourself.

But your nightmare gets worse because all of a sudden the video has been shared thousands of times on Instagram, and stills are popping up all over the place from WhatsApp to Twitter and everywhere else. You’re terrified, yet also amused. You don’t know what to do because it’s too convincing, and your loved ones and friends are beginning to ask questions. They’ve seen it all!

That scenario is not a thought experiment, but in fact a real digital phenomenon called ‘Deepfakes’ and it is rapidly taking digital enhancement technology to hitherto unknown and terrifying territories.

Forbes magazine calls Deepfakes “videos of one person with the face of another mapped over the top.” The results of which, as described above can be “excellent and very convincing.”

So, how does this brave new technology work? Well, according to Forbes, “to make a deepfake you need a computer with a Nvidia GPU and plenty of power… Offloading the maths to a GPU means that the processing time is some order of magnitude smaller, though it’s still a long process.”

And although things like celebrity fakes, where the face of a celebrity is doctored onto a different setting, body or scenario, is as old as the tabloid press, Deepfake technology takes that art (and science) to a whole new realm of realism.

Understandably, Hollywood and other international celebrities are the prime targets of ‘deepfakery’, so to speak. Just Google actress ‘Emma Watson sex’ – a frequent subject of doctored celebrity images – to get a sense of the state of the art (warning: NSFW).

Related News

But deepfakes also have other implications beyond just false nudes and sex. Think of the avenues of blackmail it opens up, how it can be deployed in Fake News and all sorts of propaganda or hoax campaigns by both private and state operatives, or even how the very concept of it can be used to hide very real captures by claiming what has been uncovered is just a ‘deepfake’.

The BBC reports that “technology news site Motherboard predicted it would take a year or so before the technique became automated. It ended up taking just a month.”

In both high profile and low profile situations, both public and private, deepfakes are not just disruptive in terms of potential and real abuses, it creates a situation where it becomes over time virtually impossible to trust what’s in front of your eyes and sounding in your ears – whether it’s video or image or audio or the combination of all three.

“Advancements in audio technology, from companies such as Adobe, could combine fakery for both eyes and ears – tricking even the most astute news watcher,” BBC says.

And even though some image hosting sites like Gyfcat are taking proactive measures to pull down obvious deepfakes, it might still not be enough to stop the ever increasing and improving enhancements to the technology, especially as it becomes cheap for anybody to create himself with free apps.

It would eventually come to a point where existing laws will have to be constantly updated every few months to keep pace with techniques like deepfakes, which at some point will become so sophisticated and absolutely indistinguishable from the real thing.

So, what at first looks and sounds fun and amusing and awe-inspiring tech-wise may yet prove to be one more Pandora’s Box that the unscrupulous exploit and the gullible get taken by. Terrified yet? You should be.