The Future of OSINT — Deepfakes and AI

Intel_Inquirer
3 min readJun 7, 2021

--

It is only a matter of time before artificial intelligence will be fully incorporated into the OSINT space.

What are Deepfakes?

Deepfakes are the product of artificial intelligence used to change, edit, and manipulate videos and photos. The clarity and highly deceptive nature of deepfakes can be seen in the YouTube video which is pictured above, of an individual using lip-syncing technology to create a deepfake of Barack Obama.

Without going into the legal, ethical, and societal concerns deepfakes raise, I want to discuss the impact it may have on the OSINT realm in the future.

For starters, throughout time deepfakes have become simpler to make, and more believable. Paris and Donovan note that “consumer-grade animation software like Adobe After Effects enables anyone with average computing power and a little time to create similar audio-visual fakes”. Various tools are now available for free on the internet including Fake App, Deepface and Faceswap. At one stage face swapping was also available as a filter on Instagram stories, revealing the growing trend and appeal to the younger generation.

Implications for OSINT

The art of intelligence analysis involves utilising a critical lens to question information and not take everything at face value. With deepfakes added to the mix, this will add another dimension of analysis required for investigations.

In the future, we will need to:

· Question videos and photos for their clarity and validity

· Assess whether certain movements/tasks/actions are feasible

· Looking out for any inconsistencies

· Further searches into the context and the original source

· Cross referencing with any other events or images from other sources

Deepfakes are becoming more difficult to detect, therefore one must not underestimate the ability of a photo/video potentially being a deepfake, made for entertainment, political or personal reasons.

With this being said, I believe deepfakes will require those in the intelligence field to use tools to determine the validity of photos and videos.

Solutions and Tools Available to Determine Deepfakes

One solution which can assist in determining whether a video or photo is a deepfake is to practice. By watching and reviewing videos and photos made through artificial technology, you will become more familiar with clues and mannerisms to look out for. Pinpointing errors, implausible movements or suspicious blinking habits is a difficult task, therefore attention to detail is key.

Global News Wire have also revealed that companies such as Cheq, Metafact, Cyabra, Falso Tech and Sensity are available to provided solutions to determining deepfakes.

In September 2020, Microsoft released Microsoft Video Authenticator, a tool used to combat the widespread creation of deepfakes. Tom Burt stated:

“Video Authenticator can analyse a still photo or video to provide a percentage chance, or confidence score, that the media is artificially manipulated. It works by detecting the blending boundary of the deepfake and subtle fading or greyscale elements that might not be detectable by the human eye.”

OSINTers, there is hope. Several tools are available and will continue to be available in the foreseeable future. As artificial technology advances so does the fight to stop the societal impacts that come with it.

The Benefits of Deepfakes for OSINT Analysis

Deepfakes have allowed OSINT practitioners to create legitimate-looking profile images for sock puppets (fake accounts). The website “this person does not exist” provides images of people who do not exist, thanks to artificial technology. Problems arise however when more than one image of the same individual is ever required.

Put your Deepfake detection skills to the test

Test your knowledge on which image is real or a deepfake created through artificial technology by using this link.

Thank you for taking the time to read this article. If you have any questions, please leave a comment!

--

--

Intel_Inquirer

OSINT enthusiast, Senior Intelligence Analyst from Sydney. Views are my own.