Explainer: What is a deep fake and how to spot it?

Deepfake is an application of AI technology that can manipulate videos, images, and audio of real people. It has been used for hate speech and misinformation, as well as to impersonate people’s likenesses.

Writer: Ghai Aketch

Have you ever wondered when digital content looking authentic is flagged as fake? It is indeed possible with the rapid evolution of Artificial Intelligence (AI) technology that is a sophisticated spread of disinformation and hate speech. 

Deepfake is, therefore a product of such advanced tech. It’s an application of sophisticated computer AI to manipulate videos, images or audio of real people. Its name is derived from Deep Learning, a type of AI system.  

The targeted images are altered by deep fake software to generate exact images of the person, and  synthetic character is produced. The output of the manipulated footage turns real people to say words they never spoke or act as the programmer wants. 

The deepfake videos were initially targeting celebrities to ‘feature’ in explicit clips. The explicit contents would sequentially promote clicks on the websites. 

The programmers download random pornographic videos and swap faces of prominent people with the video characters which to great extent looks original.

So, given the possibility to manipulate images of real people is worrying according to  researches when  political or religious leaders are targeted to ‘utter’ hate speech towards other groups. It has become a great source for hate speech and misinformation propagation. 

Although no reported clips of deepfake involving South Sudanese prominent figures, concerns should be placed on alertness on the dangers this technology could cause in the politically  fragile state should it happen. 

When used in a political setting, the image-altering emerging trends could cause detrimental cybersecurity and society threats according to Security Week.

The most recent example of a deepfake footage is of the US sitting president, Joe Biden. 

The footage in circulation was manipulated and appeared ridiculing transgender women, by allegedly saying, “you will never be a real woman.”

Interestingly, he had earlier supported transgender Americans before the footage emerged. It was hard though to distinguish it as fake. But with keen reference to past events can wake your curiosity. 

The United Nations Office of Disarmament Affairs(UNODA)  in 2019 to 2021 held a series of multi-stakeholders sessions highlighting that  misuse of ICT may ‘harm international peace and security.’ 

But fast-advancing technology has eased this machine learning process. That’s to say anyone with a computer and internet access can, with ease, generate deepfake contents. 

Also, its emerging  user-friendly interface has made AI-altered products more available on the internet today. 

However, countering that, developers  have reverse engineered the deepfake processes. Several deepfake debunking softwares such as, DeepTrace, Microsoft Video Authenticator and DuckDuckGoose are countering spread of disinformation through digital content verification. 

As much as deepfake is useful in the entertainment industry, it’s largely digressing into real crimes and insecurity for states. 

How to detect deepfake

Emergence of new tech has made almost anything possible. But people tend to be baffled  while sifting genuine information from deepfakes.

Fortunately, you can, without using deepfake detecting tools, try to spot the following in a computer-doctored footage.

How to spot deep fake

Additionally, being critical enough of digitally-generated content can do you better to detect certain flaws in deepfakes considering  coherence of speeches, or previous  events. 

For example, the recent Joe Biden deepfake which allegedly ‘ridiculed’ transgender women. But look, previously he had signed sex-marriage bill into law, saying that Americans have rights to choose whom they want to marry. 

Moreover, in the US transgender policies are upheld in some states including Washington D.C, where the government sits. So, such sudden inconsistencies can give you clear  hints  that that particular footage is fake.

It has come a time you should not always believe what you see according to an analytical report on deepfake emergence.  

In South Sudan, impersonation of people’s likenesses has been through the commonly easy-to-use celebrity photos swapping Apps. However, it’s simple to spot that such images are inauthentic looking at the skin tone and image orientation.

In conclusion,  deepfakes will keep increasing and grow complex according to available publications. But you need to be curious as deepfakes disseminate disinformation via altered images of powerful and famous personalities. 

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *