INDIA: Deepfakes, or the doctoring of videos and images have been begun to spread its roots across all corners of the world.
In general terminology, it is called Photoshop. Camera and the technology behind photo-editing have become sophisticated. You may come across an edited image of some superstar or maybe a politician.
A more advanced and influential innovation that is now becoming a global threat is; Deepfakes.
What are Deepfakes?
Deepfakes (deep learning + fake) are fabricated and doctored moving images or videos and sounds that are made through advance AI (Artificial Intelligence).
Unless you analyze the video in-depth, you cannot differentiate between a real and a doctored one.
Deepfakes are largely becoming popular because of its notorious nature and malign motives.
Alone in September 2019, around 11,000 deepfakes were found. Among them, 96% turned out to be Pornographic videos tells the AI firm Deeptrace.
How are Deepfakes created?
If you have an internet connection and a device to access it on, you can create a deepfake video.
Various techniques can be used to create a deepfake video. You can either opt for some coding at the basic level and then go for a software that can help you create those face-swapping videos and later add-on speech to it via another software.
Other than coding, you can also download Softwares that will do every work for you but will deliver low-quality and not so real deepfakes.
It is estimated that at the pace of development witnessed in the AI sector, it will soon be next to impossible to identify a real and a deepfake video.
Are they dangerous?
Yes, Deepfake poses a very critical threat to politics and democracy. For a normal functioning democracy, a few deepfakes can create havoc among the masses and political parties.
Deepfake technology has the power to create pretty acceptable videos of political leaders with the sole intention to disrupt their work and reputation.
In May 2018, Belgium’s Socialistische Partij Anders (SPA) posted a video on Facebook showing Trump taunting Belgium for remaining in the Paris climate agreement.
The strange thing that made the experts dig into the video was Trump’s hair which was looking weird than usual.
Other than his hair movements, the movement of his mouth raised some concerns over the authenticity of the video. The thing which raised maximum flags was the voiceover.
They endanger the moral side of practicing politics as political leaders can take undue advantage of creating deepfakes during elections.
In February 2020, the Indian Politician Manoj Tiwary of the political party Bhartiya Janata Party (BJP) created two deepfakes to lure votes. He created videos in Haryanvi and English.
It was the very first time in the history of Indian Democracy that deepfakes were used in the process of gaining votes.
Doctored and photoshop images were used only for election campaigns mostly in hoardings and banners. But with the introduction of deepfake by the political party in power, it marks the beginning of a new era of politics.
In the world’s largest democracy, deepfakes pose a threat which may seem a very minor one but is probably a bigger danger in the making. After the telecom industries offering internet at very cheap rates, the knowledge, and availability of resources if much easier to grasp.
What are the legal provisions?
In India, Section 66D of the Information Technology act can be applied to punish the culprits. There is no direct law on deepfakes but Section 66D is comprehensible for deepfakes.
We live in the information era and content passes from one person to another in no time.
To this context, it seems that deepfake may bind its root deep into the system and the only way to handle this malice is to counter it with information.
INFORMATION VS INFORMATION
Tackle the deepfake videos with real and factual ones. Prove the authenticity of the deepfakes and suppress the very motive of bringing those videos out.
In the future, as technology will bring in more innovations, it will be simpler but not easy to handle and manage deepfakes.
Comments are closed.