oppn parties The Danger And Nuisance Of Deepfake Videos

News Snippets

  • R G Kar rape-murder hearing start in Kolkata's Sealdah court on Monday
  • Calcutta HC rules that a person cannot be indicted for consensual sex after promise of marriage even if he reneges on that promise later
  • Cryptocurrencies jump after Trump's win, Bitcoin goes past $84K while Dogecoin jumps 50%
  • Vistara merges with Air India today
  • GST Council to decide on zero tax on term plans and select health covers in its Dec 21-22 meeting
  • SIP inflows stood at a record Rs 25323cr in October
  • Chess: Chennai GM tournament - Aravindh Chithambaram shares the top spot with two others
  • Asian Champions Trophy hockey for women: India thrash Malaysia 4-0
  • Batteries, chains and screws were among 65 objects found in the stomach of a 14-year-old Hathras boy who died after these objects were removed in a complex surgery at Delhi's Safdarjung Hospital
  • India confirms that 'verification patrolling' is on at Demchok and Depsang in Ladakh after disengagement of troops
  • LeT commander and 2 other terrorists killed in Srinagar in a gunbattle with security forces. 4 security personnel injured too.
  • Man arrested in Nagpur for sending hoax emails to the PMO in order to get his book published
  • Adani Power sets a deadline of November 7 for Bangladesh to clear its dues, failing which the company will stop supplying power to the nation
  • Shubman Gill (90) and Rishabh Pant (60) ensure India get a lead in the final Test after which Ashwin and Jadeja reduce the visitors to 171 for 9 in the second innings
  • Final Test versus New Zealand: Match evenly poised as NZ are 143 ahead with 1 wicket in hand
Security forces gun down 10 'armed militants' in Manipur's Jiribam district but locals say those killed were village volunteers and claim that 11, and not 10, were killed
oppn parties
The Danger And Nuisance Of Deepfake Videos

By Sunil Garodia
First publised on 2023-11-06 15:24:45

About the Author

Sunil Garodia Editor-in-Chief of indiacommentary.com. Current Affairs analyst and political commentator.

A huge controversy - and scare - has emerged after a deepfake video of actor Rashmika Mandhana entering an elevator in skimpy dress became viral. Subsequent investigation by fake busting organizations like AltNews showed that the video was altered from a video of Zara Patel, a British-Indian woman with a huge following on social media.

In the deepfake video, Rashmika is shown entering an elevator in a skimpy dress. But on closer examination, it is clear that at the point the person in the video enters reaches the elevator door, it is not Rashmika. But as soon as she enters, the face changes to that of the Indian actor. This has been done expertly and very difficult to detect. A normal person will definitely believe that it is Rashmika Mandhana's video.

What are deepfakes?

Deepfakes are a type of synthetic media that involves the use of artificial intelligence (AI) and deep learning techniques to create highly realistic and often deceptive videos, audio recordings, or images. These manipulations are typically used to replace one person's likeness or voice with that of another. The term "deepfake" is a portmanteau of "deep learning" and "fake."

The process of creating a deepfake typically involves training a machine learning model, such as a generative adversarial network (GAN) or an autoencoder, using large datasets of images or audio recordings of the target individuals. Once the model is sufficiently trained, it can generate content that convincingly imitates the appearance, voice, or mannerisms of the target person.

Deepfakes have gained attention and concern because they can be used to create highly convincing forgeries of individuals saying or doing things they never did. While deepfake technology has legitimate uses, such as in the entertainment industry for digital doubles or voice cloning, it also has the potential for misuse, including the creation of misleading or harmful content, such as the one of Mandhana.

(The above paragraphs in bold are AI generated content)

But the sad and scary part is that, as of now, there is no remedy against the makers of such deepfakes. The IT laws are evolving and as technology advances, the laws will have to be modified to factor in new technology. Although the laws now mandate that social media platforms have to take down such content, either on their own if they spot it or when someone complains and have to take action against those who uploaded or circulated the content, that is little consolation to the person subjected to identity theft, possible abuse and loss of reputation.

There has been a clamour for legal action after Mandhana's video went viral. Actor Amitabh Bachchan has also found the issue fit for a legal case. IT minister Rajeev Chandrasekhar called it "dangerous and misinformation" and reminded the social media platforms that they must remove the reported content in 36 hours. But he did not say what steps the government will take to amend the law to allow the affected person to move against those who make and circulate such deepfakes.