Legal News and Insight around the Globe!

PM Modi Expresses Concern Over Deepfake Scam Provides 10 Tips to Spot Fake Videos and Audios

LI Network

Published on: November 17, 2023 at 16:31 IST

Prime Minister Narendra Modi has raised alarms about the increasing threat of deepfakes, emphasizing the need for understanding artificial intelligence (AI) and its potential misuse to spread misinformation.

In a recent statement, he urged the media to raise awareness about the challenges posed by deepfakes. During an event in New Delhi, PM Modi shared his concern, citing instances where he came across manipulated videos, including one depicting him singing a Garba song.

He highlighted the looming threat of deepfakes, expressing worries about the potential problems they could cause for individuals and society.

The issue gained attention recently when a viral video featuring actress Rashmika Mandanna turned out to be a deepfake, replacing her face with that of a British Indian girl, Zara Patel.

Such instances have sparked concerns, especially with the proliferation of deepfake audios and videos, including those involving political leaders on platforms like Instagram.

To help individuals identify deepfake videos and audios, the following measures are recommended:

  1. Unnatural Eye Movements: Deepfake videos often display unnatural eye movements or gaze patterns, contrasting with the smooth and coordinated eye movements in genuine videos.
  2. Mismatches in Colour and Lighting: Pay attention to inconsistencies in colour tones and lighting conditions, as deepfake creators may struggle to replicate accurate visuals.
  3. Compare Audio Quality: Deepfake videos may use AI-generated audio with subtle imperfections. Comparing audio quality with visual content can reveal discrepancies.
  4. Strange Body Shape or Movement: Watch for unnatural body shapes or movements, such as limbs appearing too long or short, indicative of deepfake manipulation.
  5. Artificial Facial Movements: Examine facial expressions for accuracy, as deepfake software may not always replicate genuine facial movements convincingly.
  6. Unnatural Positioning of Facial Features: Distortions or misalignments in facial features can be a sign of deepfake manipulation.
  7. Awkward Posture or Physique: Deepfakes may struggle to maintain a natural posture or physique, leading to awkward body positions or movements.
  8. Verify Before Sharing: Verify the source of audio or video clips before sharing. If the source is unreliable, refrain from sharing the content.
  9. Stay Informed: Stay updated on political developments and statements made by key leaders to avoid falling for widely shared audio or video clips featuring controversial statements.
  10. Use AI Detection Tools: Utilize AI voice detectors available online, such as aivoicedetector.com and play.ht, to detect AI-generated voices.

PM Modi’s concern underscores the importance of being vigilant in the face of deepfake threats, with these tips serving as practical measures to identify and combat the spread of manipulated content.

Also Read: The Rising Threat: AI Deepfake Scams and Their Hidden Dangers