5 tips to avoid threat of deepfake videos. Read here

Pune Pulse

5 tips to avoid threat of deepfake videos. Read here

Share This News

By: Pune Pulse

December 4, 2023

Pune: Relevant authorities are issuing a strong warning to people all over the world regarding the growing threat posed by deepfake videos. Powerful tools are being manipulated by criminals to extract money from victims.

The alarm has already been raised by the World Economic Forum (WEF), which has revealed an alarming rise in deepfake videos annually.

Deepfake proliferation is a growing concern in the cybersecurity industry, according to the Vice President of Product Management, Research and Engineering at a renowned cybersecurity firm. Remarkably, a study found that the accuracy of recognizing deepfakes was only 57%.

Type of people affected by deepfakes

The Vice President of the cybersecurity company emphasizes that awareness is a key factor in deepfake crime: how aware are the people that these cybercriminals are trying to swindle them? People who are less aware of deepfake technology’s capabilities are more vulnerable.

This gap is exacerbated by age and income: 75 per cent of US adults with annual incomes over $125,000 are aware of voice clones, compared to only 56.5% of those with less than $50,000.

Additionally, exposure risk is taken into account. Politicians and celebrities are among the prominent people who are more vulnerable to the damaging effects of distorted public perceptions. With the increasing availability and capabilities of deepfake videos, experts predict a sharp rise in related criminal activity. It’s critical to remain vigilant and implement cutting-edge protective measures as the digital landscape changes in order to thwart the sneaky reach of deepfake technology.

Ways to avoid the danger of deepfake videos

· Be sceptical of any audio or video that seems too good to be true. Anything that looks too unbelievable or shocking is probably a deepfake.

· When interacting with audio or video from unidentified or dubious sources, proceed with caution. Keep an eye on the source to stay away from any dangers.

· Find any inconsistencies in the audio or video. For example, it could be a deepfake if the person’s lips are not moving in time with their voice.

· Check facts before sharing anything. Don’t accept anything you come across online at face value.

· Clicking on links in emails or texts should be done with caution, even if the message appears to be from someone you know. These links can be used by those who spread deepfakes to share their fake audio and video content.

Shreyas Vange