Rashmika Mandanna’s ‘deepfake’ video goes viral; Amitabh Bachchan calls for legal action

by admin
Rashmika Mandanna’s ‘deepfake’ video goes viral; Amitabh Bachchan calls for legal action

National crush Rashmika Mandanna is the latest celebrity to be the victim of cybercrime. A ‘deepfake’ video of the ‘Animal’ actress is going viral on social media, sparking outrage among her fans.

The viral video shows a woman, with her face morphed into that of Rashmika, dressed in a sultry outfit. Irked social media users demanded stringent action against the makers of the video. Even Amitabh Bachchan, who has worked with Rashmika in ‘Goodbye’ expressed his anguish over the viral video.

An infuriated Big B called for legal action against the makers of the viral video. Taking to X, formerly known as Twitter, Amitabh Bachchan wrote, “Yes, this is a strong case for legal”.

🚨 There is an urgent need for a legal and regulatory framework to deal with deepfake in India.

You might have seen this viral video of actress Rashmika Mandanna on Instagram. But wait, this is a deepfake video of Zara Patel.

This thread contains the actual video. (1/3) pic.twitter.com/SidP1Xa4sT

— Abhishek (@AbhishekSay) November 5, 2023

The original video is of Zara Patel, a British-Indian girl with 415K followers on Instagram. She uploaded the video on Instagram. Later, some cyber criminals morphed her face with that of Rashmika and circulated the video.

From a deepfake POV, the viral is perfect enough for ordinary social media users to fall for it. But, if you watch the video carefully, you can see that when Rashmika (deepfake) was entering the lift, suddenly her face changed from the other girl to Rashmika.

Angry social media users and Rashmika fans called for an urgent need for a legal and regulatory framework to deal with deepfake apps in India.

“Deepfakes are videos that swap one person’s face for another so that people can be shown doing and saying things they didn’t say or do. They’re now easy to make with free software,” shared a user.

Another user wrote, “Deepfake audio/video to exploit people is already happening. 🥺 What’s the way to prevent getting fooled at a personal level and at the system level? To begin with, shouldn’t the services providing such facilities use copyright for identification, at least in free versions, it would drastically cut down such instances.”

You may also like

Leave a Comment

multipurpose site for ROV ,drone services,mineral ores,ingots,agro commodities-oils,pulses,fatty acid distillate,rice,tomato concentrate,animal waste -gallstones,maggot feed ,general purpose niche -consumer goods,consumer electronics and all .Compedium of news around the world,businesses,ecommerce ,mineral,machines promotion and affiliation and just name it ...
multipurpose site for ROV ,drone services,mineral ores,ingots,agro commodities-oils,pulses,fatty acid distillate,rice,tomato concentrate,animal waste -gallstones,maggot feed ,general purpose niche -consumer goods,consumer electronics and all .Compedium of news around the world,businesses,ecommerce ,mineral,machines promotion and affiliation and just name it ...

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy