How To Identify Them And What Indian Law Says About Them; Watch Videos

In the video, Rashmika Mandanna was shown wearing a black swimsuit and cycling shorts while entering an elevator.

Deepfake (L) and Original (R)

Deepfake Videos: Bollywood superstar Amitabh Bachchan on Sunday raised an alarm and rightly so since it widely opened the till now “narrowly opened” door to the world of Artificial Intelligence (AI) and Deepfake videos that have the potential to wreak havoc in an individual’s life as well as at the global level. Big B pointed out a “deep fake” video featuring actor Rashmika Mandanna that went viral on social media. In the video, Rashmika Mandanna was shown wearing a black swimsuit and cycling shorts while entering an elevator.

When The Duplicate Is ‘Real’

Well, the video looks perfect in terms of the presentation. The only catch was that the woman shown in the video was NOT Rashmika but some other woman whose video was used to make it look like it was the actor. A little bit of scrutiny stunned the netizens and of course, Rashmika. The abrupt eye movement, the kind that appears in deepfake AI videos was identified. The original video that was used features a British Indian girl, Zara Patel, and Rashmika’s face was superimposed on Patel’s face.

For about a couple of years, we have witnessed similar videos featuring political personalities and others who matter. But this latest episode has stirred a hornet’s nest.

This Is The Fake Video

This Is The Original Video

Here, we will try to explain to you as easily as possible how to identify deep fake AI videos.

Odd Or Unusual Eye Movements

One of the most prominent signs to recognise deep fake videos is the unusual, unnatural eye movement. They might not be in sync with the face, have odd eye movements, and the pattern of the way they look around is just not fitting in the frame as a real person would have. There is simply no synchronisation between the person’s actions, speech, and usual movements.

Artificial Facial Movements, Lip Sync

It is almost similar to the eye movements. The person(s) in the video will appear to be inconsistent with the setting and their speech. The lip movements will make you suspicious as they will be out of synchronisation with what is being said.

It is very much like lip-syncing as performed by actors to a song sung by someone else and they only do a “lip service” to make it look like that they are singing it.

Visible Differences In Lighting, Background, And Colour Outline

You must have guessed by the title itself that a deep fake video will announce itself by exhibiting obvious mismatches, discrepancies, and clear differences in the background, foreground, midground, lighting, contrast, and colours.

This is because the duplicate cannot match the original in its entirety. Deep fake creators will face issues while trying to merge the two. Also, look out for irregularities in the lighting on the subject’s face and surroundings.

Voice And Audio

Anyone who creates a deep fake video relies on voice and other audio content by using AI. When such a video features a celebrity or a known personality then it is very easy to make out that the voice and the ambiance do not belong to the person(s) and the setting. As for the usual content featuring some random person(s), you just have to compare the audio with the visual content.

Body Movements And Shapes Appear Strange

Moving a bit down, notice the gestures, gait, the symmetry between the face, hair, and below the chin or if the arms, legs, or hair are not clicking, if the arms of a normal subject appear to be strange, uncoordinated, or simply do not go with the size of the body or the head, you treat it as a deep fake.

Law Against Deepfakes in India

In case of deepfake crimes, that involve capturing, publishing, or transmitting a person’s images in mass media, violating their privacy section 66E of the IT Act of 2000 is applicable. This offense is punishable with imprisonment of up to three years or a fine of up to Rs. 2 lakh, reports bqprime.com.

Minister of State for Electronics and Information Technology of India Rajeev Chandrasekhar said, “Under the IT rules notified in April, 2023 – it is a legal obligation for platforms to: ensure no misinformation is posted by any user AND, ensure that when reported by any user or govt, misinformation is removed in 36 hrs. If platforms do not comply with this, rule 7 will apply and platforms can be taken to court by aggrieved person under provisions of IPC.”



FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Todays Chronic is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – todayschronic.com. The content will be deleted within 24 hours.

Leave a Comment