Tuesday, April 8, 2025
ENGLISH

AI/Deepfake

Deepfake Video Of Nirmala Sitharaman Announcing Dubious Investment Project Goes Viral

Written By Kushel Madhusoodan, Edited By Pankaj Menon
Apr 3, 2025
banner_image

Claim

image

Union finance minister Nirmala Sitharaman announces a new investment project, according to a “TimesNow” report.

Fact

image

Viral video found to be a deepfake, platform found to be untrustworthy.

A purported 3-minute 10-second “TimesNow” news report by veteran journalist Rajdeep Sardesai, claiming to show Union finance minister Nirmala Sitharaman endorsing an investment platform, has gone viral across social media platforms. According to the video, every citizen who invests ₹21,000 in the project would receive ₹15,00,000 in the first month.

The archived version of the post can be seen here.

Fact Check

Newschecker first ran a keyword search for “Nirmala Sitharaman investment project TimesNow Rajdeep Sardesai”, which did not throw up any credible news reports about such an initiative.  Also, we noticed that the lip movements of both Sardesai and Sitharaman seemed completely unnatural and were not completely in synchronisation with the speech, raising our doubts on whether the video was AI-manipulated.

We ran the excerpt featuring Sardesai past Hive Moderation, an AI-content detection tool, which found it 98.8% likely to contain AI-generated or deepfake content. We ran the audio past Resemble AI’s Deepfake Detector, which stated that the voice was “Fake.”

Newschecker next ran a reverse image search of the keyframes featuring Sitharaman, which led us to this Business Today video on Youtube headlined, “FM Nirmala Sitharaman At The IT-BT Post Budget Round Table,” which was streamed live on February 4, 2025, where Sitharaman discussed the key takeaways from the 2025 Union Budget. At no point in the 47:42 panel discussion did Sitharaman mention any investment or trading platform, indicating that the viral video was digitally altered.

 We then ran the video past Hive Moderation, an AI-content detection tool, which found it “99.9% likely to contain AI-generated or deepfake content.” We ran the audio past Resemble AI’s Deepfake Detector, which stated that the voice was “Fake,” further confirming that the viral video was a deepfake

We also ran the link to the trading platform, attached to the post and allegedly endorsed by Sitharaman, past Scam Detector, a major online fraud prevention resource. The detector stated that the website is “Dubious. Very New. Suspicious,” giving it a low trust score. “The algorithm scored this website based on issues such as its creation date and high risk of phishing, spamming, and other factors noted in the Dubious. Very New. Suspicious. tags above. Long story short, we recommend staying away from this website,” read the review, further confirming that the viral post was intended  to mislead people into investing in dubious investment schemes.

Also Read: Myanmar Earthquake: Video Showing Nurses Shielding Babies During Tremors Is From China, Not Bangkok

Conclusion

Viral video of Nirmala Sitharaman endorsing an investment project found to be a deepfake.

RESULT
imageAltered Photo/Video
image
If you would like us to fact check a claim, give feedback or lodge a complaint WhatsApp us at +91-9999499044 or email us at checkthis@newschecker.in​. You can also visit the Contact Us​ page and fill the form.
Newchecker footer logo
Newchecker footer logo
Newchecker footer logo
Newchecker footer logo
About Us

Newchecker.in is an independent fact-checking initiative of NC Media Networks Pvt. Ltd. We welcome our readers to send us claims to fact check. If you believe a story or statement deserves a fact check, or an error has been made with a published fact check

Contact Us: checkthis@newschecker.in

17,698

Fact checks done

FOLLOW US
imageimageimageimageimageimageimage
Copyright © 2022 NC Media Pvt. Ltd. All Rights Reserved.
cookie

Our website uses Cookies

We use cookies and similar technologies to help personalise content,tailor and measure ads, and provide a better experience. By clicking OK or turning an option on in Cookie Preferences, you agree to this, as outlined in our Cookie Policy.