DEEP FAKES SERIES / POLLS / PORTENTS / WHITE PAPER – 3
One of the biggest threats in today’s technology vista is Computer Generated Imagery (CGI): this is the use of computer graphics to create or improve visual media (image and video).
These methods have been the go-to visual effect for most major movies, but now that Generative AI techniques are improving rapidly, becoming accessible and usable, both technologies are being used together to produce even more convincing fakes.
Perhaps the biggest fallout of deepfake AI will be felt in politics where it’s being increasingly used—in the dissemination of misinformation about political candidates and the release of false statements from leaders.
In 2018, in a now-viral video, a deepfake of President Obama shows him saying something, only to be revealed that the words were not his own, and intended to demonstrate the realistic nature of manufactured images and voices.
In a 2019 comment for Brookings on the 2020 US elections and the impact of deepfakes and social media, UCLA Professor John Villasenor, delving into the worrying issue of influencing voters, writes: “Under the right set of circumstances, deepfakes will be very influential. They don’t even have to be particularly good to potentially swing the outcome of an election.
“As with so much in elections, deepfakes are a numbers game. While the presence of tampering in all but the most sophisticated deepfakes can be quickly identified, not everyone who views them will get that message”.
Hustling at the Hustings
Industry watchers recognise that deepfakes may have the ability to potentially undermine elections, as well as create global conflict. One clear and potentially most devastating such instances in conflict manipulation has been mentioned earlier: Ukrainian President Volodymyr Zelenskyy asking his troops to surrender to the invading Russians.
2024 is a momentous year for democracies, with about 40 countries going to the polls, India included. Misleading posts will surely become the biggest threat to every political party in the country.
India’s own tryst with AI technology interference is not new. In 2020, just ahead of the Delhi Assembly polls, in what was then probably the first time deepfake AI was being in political campaigns, several videos of (BJP) leader Manoj Tiwari were circulated to 15 million voters via 5,800 WhatsApp groups.
The videos show Tiwari hurling allegations against his political opponent Arvind Kejriwal in English and Haryanvi, before the Delhi elections. In another incident, an altered video of Madhya Pradesh Congress chief Kamal Nath recently went viral, creating confusion over the future of the incumbent state government’s “Laadli Behna Scheme”.
It is quite feasible that in their desperate attempt to pull the numbers through misinformation, interested parties will hire AI Deepfake experts.
With election propaganda campaigns set to reach a frenzy as a no-holds-barred action script unfolds throughout the long election season before us, the task is cut out for political party whips and the Election Commission of India to do their bit to ensure deepfake AI videos don’t upstage India’s tried and tested door-to-door outreach programs, wall poster campaigns, and curated media blitzes. The damage will be incalculable without the necessary guardrails.
A Case for Regulations
Beyond politics, the targets are still the usual suspects. Bollywood stars, cricketers, business leaders—
Anil Kapoor lawsuit after finding AI-generated deepfake content using actor’s likeness and voice were used to create GIFs, emojis, ringtones and even sexually explicit content,
Amitabh Bachchan case against Rajat Negi against the unauthorized use of his personality rights and personal attributes such as voice, name, image, likeness for commercial use,
Virat Kohli, Alia Bhatt, Sharukh Khan, Ratan Tata, and Katrina Kaif, everyone’s been bitten by the bug. The Rashmi Mandanna incident is really just the tip of the iceberg!
This makes for a clear case in support of regulating this menacing industry.
(Coming up: How to Catch DeepFakes)