This piece is by WWSG exclusive thought leader, Sara Fischer.
AI-driven deepfakes weren’t the disinformation catastrophe that tech companies and global governments feared ahead of a slew of major elections this year, Meta president of global affairs Nick Clegg told reporters Monday.
Why it matters: The spread of broader conspiracy theories has proven to be a much more challenging misinformation threat than AI-doctored photos or videos.
“From what we’ve monitored across our services, it seems these risks did not materialize in a significant way, and that any such impact was modest and limited in scope,” Clegg said.
By the numbers: While Meta said its systems did catch several covert attempts to spread election disinformation using deepfakes, “the volumes remained low and our existing policies and processes proved sufficient to reduce the risk around generative AI content,” the company said.
During the election periods of major races globally this year, content verification ratings from Meta’s international fact-checking partners on AI content related to elections, politics and social topics represented less than 1% of all fact-checked misinformation, per Clegg.
State of play: Meta introduced new policies this year to prevent everyday users from inadvertently spreading election misinformation using its Meta AI chatbot, including blocking the creation of AI-generated media of politicians.
The company said its systems rejected 590,000 user requests to generate AI images of President-elect Trump, Vice President-elect Vance, Vice President Harris, Governor Walz, and President Biden in the month leading up to the election.
Clegg said he didn’t see much activity in terms of bad actors trying to game Meta’s rules around labeling AI-generated imagery in ads.
Zoom out: Meta has invested heavily in broader threat intelligence over the past few years, which Clegg said has helped the company identify coordinated disinformation networks, regardless of whether they use AI or not.
“We seek to build policies and enforcement practices that are agnostic about the origin of the content, whether it’s synthetic or human,” Clegg said.
“That’s why I don’t think the use of AI and generative AI in and of itself was a particularly effective tool for them to evade or trip our wires, because it’s not the means by which we try and identify them in the first place.”
The big picture: Nearly half of the world’s population lives in countries that held major elections this year, prompting concerns about AI deepfakes from intelligence officials globally.
But even in places where there have been concerns about disinformation, deepfakes don’t seem to be the primary challenge.
In Romania, authorities are probing whether poor policy enforcement or fake accounts on TikTok may have aided the far-right presidential candidate Călin Georgescu’s win in the first round of voting.
In India, AI-generated content was used to spread disinformation, but it was also used by candidates to help translate campaign content into different local languages.
Reality check: Images and videos created using generative AI still lack precision and that makes it possible, at least for now, for experts to debunk them.
Most mis- and disinformation that goes viral today features manipulated context, such as a mislabeled photo location, rather than doctored media.
Coordinated inauthentic networks that intentionally use deepfakes to spread disinformation often struggle to build authentic audiences, Clegg said, rendering their AI-generated images and videos ineffective.
The bottom line: The most problematic deepfakes aren’t necessarily the most believable ones, but rather, the ones shared by people in power to help propel narratives or conspiracies that support their campaigns.
Turkish President Recep Tayyip Erdogan showed a video on stage at a political rally linking a rival to a militant organization considered a terrorist group by the Turkish government.
President-elect Trump shared AI-generated images on social media during his campaign that falsely depicted Taylor Swift and her fans endorsing his campaign for president.
This is the latest episode of Anderson Cooper 360º with Anderson Cooper. President-elect Trump announced several key appointments Tuesday evening, including naming Fox News host and Army veteran Pete Hegseth…