Meta says it sniffed out six new covert influence operations on its platforms in the second quarter of 2024 and expects more ahead of November’s U.S. elections.


What You Need To Know

  • Meta says it sniffed out six new covert influence operations on its platforms in the second quarter of 2024 and expects more ahead of November’s U.S. elections

  • In its quarterly adversarial threat report, released Thursday, the social media giant said the six campaigns originated in Russia, Vietnam and the U.S. 

  • Russia continues to be the top source of covert influence operations targeting people globally, followed by Iran and China, Meta said

  • In the U.S., Meta took down 96 Facebook accounts, 16 pages and 12 groups as well as three Instagram accounts it linked to a network posing as a fake political advocacy group

In its quarterly adversarial threat report, released Thursday, the social media giant said the six campaigns originated in Russia, Vietnam and the U.S. Meta said it removed many of them before they were able to build up large audiences.

Not all targeted the U.S. 

Russia continues to be the top source of covert influence operations targeting people globally, followed by Iran and China, Meta said. 

Much of the Russian efforts appeared focused on undermining support for Ukraine and the countries in the region, including Georgia and Moldova, according to the report. Meta said that in the months leading up to the U.S. elections, it expects Russia-based operations to promote commentary and candidates who oppose aid to Kyiv and criticize those who advocate for boosting Ukraine’s military.

“This could take the shape of blaming economic hardships in the US on providing financial help to Ukraine, painting Ukraine’s government as unreliable, or amplifying voices expressing pro-Russia views on the war and its prospects,” the report said.

One Russian operation targeted English- and French-speaking audiences around the world by, in part, creating a network of fictitious news websites, Meta said. The operators would mix articles promoting its objectives with news about entertainment, celebrity gossip and other topics. 

The nonpolitical stories were likely AI-generated summaries of articles from established news organizations, the report said. The campaign also used AI-generated newsreaders in YouTube videos that focused on criticizing President Joe Biden and Democrats for providing aid to Ukraine, Facebook’s parent company said.

Meta said it removed 12 accounts, 32 pages and five groups on Facebook and three accounts on Instagram tied to the operation.

In the U.S., Meta took down 96 Facebook accounts, 16 pages and 12 groups as well as three Instagram accounts it linked to a network posing as a fake political advocacy group. “Patriots Run Project” called on “real conservatives” to run for office in Arizona, Michigan, Nevada, Pennsylvania and North Carolina, according to the report. 

The operation used accounts belonging to fictitious people, Meta said.

The Vietnam-based operation primarily focused on criticizing Qatar and primarily targeted Lebanon, the U.S., the United Kingdom and France, Meta said. The campaign targeted multiple apps, ran websites and placed ads on billboards in the U.S. and Lebanon, according to the company.

Meta also provided an update on Doppelganger, a Russian operation it first publicly identified in 2022, which uses a large network of websites to spoof legitimate ones.

The report said Doppelganger remains the most persistent Russian-based campaign. Meta said it has seen “notable shifts” in its tactics. For example, to avoid detection, operators are increasingly using nonpolitical posts and ads, including about innocuous topics such as food and health, but the links then take people to articles about the Russia-Ukraine war or geopolitics on spoofed websites.

After a pause, Doppelganger operators have resumed their campaign sharing links to their fake content — although at a much lower rate — potentially influencing opinion-makers and politicians, Meta said. 

Since May alone, Meta has removed more than 5,000 accounts and pages it traced back to Doppelganger. 

The social media company said generative artificial intelligence tactics have provided “only incremental productivity and content-generation gains to threat actors,” but added that the industry’s defense strategies against AI misuse have largely been effective.

Meta said it shares its information about foreign interference threats with other tech platforms, researchers and governments.

Microsoft last week said Iran is accelerating online activity that appears intended to influence the U.S. presidential election. On Saturday, former President Donald Trump, the Republican nominee, said Microsoft informed his campaign that Iran had hacked one of its websites. The FBI said Monday it is investigating the incident.

Meta’s report did not have any information about new campaigns originating from Iran.

-

Facebook Twitter