Meta, the parent company of Facebook and Instagram, announced disclosed the disruption of three separate covert influence campaigns rooted in Romania, Azerbaijan, and China, and targeting users in Romania, Azerbaijan, Turkey, and Myanmar, Taiwan, and Japan.
They used advanced techniques, such as establishing large networks of inauthentic accounts and creating AI-generated content, to distort public discourse and influence political discussion.
The campaign from Romania was comprised of 658 Facebook accounts, 14 Pages and two Instagram accounts. These accounts impersonated locals and shared content on sport, travels, local news and commented on content of Romanian politicians and news companies.
Meta pointed out the campaign’s “thorough operational security” to hide its source, such as the use of proxy IP infrastructure. The performers used their accounts mainly to write in Romanian about what was happening in the world and upcoming elections in Romania.
A second network, linked to Iran, aimed at Azeri speakers in Azerbaijan and Turkey. The network included 17 Facebook accounts, 22 Pages and 21 Instagram accounts.
The fake profiles frequently posed as female journalists and pro-Palestine activists. They used a variety of methods including hijacking trending hashtags to piggy-back on to existing public debate and commenting on their own content to give the illusion of popularity.
Written in Azeri, their posts spanned from the Paris Olympics and Israeli actions in Gaza to boycotts of American brands and criticisms of the United States and President Biden. Meta credited this behavior to a threat actor they track as Storm-2035.
The third influence operation was based in China and focused on users in Myanmar, Taiwan and Japan. This network included 157 Facebook accounts, 19 Pages, one Group and 17 Instagram accounts.
According to Meta, this campaign employed AI to create profile pictures and a so-called “account farm” to churn out new, fake profiles at an enormous rate. The cast members also reposted content — in English, Burmese, Mandarin and Japanese — from the account or from each other.
In Myanmar, the content denounced civil resistance movements and rallied behind the military junta. The campaign in Japan particularly decried the Japanese government’s closeness to the US, while in Taiwan, the posts asserted corruption among Taiwanese politicians and military leaders, and some Pages claimed they hosted anonymously submitted content to appear like authentic discussion.
Meta said these networks were identified and removed before they could accumulate enough real users on its platforms. The findings are included in a quarterly Adversarial Threat Report the company released today, which details these and other thwarted influence campaigns.
The move highlights the growing difficulty social media platforms have identified in trying to address state-sponsored and other malicious actors seeking to shape public opinion worldwide.