Meta has recently taken down thousands of fake Chinese accounts and pages that were part of the largest known cross-platform covert influence operation in the world. It targeted more than 50 apps, including Facebook, Instagram, X (formerly Twitter), YouTube, TikTok, Reddit, Pinterest, Medium, Blogspot, LiveJournal, VKontakte, Vimeo, and dozens of smaller platforms and forums. For the first time, Meta is able to tie Beijing’s covert influence activities together to confirm that it is part of one operation known in the security community as Spamouflage and link it to individuals associated with the Chinese law enforcement agencies (LEA).
Associated clusters were behind online attacks on pro-democracy protesters in Hong Kong and the Trump administration, praise for China’s COVID-19 response, and AI-generated fake news anchors promoting the Chinese Communist Party. “Each cluster worked to a clear shift pattern, with bursts of activity in the mid-morning and early afternoon, Beijing time, with breaks for lunch and supper, and then a final burst of activity in the evening,” according to the report. Many pseudo accounts posted spam photos or videos of scenery, food or fashion between their political posts, likely to camouflage their strategic goal.
Some “personal” posts, written in the first person, appeared to be copy-pasted from the same numbered list. On Facebook, these pages also spent at least $3,500 for ads. “This is the biggest single takedown of a single network we have ever conducted,” said Ben Nimmo, who heads Meta’s security team that looks at global threats. “When you put it together with all the activity we took down across the internet, we concluded it is the largest covert campaign that we know of today.”
The operation, which originated in China, targeted audiences in countries including Taiwan, the U.S., Australia, the U.K., and Japan, as well as Chinese-speaking audiences elsewhere. However, despite the covert network’s large size, few posts appear to have gained much traction, garnering little interaction from real users, and relying on inauthentic followers from fake engagement farms in Vietnam, Bangladesh and Brazil, according to Meta researchers. The influence operation was the seventh from China that Meta has removed in the last six years. Four of them were found in the last year, said the company, which published details of the new operation as part of a quarterly security report.
Analysts say the operation is likely just the beginning of China’s bid to expand its covert online influence operations by copying Russia’s playbook. It shows that they have adopted and shifted between different strategies, and are willing to invest the time and resources to shape global public opinion, says Sandra Quincoses, an intelligence analyst at cybersecurity firm Nisos who researches Chinese networks. The network uncovered by Meta appears strikingly similar to a recently uncovered influence operations China launched in Latin America in early 2023.
One of the most popular TikTok videos, as Meta reports, showed a woman arguing in Chinese that life in Xinjiang, a far north-western region of China, was peaceful. China has been under international scrutiny for carrying out repressive policies against Uyghurs and other predominantly Muslim ethnic minorities in the region. The TikTok video has garnered more than 7000+ views. “China is investing an enormous amount of money in the full spectrum of state propaganda, of which this is an important part,” said Graham Brookie, senior director of the Atlantic Council’s Digital Forensic Research Lab. Chinese apps like TikTok have been constantly refined to serve as “digital opium” for both domestic and international audiences, as its cognitive warfare tactics could be observed in Taiwan, Southeast Asia, EU and Americas.
Chinese officials were previously accused of leading disinformation campaigns across social media that targeted Canadian politicians. It was reported that Michael Chong, a member of Canada’s Conservative Party, was likely targeted by a “coordinated network” of WeChat news accounts, which “shared and amplified a large volume of false or misleading narratives” about Chong. The Canadian government noted that it was “highly probable” that China was involved in the alleged campaign. The pseudo social-media accounts operated by the Chinese, with the backing of its government organs, are targeting subjects in a coordinated approach to skew the narrative around varying topics, and to push set agendas. Many of these agendas are supportive of the Chinese government and critical of either popular figures or protestors who are against it. The network operates through consistent account creation to add more numbers and seeming veracity to its network, using a system of poster and amplifier accounts to “stage” trending topics, and to provide the illusion of viral tweets.
In 2022, Mandiant researchers said the Dragonbridge operation attempted to influence the 2022 U.S. midterm elections and spark protests in the United States against an Australian mining company’s American expansion plans. The influence campaign represents an escalation of both tactics and rhetoric designed to “sow division both between the U.S. and its allies and within the U.S. political system itself.” ASPI’s earlier report said the Chinese Communist party influence operations were probably conducted in parallel, if not collectively, by multiple Chinese party-state agencies – including the People’s Liberation Army’s strategic support force (PLASSF), the ministry of state security (MSS), the Central Propaganda Department, the ministry of public security (MPS) and the Cyberspace Administration of China (CAC) – which at times appeared to collaborate with private Chinese companies.
Influence operations are also increasingly setting up off-platform websites that they use social media to drive traffic to. The Chinese influence operations has adjusted its tactics in other ways to try to evade crackdown by social media platforms, such as by using alternate websites to redirect its links. Meta researchers expect the threat actors behind the campaign to rebuild and keep trying, despite consistently struggling to reach real people. “Ahead of all the elections next year, we do expect that they will keep on trying,” Nimmo said in reference to how China- and Russia-based disinformation groups are ratcheting up their influence campaigns. In 2024, about 2 billion people are due to vote in more than 50 elections, including in the US, India and the EU. As social media firms are not fully ready to tackle misinformation during elections due to take place around the world in 2024 because of language barriers, it is pertinent that they should pursue more innovative information-sharing to combat the “digital opium” – cyber-enabled influence operations. Governments should change their language in speeches and policy documents to describe social-media platforms as critical infrastructure. Public diplomacy should be a pillar of any counter-malign-influence strategy. Governments should support further research on influence operations and other hybrid threats. Strong open-source intelligence skills and collection capabilities are a crucial part of investigating and attributing these operations to rogue regimes like China.