Pro-Russian Information Operations Escalate in Ukraine War
In the three months since the war started, Russian operatives and those allied with the nation's interests have unleashed a deluge of disinformation and fake news to try and sow fear and confusion in Ukraine, security vendor says.
May 19, 2022
In March, in the middle of Russia's invasion of Ukraine, a video surfaced that showed Ukraine's President Volodymyr Zelensky announcing his country's surrender to the Russian forces. Another story the same month said he had committed suicide in the military bunker in Kyiv where he had been directing his country's fight against Russia, apparently because of Ukrainian military failures.
The video was a sophisticated deepfake of Zelensky generated by artificial intelligence. The story of his suicide was a completely concocted report from a group set up to spread fabricated narratives aligned with Russian interests. Both are examples of what Mandiant on Thursday described as systematic, targeted, and organized cyber-enabled information operations (IO) that has targeted Ukraine's population and audiences in other regions of the world since the war began in February.
Many of the actors behind these campaigns are previously known Russian, Belarusian, and other pro-Russian groups. Their goal is threefold, according to Mandiant: to demoralize Ukrainians; to cause division between the beleaguered nation and its allies; and to foster a positive perception of Russia internationally. Also in the fray are actors from Iran and China that are opportunistically using the war to advance their own anti-US and anti-West narratives.
Success Hard to Gauge
The success of these information operations is hard to gauge given its scope, says Alden Wahlstrom, a senior analyst at Mandiant. "With the Russia-aligned activity, we’ve observed multiple instances in which the Ukrainian government has appeared to rapidly engage with and issue counter-messaging to disinformation narratives promoted by [information] operations," he says. But the sheer scale and tempo of operations has made the task challenging, Wahlstrom says. "One concern when looking at this activity in aggregate is that it helps to build an atmosphere of fear and uncertainty among the population in which individuals potentially question the validity of legitimate sources of information."
Mandiant's analysis shows several known groups are behind the information operations activity in Ukraine. Among them is APT28, a threat group that the US government and others have attributed to a unit of the Russian General Staff’s Main Intelligence Directorate (GRU). Mandiant observed members of APT28 using Telegram channels previously associated with the GRU to promote content designed to demoralize Ukrainians and weaken support from allies.
The Belarus-based operator of Ghostwriter, a long-running disinformation campaign in Europe is another actor that is active in Ukraine. In April, Mandiant observed the threat actor using what appeared to be a previously compromised website and likely compromised or threat actor-controlled social media accounts to publish and promote fake content aimed at fomenting distrust between Ukraine and Poland, its ally.
In the weeks leading up the Russia's invasion of Ukraine and in the months since then, Mandiant also observed an information campaign tracked as "Secondary Infektion" targeting audiences in Ukraine with fake narratives about the war. It was Secondary Infektion, for instance, that was responsible for the fake report about Zelensky's suicide. The same group also promoted stories about operatives from Ukraine's Azov Regiment — a unit that Russia has labeled as being comprised of Nazis — apparently seeking vengeance on Zelensky for allegedly letting Ukrainian soldiers die in Mariupol.
The group was often observed using forged documents, pamphlets, screenshots, and other fake source materials to support its fake content.
False Narratives to Sow Fear and Confusion
Mandiant said it observed several other operatives engaged in a wide range of similar information operations in Ukraine often using bot-generated social media accounts and fake personas to promote a variety of Russia-aligned narratives. This has included fake content about growing resentment in Poland over refugees from Ukraine and Polish criminal gangs harvesting organs from Ukrainians fleeing into their country.
Often the information operations have coincided with other disruptive and destructive cyber activity, according to Mandiant. For example, the content about Zelensky's alleged surrender to Russia broke the same time that threat actors hit a Ukrainian organization with a disk-wiping malware tool that was scheduled to execute three hours before a Zelensky speech to the UN.
Wahlstrom says Mandiant has not been able to definitively link the information operations to the concurrent destructive attacks.
"However, this limited pattern of overlap is worth paying attention to and may suggest that the actors behind the information operations are at least linked to groups with more extensive capabilities," he says. The coordinated attacks also suggest a full spectrum of actors and tactics are being employed in operations targeting Ukraine, Wahlstrom says.
For the most part, the information operations activity in Ukraine that the various groups are engaged in appear consistent with what they have engaged in previously. But one notable evolution is the prominence of dual-purpose information ops, says Sam Riddell, an analyst at Mandiant. "Popular pro-Russian 'hacktivist' activity and coordinated 'grassroots' campaigns have pursued specific influence objectives while simultaneously attempting to create the impression of broad popular support for the Kremlin," he says.
The conflict in Ukraine has also shown how rapidly information operation assets and infrastructure can be repurposed for the theme of the day, he says. "At the onset of the war, a whole ecosystem of pro-Russian IO assets was able to quickly flip a switch and engage in wartime IO at high volumes," he says. "For defenders, this means that disrupting assets before significant global events break out is paramount."
Mandiant's report coincided with another one from Nisos this week that shed light on a Internet of Things botnet, tracked as "Fronton," that apparently was developed a few years ago at the direction of the Federal Security Service of the Russian Federation (FSB). The botnet's primary purpose, according to Fronton, is to serve as a platform for creating and distributing fake content and disinformation on a global scale. It includes what Nisos described as a Web-based dashboard called SANA for formulating and deploying trending social media events on a mass scale.
Nisos' report on Fronton is based on a review of documents that were publicly leaked after a hacktivist group called Digital Revolution broke into systems belonging to a subcontractor who developed the botnet for FSB.
Vincas Ciziunas, research principal at Nisos, says there is no evidence of Fronton or SANA being used in the current conflict between Russia and Ukraine. But presumably the FSB has some use for the technology, Ciziunas adds. "We only have demo footage and documentation," he says. But the FSB did appear to create a fake network of Kazakh users on the Russian social media platform V Kontakte, and they did have some fake content related to a squirrel statue in a Kazakhstan city that appears to later have become the basis for a BBC report.
"The conversation related to the statue led to a BBC report," Ciziunas says. "We did not directly identify any of the social media postings mentioned in the BBC article as having been made by the platform."
About the Author
You May Also Like