Iran, China-linked gangs join Putin’s disinformation war

Pro-Beijing and Iran miscreants are using the war in Ukraine to spread disinformation that supports these countries’ political interests — namely, advancing anti-Western narratives – according to threat-intel experts at Mandiant.

Additionally, Iranian cyber-campaigns are using Russia’s invasion of its neighbor to take aim at Saudi Arabia and Israel, the researchers found.

In a new report published today, Mandiant’s Alden Wahlstrom, Alice Revelli, Sam Riddell, David Mainor and Ryan Serabian analyze several information operations that the team has observed in its response to the conflict in Ukraine. It also attributes these campaigns to actors that the threat researchers say are operating in support of nation-states including Russia, Belarus, China and Iran.

“They’re opportunistically leveraging the invasion to pursue the goals of known campaigns,” Wahlstrom, a senior analyst at Mandiant, told The Register. “It’s a bit of a view of how other actors can use a major event to pursue their own interests on the global stage.”

Death to the Great Satan

In the case of Iran, this means tailoring their existing campaigns, some of which the Mandiant team has tracked for years, to pertain to the Russian invasion of Ukraine while still promoting their anti-US and allies messaging. This includes the pro-Iran Liberty Front Press (LFP) campaign, which Mandiant started tracking in 2018, and the Iran-aligned Endless Mayfly influence campaign that Citizen Lab reported on in 2019.

For the first time in this research, Mandiant has named a pro-Iran group “Roaming Mayfly” because of its similarities to the Endless Mayfly campaign.

“The campaigns aren’t new, but the activity and the focus is,” Revelli, a Mandiant senior manager, said in an interview. “They’ve been leveraging narratives pertaining to the invasion to take aim at the usual adversaries.”

Some of the campaigns include messaging directed at Arabic-language audiences stating that the US has abandoned Ukraine like it did with Afghanistan in 2021. “So that’s really focusing on the US and how it abandons its allies rather than on the invasion in and of itself,” Revelli added.

In other examples of messaging promoting Western hypocrisy and racism, pro-Iran campaigns have used the Ukraine invasion to highlight how the US and its allies responded to Russia compared to how they dealt with Saudi Arabia after the Saudi-led invasion of Yemen.

Roaming Mayfly, meet Dragonbridge

Mandiant also documented Roaming Mayfly targeting Russian audiences on the eve of the invasion in what the security researchers say is an attempt to increase tensions between Russia and Israel. Specifically, this effort used a fake Twitter account that impersonated Russian journalist Fyodor Lukyanov to suggest Israeli intelligence was supporting Ukraine, and that it had also supported earlier violent protests in 2000, 2004, and 2014.

Also in the research, Mandiant links a pro-Chinese government effort called “Dragonbridge” to an ongoing misinformation campaign that alleges Pentagon-linked labs are conducting biological weapons research in Ukraine. 

While it’s difficult to measure the influence or reach of these types of misinformation campaigns, the general chaos and distrust they sow is in and of itself a successful outcome, Wahlstrom said.

“Creating an influx of questionable information in a conflict zone at least raises questions or concerns about trustworthy news sources — that larger possibility of confusion or not knowing exactly when you can trust if something was posted to a website or if that is real,” he said. 

New details about Russian campaigns

In addition to the Chinese and Iranian campaigns related to the Ukraine war, the research includes information on new Russian information operations. 

This includes a Russian-influence campaign known as “Secondary Infektion,” which began prior to the ground invasion and spread misinformation about Ukrainian president Volodymyr Zelenskyy. Mandiant linked the operation to a March false claim that Zelenskyy had died by suicide in the military bunker in Kyiv. 

A more recent Secondary Infection campaign circulated in both Ukrainian and Russian falsely claimed that the Ukrainian and Polish governments sought to enable Polish troops to deploy in western Ukraine. And one in early April claimed that Poland attempted to use an alleged “provocation,” staged by Ukraine, to station Polish troops in the country.

Plus, a new Ghostwriter operation, which Mandiant is attributing publicly for the first time, published fake content about Polish criminals harvesting organs from Ukrainian refugees.

“Obviously, that’s an incredibly concerning narrative that has been spread with the potential to create distrust between Ukrainian refugees and the place that they’re fleeing to,” Wahlstrom said.

“But in terms of the context of viewing that as a Ghostwriter operation, it is very much in line with established goals of the campaign, which include fomenting distrust or tensions between different countries in the region, including Poland and Ukraine.”

As a reminder: Ghostwriter, a crew thought to be connected to Russia’s GRU military intelligence service, traffics in both disinformation and destructive cyberattacks. The US government blamed Ghostwriter for WhispherGate, the destructive wiper malware strain used against Ukrainian government and private sector networks on January 15 — before the land and sky bombardment began.

This same group also tried to trick people on Facebook into posting a fake YouTube video purporting to show Ukrainian soldiers emerging from a forest waving white flags. And Ghostwriter tried to hack into “dozens” of Ukrainian military personnel’s Facebook accounts, according to Meta.

When disinformation, destructive attacks collide

This gang, in particular, highlights the intersection of disinformation and destructive attacks that Russian-backed criminals have used against Ukraine since the conflict started.

While Mandiant touches on this alignment in its new report, identifying information operations that occurred concurrently with disruptive or destructive malware (like the wiper attacks at the start of the war), the authors say they can’t definitively link these types of operations because of a “limited pattern of overlap.” 

This suggests that some of the crews behind these information operations “are at least linked to groups with more extensive capabilities,” Wahlstrom added. “And what that means is the skill sets required to conduct a basic information operation vary from what you might require to develop and deploy malware.”

However, both types of threats compliment each other in a warzone, he added. “If one of the goals is to create fear, uncertainty amongst the population, then having both a very visible statement, like a website defacement, and then potentially other activity that has a more destructive aspect to it that could catch headlines has a dual impact.” ®

Source link

Back to top button