Russia amplifies right-wing influence campaign to undermine support for Ukraine by exploiting Hawaii's tragedy
A deep, deep dive into the symbiotic relationship between right-wing influencers, Russian state media, and the shared disinformation ecosystem in which they live.
More than 100 people have died in the devastating Hawaii fires, and officials expect that figure to go up as search and rescue efforts continue, making this the deadliest wildfire in modern U.S. history. But as people fled for their lives on the ground, a coordinated campaign on Twitter was exploiting their personal tragedy to undermine support for Ukraine, which is facing a crisis of its own.
Last week, I noted that a particular talking point about the Hawaii wildfires — specifically, one that used the phrase “Hawaii, not Ukraine” to pressure the Biden administration to send relief funds to Hawaii instead of Ukraine — was gaining traction on Twitter and appeared to have been the product of a narrative that was initially seeded with inauthentic activity. The narrative, which started on Aug. 9 with a tweet from a low-follower account created in July 2023, was quickly picked up by a network of right-wing influencers and became so voluminous that it showed up in the results of nearly every Hawaii-related hashtag or trending keyword. The purpose of the campaign was clearly to promote divisiveness and make people think that the U.S. was neglecting Hawaii while sending continuing aid to Ukraine — with the ultimate goal of undermining public support for providing continued assistance to Ukraine. It’s all based on a false narrative, as I explain later, but before we get into that, let’s look at the coordinated activity driving it.
To explore this campaign further, I analyzed a sample of 200 tweets containing the phrase “Hawaii not Ukraine,” and also purposively sampled the earliest tweets using that phrase to investigate the origins of the phrase and determine where it came from and how it spread throughout this network of right-wing accounts (and later, Russian state media), and also to assess whether there was any evidence of inauthentic activity, coordination, or platform manipulation.
A selection of tweets from the study sample can be seen in the collage below.
Origins
An advanced search revealed that the phrase “Hawaii not Ukraine” appeared for the first time on Aug. 9 at 4:51 pm in a reply to a tweet from the Hawaii Dept. of Transportation. The account that first posted it, which we’ll just call “Mary,” was created just last month, and had only 37 followers when I initially analyzed the account on Aug. 11. Three minutes later, at 4:54 pm, “Mary” posted the phrase a second time in response to another tweet from the Hawaii Dept. of Transportation. These two tweets were the first two times the phrase “Hawaii not Ukraine” was used, and both implied that sending aid to Ukraine was getting in the way of providing adequate support to Hawaii.
Before long, the talking point seeded by a 37-follower account was picked up by right-wing media and blasted across the headlines, becoming the dominant narrative among mainstream and fringe media outlets alike.
The next use of that phrase came almost exactly one hour later, at 5:57 pm on Aug. 9, but this time, the account that posted it was a blue-checkmark account with more than 50,000 followers. The account’s handle indicates that it may be associated with the Barstool brand Old Row Sports, which may be important given Barstool’s reputation for harassment via swarming, trolling, and algorithmic manipulation. The tweet (actually, quote-tweet) got more than 100,000 views, compared to a combined 622 views for the first two tweets using the phrase. Then, almost exactly an hour later at 6:52 pm, the fourth tweet using that phrase was posted by a 200,000+ follower account claiming to be a Republican Nominee for Arizona Attorney General. Though this tweet didn’t get as many views as the previous one (40K compared to 100K), it ended up reaching a greater variety of accounts, which was key to the campaign going viral.
Before long, the talking point seeded by a 37-follower account was picked up by right-wing media and blasted across the headlines, becoming the dominant narrative among mainstream and fringe media outlets alike.
Indicators of Coordination and Inauthentic Activity
As I explained last week, this narrative has several key characteristics of a coordinated campaign. First, the degree of repetition is notable. These tweets were not just expressing the same idea; they were using nearly identical language to do so, and all of them included the phrase “Hawaii not Ukraine.” This can also be a type of SEO or algorithmic manipulation aimed at making that specific phrase go viral and become a trending topic. Many of the tweets in the sample also quote-tweeted or posted the same recycled imagery and video footage, with one particular video appearing in at least 10% of sampled tweets. Notably, that video was quote-tweeted by two of the first four tweets that used the key phrase .
In addition to repetition, the reuse of existing content is also characteristic of disinformation and influence campaigns, which often recycle old content like memes and images. In fact, one of the most common ways I identify influence and disinformation campaigns is by performing reverse-image searches and discovering that the same picture has been used repeatedly over time in different campaigns.
Another key indicator of coordination was the timing of the initial tweets, which were posted almost exactly an hour apart. Over the next several days, other temporal trends emerged, including certain time periods (e.g., 8-10 pm) during which a disproportionate number of tweets were sent. These types of temporal patterns have been identified as an indicator of network activity that is indicative of coordinated campaigns. The tweets also engaged in a pattern of co-tweeting and co-retweeting each other’s content — a behavioral indicator that is detected in 74% of influence campaigns.
Also notable was the activity associated with the initial account (“Mary”) that first tweeted the phrase “Hawaii not Ukraine.” As stated, this account was created in July 2023 and had only 37 followers and a limited number of original tweets since its creation. (Interestingly, its first-ever tweet was a 34-second-long Twitter Space hosted by the brand new account. No one spoke during the Space; it was just background music.) Most of its engagement was retweeting other accounts — specifically, it exclusively retweeted right-wing influencers and politicians, in a way that appeared to be purposeful or strategic, almost as if it was training an algorithm.
The first account “Mary” retweeted was “Libs Of TikTok,” one of the most controversial right-wing accounts on social media. But the next retweet was an account with less than 3,000 followers who maintained a Twitter list whose only member was its own account. The list, called “Top Stuff,” was a running feed of right-wing influencer content — something that is often seen in disinformation networks and influence campaigns as a way to organize content for individual accounts to retweet. (This became more common after Twitter explicitly stated that using Direct Messages and private groups to coordinate retweeting and reciprocal engagement was a violation of the platform’s terms of service.) “Mary’s” retweet of this account wouldn’t be notable in most circumstances, but the fact that it was the second account she retweeted after creating her Twitter account suggests that she may have already been familiar with the account, and engaged with it purposefully. Almost all of “Mary’s” other retweets were of high-profile right-wing influencers, to the point that her account basically mapped out the right-wing Twitter ecosystem through its retweets. Another notable pattern was that an unusually high percentage of the account’s retweets contained images, which is evident in the collage below displaying a sample of the account’s retweets. Research shows that visual imagery can play a uniquely important role in framing news stories by more effectively capturing the audience’s attention, increasing emotional involvement, and improving memory recall. Visual images are often a key component of disinformation campaigns, with both large and small accounts involved in spreading key images.
Interestingly, one of the only accounts that “Mary” retweeted repeatedly was the account belonging to GOP Sen. John Kennedy, which seems misaligned with her other account activity. Given that “Mary” also engaged with a significant amount of content related to RFK Jr., I wonder if this may be an automated or partially-automated account that was trained to engage with content based on keywords — like “Kennedy,” and perhaps those retweets of John Kennedy were the product of a glitch when training the account?
Looking at all of this activity together, my impression is that “Mary,” the initial account to tweet the phrase “Hawaii not Ukraine” looks like an account that is embedded within an influence network that uses small, unremarkable accounts — ie, accounts that are likely to go unnoticed — to seed narratives and broadcast signals to a larger group of accounts. These types of accounts are commonly used by political “war rooms” as part of an internal strategy that is known to campaign members but not to outsiders. It’s how they coordinate without leaving a trail of evidence (well, obviously they still left a little bit of evidence — enough for me to write this article…). Of note, the third account that tweeted the phrase “Hawaii, not Ukraine” — the GOP Attorney General candidate who helped spread this narrative to a broader network — does have a campaign war room account. These types of operations usually involve a variety of single-purpose accounts, including broadcasters, amplifiers, trolls, narrative architects, gatekeepers, meme generators, anonymous influencers, attack dogs, opinion leaders, and more.
The degree of coordination among these accounts can range from full-on astroturfing to indirect network effects, and often falls somewhere in the middle, or may vary throughout the lifecycle of a narrative. In this case, I would describe the seeding and dissemination of this talking point as a form of networked information manipulation, since its origins appear to have been seeded intentionally using specific keywords and possible SEO hijacking, and most of the coordination was not explicit, but rather was the product of network effects (e.g., an account seeded the narrative; then, that narrative was tweeted by larger influencer accounts, spreading it to the larger network, which then “organically” picked it up — with the encouragement of right-wing websites and influencers, who kept the narrative trending). Of course, many of the people tweeting about this talking point genuinely believe what they’re saying — but many of those same people wouldn’t have ever formed that belief in the first place if they didn’t see people in their social network expressing it.
Russian State Media Amplifies Narrative
A few days after this narrative was seeded and picked up by right-wing influencers, Russia joined in and started boosting the same talking points. In a series of tweets and articles this week, Russian state media outlets RT and Sputnik promoted a variety of narratives claiming that Biden is ignoring Hawaii while giving his attention to Ukraine. RT used one of its favorite tactics — quoting an American to convey a Russia-friendly talking point — by highlighting Ron Paul’s criticism of Biden’s “indifference over the Maui fires” even as he pushed for more aid for Ukraine, while Sputnik promoted narratives that framed Biden as dishonest and uncaring.
This is unsurprising, of course, given that the entire point of the narrative is to criticize Biden’s response to the fires and undermine support for Ukraine by convincing people that continuing to support Ukraine means not supporting Hawaii, or at least not giving it the support it needs. However, even though it’s predictable that Russia would propagate this narrative, trends like this are always important to observe and document, given the close ties between Russian state media and Russian intelligence. These public activities provide a relatively reliable indicator of Russia’s priorities and strategic interests, as well insight into the narratives they believe are helpful and the tactics they use to propagate and expand on those narratives.
As I noted on Twitter a few days ago, Russia’s involvement in amplifying propaganda aimed at fostering isolationism and undermining public support for continuing to provide aid to Ukraine is just a small part of its long-running information war targeting the West. Russia is quite effective at identifying and exploiting existing tensions and divisions in the U.S. and other Western nations, often using a combination of information-laundering, cross-platform persona accounts, outward-facing propaganda sites like RT and Sputnik, and paid local media. This has allowed them to continue engaging in the narrative wars even as many of their official accounts have been banned or restricted on major social platforms due to their invasion of Ukraine and subsequent involvement in war crimes and human rights violations. Russia is known to have a large number of dormant social media accounts on various platforms that they can (and do) activate when necessary, for a variety of purposes, including posing as citizens of other countries and seeding pro-Russia narratives in a way that looks homegrown and local.
For example, if Russia wanted to spread a conspiracy theory to blame the U.S. military for starting the Hawaii fires by experimenting with directed energy weapons, they may deploy networks of fake personas to start expressing concerns about this on social media, knowing that it will eventually get picked up by local bloggers and fringe media. Or, they may use their extensive network of proxy websites — i.e., sites that are not obviously Russian state media, but are linked to Russian intelligence — to publish ghostwritten articles that are sometimes even authored by personas created by Russia’s military intelligence agency. In a similar form of information-laundering, Russia may publish an article about directed energy weapons through a proxy website or state media outlet, then wait for people to start commenting about it on social media, and then use those social media comments as the basis to write articles about how Americans are worried that the Hawaii fires were started by directed energy weapons controlled by the military. By laundering Russian propaganda so that it appears to be homegrown, local, and organic, Russia is able to manufacture more convincing, persuasive narratives — and in many instances, those narratives are then echoed by the likes of Fox News, Tucker Carlson, Tulsi Gabbard, etc., at which point Russia knows that it has succeed in injecting foreign propaganda into the bloodstream of America.
Speaking of which, Global Research — a Russian proxy website linked to Russian intelligence — has suddenly published a flurry of articles claiming that the U.S. military is experimenting with directed energy weapons as part of a “climatic warfare” agenda.
Conspiracy theories about directed energy weapons and manipulated images purporting to show such weapons have flooded social media in recent days, with users making baseless assertions that they were the cause of the Hawaii fires. Most of the viral images purportedly showing Hawaii in 2023 have been circulating for years, and many have been used in previous disinformation campaigns — once again highlighting the frequent use of information laundering as a tactic of online deception, and the close ideological alignment between the Kremlin and the far-right in America.
Disinformation thrives in a world of artificial simplicity, where people are either villains or heroes; where the choices people make are either all good or all bad; where context doesn’t exist, and where history can simply be rewritten to align with the latest narrative.
Of course, there’s room for legitimate criticism of the government’s response to the Hawaii fires. Survivors describe the response as “slow, inadequate, and uncoordinated,” according to the Washington Post.
“Some are struggling to find housing and daily necessities. Others say they lack medical aid, generators and transportation to recovery centers to hear news of their missing loved ones,” WaPo reported. “It’s often not clear who is in charge among local officials, the National Guard, the Coast Guard and the Federal Emergency Management Agency, some say.”
But these failures aren’t because the U.S. is providing support to Ukraine, and the U.S. isn’t withholding money from people in Hawaii so it can go to Ukraine instead. In fact, according to the Ukraine Support Tracker, only 24% of US aid to Ukraine is purely financial, while 43% is military aid in the form of equipment and weaponry. Hawaii doesn’t need military equipment and weapons — it needs disaster response, and we’ve known for years that the Federal Emergency Management Agency (FEMA) is woefully unprepared, under-resourced, and understaffed given the increasing rate and severity of natural disasters in the U.S. Beyond these problems, FEMA’s policies mean that the agency often fails to help those who need it the most — and again, we’ve known about this for years.
Furthermore, as we saw during COVID, there is often a lack of coordination and mutual understanding between local, state, and federal agencies, which slows down the entire process and often leaves local agencies — those with the most experience, familiarity, and knowledge of the situation on-the-ground — little choice but to sit and wait for the federal government to get there, and hope that they’ll bring the right supplies and people with them. Of course, getting those supplies to an island nearly 3,000 miles away also complicates things just a bit.
This is exactly the type of nuance that disinformation seeks to eliminate. Disinformation thrives in a world of artificial simplicity, where people are either villains or heroes; where the choices people make are either all good or all bad; where context doesn’t exist, and where history can simply be rewritten to align with the latest narrative. Of course, that’s not the real world. But disinformation artists aren’t interested in the real world, nor do they really care about helping the people who live in it — whether they’re in Hawaii, Ukraine, or the house next door. Oh, they’ll happily exploit the crisis, but don’t expect them to stick around to help. Tragedy is too real for them, as are the complexities of the real world and the humans living in it. But in the land of disinformation, we can just write those things right out of the history books — kind of like Russia is doing with those war crimes.
As is so often the case, I am deeply impressed by the details you have dug up.
This is excellent work! Fascinating, and accessible insight into methods used in influence campaigns.
I don't follow Twitter, or many other social media driven by algorithms, but sadly the rage triggers and influencers are also poisoning many once-trusted news sources.
(I ran across your work on Mastodon)