A new study looking at the impact of Russian interference in the 2016 election is making headlines — but unfortunately, most of them aren’t accurate. The study, published by NYU’s Center for Social Media and Politics, found that exposure to tweets from known Russian influence accounts (associated with Russia’s Internet Research Agency) was not related to changes in attitudes, polarization, or voting behavior during the 2016 election.
So what exactly does this mean? We’ll get to that in a minute. First, let’s talk about what this doesn’t mean.
Take a look at the headlines below.
Citing the new NYU research, these headlines — including several from major news outlets like The New York Post, The Intercept, and Fox News — claim that the study debunked the narrative about “Russian bots” influencing the 2016 election.
So what’s the problem?
Well, to begin with, the study didn’t look at “Russian bots” — so it certainly didn’t debunk any narratives about them. We’ll get to the details about what the study did look at in a minute, but for now, it’s important to understand that the word “bots” is not just a catch-all for accounts that engage in nefarious or suspicious behavior. A bot (shorthand for robot) refers to an account that uses automation software to program it to perform certain tasks, and although previous reporting suggests that bots were used as part of Russia’s 2016 influence operation, this study didn’t purport to look at the impact of bots. In fact, the word “bot” is not mentioned once in the article.
On social media, the study was even more wildly misinterpreted. Many pundits and politicians who have spent the past several years denying that Russian interference happened at all held up this study as “proof” that they were correct all along, and that the media had colluded with Democrats to lie to the American people about Russia’s influence operation. Of course, the study made no such claims, nor did its findings support those bold assertions, but that wasn’t going to stop a good narrative from going viral — especially with the latest “Twitter Files” release from Matt Taibbi providing even more false vindication for so-called “Russiagate skeptics.”
Despite what Matt Taibbi, Aaron Mate, Gateway Pundit, Fox News, Breitbart, and others claimed, the study did not show that Russian interference was a hoax, nor did its findings suggest that the media or Democrats lied about the extent of Russia’s influence operation.
So what did the study actually find? Let’s break that down.
The study retrospectively looked at the relationship between exposure to tweets from Russian foreign influence accounts (the independent variable) and attitudes, polarization, and voting behavior (the dependent, or outcome, variables). It found that exposure to tweets posted by Russia’s Internet Research Agency (IRA) was heavily concentrated — just 1% of users accounted for an estimated 70% of exposures — and that people who identified as “Strong Republicans” were exposed to about nine times as many of these posts as those who identified as Independents or Democrats. The study also found that exposure to IRA tweets was not associated with changes in polarization, attitudes about issues such as immigration, healthcare, and banning Muslim people from entering the U.S., or voting behavior.
The study design and methods were appropriate for answering the research questions that the study set out to investigate, but the problem is that those research questions don’t necessarily tell us much about the actual real-world impact of Russia’s influence campaign. That’s because the primary research question — whether exposure to known IRA tweets influenced polarization, attitudes, or voting behavior — is extremely narrow in scope, to the point of not being generalizable to the influence campaign as a whole. The study does a fine job of answering the questions it asked, and the authors acknowledged the limitations of their findings. However, I am personally not aware of anyone who put forth the argument that mere exposure to several dozen tweets a day for a few months was enough to change attitudes or behavior. In other words, the study answered a question that not many people were really asking. With that said, this article isn’t meant as a critique of the research as much as a rebuttal to the misinterpretation of it.
Let’s take a closer look at why it would be wrong to suggest that this study proves that Russian interference had no impact.
It’s not possible, in other words, to accurately measure the impact of such a campaign by singling out only one platform or one tactic, because they all work together to produce a whole that is greater than the sum of its parts.
For starters, the analysis only looked at exposure to Twitter posts, not any other part of Russia’s multifaceted malign influence campaign. It didn’t look at the impact of targeted harassment, nor did it consider other social media platforms, even though we know Russia’s influence campaign was active on Reddit, Facebook, YouTube, Instagram, and other platforms like Tumblr and (indirectly) Pinterest, as well as on proxy websites targeting specific groups of Americans (like Veterans) and on the ground in the form of rallies and infiltration of activist groups and protests.
The Internet Research Agency used a variety of tactics to influence social media discourse and increase divisiveness, polarization, and group conflict. Political advertisements were just a small part of this effort, with organic posts making up the majority of the IRA’s content. Additionally, the most far-reaching IRA activity was found to be organic posts, not political advertisements. Some of the most widely-used strategies among IRA accounts included promoting sensationalist and conspiratorial content, as well as encouraging extreme right-wing voters to be more confrontational and fostering an online engagement style characterized by hostility, antagonism, trolling, and harassment. According to one former IRA employee, “Our task was to set Americans against their own government: to provoke unrest and discontent.” They did so across nearly every social media platform, often working in a coordinated manner to seed disinformation on fringe platforms and then push it into mainstream news coverage.
As described in a February 2018 indictment, Russian operatives working at the Internet Research Agency were tasked with specific activities to ensure that the influence operation had maximum impact. For example, according to the DOJ, the Internet Research Agency had an entire department devoted to search engine optimization (SEO) and another department devoted to data analysis—so as one group of operatives was working to manipulate algorithms and push selected content to the top of news feeds and search results, another group was analyzing the effectiveness of these strategies and providing feedback to develop better methods of amplification and reaching target groups.
In another department at the Internet Research Agency, employees were tasked with designing and producing graphics, which were then posted on social media and uploaded to websites where others could find and disseminate them. Kremlin operatives pumped out thousands of memes during the 2016 campaign, sharing them on Reddit, Instagram, Pinterest, Twitter, and Facebook, among other online platforms. Some of these memes specifically targeted political candidates, while others sought to sow divisions surrounding social issues like racism and guns. In some instances, seemingly benign memes were used to draw people in and foster a sense of community, so that they would be more receptive to future, pro-Kremlin messaging — a process known as priming, which was used throughout Russia’s influence operation.
Clint Watts, senior fellow at the Center for Cyber and Homeland Security at George Washington University and a Foreign Policy Research Institute fellow, described the dynamics of Russia’s full spectrum influence operations in his testimony before the Senate Judiciary Committee’s Subcommittee on Crime and Terrorism in October 2017:
“[A]n anonymous forgery placed on 4Chan can be discussed by Kremlin Twitter accounts who then amplify those discussions with social bots. Then, a Russian state sponsored outlet on YouTube reports on the Twitter discussion. The YouTube news story is then pushed into Facebook communities, amplified through ads or promoted amongst bogus groups. Each social media company will see but a part of the Kremlin's efforts. Unless all of the social media companies share their data, no one can fully comprehend the scope of Russia's manipulation and the degree of their impact.”
As Watts explained, cross-platform activity is one of the defining features of Russian influence campaigns. It’s not possible, in other words, to accurately measure the impact of such a campaign by singling out only one platform or one tactic, because they all work together to produce a whole that is greater than the sum of its parts.
Looking at the diagram below — which shows a visual representation of Russia’s 2016 influence campaign — Twitter is the small part circled in purple. That’s what the NYU study included in its analysis. All of those other platforms, tactics, and pathways of influence were left out — a limitation that the study authors openly acknowledge.
Previous reporting on the Internet Research Agency also describes how employees worked in teams, engaging with one another and pulling other users into their pro-Kremlin dialogue. This is critical, as it indicates that Russian accounts weren’t simply passively posting tweets for others to engage with, but were actively involved in eliciting engagement and interacting with other accounts. This dynamic was not explored in the new study.
The study also did not look at the impact of Russia’s “hack and leak” operation in which Russia’s military intelligence service (GRU) breached the computer systems belonging to the Democratic National Committee (DNC), the Democratic Congressional Campaign Committee (DCCC), and the Hillary Clinton campaign, and then leaked stolen documents and emails through DCLeaks, Guccifer 2.0, and Wikileaks. Nor did it consider the weaponization of those documents and emails — arguably one of the most important parts of Russia’s influence campaign. Nor did it consider the potential impact of forged documents, believed to be planted by Russian intelligence, that were spread online by Trump allies like Roger Stone.
It’s also worth asking whether Russia would really spend $1.5 billion/year on disinformation and propaganda if they didn’t have some reason to believe that this investment would pay off.
Perhaps most importantly, the study did not look at the myriad indirect effects, interactions, and network/group dynamics stemming from Russia’s influence campaign. We know, for example, that many American Trump supporters adopted social media tactics that mirrored Russia’s influence activities. We also know that the presence of Russian Twitter accounts modified the composition and activity of entire networks, including by turning Bernie Sanders’ supporters against Hillary Clinton and encouraging them to harass her supporters online — a dynamic that only exacerbated the toxic misogyny that pervaded the 2016 election. Research also shows that Russian social media accounts played (and continue to play) an important role in extremist networks on fringe platforms like Gab and 4chan. Even more importantly, Russia’s influence operation had a significant impact on US media coverage, both directly and indirectly. According to a study by the Columbia Journalism Review that looked at U.S. news coverage from 2015 to 2017, nearly every major news outlet in the U.S. (32 out of 33) used IRA tweets as sources of partisan opinion to cite in stories. More broadly, issues related to Russian influence dominated the U.S. news cycle, which meant that anti-Hillary Clinton sentiment was front-and-center for much of the campaign season, while other issues that were relevant to voters got pushed to the sidelines.
It’s also important to note that the participants in the study who were exposed to IRA tweets were strong partisans (Republicans), and thus unlikely to be easily swayed. However, individuals exposed to Russian propaganda and disinformation on other platforms and through the media may have been more susceptible. And finally, the study’s outcome variables were measured in April and October 2016, which captures a crucial time period at the height of Russia’s influence operation — but also misses some of the most crucial events, including FBI Director James Comey’s announcement that the bureau had re-opened the investigation into Hillary Clinton’s use of a private email server. This announcement was made on Oct. 28, 2016. Numerous reports suggest that Comey’s handling of the investigation — and the public announcements surrounding it — may have been influenced by a dubious document (believed to be a Russian forgery) released by Russia, which subsequently had a major impact on media coverage in the final days and weeks leading up to the election. And as evidenced by the tweet below, Russian accounts seized on the opportunity.
So what can we conclude here? Well, we know from past research that influence campaigns on social media can be quite effective at swaying public opinion, mobilizing protesters, and facilitating political participation of various types, and previous studies have found evidence that exposure to IRA accounts did result in significant behavioral changes in the online environment. Research also suggests that the impact of Russian influence campaigns varies by platform, and may be stronger on certain platforms, such as Reddit and 4chan, compared to Twitter. And still other research suggests that Russian influence campaigns may indeed impact voters, though the evidence is indirect. For example, one recent study from Columbia University School of International and Public Affairs looked at online betting markets and found that, on Russian holidays — when Russian “trolls” were less active — market odds favoring Republicans dropped, while odds favoring Democrats peaked. Indirect channels of Russian influence such as this “may be the most challenging and important ones to understand,” the authors concluded.
It’s also worth asking whether Russia would really spend $1.5 billion/year on disinformation and propaganda if they didn’t have some reason to believe that this investment would pay off.
Of course, those looking to rewrite history will continue to do just that, and Matt Taibbi will continue to falsely claim that this study is being suppressed even though it was covered by the Washington Post, NY Post, Fox News, The Intercept, RT, and more. Clearly, Taibbi and his fellow travelers know the power of information and disinformation. In fact, as Taibbi has crafted his “Twitter Files” narrative, he has even borrowed a few tactics that bear striking resemblance to those used by Russia, as I pointed out last week.
To be clear, I don’t think Taibbi is a secret Russian agent or anything. I think he just recognizes an effective strategy when he sees one. Of course, this does raise an interesting question: If Taibbi really believes that Russia’s influence tactics are so unsophisticated and ineffective, then why is he borrowing from their toolkit?
Great piece and awesome to see you on substack! Thank you for being a consistently decent voice on Twitter through the Trump era.
I think Taibbi is an agent of influence, specifically a tool chosen to influence the sort of intellectual denialist crowd. Denial is a powerful force, especially among those on top - why would they want to rock the boat, right?
All the right people create for him an environment around him where he can thrive spewing their bullshit, looking like he’s a free thinker cutting against the grain. Meanwhile, he’s just one cog in a propaganda machine. Like so many of their machinations….
https://radmod.substack.com/p/manaforts-collusion-post-soviet-machinations
The good news is, at least in my experience, fewer and fewer are taking Taibbi seriously, like Greenwald. The Twitter files have been a huge dud, but I do think bear many trademarks of a post-Soviet influence operation.
It wasn’t? The guy who made up the email on this... was found guilty. Imagine believing this was true in 2023 still. I can’t anymore. Hahah.