top of page

Don’t blame Russian trolls for America’s anti-vaxx problem. Our misinformation is homegrown

I blew the whistle on inauthentic behavior at Facebook. But authentic misinformation is the bigger problem in the west


By Sophie Zhang | The Guardian | August 18, 2021


Anti-vaccination protesters in Los Angeles. Photograph: David McNew/AFP/Getty Images


On 18 May 2021, German YouTuber Mirko Drotschmann tweeted an unusual message: a marketing agency was asking him to share allegedly leaked documents on Covid-19 vaccine deaths. Within a week, French YouTuber Léo Grasset shared similar news. News reports followed: Fazze, a London-based marketing firm with ties to Russia, was offering money to influencers to falsely disparage a Covid-19 vaccine.


This month, Facebook announced that it was banning Fazze. In addition to bribing influencers, Fazze had created misleading anti-vaccine content off-platform and used fake accounts to spread it on Facebook.


Before we continue, let’s clear up some common confusion between inauthentic behavior and misinformation. Misinformation refers to what someone says: “The Earth is flat” is misinformation regardless of who says it. Inauthenticity is about the identity of the speaker: if 1,000 fake accounts say “The Earth is round”, this is still inauthentic.


Facebook banned Fazze not because of their message, but because of the shady methods they used to spread it. In contrast, users spreading misinformation authentically are generally left alone by FB.


The Fazze campaign and Facebook’s takedown resulted in significant media coverage of Russian disinformation and lent credence to the narrative that Russia is an important source of the anti-vaccine propaganda that floods social media.


But the Fazze campaign was a failure. Memes spread by Fazze “received few if any likes, and some were ridiculed by real people … the operation’s Instagram posts attracted around 1,000 likes combined, with most receiving zero”, according to Facebook. Attempts by Fazze to recruit influencers resulted in the campaign’s exposure; ultimately only two influencers signed up.


Meanwhile, the same week that Facebook banned Fazze, a video of an Indiana physician, Dr Dan Stock, making false or misleading claims about masks and vaccines at a local school board meeting went viral on social media. The Stock video racked up more than 92m engagement actions on Facebook – at least a thousand times more than Fazze’s campaign. But as a real person expressing his authentic views, and as an American speaking to fellow Americans, Stock appears to have garnered significantly less concern – and media coverage – than the ineffective but Russian-backed and inauthentic campaign.


When I worked at Facebook, I spent two and a half years combating inauthentic behavior; I was responsible for Facebook taking down inauthentic campaigns by two national governments, and become a whistleblower because Facebook was unwilling to prioritize my findings. It’s hence ironic that today, I’m arguing that the west is too focused on inauthentic behavior and not enough on harm done by people acting authentically. (My most important work was in the global south, where governments act with impunity with serious consequences.)


If Fazze was directed by the Russian government – as many suspect but remains unproven – I would argue that the campaign has been a success for Vladimir Putin. The media attention on Fazze played into the incorrect belief that Russia is responsible for significant amounts of western misinformation, inflating the perception of its power and influence.

This gives Putin too much credit. As the Stock video demonstrates, the misinformation is coming from inside the house.


The Fazze case illustrates that inauthentic behavior can receive attention disproportionate to impact. Furthermore, the media has frequently covered questionable allegations of inauthentic behavior, misleading news consumers about the most likely source of misinformation. Days before the 2019 British election, a researcher alleged that fake accounts were spreading misinformation, but our investigation found that real people were responsible. And in February 2020, when news outlets suspected a North Carolina Facebook page to be Russian interference, we found it to be run by a real American pretending to be a Russian troll.


Ultimately, by focusing on inauthenticity in the western world, the media and Facebook have been refighting the last war of 2016 – a focus with consequences. After my departure, Facebook failed to inhibit the Stop the Steal movement, which falsely alleged that Donald Trump had won the 2020 election, because the company could not determine if the movement was “a coordinated effort to delegitimize the election” or “free expression by users”. But misrepresentation and inauthentic behavior are only two out of 26 community standards. Facebook explicitly bans coordinating or advocating harmful/criminal activity, but nevertheless failed to stop extremists who organized on Facebook to storm the US Capitol on 6 January. “We learned a lot from these cases,” Facebook staff wrote in a post-mortem. But the damage to society and the rule of law had already happened. Today, Facebook is probably trying to figure out how to stop a repeat of 2020. Four years ago, it sought to avoid repeating the mistakes of 2016. Facebook must learn to respond more flexibly to new threats, or it will be conducting another post-mortem in 2025.

5 views0 comments
bottom of page