Should you spend time arguing with the anti-vaccine lobby on the internet? After all, someone is always wrong on the internet and duty calls.
A paper analysing a retrospective set of 1.8 million tweets suggests caution is required.1 David Broniatowski and colleagues examined vaccine-related tweets in this sample, and compared how normal users posted vaccine messages compared to trolls and bots. Trolls and bots produced vaccine material more often than normal users, creating conflict amongst other users and disinformation. Much of this material was sourced from Russia.
The use of disinformation, or as the Russians call it dezinformatsiya, is not new. Soviet disinformation started back in the 1920s, and continued throughout the Cold War. After the end of the Cold War, it found new life under Putin and his troll Factories.
“The term dezinformatsiya denoted a variety of techniques and activities to purvey false or misleading information that Soviet bloc active measures specialists sought to leak into the foreign media. From the Western perspective, disinformation was a politically motivated lie, but Soviet bloc propagandists believed their disinformation campaigns merely highlighted greater truths by exposing the real nature of capitalism.”
Dezinformatsiya is less about the evils of capitalism these days, and more about Russian nationalism. Modern day disinformation attempts include provocation for regime change in Montenegro, spinning concern about suspected Russian chemical attacks on the UK as hate speech against Russia, and accusing Finland of putting Russian children in prison. That’s before you even get to the Russian involvement in the 2016 US Presidential election, and the Brexit referendum.
Russian attempts to create discord and polarisation are endemic. In the case of the Ukraine, the use of disinformation was a key component of the Russian annexation of the Crimea, and the continuing separatist narrative in the Ukraine. If there is fracture in your society, the Russians will be there prying at it, handing out free icepicks to those willing to lend a hand. They have had great success.
In 2016 in the US they even managed to organise a real world Black Lives Matter protest using Facebook, and in an act of genius also organised the counter-protest against it. In 2017, a photograph of a Muslim lady walking past victims of a terrorist attack (malignly characterised as uncaring) in the UK was widely circulated and commented on in a very polarised debate.
The provenance of the initial tweet creating a maelstrom on social media that eventually broke through to mainstream media?
The ensuring outrage, from those with good and bad intentions, served the bot’s intention to spread the disinformation wider. Russian disinformation often seeks to create debates on race or immigration in order to destabilise the political establishment in Europe and the US. This echoes old Soviet tactics. That pro-Russian populists, of the far right and far left, may benefit from this is not a surprise.
Russian disinformation is a complex operation, ranging from so-call troll farms to bots, the twitter feeds of their official embassies, to Russian funded media such as Sputnik or Russia Today. The latter has a veneer of respectability, with political figures, including the current leader of the UK opposition, willing to appear as guests.
To counter Russian dezinformatsiya, Europe set up a Task Force to counter the threat. Their website EU vs Disinfo states they have had 3800 disinformation cases from September 2015 to Spring of 2018, and provides numerous examples of recent disinformation, including disinformation on health issues, such as vaccination.
One of the worst cases cases of disinformation related to health occurred at the tail end of the Cold War. The Soviets sought to exploit the HIV/AIDs crisis in Operation Infektion.2 In 1983, a letter, almost certainly written by the KGB, was published in an Indian magazine suggesting that HIV had been developed by the US government. Disinformation was created and planted outside of Russia, and then built on. The KGB were experts in exploiting real concerns, such as the Tuskegee experiment, and converting them into wider conspiracies. Over the remainder of the 1980s the story was amplified through various media outlets, eventually outliving the KGB use of the conspiracy. A survey in 2005, found 50% of African Americans believed HIV was man-made, with over 25% believing it had been made in a government lab. In sub-sharan Africa, the conspiracy mutated, leading to suggestions of “AIDS-oiled” condoms from the US, and Robert Mugabe calling AIDs a “white man’s plot.”
Similarly, modern day anti-vaccine material can also cause harm. In Europe more than 41,000 people have been infected with measles in the first six months of 2018, leading to 37 deaths. Pushing anti-vaccine propaganda has a price. Now disinformation can be planted directly into social media feeds of populations. There is evidence that more emotional or moral tweets are interacted with more in twitter, so it is no surprise to find that many of the tweets found by Broniatowski et al1 are designed to be more provocative in nature, including racial, ethic and socio-economic issues:
“#VaccinateUS messages included several distinctive arguments that we did not observe in the general vaccine discourse. These included arguments related to racial/ethnic divisions, appeals to God, and arguments on the basis of animal welfare. These are divisive topics in US culture, which we did not see frequently discussed in other tweets related to vaccines. For instance, “Apparently only the elite get ‘clean’#vaccines. And what do we, normal ppl, get?! #VaccinateUS” appears to target socioeconomic tensions that exist in the United States. By contrast, standard antivaccine messages tend to characterize vaccines as risky for all people regardless of socioeconomic status”
The accounts also posted pro-vaccine messages. The intention of trolls was to keep the discussion going. Human users would be retweeted by the troll accounts, so benign pro-vaccine messages by non-trolls would feed the trolls “giving the false impression of legitimacy to both sides”. The paper contains an argument that that reducing pro-vaccination tweets in these engagements might lead to overall less anti-vaccine content becoming visible, and that public health messages need to be carefully crafted to avoid strengthening such anti-vaccine material.
However, it is a game that many will still play. People with the best of intentions will feed the trolls. Just knowing that the game exists is a start though.
Subscribe via RSS