‘The Firehose of Falsehood’
Russian disinformation tactics have not really changed since the Soviet Union. In fact, the strategies used against enemies back then are almost literally the same used today. At the heart of these tactics are goals of creating chaos, confusion, and controversy. We saw it with the press conference organised by the Russian Ministry of Defence in the days after the MH17 downing. They claim they don’t know the answer to who downed MH17 and suggest that no one can know the answer. Instead, they give viewers several possible suggestions – all of them contradictory. Seems weird, huh? No, quite genius actually. While viewers got zero answers from this press conference, the Russian government was successful in doing one thing – confusing everyone.
Today, as the war in Ukraine continues, these tactics persevere. When the Russian Air Force bombed a maternity hospital on 9 March, they employed the Crisis Actor conspiracy (which aims to portray victims of violence as ‘crisis actors’). This is employed to create a narrative that the entire event never happened or was completely faked. This goes beyond ‘false flag’ accusations that misrepresent motives and instead hope to delegitimise the source of the information – namely the on-the-ground journalists taking photos of people after the bombing. Of course, there’s some truth to this story (this is how they confuse people) – the woman accused of being a ‘crisis actor’ is a beauty influencer and model BUT she is actually pregnant and was actually there when the hospital was attacked. So, because she is actually a model, the Russian government is able to at least create some doubt that the situation is not what it seems.
The success of these tactics is also dependent on this confusion and doubt being shared in the media; RAND calls this ‘the firehose of falsehood’ because this Russian model of disinformation is a) dependent on high numbers of people sharing partial truths or pure fiction and b) it is repetitive and lacks commitment to consistency. So, when the Russian Embassy to the UK tweets that the attack on the hospital was staged or when several sock puppet accounts on Twitter share the ‘crisis actor’ story – it spreads, it confuses, and it works.
These tactics not only confuse people, but they create an illusion of truth that is sometimes hard to break. After all, people are lazy and due to the constant information overload, people don’t usually take the time to research, review, and check the source of a story. Even when information is initially assumed valid but is later proven false, people will only tend to remember the initial headline. And so, the good old belief that the ‘truth always wins’ is no longer the case, because if one thing is for certain these days, the truth almost never wins.
PGI’s Social Media Intelligence Analysts combine modern exploitative technology with deep human analytical expertise that covers the social media platforms themselves and the behaviours and the intents of those who use them. Our experienced analyst team have a deep understanding of how various threat groups use social media and follow a three-pronged approach focused on content, behaviour and infrastructure to assess and substantiate threat landscapes.