Godwin’s Law

Godwin’s Law

- Digital Threat Digest


Whenever an analyst at PGI uses the term ‘fake news’ in a report, out comes the red pen. Part of the problem is that there isn’t really a fixed definition of ‘fake news’; it can cover everything from satire to sensationalism to clickbait and, as such, is too imprecise to use in any meaningful analysis. The connotations of satire in particular also diminish the severity of the concept; evoking the humour of The Onion or general harmless misreporting and failing to address the problem at hand. ‘Disinformation’ as a term is colder, more precise, and conveys part of that implicit severity that comes with intent to deceive. The way we talk about these issues is important, because mainstream media describing something as ‘fake news’ doesn’t make it accessible; it masks the brutality and extremity of a lot of this content. It also places too much emphasis on isolated pieces of content and ignores that narratives are but one constituent part of sophisticated efforts to manipulate online at scale. An article describing a fake leak that Trudeau will turn into a Transformer to tackle the Freedom Convoy is undoubtedly ‘fake news’. Labelling an article describing sustained and well-financed campaigns to allege that no civilians were massacred in Bucha as ‘fake news’ is demeaning.

To pivot slightly, there are many rules of the internet, formal and informal, written and unwritten. Godwin’s Law states that the longer a discussion is online, the more likely it is that someone will bring up Adolf Hitler. It doesn’t matter whether the discussion is about supercars or the politics of the South China Sea – discussions invariably descend into comparison to the Nazis. Chatrooms, discussion boards, comments on BBC Good Food recipes, it doesn’t matter – someone’s going to be called a Nazi, usually by the party losing the argument. Godwin’s Law is typically confined to the internet, because hitting the Nazi button is a fairly nuclear option in a debate. Sergei Lavrov, Russia’s Minister of Foreign Affairs, bucked this trend when he sought to further propagate Russian efforts to frame the invasion of Ukraine as a process of de-Nazification in asserting that ‘the most rabid antisemites tend to be Jews’ and ‘Hitler also had Jewish blood’.

These claims aren’t so much ‘fake news’ or even ‘disinformation’ much as they’re an effort to rewrite both history and reality in real time. Rules of the internet are inevitable, but rules of engagement only function when both parties follow them. Russia has demonstrated its refusal to follow the laws of war in the physical world, so it shouldn’t be surprising that they’ve crossed the threshold in the accompanying information war.

There are multiple problems here, and they’re all intertwined. There are morally repugnant people on the internet, much as there are morally repugnant in real life. A subset of conspiracy theorists fall under that classification, not the ones who believe that Avril Lavigne was replaced with a clone, but those who believe in white nationalist Great Replacement theories, or that the Sandy Hook shooting used crisis actors, or those who adhere to antisemitic blood libel conspiracies, or deny the Holocaust happened. These theories are bad, but so long as they’re as far from the mainstream as can be, they’re basic bad. When they are weaponised by threat actors, dragged up from the depths of the internet and mainstreamed by the highest level of politician, then they become complex bad.

Antivaxxers largely confined themselves to health food shop noticeboards prior to the mainstreaming of the vaccine debate during the Covid-19 pandemic. US militias largely confined themselves to cosplaying Rambo in the forests of northern Michigan prior to being fed soundbites of election fraud by cable news. Now Lavrov has made a play at Holocaust denial, questioning what for so long was as close to a globally accepted universal truth of an atrocity as we’ll ever get. As a society we tolerated the antivaxxers, but it took just a few short months to see that the weaponisation of antivaxx movements was far more a sustained influence information than mere ‘fake news’. Checks and balances and institutional integrity just about held as militias stormed the Capitol, but now the US political landscape is more polarised than ever. Both cases show that the world is weary, that digital resilience is weak, and that entire generations are susceptible to manipulation at scale. Maybe emerging generations who were born in the echo chamber will see through it, but maybe they won’t. The Russian invasion of Ukraine and accompanying disinformation campaigns are further threatening the integrity of the general information environment, and now Lavrov has unleashed the Holocaust deniers, offering them a seat at the table of mainstream discourse.

There’s only so long that a population under siege can hold out. Food supplies in physical conflict run low. Resilience is finite in an information war.

PGI’s Digital Investigations Team brings you the Digital Threat Digest, get SOCMINT and OSINT insights into disinformation, influence operations and online harms, straight to your inbox.

PGI’s Social Media Intelligence Analysts combine modern exploitative technology with deep human analytical expertise that covers the social media platforms themselves and the behaviours and the intents of those who use them. Our experienced analyst team have a deep understanding of how various threat groups use social media and follow a three-pronged approach focused on content, behaviour and infrastructure to assess and substantiate threat landscapes. 

Ready to get started? Speak to one of our experts.

If you have any questions about our services or would like to learn more about our consultants here at PGI, please get in touch with us and speak with one of the team, call us on +44 (0)845 600 4403 or email us at sales@pgitl.com

Get in touch

Want to find out more?