As volatility stemming from the pandemic continues into 2022, uncertainty prevails in the digital landscape with authoritarian regimes, and state and non-state groups posing significant threats online. PGI examines five key trends that are likely to shape threats online in year ahead.
Growing hybrid extremism
There will be continued challenges in the detection and removal of extremism and disinformation on social media. While the lines between free-speech and hate-speech will be continually contested, so too will the boundaries between different extremist narratives themselves.
In recent years, political violence and civil unrest has been carried out and incited by individuals with diverse belief systems, rather than just homogenous extremist ideologies alone. Many web-based extremists no longer stick to strict categories of thought, instead espousing hybrid worldviews that express different ideas and grievances from across the political spectrum. Narratives and terms previously thought to be separate now often co-occur in digital spaces, bringing together UFOs with white-genocide theories, Incel ideology with neo-Nazi accelerationism, and historic antisemitic conspiracy theories with contemporary vaccine hesitancy and anti-lockdown sentiment.
The global pandemic has undoubtedly facilitated this process by providing disparate extremist worldviews with a common grievance on which to create conspiracy theories about COVID-19, though a major task for 2022 will be to scrutinise the aspects of social media and platform migration that allow these divergent groups to engage in the same social spaces and to influence one-another’s beliefs. This diversification of thought broadens the range of those vulnerable to extremist concepts, with fringe and seemingly apolitical conspiracy theories carrying a heightened risk of introducing more people to hard-line and violent extremist narratives.
In 2022, Influence Operations (IOs) will continue to evolve and become more sophisticated. The logical progression of IOs over time has been to transition from entirely inauthentic (fake content written by fake authors, hosted on fake media entities, amplified by fake accounts) to entirely authentic (real content written by real people, hosted on real media entities, amplified by real organic users).
The problem here is twofold, not only are organic IOs generally more impactful than inauthentic ones, they are also more difficult to detect. Without inauthentic points of content, behaviour, or infrastructure, they appear to merely constitute an idea going viral. You don’t need a network of fake accounts to promote vaccine hesitancy when you have a dedicated audience of real people ready to spread that content for you.
This organic process is made possible by a polarised information environment, one which can be plainly found across much of the world at this point. From elections in France, to the US Midterms, to emergent conflict in Ethiopia, to continued conspiracy theories around vaccines, fractured information environments facilitate threat actors manipulating through authentic inauthenticity.
Unchecked autocrats on social media
Authoritarian regimes and autocratic leaders have increasingly engaged in manipulation on social media to maintain control over public life domestically and target international audiences through disinformation campaigns. In particular, autocratic leaders are able to amplify their messages consistently to their supporters while simultaneously marginalising opposition voices online.
Autocratic governments such as China have engaged in not only traditional methods of digital autocracy—namely censorship—online but also flooding domestic and foreign platforms with pro-government content. This not only restricts the flow of information but makes public debate significantly more difficult with detractors unable to contend with the volumes of spurious claims, irrelevant information, and targeted attacks.
The sheer volume at which these governments and individual leaders are able to seed disinformation online makes it difficult for social media platforms to effectively police these actions. In turn, these limited and ineffective responses to disinformation campaigns are likely to embolden autocrats, particularly those seeking re-election or looking to maintain their grip on power in 2022.
Continued anti-democratic internet shutdowns
Authoritarian governments have routinely enacted complete and partial internet shutdowns, curtailing access to information and the ability of opponents to organise in 2021. PGI expects this to remain a key tool in the arsenal of authoritarian regimes for maintaining control in the year ahead, as the penalties and costs for doing so for these states remain relatively limited.
According to the UN, internet shutdowns are becoming increasingly frequent and occurring for greater durations when they do happen. For example, the military government in Myanmar conducted protracted shutdowns in the weeks following the coup in February 2021 and persists in curtailing access to social media websites in the country.
Shutdowns are likely to be a first move, rather than a last resort, as governments face mass unrest such as in Kazakhstan in early January. It is also likely that governments will use the threat of mass unrest and the alleged spread of disinformation to conduct complete or partial shutdowns in the lead up to elections to curtail opposition activity.
Diaspora involvement in conflicts
Social media continues to provide significant opportunities for diaspora groups to maintain an active influence in conflicts taking place in their former homelands. While the involvement of diaspora groups in this manner is nothing new—nor necessarily nefarious—social media platforms often make the process of raising funds for militias, sharing disinformation and hate speech, and lobbying governments significantly easier.
In 2021, Ethiopian influencers—sometimes paid by the national government—attempted to raise funds for militia groups linked to their various ethnic groups. Similarly, amateur nationalist commentators supportive of the Myanmar junta routinely use social media to promote pro-military narratives to Burmese nationals within the country and abroad.
These trends are likely to continue, if not accelerate in the year ahead as these conflicts persist. Autocratic governments and militia groups in particular have found diaspora influencers to be useful and relatively inexpensive tools for spreading their messages abroad. Individual partisans without direct ties to a militia or government can also lend significant legitimacy to a group’s struggle online, seemingly rallying support for their faction organically. These factors suggest that diaspora involvement in conflicts, facilitated by social media sites, is likely to continue into the year ahead.
PGI’s Digital Investigations Team combine modern exploitative technology with deep human analytical expertise that covers the social media platforms themselves and the behaviours and the intents of those who use them. Our experienced analyst team have a deep understanding of how various threat groups use social media and follow a three-pronged approach focused on content, behaviour and infrastructure to assess and substantiate threat landscapes.
We monitor in a range of languages, dialects and scripts and draws from our geopolitical intelligence background to provide contextual analysis and frame risks within very specific geopolitical contexts. Our combined technical and geopolitical approach complemented by extensive and niche language capabilities allows us to provide a layer of regional-specific intelligence and insight alongside technical artefact data.
Contact us to talk about how we can help you counter online threats.