Investigations
Security
Capacity Building
Insights
About
Digital Threat Digest Insights Careers Let's talk

The three constraints on Influence Operations - Digital Threat Digest

PGI’s Digital Investigations Team brings you the Digital Threat Digest, SOCMINT and OSINT insights into disinformation, influence operations, and online harms.

Double circle designs part4

To most people, online influence operations involve competing ideologies battling it out in the public sphere. They recognise this conflict is often lopsided, with certain ideas being better attuned to our base human instincts and algorithmic virality. This isn’t necessarily wrong, but I think it only tells a small part of the story.

If you break down these operations and campaigns to their constituent pieces, you find they are a function of three constraints: objectives, resources, and time. To justify its existence, an operation must first have a concrete and measurable goal, involving clear metrics of success or failure. Next, operators must consider their available resources, from bots and software tools to hardware and staff. Lastly, they must account for time. Operations have a window of viability before the sociopolitical context evolves and renders their campaign irrelevant. In this way, a person or entity engaging in these propaganda campaigns functions just like any business or similar organisation, who are also bound by these fundamental constraints.

A recent article by Wired really struck me as a great example of these three constraints in action. The author shows how businesses emerged around US intelligence agencies to connect them to the incredible repository of ad data collected by apps and digital platforms. The article describes how terrifyingly granular this information can get, and how it could be used by organisations such as the CIA in nearly all their missions. As it is technically public, there is also no reason why it can’t also be exploited by Chinese or Russian authorities.

Nevertheless, the development and use of “ADINT” operates according to the framework I’ve outlined. Private companies connect intelligence agencies to online ad data markets, and that in turn helps them broaden their scope for any potential future influence operation. It increases the information resources at their disposal and allows them to broaden or deepen their existing objectives. Furthermore, the detail and real-time feedback of this data prolongs the time their campaigns are effective.

Ultimately, these tools will lose some of their effectiveness as the public becomes aware of their existence. Those who feel at risk will improve their operational security measures or even learn how to manipulate the data to set traps or red herrings. My key takeaway is that when trying to understand the behaviour of those running information operations, do so through this prism of objective, resources, and time. Because that’s the basic framework behind how actors are making decisions on what objectives to pursue and how.


More about Protection Group International's Digital Investigations

Our Digital Investigations Analysts combine modern exploitative technology with deep human analytical expertise that covers the social media platforms themselves and the behaviours and the intents of those who use them. Our experienced analyst team have a deep understanding of how various threat groups use social media and follow a three-pronged approach focused on content, behaviour and infrastructure to assess and substantiate threat landscapes.

Disclaimer: Protection Group International does not endorse any of the linked content.