The three constraints on Influence Operations - Digital Threat Digest
![Double circle designs part4](https://pgi.imgix.net/assets/uploads/images/Blog-posts/Double-circle-designs-part4.png?auto=compress%2Cformat&fit=crop&fm=webp&h=349&ixlib=php-3.1.0&upscale&w=349&tone=light)
PGI’s Digital Investigations Team brings you the Digital Threat Digest, SOCMINT and OSINT insights into disinformation, influence operations, and online harms.
Last week, world-renowned camera company, Leica announced the release of its first ‘Encryption Verification’ camera with content credentials built in. This camera is intrinsically designed to ensure the authenticity of photos taken on it, and has features which embed a photo’s metadata, including the location, camera make, picture mode, and details of any editing history. Images can also be verified through an external website or through the Leica app, providing another avenue to confirm the legitimacy of images. Designed initially for photojournalists, this product comes at a time when images are highly susceptible to misattribution, manipulation and even being wholly generated by artificial intelligence. From AI photos of Donald Trump, to misattributed photos of conflict zones – the integrity of photos is important to maintaining public trust in what we see online, now more than ever.
One drawback is the hefty price tag of $9,000, which will create barriers to access. However, the technology behind this product is still exciting, and down the line will hopefully become more widely available to consumers, journalists, and the general public. Photo verification services are not new, but embedding of this technology into everyday products such as cameras tackles the problem directly at the point of capture. In August, Canon partnered with Reuters and Starling Lab, an academic research lab, to run a pilot program which explored how embedding metadata within images might create an increased sense of trust from viewers around the integrity of the image.
This pilot and other initiatives by camera brands and technology companies is encouraging, especially given the ease at which manipulated images can spread like wildfire on social media. There are, of course, questions about the ease of accessibility and how this type of technology will develop. But for right now, the move towards cementing authenticity and legitimacy within photos and products themselves signals a new approach to tacking misinformation. This will be beneficial not just for photographers and photojournalists, but also for members of the OSINT community who work to fight against misattributed and manipulated content.
More about Protection Group International's Digital Investigations
Our Digital Investigations Analysts combine modern exploitative technology with deep human analytical expertise that covers the social media platforms themselves and the behaviours and the intents of those who use them. Our experienced analyst team have a deep understanding of how various threat groups use social media and follow a three-pronged approach focused on content, behaviour and infrastructure to assess and substantiate threat landscapes.
Disclaimer: Protection Group International does not endorse any of the linked content.
As I waited for my flight to be rescheduled during last week’s IT outage, I listened to fellow passengers wonder aloud how a company whose name has never hit their radar could have such an impact on such a spectrum of day-to-day matters.
If you don’t know who Nara Smith is, I’m sorry to say you may just be living under a rock. Nara Smith has simply taken over my Instagram and TikTok feed with her ‘what I cooked for my husband today’, ‘what my toddlers ate today’ or my favourite video format, ‘my husband was craving [insert insane request] so I made it from scratch’.
Explaining how digital incidents severely impact the real world can be difficult, but we are increasingly seeing cyber incidents that illustrate how malicious actors can impact our daily lives.