Disinformation

Thinking about information manipulation campaigns through the diamond model

Meunier Anais
Analyst - OWN-CERT
21/2/2024
The aim of this article is to study the diamond model for the analysis of influence operations developed by Charity Wright, in order to understand its usefulness in the field of combating information manipulation.OWN Security

This post looks at the diamond model for analyzing influence operations developed by Charity Wright[1], a threat intelligence analyst at Recorded Future. By questioning the terms proposed, it aims to understand its usefulness and possible exploitation in the field of combating information manipulation.

Vocabulary

Information manipulation campaigns are a widely discussed topic in both the general press and academic literature. However, the terminology used to describe them varies and is often used indiscriminately. We will thus not refer to propaganda or information warfare: firstly,, according to Jean-Baptiste Jeangène Vilmer [2], the term propaganda is "too broad", as it describes "an attempt to influence the opinion and conduct of society in such a way that people adopt a certain opinion and conduct[3]". Secondly, the term "information warfare" is also too broad, as it describes a global and imminent threat and freezes the rabbit in the car's headlights: we don't know what we're dealing with, or how to respond.

Therefore, the term disinformation, describes deliberately distorted information. It does not describe the idea of a political will to do harm. The terme “Fake news” describes sensational information that plays on people's emotions and imitates real news. Their goal is to get the public to react using affect.

Finally, influence operations are linked to diplomacy and politics, and are not necessarily problematic.

The term that seems to describe the subject most accurately under study is "information manipulation campaign". Jeangène Vilmer offers the following definition: "a coordinated campaign to disseminate false or knowingly misleading news, with the political intention of causing harm[4]". For this reason, this article will not use the term "influence operation", even though it is based on Charity Wright's model.

The Diamond Model for Influence Operations Analysis

The Diamond Model is a conceptual tool used in Cyber Threat Intelligence. It was developed in 2013 by Sergio Caltagirone, Andrew Pendergast, and Christopher Betz as the "Diamond Model of Intrusion Analysis". It may seem very conceptual, but it can be used to:

• Understand, visualize and capitalize on a campaign in a standardized language (e.g. STIX).

• Understand the scope of applying a framework of techniques, tactics and procedures or a Kill Chain.

• Visualize all the elements that make up a campaign.

• Enable analysts to understand what data they are dealing with. In fact, depending on their status or the organization they belong to, analysts will not have access to the same type of data (open sources, from intelligence services, shared by a company, etc.[5]).

The diamond model for influence operations is inspired by CTI. It is shaped as a flat diamond with its 4 corners representing the different elements of the campaign. It describes the campaign's actor, its target, the capabilities at its disposal (in terms of Tactics, Techniques, and Procedures or TTPS, for which we'll use the DISARM framework), its infrastructure (its technical capabilities) and in its center, the narratives on which it relies to support its campaign. These different angles are linked by two main axes. The socio-political axis enables the opponent to develop narratives for the target audience.

Figure 1. The Diamond Model socio-political axis

The second axis is technical, linking TTPs with the technical infrastructures needed to deploy them.

Figure 2. The Diamond Model Technical axis

The structure of the diamond, adapted to the manipulation of information, is highly effective for describing campaigns. However, its creator makes two conceptual choices, one of which I find problematic, especially if you want to be able to translate campaigns into STIX 2.1 (more about that later).

Charity Wright considers that, because campaigns can be outsourced to private companies, they can't be called such, even if they are placed on the diamond angle dedicated to the adversary[6]. She has therefore replaced the malicious actor(Adversary in the original model) with the termInfluencer.

Figure 3. The Diamond Model of Intrusion Analysis, Sergio Caltagirone, Andrew Pendergast et Christopher Betz, 2013
Figure 4. The Diamond Model for Influence Operations Analysis, Charity Wright (Source: Recorded Future)

I find this choice questionable. Outsourcing a campaign to a private company enables the threat actor to obtain technical infrastructure or capability support (in the sense of TTPs). In the RRN/Doppelgänger campaign[7], for example, sources identified "the involvement of Russian or Russian-speaking individuals". Russian service providers, including a digital marketing company, are seen as infrastructure or providing support to a malicious actor.

A private company is thus a capability for the adversary (in terms of TTPs) or an infrastructure (web hosting company, for example). The actor is not simply an influencer who has set up a campaign with no intention of causing harm. If he replaces the adversary in the original model, he has an intention supported by a narrative and aimed toward an audience. He is responsible for the campaign and its goals. He is therefore an adversary. Private companies or influencers will, depending on their actions, be in the Capabilities, Infrastructure or Victims corners of the diamond model.  

The audience

Defining the audience for an information manipulation campaign raises the question of its purpose, impact, and target.

The heterogeneity of actors and campaigns makes it difficult to model them clearly. However, as the vector, victim and target of campaigns, it is necessary to understand their constituent parts.

It seems to me that the term audience refers to three structuring elements: victims, audience, and target.

Figure 5. Diagram of victims of information manipulation campaigns, Anais Meunier 2024, Source: OWN

These three notions are interconnected. Victims include both the intended audience and the campaign target. In fact, the target may be part of the audience, but is sometimes only affected by the campaign through the relay and comment of campaign artifacts (posts, visuals, etc.) by the audience. In this case, the victims are all the people affected by the campaign but not targeted by it.

The campaign may miss its target or even its audience, but it will always have victims. In fact, these victims are subjected to artifacts (all the audio, video and textual media disseminating the campaign narrative). These artifacts accumulate in what Arild Bergh, defines as “online information sediments”[8].

Bergh invites us to think of social media as a moving river, carrying particles that eventually land in a delta where they are accumulated. These particles are the artifacts of the various campaigns. Each new artifact feeds this sediment and increases its level.

This view of the audience has no impact on Wright's diamond model. However, it does encourage the use of the term "victim" found in the original model. This term can encompass the various components that make up the victim.

The Impact

Recent news reports illustrate a case of information manipulation, in the context of the war in Ukraine, concerning the so-called French mercenaries killed in a bombardment in Kharkiv[9]. The comments in the article, as well as reactions published on social networks, highlight the poor quality of this incident. While some of the French names did indeed refer to existing people, the others were clearly fake and poorly crafted.

In campaigns, audience, target, and victim can be undifferentiated. The malicious actor can set up numerous incidents to see what will finally reach his target, he can wait for the incident's audience, including useful idiots, to relay the campaign and give it visibility. In this particular case, under the guise of denouncing the poor quality of the incident, the audience ends up being the vector.

Why does it matter? Because, if we follow up Arild Bergh's work, relaying these low-quality incidents will feed the substrate of past campaigns. The new artifacts will sediment with those from previous campaigns.

This sedimentation is a way of understanding impact without the aid of metrics (numbers of visits, likes, shares or interactions, for example). This is what analysts, researchers and politicians struggle to quantify.

This sedimentation will feed future campaigns. In the long term, it will lead to action. Indeed, information manipulation campaigns are generally not designed to change people's minds. Their goal is to confirm or amplify dissension in society, right up to the breaking point.

Conclusion

Understanding the complexity of a campaign through a synthetic diagram may seem reductive. However, this schematization is an aid to structuring an analysis. It can also help in understanding and communicating about campaigns. Finally, choosing the right vocabulary and arguing your point of view will enable you to share your vision of the concepts you are working on in the field of the fight against information manipulation.

Returning to the diamond model itself, we can simply formalize it as follows, using the terms as defined in this post.

Figure 6. Diamond model for information manipulation campaigns, Anaïs Meunier 2024
(Source: OWN)

This diagram takes up the fundamental elements of the diamond model for influence operations. It includes the technological and socio-political axes, with the narrative at its core, capabilities, and infrastructures. Finally, the adversary and victims redefine the campaign's actors and targets.

We could, however, make it even more complex by explaining the various relationships between its constituent elements.

Figure 7 Complex diamond model for information manipulation campaigns,
Barbara Louis Sidney, Anaïs Meunier, 2024 (source: OWN)

This complex diagram highlights the real purpose, in my opinion, of this model. To me, the diamond model was the simplest way of communicating how information manipulation campaigns can be capitalized on and described in STIX 2.1 format. Thus, the links representing the relationships between the various elements of the diamond model evoke the relationships linking entities (STIX domain object) and observables in the STIX 2.1 ontology.

These different approaches allow the analyst to choose the way in which he or she wishes to share the results of his or her investigations. Like the ABC, CIB frameworks for describing campaigns, there is no single way to model or share information. However, before embarking on a complex representation, it's a good idea to think about its goals: do we want to explain, to guide, to capitalize on, to share? The true purpose will indicate the most appropriate way of communicating information, as well as the tools for doing so.

Footnotes

[1] Charity Wright, « The Diamond Model for Influence Operations Analysis », Recorded Future White Paper, s. d., 24.

[2] Jean-Baptiste Jeangène Vilmer et al., Les manipulations de l'information: un défi pour nos démocraties (Paris: Centre d'analyse, de prévision et de stratégie (CAPS), Institut de recherche de l'École militaire, 2018).

[3] Jean-Marie Domenach, La propagande politique (Presses universitaires de France (Vendôme, Impr. des PUF), 1965).

[4] Op. cit. Jeangène Vilmer. p. 21

[5] James Pamment and Victoria Smith, Attributing Information Influence Operations: Identifying Those Responsible for Malicious Behaviour Online (Riga: StratCom | NATO Strategic Communications Centre of Excellence, 2022) <https://stratcomcoe.org/publications/attributing-information-influence-operations-identifying-those-responsible-for-malicious-behaviour-online/244> [accessed 13 November 2022].

[6] Wright, « The Diamond Model for Influence Operations Analysis ». p. 20

[7] Viginum, RRN : une campagne numérique de manipulation de l’information complexe, 19 June 2023<https://www.sgdsn.gouv.fr/files/files/20230619_NP_VIGINUM_RAPPORT-CAMPAGNE-RRN_VF.pdf>[accessed 26 January 2024]. P.3

[8]Arild Bergh, « Understanding Influence Operations in Social Media: a cyber kill chain approach », Journal of Information Warfare 19, no 4 (2020): 110‑31. « Given this environment, this paper suggests that the content of social media is best thought of as ‘online information sediments’. In the world of big data, the term ‘data lakes’ is used to describe raw, unstructured data in large quantities, emphasizing the vastness of the data (Walker & Alrehamy 2015).

[9] « Mercenaires français » tués en Ukraine : une opération de désinformation russe, selon Paris », Le Monde.fr, 25 janvier 2024, https://www.lemonde.fr/international/article/2024/01/25/mercenaires-francais-tues-en-ukraine-une-operation-de-desinformation-russe-selon-paris_6212913_3210.html.

[10] '« Mercenaires français » tués en Ukraine : une opération de désinformation russe, selon Paris’, Le Monde.fr, 25 January 2024 <https://www.lemonde.fr/international/article/2024/01/25/mercenaires-francais-tues-en-ukraine-une-operation-de-desinformation-russe-selon-paris_6212913_3210.html> [accessed 26 January 2024].

[11] In STIX 2.1 format enhanced by Filigran to describe information manipulation campaigns.

[12] Camille François and Transatlantic Working Group, ‘Actors, Behaviors, Content: A Disinformation ABC. Highlighting Three Vectors of Viral Deception to Guide Industry & Regulatory Responses’, 2019 <https://docs.house.gov/meetings/SY/SY21/20190926/109980/HHRG-116-SY21-Wstate-FrancoisC-20190926-SD001.pdf>.

[13] ‘Coordinated Inauthentic Behavior Archives’, Meta, 2018 <https://about.fb.com/news/tag/coordinated-inauthentic-behavior/> [accessed 15 January 2024].

Partager l'article :

Your OWN cyber expert.