Downloads: 2 available

Available in

Contents

Including digital technologies in peace agreements

Digital technologies play an increasingly significant role in armed conflict. In response, conflict parties and mediators have begun integrating digital technologies into negotiation processes. A number of ‘social media peace agreements’ have been brokered, as well as a number of clauses covering digital technologies in broader peace agreements. These early agreements demonstrate the utility and possibility of negotiating restraint in the online space.

However, despite these developments, most mediators still have limited experience in facilitating negotiations over how peace agreements can seek to restrict harmful uses of digital technologies. To address this gap, we discuss five crucial process design questions for mediators to consider when managing the digital dimensions of conflict.

When should digital technologies be integrated into a peace process?

The centrality of digital technologies in any peace process will depend on their prevalence and impact in the conflict. Integrating the digital dimension into conflict analysis is now increasingly essential. Such analysis should determine how digital technologies are being deployed, by whom and with what effect, including how gender dynamics intersect with access and usage of these technologies (see Table 1 for descriptions and examples).

Analysis should determine how digital technologies are being deployed, by whom and with what effect, including how gender dynamics intersect.
Accord 30

When analysis suggests that digital technologies represent an important facet of the conflict, mediators might proactively explore with conflict parties how to incorporate digital technologies into the negotiating agenda. As the use of digital technologies in conflict grows, with what the Office of the UN High Commissioner for Human Rights has recognised as a correspondingly ‘dramatic impact’ on civilian populations, conflict parties or other stakeholders may themselves increasingly call for this. In 2022, for example, alarmed by the humanitarian impact of a multi-year internet shutdown in Tigray for millions of people, civil society groups urged the African Union to address the shutdown in peace talks between the Ethiopian Government and the Tigrayan People’s Liberation Front.

Nonetheless, there may be cases where despite the prevalence of digital technologies in a conflict, the opportunities for mediation on digital issues are limited. For example, there may be less scope for negotiation if there is significant asymmetry in digital capabilities between the parties (for example only one party controls the tools necessary to engage in some of the digital behaviours described in Table 1). Analysis should thus determine both the scope of digital issues and their ripeness for mediation.

Dedicated or integrated agreements?

Where conflict parties are willing to negotiate on digital issues, they must decide whether these issues should be addressed in the form of a dedicated agreement, protocol or annex, or rather as clauses within a wider agreement. Dedicated digital agreements provide an opportunity for greater detail and nuance, and the opportunity to address a wider spectrum of digital behaviours. For example, in Nigeria’s 2023 elections, a Code of Conduct on social media was signed between parties, candidates and influencers in Kaduna State, outlining a range of problematic digital behaviours (for example, harassment, political disinformation, and inauthentic accounts). However, standalone agreements can lead to the side-lining of digital issues and commitments, as they might not be negotiated or implemented as part of the broader process. This could be particularly problematic if the parties’ leaders do not fully understand and own the resulting agreement.

An alternative is to integrate digital clauses into broader agreements. This has the advantage of giving digital issues greater visibility and (in principle) according them equal importance to other issues. However, it limits the amount of detail possible and may not enable the full range of problematic digital behaviours to be addressed. In such cases, mediators could encourage relevant clauses to include concrete implementation mechanisms, providing a mandate to continue the negotiating process. For example, the 2020 Libya ceasefire agreement calls on parties to ‘halt the currently rampant media escalation and hate speech’ on ‘websites’ and establishes an implementation sub-committee.

Table 1: Uses of digital technologies in conflict

Table showing Uses of digital technologies in conflict

Digital technologies and ceasefires

If conflict parties decide to integrate digital technologies into broader agreements, they may do so in various ways: as part of political prevention efforts (such as electoral codes of conduct), a ceasefire (aiming to stop or control violence), or a more comprehensive peace agreement.

Given the growing convergence between digital technologies and warfighting, negotiated restraints on certain digital behaviours may become a more frequent feature of ceasefires. Indeed, during the past decade simple provisions related to harmful social media use have emerged in national and local ceasefires (Kenya, Libya, South Sudan, Syria and Yemen), and local internet shutdowns have been ended following ceasefires in northern Ethiopia and Myanmar (Rakhine).

In addition, the military use of cyber operations witnessed in inter-state wars such as Ukraine has led to growing discussion of how this might feature in a future ceasefire agreement. A consistent theme emerging from practitioner guidance on ceasefires is the need for precision around prohibited behaviour and incident management mechanisms, and it would thus make sense to include specific uses of digital technologies where they might directly or indirectly threaten the stability of a broader ceasefire regime.

Which type of clause: principles, practices or processes?

Regardless of the type of agreement within which digital technologies are included, the clauses they contain may fall into one of three categories: principles, practices or processes.

  1. Principles are general aspirational statements intended to signal an acknowledgment of the significance of the digital realm. They serve to reaffirm or commit to upholding existing legal obligations, rights or emerging norms related to the responsible use of digital technologies – without necessarily detailing specific actions or accompanying monitoring and verification (M&V). For example, parties may agree to:



    • Adhere to existing obligations under international law, including international humanitarian and international human rights law relating to cyber operations.

    • Recognise access to the internet as part of the right to information on which people’s lives, well-being, and security depend during conflict and/or as a key enabler of other political, economic, and social rights.

    • In light of the heightened risks posed in conflict by disinformation and other forms of information manipulation, recognise a safe, constructive, and responsible social media space as a public good.

     
  2. Practices are detailed commitments to prohibit or encourage specific types of digital behaviours. This requires precise definitions of prohibited or permitted tactics and may have associated M&V mechanisms. Parties could:

    • Commit to responsible social media use or to refrain from certain social media behaviour. For example, political parties in Thailand ahead of 2023 elections

    agreed to refrain from the ‘creation or dissemination of false information or content’ and the ‘deployment of networks of coordinated accounts… to systematically disseminate harmful information or content with malicious and manipulative intent’.

    • Draw from elements of the voluntary norms negotiated by the Group of Governmental Experts at the UN related to the responsible use of digital technologies by states. For instance, parties could agree to refrain from using information and communication technologies (ICTs) or physical means to target critical information infrastructure essential to the delivery of public and humanitarian services.

     
  3. Processes are measures to monitor, coordinate, information share or ensure implementation or action to be taken by signatories. Parties could, for example, establish information-sharing mechanisms to reduce or manage incidents while building confidence. Parties could:

    • Draw upon a set of confidence-building measures developed by the Organization for Security and Cooperation in Europe that focus on information sharing, voluntary cooperation, and establishment of communication channels to reduce the risks of misperception, escalation, and conflict during cyber incidents.

    • Establish a monitoring and dialogue body that commits to regular meetings, reports and to engage with social media platform administrators in the event of violations, like that contained in the social media peace agreement facilitated by the Centre for Humanitarian Dialogue (HD) in Nigeria’s Plateau State in 2021 (see article by Medinat Malefakis).

Table 2 presents a non-comprehensive list of the type of clauses that might fall under each category.

Table 2: Digital clauses in peace agreements

Table showing digital clauses in peace agreements

 

Political parties, candidates and influencers in Kaduna State, Nigeria agree a code of conduct on social media in order to address problematic digital behaviours during Nigeria’s 2023 elections.
Political parties, candidates and influencers in Kaduna State, Nigeria agree a code of conduct on social media in order to address problematic digital behaviours during Nigeria’s 2023 elections. © Centre for Humanitarian Dialogue

How to monitor, verify and attribute agreements covering digital technologies?

Attribution of responsibility for the use of certain digital technologies – especially offensive cyber operations – is difficult. The anonymity and lack of geographic boundaries in the digital space makes it hard to determine who is responsible for a cyber incident or influence operation and if those responsible were under the direction of a conflict party. That said, digital forensic techniques are evolving and certain types of technical attribution are now feasible. As such, mediators may work with parties to:

  • Focus initially on what can be monitored, such as the social media behaviour on official accounts.
  • Design monitoring approaches focused on information sharing and liaison mechanisms. This can help parties create dispute resolution mechanisms to prevent unwanted escalation from digital incidents.
  • Adopt differentiated monitoring approaches for different digital behaviours. For example, the commitments of conflict parties to end internet shutdowns are relatively easy to monitor, because control is in the hands of the government. Internet traffic can also be relatively easily tracked through open-source sites.
  • Promote inclusive process design to enable the participation of populations harmed by digital technology use. This could help cultivate constituencies pressuring for the implementation of technology-related provisions – even if they cannot be easily monitored. For example, initial evidence gathered by Access Now indicates that women are particularly harmed by shutdowns, possibly meaning that women’s groups may become powerful advocates for ending them.
  • Partner with researchers, platforms, and organisations with experience in attributing responsibility for cyberattacks or developing traceable methods for proxy actors operating across a coordinated network on social media

How should the mediation process engage with the private sector on digital technology use?

Social media platforms, telecommunications companies and other network operators are an inextricable part of the digital landscape, making them important actors to engage with. Key stakeholders in the mediation process will need to carefully consider the potential reputational and other risks of engaging with these and other private companies during the negotiation and implementation process. Mediators should carefully discuss with the parties what roles may be necessary for the private sector to achieve their mutually agreed objectives. Such objectives might include: cyber incident response, restoration of internet or mobile services, devoting resources to monitor hate speech, or removal of inauthentic networks.

Mediators and parties could engage with social media companies to address overlaps and inconsistencies between prohibitions on certain social media behaviours in peace agreements and the platforms’ content moderation and terms of service policies.

Mediators and conflict parties often lack the necessary technical expertise to address digital issues.
Accord 30

Platforms can play a crucial role in implementing provisions related to content that risks imminent violence, as is being trialled through the establishment of trusted partner links between Meta and social media agreement monitoring bodies established by HD. Even in cases where overlap is limited, platforms can assist in investigating problematic behaviour, providing data access, and exploring alternative mechanisms beyond content removal to support implementation efforts. The stakeholders may also consider involving other private sector actors, such as digital marketing agencies, if their information campaigns significantly contribute to exacerbating the conflict. Indeed, mediators and conflict parties often lack the necessary technical expertise to address digital issues. Recruiting or partnering with dedicated experts from qualified bodies (such as academic centres or computer emergency response teams) can help to build the necessary technical capacity to negotiate digital issues.

Adapting mediation to the digital age

The role of digital technologies in conflict is likely to increase. Mediators will need to respond to this challenge. Integrating new content does not require wholesale changes to the logic or design of mediation. But new expertise and resourcing is likely to be required, either within mediation teams or through strategic partnerships and engagement with outside experts and the private sector. Conflict analysis should routinely explore the digital dimension of conflicts, and processes be designed to effectively incorporate relevant digital behaviours in a conflict. This is likely to mean clauses on principles, practices and processes relevant to digital technologies more often feature in both standalone agreements, and as separate clauses in a range of broader ceasefire and peace agreements.

Conflict analysis should routinely explore the digital dimension of conflicts, and processes be designed to
effectively incorporate relevant digital behaviours.
Accord 30

Recognising the influence of the digital age and adapting processes accordingly is crucial for mediation to respond to new conflict dynamics and remain an effective means of addressing contemporary conflicts.