Our Work

Disinformation Toolkit: How does online disinformation affect international NGOs and civil society?

Disinformation Toolkit
How does online disinformation affect international NGOs and civil society?

Download complete Disinformation Toolkit as PDF

Targeting Western-Backed Organizations:
The White Helmets

While the volunteer first responders known as the White Helmets have gained international attention for their search-and-rescue operations in the Syrian civil war, they have also become the target of a heavy disinformation campaign intending to sow confusion about the conflict in Syria. Disinformation agents leveraged a network of news sites such as RT and Sputnik News, and published several articles characterizing the White Helmets as a terrorist organization with ties to Al-Qaeda and access to chemical weapons. The claims were amplified on social media, with RT-affiliated reporters sharing the fake news content with their followers, who in turn shared the content throughout their networks. This fueled doubts in people’s minds about the motivations of the White Helmets. Russia-backed disinformation campaigns against the White Helmets not only distract attention away from the aftermath of airstrikes, they also work to justify Russia’s role in backing President Bashar al-Assad in the conflict against Syrian rebels.

New Threats, More Work:
The Together Project  

Organizations that need to defend their credibility because of disinformation are burdened with new expenses and workflows to mitigate risk. The amount of time that a team spends identifying disinformation and responding can be taxing.

The Together Project is a hub of advocacy and solidarity for U.S.-based NGOs that provide development and humanitarian relief around the world, and confront discrimination or targeted prejudicial regulations in the U.S. due to their operating principles or religious faith. One member of the Together Project reports that it spent more than $100,000 in one year on outside SEO (search engine optimization) consultants to improve search results for their organization and its leaders. Another member organization calls attention to the issues it has with managing its social media accounts.

Asymmetrical Information Environments:
Spreading Anti-Muslim Narratives on Facebook in Myanmar

More than half a million Rohingya, an ethnic minority group, have fled Myanmar since August 2017 to escape violence at the hands of the government-backed military. The United Nations has described the persecution as a “textbook example of ethnic cleansing.” The violence has grown in large part, says the United Nations, due to unsubstantiated rumors and doctored photos that have gone viral on Facebook in Myanmar and that have spread or re-enforced dangerous, false beliefs about the Rohingya. The images, even when debunked, have fueled waves of anti-Rohingya fervor.

The rise in disinformation about the Rohingya took place alongside the adoption of smartphones and an increase in mobile connectivity throughout Myanmar in the past five years. Social media is the main source for news, and due to the nature of the platform, the content spreads quickly without context or fact-checking. his situation demonstrates how disinformation, through social media, incites real-life harassment and violence during a sensitive transition period. 

Interest in so-called “fake news” around the U.S. elections has led to a conflation of terms when discussing false information online and, at times, messy arguments about how disinformation campaigns manifest, spread, and affect society. 

Online disinformation can take many forms: promoting accurate information in false contexts, manipulating original content, or completely fabricating and promoting imposter content. 

Strategies to disseminate online disinformation cover a wide spectrum. The following methods have been documented by international organizations: 

  1. Coordinated bot networks (see the White Helmets case study);
  2. Use of fake domains in which an adversary creates a similar looking website or social media profile to a targeted website; 
  3. Hijacking attacks called “double switch attacks” in which adversaries gain control of an organization’s or individual’s account and spreads misinformation through those accounts; and 
  4. DNS (denial of service) redirection or re-directing traffic to specific websites to alternative websites by state-owned telecoms.

It is useful to assess disinformation content based on whom the campaigns intend to target: 

  1. Individuals: visible or politically connected leaders of organizations, national, and international staff. 
  2. Organizations: network organizations, funders, and small grassroots and community-based organizations that partner with large organizations. 
  3. Affected populations: parts of society that receive assistance from international organizations (e.g., refugees, interfaith communities). 

Groups interviewed for this report cited several examples of how disinformation attacks can negatively impact their organizations. The most cited consequence has been the influence of these attacks on organizations providing critical information. In 2017, Access Now’s Digital Security Helpline documented several cases in which leaders in Bahrain, Myanmar, and Venezuela had problems recovering their accounts after they were taken over by an adversary that disseminated false information through those accounts. Second, the costs of responding to these threats and concerns are high. Fighting off trolls and false claims, for example, have cost members of the Together Project significant human resources and capital. Without question, these attacks put in-country staff, partner organizations, and the community as a whole at risk; and if they remain untamed, many worry the attacks could lead to operations being halted. 

Conditions for Vulnerability to Disinformation Attacks  

Researchers have noted that specific environmental conditions may heighten vulnerability to disinformation and put an international organization at higher risk of a becoming a target for a disinformation attack. Attacks may be more likely to occur in regions affected by active conflict, authoritarian governments, and/or uneven connectivity infrastructure. The following conditions are notable: 

  1. Lack of reliable and credible information. In environments where press freedoms are under threat, journalists are intimidated, or the state controls the media, disinformation can reach wider audiences. When information is created and distributed with malintent in places where information on specific topics is not readily available, it can have harmful effects on vulnerable audiences. 
  2. High levels of ambient fear. In environments experiencing high degrees of uncertainty, including areas affected by conflict, humanitarian emergencies, or natural disasters, audiences may be more susceptible to misinterpreting, or taking actions on disinformation; this, in turn, increases the negative effect and impact it has on an audience.  
  3. Asymmetrical information environments. When there is a lack of access to information, existing information channels are vulnerable to co-optation or manipulation. Asymmetry can be caused by media ownership, political issues around language, and even practical issues such the information dissemination mechanisms that are employed.  
  4. Political events or power transitions. Critical social and political periods and events, such as the period before national elections, present environments that exacerbate the trends described above, and provide fertile opportunities for governments to surveil and limit the flow of information. 
  5. Past history of political and other leaders targeting civil society. Environments where there is some benefit to be gained in undermining the credibility of an international actors like human rights groups may be more vulnerable.