Our Work

Disinformation Toolkit: Preparing for Online Disinformation Threats

Disinformation Toolkit
Preparing for Online Disinformation Threats

Download complete Disinformation Toolkit as PDF

Individual organizations and leaders within the international development and humanitarian assistance community have begun to test new methods for protecting organizations against false claims and exaggerated information online that disparage them or their work. This section provides both questions to help groups determine their organizational vulnerabilities and also practical ideas for protecting against possible risks.

Developing and deploying strategies for anticipating disinformation strategies and techniques used by states and nonstate actors is an evolving area of practice. Organizations will need to support their staff to develop dynamic ways to identify and respond to disinformation and move from ad hoc response systems to more streamlined workflows around handling disinformation. This section provides suggestions for organizations to develop dynamic reporting and response mechanisms for identifying, assessing, and taking action on disinformation threats. 

Identifying your risk 

Preparing for disinformation, responding to disinformation, and sharing insights and learnings about attacks will naturally fall on the communication and security leads within your organization. As front-line actors, internal communication and security team experts are best positioned to become better informed about disinformation risks, and to update existing risk assessment and response activities to respond to disinformation threats. 

For Communication Leaders: 

Discuss your organization’s disinformation-related risks to identify weak spots and opportunities for proactively preparing for a possible attack. Conduct a media threat assessment as part of larger risk assessments (see resources at the end of this report) and seek to answer the following questions:

  1. Has your organization suffered from a disinformation event before?
  2. If so, was the organization able to determine who was behind it, and why?
  3. What steps did the organization take before? Are these defensive steps still valid or sufficient?

Train yourself and your team members who are most likely to be on social media on how to identify disinformation.

  1. Are you aware of what early warning signals might be? (For example, are you aware of what a bot might look like?)  

For Security Leaders: 

Disparaging attacks against organizations and leaders, even if false, have in the past posed physical threats to offices and individuals. In this way, online disinformation should be an issue security leads are briefed on, as they develop risk mitigation, emergency crisis response plans and seek to answer the following questions: 

  1. Who might gain by undermining your organization’s credibility?
  2. What tools do they have at their disposal (e.g., access to state media)?
  3. Would you have the ability to respond (consider messaging, channels to reach key populations, allies, etc.)?

Developing Your Organization’s Risk Mitigation Plan 

Organizations working on highly visible issues or with at-risk beneficiary communities should discuss what kind of risk mitigation plan is needed. This section summarizes steps you might consider taking to develop a strategy for identifying and responding to online disinformation that could affect your organization’s operations and the safety of your staff, as well as your beneficiaries.

We recommend thinking about your disinformation preparation strategy in four parts: 

Your Strategy to Tackle Disinformation

  1. Evaluate your media and information ecosystem. 
  2. Determine who is spreading the false information about your organization, leaders, or programs and develop a hypothesis about why they are sharing this information. 
  3. Determine what they are spreading or saying and how it is spreading.  
  4. Take actions to counter this information and work with your organization’s leaders to integrate these preferred actions into existing workflows within your organization.  

Below are some strategies for taking these actions. These suggestions should be viewed as conversation starters for you and your communications and security staff. The steps that you decide to take or not take should be tailored to the unique context in which your organization operates.

(1) Your Media Ecosystem 

Understand online media use and the online media environment in which your programs operate. The first question to ask yourself is: How vulnerable is my media environment to abuse? 

Action needed

The organizations interviewed for this report noted that there is a greater need to monitor social media for conversations about their work and their organizations. In fact, several organizations indicated that watching local online media is usually an afterthought. Consult your national staff and learn from them on how information is received by and travels to and within the communities that matter most to your organization. 

Possible discussion questions could include: 

Questions about your audience:

  1. How do people get information about news, politics, and their community? 
  2. What sources of information are most important for political news?
  3. What information sources seem to matter to your core audiences (more than others)? 

Questions about your threats

  1. Who are the distributors (i.e., who shares the posts that go viral) that affect your work or your organization?  
  2. Who are likely creators (i.e., who develops the content that goes viral) of false claims that affect your work or your organization? 
  3. Do you have any hypotheses on how they disseminate their information and messages?
  4. What are their motivations?  

(2) Who Creates Disinformation? And Why? 

Disinformation researchers cite two primary actors that create and disseminate disinformation content:  

  1. State or state-aligned groups, and political actors with political goals who create and spread disinformation. The Kremlin’s tactics are well documented.9 In more recent history, in the Philippines, the president’s office has built a propaganda machine, in the form of fake accounts and bot networks, that disparages organizations and journalists, and disseminates narratives with specific political goals.
  2. Nonstate actors such as terrorist organizations, extremist groups, political parties, and corporate actors who have developed and distributed disinformation online. These groups have political aims to recruit supporters, create confusion, or disparage groups who oppose them. 

Note: be careful to distinguish groups with politically motivated goals from individuals and groups motivated by economic incentives who create and disseminate false information. These are actors who have identified methods to earn a living by creating and disseminating false information; they may support state and nonstate actors in achieving their political goals. In the United States, reports of Macedonian teenagers building false information content farms showed how these cottage industries generate revenue and support industry around the creation and dissemination of false information. Civil society and advocacy groups have also promoted disinformation for satirical purposes. 

A common goal for these groups is to sow confusion or discontent among targeted communities. In Myanmar, for example, Facebook has been repeatedly jammed after major terrorist attacks with doctored photos and false information about the attacks from outside sources.

Disinformation can also take the form of ongoing, more diffusive attacks to discredit individuals or organizations conducted through the following ways: paid pro-government commentators; political bots; hijacked accounts (hacking, impersonation, phishing); fake news around elections; and pro-government media and propaganda.

(3) What are They Saying? Where is it Appearing? 

Disinformation is disseminated through the Internet through websites, social media platforms, such as Facebook and Twitter, and smart-phone messaging applications such as WhatsApp, Viber, and Instagram, but will vary depending on how actors are seeking to reach their intended audiences. 

Commonly cited areas where disinformation has appeared include the following:

  • Websites (articles)
  • Facebook pages
  • Messages through Facebook Messenger, Whatsapp, and Viber
  • Posts in Facebook groups
  • Comments on highly visible news pages
  • Posts on Instagram (posts)
  • Tweets on Twitter (tweets)

In preparing this report, we found that international NGOs knew disinformation was a problem, and said that their colleagues were monitoring these trends and behaviors, but admitted collecting data on these trends was challenging.

Organizations may consider developing a system to systematically record and log problematic posts, photos, or text content in a spreadsheet (see appendix for sample) as they occur and share these materials with other groups who are experiencing attacks or observing worrisome trends. By aggregating and collecting this information, research partners may be able to support research that identifies sources and networks leading to the spread of disinformation. Crowdtangle is one tool organizations can use to see where specific pieces of online content are shared and to separate amplifiers from sources.


Advantages and Disadvantages of Countering Online Disinformation

Action Advantage Disadvantage

Let the disinformation die out and monitor conversations.

Allows a conversation
that may not be visible to your audience to die out more quickly.

Audiences that may have engaged with the disinformation may harbor false views about you and your organization.

Directly counter the disinformation and refute false facts with your organization’s existing online media channels.

Allows organizations to correct false statements or claims about them or their work. (If this course is taken, it should be
done swiftly.)

Developing and publishing content, and then monitoring response to it takes time and human resources. There is also the possibility that counter-messages can backfire, or reinforce initial false claims or disinformation.

Promote alternative messages that provide information to your audience, through
new narratives.

Allows you to change the conversation by presenting new information or alternative messages.

Developing and publishing content, and then monitoring response to it, takes time and human resources.


(4) Decide Whether and How to Take Further Action 

Have discussions with your communications and security leads to discuss whether actions need to be taken to counter disinformation. 

Depending on the circumstances and your organization’s goals, the following could be options for response against disinformation events: 

If your organization has experienced a large-scale disinformation event, you may also consider the following actions: 

  1. Archiving social media content. If this is an area of increased vulnerability for you, consider connecting with open source investigation labs or media organizations that focus on social media information archiving.
  2. Conducting a formal, after-event assessment. Discuss how you would have handled the event differently, or resources that you wish you would have had. Discuss and assess the experience so that you can be prepared for the next event. 
  3. Discuss the event with partners and donors. Discuss what happened to you and your colleagues with critical stakeholders, including your partners and donors. 
  4. Engage platforms. Disinformation is also an urgent issue for technology platforms to address. If there were any issues related to engagement with the platforms directly in requesting removal of content, tell your organization’s policy contact.  

Engaging National Staff 

International NGOs and civil society interviewed for this report suggested it would be prudent for national staff teams to be involved in threat assessment and response activities related to disinformation. On-the-ground staff may be more likely to identify problematic trends as they occur, and will have valuable perspectives on what an appropriate response might be. Discuss and identify pathways for team members to share patterns and behaviors. Also discuss steps they might take to flag and—if necessary—to respond to disinformation. Additional recommendations follow below. 

  • Develop an internal system for documenting and reporting instances of disinformation online that may affect an organization’s operations. Discussing the issue with staff, and designating a preferred method of communication around the problem, would highlight the importance of sharing events when they occur. It would also allow organizational leaders to get a more accurate picture of threats against the organization. 
  • An important element of doing this successfully is developing an open culture where staff members feel comfortable and are encouraged to report disinformation events as they occur. 

Longer Term Strategies: Building Community Resilience 

Proactive measures to establish relationships, build trust, and promote information about what organizations are doing, and who they are, helps make a strong defense against false claims. Inversely, groups with weak community relationships and that infrequently share information with their communities will be more susceptible to disinformation attacks. Practitioners know this work is essential, but it is not a priority when working under stress or in crisis environments where immediate relief or protection are needed. Below are suggestions to get started quickly and take steps toward preparing your organization to be ready if and when an unexpected disinformation event occurs.

  • Proactively develop relationships with credible information sources. Based on the media ecosystem assessment suggested above, build relationships with a network of trusted journalists. Organize one-on-one meetings to brief them on your work, regularly invite them to your events and activities if appropriate, and maintain of drumbeat of information to these journalists.
  • Identify and coordinate with partners who share the same vulnerabilities. International NGOs contacted in preparing this report have suggested the importance and value of investing time in identifying and working with like-minded organizations to discuss vulnerabilities and attacks when they occur. For example, the Together Project, a coalition supported by InterAction, has developed a space for Muslim-interest foundations in the U.S. to find allies who can carry important messages to different constituencies, including larger interfaith coalitions. These relationships have allowed the alliance to strategically deploy surrogates to promote positive messages at the local level (whether it is commemorating a holiday, supporting disaster response, or sharing content around significant political events) and to members of Congress when advocating for specific issues. Working together as a network and addressing the problem together has been an essential part of sharing insights and brainstorming solutions. 
  • Develop a plan for proactively communicating who you are and what you do locally. Working on sensitive issues means there is often a tension between needing to be discreet and needing to be more vocal to correct inaccurate information or promote accurate details. Encouraging the spread of your messages can help you shape your narratives, and help others reject information that may be inconsistent with their beliefs about your organization. If you do not proactively share what your organization does and what you stand for, then someone else may fill information gaps with inaccurate information. International NGOs and civil society organizations feel uncomfortable proactively advocating for their work. Organizations need to do more to promote who they are, and proactively share these messages with their partners and stakeholders more than ever. Discuss with your colleagues your approach to balancing proactive communications about your activities and events, with the potential risk of that information negatively affecting communities you support.
  • Anticipate risk, and share resources before the crisis. NGOs have noted the benefit of developing systems for translating stock messages to be used in crisis situations. Translators Without Borders, through a proactive communications “words of relief” program, translates critical messages before crises. The organization developed a library of statements on topics such as flood warnings to build up resilience when attacks or disasters occur, so people are more informed. This was deployed with success through the Red Cross and the International Federation of Red Cross and Red Crescent Societies (IFRC) during the 2017 hurricane season in the Caribbean. Messages were translated into Creole and Spanish in late September and October of 2017. Translators Without Borders emphasized the need to provide the right content, that is relevant, and is in a format that is accessible. While this toolkit focuses primarily on online disinformation campaigns, some audiences may have other mechanisms that they use to receive and share information (which may not be online, due to lack of technology, connection, and trust in those sources). Effective responses to those campaigns need to appreciate the information landscape in that particular context.