Carnegie Endowment for International Peace
In late December 2020, Carnegie’s Partnership for Countering Influence Operations (PCIO) published a systematic review of 84 public policy proposals related to disinformation released since 2016. In addition to providing a database of these policy papers, this analysis found that:
- The majority of policy recommendations were focused on tech platforms and government due to their preeminent access to data and resources, while recommendations for civil society, academic and multi-lateral organizations have received less attention despite their important role in combatting disinformation.
- The most frequent policy recommendations had to do with data and information sharing, media literacy, supporting fact-checkers, platform regulation, moderation, and transparency, followed by better cross-national coordination, oversight, and sanctions. There were relatively few recommendations related to norm building; user autonomy and privacy; platform accountability; and anti-trust (among others).
- More than half of the policy papers called for greater coordination between actors, especially cross-sector collaboration, despite the fact that nearly three-quarters of the papers reviewed included no citations of other research, suggesting that further coordination between policy researchers and their recommendations may help to advance a unified agenda.
See the summary and full database HERE.
National Endowment for Democracy (NED)
NED’s January 2021 paper, Mapping Civil Society Responses to Disinformation, looks at the role of civil society organizations in combatting misinformation by mapping current initiatives
(described in the section above) and survey of CSOs working on this issue. The authors highlight the following insights:
- Civil society must prioritize skill diffusion and knowledge transfer initiatives, which is seen to be lacking, given the number of isolated initiatives taking place across this sector.
- A lack of access to social media data is hampering the ability of civil society organizations to determine the depth of the challenge, as well as the efficacy of current approaches.
- There is a deep need to improve coordination among civil society actors working on the disinformation challenge. Too many initiatives operate in isolation.
- Civil society organizations enjoy uneven access to tech platforms. Many, particularly in developing countries, were ignored entirely when sharing concerns with platforms.
See the full analysis and recommendations HERE.
- New America’s January 2018 Digital Deceit explores the business models behind disinformation and proposes a series of reforms to digital advertising technology, for tech platforms, governments, and civil society to consider, with a focus on transparency, cybersecurity, public education or media literacy, public service journalism and fact-checking, corporate responsibility or platform accountability, and consumer or user empowerment. Learn more HERE.
- Another relevant resource from New America is the Ranking Digital Rights project, which evaluates the digital rights record of some of the most popular tech platforms. This project proposed a series of reforms for tech companies and governments related to re-empowering citizens in the digital sphere. These recommendations include the need for tech platforms to empower users’ control of data, more transparency, and greater accountability on the part of platforms and their algorithms to users, among others. Learn more HERE.
International Center for Non-Profit Law (ICNL)
ICNL has published a briefer on the legal dilemma of cracking down on disinformation while respecting free speech, a challenge with which governments around the world are now grappling. In Responding to the Disinformation Dilemma: A Policy Prospectus of Legal Responses to Disinformation, ICNL recommends three legal approaches to combatting disinformation around the globe:
- The use of existing laws, including tort law, cyber-bullying and cyber-stalking laws, as well as fraud.
- New laws to curb disinformation and coordinated inauthentic behavior, which have already been enacted in certain sub-national jurisdictions, such as anti-bot laws and transparency laws.
- Newly proposed legal and regulatory rules are not currently in place, such as two-way enforcement of terms of service, independent regulatory agencies or administrative tribunals representing users’ digital rights, enforceable complaint and review mechanisms, deeper investments in media literacy among the public, transparency requirements for content moderation, and platform-level controls on the forwarding of messages.
Learn more HERE.