Center for American Progress

5 Key Recommendations for Social Media Platforms on Handling Hacked Materials Around Elections

5 Key Recommendations for Social Media Platforms on Handling Hacked Materials Around Elections

This column offers five clear recommendations to social media platforms on how they should handle hacked materials for the remainder of the 2020 election season and into the future.

A woman uses her cellphone while walking past a fountain in Los Angeles on August 13, 2019. (Getty/Frederic J. Brown/AFP)
A woman uses her cellphone while walking past a fountain in Los Angeles on August 13, 2019. (Getty/Frederic J. Brown/AFP)

In recent years, the release of hacked materials has been weaponized as a political tactic to undermine electoral contests around the world. As the retrospective report from the U.S. Senate Select Committee on Intelligence, indictments from the Department of Justice, and independent research attest, the United States was the victim of one such effort in 2016. Foreign operatives weaponized the incentives of the press, design of social media, and free flow of information online to distort, distract, and divert attention with stolen information during a key time period in the general election. Claire Wardle at First Draft described this instance as a malinformation campaign:

The term [malinformation] describes genuine information that is shared with an intent to cause harm. An example of this is when Russian agents hacked into emails from the Democratic National Committee and the Hillary Clinton campaign and leaked certain details to the public to damage reputations.

Other examples include the hack and leak of Emmanuel Macron’s emails during the final round of the 2017 French presidential election as well as the hack and leak of government trade documents ahead of the 2019 U.K. election.

Influence operations involving hacked materials are also a threat to the U.S. 2020 general election. Statements from major technology companies, issued after meeting with U.S. election security professionals, named preparation for a hack and leak operation as a key step in election protection. A September statement from the FBI and Cybersecurity and Infrastructure Security Agency warned of “false claims of hacked voter information” as a likely threat vector in attacking U.S. election legitimacy. The introduction of manipulated or forged documents as part of a leak—a practice the Citizen Lab terms “tainted leaks”—or vying for media attention by presenting forged documents as leaked documents, a method Joan Donovan and Brian Friedberg term “leak forgery,” are also potential threats.

While recent examples from abroad suggest that a hacked materials attack might come extremely close to the election, potential examples have already occurred. Clear policies must be developed before additional threats arise.

Recommended hacked materials policies

  • Every social network should develop and publish a hacked materials policy reflecting these impending threats for the 2020 U.S. general election. This policy should apply ahead of and following Election Day, lasting through the certification of results to any transfers of power. Such a policy should specifically cover hacked or stolen materials and should not be subject to modification based on politically motivated complaints during the election and postelection period. After the election process is fully completed, platforms should engage in a postmortem analysis, enable independent research of their actions, and update their hacked materials policies accordingly.
  • Hacked materials policies should prohibit, at a minimum, the direct linking, posting, or distribution of hacked materials with direct relevance to the 2020 election but exempt reporting or discussion of those hacks. Such a policy would not impede the essential work of the press in critically covering the contents or events surrounding a hack and leak operation, nor public discussion of it, but would mitigate the wide distribution of its uncontextualized contents.
  • Platforms should apply stricter standards and quicker enforcement in close proximity to elections. Recent history suggests that foreign actors are likely to target the critical window in the days leading up to an election for the distribution of harmful materials, and platforms must make every effort to prevent their vulnerable products from again being weaponized to this end. Given the potentially extended nature of this year’s election results period due to higher rates of mail-in voting, platforms should also consider the days and weeks following the election as a high-risk period meriting additional scrutiny.
  • Platforms must enforce these policies before hacked material garners mass attention, at which point much of the damage is already done. In order to more effectively prevent such a scenario, social media platforms should curb the distribution of potentially hacked materials by reducing algorithmic promotion. Platforms should alter content moderation processes in the spirit of virality circuit breakers, which may help prevent the wide distribution of disinformation and malinformation around the 2020 election, including efforts in the form of real or forged hacked materials. For example, platforms took swift and welcome, if imperfect, efforts in order to limit the spread of a story that more than 50 former intelligence officials have decried as having “all the classic earmarks of a Russian information operation.”
  • Social media platforms must be accountable to their outsize influence in democratic discourse. Such accountability includes unprecedented transparency about their efforts to prevent their products from being used to impede democratic processes. As the election nears, platforms must be entirely and proactively transparent about policy and product changes, enforcement efforts, and monitoring efforts in a way that better enables ongoing independent scrutiny. In parallel to this call for data preservation around COVID-19 moderation efforts, platforms should preserve and share detailed data on their 2020 election moderation actions to enable future independent research on platform efforts during this time period.

Social media platforms have a responsibility to prevent their products from again being utilized to interfere with the democratic process: The events of the 2016 election cannot be allowed to repeat in the future. In the weeks ahead and beyond, clear policies and swift action on politically weaponized hacked materials will continue to be necessary.

Adam Conner is the vice president of Technology Policy at the Center for American Progress. Erin Simpson is an associate director of Technology Policy at the Center. 

Appendix: Assorted platforms’ hacked materials policies 

The positions of American Progress, and our policy experts, are independent, and the findings and conclusions presented are those of American Progress alone. A full list of supporters is available here. American Progress would like to acknowledge the many generous supporters who make our work possible.


Adam Conner

Vice President, Technology Policy

Erin Simpson

Former Director, Technology Policy