Center for American Progress

Social Media and the 2022 Midterm Elections: Anticipating Online Threats to Democratic Legitimacy
Report

Social Media and the 2022 Midterm Elections: Anticipating Online Threats to Democratic Legitimacy

Social media companies continue to allow attacks on U.S. democracy to proliferate on their platforms, undermining election legitimacy, fueling hate and violence, and sowing chaos.

In this article
Mark Zuckerberg speaks at the Senate.
A person takes a photo as Facebook CEO Mark Zuckerberg speaks during a joint hearing of the U.S. Senate Commerce, Science, and Transportation Committee and the U.S. Senate Judiciary Committee on April 10, 2018, in Washington, D.C. (Getty/AFP/Brendan Smialowski)

Authors’ note: This issue brief contains descriptions and direct quotes of online threats, including some of offensive nature.

The American election system and democratic process have been under continuous informational assault for the past few years. All the while, social media companies continue to abdicate responsibility for their role in facilitating these attacks on U.S. election integrity. The Center for American Progress has—and continues to—forcefully advocate for social media companies to do more to prevent abuse of their products to spread mass hate and disinformation.

As the nation nears the 2022 U.S. midterm elections, CAP has spent the past year working with partners to urge companies to make the meaningful changes needed. Yet it is clear that social media companies have again refused to grapple with their complicity in fueling hate and informational disorder ahead of the midterms. With key exceptions, companies have again offered cosmetic changes and empty promises not backed up by appropriate staffing or resources. In many cases, companies are backing away from the commitments and efforts they made around the 2020 U.S. election. This again leaves the country’s elections vulnerable and sends a clear message about other elections around the world.

This issue brief outlines what is needed from social media companies and identifies three of the top threats they pose to the 2022 midterm elections—the season opener for the 2024 presidential election.

Social media companies and the 2020 U.S. general election

A mammoth effort by election deniers to attack the democratic system and undermine its legitimacy is underway. These attacks on the election process and the overall threat environment, especially around election denial and harassment of election workers, pose a profound threat to the 2022 election and to American democracy writ large.

After unfounded claims of fraud spread during the 2020 presidential election, considerable threats to the democratic process remain, including the refusal to certify election results, increased voter intimidation, and abuse of electoral laws.1 Election workers, meanwhile, are facing unprecedented harassment and intimidation,2 and protecting those individuals must become a priority.3

Social media companies are not the cause of these threats, yet they have allowed their platforms to be easily abused to carry out those attacks. In 2020, CAP advocated not only for improved rules and enforcement to counter election delegitimization but for significant changes to front-end social media interfaces and back-end algorithmic management decisions that could more fundamentally mitigate the spread of disinformation online.4 Some of these, including recommendations advancing the idea of viral circuit breakers, were reportedly adopted by platforms.5 Yet warnings that the threats would not only continue but change and worsen in the post-election environment went unheeded. Firms shut down protective features,6 even as violent groups ramped up their efforts to organize the January 6 insurrection at the U.S. Capitol. CAP’s explicit call for action in October of 2020 warned that: “During [the postelection period], new threats will emerge as the information environment evolves with each phase of the election process—from initial counts and media projections to canvassing and recounts to the Electoral College and congressional certification. … Aggrieved groups may seek to organize disruption of recounts or the Electoral College process.”7

Following the white supremacist insurrection on January 6, CAP called for the immediate removal of then-President Donald Trump for inciting racist violence and attacks on democracy.8 CAP joined with allies to address the egregious interlanguage disparities in social media content moderation during the election9 and, in its sweeping 2021 report, “How To Regulate Tech: A Technology Policy Framework for Online Services,” proposed a path forward on tech regulation that could structurally address these issues and the United States’ failure to meaningfully regulate all online services.10

Social media companies, the 2022 U.S. midterms, and beyond

In comparison to the 2020 general election, the 2022 midterms offer fewer unknowns. The public is more accustomed to vote by mail, drop boxes, and the extended period of ballot counting that follows election night. However, the known threat has worsened: An unprecedented number of commentators and candidates are advancing false claims about the democratic process and election integrity. This dialogue has created a media environment of cable news, talk radio, podcasts, and livestreamers that is highly charged with false claims and is vulnerable to manipulation. The rise of conservative-centric digital platforms has exacerbated this environment,11 creating space for the “big lie”—the unfounded falsehood that the 2020 U.S. election was “stolen” from Donald Trump—to flourish.12

CAP anticipates that the 2022 midterms will open the curtain on these rhetorical and procedural assaults, which will continue to grow ahead of the 2024 general election. Election deniers may loudly launch baseless legal challenges. Politically motivated election officials may refuse to certify sound election results. Voters may elect public officials who hypocritically deny the security of elections, putting them in positions of power over election administration. And most obviously, former President Trump may declare his intent to run for president in 2024, even before any newly elected officials are seated, giving him and his constant stream of disinformation increased media attention.

CAP anticipates that the 2022 midterms will open the curtain on these rhetorical and procedural assaults, which will continue to grow ahead of the 2024 general election.

Thus, aggressive action is needed from social media companies beyond the midterms; informational assaults on democracy will occur every day between now and the completion of the 2024 Electoral College process in early 2025. While social media platforms are not the sole cause of these challenges to democracy, they are a critical accelerant.13

Recognizing these threats, CAP joined with civil rights, racial justice, and election integrity partners to launch renewed efforts ahead of the 2022 midterms. As co-chair of the Change the Terms coalition,14 CAP met with companies to urge them to #FixTheFeed and stop amplifying hate ahead of the midterm elections,15 co-authored a clear set of demands for social media companies supported by more than 120 civil rights and democracy groups (see “Midterm demands to social media companies”),16 and supported a second letter—once again urging action in the public interest.17

Still, the companies—including Meta (parent company of Facebook and Instagram), Twitter, and TikTok—have fallen short of those demands.18 Proactively providing information about how to vote to social media users in multiple languages is a welcome effort, but it falls short of the needed efforts to ensure that disinformation targeting non-English-speaking communities is appropriately identified, labeled, and removed. Policies that prohibit violent rhetoric are, likewise, essential but of little use if not backed up by sufficient staffing and enforcement resources. Programs to detect and curtail coordinated harassment of election officials and election workers are also to be cheered, but if the same companies continue to platform the dangerous and violent lies that motivate that harassment, they are insufficient.

Midterm demands to social media companies

For more, read the initial letter from the Leadership Conference on Civil and Human Rights, Common Cause, Free Press, Lawyers’ Committee for Civil Rights Under Law, and CAP—which includes the following recommendations:19

  1. Introduce friction to reduce the distribution of content containing electoral disinformation.
  2. Focus on disinformation targeting non-English-speaking communities.
  3. Consistently enforce civic integrity policies during both election and nonelection cycles.
  4. Prioritize enforcement to combat the “big lie.”
  5. Consistently apply civic integrity policies to all live content as a means of combating election disinformation.
  6. Prioritize fact-checking of electoral content, including political advertisements and posts from public officials.
  7. Provide real-time access of social media data to external researchers and watchdogs.
  8. Provide greater transparency of political advertisements, enforcement practices, and algorithmic models.

Given companies’ insufficient efforts to make the fundamental changes needed, the underlying risks that motivated civil society advocacy and recommendations throughout the summer persist. Now, only days ahead of the election, three megathreats to the 2022 midterm elections, posed by social media companies, have coalesced: 1) election subversion theater—that is, providing a space for election deniers to rhetorically “act out” the big lie for the public, even as their procedural challenge fail; 2) online harassment of election workers and voters; and 3) post-election chaos online surrounding the results and delays in order to baselessly delegitimize the democratic process.

Ahead of the 2020 election, social media platforms did not pay enough attention to CAP and other organizations’ warnings about these threats and the post-election period; there can be no excuse for a repeat in 2022 and the start of the 2024 election cycle.

Ongoing social media threats to the 2022 midterms and beyond

Threat No. 1: Election subversion theater

The threats to democratic legitimacy are—and have long been—twofold: procedural threats and perceptive threats. As noted above, there are tangible procedural threats to American elections,20 including some partisan officials’ refusal to certify, voter suppression, and tampering with results from aggrieved election workers. But there are also perceptive threats to democratic legitimacy, as the legitimacy of the election and sustainability of democracy rests in part on social trust and public perception of election integrity. Whenever public perception of an election diverges from reality, it causes frictions that threaten to destabilize democracy. This is, of course, true in either direction: An illegitimate election perceived as legitimate and a legitimate election perceived as illegitimate both present catastrophic democratic consequences.

A herculean effort has been made to secure the procedural legitimacy of the U.S. elections. Election workers and public officials have worked tirelessly to administer secure, fair elections–even in the face of challenges presented by the COVID-19 pandemic and threats from anti-democratic forces. But procedural security is not enough; democratic legitimacy rests on the ability of the American people to perceive and believe in that security. If half the battle is securing the elections, the other half is clearly communicating that security and making transparent just how safe elections are. In the face of procedurally secure elections, election deniers will double down on rhetorical efforts to destroy the perceived legitimacy of the democratic process.

These attempts constitute a type of “election subversion theater,” acting out the media cycle around baseless claims of fraud in order to create the impression that there were instances of fraud or election insecurities, even when there were not. Media coverage of these false claims and accompanying online conversation about manufactured events or baseless legal challenges can leave the public with the impression that something about the election was amiss, even when those claims are empty. In this way, even if procedural attempts to undermine the election fail, election subversion theater poses a threat in its ability to undermine accurate public perception and trust.

In the face of procedurally secure elections, election deniers will double down on rhetorical efforts to destroy the perceived legitimacy of the democratic process.

In recognition of that, election officials, local leaders, and grassroots groups have been working hard to show their work and let people know that elections are secure.21 As discussed, however, social media companies are failing to do the same. Facebook and Twitter have failed to continue enforcement around the big lie, as users’ posts questioning the integrity of the 2020 election are not labeled as misleading, nor are they removed. Many platforms claim their civic integrity policies only apply to the current election, leaving the door open for false claims to spread outside of election cycles. Other rules are, likewise, too narrowly constructed to be effective, drawing on insufficient standards around whether something could be considered “opinion” even if it relies on false, damaging claims that have been proven false in court again and again. Perfunctory blog posts that hand-wave vaguely about disinformation but sidestep the elephant in the room–enforcement around the big lie, including by the more than 100 candidates and former president promoting it–offer little support to election workers who are concerned their efforts might be undermined by election deniers on social media.

Ultimately, social media platforms have a significant role to play in ensuring that democratic legitimacy is not attacked in massive, coordinated, and baseless ways in the court of public opinion.22 If platforms allow their tools to stoke confusion and sow doubt about secure election processes, the work of public officials everywhere to secure the procedural legitimacy of the election will be undercut by rhetorical campaigns against it. Wherever election denier efforts to substantively disrupt secure elections fail, they will turn to rhetorical tactics to create the illusion of disruption and chaos. Social media companies that fail to disrupt election subversion theater, but rather provide the national stage and necessary tools to carry it out, are complicit in these assaults on American democracy.

Social media companies that fail to disrupt election subversion theater, but rather provide the national stage and necessary tools to carry it out, are complicit in these assaults on American democracy.

Threat No. 2: Online harassment and intimidation of election workers

This past summer, Wandrea ArShaye “Shaye” Moss, a former election worker in Georgia, provided harrowing testimony to the U.S. House Select Committee to Investigate the January 6th Attack on the United States Capitol, detailing the harassment she faced after former President Donald Trump and his allies targeted her and her mother in an effort to cast doubt on the 2020 presidential election in the weeks and months after the election. Moss faced death threats, vitriolic and racist comments, and harassment in her daily life. She was ultimately forced to leave her job as an election official and even go into hiding.23

This type of intimidation is, unfortunately, not an isolated incident. Threats directed at election workers have become more acute and pervasive, even though there is no evidence of election fraud or misconduct. In a recent survey conducted by the Brennan Center for Justice, 1 in 3 election officials reported feeling unsafe because of their job, and 77 percent reported that threats against election officials have increased in recent years.24 To be sure, harassment is not exclusive to social media, but online threats are nonetheless widespread and highly concerning. The Brennan Center report found that 37 percent of election workers who have experienced threats reported that those threats were made on social media.

A 2021 Reuters investigation revealed details about the targets of these threats and the alarming content they were confronted with,25 including:

  • The Colorado secretary of state received multiple death threats across Facebook and Instagram, including: “Watch your back. I KNOW WHERE YOU SLEEP, I SEE YOU SLEEPING. BE AFRAID … I hope you die” and “Guess who is going to hang when all the fraud is revealed? (*Hint ..look in the mirror).”
  • A Facebook message to a deputy to the Philadelphia city commissioner read: “EVERYONE WITH A GUN IS GOING TO BE AT YOUR HOUSE- AMERICANS LOOK AT THE NAME- ANOTHER JEW CAUGHT UP IN UNITED STATES VOTER FRAUD.”

As noted above, CAP has repeatedly emphasized the importance of election workers and has called for policy and legal solutions to protect them from continued harassment. The U.S. Department of Justice has launched a task force to combat threats against election workers, but social media companies must also take seriously the spread of disinformation and harassment that fuels these threats.26 Death threats, or any violent rhetoric, must be flagged and removed immediately—not only during the election cycle but on a consistent basis.

Beyond that, social media companies need to introduce mechanisms to slow the spread of election disinformation. Threats against and harassment of election workers that occurs offline often stems from the disinformation that is allowed to fester online. Election conspiracy theories, outright lies, and even calls for violence are rampant on social media and contribute to an environment that makes the lives of election workers increasingly unsafe. In fact, the Brennan Center survey reported that nearly 2 in 3 election officials believe that false information makes their jobs more dangerous, and more than 3 in 4 believe that social media companies have not done enough to stop the spread of false information.27

Consistent, proactive, and robust enforcement of the civic integrity policies, permanent bans of users that have made public death threats, and the use of friction—that is, slowing the spread of disinformation through various product changes, such as not recommending disallowed content on “explore” pages or disabling users’ ability to share or retweet before reading an article—are essential for mitigating harassment.28

Threat No. 3: Post-election informational chaos

As CAP wrote ahead of the 2020 election in “Results Not Found: Addressing Social Media’s Threat to Democratic Legitimacy and Public Safety After Election Day,” the online threat environment undergoes a transition between pre- and post-election periods.29 Once polls close, voters are no longer seeking information to inform their vote. As such, online platforms must shift their considerations in order to preserve civic integrity. This will involve shifting to a more aggressive posture to remove inaccurate and inflammatory content that seeks to delegitimize the election or sow chaos.

Failure to treat the post-election information environment vigilantly comes with grave risks, most obviously violent insurrection attempts. As witnessed in the post-election period of the previous election, the relentless spread of disinformation by former President Trump and his allies culminated in insurrectionists storming the U.S. Capitol on January 6 to attempt to disrupt the peaceful transfer of power. The events of that day should loom large in the minds of online platforms as the midterm elections are carried out.

Online platforms have every piece of evidence and no excuse not to be prepared for the same tactics from political candidates in the 2022 election. They must remain vigilant and dedicate extra resources during and after this election, up until the point that state and local elected officials are seated for their new terms and the new Congress has been convened. In addition to continuing to enforce against retroactive false claims for past elections and carefully monitoring for premature delegitimization attempts, such as declaring an election stolen or rigged, platforms should prepare the following actions for after the polls close:

  • Prepare moderation systems for the post-election features that became issues in 2020, including the so-called “blue shift” that comes from counting mail-in ballots, normal election counting errors that are legally corrected, and traditional weather or other normal delays. While content labeling—that is, the informational tags or warnings adhered to social media posts with potentially sensitive content—has mixed efficacy in reported studies, time-sensitive labels should be used to append information on vote counting and administration processes.30
  • Take into account off-platform declarations from candidates—for example, in press conferences—to delegitimize the election. As part of this effort, serial purveyors of election delegitimization should be removed.
  • Employ viral circuit breakers to ensure the spread of false election information or delegitimization is not immediately damaging.
  • Proactively monitor for—and expeditiously remove—attempts to create conspiracy theories about election workers. Election workers are often volunteers or temporary employees whose duties end soon after, if not on, Election Day, and they may not be provided the same protections as full-time election administrators. This also goes for both the full-time and temporary staff overseeing ballot counting, canvassing, storage, and certification, who have become victims of increasingly vitriolic attacks. These threats can sometimes surface months after the election has ended, so these protections should not be solely limited to any heightened civic integrity period.
  • Proactively monitor for—and remove—any events or calls to action to converge at ballot counting or storage facilities in order to ensure that election records are not physically at risk. Law enforcement should also be notified of pending potential real-world actions.
  • Proactively monitor for and remove calls to disrupt the peaceful transfer of power, including during gubernatorial inaugurations and swearing-in ceremonies for critical but lesser-known roles, such as secretary of state. Again, law enforcement should be notified of pending threats.
  • Prohibit advertisements that promote the big lie, delegitimize the election, or otherwise declare elections stolen or rigged.
  • Prepare for the possibility of runoff elections, which would combine both pre- and post-election circumstances beyond November and complicate civic integrity enforcement. Vigilance is needed on rhetoric, posts, and advertisements questioning the integrity of an election before and during any runoff elections—from not only the candidates on the ballot but also any other political figures who campaign in the state.
  • Ensure any civic integrity period does not end until after state and local elected officials are seated for their new terms and the new Congress has been seated.

Finally, there is the possibility that during the months between the polls closing and the swearing in of the 118th U.S. Congress, prominent political figures may officially declare themselves to be candidates in the 2024 election.31 This would mean that the 2024 election cycle would begin, and if that were to happen, social media platforms should not let their civic integrity policies or election rules expire after the 2022 election but continue through 2023 and beyond.

Conclusion

American democracy remains in a perilous state. The commitment of a large swath of political candidates to the big lie means that the political environment is uniquely tilted against the integrity of the U.S. election system. Online platforms are not the sole cause of this crisis. But companies’ continued refusal to make the fundamental changes required to stop their tools from platforming hate and election subversion theater make them complicit. Given their lackluster preparations for the 2022 midterms, social media companies are choosing again to abdicate their public responsibilities rather than accept accountability for how their tools are being used to destroy the public institutions that maintain the free society in which they operate.

Acknowledgements

The authors would like to thank our colleagues on the CAP Democracy Policy team—Will Roberts, Alex Tausanovitch, and Greta Bedekovics—for their help as well as our allies at the Leadership Conference on Civil and Human Rights, Common Cause, Free Press, Lawyers’ Committee for Civil Rights Under Law, and the Change the Terms coalition for their partnership with CAP on these critical issues over the years.

Endnotes

  1. Alex Tausanovitch, “How To Save American Democracy” (Washington: Center for American Progress, 2022), available at https://www.americanprogress.org/article/how-to-save-american-democracy/.
  2. Michael Sozan, “Poll Workers Are Indispensable to the November Election,” Center for American Progress, October 6, 2022, available at https://www.americanprogress.org/article/poll-workers-are-indispensable-to-the-november-election/.
  3. Greta Bedekovics, “Protecting Election Workers and Officials From Threats and Harassment During the Midterms,” Center for American Progress, October 13, 2022, available at https://www.americanprogress.org/article/protecting-election-workers-and-officials-from-threats-and-harassment-during-the-midterms/.
  4. Erin Simpson and Adam Conner, “Fighting Coronavirus Misinformation and Disinformation: Preventive Product Recommendations for Social Media Platforms” (Washington: Center for American Progress, 2020), available at https://www.americanprogress.org/article/fighting-coronavirus-misinformation-disinformation/; Adam Conner and Erin Simpson, “Results Not Found: Addressing Social Media’s Threat to Democratic Legitimacy and Public Safety After Election Day” (Washington: Center for American Progress, 2020), available at https://www.americanprogress.org/article/results-not-found-addressing-social-medias-threat-democratic-legitimacy-public-safety-election-day/.
  5. The Interface With Casey Newton, “New ideas for fighting COVID-19 misinformation,” August 20, 2020, available at https://www.getrevue.co/profile/caseynewton/issues/new-ideas-for-fighting-covid-19-misinformation-272134.
  6. Craig Timberg, Elizabeth Dwoskin, and Reed Albergotti, “Inside Facebook, Jan. 6 violence fueled anger, regret over missed warning signs,” The Washington Post, October 22, 2021, available at https://www.washingtonpost.com/technology/2021/10/22/jan-6-capitol-riot-facebook/.
  7. Conner and Simpson, “Results Not Found.”
  8. Center for American Progress, “STATEMENT: CAP’s Adam Conner Says Twitter’s Decision To Ban Trump Is Long Overdue, Calls for All Social Platforms To Permanently Suspend His Accounts,” Press release, January 8, 2021, available at https://www.americanprogress.org/press/statement-caps-adam-conner-says-twitters-decision-ban-trump-long-overdue-calls-social-platforms-permanently-suspend-accounts/; Change the Terms, “Trump’s Incitement of Racist Violence Predated January 6,” February 11, 2021, available at https://osbcontent.s3-eu-west-1.amazonaws.com/PC-09383.pdf.
  9. Shannon Bond, “’Ya Basta Facebook’ Says Company Must Curb Misinformation In Spanish,” NPR, March 16, 2021, available at https://www.npr.org/2021/03/16/977613561/ya-basta-facebook-says-company-must-curb-misinformation-in-spanish.
  10. Erin Simpson and Adam Conner, “How To Regulate Tech: A Technology Policy Framework for Online Services” (Washington: Center for American Progress, 2021), available at https://www.americanprogress.org/article/how-to-regulate-tech-a-technology-policy-framework-for-online-services/.
  11. Tiffany Hsu, “News on Fringe Social Sites Draws Limited but Loyal Fans, Report Finds,” The New York Times, October 6, 2022, available at https://www.nytimes.com/2022/10/06/technology/parler-truth-social-telegram-pew.html.
  12. Center for American Progress, “STATEMENT: A Year After the Violent Attack on the Capitol, Insurrectionists and Those Who Spurred Them On Must Be Held Accountable, CAP’s Patrick Gaspard Says,” Press release, January 5, 2022, available at https://www.americanprogress.org/press/statement-a-year-after-the-violent-attack-on-the-capitol-insurrectionists-and-those-who-spurred-them-on-must-be-held-accountable-caps-patrick-gaspard-says/.
  13. Paul M. Barrett, Justin Hendrix, and J. Grant Sims, “Fueling the Fire: How Social Media Intensifies U.S. Political Polarization — And What Can Be Done About It” (New York: NYU Stern Center for Business and Human Rights, 2021), available at https://bhr.stern.nyu.edu/polarization-report-page.
  14. Change the Terms, “About,” available at https://www.changetheterms.org/about (last accessed October 2022).
  15. Change the Terms, “Social-Media Companies Must #FixTheFeed,” available at https://www.changetheterms.org/fix-the-feed (last accessed October 2022).
  16. The Leadership Conference on Civil and Human Rights, “The Leadership Conference and 120 Civil Rights & Democracy Groups Urge Social Media Platforms to Take Meaningful Steps to Address Election Disinformation,” May 12, 2022, available at https://civilrights.org/resource/the-leadership-conference-and-120-civil-rights-democracy-groups-urge-social-media-platforms-to-take-meaningful-steps-to-address-election-disinformation/; Reuters, “U.S. groups urge social media companies to fight ‘Big Lie,’ election disinformation,” May 12, 2022, available at https://www.reuters.com/world/us/us-groups-urge-social-media-companies-fight-big-lie-election-disinformation-2022-05-12/.
  17. The Leadership Conference on Civil and Human Rights, “Leadership Conference and Democracy Groups Urge Social Media Platforms to Address Voter Disinformation Ahead of Midterms,” October 13, 2022, available at https://civilrights.org/resource/leadership-conference-and-democracy-groups-urge-social-media-platforms-to-address-voter-disinformation-ahead-of-midterms/.
  18. Free Press, “Empty Promises: Inside Big Tech’s Weak Effort to Fight Hate and Lies in 2022” (Florence, MA: 2022), available at https://www.freepress.net/sites/default/files/2022-10/empty_promises_inside_big_techs_weak_effort_to_fight_hate_and_lies_in_2022_free_press_final.pdf.
  19. The Leadership Conference on Civil and Human Rights, “The Leadership Conference and 120 Civil Rights & Democracy Groups Urge Social Media Platforms to Take Meaningful Steps to Address Election Disinformation.”
  20. Tausanovitch, “How To Save American Democracy.”
  21. The Columbus Dispatch, “Letters: 6 reasons Ohioans can trust elections are full of safe, secure,” October 18, 2022, available at https://www.dispatch.com/story/opinion/letters/2022/10/18/letters-are-elections-in-ohio-safe-and-secure-columbus-invisible-disabilities-week-mike-dewine/69568401007/; Tim Vandenack, “Utah officials tout election security as ballots hit mailboxes,” Standard-Examiner, October 17, 2022, available at https://www.standard.net/news/government/2022/oct/17/utah-officials-tout-election-security-as-ballots-hit-mailboxes/; Amber Delay, “Moffat County officials want voters to know local elections are secure,” Craig Daily Press, October 17, 2022, available at https://www.craigdailypress.com/news/county-officials-take-several-steps-to-ensure-election-security/.
  22. U.S. House Select Committee to Investigate the January 6th Attack on the United States Capitol, “Hearings,” available at https://january6th.house.gov/legislation/hearings (last accessed October 2022).
  23. John F. Kennedy Presidential Library and Museum, “2022 Profile in Courage Award: Wandrea’ ArShaye Moss,” available at https://www.jfklibrary.org/events-and-awards/profile-in-courage-award/award-recipients/defending-democracy-2022/wandrea-arshaye-moss (last accessed October 2022).
  24. Brennan Center for Justice, “Local Election Officials Survey (March 2022),” March 10, 2022, available at https://www.brennancenter.org/our-work/research-reports/local-election-officials-survey-march-2022; Brennan Center for Justice, ”Election Officials Under Attack” (New York, NY: Brennan Center for Justice, 2021), available at https://www.brennancenter.org/our-work/policy-solutions/election-officials-under-attack.
  25. Linda So and Jason Szep, “U.S. election workers get little help from law enforcement as terror threats mount,” Reuters, September 8, 2021, available at https://www.reuters.com/investigates/special-report/usa-election-threats-law-enforcement.
  26. U.S. Department of Justice, “Justice Department Launches Task Force To Combat Threats Against Election Workers,” July 29, 2021, available at https://www.justice.gov/opa/blog/justice-department-launches-task-force-combat-threats-against-election-workers-0.
  27. Brennan Center for Justice, “Local Election Officials Survey (March 2022).”
  28. “Friction” is anything that inhibits user action within a digital interface. See Simpson and Conner, “Fighting Coronavirus Misinformation and Disinformation.”
  29. Conner and Simpson, “Results Not Found.”
  30. Kayla Gogarty, “Facebook keeps touting its labels, but data suggests labels actually amplified Trump’s misinformation,” Media Matter for America, June 2, 2021, available at https://www.mediamatters.org/facebook/facebook-keeps-touting-its-labels-data-suggests-labels-actually-amplified-trumps; Katherine Clayton and others, “Real Solutions for Fake News? Measuring the Effectiveness of General Warnings and Fact-Check Tags in Reducing Belief in False Stories on Social Media,” Political Behavior 42 (2020): 1073–1095, available at https://doi.org/10.1007/s11109-019-09533-0.
  31. Kathryn Watson, “Kellyanne Conway says Trump ‘wants his old job back,’ and would like to announce within weeks,” CBS News, October 1, 2022, available at https://www.cbsnews.com/news/kellyanne-conway-trump-wants-old-job-back-would-like-to-announce-within-weeks/; Jonathan Lemire, “Election anxiety creeps inside the White House,” Politico, October 26, 2022, available at https://www.politico.com/news/2022/10/26/election-anxiety-begins-to-creep-inside-the-white-house-00063524.

The positions of American Progress, and our policy experts, are independent, and the findings and conclusions presented are those of American Progress alone. A full list of supporters is available here. American Progress would like to acknowledge the many generous supporters who make our work possible.

Authors

Erin Simpson

Former Director, Technology Policy

Adam Conner

Vice President, Technology Policy

Ashleigh Maciolek

Former Research Associate

Team

Technology Policy

Our team envisions a better internet for all Americans, advancing ideas that protect consumers, defend their rights, and promote equitable growth.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.