Center for American Progress

Unions Give Workers a Voice Over How AI Affects Their Jobs

Unions Give Workers a Voice Over How AI Affects Their Jobs

Collective bargaining is a powerful tool workers can use to ensure artificial intelligence and algorithmic technology improve their jobs instead of make working conditions worse, and workers have won several recent contracts that give them power over how AI will affect their working lives.

In this article
Photo shows a view of an open office with mostly empty desk setups, and a large window in the back of the room
A few employees work at their computers at a tech startup office in San Francisco, March 2021. (Getty/Justin Sullivan)

Introduction and summary

As policymakers seek solutions to the current and future impacts of artificial intelligence (AI) on workers across the country, workers themselves are using their union bargaining power to negotiate contract provisions that prevent the elimination of jobs, place limits on surveillance and algorithmic management, and enable workers to benefit from productivity boosts offered by AI tools.1

AI and machine learning technologies2 are being used in new ways to automate nonroutine tasks, from writing code and human-sounding text to managing schedules, promising an increase in productivity for some workers. Nevertheless, many workers are understandably nervous that they will be denied the benefits of AI technology. Even as worker productivity increased over the past several decades, those gains went “everywhere but the paychecks of the bottom 80% of workers,” according to research from the Economic Policy Institute.3 For many workers, AI further threatens to automate parts or all of their jobs or worsen conditions by replacing human decision-making with algorithmic management driven by data harvested via invasive surveillance. Sam Altman, CEO of ChatGPT developer OpenAI, predicts “jobs are definitely going to go away, full stop.”4

While policymakers can take advantage of existing worker protections to ensure the use of AI in the workplace benefits workers and consider additional legislative steps, the examples set by unions make clear that policymakers also must complement these protections by strengthening the right to join a union and bargain collectively. Existing law can be applied in a way that protects workers from some of the potential harms of the use of AI in the workplace and hiring. Jennifer Abruzzo, general counsel of the National Labor Relations Board (NLRB), argued in a 2022 memo on electronic hiring and algorithmic management that many of the AI technologies used by employers are already illegal under settled law and urged the board to adopt a framework for protecting employees from surveillance and algorithmic management that interferes with protected activity.5 The Equal Employment Opportunity Commission issued technical assistance in 2021 on compliance with the Americans with Disabilities Act6 and algorithmic decision-making tools in hiring and employment as part of a larger algorithmic fairness initiative.7 Across the Biden administration, policymakers can further consider how existing law already regulates the use of AI.

Policymakers in Congress and the administration must center workers’ needs in their response to the use and development of AI8 through measures that strengthen workers’ right to come together in unions; ensure AI augments, rather than replaces, workers; prepare the workforce for AI adoption; and help meet the needs of workers who are displaced.9 Lawmakers in Congress are advocating for bills—such as the Stop Spying Bosses Act and the No Robot Bosses Act—that would protect workers from certain threats from AI. These policies should complement bills that strengthen unions, including the Protecting the Right to Organize (PRO) Act, which would stiffen penalties for union busting and enhance protections for workers trying to organize their colleagues,10 and the Public Service Freedom to Negotiate Act, which proposes strengthening organizing rights in the public sector.11

The Stop Spying Bosses Act and the No Robot Bosses Act

The Stop Spying Bosses Act and the No Robot Bosses Act, both introduced in Congress in 2023, address different aspects of employees’ work lives.

The Stop Spying Bosses Act12 would outlaw workplace surveillance for monitoring worker organizing or using AI to make behavioral predictions. The bill would also require employers who surveil employees to disclose their use of data collection so workers can be aware of how their data are being collected and used and would establish a Privacy and Technology Division at the U.S. Department of Labor to administer the law.

The No Robot Bosses Act13 would prohibit employers from relying exclusively on automated decision-making systems in making employment decisions such as hiring or firing workers. The bill would require testing and oversight of decision-making systems to ensure they do not have a discriminatory impact on workers, and when automated systems are used to help employers make a decision, employers must describe to workers how the system was used and allow workers or job applicants to dispute the system’s output with a human.

A comprehensive policy covering AI in the workplace would combine legislation to regulate novel uses of AI that harm workers, enforcement of existing law against uses of AI that already run afoul of existing employee protections, and stronger rights to organize a union and collectively bargain. Workers in some of the nation’s largest unions have worked tirelessly to negotiate over the uses of new technologies and set an example for how other unions can bargain over AI, but policymakers must open the door for more workers to advocate for themselves over how they can reap the promised benefits of AI.

Policymakers must open the door for more workers to advocate for themselves over how they can reap the promised benefits of AI.

Unions prevent workers from being replaced

In some industries where AI’s deployment threatens to replace workers, unions are winning control over the ways employers can use AI technology and how employees should be compensated. In late 2023, casino workers in Las Vegas represented by the Culinary Workers Union achieved a new contract that included a severance package of $2,000 for each year the employee worked if the employee’s role was eliminated due to “technology or AI.”14 AI was a contentious issue for workers in two of the largest and most highly publicized strikes in 2023: the Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA)15 strike of actors and the Writers Guild of America (WGA)16 strike of writers. Both unions negotiated contract resolutions with the Alliance of Motion Picture and Television Producers (AMPTP) to address the risks AI posed to workers.

SAG-AFTRA, WGA, generative AI, and digital replicas

AI directly threatens to reduce or replace work completed by writers and actors alike. Generative AI, such as ChatGPT, can let anyone generate human-sounding scripts, screenplays, and articles based on natural language prompts.17 Generative AI technology also enables digital replicas that allow computer-generated models of actors to be visually inserted into scenes and give performances that actors were never present for.18 Both instances not only threaten to displace work performed by skilled writers and actors, but also use actors’ and writers’ preexisting work to produce the output. This means a writer’s own scripts or an actor’s own performances can be used to create new digital ones—and potentially replace the writer or actor. Both SAG-AFTRA and the WGA reached agreements with the AMPTP that for now alleviate much of the threat that workers’ visual likenesses or written work could be used without compensation or in place of work completed by union members. The SAG-AFTRA agreement includes provisions regarding consent and compensation for use of digital replicas powered by AI.19 The WGA contract places limits on how studios can use AI with human-written material, empowering writers to choose whether to use AI technology in completing their work and enabling writers, rather than the studio alone, to gain from the productivity benefits.20

Because unions democratically advocate for the best interests of their membership, they can also push for contract provisions that matter to workers and their unique circumstances, including workplace-specific concerns regarding the use of AI. In the film industry, credits for workers’ completed work on film projects earn workers money through residuals for reuse of credited work21 and help workers land new jobs and demand and receive higher earnings;22 credits have been the subject of negotiations with unions since the early days of organizing in film.23 As a result, writers run the risk of losing out on the compensation they deserve when their work appears on screen without being credited; AI increases the likelihood of this since it uses publicly available data created by humans, sometimes including copyrighted material, to create its scripts.24 The WGA fought to prevent AI from using material created by WGA workers and ultimately reached a contract with the AMPTP that explicitly includes a provision that AI-written material cannot be considered source material when determining writing credits.25


The contract SAG-AFTRA members ratified with the AMPTP in December 2023 covers television, streaming, and theatrical productions and has extensive language on AI and its use in producing digital replicas that look like real actors.

The SAG-AFTRA agreement defines an “Employment-Based Digital Replica” as a digital performance produced in “connection with employment on a motion picture” and “with the performer’s physical participation … for the purpose of portraying the performer in photography or sound track in which the performer did not actually perform.”26 This means producers making a movie can scan an actor’s performance, then use an AI-generated version of the actor to shoot a scene that actor was not present for, but the contract does not give producers carte blanche to use it however they want without paying the actor. Not only is the worker’s consent required, with a “reasonably specific description of the intended use” of the digital copy, but the worker is also entitled to their “pro rata daily rate or the minimum rate, whichever is higher, for the number of production days that Producer determines the performer would have been required to work had the performer instead performed those scene(s) in person,” along with residuals. These provisions mean the technology cannot be used to scan an actor and then use that digital likeness for free however the producers want without consent from and compensation for the actor.

The agreement also states that the technology cannot be used to create digital copies of an actor for productions that the actor is not already working on. These are called “independently created digital replicas,” which create “the clear impression that the asset is a natural performer whose voice and/or likeness is recognizable as the voice and/or likeness of an identifiable natural performer.” They, too, require consent with a specific description of its use, bargaining, and compensation.

Finally, actors negotiated compensation for and asserted control over using generative AI tools to create new performances by using old ones. Producers have to give “[n]otice to Union and an opportunity to bargain in good faith over appropriate consideration, if any, if a Synthetic Performer is used in place of a performer who would have been engaged under this Agreement in a human role”—making it harder for producers to simply use synthetic performers as a way to avoid hiring real actors. They also have to bargain with performers to use AI to generate synthetic performers that share some distinctive facial features with performers.


The contract agreed on by the WGA with the AMPTP in September 2023 amends the previous contract between writers and producers with an additional article, Article 72, that deals entirely with AI.27 The article makes clear that writing created by AI or generative AI cannot be “considered literary material” under WGA contracts. “Literary material” is the work product writers produce, so under the contract, the output of AI is not a substitute for the work of actual WGA writers.28 Section C of Article 72 goes further to cover situations where writers are hired to use AI-generated content “as the basis for writing literary material” and ensures that writers are still entitled to the payment, credit, and appropriate rights for their work that they would have enjoyed without AI involvement. The article further states that producers “may not require, as a condition of employment, that a writer use a GAI [generative AI] program which generates written material that would otherwise be ‘literary material’ … if written by a writer” and notes this would prevent, for example, a production company from forcing a writer to use ChatGPT to complete their work.

The contract reflects the “uncertain and rapidly developing” legal landscape around AI, and under it, “Each Company agrees to meet with the Guild during the term of this Agreement at least semi-annually at the request of the Guild and subject to appropriate confidentiality agreements to discuss and review information related to the Company’s use and intended use of GAI [generative AI] in motion picture development and production,” giving workers control over how AI use develops in the future. By establishing strong baseline standards to meet the current needs of workers and lay the groundwork for future negotiations, the contract offers an example for how workers can protect their needs in negotiating over AI.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Unions bargain over surveillance

AI-powered tools need vast datasets to function. Employers can harvest these data through invasive surveillance of workers, while AI companies may simultaneously gather even more information by processing large amounts of data automatically. If an employer plans to implement an AI-powered tool in its workplace, it may need data unique to its particular warehouses, trucks, or employee laptops. Employers may gather those data by tracking workers via means such as cameras and microphones,29 now powered by AI,30 as well as by monitoring work laptops issued to employees.31

The problem of employee surveillance is nothing new, and unions have been addressing workers’ concerns for decades since surveillance is not only invasive to workers’ privacy and a detriment to job quality but also a core part of the union-busting toolkit.32 The NLRB ruled as early as 1997 that the use of hidden surveillance cameras is a mandatory point on which employers must bargain with unions.33 The Communications Workers of America (CWA) has long negotiated over the use of surveillance in call centers, as many workers feared surveillance could be used against them to initiate disciplinary proceedings. Over the past 30 years, CWA contracts with AT&T, Verizon, Lumen/CenturyLink, and other telecommunications companies have placed limits on how many calls can be monitored, prevented recorded calls from being used to punish employees, and ensured call monitoring is used only to offer feedback for workers and to help train them.34

Today, surveillance continues to harm workers and poses a renewed threat to workers’ ability to come together in unions.35 Efficient AI-enabled processing of bulk data from sensors such as cameras allows more widespread use of surveillance in the workplace, reducing workers’ privacy and harming job quality.36 For example, workers have alleged that Amazon uses its data to calculate abstract performance metrics37 for its warehouse workers that place pressure on employees to overwork themselves and contribute to high rates of injury, stress, and taking unpaid leave,38 creating what one group of researchers called a “corporate police state.”39 In response to this, Amazon said it has made progress in improving warehouse safety, does not use camera technology in its warehouses to monitor employees, and offers workers multiple methods for reporting safety concerns.40 Workers have also alleged that Amazon monitors employee communications for labor organizing efforts,41 and journalists have detailed Amazon’s surveillance of private Facebook groups used by workers.42 The NLRB ordered the unsuccessful 2021 union vote at an Amazon warehouse in Bessemer, Alabama, to be rerun after a regional director of the NRLB found that Amazon “created the impression of surveillance” during the mail voting process.43

At the same time, Amazon workers who advocate for unions see unions as a means of addressing workplace surveillance, especially the use of surveillance and automated systems in disciplining employees. Chris Smalls, a former Amazon warehouse employee and the organizer who led a successful unionization vote at Amazon’s Staten Island warehouse in 2022, cited an algorithm-driven employee monitoring system as “one of the big reasons people want to unionize” in 2021, asking “Who wants to be surveilled all day? It’s not prison. It’s work.”44 Joshua Brewer, who helped organize the unsuccessful 2021 union vote at the Amazon, similarly said when asked about one reason why many workers want a union: “The workers answer to a lot of robotic information systems that deliver their discipline, and they have no say in it.”45

The Guardian reports that surveilling calls remains common at call centers,46 which unions have argued worsens job quality.47 Other industries are experimenting with new forms of surveillance. Hospitals have tracked nurses via a range of technologies including sensor badges that monitor when they wash their hands,48 and Uber has long tracked drivers via their smartphones, using the data to develop algorithms for predicting driver performance, for the purpose of increasing safety.49 The use of surveillance to curtail organizing is such a risk for workers that the NLRB general counsel issued a memo in 2022 urging the NLRB to recognize the threat that surveillance poses to worker organizing.50

The harms posed by surveillance can be mitigated through collective bargaining. In 2022, UPS began to refit its trucks with surveillance cameras that monitor drivers inside their trucks and can continually record and stream data.51 This surveillance increased pressure on workers and interfered with their ability to complete their work in a manageable way, including sorting packages in extreme summer heat. Drivers voiced frustration over the company’s choice to install new surveillance features in their vehicles while it refused to install air conditioning, with one driver alleging “surveillance and discipline are used to make us work faster.”52

The agreement the Teamsters reached with UPS in 2023 after their nationwide strike curtails the use of surveillance in trucks and prevents the potential replacement of workers with automated technology.53 Article 6, Section 6 of the contract covers tracking and surveillance technologies and their use in discipline. The protections are strong and guarantee that human managers are involved at every step of the process, ensuring, “No employee shall be disciplined based solely upon information received from GPS, telematics, or any successor system that similarly tracks or surveils an employee’s movements unless they engage in dishonesty.” Dishonesty explicitly does not alone consist of a “driver’s failure to accurately recall what is reflected by the technology,” and UPS “must confirm by direct observation or other corroborating evidence” behaviors that could result in firing or discipline, preventing tracking technologies from being the sole witness to alleged violations and giving workers a layer of human protection to ensure technology is being used fairly in the disciplinary process. Workers also cannot be warned about infractions based on tracking systems without first having a “verbal counseling session” about that infraction. Surveillance cameras for recording audio and video inside a vehicle cab are banned, and cameras that face outside the vehicle also cannot be used for discipline. These provisions alleviate the burden that surveillance places on workers to overperform in order to avoid being disciplined based on tracking or surveillance technology without human involvement.

Article 6, Section 4 establishes a committee between UPS and the Teamsters to review planned technological changes, which are defined under expansive language to include “any meaningful change in equipment or materials which results in a meaningful change in the work, wages, hours, or working conditions of any classification of employees in the bargaining unit or diminishes the number of workers in any classification of employees in the bargaining unit.” Under the agreement, UPS also agreed to notify the union well in advance of plans to implement any change that falls under this definition and to strike agreements with the union about the change being introduced. Furthermore, “If a technological change creates new work that replaces, enhances or modifies bargaining unit work, bargaining unit employees will perform that new or modified work,” ensuring workplace changes through technological advancement do not cut Teamsters out of the workplace altogether.54

Unions can help when workers have problems with AI management

For decades, employers have experimented with tools that offload management decisions such as scheduling and performance evaluation to automated systems. More recently, AI has enabled powerful automated management systems that sometimes conflict with workers’ ability to manage themselves or make decisions that workers do not understand or find unfair. Collective bargaining has enabled many workers to negotiate solutions that introduce a human element when needed and preserve the autonomy of individual workers on the job.

Read more

AI management

AI has increasingly been implemented to assign tasks, schedule workers, and evaluate performance, by offering a way to monitor performance metrics and needs in real time and assign work accordingly.55 Past uses of statistical and computational algorithms, however, have shown that optimizing for narrow productivity targets can harm workers.56

AI management of workers can strip workers’ power to manage the workload themselves. Housekeepers at a hotel that implemented algorithmic management of room-cleaning assignments described how the app-based management solution prevented workers from efficiently managing their own flows, denied them access to necessary information that would have made their jobs easier in deciding how to complete necessary room-cleaning work, and placed a higher workload on workers, all of which can negatively affect worker well-being.57 Academic research so far has found that algorithmic management of platform workers—that is, employees whose work is managed by smart technologies—can increase pressure on workers while making their income, workload, and scheduling less predictable and harder to manage.58 Higher stress and lower worker well-being can themselves negatively affect productivity as well, undermining the original goal of increasing productivity through algorithmic management.59

While the machine learning technologies now becoming mainstream may seem novel, the use of inscrutable algorithms for management, including hiring and firing, is not. Forty-four states and Washington, D.C., had implemented “value-added models” by 201560 to evaluate a teacher’s “value” to student academic achievement based on statistical analysis of student performance on standardized test scores.61 The models are difficult for nonstatisticians to understand, meaning teachers often could not get clear explanations as to why they failed the algorithm’s test. As the American Statistical Association warned in 2014,62 it was necessary to have “high-level statistical expertise” to “develop the models and interpret their [the models’] results,” which mirrors concerns today that AI is too difficult to explain.63 Despite this, some school districts used these models to make decisions about firing teachers. The District of Columbia Public Schools’ (DCPS) teacher evaluation system, IMPACT, introduced in 2009, resulted in DCPS Chancellor Michelle Rhee firing hundreds of teachers64—and a 2021 DCPS review found IMPACT had “disparate outcomes between white teachers and teachers of color.”65 The Education Value-Added Assessment System, implemented by the Houston Independent School District starting in 2007, resulted in 221 teachers’ contracts not being renewed by the school district in 2011.66 Seven teachers and their union, the Houston Federation of Teachers, sued the school district, and in 2017, the district agreed to stop using the system’s scores to terminate teachers unless the teachers could test or challenge the score independently.67

Driver deactivation

Today, many drivers for ride-sharing apps such as Uber and Lyft have suffered “deactivations,” made via algorithm, whereby drivers cannot accept new rides using the apps they rely on as a source of income.68 A 2023 survey of 810 drivers for Uber and Lyft in California found that 66 percent of drivers were deactivated at some point by one or both of the companies, and 30 percent of surveyed drivers who were deactivated report being given no explanation by either Uber or Lyft.69 The automated and opaque system powering deactivations results in workers being terminated without an easy-to-understand explanation.70

To combat this, ride-sharing drivers in Seattle organized Drivers Union, an association to help them advocate collectively for protections at the state, local, and industry levels.71 Because American labor unions must organize on a workplace-by-workplace basis, many workers, including app-based platform workers, cannot organize via a traditional NLRB-certified election. Instead, drivers pushed for better policies from their state and local lawmakers.72

In 2021, Seattle implemented a Transport Network Company Driver Deactivation Rights Ordinance,73 which established a panel that allowed drivers to appeal their termination to a government-run workers board with representation from the state-regulated and Drivers Union-operated Driver Resource Center.74 The center was highly effective at helping drivers retain their jobs, with drivers getting their deactivations overturned in 80 percent of cases with Driver Resource Center representation, based on a study of more than 1,400 cases.75

Drivers also advocated for a state law that came into effect across Washington state in 2023,76 guaranteeing minimum wages and paid sick time and workers’ compensation as well as expanding deactivation protections to cover the entire state.77 Drivers Union used this law as a foundation to reach a termination agreement with Uber in 2023,78 under which drivers who lose access for three or more days can file an appeal through the Driver Resource Center and, if no resolution is reached in 30 days, require the company to show just cause for the termination.79

Workers can act as partners in introducing AI to the workplace

While certain uses of AI technology can directly harm workers, it also promises productivity benefits, and unions empower workers to make sure they can share in the benefits. Although many workers are anxious about the introduction of AI into the workplace, some workers are hopeful that the technology can automate many of the repetitive, taxing, or undesirable tasks of their jobs and effectively ensure that work time is better spent.80 A 2023 survey by the Organization for Economic Cooperation and Development (OECD) found that a majority of manufacturing and financial services workers say AI has had a positive impact on their performance and mental health.81

Nevertheless, there is a risk that productivity benefits accrue only to employers, particularly if they find they need fewer workers to complete tasks and start laying off employees or “deskilling,” or lowering the skill level needed for existing jobs, in order to pay less.82 The 2023 OECD study also found that more than 40 percent of workers in manufacturing and financial services expect AI to lower their wages within a decade.83

Last year, several major unions and labor organizations won a seat at the table for workers in determining the future of the development and use of AI in their workplaces. In December 2023, the AFL-CIO and Microsoft created a platform for worker input into AI design and a dialogue over public policy to set guardrails for AI deployment.84 The CWA, which represents workers in industries with AI exposure such as game development and AI development itself, has developed bargaining principles for making sure AI works for its workers.85 Using these principles and building on their neutrality agreement with Microsoft, CWA workers at Microsoft video game subsidiary ZeniMax Media reached a tentative agreement86 that commits ZeniMax to provide notice about AI implementation while ensuring that its use of these tools boosts productivity and satisfaction without harming workers.87 Similarly, workers at the Financial Times editorial branch FT Specialist unionized with WGA East and ratified a contract that requires FT Specialist to “discuss in advance the introduction of any new technology,” and the union can bargain over the effects of these changes.88

How unions negotiate over new technology

Unions have long negotiated the introduction of new technologies into the workplace in a way that benefits workers. As one recent example, the COVID-19 pandemic resulted in as many as 35 percent of workers with jobs that can be done remotely taking advantage of technology to work from home all the time instead of in the office in 2023.89 Working from home became preferable for many workers who enjoyed saving time and money on their commute and being able to maintain a better work-life balance, though uncertainty existed over whether employers and workers would prefer working from home in the long term.

As a result, many unions that represent workers pushed for greater flexibility in their contracts. Workers at a range of companies won reductions in the number of days per week or month workers needed to report in person to the office, new ways for workers to request time to work from home, and increased availability of fully remote options. Nearly 700 technology workers at The New York Times went on strike in October 2023 over the publication’s return-to-work policy.90 At the federal Government Accountability Office, 2,500 unionized workers won an agreement for a flexible remote-work policy in September 2023,91 in contrast to many other federal workers being brought back into the office, as did a union representing 7,500 Environmental Protection Agency employees in 2021.92 Unionized university staff at schools including Harvard93 and within the City University of New York system94 solidified and extended remote-work policies as well. The CWA reached agreements in 2022 with AT&T95 and Verizon96 that established terms for employees working from home, including stipends for remote-work costs, pay protection during system outages, and limits on the use of webcams for surveillance.

A similar process is taking place where AI technologies are being introduced, although uncertainty over its exact uses remains. The CWA and ZeniMax reached a tentative agreement over using AI.97 ZeniMax agreed to limit itself to “uses of AI that augment human ingenuity and capacities, to ensure that these tools enhance worker productivity, growth, and satisfaction without causing workers harm.” While a final contract has not yet been reached, the language was developed as part of a “Proactive Bargaining” strategy by the CWA to deal with the effects of AI on its members rather than allow employers to take the first step and force unions on the back foot.98 Dylan Burton, a QA tester at ZeniMax, hailed the agreement: “This agreement empowers us to shape the ways we may choose to use AI in our work and also gives us the means to address those impacts before their potential implementation.”99


Technological change affecting working conditions is nothing new, and amid bringing new AI technologies to the workplace, unions give workers a powerful tool for ensuring AI benefits workers rather than making their jobs worse or leaving them without a job. Many researchers100 and labor unions are highlighting the need for unions to play a key role101 in negotiating the development and introduction of new technology into the workplace. Unfortunately, federal labor law makes joining a union far harder than it needs to be, denying many workers a voice on the job. Because of this, policymakers should take a comprehensive approach to AI that includes empowering workers to speak up on the job while ensuring they enjoy the benefits of AI technology in the workplace. Empowering workers to join unions via policies that strengthen the right to organize, such as the Protecting the Right to Organize Act, will enable more workers to use collective bargaining to advocate for themselves over the use of new technologies in the workplace.


  1. U.S. Senate Committee on Health, Education, Labor, and Pensions, “AI and the Future of Work: Moving Forward Together,” October 31, 2023, available at
  2. Although a range of definitions for AI exist, Congress offered a working definition under Title 15, Section 9401 of the U.S. Code: “The term ‘artificial intelligence’ means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations or decisions influencing real or virtual environments. Artificial intelligence systems use machine and human-based inputs to- (A) perceive real and virtual environments; (B) abstract such perceptions into models through analysis in an automated manner; and (C) use model inference to formulate options for information or action.” See U.S. Code, “15 USC 9401: Definitions,” available at (last accessed February 2024).
  3. Economic Policy Institute, “The Productivity–Pay Gap,” available at (last accessed February 2024).
  4. Ross Andersen, “Does Sam Altman Know What He’s Creating?”, The Atlantic, July 24, 2023, available at
  5. National Labor Relations Board Office of the General Counsel, “Memorandum GC 23-02: Electronic Monitoring and Algorithmic Management of Employees Interfering with the Exercise of Section 7 Rights” (Washington: 2022), available at
  6. Equal Employment Opportunity Commission, “The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees,” May 12, 2022, available at
  7. Equal Employment Opportunity Commission, “Artificial Intelligence and Algorithmic Fairness Initiative,” available at (last accessed February 2024).
  8. Rose Khattar, “Will AI Benefit or Harm Workers?”, Center for American Progress, August 24, 2023, available at
  9. Patrick Gaspard, “Patrick Gaspard’s Statement for the Senate AI Insight Forum on Workforce” (Washington: Center for American Progress, 2023), available at
  10. Richard L. Trumka Protecting the Right to Organize Act of 2023, H.R. 20, 118th Cong., 1st sess. (February 28, 2023), available at
  11. Public Service Freedom to Negotiate Act of 2021, H.R. 5727, 117th Cong., 1st sess. (October 26, 2021), available at
  12. Stop Spying Bosses Act, S. 262, 118th Cong., 1st sess. (February 2, 2023), available at
  13. No Robot Bosses Act, S. 2419, 118th Cong., 1st sess. (July 20, 2023), available at
  14. Gennadiy Shevtsov, “AI in Hospitality: Robotic Baristas and AI Chefs,” Medium, January 14, 2024, available at
  15. Screen Actors Guild-American Federation of Television and Radio Artists, “Artificial Intelligence Resources,” available at (last accessed February 2024).
  16. Drew Richardson, “Hollywood’s AI issues are far from settled after writers’ labor deal with studios,” CNBC, October 16, 2023, available at
  17. Noam Scheiber and John Koblin, “Will a Chatbot Write the Next ‘Succession’?”, The New York Times, April 29, 2023, available at
  18. Screen Actors Guild-American Federation of Television and Radio Artists, “Regulating Artificial Intelligence” (Los Angeles: 2023), available at
  19. Screen Actors Guild-American Federation of Television and Radio Artists, “TV/Theatrical Contracts 2023: Summary of Tentative Agreement” (Los Angeles: 2023), available at
  20. Writers Guild of America, “Summary of the 2023 WGA MBA,” available at (last accessed February 2024).
  21. Writers Guild of America, “Residuals Survival Guide,” available at (last accessed March 2024).
  22. Writers Guild of America, “Screen Compensation Guide,” available at (last accessed February 2024).
  23. Ben Schwartz, “Raging Bullshit: Credits and the Hollywood Economy,” The Nation, June 8, 2023, available at
  24. Dan Milmo, “‘Impossible’ to create AI tools like ChatGPT without copyrighted material, OpenAI says,” The Guardian, January 8, 2024, available at
  25. Writers Guild of America, “Memorandum of Agreement for the 2023 WGA Theatrical and Television Basic Agreement” (Los Angeles: 2023), available at
  26. Screen Actors Guild-American Federation of Television and Radio Artists, “TV/Theatrical Contracts 2023.”
  27. Writers Guild of America, “Memorandum of Agreement for the 2023 WGA Theatrical and Television Basic Agreement.”
  28. Writers Guild of America, “Screen Credits Manual” (Los Angeles: 2018), available at
  29. Teamsters for a Democratic Union, “UPS Installs On-Truck Surveillance Cameras,” available at (last accessed February 2024).
  30. Kathryn Zickuhr, “Workplace surveillance is becoming the new normal for U.S. workers” (Washington: Washington Center for Equitable Growth, 2021), available at
  31. Zoë Corbyn, “‘Bossware is coming for almost every worker’: the software you might not realize is watching you,” The Guardian, April 27, 2022, available at
  32. Alana Semuels, “Some Companies Will Do Just About Anything to Stop Workers from Unionizing” Time, October 13, 2022, available at
  33. National Labor Relations Board, “Colgate-Palmolive Co – Decision Summary,” May 2, 1997, available at
  34. Communications Workers of America, “Protections against Abusive Monitoring” (Washington: 2014), available at
  35. Annette Bernhardt, Lisa Kresge, and Kung Feng, “Response to the White House Office of Science and Technology Policy Request for Information on Automated Worker Surveillance and Management” (Berkeley, CA: University of California, Berkeley Labor Center, 2023), available at; National Labor Relations Board Office of the General Counsel, “Memorandum GC 23-02.”
  36. Steven Greenhouse, “‘Constantly monitored’: the pushback against AI surveillance at work” The Guardian, January 7, 2024, available at
  37. Katrina Pham, “‘It Kind of Feels Like Prison’: Injured, Burned Out and Under Surveillance at Amazon,” In These Times, October 26, 2023, available at; Jay Greene, “Amazon’s employee surveillance fuels unionization efforts: ‘It’s not prison, it’s work’,” The Washington Post, December 2, 2021, available at
  38. Beth Gutelius and Sanjay Pinto, “Pain Points: Data on Work Intensity, Monitoring, and Health at Amazon Warehouses” (Chicago: University of Illinois Chicago Center for Urban Economic Development, 2024), available at
  39. Tamar L. Lee and others, “Amazon’s Policing Power: A Snapshot from Bessemer” (Piscataway, NJ: Rutgers School of Management and Labor Relations, 2022), available at
  40. Annie Palmer, “Amazon’s focus on speed, surveillance drives higher warehouse worker injuries, study finds,” CNBC, October 25, 2023, available at; Amazon, “Read Amazon’s response to Oxfam’s workplace safety allegations,” April 10, 2024, available at
  41. Lauren Kaori Gurley and Janus Rose, “Amazon Employee Warns Internal Groups They’re Being Monitored For Labor Organizing,” Vice, September 24, 2020, available at
  42. Lauren Kaori Gurley and Joseph Cox, “Inside Amazon’s Secret Program to Spy On Workers’ Private Facebook Groups,” Vice, September 1, 2020, available at
  43. National Labor Relations Board, Region 10, “Case 10-RC-269250: Decision and Direction of Second Election” (Atlanta: 2021), available at
  44. Greene, “Amazon’s employee surveillance fuels unionization efforts.”
  45. Steven Greenhouse, “‘Everyone in the Community Is Cheering Us On’,” The American Prospect, March 2, 2021, available at
  46. Greenhouse, “‘Constantly monitored’.”
  47. Ameenah Salaam, “Written Comments for AI Insight Forum on Workforce” (Washington: Communications Workers of America, 2023), available at
  48. Christina Farr, “Some hospitals are tracking Covid-19 by adding sensors to employees’ badges,” CNBC, August 2, 2020, available at
  49. Belle Lin, “Uber Patents Reveal Experiments with Predictive Algorithms to Identify Risky Drivers,” The Intercept, October 30, 2021, available at
  50. National Labor Relations Board Office of the General Counsel, “Memorandum GC 23-02.”
  51. Elliot Lewis and Matt Leichenger, “UPS Says No to Air Conditioning, But Here’s a Surveillance Camera,” Labor Notes, August 18, 2022, available at
  52. Matt Leichenger, “UPS Is Installing Surveillance Cameras in Our Trucks, but Not Air Conditioning,” Jacobin, August 5, 2022, available at
  53. International Brotherhood of Teamsters, “National Master United Parcel Service Agreement,” available at (last accessed February 2024).
  54. Ibid.
  55. Emilia F. Vignola and others, “Workers’ Health under Algorithmic Management: Emerging Findings and Urgent Research Questions,” International Journal of Environmental Research and Public Health 20 (2) (2023): 1239, available at
  56. Perry Stein, “D.C. teacher evaluation system has academic benefits, but is racially biased, new study finds,” The Washington Post, August 13, 2021, available at
  57. Franchesca Spektor and others, “Designing for Wellbeing: Worker-Generated Ideas on Adapting Algorithmic Management in the Hospitality Industry” (Pittsburgh: Designing Interactive Systems Conference, 2023), available at
  58. Vignola and others, “Workers’ Health under Algorithmic Management.”
  59. Andrew Sharpe and Shahrzad Mobasher Fard, “The current state of research on the two-way linkages between productivity and well-being” (Geneva: International Labor Organization, 2022), available at
  60. Margarita Pivovarova, Audrey Amrein-Beardsley, and Tray Geiger, “Value-added models: What the experts say,” Phi Delta Kappan, October 1, 2016, available at
  61. Jesse Rothstein, “Can value-added models identify teachers’ impacts?” (Berkeley, CA: University of California, Berkeley Institute for Research on Labor and Employment, 2016) available at
  62. American Statistical Association, “ASA Statement on Using Value-Added Models for Educational Assessment” (Alexandria, VA: 2014), available at
  63. White House Office of Science and Technology Policy, “Notice and Explanation: You Should Know That an Automated System Is Being Used and Understand How and Why It Contributes to Outcomes That Impact You,” available at (last accessed March 2024).
  64. Veronica DeVore and Imani Cheers, “D.C. Schools Chief Rhee Fires 241 Teachers Using New Evaluation System,” PBS, July 23, 2010, available at
  65. Stein, “D.C. teacher evaluation system has academic benefits, but is racially biased, new study finds”; District of Columbia Public Schools, “IMPACT Review Findings,” available at (last accessed March 2024).
  66. Audrey Amrein-Beardsley, “The Education Value-Added Assessment System (EVAAS) on Trial: A Precedent-Setting Lawsuit with Implications for Policy and Practice” (Tempe, AZ: Arizona State University, 2019), available at
  67. American Federation of Teachers, “Federal Suit Settlement: End of Value-Added Measures for Teacher Termination in Houston,” Press release, October 10, 2017, available at
  68. Asian Americans Advancing Justice and Rideshare Drivers United, “Fired by an App: The Toll of Secret Algorithms and Unchecked Discrimination on California Rideshare Drivers” (Washington: 2023), available at
  69. Ibid.
  70. Ruby de Luna, “Seattle introduces legislation to protect gig workers from abrupt termination,” KUOW, May 23, 2023, available at
  71. Drivers Union, “Deactivation Support,” available at (last accessed February 2024).
  72. The Associated Press, “California court says Uber, Lyft can treat state drivers as independent contractors,” NPR, March 14, 2023, available at
  73. Seattle Office of Labor Standards, “Deactivation Appeals Panel,” available at (last accessed February 2024).
  74. Aurelia Glass and David Madland, “Momentum for Worker Standards Boards Continues To Grow,” Center for American Progress, September 7, 2023, available at
  75. Lindsey Schwartz, Nic Weber, and Eva Maxfield Brown, “Deactivation with and without Representation: The Role of Dispute Arbitration for Seattle Rideshare Drivers” (Seattle: University of Washington Puget Sound Clinic for Public Interest Technology, 2023), available at
  76. Drivers Union, “Driver FAQ: What’s in the New Expand Fairness Law?”, available at (last accessed February 2024).
  77. Transportation Network Companies, Washington 67th Legislature (2022), available at
  78. KING 5 Staff, “New labor agreement protects Washington Uber drivers from unwarranted termination,” KING 5, September 19, 2023, available at
  79. Washington State Department of Labor and Industries, “Uber drivers in Washington state gain appeal rights under first-in-nation agreement,” Press release, September 25, 2023, available at
  80. Josie Cox, “AI anxiety: The workers who fear losing their jobs to artificial intelligence,” BBC, July 13, 2023, available at
  81. Marguerita Lane, Morgan Williams, and Stijn Broecke, “The impact of AI on the workplace: Main findings from the OECD AI surveys of employers and workers” (Paris: Organization for Economic Cooperation and Development, 2023), available at
  82. David Kunst, “Deskilling among manufacturing production workers” (Paris: Centre for Economic Policy Research, 2019), available at
  83. Lane, Williams, and Broecke, “The impact of AI on the workplace.”
  84. Microsoft, “AFL-CIO and Microsoft announce new tech-labor partnership on AI and the future of the workforce,” Press release, December 11, 2023, available at
  85. Communications Workers of America, “Communications Workers of America Announces Union Principles for Artificial Intelligence in the Workplace,” Press release, December 6, 2023, available at
  86. Communications Workers of America, “ZeniMax Workers Reach Agreement with Microsoft on Contractors and AI,” Press release, December 14, 2023, available at
  87. Communications Workers of America, “ZeniMax Workers United-CWA Collective Bargaining Yields First-of-Its-Kind Tentative Agreement with Microsoft Over Use of AI in the Workplace,” Press release, December 11, 2023, available at
  88. Writers Guild of America East, “WGA East Members at FT Specialist Ratify First Union Contract,” Press release, August 23, 2023, available at
  89. Kim Parker, “About a third of U.S. workers who can work from home now do so all the time,” Pew Research Center, March 30, 2023, available at
  90. Reuters, “New York Times tech workers to strike over return-to-office rules,” October 30, 2023, available at
  91. Molly Weisner, “GAO union employees keep flexible work options in new contract,” Federal Times, September 20, 2023, available at
  92. Rachel Frazin, “EPA union announces agreements with agency expanding work from home,” The Hill, December 1, 2021, available at
  93. Harvard Union of Clerical and Technical Workers, “Negotiating Remote Work Arrangements,” available at (last accessed March 2024).
  94. Ari Paul, “Union wins remote work extensions,” PSC-CUNY, Press release, July 2023, available at
  95. Communications Workers of America and AT&T Mobility Services LLC, “2022 Regional Labor Agreement,” available at (last accessed February 2024).
  96. Communications Workers of America and Verizon New York Inc. and others, “2022 Common Issues Memorandum of Understanding” (New York: 2022), available at
  97. Communications Workers of America, “ZeniMax Workers United-CWA Collective Bargaining Yields First-of-Its-Kind Tentative Agreement with Microsoft Over Use of AI in the Workplace.”
  98. Communications Workers of America, “Communications Workers of America Announces Union Principles for Artificial Intelligence in the Workplace.”
  99. Communications Workers of America, “ZeniMax Workers Reach Agreement with Microsoft on Contractors and AI.”
  100. Lisa Kresge, “Negotiating Workers’ Rights at the Frontier of Digital Workplace Technologies in 2023,” University of California, Berkeley Labor Center, December 18, 2023, available at
  101. Center for Labor and a Just Economy, “Worker Power and Voice in the AI Response” (Cambridge, MA: 2024), available at

The positions of American Progress, and our policy experts, are independent, and the findings and conclusions presented are those of American Progress alone. A full list of supporters is available here. American Progress would like to acknowledge the many generous supporters who make our work possible.


Aurelia Glass

Policy Analyst, Inclusive Economy


A subway train pulls into the Flushing Avenue station in Brooklyn.

Inclusive Economy

We are focused on building an inclusive economy by expanding worker power, investing in families, and advancing a social compact that encourages sustainable and equitable growth.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.