Washington, D.C. — Republican leaders in Congress are again considering legislation that would block states from regulating artificial intelligence (AI), a move that would undermine valuable consumer protections and safety concerns without having an adequate federal framework in place.
A new analysis from the Center for American Progress highlights the dangers of this latest effort to preempt state AI laws, which could be inserted into the annual National Defense Authorization Act. This follows the failed effort this summer to include a moratorium on state AI laws in the Big Beautiful Bill, which was removed from the bill by the Senate in a 99-1 vote.
This action is being driven by the largest tech firms that want to reduce public scrutiny and shield their practices from accountability, the analysis says.
“A moratorium on state AI laws would freeze progress in the states without offering any federal protections in return,” said Nicole Alvarez, a senior policy analyst at CAP and author of the analysis. “States are moving forward with real policies that respond to real problems, and blocking those efforts would not create clarity. Blocking states from regulating AI would only protect industry interests and silence the only voices trying to keep this technology accountable.”
Rather than wait, states, which have long been the laboratory of American democracy, have advanced laws that promote transparency, prevent abuse, and restrict dangerous uses of AI in sectors such as employment, housing, and health care. These state efforts are critical because they allow for policy experimentation, developing best practices that can later inform federal legislation.
Poorly crafted federal preemption and blanket moratoriums on state AI laws in the absence of federal standards are dangerous approaches. The analysis urges Congress to reject any push to centralize control at the expense of state power.
Read the analysis: “Moratoriums and Federal Preemption of State Artificial Intelligence Laws Pose Serious Risks” by Nicole Alvarez
For more information, or to speak with an expert, please contact Sam Hananel at [email protected].