To be clear, there is no existing national policy framework for AI and creating one would be the job of Congress—not the president. Similarly, only Congress can pass a law that legally preempts state laws. More specifically, the president does not have the legal authority to simply preempt or strike down state laws that they dislike via EO.
Then why did President Trump issue this AI National Framework EO? Because Congress has not made meaningful progress on any national framework for AI. Meanwhile, states are exercising their rights and responsibilities to legislate on AI, and though the actual number of AI laws on the books is small today—compared to the talking point of thousands of proposed state AI bills—the number of state AI laws is growing.
AI companies and their trade groups have generally asserted that significant regulatory guardrails at the state or federal level would constrain their ability to innovate. Due to the lack of federal laws addressing AI, a few states have stepped in and have begun to pass laws erecting basic guardrails for the powerful new technology. That has led the industry to claim that a proliferation of state AI laws will create a patchwork more than 50 different rules that will make it difficult or impossible for AI companies to operate or innovate. Thus, AI industry trade groups and top AI companies have identified the preemption of state AI laws as a top priority. Republicans in Congress have failed twice this year in their attempts to include a moratorium or federal preemption of state AI laws in critical bills, most recently failing to include it in the National Defense Authorization Act (NDAA) and previously losing a vote 99-1 that removed an AI moratorium from the Big Beautiful Bill (BBB) before it passed earlier this year. The National AI Framework EO represents a last attempt to provide some kind of federal check against state AI laws. However, as outlined below, it is the most legally dubious effort yet. The EO itself admits, “Until such a national standard exists, however, it is imperative that my Administration takes action to check the most onerous and excessive laws emerging from the States that threaten to stymie innovation.”
The AI National Framework EO primarily directs the attorney general to head an “AI Litigation Task Force” to challenge state AI laws deemed “onerous” and for federal agencies withhold certain Broadband Equity Access and Deployment (BEAD) funding and any discretionary grants from states with “onerous” state AI laws. This EO represents a potentially sweeping and legally unjustifiable incursion into the rights of states; will likely primarily serve as a form of threat and intimidation to states; and does not uphold the administration’s promises to exempt AI child safety or data center laws from its attempted preemption.
Unprecedented incursion on the states
Much as he may want to, the president cannot simply assert the negation of state (or federal) laws through EO. The Constitution clearly articulates that the Congress is empowered to make the nation’s laws, and the executive branch—led by the president must—take care to faithfully execute those laws. The December 11 AI National Framework EO is similar, though not identical, to a draft EO published by news outlets in November. Detailed legal analysis of the leaked November draft or the final December EO are both relevant and accurate.
It is important to note that the AI National Framework EO does not technically assert a blanket negation of state AI laws but merely directs the creation of an “AI Litigation Task Force” led by the attorney general “whose sole responsibility shall be to challenge State AI laws inconsistent with the policy set forth” in the EO. The determination of what is “inconsistent” or “onerous” falls to the Secretary of Commerce and others in the federal government.
It bears reiterating that AI National Framework EO is an unprecedented and likely unconstitutional attempt to assert the power of the federal executive branch into the powers of state and local governments. This unconstitutional EO is also an inversion of the position historically held by conservatives on the importance of primacy of state’s rights and federalism. Conservative governors such as Ron DeSantis of Florida responded to the reports of the EO with alarm.
States should choose to fight this EO’s sweeping encroachment of state powers purely on principle. Every governor and state attorney general, regardless of party, has a vested interest in the preservation of their own powers and should seek to fight this EO in court immediately, if only to reserve that ability to push back on a future president they may fear could invoke the same authority.
Enforcement by threat
The main power of the AI EO is the threat of action from the federal government to sue or restrict funding to the states to coerce and deter states from passing any new AI laws or regulations or enforcing their existing ones.
The AI EO utilizes two main mechanisms to enforce its attacks against state AI laws, an “AI Litigation Task Force” and “Restrictions on State Funding,” particularly certain BEAD funds. More troublingly, it provides a directive to restrict funding from all executive departments and agencies’ discretionary grant programs, which could total potentially billions of dollars.
The ordered attempts by the federal government’s “AI Litigation Task Force” to overturn state AI laws appear to rest primarily on a new dormant commerce clause argument advanced by the venture capital firm Andreessen Horowitz in September. That language is echoed in the AI EO language directing the government to “challenge State AI laws inconsistent with the policy set forth . . . in this order, including on grounds that such laws unconstitutionally regulate interstate commerce.”
As experts such as LawAI have noted in analyzing the November draft EO, which utilized almost identical language to the December AI National Framework EO, “analysis indicates that this commerce clause argument, at least with respect to the state laws specifically referred to in the DEO [Draft Executive Order], is legally meritless and unlikely to succeed in court.” Similarly, any federal discretionary grants not issued to a state because of its AI laws would likely be subject to immediate litigation from states too.
The EO also directed future restrictions for states with “onerous AI laws” on remaining BEAD nondeployment funds, though not the broader tranche of statutorily required and already obligated BEAD deployment funds for states. The federal government has only recently begun approving the final proposals for states to release their full BEAD funding but 24 states and territories have yet to have their final proposals approved. However, the secretary of commerce—who ultimately oversees the BEAD program—is also tasked with identifying states with “onerous AI laws,” which will be used to determine which states have federal funds withheld. As CAP has previously noted during the BEAD debates over the proposed state AI moratorium in the BBB, the administration holds the sole ability to approve a state’s BEAD final proposal. Once the secretary of commerce has identified the states that have “onerous AI laws,” if those states have not received their BEAD final approval, then they should be vigilant in making sure their BEAD final proposal approval is not impacted by this EO.
If the federal government took either of the primary actions against states outlined in the EO—challenging their state laws in court or withholding BEAD or discretionary grant federal funds—it would likely result in immediate litigation from the affected states. But resolving said litigation would take months, or even years, forcing states to consider a time and financial cost to passing potential legislation or defending existing state AI law. This would appear to be the partial—if not the main—intent of this EO.
Child safety and data centers are not exempted
David Sacks, the special advisor for AI and crypto, claimed in a tweet that ahead of the AI EO’s release that child safety and local rules on data centers would not be preempted, referring to what he called the “4 C’s,” with the first two C’s being “child safety” and “communities” (referring to data centers):
- Child safety – Preemption would not apply to generally applicable state laws. So state laws requiring online platforms to protect children from online predators or sexually explicit material (CSAM) would remain in effect.
- Communities – AI preemption would not apply to local infrastructure. That’s a separate issue. In short, preemption would not force communities to host data centers they don’t want.
Yet, the actual EO states in “Sec. 8. Legislation” only that the administration is to “prepare a legislative recommendation establishing a uniform Federal policy framework for AI that preempts State AI laws that conflict with the policy set forth in this order” that “shall not propose preempting otherwise lawful State AI laws relating to: (i) child safety protections; (ii) AI computer and data center infrastructure, other than generally applicable permitting reforms.”
To be clear, the AI EO does not in any way, shape, or form direct the federal government to not sue or withhold federal discretionary grants from states with laws related to AI child safety or data centers, which would have been an easy inclusion into the EO. Instead, it merely says that the administration’s recommendation to Congress for legislation “establishing a uniform Federal policy framework for AI that preempts State AI laws” should not preempt state laws relating to child safety and data centers, none of which is directed to bind the federal government to not sue or withhold federal funds from states with AI child safety or data center laws. The exclusion of state child safety and data center AI laws from the EO’s ordered federal action is even starker, as one of the other “4 C’s”—censorship—is specifically highlighted as a consideration for state AI laws with the requirement to “identify laws that require AI models to alter their truthful outputs.”
It is hard to square Sack’s tweet—“In summary, we’ve heard the concerns about the 4 C’s [Child Safety, Communities, Creators and Censorship], and the 4 C’s are protected”—with the decision to not specifically exclude from federal action state AI child safety and data center laws while specifically including the issue of censorship in the EO. States should not trust this administration with any tweets that promise otherwise when the EO itself goes out of its way to not exempt those laws.
Response from states
This is another in a long line of illegal EOs from the Trump administration. States should not cave to the federal government and shy away from carrying out their responsibilities. Indeed, the day this EO was issued, we have seen a state legislature stand up to intimidation and pressure from the Trump administration. More importantly, it is critical for elected officials to recognize how unpopular attempts to shield AI companies from any laws is and work hard to reject those attempts.
In response to this EO, states should come together and prepare joint legal action to challenge the EO’s legality generally and seek a preliminary injunction. This is an attack on the rights of states that should not—and cannot—go unchallenged. This is a potentially rare instance where state attorneys general of both parties can unite in opposition to an overreaching federal executive policy.
Separately, as January approaches and new state legislative sessions convene across the country, states should pass AI laws that get to the heart of the real concerns that people have about AI with common sense solutions prohibiting the worst abuses of AI—such as being fired autonomously by an algorithm—and pass laws to take action on those.
AI companies claim that states passing AI laws is a disaster that must be preemptively preempted by Congress. But it is clear now that states are the only ones acting, and state action may be the only thing that may cause Congress to take real action to develop a national framework.
A real federal legislative framework for AI is needed that addresses key safety and other issues. This framework may or may not eventually include a form of federal preemption. A federal preemption of state laws without a new federal framework—whether attempted legally by Congress or attempted unconstitutionally by the president—would be a dangerous mistake and imperil the safe and broad development and adoption of this technology to advance our national interest. This EO exists because Congress has, thus far, failed to do its duty, and the president is attempting to illegally act in the face of its inaction while he and his administration also reward some of his strongest corporate supporters. States should not shy away from doing their duty in addressing the concerns around AI and should not let the executive branch tell states what laws it cannot pass solely because Congress failed to act.