Trump Administration Eyes AI Preemption of State Laws

On March 20, 2026, the Trump administration issued a National Policy Framework for Artificial Intelligence outlining the White House's non-binding "wish list" for federal AI regulation.

Objective Facts

On December 11, 2025, the White House issued an executive order, "Ensuring a National Policy Framework for Artificial Intelligence" (the Executive Order), intending to weaken state-level regulations of artificial intelligence through a combination of targeted litigation led by the Department of Justice, administrative reinterpretation of existing laws, conditional federal funding and the preemption of existing state laws through a federal policy framework. The Commerce Department's evaluation, published on the March deadline, flagged laws in Colorado, California, and New York for particular scrutiny. The evaluation feeds into the DOJ task force, which is expected to begin filing federal legal challenges by summer 2026. Cases are projected to take two to three years to resolve. The third was a National Policy Framework for AI released on March 20, containing legislative recommendations organised around seven pillars: child protection, AI infrastructure, intellectual property, censorship and free speech, innovation, workforce preparation, and preemption of state AI laws. The volume of legislation reflects a bipartisan consensus at the state level that AI regulation cannot wait for a Congress that has repeatedly failed to act. Regional media perspectives track closely with U.S. framing on this domestic federalism dispute, though some European outlets emphasize concerns about U.S. deregulation undercutting EU standards.

Left-Leaning Perspective

Senator Brian Schatz (D-Hawai'i) led a group of six senators in introducing the GUARDRAILS Act to repeal President Trump's executive order seeking to prevent states from regulating artificial intelligence. The Trump administration renewed its call for a preemption of state-level AI laws as part of its National AI Policy Framework. "Embracing the amazing possibilities of AI can't come at the cost of leaving Americans vulnerable to its profound risks, which is exactly what President Trump's Executive Order tries to do," said Senator Schatz. "Discouraging states from enacting common-sense regulation that protects people from potential AI harms is dangerous." Representatives Yvette Clarke (D-N.Y.) and Don Beyer (D-Va.), along with Sen. Brian Schatz (D-Hawaii), raised concerns regarding federal preemption, accountability and oversight. On March 20, 2026, Rep. Beyer, alongside Reps. Doris Matsui (D-Calif.), Ted Lieu (D-Calif.), Sara Jacobs (D-Calif.) and April McClain Delaney (D-Md.), introduced the Guaranteeing and Upholding Americans' Right to Decide Responsible AI Laws and Standards (GUARDRAILS) Act, which would repeal the Trump Administration's EO establishing a national AI policy framework and effectively block efforts to impose a moratorium on state-level AI regulation. Caucus Co-Chair Josh Gottheimer (D-NJ) criticized the AI Framework as lacking meaningful accountability, emphasizing that federal preemption is only appropriate if it replaces state regimes with a comprehensive and protective national standard, arguing that the framework falls short. Caucus Co-Chair Valerie Foushee (D-NC) echoed those concerns, warning that the AI Framework "lacks meaningful guardrails" and overlooks AI's real-world impacts on jobs, communities and resources, while taking "the wrong approach" by limiting state and local authority. The Center for American Progress characterized the AI National Framework EO as primarily directing the attorney general to head an "AI Litigation Task Force" to challenge state AI laws deemed "onerous" and for federal agencies to withhold certain Broadband Equity Access and Deployment (BEAD) funding from states with "onerous" state AI laws, representing a potentially sweeping and legally unjustifiable incursion into the rights of states and serving as a form of threat and intimidation to states. Adam Billen, vice president of Encode, a nonprofit focused on child safety and threats posed by AI, stated "Even if everything is overturned in the executive order, the chilling effect on states' willingness to protect their residents is going to be huge because they're all now going to fear getting attacked directly by the Trump administration," and "That is the point of all of this - it is to create massive legal uncertainty and gray areas and give the companies the chance to do whatever they want." Public Knowledge's position is that federal preemption of state AI laws may be warranted on a 1:1 basis. However, the regulation-impedes-innovation mindset currently underpinning the Trump and congressional Republican preemption effort is entirely backwards: giving the public a hand in shaping the technology and showing the government has a hand on the wheel can build the trust to accelerate adoption. By contrast, broad deregulatory preemption could further polarize an already-skeptical public.

Right-Leaning Perspective

Several Republican lawmakers, including U.S. Senator Ted Cruz (TX), who previously supported a 10-year moratorium of state and local AI laws, and members of House leadership, have praised the EO as a necessary interim step. Senator Marsha Blackburn, R-Tenn., who released her own sweeping draft AI legislation, emphasized focusing on legislation that can pass both chambers, stating "Over the last few months, I have worked diligently with the White House, conservative leaders, child safety advocates, members of the creative community, and AI innovators to develop legislation that can garner bipartisan support and accomplish the President's goals." AI industry priorities won praise from NetChoice director Patrick Hedger, who said the framework shows that the White House knows "what is at stake and what it will take to win the future," and Daniel Castro, director of the Center for Data Innovation, said the framework avoids the "worst instincts in today's AI debate" including "alarmism" about unemployment and worries that AI training infringes on copyright. Senate Commerce Committee Chair Ted Cruz (R-TX) stated, "I look forward to working with the White House and members of the Commerce Committee to advance meaningful AI legislation that safeguards free speech, establishes regulatory sandboxes, protects children and provides a national standard for AI in the United States." Senator Marsha Blackburn wants her vision for the future of American artificial intelligence, with strict child safeguards, to prevail over a tech accelerationist push that overrides state regulations while including only minimal safeguards. The Tennessee Republican's bill differs in key aspects from the White House National Framework on AI, but she believes it will fulfill President Donald Trump's request for Congress to pass the first national standard for AI, saying "I do think this is the vehicle that is set to travel forward at committee." The Trump Administration's legal theory stems from the AI Action Plan, which prioritized preventing the imposition of ideological constraints on AI development. The Administration posits that if an AI model is trained on data reflecting societal patterns, forcing developers to alter the model's outputs to mitigate bias compels them to produce results that are less faithful to the underlying data. Under this interpretation, such mitigation renders the model less "truthful" and, therefore, deceptive.

Deep Dive

The Trump administration's AI preemption strategy represents a deliberate shift from traditional federalism toward centralized federal control of AI regulation justified by economic competitiveness. On December 11, 2025, the White House issued an executive order intended to weaken state-level regulations through a combination of targeted litigation led by the Department of Justice, administrative reinterpretation of existing laws, conditional federal funding and the preemption of existing state laws through a federal policy framework. This follows a clear pattern in the Trump Administration's AI policy of seeking to limit state-level regulation and consolidate authority at the federal level. The Trump Administration had previously pursued legislative preemption through the proposed One Big Beautiful Bill Act, which included a 10-year moratorium on new state AI regulations. Taken together, these initiatives reflect the Trump Administration's strategy to limit the scope of state AI rules and promote a uniform AI governance framework set by the federal government. Each side correctly identifies what the other omits. Conservatives omit that the volume of state legislation reflects a bipartisan consensus at the state level that AI regulation cannot wait for a Congress that has repeatedly failed to act, and that the Senate voted 99 to 1 to strip the AI preemption provision, with only Senator Thom Tillis voting to keep it, and the bill was signed into law without any restrictions on state AI legislation, indicating deep congressional skepticism of preemption. Progressives omit that the exemptions from preemption reflect intense pressure from Republican-controlled states such as Florida that have been working to pass laws to protect consumers and residents from the negative impacts of AI, showing internal Republican divisions on deregulation. They also omit that DOJ challenges to state laws appear unlikely to be successful, courts are unlikely to grant injunctions unless the Administration makes a more defined legal showing than would be apparent from the face of the EO, and while DOJ challenges to state laws appear unlikely to succeed, companies should monitor DOJ-initiated challenges for the potential impact and influence on how states approach near-term enforcement. What matters next: The DOJ task force is expected to begin filing federal legal challenges by summer 2026. Cases are projected to take two to three years to resolve. Many agencies appear to have not yet taken the various actions ordered by the EO despite the passage of their respective deadlines. For now, employers should continue to monitor and comply with existing state and local laws in this space, but be prepared for uncertainty and change in 2026. Congressional willingness to pass preemption legislation remains uncertain given the 99-1 Senate vote against it and razor-thin Republican House margins in an election year.

OBJ SPEAKING

Create StoryTimelinesVoter ToolsRegional AnalysisAll StoriesCommunity PicksUSWorldPoliticsBusinessHealthEntertainmentTechnologyAbout

Trump Administration Eyes AI Preemption of State Laws

On March 20, 2026, the Trump administration issued a National Policy Framework for Artificial Intelligence outlining the White House's non-binding "wish list" for federal AI regulation.

Apr 29, 2026
What's Going On

On December 11, 2025, the White House issued an executive order, "Ensuring a National Policy Framework for Artificial Intelligence" (the Executive Order), intending to weaken state-level regulations of artificial intelligence through a combination of targeted litigation led by the Department of Justice, administrative reinterpretation of existing laws, conditional federal funding and the preemption of existing state laws through a federal policy framework. The Commerce Department's evaluation, published on the March deadline, flagged laws in Colorado, California, and New York for particular scrutiny. The evaluation feeds into the DOJ task force, which is expected to begin filing federal legal challenges by summer 2026. Cases are projected to take two to three years to resolve. The third was a National Policy Framework for AI released on March 20, containing legislative recommendations organised around seven pillars: child protection, AI infrastructure, intellectual property, censorship and free speech, innovation, workforce preparation, and preemption of state AI laws. The volume of legislation reflects a bipartisan consensus at the state level that AI regulation cannot wait for a Congress that has repeatedly failed to act. Regional media perspectives track closely with U.S. framing on this domestic federalism dispute, though some European outlets emphasize concerns about U.S. deregulation undercutting EU standards.

Left says: Senator Schatz argues that "Embracing the amazing possibilities of AI can't come at the cost of leaving Americans vulnerable to its profound risks" and that "Discouraging states from enacting common-sense regulation that protects people from potential AI harms is dangerous."
Right says: Michael Kratsios, science and technology adviser to Trump, told one publication: "We need one national AI framework, not ​a 50-state patchwork."
✓ Common Ground
The addition of child safety exemptions reflects intense pressure from Republican-controlled states such as Florida that have been working to pass laws to protect consumers and residents from the negative impacts of AI, indicating that resistance to the EO could come from a bi-partisan coalition of states.
Some voices on the left and right share concern that many agencies appear to have not yet taken the various actions ordered by the EO despite the passage of their respective deadlines, suggesting agreement on the practical challenges of executive implementation.
Prominent Democrats and Republicans both criticized the EO as too closely aligned with industry and contrary to federalist principles, suggesting it will face stiff opposition.
Attempts to include a preemption provision or a quasi-moratorium in the NDAA sputtered in the face of Democratic and some Republican opposition. Moreover, earlier attempts at legislation to impede state AI regulations have drawn opposition from many Republicans in Congress who support at least some state-based AI regulation.
Objective Deep Dive

The Trump administration's AI preemption strategy represents a deliberate shift from traditional federalism toward centralized federal control of AI regulation justified by economic competitiveness. On December 11, 2025, the White House issued an executive order intended to weaken state-level regulations through a combination of targeted litigation led by the Department of Justice, administrative reinterpretation of existing laws, conditional federal funding and the preemption of existing state laws through a federal policy framework. This follows a clear pattern in the Trump Administration's AI policy of seeking to limit state-level regulation and consolidate authority at the federal level. The Trump Administration had previously pursued legislative preemption through the proposed One Big Beautiful Bill Act, which included a 10-year moratorium on new state AI regulations. Taken together, these initiatives reflect the Trump Administration's strategy to limit the scope of state AI rules and promote a uniform AI governance framework set by the federal government.

Each side correctly identifies what the other omits. Conservatives omit that the volume of state legislation reflects a bipartisan consensus at the state level that AI regulation cannot wait for a Congress that has repeatedly failed to act, and that the Senate voted 99 to 1 to strip the AI preemption provision, with only Senator Thom Tillis voting to keep it, and the bill was signed into law without any restrictions on state AI legislation, indicating deep congressional skepticism of preemption. Progressives omit that the exemptions from preemption reflect intense pressure from Republican-controlled states such as Florida that have been working to pass laws to protect consumers and residents from the negative impacts of AI, showing internal Republican divisions on deregulation. They also omit that DOJ challenges to state laws appear unlikely to be successful, courts are unlikely to grant injunctions unless the Administration makes a more defined legal showing than would be apparent from the face of the EO, and while DOJ challenges to state laws appear unlikely to succeed, companies should monitor DOJ-initiated challenges for the potential impact and influence on how states approach near-term enforcement.

What matters next: The DOJ task force is expected to begin filing federal legal challenges by summer 2026. Cases are projected to take two to three years to resolve. Many agencies appear to have not yet taken the various actions ordered by the EO despite the passage of their respective deadlines. For now, employers should continue to monitor and comply with existing state and local laws in this space, but be prepared for uncertainty and change in 2026. Congressional willingness to pass preemption legislation remains uncertain given the 99-1 Senate vote against it and razor-thin Republican House margins in an election year.

◈ Tone Comparison

Democratic critics use protective language like "vulnerable to its profound risks" and emphasize "common-sense regulation," positioning state laws as necessary safeguards. Republican supporters invoke competitive framing, with industry voices calling for "light-touch regulatory environment" to enable innovation. Democrats invoke constitutional limits; Republicans invoke competitive necessity.