DeSantis Defies Trump on AI Rules with Florida Citizen AI Bill

  • Thread Author
Florida’s governor has openly broken ranks with President Trump on how to handle artificial intelligence — framing a suite of state-level protections as necessary guardrails against deepfakes, labor disruption and the sprawling infrastructure that powers modern AI, even as the White House issues an executive order aimed at creating a single national AI rulebook.

Poster-style illustration of a Citizen AI Bill balancing state protections and federal policy.Background​

The year’s policy headlines around AI have landed on three overlapping fronts: federal efforts to standardize rules and limit state-by-state divergence, states moving to protect children and consumers from specific harms, and rising local resistance to the physical infrastructure — hyperscale data centers — required to run the largest models. The federal administration issued an executive order laying out a national policy framework and an AI Litigation Task Force to assess and, where appropriate, challenge state laws deemed “onerous”; at the same time, several states have advanced or enacted targeted laws addressing AI companions, deepfakes, and insurance uses of algorithms. What makes the current moment unusual is that one of the Republican Party’s highest-profile governors, Ron DeSantis of Florida, has put himself in explicit opposition to the President’s federal posture. DeSantis frames his approach as local stewardship: a proposed “Citizen Bill of Rights for Artificial Intelligence” and companion measures aimed at curbing hyperscale data-center growth and protecting families. That divergence is not merely rhetorical; it lays the groundwork for a real policy contest over preemption, regulatory authority, and how the United States balances innovation with social and civic risk.

What DeSantis said — the public break​

Key quotations and posture​

At events in mid-December, Governor DeSantis delivered a blunt critique of AI’s social and economic trajectories. He warned against the idea that synthetic media — “fake videos” and “fake songs” — will lead society toward a technological utopia, and he framed certain strains of AI adoption as a risk to human dignity and democratic institutions. “The idea of this transhumanist strain, that somehow this is going to supplant humans and this other stuff, we have to reject that with every fiber of our being,” he said during an event in Jupiter, Florida. He later repeated: “Let’s not try to act like some type of fake videos or fake songs are going to deliver us to some kind of utopia.” These comments are noteworthy because they come at a time when the White House is urging a national, minimally burdensome framework to accelerate AI deployment. DeSantis’ rhetoric places him politically at odds with the administration’s pro-innovation stance while aligning with parents, local communities, and critics worried about civic harms and resource strain.

Political framing​

DeSantis frames his proposals as pragmatic, not merely cultural. His messaging emphasizes concrete harms: job displacement, the erosion of trust via deepfakes, the strain on electricity and water resources posed by hyperscale centers, and the mental-health risks tied to “companion” chatbots aimed at teens. That shift from traditional culture-war framing to a protectionist, economic and consumer-focused set of claims is what makes his divergence from the President politically consequential.

What Florida proposes: the Citizen Bill of Rights and data-center curbs​

The proposal in brief​

Governor DeSantis announced a legislative package he calls the Artificial Intelligence Bill of Rights, paired with measures to protect ratepayers and local control against hyperscale data centers. Highlights from the state press release include:
  • Reinforcing protections against deepfakes and explicit material that exploit minors.
  • Prohibiting the use of a person’s name, image, or likeness by AI without consent in contexts that harm or monetize that use (political ads, fraud).
  • Notices when a consumer is interacting with AI (transparency for chatbots).
  • Parental controls and parental access to minors’ conversations with large language models.
  • Restrictions on AI being used as a substitute for licensed therapy or mental-health counseling.
  • Rules to prevent public utility rate increases or taxpayer subsidies to support hyperscale data center buildouts.

Local opposition to data centers​

DeSantis is also championing local authority to restrict or reject data-center projects on environmental, water-use, noise, and grid-strain grounds. He and allied local officials have spotlighted specific proposals — such as large projects in St. Lucie County and elsewhere — to make the economic and quality-of-life case against unconstrained hyperscale growth. DeSantis’ camp emphasizes no taxpayer subsidies, utility protections, and local permitting authority as central elements.

The federal response: Trump’s executive order and legal limits​

What the executive order does​

The White House issued an executive order on December 11, 2025, that sets out a national policy emphasizing a “minimally burdensome” federal framework designed to preserve U.S. competitiveness in AI. The order instructs the Commerce Department to evaluate state AI laws and tasks the Attorney General with creating an AI Litigation Task Force to challenge state statutes the administration deems inconsistent with federal policy. It also ties eligibility for certain federal funds to alignment with the administration’s AI policy priorities. Note: some contemporaneous press coverage referenced December 12 as the date of the action; the official text and White House release are dated December 11. When reporting around a fast-moving executive action, relying on the official published order is the best way to confirm precise timing and language.

What it cannot do — constitutional and practical limits​

An executive order cannot unilaterally overturn state statutes or permanently preempt state legislative authority where the Constitution and federal statutes do not grant that specific power. Governor DeSantis has publicly argued precisely that point: “An executive order doesn’t/can’t preempt state legislative action,” he posted on social media, adding that only Congress could theoretically enact preemption through legislation. That is a defensible legal reading: the federal government can preempt state law where Congress legislates pursuant to its enumerated powers or where federal law occupies the field, but unilateral presidential orders are more limited and face immediate legal scrutiny when they purport to override state regulatory authority. The White House attempt to condition certain federal spending or program eligibility on state compliance — a common administrative lever — raises real political and legal questions (e.g., Commerce Department policy notices and BEAD funds language within the EO). Those levers can be powerful in practice, but are still more likely to produce litigation than to instantly sweep aside state-level initiatives.

The moratorium battle: Congress, the Senate vote, and the politics of pause​

A significant episode that shaped the federal-state fight was the push earlier in 2025 to include a long moratorium on state-level AI regulation in large federal legislation. Lawmakers considered language that would block states from enacting AI rules for ten years; that provision proved politically toxic and was struck by an overwhelming Senate vote (99–1) in July 2025. The moratorium’s removal demonstrated strong bipartisan resistance — including among Republicans — to a federal carve-out that would have insulated tech companies from local regulation. That history helps explain both the White House’s urgency and DeSantis’ determination to preserve state authority. Important caveat: moratorium proposals have surfaced repeatedly in various forms (budget packages, defense bills, and “one big” bills), and their language and legislative posture changed over time. Treat references to the “10-year moratorium” as shorthand for a particular legislative episode; the precise scope and enforcement mechanisms proposed earlier in the year varied across drafts.

Deepfakes, teen companion bots and the human costs​

Why DeSantis is focused on synthetic media and AI companions​

DeSantis’ emphasis on deepfakes and AI companions maps to two concrete civic anxieties. First, deepfakes (synthetic audio and video that impersonates real people) pose political and reputational risks and can erode trust in news and institutions. Second, AI companions — chatbots and “character” bots that simulate relationships — have been implicated in a wave of policy concern after several tragic cases where teens interacted with bots before harming themselves. DeSantis and others point to both phenomena as examples where state-level action could protect children and civic discourse.

What California and other states have done​

California enacted companion-bot rules that require operators to refer users reporting suicidal ideation to crisis services and impose other safety requirements and penalties; the law also strengthened penalties for nonconsensual deepfake sexual content involving minors. Those measures followed high-profile investigations and lawsuits alleging connections between chatbot interactions and teen suicides. At the same time, experts caution that causation is complex: litigation and investigations are ongoing, and linking a bot’s conversational content to a specific outcome requires careful forensic work. Policymaking nonetheless proceeded on a harm‑reduction basis after repeated, painful case reports.

Takeaway on causation vs. prevention​

Policymakers are acting on risk not absolute proof of causal chains. The frequency and pattern of harms reported — and the vulnerabilities of minors using emotionally persuasive systems — make precautionary laws politically and ethically compelling to many state leaders, even while courts and researchers continue to evaluate precise causal pathways. Where possible, laws emphasize crisis-referral mechanisms, transparency, and age protections rather than attempting to ban broad classes of innovation outright.

Data centers: the overlooked battleground​

Why local opposition matters​

Hyperscale data centers are the physical backbone of cloud AI. They require vast electricity and water resources, generate continuous noise and heat, and alter local grid dynamics. Communities confronted with proposed mega‑centers often raise concerns about:
  • Electricity rates and grid reliability when massive new loads are added.
  • Water consumption and local environmental impacts.
  • Noise and light pollution from constant cooling systems and backup generators.
  • Land use changes and the loss of agricultural or conservation lands.
DeSantis’ proposals tap into those concrete quality-of-life and economic questions; his team argues that Floridians should not be left to subsidize corporate infrastructure through higher utility rates or tax giveaways.

The industry counterargument​

Tech companies and some economists push back: data centers create jobs, spur ancillary investment, and can be sited with modern efficiency to minimize environmental impact. Industry argues that scaled deployment is essential for national competitiveness in AI; it also points to innovation in data-center efficiency and renewable energy procurement as partial mitigations. The policy choice — whether to prioritize local constraints or national industrial capacity — is the crux of this debate.

Analysis: strengths, weaknesses and political risk​

Strengths of DeSantis’ approach​

  • Political traction: Local control and consumer-protection messaging resonate across party lines and with voters who see infrastructure and child safety as immediate problems.
  • Targeted interventions: The proposed Bill of Rights focuses on specific, actionable items (parental controls, NIL protections, prohibiting AI therapy) that are simpler to implement and defend than sweeping federal bans.
  • Leverage on local planning: Data center pushback can be effective at the municipal level, offering a practical check on rapid hyperscale expansion.

Weaknesses and risks​

  • Fragmentation risk: If every state enacts different technical standards, compliance costs and legal uncertainty could slow smaller startups or push investment overseas.
  • Economic tradeoffs: Aggressive curbs on data centers might raise costs for local businesses and consumers or cause employers and investment to relocate.
  • Legal exposure: Some measures could face preemption challenges, and the federal government may deploy funding conditions that bite into state budgets.
  • Policy design complexity: Banning “AI therapy” or restricting model uses can have unintended consequences for accessibility and legitimate telehealth applications if definitions are not meticulously crafted.

The political gamble​

DeSantis’ public break with the President is a calculated gamble: it stakes out a populist, governor‑level protector role that can play well domestically while testing the bounds of GOP cohesion on tech policy. For the industry, the split increases regulatory uncertainty while giving state legislatures cover to act. For national politics, it tests whether the White House will prioritize centralized federal uniformity or tolerate state experimentation.

Practical takeaways for stakeholders​

For state lawmakers and regulators​

  • Prioritize narrow, defensible rules: crisis referral, transparency, and parental controls are easier to defend in court than broad operational bans.
  • Build interoperable standards: coordinate with neighboring states to reduce harmful patchworks where feasible.
  • Use permitting and local planning authority to evaluate infrastructure impacts comprehensively (grid, water, noise).

For technology companies​

  • Expect a mixed patchwork: budget for compliance and legal teams that can handle state-by-state variance.
  • Invest in proven safety mechanics: provenance metadata, robust crisis-handling protocols, and transparent age‑assurance methods.
  • Engage early with local communities on data‑center siting and shared-benefit agreements to reduce political friction.

For Windows users, consumers and civic groups​

  • Watch for disclosure rules in your state — you may gain the right to know when you’re talking with AI or to access records of minors’ conversations under parental-control rules.
  • Be cautious about emotionally persuasive AI companions; seek human help for mental-health crises and use the new crisis-referral tools companies will be legally compelled to expose.

What to watch next​

  • Legal challenges to the executive order’s leverage: lawsuits and administrative litigation are likely if the administration tries to condition federal funds or file preemption challenges against states. The balance between executive guidance and statutory preemption will be litigated quickly.
  • Florida’s legislative calendar: DeSantis said his proposals will move in the state legislature; how those bills are drafted (definitions, exceptions, enforcement) will determine their longevity and susceptibility to federal challenge.
  • Other states’ trajectories: California and New York’s companion-bot rules show a path for child-safety regulation; several states may follow with variations that create either harmonization (if states coordinate) or fragmentation.
  • Industry reaction: real siting decisions for data centers, corporate pledges on provenance and safety, or litigation funding could materially shift where investments land.

Conclusion​

The DeSantis–Trump split over AI policy reframes a debate that many observers assumed would be resolved along predictable party lines. Instead, it places state-level consumer protection and community control at the center of a national argument about competitiveness, civil resilience, and the distributional impacts of a technology that touches media, labor and civic life. The policy fight is now legal, political and technical at once: executive orders and federal leverage will collide with municipal land-use politics, state legislatures will write nuanced protections with real-world consequences, and companies must adapt to a landscape where both data gravity (where the compute sits) and policy gravity (whose rules prevail) matter.
For Windows users, IT managers and local officials, the near future will be defined by a pragmatic triage: protect people (especially children), preserve community resources against unpriced externalities, and preserve innovation pathways that allow useful AI to deliver productivity gains. The details — precise definitions, enforcement mechanisms, and the balance between federal and state authority — will be fought in legislatures, courts and planning boards over the coming year. The stakes are not abstract: they are about who controls the rules of engagement for machines that can alter speech, work and governance at planetary scale.
Source: PCMag Australia DeSantis Breaks With Trump on AI: 'Fake Videos' Are Not 'Some Kind of Utopia'
 

Back
Top