Regulators Target Cloud Lock-In: Fees, Bundles and DMA Probes

  • Thread Author
Antitrust enforcers across the United States, the United Kingdom and the European Union have zeroed in on the commercial plumbing of the internet — the hyperscale cloud providers — to ask a blunt question: are today’s dominant cloud contracts and pricing practices deliberately sticky, and are those practices harming competition and innovation?

Background / Overview​

Regulators’ actions in 2025 follow years of complaints, industry research and high‑profile market reports documenting how a small number of hyperscalers — chiefly Amazon Web Services (AWS), Microsoft Azure and Google Cloud — now account for the lion’s share of cloud infrastructure spending. The U.K. Competition and Markets Authority (CMA) completed a 21‑month market investigation in July 2025 that concluded AWS and Azure each account for roughly 30–40% of UK cloud spend, with Google Cloud trailing in the single digits; the CMA flagged egress fees, incompatible architectures and restrictive licensing as key contributors to lock‑in. Those national findings helped catalyze action in Brussels and Washington. On 18 November 2025 the European Commission opened three coordinated market investigations under the Digital Markets Act (DMA): company‑level probes into AWS and Microsoft Azure to determine whether particular cloud services should be treated as “gatekeepers,” and a horizontal study testing whether the DMA’s toolkit is fit for cloud infrastructure markets. In the United States the Federal Trade Commission (FTC) has been building a record since a 2023 Request for Information, collecting industry feedback about data egress fees, minimum‑spend commitments, bundled licensing, opaque billing, and technical switching costs. The FTC’s interest widened in 2024–2025 as startups and AI developers reported abrupt pricing changes and contractual entanglements that, they say, made migration or multicloud strategies impractical. Industry trackers corroborate concentration at a global level: independent market research shows the three hyperscalers capturing roughly 60–65% of global cloud infrastructure spend in 2024–2025, with AWS typically around the low‑to‑mid‑30s as a percentage and Microsoft and Google occupying the next largest slices. Those numbers help explain why regulators view switching frictions as more than an annoyance — they are potential structural barriers to effective competition.

Why regulators are focused on “stickiness”​

The complaint vector: egress fees, commitments and bundles​

Regulatory interest targets a clear set of commercial mechanics that, together, make switching providers expensive, time‑consuming or operationally risky for business customers:
  • Data egress fees (charges to move data out of a provider) that can balloon migration costs and deter customers from leaving a platform. The CMA devoted working papers to the empirical effects of these fees; the European Commission’s sectoral probe likewise lists egress charges as a priority.
  • Minimum‑spend commitments and long‑term discounts that effectively function as exclusive‑dealing incentives: big discounts in return for contractual commitments can privilege incumbents and penalize customers who want to spread workloads across multiple clouds. The FTC’s RFI found multiple comments flagging these provisions as lock‑in mechanisms.
  • Bundled licensing and proprietary packaging, including software licensing terms that create price differentials depending on whether a workload runs on the vendor’s own cloud. The CMA specifically highlighted Microsoft licensing as a potential barrier to running Microsoft workloads on rival clouds. Other complainants — including Google Cloud in a 2024 complaint — focused on the same mechanics.
  • Opaque billing, unilateral price changes and control‑plane differences that make cost forecasting and migration planning difficult, especially for startups on tight margins. The FTC’s outreach collected many such anecdotes.

Why these practices matter now (and why AI raises the stakes)​

Cloud is no longer incidental infrastructure; it is the strategic backbone for banking, government services, streaming platforms and, crucially, large‑scale AI training and inference. AI workloads rely on specialized accelerators, high‑throughput networking and integrated stacks — and those resources are not evenly distributed across providers. When compute and data portability are asymmetric, the cost of changing providers rises not only in dollars but in lost access to capacity and specialized tooling. Regulators view that dynamic as a multiplier of potential market power.

What the agencies are actually doing​

United Kingdom — CMA: fact finding and SMS recommendation​

The CMA’s final report (31 July 2025) closed its market investigation and recommended pursuing Strategic Market Status (SMS) probes for AWS and Microsoft under the U.K.’s new digital markets regime, citing concentration, low switching rates (customers switching providers at an estimated rate below 1% per year), and contractual and technical frictions that materially limit customer mobility. The CMA’s approach prioritized targeted remedies — transparency, portability standards, and potentially SMS designation — rather than wholesale structural breakups at this stage.

European Union — DMA market investigations and a horizontal probe​

The European Commission’s 18 November 2025 announcement initiated two company‑level DMA market investigations, one on AWS and one on Azure, and a horizontal study assessing whether DMA obligations (non‑discrimination, interoperability, portability) map coherently to cloud infrastructure. If the Commission finds certain cloud services act as “important gateways,” it can designate them as gatekeepers and impose ex‑ante obligations with tight compliance timetables and steep fines for breaches.

United States — FTC fact‑gathering and AI partnerships scrutiny​

The FTC launched its cloud probe via a 2023 Request for Information and has continued to gather detailed submissions and documents; in parallel the agency used its 6(b) authority in 2024–2025 to study large cloud–AI partnerships (e.g., Microsoft‑OpenAI, AWS‑Anthropic) for potential exclusivity or tied‑spend effects that could raise switching costs for AI developers. The FTC’s posture to date has been research‑driven with the ability to pivot to enforcement if the evidence supports anticompetitive conduct.

How the hyperscalers respond — claims, concessions and counterarguments​

The hyperscalers reject the “lock‑in” framing while acknowledging complexity.
  • Amazon (AWS) argues that egress fees reflect legitimate network and transport costs and that price controls could produce unintended harms by undermining investment in infrastructure. AWS pointed regulators to commercial programs to ease switching and, notably, in March 2024 announced it would waive certain data‑transfer‑out fees for customers who wanted to move to rival clouds — a public concession intended to blunt one of the main complaints. Reuters documented AWS’s March 2024 change.
  • Microsoft describes the cloud market as dynamic and competitive, stressing continued investment and innovation (especially for AI) and disputing empirical claims that its licensing practices are exclusionary. Microsoft told the CMA the agency “misses the mark,” arguing market dynamics, not restrictive conduct, explain its commercial position.
  • Google framed regulatory scrutiny as a chance to correct pricing and choice imbalances. Google’s high‑profile complaint against Microsoft (filed in 2024) crystallized the licensing dispute; Google later withdrew that formal complaint after the Commission launched the DMA investigations, viewing the DMA process as the more effective remediation route.
These rebuttals are not purely rhetorical: hyperscalers can and do point to investments in interconnection, open‑source contributions and migration tooling that lower apparent switching costs for some workloads. The central regulatory question is whether those investments are sufficient — and whether the remaining frictions are the product of competition on the merits or exclusionary design.

The AI competition angle: why regulators care about compute, not just cookies​

AI startups report that cloud pricing opacity and contractual ties materially affect their unit economics when training and serving models. The FTC’s 6(b) study of large AI partnerships showed how multi‑billion‑dollar investments, equity stakes and revenue‑share arrangements between clouds and AI firms can embed spending commitments and preferential access to compute that advantage certain partners. When access to specialized accelerators and managed AI stacks becomes a competitive bottleneck, commercial terms that privilege in‑house models or partners become competition vectors, not mere commercial detail. Cloud‑native AI also amplifies technical lock‑in: model weights, dataset formats, orchestration pipelines and accelerator‑specific optimizations are not trivially portable. Even if raw data and VM images can be exported, full fidelity of performance and cost at scale is a separate engineering question — and one that creates switching costs beyond the nominal egress bill. Regulators are treating those technical realities as part of the competitive assessment.

Economic analysis: the trade‑offs of intervention​

Scholars and policy analysts are split on remedies. Some—citing concentration and observed switching frictions—argue for ex‑ante constraints and enforced portability standards. Others caution that heavy‑handed rules can reduce incentives to invest in data centers, networks and specialized AI hardware, ultimately raising costs and slowing innovation.
  • The International Center for Law & Economics (ICLE) warned the CMA that concentration alone is a poor proxy for harm and that designating firms as having Strategic Market Status could be disproportionate. ICLE emphasized careful economic analysis before imposing ex‑ante constraints.
  • Academic work (for example, a TSE working paper co‑authored by Gary Biglaiser and colleagues) frames cloud economics as distinct from classic consumer platforms; the paper highlights how egress fees and vertical integration can create switching costs while noting that banning a single instrument (egress fees) may push providers to invent alternative mechanisms for raising exit costs. In short, the economics point to trade‑offs: remedies may raise consumer surplus in one dimension while shifting costs into others.
One empirical takeaway regulators will need is careful measurement: how often do customers actually migrate, what are the realized switching costs in representative migration projects, and do pricing differentials persist after competitive responses? Headline numbers — “up to 400% markups” or multi‑billion aggregate damages quoted in some industry commentary — must be treated as contested and will require forensic contract‑level evidence before being relied upon in enforcement or remedies. Several regulators explicitly stress this evidentiary standard.

Potential remedies on the table​

Regulators have a menu of interventions that scale from transparency to structural measures. Likely near‑term options include:
  • Transparency requirements for billing, egress pricing and portability costs to enable meaningful price comparisons.
  • Portability and interoperability mandates, potentially enforced through DMA‑style obligations or national equivalents, requiring export tooling, standardized APIs or guaranteed data‑format portability for business users.
  • Limits on switching charges, consistent with the EU Data Act that phases out switching fees by 12 January 2027 and caps reduced charges in the interim. The Data Act explicitly constrains switching charges and requires pre‑contract disclosure of switching costs.
  • Fair‑use licensing obligations that prevent discriminatory surcharges for running commercial software on rival clouds (one of Google’s main claims).
  • Targeted designation and behavioural remedies (e.g., SMS or DMA gatekeeper duties) for the largest providers, which could force non‑discrimination and interoperability obligations.
Less likely, but not impossible in the long run, are structural remedies — divestiture or enforced separation — because they would be disruptive, legally fraught and politically contentious. Most regulators so far prefer to pursue behavioural and interoperability remedies, at least as the first line of action.

Practical implications for IT buyers and enterprises​

For procurement teams, platform engineers and CIOs, the unfolding investigations will matter in concrete ways:
  • Short term: expect more vendor dialogues and possible contractual concessions as providers respond to regulator questions. AWS’s 2024 removal of some data‑transfer fees for switching customers is an example of preemptive commercial change. Yet that concession does not resolve all portability or licensing friction.
  • Medium term: potential mandates (transparency, portability) will make multi‑cloud strategies more tractable cost‑wise — but the engineering work of re‑architecting for multi‑cloud will remain non‑trivial for many production workloads.
  • Strategic: buyers should inventory the true cost of lock‑in in their environments — not only egress bills but custom integrations, managed‑service bindings, and accelerator‑specific optimizations — and quantify migration scenarios to inform negotiation leverage.
  • For AI teams: evaluate dependency on cloud‑specific managed AI stacks and specialized hardware. Where feasible, design model training and serving pipelines with portability in mind (containerized workloads, open model formats, abstraction layers for accelerators).

Where the evidence is strong — and where it isn’t​

The following assertions are well supported by public records and market data:
  • Regulators in the U.K., EU and U.S. have active inquiries into cloud pricing and contractual practices; the EC launched DMA market investigations on 18 November 2025.
  • The CMA’s July 2025 final decision quantified AWS and Azure’s substantial UK market shares and identified egress fees and licensing frictions as sources of lock‑in.
  • Independent market trackers show the top three hyperscalers capturing roughly 60–65% of global cloud infrastructure spend in 2024–2025.
The following claims are headline allegations that require careful empirical verification before being elevated to regulatory findings:
  • Specific markup figures (for example, widely‑circulated claims of “up to 400%” markups for certain licensing scenarios) are contestable and derive from selective scenarios or vendor‑supplied comparisons; regulators have flagged such numbers as allegations pending documentary verification. Regulators will require contract‑level evidence to validate any large aggregate damages calculation.
  • The causal attribution that any single practice (e.g., banning egress fees outright) will unambiguously improve consumer welfare is uncertain: academic work shows that providers might replace one switching instrument with others, and remedies can shift costs into other products or reduce investment incentives. These are empirical questions that demand counterfactual analysis.

What comes next — process, timing and likely chronology​

  • The CMA has closed its market investigation and recommended further SMS designation probes; follow‑up U.K. action could begin in 2026.
  • The European Commission set an accelerated fact‑finding timetable for the DMA company inquiries (roughly 12 months from launch), though complex technical evidence can elongate the process. The horizontal study assessing DMA fit‑for‑purpose may run longer.
  • The FTC continues to gather submissions and produce staff reports; it retains the authority to pivot from study to enforcement if the record supports anticompetitive conduct. The FTC’s 6(b) work on AI partnerships shows how the agency is using data collection tools to probe embedded commercial arrangements beyond simple price‑setting.

Conclusion — a regulatory moment for cloud that demands nuance​

The current wave of investigations is not a single‑issue sprint but a multi‑jurisdictional campaign to determine whether cloud commercial and technical design choices erect structural barriers to competition. Regulators are rightly focused on real‑world switching costs — those that combine fees, licensing differentials, proprietary APIs and specialized hardware access into durable lock‑in.
Policymakers face a classic regulatory trade‑off: remedy lock‑in risks without undermining the investment incentives that built the global cloud platform ecosystem. The most promising path is targeted, evidence‑based remedies that improve transparency and portability where switching frictions are unjustified, coupled with careful measurement to avoid blunt instruments that simply shift the barrier to a different form.
For IT leaders and developers, the practical advice is unchanged but urgent: catalogue the full scope of vendor dependencies, quantify migration costs for your critical workloads, and insist on contractual clarity around egress, portability tooling and licensing parity. The regulatory spotlight will likely force more public answers from providers; when that happens, customers who have done their homework will be best positioned to translate regulatory change into competitive advantage.
Source: JD Supra Are Clouds Too Sticky? Antitrust Authorities Probe Lock-in Pricing Complaints | JD Supra