Microsoft’s new flex routing behavior for Microsoft 365 Copilot is a textbook example of how a helpful capacity feature can become a compliance headache overnight. For eligible tenants in the EU and EFTA, Microsoft now allows some Copilot inferencing to move outside the EU Data Boundary during peak demand, including to the United States, Canada, or Australia. The catch is not that the data is unprotected in transit or at rest; the catch is that processing location still matters for organizations trying to stay aligned with GDPR obligations and their own privacy policies.
The immediate issue is simple: Microsoft has made a routing choice that improves service resilience, but it has also shifted more responsibility onto tenant administrators. According to Microsoft’s own documentation, flex routing is on by default for eligible tenants created after March 25, 2026, while existing eligible tenants were told to check Message Center for their default setting. That means many organizations may not even realize that an AI prompt can be processed outside Europe unless someone actively audits the tenant configuration.
That is why the TechRadar report landed with such force. It framed the feature as a potential GDPR trap, and that concern is not outrageous. Microsoft says the data remains encrypted in transit and at rest, but it also explicitly states that LLM inferencing may occur outside the EU Data Boundary during peak demand. For privacy teams, legal teams, and procurement teams, “encrypted” does not automatically mean “no cross-border transfer issues”.
There is also a broader context here: Microsoft has spent years publicly emphasizing the EU Data Boundary as a major trust and sovereignty milestone. The company says it completed the boundary in 2025 and presents it as an industry-leading framework for storing and processing data in-region. Flex routing is therefore not an isolated toggle; it is a stress test of how far those commitments extend when AI demand spikes.
And this is not merely a European bureaucratic concern. Under GDPR, organizations that handle personal data about EU residents must think in terms of governance, lawful basis, retention, transfer mechanisms, vendor contracts, and documented risk assessments. Microsoft’s guidance explicitly points organizations toward DPIAs, and its GDPR documentation reminds customers that the regulation applies to organizations that offer goods and services to people in the EU or analyze their data, regardless of where the business itself is located. That is a wide net.
The company also distinguishes between data at rest and inferencing. Data at rest stays within the EU Data Boundary, except for limited pseudonymized data that may be stored outside it for security and operational purposes. The inference step itself, however, is where the prompt gets interpreted and the output produced, and that is the step Microsoft is willing to move.
That distinction matters because a lot of non-specialist commentary collapses the entire issue into a simple “data leaves Europe” headline. The more precise concern is that processing can be outside Europe even if the storage model remains largely regional. For many compliance teams, that is enough to demand a fresh legal analysis.
For existing tenants, the story is slightly better because Microsoft directs administrators to Message Center guidance. But “slightly better” is not the same as “safe.” If a feature affects regulated data flows, administrators should not rely on passive notices buried in a service dashboard. They need a documented review process. That is what mature governance looks like.
This is where the TechRadar warning has some bite. A business can be perfectly comfortable with Microsoft’s broader cloud posture and still decide that AI inference outside the EU is a bridge too far. That is especially true where employee prompts, customer content, or commercially sensitive material may be included in Copilot interactions. The legal issue is contextual, not universal.
Organizations often forget that vendor language about compliance support is not the same thing as a warranty of compliance for the customer. Microsoft says Copilot provides broad compliance offerings and certifications, including GDPR, and that it is committed to complying with applicable laws. But Microsoft also says the customer still has to configure settings that align with its own requirements. That division of labor is normal in cloud computing, but it can be uncomfortable in AI.
The practical challenge is that many businesses do not have a single global answer for AI tools. They may allow one type of transfer for HR systems, another for marketing data, and a third for internal productivity tools. Copilot sits awkwardly in the middle because it touches email, documents, meetings, chats, and other sensitive work content. That makes classification harder and risk higher.
That complexity is what matters. AI systems are not static storage systems. They are dynamic, elastic, and often resource-hungry, especially under peak demand. A service boundary that worked neatly for conventional cloud workloads may become more porous once inference capacity is pooled globally. That is an architectural issue, not merely a marketing issue.
Microsoft’s documentation tries to keep the distinction clean by saying data remains encrypted and at-rest storage remains in-region. But from a compliance perspective, the legal question is whether customers were adequately informed and whether the configuration matches their expectations and obligations. When the feature is enabled by default, those questions get harder, not easier.
The documentation also makes clear that not every tenant will even see the option. Customers with multi-geo capabilities are excluded from the setting, even if their tenant is based in the EU or EFTA. That restriction suggests Microsoft is trying to avoid collisions with more complex data-residency architectures, but it also means administrators need to understand which tenant class they actually have.
Still, administrators should not treat the toggle as the end of the story. The change should be documented, and the reason for the choice should be recorded in the privacy or security control register. That recordkeeping may matter later if auditors or regulators ask how the decision was made.
The practical challenge is that Copilot is designed to be everywhere. It touches mail, meetings, documents, and chat, which means it can easily ingest all kinds of personal and business content. That makes the boundary between productivity and regulated processing much thinner than it would be in a standalone analytics tool. It is the breadth of Copilot that makes the routing question so serious.
Some enterprises will also see this as a governance maturity test. If a company cannot quickly identify which regions its AI processing uses, it likely does not have a robust AI control framework yet. That is not just a Microsoft problem; it is a sign that the organization’s own AI adoption has outrun its policy layer.
There is also a less obvious risk: smaller teams are less likely to have a privacy lawyer or dedicated compliance engineer watching Microsoft Message Center updates. That makes default settings especially powerful. If a tool turns something on by default, many small organizations will never discover the change until something goes wrong. That is the kind of silent risk regulators dislike most.
This could become a subtle but important market split. Some customers will prefer maximum AI availability and accept broader routing. Others will prioritize jurisdictional certainty over a marginally smoother experience. The companies that win those deals will be the ones that explain their trade-offs most clearly. Transparency will matter almost as much as the actual architecture.
Microsoft also risks reinforcing a perception problem: that compliance is becoming a configurable afterthought rather than a first-class product promise. That perception may not be fair, but it can still shape procurement outcomes. In enterprise software, trust is often lost not because the feature is catastrophic, but because it feels administratively inconvenient.
That does not mean Microsoft is “wrong” to use flex routing. It means the competitive bar for AI trust is rising. Customers now expect not just a powerful assistant, but a predictable one. In regulated industries, predictability is a feature.
The company also appears to be trying to balance growth, capacity, and regional commitment rather than abandoning European data protections outright. That matters because AI capacity shortages are a genuine operational issue, and rigidly local-only systems can degrade user experience during demand spikes. In that sense, flex routing is a pragmatic response to a real scalability problem.
Another concern is the ambiguity between pseudonymized storage, cross-border inference, and legal transfer analysis. Microsoft provides safeguards, but many organizations will still need to decide whether the combination is appropriate for their data categories and contractual obligations. What is “allowed” in a technical sense is not always acceptable in a governance sense.
We should also expect Microsoft to keep refining its European data narrative as AI workloads grow. The company has already invested heavily in EU Data Boundary messaging and sovereign controls, and that messaging will need to coexist with increasingly global inference infrastructure. That tension will not disappear; it will just become a normal part of enterprise AI procurement.
A likely near-term development is more detailed guidance from customers, regulators, and consultants on when flex routing requires a DPIA, when it should be documented as a transfer risk, and when it should simply be disabled. The market is still learning how to govern AI elasticity without compromising sovereignty claims.
Source: TechRadar This new Microsoft 365 Copilot feature could make thousands of businesses break GDPR – here's how to turn it off
Overview
The immediate issue is simple: Microsoft has made a routing choice that improves service resilience, but it has also shifted more responsibility onto tenant administrators. According to Microsoft’s own documentation, flex routing is on by default for eligible tenants created after March 25, 2026, while existing eligible tenants were told to check Message Center for their default setting. That means many organizations may not even realize that an AI prompt can be processed outside Europe unless someone actively audits the tenant configuration.That is why the TechRadar report landed with such force. It framed the feature as a potential GDPR trap, and that concern is not outrageous. Microsoft says the data remains encrypted in transit and at rest, but it also explicitly states that LLM inferencing may occur outside the EU Data Boundary during peak demand. For privacy teams, legal teams, and procurement teams, “encrypted” does not automatically mean “no cross-border transfer issues”.
There is also a broader context here: Microsoft has spent years publicly emphasizing the EU Data Boundary as a major trust and sovereignty milestone. The company says it completed the boundary in 2025 and presents it as an industry-leading framework for storing and processing data in-region. Flex routing is therefore not an isolated toggle; it is a stress test of how far those commitments extend when AI demand spikes.
And this is not merely a European bureaucratic concern. Under GDPR, organizations that handle personal data about EU residents must think in terms of governance, lawful basis, retention, transfer mechanisms, vendor contracts, and documented risk assessments. Microsoft’s guidance explicitly points organizations toward DPIAs, and its GDPR documentation reminds customers that the regulation applies to organizations that offer goods and services to people in the EU or analyze their data, regardless of where the business itself is located. That is a wide net.
What Microsoft Actually Changed
The key change is not that Copilot suddenly became unsafe. It is that Microsoft has introduced a load-balancing behavior that can send some Copilot inference workloads outside the EU Data Boundary when demand peaks. Microsoft says this is designed to maintain a consistent Copilot experience, which tells you the engineering motive: preserve responsiveness, even if that means borrowing compute capacity from other regions.Flex routing in plain English
In practical terms, flex routing means Copilot can ask for help from data centers beyond Europe when local capacity is strained. Microsoft says this can include the US, Canada, and Australia, and that the system is intended to operate only during periods of peak demand rather than continuously. That sounds narrow, but in enterprise environments even occasional cross-border processing can trigger policy or legal review.The company also distinguishes between data at rest and inferencing. Data at rest stays within the EU Data Boundary, except for limited pseudonymized data that may be stored outside it for security and operational purposes. The inference step itself, however, is where the prompt gets interpreted and the output produced, and that is the step Microsoft is willing to move.
That distinction matters because a lot of non-specialist commentary collapses the entire issue into a simple “data leaves Europe” headline. The more precise concern is that processing can be outside Europe even if the storage model remains largely regional. For many compliance teams, that is enough to demand a fresh legal analysis.
Why the default matters
The real controversy is the default setting. Microsoft says eligible tenants created after March 25, 2026 have flex routing enabled by default, and the Learn page was last updated on April 3, 2026. That means businesses onboarding Copilot now may inherit a configuration they never intended to approve explicitly. Defaults are policy. Defaults are also where compliance failures hide.For existing tenants, the story is slightly better because Microsoft directs administrators to Message Center guidance. But “slightly better” is not the same as “safe.” If a feature affects regulated data flows, administrators should not rely on passive notices buried in a service dashboard. They need a documented review process. That is what mature governance looks like.
- New eligible tenants may have flex routing enabled automatically.
- Existing tenants must verify their own default state.
- Peak-load inference may move to non-EU regions.
- Data at rest remains mostly inside the EU Data Boundary.
- Pseudonymized data may still be stored outside the EU Data Boundary.
Why GDPR Teams Are Paying Attention
The legal sensitivity here comes from the fact that GDPR is not just about whether data is encrypted; it is about whether processing, transfer, access, and accountability are all handled lawfully. Microsoft’s own GDPR documentation says organizations must consider data subject rights, breach notification, and DPIAs. In other words, privacy compliance is not a box-ticking exercise; it is a governance framework.Cross-border processing is the pressure point
Microsoft says that when flex routing is enabled, inferencing may happen in the US, Canada, or Australia during periods of peak demand. It also says that if flex routing is allowed, associated pseudonymized data may be stored outside the EU Data Boundary for security and operational purposes. For some organizations, that combination will be acceptable under their transfer mechanism and risk posture; for others, it will not.This is where the TechRadar warning has some bite. A business can be perfectly comfortable with Microsoft’s broader cloud posture and still decide that AI inference outside the EU is a bridge too far. That is especially true where employee prompts, customer content, or commercially sensitive material may be included in Copilot interactions. The legal issue is contextual, not universal.
Organizations often forget that vendor language about compliance support is not the same thing as a warranty of compliance for the customer. Microsoft says Copilot provides broad compliance offerings and certifications, including GDPR, and that it is committed to complying with applicable laws. But Microsoft also says the customer still has to configure settings that align with its own requirements. That division of labor is normal in cloud computing, but it can be uncomfortable in AI.
The role of transfer mechanisms
Microsoft’s positioning is that cross-border transfers can remain compliant if protected by mechanisms such as the EU-US Data Privacy Framework or Standard Contractual Clauses. That is important, because it suggests the company is not claiming “no transfer.” It is claiming “transfer with safeguards.” For some enterprises, that will be enough. For others, legal teams may still want a more conservative setup.The practical challenge is that many businesses do not have a single global answer for AI tools. They may allow one type of transfer for HR systems, another for marketing data, and a third for internal productivity tools. Copilot sits awkwardly in the middle because it touches email, documents, meetings, chats, and other sensitive work content. That makes classification harder and risk higher.
- GDPR scope is broader than many IT teams assume.
- Cross-border processing can trigger transfer analysis even if encrypted.
- DPIAs are often required for new or high-risk processing.
- Vendor certifications help, but they do not replace customer governance.
- Copilot’s broad content access raises classification complexity.
Microsoft’s EU Data Boundary Promise Under Strain
Microsoft has spent a lot of corporate energy turning the EU Data Boundary into a trust signal. The company says the boundary significantly reduces cloud data flows out of the EU and provides detailed transparency documentation for customers. That message works well when the conversation is about storage, residency, or regional processing commitments. It gets fuzzier when capacity management pushes active inference elsewhere.A promise designed for one era, tested by another
The original political and commercial logic behind the EU Data Boundary was straightforward: reassure European customers that cloud services would not casually export their data. Microsoft’s Trust Center frames the boundary as part of “industry-leading data protection,” and the company has repeatedly highlighted reduced data flows and stronger transparency. Flex routing does not overthrow that story, but it does complicate it.That complexity is what matters. AI systems are not static storage systems. They are dynamic, elastic, and often resource-hungry, especially under peak demand. A service boundary that worked neatly for conventional cloud workloads may become more porous once inference capacity is pooled globally. That is an architectural issue, not merely a marketing issue.
Microsoft’s documentation tries to keep the distinction clean by saying data remains encrypted and at-rest storage remains in-region. But from a compliance perspective, the legal question is whether customers were adequately informed and whether the configuration matches their expectations and obligations. When the feature is enabled by default, those questions get harder, not easier.
What the documentation does — and does not — settle
Microsoft is actually quite transparent in the new Learn page. It explains where flex routing can occur, who can disable it, and which admin centers control the setting. It also states that the Power Platform admin center respects the Microsoft 365 admin center setting unless the Power Platform configuration is more restrictive. That is useful operational detail, but it is not the same as a legal green light.The documentation also makes clear that not every tenant will even see the option. Customers with multi-geo capabilities are excluded from the setting, even if their tenant is based in the EU or EFTA. That restriction suggests Microsoft is trying to avoid collisions with more complex data-residency architectures, but it also means administrators need to understand which tenant class they actually have.
- Microsoft is emphasizing transparency rather than denial.
- The EU Data Boundary remains intact for at-rest storage.
- Inference routing is now a separate policy layer.
- Multi-geo tenants face different constraints.
- The documentation reduces ambiguity but does not eliminate compliance work.
How to Turn It Off
For organizations that decide flex routing is not acceptable, Microsoft provides a path to disable it. The setting is managed in the Microsoft 365 admin center by someone with the AI Administrator role. Microsoft says administrators should go to Copilot, then Settings, then View all, and then select the flexible inferencing option for peak load periods.The admin steps
If you need the shortest operational version, the flow is straightforward:- Sign in to the Microsoft 365 admin center as an administrator with the AI Administrator role.
- Navigate to Copilot.
- Open Settings.
- Select View all.
- Find Flexible inferencing during peak load periods.
- Choose Do not allow flex routing.
Still, administrators should not treat the toggle as the end of the story. The change should be documented, and the reason for the choice should be recorded in the privacy or security control register. That recordkeeping may matter later if auditors or regulators ask how the decision was made.
Governance you should do before changing the toggle
Turning it off is easy. Proving that you thought it through is the harder part. Organizations should make sure legal, security, and data protection stakeholders have all reviewed the decision, especially if Copilot is being used with sensitive documents or regulated personal data.- Verify whether the tenant is eligible for the setting.
- Check whether the tenant was created before or after March 25, 2026.
- Confirm whether multi-geo changes the available options.
- Document the chosen routing policy.
- Update internal privacy notices if processing behavior changes.
- Reassess your DPIA if Copilot usage is broad or sensitive.
Enterprise Impact vs Consumer Expectations
This issue is overwhelmingly an enterprise story, because Microsoft 365 Copilot is a workplace service and the compliance burden falls on the organization, not on an individual consumer. But the consumer instinct still matters because employees tend to assume that if a tool is branded as “Copilot” and lives inside Microsoft 365, it behaves consistently everywhere. That assumption is now more dangerous than helpful.Why enterprises feel the pain first
Enterprises are the ones who have to decide whether cross-border inference is acceptable for HR, finance, legal, operations, and executive workloads. They are also the ones with the highest chance of having contractual promises to customers, suppliers, and workers that go beyond Microsoft’s standard cloud terms. A capacity feature that is fine for one department can be a compliance issue for another.The practical challenge is that Copilot is designed to be everywhere. It touches mail, meetings, documents, and chat, which means it can easily ingest all kinds of personal and business content. That makes the boundary between productivity and regulated processing much thinner than it would be in a standalone analytics tool. It is the breadth of Copilot that makes the routing question so serious.
Some enterprises will also see this as a governance maturity test. If a company cannot quickly identify which regions its AI processing uses, it likely does not have a robust AI control framework yet. That is not just a Microsoft problem; it is a sign that the organization’s own AI adoption has outrun its policy layer.
Why consumers and smaller businesses should still care
Small businesses may think this issue is too specialized to matter, but that would be a mistake. GDPR applies based on the nature of the data and the people involved, not on company size alone. A small firm processing EU customer or employee data still has obligations, and it still needs to know whether a vendor feature changes data flow behavior.There is also a less obvious risk: smaller teams are less likely to have a privacy lawyer or dedicated compliance engineer watching Microsoft Message Center updates. That makes default settings especially powerful. If a tool turns something on by default, many small organizations will never discover the change until something goes wrong. That is the kind of silent risk regulators dislike most.
- Enterprises face the biggest governance burden.
- Small businesses face the biggest visibility problem.
- Copilot’s broad access increases the sensitivity of routing choices.
- Default-on behavior can bypass internal review.
- Compliance maturity matters as much as vendor promises.
Competitive and Market Implications
Microsoft’s move will not exist in a vacuum. It will influence how rivals position their own enterprise AI services, especially in Europe where data sovereignty has become a sales requirement rather than a niche feature. Vendors that can promise stricter in-region processing may gain an advantage with cautious buyers, even if their AI experiences are less flexible.The sovereignty premium is real
European buyers increasingly ask a simple question: where does the model run, where does the data travel, and what happens when the region is busy? Microsoft’s answer is effectively, “We can burst globally while keeping guardrails in place.” That may be technically sensible, but competitors will absolutely use it to pitch themselves as more rigidly sovereign.This could become a subtle but important market split. Some customers will prefer maximum AI availability and accept broader routing. Others will prioritize jurisdictional certainty over a marginally smoother experience. The companies that win those deals will be the ones that explain their trade-offs most clearly. Transparency will matter almost as much as the actual architecture.
Microsoft also risks reinforcing a perception problem: that compliance is becoming a configurable afterthought rather than a first-class product promise. That perception may not be fair, but it can still shape procurement outcomes. In enterprise software, trust is often lost not because the feature is catastrophic, but because it feels administratively inconvenient.
What rivals may do next
Expect competing platforms to sharpen their language around regional inferencing, residency guarantees, and administrative control. Some will emphasize that prompts never leave a specified region; others will stress that inferencing can be locally pinned by default. A few may even market the absence of dynamic bursting as a compliance feature.That does not mean Microsoft is “wrong” to use flex routing. It means the competitive bar for AI trust is rising. Customers now expect not just a powerful assistant, but a predictable one. In regulated industries, predictability is a feature.
- Sovereignty requirements can shape buying decisions.
- Competitors may advertise stricter regional pinning.
- Default settings increasingly influence procurement.
- Transparency is becoming a competitive differentiator.
- AI trust is now tied to administrative control, not just model quality.
Strengths and Opportunities
Microsoft deserves credit for being unusually explicit about what flex routing does, where it can route, and how administrators can turn it off. The feature may be controversial, but the documentation is not vague, and that transparency gives customers a real chance to make informed choices. The upside is that companies with less sensitive workloads can preserve service quality without guessing about the engineering trade-off.The company also appears to be trying to balance growth, capacity, and regional commitment rather than abandoning European data protections outright. That matters because AI capacity shortages are a genuine operational issue, and rigidly local-only systems can degrade user experience during demand spikes. In that sense, flex routing is a pragmatic response to a real scalability problem.
- Clear admin control over the setting.
- Explicit disclosure of possible routing regions.
- Data at rest remains inside the EU Data Boundary.
- Encrypted transit reduces exposure during transfer.
- Peak-load routing may improve responsiveness.
- More mature tenants can align the setting with internal policy.
- Customers retain the option to disable the feature.
Risks and Concerns
The biggest risk is not technical failure; it is silent mismatch between vendor defaults and customer expectations. If an organization assumes Copilot stays in-region and never checks the tenant setting, it may unknowingly create a compliance problem. That risk is amplified by default-on behavior for newer eligible tenants.Another concern is the ambiguity between pseudonymized storage, cross-border inference, and legal transfer analysis. Microsoft provides safeguards, but many organizations will still need to decide whether the combination is appropriate for their data categories and contractual obligations. What is “allowed” in a technical sense is not always acceptable in a governance sense.
- Default-enabled behavior can surprise administrators.
- Cross-border inferencing may conflict with internal policies.
- Pseudonymized storage outside the boundary may need documentation.
- Employee and customer notices may need updating.
- DPIAs may be required or need revision.
- Multi-geo environments may face added complexity.
- Legal comfort with SCCs or the Data Privacy Framework may vary.
Looking Ahead
The next phase of this story is likely to be less about whether flex routing exists and more about how often organizations choose to disable it. If the majority of European tenants opt out, that will signal that compliance sensitivity still outweighs capacity convenience. If most keep it enabled, Microsoft will have evidence that customers are comfortable with the trade-off.We should also expect Microsoft to keep refining its European data narrative as AI workloads grow. The company has already invested heavily in EU Data Boundary messaging and sovereign controls, and that messaging will need to coexist with increasingly global inference infrastructure. That tension will not disappear; it will just become a normal part of enterprise AI procurement.
A likely near-term development is more detailed guidance from customers, regulators, and consultants on when flex routing requires a DPIA, when it should be documented as a transfer risk, and when it should simply be disabled. The market is still learning how to govern AI elasticity without compromising sovereignty claims.
- More admins will audit Copilot routing settings.
- Privacy teams may update DPIAs for AI usage.
- Procurement teams may tighten vendor questionnaires.
- Competitors may push stricter regional guarantees.
- Microsoft may expand its guidance as adoption grows.
Source: TechRadar This new Microsoft 365 Copilot feature could make thousands of businesses break GDPR – here's how to turn it off