Microsoft has quietly added a default‑on setting to Copilot that explicitly permits the assistant to “use data from Bing, MSN, Edge, and other Microsoft products you’ve used,” and you should seriously consider turning it off right now if you care about privacy, discoverability, or simply keeping what you do in other Microsoft services out of an AI assistant’s memory.
Background / Overview
Microsoft introduced the idea of persistent personalization — sometimes called
memory — for Copilot to make the assistant more useful across sessions. The Memory feature can remember facts you explicitly tell Copilot, infer preferences from usage, and apply that information to future replies. In recent hands‑on reporting, a buried secondary control labeled
Microsoft usage data appeared under Copilot’s Memory or Personalization settings. The control reads, in plain text, that it will
“Let Copilot use data from Bing, MSN, Edge, and other Microsoft products you’ve used.” Reporters found the toggle turned on by default for many accounts.
This is not just a UI tweak. When enabled, it allows Copilot to draw on non‑chat signals generated across Microsoft’s consumer properties and feed them into Copilot’s personalization pipeline. Microsoft says the data is used to personalize the Copilot experience and — critically — that such product usage signals are
not used to train their public foundation models. But Microsoft’s public guidance does not publish a definitive, item‑by‑item inventory of what exactly “usage data” contains, and that ambiguity is the core of the concern for privacy‑minded users and compliance teams.
What the new toggle actually says — and why wording matters
The visible change
The new control is labeled as a Microsoft usage data toggle inside the Memory/Personalization area of Copilot’s settings. The UI description explicitly lists Bing, MSN, and Edge as sources and finishes with “and other Microsoft products you’ve used.” That phrase is broad by design and allows for a range of cross‑product signals to be considered usage data. Early testing by journalists and community members shows the toggle is enabled by default for many accounts.
Why the wording is important
Words matter here. “Usage data” can mean an enormous spectrum of signals: search queries, topics you click on, the links you open, feed articles you engage with, which Edge tabs are open, even high‑level telemetry about features you use. Because Microsoft’s UI groups these signals under one switch and hasn’t published a granular list for consumers, users can’t know precisely what they are giving Copilot permission to ingest without additional documentation or auditing tools. Community analysis and reporting have attempted to infer likely components, but those remain inferences rather than confirmed inventories.
What Microsoft says (and what it doesn’t say)
Microsoft’s public documentation on Copilot personalization and memory clarifies several important points:
- Memory and personalization are enabled by default for eligible users unless tenant admins or individual users change the controls.
- Memories are stored in a user’s Exchange mailbox in a hidden folder; they are subject to mailbox security and are discoverable by administrators via eDiscovery and Microsoft Purview tools. That means memories can appear as normal corporate data for compliance purposes.
- Microsoft states product usage signals are used for personalization, not to train public foundation models, but the documentation also makes clear that aggregated telemetry and diagnostic signals can be used for product improvement. The boundary between personalization, telemetry, and product improvement is a nuanced one that relies on contractual commitments and practical auditability.
These official facts are important: storage location, discoverability, and default status are objective and verifiable. What Microsoft has not published is an exhaustive, itemized list of every signal that falls under “Microsoft usage data” in the toggle — and that is the precise gap that should make users and IT teams wary.
Why you might want to turn this off immediately
1) Default‑on expands the blast radius without active consent
A default‑on toggle dramatically increases the number of users included in the personalization data flow simply because most people don’t comb through every setting. A buried option in Memory means hundreds of millions of Microsoft accounts could be seeding cross‑product signals into Copilot by default without explicit awareness. Multiple independent outlets confirmed the setting was on by default during testing.
2) Ambiguity about what counts as “usage data”
Microsoft’s lack of a public, granular list of included signals means users must assume the worst: if you were signed in to a Microsoft account while performing a sensitive search or interacting with sensitive content, the metadata of that interaction could be used to inform Copilot’s responses. Community reporting suggests plausible inclusions — Bing search history, Edge browsing signals, MSN feed activity — but conclusive proof of the full scope is not available. Treat any ambiguity involving private or sensitive content as a risk.
3) Discoverability in legal and corporate contexts
For enterprise users, remember: Copilot memory is stored in Exchange mailboxes and can be found via Purview eDiscovery. That means what Copilot remembers — even inferred facts — may be subject to legal holds, data subject requests, or internal audits. If you are an employee or an organization handling regulated data, this changes the compliance picture in ways that must be actively managed by admin policy. Microsoft documents this storage and discoverability, which makes the privacy implications concrete for corporate customers.
4) Turning off the toggle does not necessarily delete previous signals
Community testing and multiple help guides show the Microsoft usage data toggle prevents new cross‑product signals from flowing into Copilot going forward, but
it does not automatically wipe what has already been collected. A separate “Delete all memory” action is required to purge stored memories. That two‑step requirement is a friction point that means disabling the toggle is necessary but not sufficient if you want a clean slate.
5) Sensitive data and unexpected inferences
Even if Copilot doesn’t ingest your raw files or messages, combining seemingly innocuous signals (searches, visited pages, clicked headlines) can create sensitive inferences about political views, health concerns, finances, or relationships. Those inferences matter because AI assistants use context to generate responses and suggestions; the more context they receive, the easier it is to create accurate — and potentially intrusive — personalization. Where sensitive data may be involved, err on the side of limiting cross‑product signal sharing.
The case for leaving it on (briefly): benefits Microsoft emphasizes
To be balanced, there are genuine benefits to allowing Copilot to use product‑usage signals:
- Improved relevance: Copilot can surface answers and suggestions tuned to your recent interests and signaled preferences. If you regularly search for cooking recipes, Copilot’s recommendations can become demonstrably more useful.
- Convenience: Memory promises fewer repetitive prompts; Copilot can remember names, preferences, and background details you’ve previously shared or that are inferred from your activity. This is valuable for users who want a long‑running assistant that “knows” them.
- Feature development: Product teams use aggregated usage signals to improve features and fix problems; some degree of telemetry is essential for iterative improvement.
Those benefits matter, and some users will prefer the convenience tradeoff. The point of this article is not to banish the feature but to make sure users and admins understand the tradeoffs and choose deliberately rather than by accident.
How to check and disable the Microsoft usage data toggle (step‑by‑step)
Below are practical, verified steps you can follow right now. The UI wording and exact menu location may vary slightly by Copilot client (web, Edge sidebar, Windows Copilot app), but the sequence is consistent.
- Open Copilot in your web browser or Copilot app and sign in with your Microsoft account.
- Click your account avatar (bottom left in the web UI) and choose Settings.
- Open Memory or Personalization (label varies by client).
- Under Memory, locate the Microsoft usage data toggle — the description typically reads: Let Copilot use data from Bing, MSN, Edge, and other Microsoft products you’ve used. Switch that toggle to Off.
- To remove previously collected memories, click Delete all memory (or Delete memory) and confirm. Turning off the toggle alone will not purge existing stored memory.
- Optionally, toggle the master Personalization and memory control to Off to stop Copilot from saving new memories entirely. Be aware: this will reduce Copilot’s ability to recall preferences across sessions.
Edge users: you can also access Copilot settings through the Edge sidebar or the Copilot toolbar icon and follow the same path to Privacy → Memory → Microsoft usage data. Mobile users: open the Copilot app, tap your profile menu, and navigate to Memory. Community guides document these paths and illustrate the UI.
What enterprise IT and compliance teams need to do now
- Audit tenant defaults — Microsoft’s documentation confirms that Enhanced personalization is on by default for eligible tenants and end users. Admins who need stricter controls should proactively disable Enhanced personalization at the tenant level to prevent users from enabling memory. This is a global, admin‑level control.
- Map Copilot memory to retention and eDiscovery processes — Copilot memories live in Exchange mailboxes and are searchable via eDiscovery/Purview. Compliance teams must update documentation, retention maps, and discovery playbooks so Copilot memory items are covered by legal holds and data subject requests when relevant.
- Communicate to employees — Organizations should issue immediate guidance to employees on how to disable Microsoft usage data and delete memories if appropriate. Provide screenshots and exact steps tailored to the company’s common client configurations. Community threads and help sites contain ready‑made walkthroughs you can adapt.
- Update training and acceptable use policies — Where employees were previously told not to post sensitive data into chats, update policies to include Copilot memory behavior, cross‑product signal considerations, and the new default settings.
- Consider technical mitigations — If needed, use admin controls available via Microsoft Graph to script tenant‑level enforcement, and test how Copilot behavior changes when Enhanced personalization is disabled. Microsoft documents Graph resources for enhancedPersonalizationSetting that admins can query and set programmatically.
Technical and legal pitfalls to watch for
- Retention misalignment: Microsoft’s documentation notes that Copilot memories do not obey standard Purview retention labels in a straightforward way; saved memories are retained until the user deletes them or until admins take explicit Graph/eDiscovery actions. That divergence can create compliance blind spots if retention governance isn’t updated to include Copilot memory.
- Discovery surprises: Because memories are stored in mailboxes using the Contact item class, a broad export of contacts could unintentionally include Copilot memory. Admins and discovery teams need to use the guidance Microsoft provides for finding Copilot memory specifically.
- Ambiguous signal scope: The lack of a published signal inventory — i.e., the exact list of Bing/Edge/MSN items Copilot can ingest — creates ambiguity for risk assessments. Where legal or regulatory risk is material, this ambiguity must be documented and, if necessary, escalated to Microsoft through contractual channels for clarification or audit access.
Practical recommendations (quick checklist)
- For individual users:
- Turn off Microsoft usage data in Copilot > Settings > Memory.
- Use Delete all memory to remove previously collected items if you want a clean slate.
- Consider disabling the master Personalization and memory toggle if you prefer no cross‑session memory.
- For enterprise admins:
- Audit tenant setting for Enhanced personalization and set it to Off if your risk posture demands it.
- Update retention maps and eDiscovery procedures to include Copilot memory.
- Communicate clear, step‑by‑step guidance to end users with screenshots and support channels.
- For privacy officers and legal teams:
- Treat Copilot memory as discoverable mailbox content and align it with DSR workflows and legal hold procedures.
- Where the business needs more certainty, request clarifying documentation from Microsoft or pursue contractual assurances.
What remains unverified — exercise caution
Several reasonable but currently unverified assertions circulate in forums and social reporting, and they merit explicit caution:
- The precise, exhaustive list of signals classified as Microsoft usage data has not been published by Microsoft. Independent labs and journalists have made educated inferences (Bing history, Edge browsing signals, MSN engagement), but the definitive inventory is not public. Users should treat descriptions of specific signal elements as likely rather than confirmed until Microsoft publishes a full list.
- Claims that Copilot will or will not use particular types of sensitive content (e.g., full unredacted files, private attachments) for personalization vary depending on product tier and tenant policy. Microsoft’s documentation draws distinctions between personalization and training of foundation models, but those boundaries are contractual and technical; when in doubt, treat sensitive materials as off‑limits.
If you need absolute certainty about signal scope for regulatory reasons, the only reliable path is to seek written clarification from Microsoft through formal support or contractual channels.
Community reaction and context
Community threads, help forums, and Windows‑focused outlets have been active since the change surfaced. Many privacy‑minded users called attention to the default status of the toggle and the two‑step disable/delete flow; others pointed out the convenience tradeoffs for keeping personalization enabled. Our own review of community threads and curated reporting captured these reactions and practical walkthroughs documenting how the toggle appears across clients. For organizations that have uploaded internal or community captures, those threads provide repeatable, walkable steps for admins and users to validate the UI on their accounts.
Final analysis — balancing convenience against control
Microsoft’s decision to centralize cross‑product personalization signals under a single, default‑on toggle in Copilot is a significant UX and privacy moment. On one hand, the ability to draw on Bing, Edge, and other signals can make Copilot measurably more helpful: fewer repeated prompts, more relevant suggestions, and a more personal assistant experience. On the other hand, the default nature of the control, the broad wording of “other Microsoft products,” and the lack of an exhaustive public inventory of included signals make this a setting that should not be left to chance.
For privacy‑conscious consumers and for enterprises with compliance obligations, the prudent choice is to disable the Microsoft usage data toggle and delete existing memories until you can assess what information Copilot has retained. Administrators should treat this change as a governance event: audit tenant settings, update policies, and inform users. For everyone else, be deliberate — if you prefer convenience, understand the tradeoffs; if you prefer control, act now.
Turning off a single toggle takes a minute. The privacy and legal consequences of not knowing what an AI assistant remembers can last much longer. Act deliberately; don’t let default settings decide for you.
Conclusion: Copilot’s new cross‑product memory toggle is a powerful convenience and a potential privacy hazard. Microsoft provides controls — but their placement and default state make them easy to miss. Review your Copilot settings today, disable the Microsoft usage data option if you value privacy, and delete any stored memories you don’t want hanging around in your mailbox.
Source: bgr.com
You Should Disable This Invasive New Microsoft Feature Right Now - Here's Why - BGR