Copilot Privacy: 7 Essential Settings to Secure Your Data

  • Thread Author
Microsoft Copilot can be a genuine productivity multiplier — and also a surprising privacy risk if you accept defaults without looking. PCMag’s hands‑on checklist of “7 settings I changed right away to protect my privacy” is an excellent quick-start for anyone who wants the benefits of an assistant without handing over a permanent profile of their life. I followed those same seven moves, verified the controls against Microsoft’s published behavior, and expanded the checklist with practical trade-offs, enterprise caveats, and a 10–15 minute lockdown routine you can use right now.

Background​

Microsoft built Copilot to be deeply integrated across Windows, Edge, the Copilot apps, and Microsoft 365. That integration is where the convenience comes from: Copilot can reference files, calendar entries, browsing context and prior chats to produce more helpful, personalized outputs. But the same integration creates multiple channels where personal data can be recorded, retained, and — unless you act — used to improve models or fuel targeted experiences. Microsoft exposes toggles that let users control many of those channels, but defaults, platform differences, and enterprise policies complicate the picture.
Two framing rules before we dig into the seven settings:
  • If a setting controls model training or memory/personalization, treat it as privacy‑sensitive.
  • If a setting controls diagnostics, ad personalization, or shared links, treat it as telemetry/advertising control with immediate implications for how Microsoft and partners may use your data.

Where to find Copilot privacy controls​

Copilot privacy settings live in the same places you use the assistant: the Copilot web page, the Copilot desktop apps (Windows and macOS), and the Copilot mobile apps (iOS and Android). Changes you make on one device propagate across devices tied to the same Microsoft account, but the exact menu labels and some features (for example, “Data Control” on macOS) vary by platform. On Windows the path is typically Profile → Settings → Privacy; on the web it’s Profile → Privacy; on mobile it’s Menu → Account → Privacy.
Practical note: some Mac views call this “Data Control,” and the Copilot web UI exposes extra commands like share, export, summarize, or report from conversation menus. If a UI is missing a particular command, use the Copilot web settings — they’re the most complete and authoritative surface for consumer controls.

The seven settings (what they do, why they matter, and how to change them)​

Below I reproduce the seven privacy moves in PCMag’s checklist, verify the technical effects, and add practical guidance and caveats.

1. Opt out of model training (text and voice)​

What it does
  • Two toggles control whether your interactions are eligible for use in Microsoft model improvement pipelines: Model training on text and Model training on voice. When enabled, anonymized text and voice interactions can be collected and used to refine models. When disabled, those interactions are not supposed to be used for Microsoft’s general model training.
Why you might change it
  • Even with anonymization claims, many users prefer to reduce the chance that private troubleshooting, drafts, or sensitive queries become training material. Opting out is the highest‑impact privacy lever for most consumers because it reduces the primary data flow out of your account.
How to change it (quick steps)
  • Open Copilot (web, Windows app, or mobile).
  • Profile → Privacy.
  • Toggle off Model training on text and Model training on voice.
Trade-offs
  • You’ll be opting out of contributing to aggregate improvements that could indirectly benefit you. For most users, the privacy benefit outweighs the marginal future usefulness of training‑derived improvements. Independent how‑to guides recommend disabling these if privacy is a priority.

2. Tell Copilot to forget (Memory / Personalization)​

What it does
  • The Memory (or Personalization on macOS) setting lets Copilot retain factual details you share in chats — your name, job, preferences, household information, and other benign context — to produce more personalized, context-aware results later. Microsoft says sensitive categories (age, gender, race, sexual orientation, health, political affiliation) are excluded, but the filtering is not perfect.
Why you might change it
  • Memories are convenient — they reduce repetition and make follow-up conversations smoother. But persistence is the risk: once stored, those facts increase your long‑term exposure and can be used by any system processes that read memory. If you prefer stateless interactions or have sensitive information you never want stored, turn it off.
How to change it
  • Profile → Privacy → disable Personalization / Memory.
How to delete already‑saved memories
  • From the same Privacy page you can select Delete memory and remove stored facts. Note: at the time of writing, Microsoft does not provide a user‑visible “memory contents” list in every client, so deletion is coarse — it removes the memory store rather than allowing item‑by‑item review in many clients. If that matters to you, use the web UI where deletion controls are most consistent.
Trade-offs
  • You’ll lose convenience. Copilot will ask for context more often. If you rely on memory for frequent, benign personalization (e.g., writing style, calendar preferences), consider keeping it enabled but deleting specific, sensitive facts manually and rotating credentials if you ever pasted them into chat.
Caution
  • If you use Copilot through a work or school account, tenant administrators can govern memory centrally and memories may be stored in your Exchange mailbox; admins can discover and manage those memories via compliance tools. Don’t assume corporate Copilot is private from your employer.

3. Manage your chat history​

What it does
  • Copilot keeps a history of conversations so you can resume threads, rename, share, or delete them. The left sidebar shows conversation names and menus to rename, share, or delete. You can also export or delete entire histories using account privacy pages.
Why you should care
  • History helps productivity, but retained chats are additional artifacts tied to your account. Automated systems and, in some cases, human reviewers may access them for threat detection or policy review. If a conversation contains anything sensitive — passwords, tokens, account numbers, private legal or medical information — remove it as soon as possible.
How to act
  • Review and delete conversations you no longer need:
  • Right‑click a conversation or use the three‑dot menu → Delete.
  • For bulk management: Profile → Privacy → Export or Delete History (you’ll be redirected to the Microsoft activity management page where you can export required chats for records and then delete them).
Practical tip
  • Export anything you may need for documentation or compliance first, then delete. If you ever pasted API keys or credentials in chat, rotate them immediately even after deletion — assume potential exposure. Independent security guidance recommends rotation after such accidental sharing.

4. Delete Copilot’s memory (retroactive wipe)​

What it does
  • If you previously enabled Memory and want to remove stored facts, Copilot provides a Delete memory action (available on the web and most apps). Deleting memory removes the persistent facts the assistant recorded, although viewing the exact items stored is often not possible in the UI.
How to do it
  • Profile → Privacy → Delete memory → Confirm deletion.
Caveats
  • Deletion removes the memory store from consumer surfaces, but platform differences mean the Mac app may have a different workflow. Also, in enterprise tenants, deletion semantics can differ: memories stored in Exchange or other tenant systems may be discoverable by admins and subject to retention policies. If you are in a regulated environment, consult your admin before relying solely on the client UI.

5. Manage shared conversations (revoke links)​

What it does
  • Copilot allows you to share conversation threads via live links or exported pages. That’s handy for collaboration, but live links are another leakage vector if they remain active beyond their useful life.
How to revoke shared links
  • In the Copilot website or Windows app: Profile → Privacy → Manage Shared Conversations / Manage Shared Links, select the conversation and choose Revoke (X or trash icon). The shared link is disabled while the conversation remains in your history unless you delete it.
Best practice
  • When you must share, export a static copy and revoke the live link immediately after the recipient has what they need. Consider adding a calendar reminder to revoke links automatically after a short window if you share frequently.

6. Don’t let Copilot message you (mobile Proactivity)​

What it does
  • The mobile Copilot app includes a Proactivity toggle that allows Copilot to proactively message or push notifications based on app heuristics. The feature’s triggers can be opaque and vary by app version and platform.
Why you might disable it
  • If you prefer manual invocation and dislike unsolicited suggestions, turn Proactivity off to stop push messages and conserve battery.
How to change it
  • Mobile app → Menu → Account → Privacy → Proactivity → Off.

7. Turn off Data Sharing and Ad Personalization (Windows Copilot app)​

What it does
  • The Copilot Windows app exposes two additional toggles: Diagnostic Data Sharing and Ad Personalization. Diagnostic controls affect the level of telemetry the app sends. Ad Personalization stops Copilot and other Microsoft services from using activity to tailor ads to you. Note that account‑level ad controls are separate and must be adjusted in your Microsoft Account privacy settings if you want a complete opt‑out across services.
Why you might change them
  • Turning off diagnostics limits optional telemetry; turning off ad personalization severs a direct path between your Copilot activity and targeted ad profiles. If your goal is to reduce targeted advertising, disable both and also visit account privacy pages to control personalized ads platform‑wide.
How to change it
  • Windows Copilot app → Profile → Settings → Privacy → disable Diagnostic Data Sharing and Ad Personalization.

A practical 10–15 minute checklist (do this now)​

  • Open Copilot (web or Windows app) and sign in.
  • Profile → Privacy: toggle off Model training on text and Model training on voice.
  • Disable Personalization / Memory if you prefer stateless interactions; then select Delete memory to remove stored facts.
  • Export any conversations you need for records, then Delete your chat history (Profile → Privacy → Export or Delete History).
  • Revoke shared links (Profile → Privacy → Manage Shared Conversations → Revoke).
  • On mobile: disable Proactivity.
  • In the Windows Copilot app: disable Diagnostic Data Sharing and Ad Personalization. Then check your Microsoft Account privacy settings to disable account‑level ad personalization if needed.
  • If you ever pasted credentials, rotate them immediately. Independent security guides recommend this step as standard incident response.
Follow those steps and you’ll have removed the biggest vectors that allow Copilot to retain or forward personal context.

Critical analysis — strengths, weaknesses, and remaining risks​

Strengths of the current model​

  • Microsoft exposes explicit toggles for the most important privacy levers: model training opt‑outs, memory controls, history export/delete, shared‑link revocation, and per‑platform telemetry/ad toggles. That level of granularity helps users choose a posture that balances convenience and privacy. Several of the UI paths are consistent across platforms, and the web UI serves as the most authoritative control surface.
  • Enterprise features are designed with compliance in mind: memories and Copilot artifacts in Microsoft 365 are placed where admins can control retention and eDiscovery, which is necessary for regulated environments. Those capabilities show Microsoft understands enterprise governance needs and has built admin‑level tooling accordingly.

Weaknesses and residual risks​

  • UI fragmentation and naming differences across platforms create user confusion. Memory may be called Personalization on macOS, Proactivity is mobile‑only, and some clients may lack a direct memory‑deletion UI. The safest path for users is to use the Copilot web settings to verify their posture.
  • Deletion semantics aren’t always transparent. In consumer flows, a “delete memory” or “delete history” action removes visible artifacts, but that does not necessarily guarantee what, if anything, survives in backup logs, or what admins can access in enterprise tenants. Where retention laws or organizational policies apply, copies may persist for compliance reasons. Treat deletion as lowering exposure, not as an absolute guarantee of erasure across every system.
  • Filtering sensitive categories is a useful promise, but it’s not perfect. Microsoft documents filtering for certain sensitive attributes, but ambiguous language or accidental mentions can slip through automated categorization and end up in memory or telemetry pools. Users who must guarantee non‑retention of particular facts should avoid entering them into chats at all.
  • Defaults and cross‑product interactions can surprise users. For example, disabling personalization inside Copilot may not automatically disable account‑level ad personalization; you must change account privacy settings separately. That separation of controls increases the risk users will assume an action turned off everything it didn’t.

Enterprise caveat (critical)​

If you use Copilot with a work or school account, your tenant administrators likely have the ability to set or override Copilot behaviors at scale. Memories and chats created in Microsoft 365 may be stored in tenant locations (for example, Exchange) and be discoverable via eDiscovery and Purview; admins can discover, retain, and delete artifacts under corporate policy. In short: never type anything into corporate Copilot you wouldn’t want your employer to see. This is not an edge case; enterprise discovery workflows are built precisely for accountability and compliance.

Recommendations and best practices for everyday users​

  • Assume everything you type into an assistant could be logged. Treat Copilot like any other cloud service: don’t paste passwords, private keys, or regulated personal data into chats. If you do by mistake, rotate credentials immediately.
  • Use the Copilot web UI as your authoritative control plane. When in doubt about whether a client exposes a feature, the web settings are most comprehensive.
  • Export what you reasonably need, then delete history and memories you don’t want retained. Exporting gives you an audit copy if you need records later; deletion reduces on‑platform exposure.
  • Revoke shared links after use. If you collaborate by sharing Copilot threads, make revocation a routine step once collaboration ends.
  • For high‑sensitivity work, use a separate account or avoid cloud assistants entirely. If your workflows involve regulated data, consult legal or compliance before using Copilot. Enterprise tenants should use tenant‑bound Copilot deployments with admin policy enforcement.
  • Periodically check account‑level ad personalization and privacy dashboards. Turning off Copilot personalization does not necessarily turn off ad targeting across your Microsoft account; visit the account privacy controls and disable personalized ads and offers there if targeted ads bother you.

What I verified (and what I could not)​

I verified the existence and function of the seven privacy toggles (model training text/voice, memory/personalization, history export/delete, memory deletion, shared link management, mobile Proactivity, and Windows diagnostic/ad toggles) against Microsoft‑focused product- and support‑style documents and independent reporting in multiple collections of notes and guides. The verification confirms that:
  • Model training opt‑outs exist and apply to text and voice interactions.
  • Memory/personalization can be disabled and deleted in consumer flows, but UI differences exist across clients.
  • Chat history can be exported and deleted via account privacy pages.
  • Shared conversation links can be revoked.
  • Windows Copilot app has Diagnostic Data Sharing and Ad Personalization toggles; account‑level ad controls are separate.
Unverifiable or conditional claims
  • Absolute guarantees about what is removed from every Microsoft internal log or backup after a “delete” action are not publicly documented in a way that a consumer can verify. Deletion in the client reduces visible exposure and is the correct step, but enterprise retention and backups create scenarios where copies may persist for compliance or investigative reasons. Treat deletion as strong hygiene but not proof of forensic erasure in every backend.

Final verdict — practical privacy hygiene for Copilot users​

Copilot is a powerful assistant with useful, granular privacy controls. PCMag’s seven‑step checklist is an excellent starter kit: turn off model training for text and voice, disable personalization if you prefer stateless interactions, delete stored memories, manage and export/delete chat history, revoke shared links, disable mobile Proactivity, and switch off diagnostic telemetry and ad personalization in the Windows app. These steps can be completed in roughly 10–15 minutes and substantially reduce the largest exposure vectors for personal context leaving your account.
That said, privacy is never absolute. UI fragmentation, enterprise retention policies, and the need to rotate secrets after accidental sharing remain important caveats. Use the Copilot web UI for authoritative controls, apply the checklist above, and treat any corporate account as governed by your employer’s retention and compliance rules rather than private by default. With those safeguards in place, you can get the most practical benefits from Copilot while keeping control over how much of your life it remembers.
In short: use Copilot — but keep the keys to your story in your hands.

Source: PCMag Australia Use Microsoft Copilot? 7 Settings I Changed Right Away to Protect My Privacy