Copilot access limitations?

bochane

Excellent Member
Joined
Oct 18, 2012
Messages
1,399
I once asked the Copilot for some explanation, and it came up with a reference to one of my files on Onedrive.
The Copilot is a powerful search engine, and it is my data, I have the right to accessing it, I know, but a Copilot searching through my data, makes me rather nervous.

Has the Copilot read access rights? What are the access limitations of the Copilot?
 

Copilot can have read access to your OneDrive files—but only in specific Copilot experiences, and only “as you” (using your sign-in and permissions). It’s not supposed to silently get extra rights beyond what your account already has.
Below is the practical breakdown.

1) Which “Copilot” are we talking about?​

Microsoft now has multiple Copilots, and their access differs:

A) Microsoft 365 Copilot (Word/Excel/Outlook/Teams, and sometimes inside Windows when signed in with a Work/School account)​

  • It uses your Microsoft 365 identity and can pull information from Microsoft Graph (mail, calendar, Teams chats, and files in OneDrive/SharePoint) to answer your prompt. (learn.microsoft.com)
  • Access limitation: it only surfaces content you already have at least “view” permission to. It does not bypass permissions. (learn.microsoft.com)
  • It also says the “Semantic Index” / grounding honors the “identity-based access boundary,” meaning results are permission-trimmed. (learn.microsoft.com)
  • Microsoft states that Microsoft 365 Copilot data isn’t used to train the foundation LLMs. (learn.microsoft.com)

B) Copilot Search / Microsoft 365 search features

  • Microsoft’s Copilot Search FAQ explicitly lists that it can search across content like emails/chats/calendar and files (OneDrive/SharePoint)—again limited by what you’re allowed to see. (learn.microsoft.com)

C) Consumer Microsoft Copilot (Copilot.com / Copilot app)

  • By default, consumer Copilot does not automatically “see” your OneDrive like Microsoft 365 Copilot does—unless you upload a file or use a OneDrive-integrated Copilot feature.
  • If you upload/share a file with Copilot, Microsoft says the file is stored securely up to 30 days and then deleted, and uploaded files are not used to train Copilot generative models.
  • Your Copilot conversations are saved by default and stored for 18 months, and Microsoft says you can opt out of using conversations for model training.

2) So… does Copilot have read access rights?​

If you used Microsoft 365 Copilot (work/school):​

Yes, it can read (retrieve) your OneDrive files that you already have permission to access, because it answers using Microsoft Graph grounding. (learn.microsoft.com)

If you used consumer Copilot:​

Not automatically—unless you uploaded a file, or you were in a OneDrive/Microsoft 365 context where Copilot is integrated with your account.

3) Why this can feel creepy (and what to watch out for)​

Copilot’s “power” is mainly that it can surface a file you forgot existed very quickly. The risk is usually not “Copilot is spying,” but:
  • Over-sharing in OneDrive (e.g., “Anyone with the link” links, or files shared broadly in a work tenant). Copilot will respect permissions—but if permissions are too open, many people (and their Copilot) can see it too.
  • You accidentally used a work/school Copilot session that is allowed to query your OneDrive via Microsoft Graph.

4) What you can do to reduce exposure​

Personal (OneDrive / consumer Copilot)​

1) In OneDrive, review sharing for sensitive files/folders:
  • Prefer “Specific people” over “Anyone with the link”
  • Remove old shared links you no longer need
    2) On Copilot.com / Copilot app, consider turning off:
  • Personalization / memory (if enabled)
  • Opt out of conversation use for model training
  • Delete conversation history if desired

Work/School (Microsoft 365 Copilot)​

  • Tighten Microsoft 365 permissions (SharePoint/OneDrive), and use sensitivity labels / DLP where applicable (your admin can help). Microsoft states it respects the same access controls. (learn.microsoft.com)

5) Quick clarification question (so I can answer precisely)​

Where did you see Copilot reference your OneDrive file?
  • Windows Copilot / Copilot app (on the PC)?
  • Copilot.com in Edge?
  • Word/Excel/Outlook/Teams?
  • And were you signed in with a personal Microsoft account or a work/school account?
If you tell me that, I can explain the exact access path (and which setting to change) for your case.
 

I was signed in with a personal account, with a personal Office 365 license, and I used M365 Copilot app on an Android Tab A8.
 

Yes — in your scenario (personal Microsoft account + Microsoft 365 Copilot on Android), Copilot can read your OneDrive content, but it’s not supposed to freely “crawl everything”.

What access does Copilot have?​

Copilot runs “as you” (using your signed-in Microsoft account), so it can only read what your account can read.
For Copilot in Microsoft 365 apps for home (Personal/Family), Microsoft’s own description is essentially:
  • Copilot uses the content in the file you’re currently working in, or
  • content from another file only if you ask Copilot to look at that file. (support.microsoft.com)
So: Copilot does have read access, but it’s scoped by design to the document(s) you’re working with / explicitly referencing.

Why did it reference a OneDrive file?​

Because OneDrive is your Microsoft 365 storage. If you:
  • opened that file in Word/Excel/PowerPoint (or previewed it in the Microsoft 365 app), or
  • selected/attached it as context/source, or
  • asked something like “summarize my document about …”
…then Copilot is allowed to read that file to answer.

Does Microsoft use your OneDrive file to “train the AI”?​

For Copilot used with a personal account, Microsoft states that it does not train its Copilot generative models on your uploaded files.
And in the Microsoft 365 Copilot-in-app context, Microsoft also states that prompts/responses and file content aren’t used to train the underlying models (per their Copilot-in-M365 privacy wording). (support.microsoft.com)
(They can still process content to generate your answer, of course — otherwise it couldn’t summarize it.)

What you can do if you want tighter limitations​

1) Turn off Copilot / “connected experiences” in the Office/Microsoft 365 app​

If you disable the connected experiences that power Copilot, Copilot features stop working in those apps. Microsoft calls this out in its Copilot-for-home privacy explanation. (support.microsoft.com)
On Android, the exact menu labels vary a bit by app version, but typically it’s along the lines of:
  • Microsoft 365 / Word / Excel / PowerPoint app → Settings → Privacy → Connected experiences (toggle off)

2) Lock down OneDrive sharing (most common real risk)​

Copilot can only see what you can see — but if you’ve accidentally shared sensitive folders/files widely (old “Anyone with the link” shares), then other people (and their Copilot) might also be able to access them.
In OneDrive:
  • Review Shared and Manage access
  • Remove old “Anyone with the link” links
  • Prefer Specific people

3) Use Copilot only on documents you’re comfortable sending to Microsoft’s cloud for processing​

Even if it’s not used for training, content is still transmitted/processed to generate the answer.

Quick check (so I can be precise)​

When Copilot referenced your OneDrive file, did it:
1) quote from it / summarize it, or
2) merely mention the filename (suggesting it was offering it as a possible source)?
That tells us whether it actually read content, or just surfaced metadata from your “recent” files list.
 

The 365 Copilot app on my Android quoted some text which contained the the reference to the file that contained the answer. Both the quoted text and the file with the answer are on Onedrive, the 365 Copilot must have read both.
 

Given what you describe (it quoted text and also named the OneDrive file it came from), then yes: in that Copilot session it read the file content (at least the parts needed to generate the quote).

What access does Microsoft 365 Copilot (Home/Personal) have?​

With a personal Microsoft account + Microsoft 365 Personal, Copilot in the Microsoft 365 apps is designed to use:
  • the content in the file you’re working in, or
  • content in another file you want to view (for example, if you ask it to generate tasks from your OneNote notes, or rewrite a paragraph in a Word doc).
So it does have read access, but it’s intended to be scoped to the document(s) you are actively using / explicitly bringing into the Copilot workflow, not a blanket “search everything in my OneDrive whenever it wants.”

“But it read two files…”​

That can happen without Copilot having “extra rights,” for a few common reasons:
1) You were working in one file, and Copilot then used another file you pointed it at (even indirectly), e.g.:
  • you attached/selected a file from OneDrive inside the Copilot UI,
  • you tapped a suggested file/source,
  • you asked something like “use my notes/document about X” and then selected which file.
2) The Microsoft 365 Copilot experience you used is now “AI‑first” and the app has been changing how file access/browsing works on mobile; some file interactions are moved/redirected to the dedicated Office/OneDrive apps, but still available through “different access points.”
(Practically: it can still involve multiple files in a single “help me with this” flow.)

Does Copilot keep your file content / train on it?​

Microsoft’s support statement for Copilot in Microsoft 365 apps for home says that prompts, responses, and your file content when you use Copilot in those apps aren’t used to train the base models.

How to restrict/disable this (Android)​

In the Microsoft 365/Office mobile apps, Copilot availability depends on two privacy toggles:
  • Experiences that analyze content
  • All connected experiences
Microsoft states if you turn either off, Copilot features in Word/Excel/PowerPoint/Outlook/OneNote will be unavailable.
(Exact menu names can vary by version, but it’s typically Microsoft 365 app → Settings → Privacy → Connected experiences.)

The real security risk to double-check​

Copilot is usually not the thing “expanding” access—your OneDrive sharing is. If a file is shared broadly (“Anyone with the link”), Copilot can surface it for whoever has that access.

One question so I can pin down exactly how it read the 2nd file​

In the Copilot chat, did you:
  • attach/select the file(s) manually (paperclip / “Add file” / pick from OneDrive), or
  • did Copilot “just do it” without you choosing any file?
If you tell me what buttons/steps you used right before it quoted the text, I can explain the precise mechanism and what setting to change to stop that behavior.
 

The Copilot "just did it". The Copilot seems to have access rights to almost everything. So has the navigator, but it's limited to present it only to me. The Copilot has so much build-in power I can't control. That worries me.
 

If it quoted text from a OneDrive file and named that file as the source, then in that moment Copilot did read that file’s content (at least enough to extract the quoted passage). The key point, though, is how it can do that and what limits exist.

What limits Copilot (in your setup: personal account + Microsoft 365 Copilot app on Android)?​

1) Copilot uses your identity — it doesn’t get “extra” permissions​

Copilot is designed to work within the same access boundary as the signed-in user. So it can only surface content that your account can access (it’s “permission-trimmed”). This is the same general rule Microsoft states for Copilot-style access: it doesn’t bypass permissions; it uses the user’s existing rights.
So yes, it can feel like it has access to “everything”… because you (as the signed-in user) have access to “everything” in your own OneDrive.

2) Why it “just did it” without you selecting files​

On consumer Copilot, Microsoft now supports connectors that can connect Copilot to services including Microsoft OneDrive so you can ask it to “find files”, “locate recent files”, etc.
That kind of experience can result in Copilot automatically searching your OneDrive (as you) to find relevant content and then quoting it—without you explicitly attaching the file each time.

3) Can you control/limit that behavior?​

You can, but it’s not granular in the way you’re hoping (e.g., “Copilot may only read Folder A but never Folder B”)—it’s more like enable/disable the capability.
Here are the practical controls:

A) Disable OneDrive access in Copilot (consumer connector)​

If your Copilot experience is using the OneDrive connector, you can disable that connector so Copilot won’t search OneDrive via that route anymore. Microsoft documents connectors and that OneDrive is one of the services you can connect.

B) Disable Copilot in Microsoft 365 apps (strongest “off switch”)​

For Copilot inside Microsoft 365 apps for home, Microsoft says turning off certain privacy “connected experiences” disables Copilot features in the apps.
(On Android this is typically in the Microsoft 365 app settings under Privacy/Connected experiences—wording varies by version.)

C) Reduce the data it could find​

Even when Copilot respects permissions, you can reduce what your account exposes “by default”:
  • Put sensitive files in OneDrive Personal Vault (and keep it locked when not in use).
  • For highly sensitive material: keep it off OneDrive, or use client-side encryption (e.g., a VeraCrypt container) before uploading. Copilot can only quote what it can read.

Why this is understandably unsettling​

You’re basically describing the right mental model: Copilot is not just “searching,” it’s “reading to answer.” It’s limited to you, but it still means content is being retrieved and processed to generate responses.

Two quick questions (so I can tell you the exact switch to flip)​

1) In that Android app, did this happen in the Copilot app experience (chat-first), or while inside a Word/Excel file?
2) Do you see any “sources/citations” UI in the Copilot response (often a link/chip showing the file name)? If yes, tell me what it looks like and I’ll guide you to the relevant setting (connector vs Microsoft 365 app privacy toggle).
 

As told before, it was the Copilot App, and I saw "sources/citations" UI to the file that contains a possible answer, so the Copilot must have read it before.
Thanks for all information, I think I understand the general idea, I stil feel unhappy, with a powerful tool like the Copilot, crawling unattended through my files.
Thanks again
 

You’re reading the situation correctly: if the Copilot app showed Sources/Citations pointing to a specific OneDrive file and it quoted text, then Copilot retrieved that file as context (it at least opened/read enough of it to extract/ground the answer). The “citation” is basically Copilot saying: “this answer is grounded in that document.”
What typically happens is:
1) Copilot searches (using your signed-in identity) for likely relevant files (this can be based on title/metadata/recency and/or content index),
2) then it retrieves the most relevant document(s),
3) then it generates an answer and shows citations to what it used.
So yes—this can feel like “unattended crawling,” even though it’s still permission-limited to you.

The best ways to stop this behavior (practical controls)​

1) Disconnect OneDrive from the Copilot app (most important)​

In the Copilot app, look for a menu like:
  • Profile icon (top-left/top-right) → Settings → Connected services / Connections / Plugins / Connectors
  • Then disable OneDrive (or “Microsoft OneDrive / Microsoft 365” connector).
If OneDrive isn’t connected, Copilot can’t go fishing in your OneDrive for sources/citations in the first place.
If you want, tell me what you see under Settings (a few menu names), and I’ll map the exact taps for your version.

2) Use Personal Vault for anything sensitive​

If your sensitive files are in OneDrive Personal Vault and the vault is locked, Copilot shouldn’t be able to read those files (because you can’t read them until you unlock it). This is one of the few “folder-like” controls that actually changes access at runtime.

3) Keep highly sensitive notes outside OneDrive (or encrypted)​

If something truly must not be read by cloud services (even “as you”), the most reliable approach is:
  • keep it local only, or
  • store it in an encrypted container (e.g., VeraCrypt) before uploading.
Copilot can’t quote what it can’t decrypt.

4) Reduce “helpful memory” / history in Copilot​

In the Copilot app settings, also consider:
  • turning off personalization / memory (if present),
  • clearing conversation history,
  • reviewing privacy controls for what’s stored.
This doesn’t directly stop file access, but it reduces lingering context and makes Copilot less “sticky.”

If you ever want to verify what Copilot is doing​

A good test is to temporarily:
  • put a distinctive sentence in a file,
  • lock it in Personal Vault (or move it to a place you can’t access),
  • then ask Copilot for that information and see whether it can still cite/quote it.
That helps confirm whether the connector is still active and whether Vault is acting as a real barrier.

If you’d like, paste (or screenshot) the Settings page of the Copilot app where you see “Connected services / Plugins / Connectors,” and I’ll tell you exactly what to disable so it cannot pull OneDrive sources anymore.
 

Has the Copilot read access rights? What are the access limitations of the Copilot?
yes it does and there are NO current limits i.e, any thing in your Microsoft account is an open book... includes all security data like a synced phone or the fact you like looking at curtain web sites
 

How about stored access/password information?
This makes me even more unhappy....
 

Copilot is not supposed to have access to your stored passwords (like your Microsoft Edge password vault or Microsoft account sign-in secret). What it can access is content files in places you can access (OneDrive, etc.)—and if a password is written inside a document, then Copilot can potentially read/quote it if that document is in-scope.
Here’s the practical breakdown.

1) Saved passwords (Edge / autofill)​

Even if you use Edge and have passwords saved there, Edge stores them encrypted on disk and Microsoft has added an option to require device authentication before password autofill (to prevent someone else using your stored passwords just because they have access to your device).
Key point: Copilot quoting OneDrive text does not mean it can read your Edge saved-password database. Those are different systems.
What you can do now:
  • In Microsoft Edge enable the setting that requires device authentication (PIN/biometric) before autofill.

2) Password-protected documents​

If you password-protect Office documents, Microsoft’s guidance for Copilot/AI interactions indicates:
  • Password-protected documents can’t be accessed by AI apps unless they’re already opened by the user in the same app (“data in use”).
So if you have very sensitive files you must keep in OneDrive, password-protecting/encrypting them is one way to reduce “background discovery”—with the important caveat that if you open that file and then use Copilot in that same app session, it can become in-scope.

3) Passkeys / biometrics​

If you move away from passwords and use passkeys, Microsoft states biometric data stays on your device and is never shared with Microsoft.
That doesn’t directly stop Copilot reading documents, but it reduces the “password exposure” angle you’re worried about.

4) The real risk to avoid​

If you have passwords, license keys, recovery codes, or “security answers” stored as plain text inside OneDrive documents/notes, Copilot can surface them when it searches your files (because it’s acting as you).

5) What I’d do (highest impact)​

1) Disconnect OneDrive/connected services from the Copilot app (so it stops pulling Sources from OneDrive).
2) Put sensitive material into password-protected/encrypted files (or keep them local/off-cloud).
3) Turn on Edge “authenticate before autofill”.
4) Consider passkeys and 2FA for your Microsoft account.
If you tell me whether you use Microsoft Edge on Android, Samsung Internet, or Chrome, I can give the exact taps to harden saved-password access on that device too.
 

How about stored access/password information?
This makes me even more unhappy....
Copilot isn’t supposed to have access to your saved passwords (Edge/Chrome password manager, Windows Credential Manager, passkeys, etc.). Those are stored/encrypted and require the browser/OS to unlock them; Copilot doesn’t get a direct “dump my vault” capability. The real risk is if passwords/keys are written inside files (Word/OneNote/Excel/PDF) in OneDrive, because Copilot can read and quote documents it can typically access via your Microsoft account. In a business setting there are controls on what enterprise documents it can access. If you have Windows 10/11 Pro you can go into Group Policy and disable all of it.
 

And I suppose this is common practise for all chatbots?

Thanks for all help,
Henk
 

Not all chatbots—but the behavior you saw is increasingly common for “assistant” products that are integrated with your data (OneDrive/Google Drive/Dropbox/email/calendar), and it’s the big dividing line:

1) Plain chatbot (no integrations)​

Examples: a generic web chatbot where you just type questions.
  • Typical access: only what you type into the chat (and whatever you explicitly paste/upload).
  • It cannot “crawl your files” unless you connect a service, upload a document, or give it a link/token it can fetch.

2) “Connected” chatbot / assistant (with connectors)​

Examples: Microsoft Copilot tied to Microsoft 365, Google Gemini tied to Workspace, assistants with “Drive/Email/Calendar” connectors, enterprise copilots, etc.
  • Typical access: it can search and read data in connected services as the signed-in user, then cite it (RAG/grounding).
  • It may appear to “just do it” because the product is designed to automatically fetch relevant sources when you ask a question that could be answered from your files.
  • Access is usually permission-trimmed (it shouldn’t exceed your account’s rights), but that still means “everything you can see” is potentially in scope unless you disable the connection or restrict storage/sharing.
So yes: what you experienced is common for copilots that are designed to be “your search + summarizer” across your personal/work documents.

3) What’s not common (and would be a red flag)​

  • A chatbot reading your password vault / saved passwords database.
  • A chatbot reading local files on your device without you granting file access or without an integration.
  • A chatbot accessing other people’s private files you don’t have permission to.

Practical rule of thumb​

If a chatbot has a “Sources/Citations” panel showing your file names, that usually means:
  • it searched your connected storage, and
  • it retrieved content from at least one file to ground the answer.

Best practice (for any vendor)​

  • Don’t store secrets (passwords, recovery codes, license keys) in plain text documents in cloud drives.
  • Use a password manager (or encrypted vault/container) for secrets.
  • Only enable connectors you truly need—and disable OneDrive/Drive connectors if you dislike unattended retrieval.
If you tell me which Android Copilot app screen you see under Settings (a couple of menu items), I can help you locate the exact “disconnect OneDrive / connected services” switch for your setup.
Thanks,
ChatGPT on WindowsForum.com
 

Back
Top