Microsoft’s cloud assistant has been patched — and the fix has prompted a wider rethink about how Data Loss Prevention (DLP) must work in an era of always‑on AI assistants. Over the past month Microsoft has closed a logic error that allowed Microsoft 365 Copilot to index and summarize...
Microsoft has quietly tightened one of the most consequential guardrails for enterprise AI: Microsoft Purview’s Data Loss Prevention (DLP) policies that block Microsoft 365 Copilot processing of sensitivity‑labeled files will now apply to Word, Excel, and PowerPoint files regardless of where...
Microsoft has confirmed that a configuration error in Microsoft 365 Copilot Chat allowed the assistant to read and summarise emails stored in users’ Drafts and Sent Items — including messages labelled confidential — for several weeks, exposing a blind spot in enterprise controls and reigniting...
Microsoft’s Copilot for Microsoft 365 quietly read and summarized email messages that organizations had explicitly marked “Confidential,” a logic error that bypassed Purview sensitivity labels and Data Loss Prevention (DLP) protections and has reignited serious questions about AI governance...
Microsoft’s Copilot for Microsoft 365 briefly did exactly what it was built to do — read, understand and summarise email content — and in doing so it accidentally summarised messages that organizations had explicitly labelled Confidential, exposing a gap between AI convenience and longstanding...
Microsoft has confirmed a logic error in Microsoft 365 Copilot Chat that, for a window of weeks beginning in late January 2026, allowed the assistant’s “Work” chat to read and summarize email messages stored in users’ Sent Items and Drafts — including messages labeled Confidential and protected...
Microsoft has confirmed a logic error in Microsoft 365 Copilot Chat that briefly allowed the assistant to read and summarise email messages organizations had explicitly marked as Confidential, bypassing Purview sensitivity labels and configured Data Loss Prevention (DLP) controls — a lapse...
For weeks in late January and early February 2026, Microsoft’s flagship productivity assistant, Microsoft 365 Copilot, quietly indexed and summarized Outlook messages that organizations had explicitly labeled Confidential, effectively bypassing configured Purview sensitivity labels and Data Loss...
Microsoft’s Copilot Chat quietly summarized emails labeled “Confidential,” bypassing the data‑loss protections administrators relied on and forcing a hard assessment of how AI features must be governed inside Microsoft 365...
Microsoft's enterprise Copilot assistant has been quietly processing and summarizing emails flagged as confidential — including messages stored in Drafts and Sent Items — after a logic error in Copilot Chat allowed those items into its retrieval pipeline, a lapse that raises fresh questions...
Microsoft's own Copilot Chat briefly overran its guardrails: a code error allowed the service to summarize emails labeled as confidential, processing messages from users' Sent Items and Drafts in ways that violated intended Data Loss Prevention (DLP) and sensitivity-label behavior.
Background
In...
Microsoft’s flagship workplace assistant, Microsoft 365 Copilot Chat, briefly read and summarized email messages that organizations had explicitly labeled Confidential, a logic error the company logged internally as service advisory CW1226324 and that has forced a re‑examination of how embedded...
Microsoft’s flagship workplace assistant, Microsoft 365 Copilot Chat, mistakenly accessed and summarised some users’ confidential Outlook messages — a logic error the company first detected in late January and has since patched — raising fresh questions about how embedded AI interacts with...
Microsoft’s Copilot Chat briefly summarized emails that organizations had explicitly labeled as confidential — a failure Microsoft attributes to a server‑side code error that allowed items in users’ Sent Items and Drafts to be picked up and summarized by the Copilot “Work” chat experience, and...
For weeks this winter, a logic error in Microsoft 365 Copilot Chat’s “Work” experience allowed the AI to read and summarize emails that organizations had explicitly marked Confidential, bypassing configured Data Loss Prevention (DLP) and sensitivity‑label protections and exposing a material risk...
Microsoft’s flagship productivity assistant briefly did what it was built to do — read, index and summarise corporate communications — and in doing so it accidentally summarised email messages organizations had explicitly marked Confidential, bypassing Data Loss Prevention (DLP) and...
Microsoft confirmed a logic bug in Microsoft 365 Copilot that, for a window of weeks, allowed Copilot Chat’s “Work” experience to index and summarize emails that organizations had explicitly labeled as Confidential, effectively bypassing configured Data Loss Prevention (DLP) and...
For weeks this winter, Microsoft’s flagship productivity assistant, Microsoft 365 Copilot Chat, quietly indexed and summarised emails that organizations had explicitly marked Confidential, bypassing sensitivity labels and Data Loss Prevention (DLP) controls designed to stop exactly that — a...
Microsoft’s flagship productivity assistant, Microsoft 365 Copilot Chat, briefly read and summarized emails that organizations had explicitly labeled “Confidential,” exposing a gap between automated AI convenience and long‑standing enterprise access controls...
For weeks this winter, Microsoft’s flagship productivity assistant, Microsoft 365 Copilot, quietly did exactly what it was built to do — read, index and summarise corporate communications — and in the process it mistakenly summarised emails that organisations had explicitly marked Confidential...